EldoS | Feel safer!

Software components for data protection, secure storage and transfer

S3 Upload consumes a lot of memory

Also by EldoS: Rethync
The cross-platform framework that simplifies synchronizing data between mobile and desktop applications and servers and cloud storages
#36342
Posted: 03/30/2016 10:52:22
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 71

Hi,

made many tests, but it's still quite problematic.

Trying to set threshold = 20MB, part size = 10MB, as you suggested, it works correctly with a small file, but try to upload a 2-3 GB file with these settings: the interface is completely frozen.

Can you make a test with a large file? I think it's quite easy to reproduce this issue.

Just make this simple test. Use your sample AWS app, select a 5 GB file and try to send it. App frozen.
#36343
Posted: 03/30/2016 11:34:36
by Alexander Ionov (EldoS Corp.)

Quote
ntr1 wrote:
Can you make a test with a large file?

Sure. We'll do a test and will let you know as soon as we have any news.


--
Best regards,
Alexander Ionov
#36344
Posted: 03/30/2016 12:11:01
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 71

Hi,

thank you for your prompt reply.
I hope this issue can be solved quickly since we've a lot of customers with this problem, and we wrongly assumed the component worked both with small and large files....
#36355
Posted: 03/31/2016 07:43:55
by Alexander Ionov (EldoS Corp.)

Well, here are the results.

1. If a file is transferred as a single part, the program allocates memory approximately equal to the data size to be transferred (if no external temporary stream provided).

2. If I set MultipartUploadThreshold to a smaller value, the file is transferred in multipart mode and the program allocates memory a bit more than the value of MultipartUploadPartSize property (BTW, we've found a bug that prevented the component to use part size greater than 5MB).

If you have SBAWSDataStorage unit sources, you can fix it yourself. Just find in TElAWSS3DataStorage.DSRawWriteObject method the following line and replace "Min" with "Max" function call.
Code
MultipartChunkLen := Min(FMultipartUploadPartSize, 5242880); // S3 defines 5MB as minimum part size


3. OnProgress event is called after each part sent to the transport component. The sample app doesn't update the progress form because of another bug. Please replace the code of TfrmProgress.UpdateForm method with the following lines:
Code
lblProgress.Caption := 'Operation progress: ' + IntToStr(Current) + ' / ' + IntToStr(Total);
while Total > High(ProgressBar1.Max) do
begin
  Total := Total div 1024;
  Current := Current div 1024;
end;
ProgressBar1.Max := Total;
ProgressBar1.Position := Current;
Application.ProcessMessages();
Cancelled := FCancelled;

After this, the sample will update the progress bar and you'll be able to interrupt the upload by clicking Cancel button.

Please let us know if this solves your issue.


--
Best regards,
Alexander Ionov
#36358
Posted: 03/31/2016 09:51:19
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 71

Hi,

thank you for your answer.

I'll try. But I haven't that source file (I have the compiled version).
Can you send it to me with the fix applied? So I can use it temporarily while you release a new version.

Finally, I'll simply try to add that code to the onprogress event.

Must I use: threshold = 20MB, part size = 10MB ?

I'll try with a large file and let you know the result.
#36360
Posted: 03/31/2016 10:13:15
by Alexander Ionov (EldoS Corp.)

Quote
ntr1 wrote:
Must I use: threshold = 20MB, part size = 10MB ?

You need to set threshold to any value less than your file size to turn on multipart uploading. Part size doesn't make any sense for you as the bug in your copy (not fixed yet) of the component prevents usage of parts greater than 5MB.


--
Best regards,
Alexander Ionov
#36363
Posted: 03/31/2016 10:47:41
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 71

It seems it doesn't work.

Always starting from your sample app:

1. I've replaced the code into the UpdateForm method

2. I've added a simple row to set the threshold:

FStorage.MultipartUploadThreshold := ; (tried with 10 MB, 1 MB, 65 KB....)

3. Now try to upload a 5 GB file. The app is frozen, the onprogress event simply isn't fired.
#36365
Posted: 03/31/2016 11:01:22
by Alexander Ionov (EldoS Corp.)

Let's move to our Helpdesk for further investigation.


--
Best regards,
Alexander Ionov
#36539
Posted: 04/21/2016 05:34:32
by Raviraj Magadum (Basic support level)
Joined: 04/21/2016
Posts: 1

Hi ntr1, Alexander.

I am facing the same issue . I have 4gb ram and trying to upload the 5gb file by PHP multipart upload but it consume large memory and throw memory allowcation error. Is it the bug in aws. If yes then how can be it is achived any other way.
Will giving a direct s3 url to the upload foirm will help ?

You can refer my query on stack also

http://stackoverflow.com/questions/36334483/aws-s3-multipart-upload-fails-memory-allocation-exhausted-php
#36541
Posted: 04/21/2016 05:57:33
by Eugene Mayevski (EldoS Corp.)

Raviraj, are you using our components (SecureBlackbiox) in your script? If no, then we won't be able to help you - this forum is about our products.


Sincerely yours
Eugene Mayevski
Also by EldoS: CallbackRegistry
A component to monitor and control Windows registry access and create virtual registry keys.

Reply

Statistics

Topic viewed 3916 times

Number of guests: 1, registered members: 0, in total hidden: 0




|

Back to top

As of July 15, 2016 EldoS Corporation will operate as a division of /n software inc. For more information, please read the announcement.

Got it!