EldoS | Feel safer!

Software components for data protection, secure storage and transfer

Error uploadign large files to Google Drive

Also by EldoS: BizCrypto
Components for BizTalk® and SQL Server® Integration Services that let you securely store and transfer information in your business automation solutions.
#31207
Posted: 10/30/2014 09:40:31
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 73

Hello,

trying to upload a 1 GB file to Google Drive (it requires a few hours), I'm experiencing some errors and the upload is never completed successfully. I'm not sure it's a problem of the component, because I've found the issue reported by other users:

https://code.google.com/p/gdata-issues/issues/detail?id=5700
http://stackoverflow.com/questions/23696484/google-drive-api-1-8-1-large-file-upload-download-issue

Here are the errors I've got:

Google Drive server reports the following error.Code: authErrorDescription: Invalid Credentials

or

Invalid OAuth response format: No Location field in the response (error code is 14593)


Could it be a bug of the API? Did you try to upload very large files? Or could it simply be a connection problem?
#31208
Posted: 10/30/2014 10:06:08
by Alexander Ionov (EldoS Corp.)

Thank you for contacting us.

Google Drive API support 3 types of file uploading: https://developers.google.com/drive/web/manage-uploads
Our component uses the first one, i.e. "uploadType=media". Although the manual says about 5MB files, we tried to upload larger files and succeeded.

Unfortunatelly our connection speed does not allow to upload such huge files as 1GB but tomorrow we'll try to upload a file of several hundreds MB in size.

The errors you get most seem like invalid authorization or something. Could you please create a very simple code which authorizes and uploads a file to Google Drive, then test this source code with a small file; if succeeded, try to use the same source code without any change to upload a large file. We need to be sure that your program successfully authorized on the GDrive server before it tries to upload a file.


--
Best regards,
Alexander Ionov
#31209
Posted: 10/30/2014 10:54:00
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 73

Yes, the programs is connected and authorized successfully. It perfectly works and I made several tests uploading files of about 300 MB.

The problem first occured with a file of about 1 GB, without changing any setting. Furthermore the error doesn't happen at the beginning, but some hours after the upload is started. That is the strange thing.

Could it be possible there is some session that expires ?

The API allows to send very large files, so I think the component should be tested to do that.

Thank you
#31210
Posted: 10/30/2014 11:03:54
by Alexander Ionov (EldoS Corp.)

Quote
ntr1 wrote:
Could it be possible there is some session that expires ?

I suppose so. Unfortunatelly there is no information on how long access tokens are valid. So it's possible that the access token you've got at the session beginning expires in several hours and needs to be refreshed.
Tomorrow I'll try to upload a large file and let you know the results.


--
Best regards,
Alexander Ionov
#31211
Posted: 10/30/2014 11:09:58
by Alexander Ionov (EldoS Corp.)

BTW, did you try to read the GDrive.HTTPClient.OAuth2.Expires property. After successful authorization this property should contain time (in UTC) when the current access token expires. It seems the access tokens are valid during 1 hour.


--
Best regards,
Alexander Ionov
#31212
Posted: 10/30/2014 11:11:44
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 73

The issue is probably caused by the UploadType you used. In fact, it seems the best method, and recommended for transferring files is the "resumable" one (I copied and paste from Google API documentation):


------------------------------------------------------------------------------uploadType=media. For quick transfer of smaller files, for example, 5 MB or less.

uploadType=multipart. For quick transfer of smaller files and metadata; transfers the file along with metadata that describes it, all in a single request.

uploadType=resumable. For reliable transfer, especially important with larger files. With this method, you use a session initiating request, which optionally can include metadata. This is a good strategy to use for most applications, since it also works for smaller files at the cost of one additional HTTP request per upload.
------------------------------------------------------------------------------

As they say, the "resumable" type is "reliable", "for larger files", and "good strategy to use for most applications".

I think this is a method that should be added in the component, otherwise it can be used only for trivial transfers, and not for reliable and important transfers.

Can this be included in the next maintenance release? When? It is a blocking issue for us.

Perhaps it could be an option, so user can choose between "resumable" and "media". However, please let me know if this is really the possible cause of the error.
#31216
Posted: 10/31/2014 03:34:42
by Alexander Ionov (EldoS Corp.)

I think the cause of the problem is that the access token expires before a file is uploaded. I did a test with uploading a file over 1 hour and got the same error "No Location field in the response". So the problem is not in file size, but in uploading time. Session expires before a file is uploaded.

Could you please explain what you expect to have for uploading large files?

I'm going to do some experiments with resumable uploads, and only then we'll be able to say whether and when we can implement this feature.


--
Best regards,
Alexander Ionov
#31225
Posted: 10/31/2014 12:10:01
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 73

Hi Alexander,

I had a suspect the problem could be a session expiration or something similar. That's very strange, and I must admit I had the same issue with the previous component I've been using (and it has been one of the reasons why I've replaced it).

Perhaps it needs a sort of keepalive command, or using "resumable" type, this problem can be more easily fixed, I don't know.

I've been able to send the same 1 GB file to S3 without problems.


Anyway, my expectations are to have a component that can be used to send at the least a 10 GB file. I think that's a reasonable, basic limit for any backup software.


Thank you
#31236
Posted: 10/31/2014 15:01:28
by Alexander Ionov (EldoS Corp.)

The problem is not in file size but in uploading time. I belive one will succeed to upload a 10GB file with that "media" upload type if it can be done during 1 hour when the access token is still valid.

That "resumable" upload will fail in 1 hour with the same error with the difference that the uploaded part of a file is already on the server and is not lost when the session expires. In this case, possibly there is a need to have some means to resume the file uploading after a new session is established. Or maybe we have to implement an automatic session renewal, after which the last chunk is to be reuploaded again.


--
Best regards,
Alexander Ionov
#31244
Posted: 11/03/2014 02:38:40
by ntr1 (Standard support level)
Joined: 02/05/2014
Posts: 73

Hello Alexander,

I think you're right regarding the cause of the issue, and that an automatic session renewal (if it exists) would be a good solution.

In the meanwhile, I've found, after a few tests, that the same issue affects Dropbox. A 300 MB file is uploaded correctly, but when I try to upload a 800 MB one, the error always is: "Connection lost (error code is 10058)".

So the same workaround must be find also for Dropbox component, and, I must say, everything will be perfect.
Also by EldoS: CallbackRegistry
A component to monitor and control Windows registry access and create virtual registry keys.

Reply

Statistics

Topic viewed 3905 times

Number of guests: 1, registered members: 0, in total hidden: 0




|

Back to top

As of July 15, 2016 EldoS Corporation will operate as a division of /n software inc. For more information, please read the announcement.

Got it!