Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Timeout while uploading to remote storage #161

Open
miicha opened this issue Jan 9, 2022 · 8 comments
Open

Timeout while uploading to remote storage #161

miicha opened this issue Jan 9, 2022 · 8 comments

Comments

@miicha
Copy link

miicha commented Jan 9, 2022

I'm running Nextcloud Hub II (23.0.1 RC1) with Backup 1.0.4.
Creating a local backup works and occ backup:point:pack 202201... creates a pack.
When trying to use occ backup:point:upload 202201... I run into ConnectException cURL error 28: Operation timed out after 30000 milliseconds with 0 bytes received.
It seems to be kind of a server issue (nextcloud/server#26071), since it ca be "resolved" by setting /path/to/nextcloud/lib/private/Http/Client/Client.php RequestOptions::TIMEOUT to a larger value (240s).
To link another of the countless related issues I came across: nextcloud/server#18103

To me it looks like the "chunked" or whatever upload that is used by the web interface is not used by CLI operations. Since I have a rather slow (50 Mbit/s) DSL connection (on both server instances) it is not possible to transfer the 100MB files within the default 30s curl timeout.

As read in some other issue, I played around with the federated share:

  • uploading ~100MB files via a link to the target server in web interface works nicely
  • doing the same via the federated share (from the server I like to do the backup) I get the same 30s timeout.

Therefore, I don't really know if this repo is the correct place for the issue. Nevertheless, I guess it might serve as a workaround for others that run into the same problem.

@mastermns
Copy link

Hello,

which kind of external storage are you using? Webdav?

Thanks

@miicha
Copy link
Author

miicha commented Jan 11, 2022

I'm using "Nextcloud" with user name and password.

I just found this issue that would have probably helped me as well: #141 (just setting to a smaller chunkk size)

@arkhan
Copy link

arkhan commented Jan 28, 2022

Hello,

which kind of external storage are you using? Webdav?

Thanks

Greetings, I have the same issue using external storage with webdav

@EmmyGraugans
Copy link

And it's not even limited to "slow" networks. We are sitting on a 1 Gbit-LAN and between the two Nextcloud-Instances that are federated is even a 10 Gbit connection, but I still can't download a 1.5 GB file via the federation with the same 30 second timeout in curl. (It manages about 1 GByte, then aborts the download, giving me a broken/truncated file).

snippet from the relevant line in the log-file:

Operation timed out after 30000 milliseconds with 1034205482 out of 1653575321 bytes received (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for https://federated.server/public.php/webdav/Folder/File","userAgent":"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:96.0) Gecko/20100101 Firefox/96.0","version":"22.2.3.0"

@vdheidenet
Copy link

Same here with NC 24.0.4.1 and backup 1.1.2.
A Gig internet connection between NC and Webdav server (both located in the same DataCenter).
When starting a backup (both app_data and backup are using remote WebDAV), then fairly quick (after writing approx 10MB) this error occurs.

@deetuned
Copy link

deetuned commented Sep 5, 2022

I am experiencing the same issue. I cannot upload my backup to my remote Nextcloud share.

@Railsimulatornet
Copy link

Same Problem here

@stilobique
Copy link

Hi,
I have this issue too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants