-
Notifications
You must be signed in to change notification settings - Fork 341
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to upload the chunk / connection timed out errors using Azure #361
Comments
How many threads are you using? |
I haven't specified the threads option so it is whatever the default is. The command I am using is "duplicacy backup -stats" |
Can you change this line: duplicacy/src/duplicacy_azurestorage.go Line 11 in e07226b
to:
and then rebuild from the source? This will pick up the change I made to retry on timeout errors: gilbertchen/azure-sdk-for-go@53194c2 If you would rather like a working binary, please let me know. |
Sorry for the delayed response - was out of town last week. I don't have an easy way to do the builds myself, can you help me get a binary to test this with (for Mac)? |
Can you try this build: https://acrosync.com/duplicacy/duplicacy_osx_x64_2.0.10e? |
Hey, I'm having this identical problem, see here for details. I, too, am running on Mac OS/X. Any problem if I test this fix as well? |
@jeffaco please test the fix and let me know if it works. |
Minor immediate problem: After doing I also observed, after getting the number of patterns loaded, there is a LONG delay. I have a 1.6TB file system, so I imagine it's looking at files. But with this many files, it takes 5-10 (or maybe more) minutes of no output while doing this step. Since I did a Beyond that, this problem is pretty intermittent. I'll get back to you if/when it completes, thanks! |
Looks like the new version did NOT fix the problem:
Here's the version I'm running:
Note that the error is slightly different this time: Please let me know how to proceed, thanks! |
The new build https://acrosync.com/duplicacy/duplicacy_osx_x64_2.0.10f should be able to retry on this error. |
I didn't want to leave you hanging. I did install this last night, just around 8:00 PM (just shortly after you posted the new image). This is a big backup, but so far so good:
This is promising, but certainly not conclusive since these problems were intermittent to begin with. I'll get back with more definitive information within 2 days, 7 hours! 😄 |
Duplicacy aborted again backing up to Azure. I got MUCH farther this time, farther than ever before, but it did abort:
The error this time is different than before: If you can let me know how to proceed, that would be great. Thanks so much. |
The previous fix only retries on temporary errors, but this |
I can retry, but this is 2+ day upload. I'd like to see a full upload happen at least once (in one shot) to know that communications are good. I have no reason to believe that this won't come up again, at least at times. A broken pipe is a synchronization issue between essentially two processes (in this case, no doubt between Azure and Duplicacy). Since you know exactly what chunk you were trying to store, why wouldn't you just retry the operation? It's harmless to retry, as the new chunk would just replace the old chunk, right? |
I ran the backup again and it did reoccur with the same error:
Same error: (It took about two days before this error occurred ...) /Jeff |
Any guidance on how to proceed here? It feels like large uploads to Azure aren't very reliable here, which is concerning since I have some large uploads to do. Thanks for any advice. |
This build https://acrosync.com/duplicacy/duplicacy_osx_x64_2.1.0a should handle the broken pipe error more gracefully. |
Awesome, thank you so much! I've restarted the backup, I'll let you know (if all goes well, will take up to 3 days, I estimate). |
I don't think the new image (
Same error: This seems to be maddeningly consistent 😞. Any guidance would be appreciated of what to try next, thanks! |
This build (https://acrosync.com/duplicacy/duplicacy_osx_x64_2.1.0b) should work. The previous fix didn't work because the |
Backup finally finished successfully, with a
I would say that your latest fix does indeed resolve the |
While I'm glad the new build worked for you, I noticed there was a bug: the total number of file chunks can't be 0. This is likely to be a bug introduced after the 2.1.0 release. I'll look into that. |
Does that mean that this backup is invalid (it does pass a |
It is just a problem in output. The backup should be fine as long as it passes the check command. |
Awesome, thanks! |
The 0 file chunk bug has been fixed by 5d2242d. |
Thanks for taking care of this! This issue can be closed since the problems reported in it are fixed. I would have closed it myself, but don't have permission. |
While running the initial backup from MacOS and Linux computers running 2.0.10 to Azure storage, the process eventually fails with a message like this:
Whenever this happens I just restart the backup process again. Eventually, it finishes. The problem is that these are the initial backups so they are huge and instead of it just running for a day, it's taking forever because it fails after an hour or two and doesn't get started up again until I notice it.
I am not sure if this happens on Windows or not because I haven't tried from a Windows PC yet but I assume it does. I am also not sure if this only happens when writing to Azure storage or if it affects other clients too.
The text was updated successfully, but these errors were encountered: