Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to upload 100KB+ file : Keep getting #100

Closed
Sdaas opened this issue Jan 5, 2013 · 7 comments
Closed

Unable to upload 100KB+ file : Keep getting #100

Sdaas opened this issue Jan 5, 2013 · 7 comments

Comments

@Sdaas
Copy link

Sdaas commented Jan 5, 2013

I am consistently unable to upload files > ~100KB using s3cmd. It keeps bombing with "Errno 32 Broken pipe". I have checked that the bucket exists.

Smaller files seem to upload file. I am running s3cmd-1.1.0beta3

s3cmd --no-guess-mime-type put myfile s3://com.xxx.yyy
myfile -> s3://com.xxx.yyy/myfile [1 of 1]
143360 of 15397790 0% in 1s 80.98 kB/s failed
WARNING: Upload failed: /myfile.war ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.00)
WARNING: Waiting 3 sec...
....

@chrisburnor
Copy link

It looks like I have this issue too. It seems like the limit is the time that it takes. No matter how big or small the file, if it takes more than 1 second to process, it generates a broken pipe error. This is despite the fact that I have the socket timeout set to 300s.

@ivebeenlinuxed
Copy link
Contributor

Can you confirm this is still an issue in the latest version? Quite a lot of work has been done recently.

@havramar
Copy link

@ivebeenlinuxed I had the same issue with 1.6.0 and ~14MB file. I was getting:

WARNING: Upload failed: /manual_upload ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.01)
WARNING: Waiting 6 sec...

or

WARNING: Upload failed: /manual_upload ([Errno 104] Connection reset by peer)
WARNING: Retrying on lower speed (throttle=0.00)
WARNING: Waiting 3 sec...

Running with --multipart-chunk-size-mb=5 solved the issue.

@tallpsmith
Copy link

I'm hitting this problem as well with 1.6.0 but the --multipart-chunk-size-mb trick is not helping me.

I do have some policies defined on the bucket that auto-expires them (after a year..) and migrates to Glacier after 30 days, but disabling them doesn't seem to have an impact.

We have several geo locations where it seems more prevalent to occur than others, but I can't pinpoint why that would be the case (nor why others are getting it).

@jtdevos
Copy link

jtdevos commented May 17, 2016

Could this be related to the following issue in the aws-cli project: Uploading files to S3 from EC2 instances fails on some instance types #634

It looks like at least part of the issue was that the http request did not include "expect: 100-continue" header - which is required even for small file uploads (i think).

@fernandoZa
Copy link

As a workaround can we install 1.5.0 instead?

@fviard
Copy link
Contributor

fviard commented Nov 15, 2016

This issue should now be fixed in master.
Please give it a try.

@fviard fviard closed this as completed Nov 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants