Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max-retries-exceeded exceptions are confusing #1198

Closed
gabor opened this issue Feb 15, 2013 · 39 comments
Closed

max-retries-exceeded exceptions are confusing #1198

gabor opened this issue Feb 15, 2013 · 39 comments
Assignees

Comments

@gabor
Copy link

gabor commented Feb 15, 2013

hi,
for example:

>>> requests.get('http://localhost:1111')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "requests/api.py", line 55, in get
    return request('get', url, **kwargs)
  File "requests/api.py", line 44, in request
    return session.request(method=method, url=url, **kwargs)
  File "requests/sessions.py", line 312, in request
    resp = self.send(prep, **send_kwargs)
  File "requests/sessions.py", line 413, in send
    r = adapter.send(request, **kwargs)
  File "requests/adapters.py", line 223, in send
    raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 61] Connection refused)

(assuming nothing is listening on port 1111)

the exception says "Max retries exceeded". i found this confusing because i did not specify any retry-related params. in fact, i am unable to find any documentation about specifying the retry-count. after going through the code, it seems that urllib3 is the underlying transport, and it is called with max_retries=0 (so in fact there are no retries). and requests simply wraps the exception. so it is understandable, but it confuses the end-user (end-developer)? i think something better should be done here, especially considering that it is very easy to get this error.

@sigmavirus24
Copy link
Contributor

Requests wraps the exception for the users convenience. The original exception is part of the message although the Traceback is misleading. I'll think about how to improve this.

@ghost ghost assigned sigmavirus24 Feb 15, 2013
@zew13
Copy link

zew13 commented Feb 21, 2013

I think need a auto retry to ignore few error

@benhoyt
Copy link

benhoyt commented Apr 8, 2013

I agree this is quite confusing. Requests never retries (it sets the retries=0 for urllib3's HTTPConnectionPool), so the error would be much more obvious without the HTTPConnectionPool/MaxRetryError stuff. I didn't realize requests used urllib3 till just now, when I had to dive into the source code of both libraries to help me figure out how many retries it was doing:

ConnectionError(MaxRetryError("HTTPSConnectionPool(host='api.venere.com', port=443): \
    Max retries exceeded with url: /xhi-1.0/services/XHI_HotelAvail.json (\
    Caused by <class 'socket.error'>: [Errno 10054] \
    An existing connection was forcibly closed by the remote host)",),)

Ideally the exception would just look something like this:

ConnectionError(<class 'socket.error'>: [Errno 10054] \
    An existing connection was forcibly closed by the remote host))

@sigmavirus24
Copy link
Contributor

That would be ideal. The issue is with wrapping these exceptions like we do. They make for a great API but a poor debugging experience. I have an idea for how to fix it though and preserve all the information

@Lukasa
Copy link
Member

Lukasa commented Apr 9, 2013

We'd also need to consider the case where a user does configure retries, in which case this exception is appropriate.

@benhoyt
Copy link

benhoyt commented Apr 9, 2013

@Lukasa, I'm not sure you do need to consider that -- Kenneth said here that Requests explicitly shouldn't support retries as part of its API.

@sigmavirus24
Copy link
Contributor

Right, but there's no way to prevent a user from actually doing so.

My plan, for the record, is to traverse as far downwards as possible to the lowest level exception and use that instead. The problem with @benhoyt 's example is that it seems the socket error exception is unavailable to us. (Just by looking at what he has pasted. I haven't tried to reproduce it yet and play with it.)

@sigmavirus24
Copy link
Contributor

@gabor 's example actually makes this easy to reproduce. Catching the exception that's raised, I did the following:

>>> e
ConnectionError(MaxRetryError("HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)",),)
>>> e.args
(MaxRetryError("HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)",),)
>>> e.args[0].args
("HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)",)
>>> e.args[0].args[0]
"HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)"
>>> isinstance(e.args[0].args[0], str)
True

So the best we could do is only use the message stored in e.args[0].args[0] which could potentially be confusing as well, but probably less so than what @benhoyt encountered. Either way, we will not parse error messages to try to get more or less details because that would just be utter insanity.

@benhoyt
Copy link

benhoyt commented Apr 9, 2013

@sigmavirus24, I agree string parsing in exceptions is a terrible idea. However, urllib3's MaxRetryError already exposes a reason attribute which contains the underlying exception (see source code). So you can get what you want with e.args[0].reason.

So continuing with the example above, e.args[0].reason is an instance of socket.error:

>>> requests.get('http://localhost:1111')
Traceback (most recent call last):
  ...
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10061] No connection could be made because the target machine actively refused it)
>>> e = sys.last_value
>>> e
ConnectionError(MaxRetryError("HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10061] No connection could be made because the target machine actively refused it)",),)
>>> e.args[0]
MaxRetryError("HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10061] No connection could be made because the target machine actively refused it)",)
>>> e.args[0].reason
error(10061, 'No connection could be made because the target machine actively refused it')

@sigmavirus24
Copy link
Contributor

Nice catch @benhoyt. I'm not as familiar with urllib3 as I would like to be.

@piotr-dobrogost
Copy link

If it really looks as you showed ie.
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=1111): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 61] Connection refused)

then I couldn't dream of better exception, really.

@benhoyt
Copy link

benhoyt commented Apr 10, 2013

@piotr-dobrogost, the main problem (for me) was the fact that it talks about "max retries exceeded", when there's no retrying involved at all. At first I thought it was the web service I was using saying that, so I contacted them. Then, digging further, I discovered this was a urllib3 quirk. So you can see the confusion.

@piotr-dobrogost
Copy link

Have you missed (Caused by <class 'socket.error'>: [Errno 61] Connection refused) part of the exception?

@benhoyt
Copy link

benhoyt commented Apr 10, 2013

Yeah, you're right -- it's all there. But as I mentioned, I missed that at first, because the MaxRetryError is a red herring.

@ksnavely
Copy link

ksnavely commented Feb 7, 2014

This max retries thing always drives me mad. Does anybody mind if I dive in and see if I can't put a PR together to squash the retries message?

I don't mean to appear out of nowhere, but I use requests tons in the Python work we do at Cloudant. We get pages that include the retries thing, and it can be a red herring.

@Lukasa
Copy link
Member

Lukasa commented Feb 8, 2014

The answer is maybe.

The problem is that, while by default we don't perform any retries, you can configure Requests to automatically retry failed requests. In those situations, the MaxRetryError we've wrapped is totally reasonable. If you can come up with a solution that leaves the MaxRetryError in place when it ought to be, but removes it when you can guarantee no retry attempts have been made, we'll consider it. =)

@ksnavely
Copy link

ksnavely commented Feb 8, 2014

@Lukasa thanks, I'm refreshing myself on the backlog here. If I get a chance to dive in I will definitely reach out.

@ksnavely
Copy link

ksnavely commented Feb 9, 2014

It almost seems to me as if the right place for the change is in urllib3? MaxRetryError makes sense to raise in the context of automatic retries, but in the case of zero retries (perhaps the naive requests experience) it can be confusing.

In urllib3 it seems the confusing errors can be triggered here via requests. It'd almost be nice to only raise a MaxRetryError when retries==0 and max_retries!=0. If max_retries==0 instead raise a plain RequestError was raised instead.

I see the urllib3 as used by requests exists in an included package -- just curious, why is that? Anyways, these were just a few ideas I wanted to toss out there. I'm still catching up on the codebases.

@Lukasa
Copy link
Member

Lukasa commented Feb 9, 2014

Whether or not the fix belongs in urllib3 is totally down to @shazow. Given that urllib3 by default does retry (3 times IIRC), it may be that he wants to keep urllib3's behaviour as is. Pinging him to get his input.

We vendor urllib3 to avoid some dependency issues. Essentially, it means we're always operating against a known version of urllib3. This has been discussed at excruciating length in #1384 and #1812 if you want the gritty details.

@ksnavely
Copy link

ksnavely commented Feb 9, 2014

Phew gritty but informative. @shazow these are just a few thoughts I had -- raising a RequestError rather than MaxRetryError as above. Really I think I better understand the MaxRetryError after checking out urlopen.

Double edit: Really even just a kwarg so one can raise MaxRetryError(retries=0) and alter the message on retries==0.

@shazow
Copy link
Contributor

shazow commented Feb 9, 2014

How about a retries=False which would disable retries altogether and always raise the original exception instead of MaxRetryError?

@cash
Copy link

cash commented Feb 19, 2014

Being able to distinguish between asking urlopen for no retries and having it count down to 0 on the number of retries would be useful. It is jarring seeing the MaxRetryError when you did not ask for retries.

@shazow
Copy link
Contributor

shazow commented Feb 19, 2014

If anyone would like to do a patch+test for this, it would be appreciated. :)

@ksnavely
Copy link

@shazow great, I'd be game to if I can find the cycles. I'll ping if I have anything.

@shazow
Copy link
Contributor

shazow commented Feb 19, 2014

\o/

@arcolife
Copy link

^I was wondering whether there was any patch released ? This issue seems to be an year old.

@Lukasa
Copy link
Member

Lukasa commented Aug 26, 2014

Not as far as I'm aware. =)

@shazow
Copy link
Contributor

shazow commented Sep 2, 2014

retries=False should raise the original exception as of v1.9, no wrapping.

@sigmavirus24
Copy link
Contributor

@kevinburke thoughts?

@kevinburke
Copy link
Contributor

Need a little more time

Kevin Burke
phone: 925.271.7005 | twentymilliseconds.com

On Sun, Oct 5, 2014 at 10:37 AM, Ian Cordasco [email protected]
wrote:

@kevinburke https://github.com/kevinburke thoughts?


Reply to this email directly or view it on GitHub
https://github.com/kennethreitz/requests/issues/1198#issuecomment-57945403
.

@kevinburke
Copy link
Contributor

Yeah this has been fixed I think

requests.get('http://localhost:11211')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "requests/api.py", line 60, in get
    return request('get', url, **kwargs)
  File "requests/api.py", line 49, in request
    return session.request(method=method, url=url, **kwargs)
  File "requests/sessions.py", line 457, in request
    resp = self.send(prep, **send_kwargs)
  File "requests/sessions.py", line 569, in send
    r = adapter.send(request, **kwargs)
  File "requests/adapters.py", line 407, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(61, 'Connection refused'))

@SiddheshS
Copy link

Could you please tell me how this has been resolved, since i am too getting connection refused problem at my end. In my python script i am trying to connect RPC server

@Lukasa
Copy link
Member

Lukasa commented Feb 22, 2016

@SiddheshS This issue was fixed by rewording some exceptions: it has nothing to do with the actual connection refused error. To ask for help with a problem you should consider using Stack Overflow.

@nkjulia
Copy link

nkjulia commented May 3, 2016

I encountered the same problem . it happened occasionally. how to fix,is there anyone can help me ? .thanks.

requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.xxxx.com', port=443): Max retries exceeded with url: /v2/goods/?category=0&sort_type=2&page_size=3&page_num=13&t=0&count=110 (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f033a2c2590>: Failed to establish a new connection: [Errno 110] Connection timed out',))
Traceback (most recent call last):
File "test.py", line 335, in
main()
File "test.py", line 290, in main
result = get_goods_info()
File "test.py", line 67, in get_goods_info
result = requests.get(url)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 69, in get
return request('get', url, params=params, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 423, in send
raise ConnectionError(e, request=request)

@Lukasa
Copy link
Member

Lukasa commented May 3, 2016

@nkjulia The connection attempt is timing out, which suggests that the remote server is overloaded or that your connection timeout is too low.

@linzhi
Copy link

linzhi commented Jul 20, 2017

I also misguided by this....

@duginivijay
Copy link

@kevinburke how did your issue resolved after getting connection refused error ? Could you please advice buddy. TIA

@duginivijay
Copy link

Ignore my post mate. I had multiple version of pythons in my machine due to which it wasn't able to pick the right one and was throwing error. Posting this thinking it may be helpful for someone.

@pigga
Copy link

pigga commented Dec 27, 2018

I encountered the same problem . it happened occasionally. how to fix,is there anyone can help me ? .thanks.

requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.xxxx.com', port=443): Max retries exceeded with url: /v2/goods/?category=0&sort_type=2&page_size=3&page_num=13&t=0&count=110 (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f033a2c2590>: Failed to establish a new connection: [Errno 110] Connection timed out',))
Traceback (most recent call last):
File "test.py", line 335, in
main()
File "test.py", line 290, in main
result = get_goods_info()
File "test.py", line 67, in get_goods_info
result = requests.get(url)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 69, in get
return request('get', url, params=params, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 423, in send
raise ConnectionError(e, request=request)

If this issue has been solved,please give me some advise.

@psf psf locked as resolved and limited conversation to collaborators Dec 27, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests