-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple simultaneous https requests through http proxy doesn't work #1340
Comments
Could explicit response closing solve your problem? |
No. I tried both close and release. Also, read calls one of them internally. |
Hmm. I recall something weird with proxies. |
It's even worse. At every iteration you get 5 new entries (SelectorSocketTransport) in the _acquired. So it stops after only 4 iterations. It seems to me that whatever method is responsible for taking the SSLProtocolTransport out of the _acquired when the response is released doesn't do the same for the SelectorSocketTransport. |
* Add functional tests for raw_host in proxy request * Add tests of proxy acquired cleanup * Fix bug with https proxy acquired cleanup * More acquired tests. Fix test session close. * Do not release waiter on detach * Update CHANGES * Close proxy connect transport * Update CHANGES * Clean client session test traceback
related to #1568 ? |
seems some servers does not complete all shutdown procedure in that case asyncio never closes transport. |
Long story short
Making multiple simultaneous requests to https url through http proxy ends up filling TCPConnector's acquired set and it stops working.
Expected behaviour
That it should be possible to make as many simultaneous requests as it is set in the Connector's limit parameter and that it should reuse connections and/or clean up after use. And if the simultaneous limit is hit, the requests should wait in a queue.
Actual behaviour
TCPConnector keeps SelectorSocketTransports in it's _acquired set. After a while, it inserts a new 'set', until it fills the limit and stops working.
Steps to reproduce
If you set number_of_requests to more than 20, it stops at the first iteration. If you run with 5, for example, it seems to be working until 500 requests are made, then acquired raises to 10 and if kept running, eventually it will hit the limit and stop.
Your environment
Linux, Python 3.5.2, aiohttp 1.0.5
The text was updated successfully, but these errors were encountered: