Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Timeout context manager should be used inside a task #440

Closed
beneshed opened this issue Nov 20, 2017 · 24 comments
Closed

RuntimeError: Timeout context manager should be used inside a task #440

beneshed opened this issue Nov 20, 2017 · 24 comments

Comments

@beneshed
Copy link

beneshed commented Nov 20, 2017

Just today I started having this issue when trying (so far just with SQS)
Context i'm calling from inside an aiohttp server request

class RequestView(web.View):
    async def post(self):
        body = await self.request.json()
        await self.request.app['sqs'].async_send_message(body=body)
        return web.json_response({
            'message': 'SUCCESS'
        })

in the async_send_message function

 response = await client.send_message(QueueUrl=self.queue_url, MessageBody=body)
Traceback (most recent call last):
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/web_protocol.py", line 416, in start
    resp = yield from self._request_handler(request)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/web.py", line 323, in _handle
    resp = yield from handler(request)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/web_urldispatcher.py", line 748, in __iter__
    resp = yield from method()
  File "gozer/venvs/dev/lib/python3.6/site-packages/gozer-0.1-py3.6.egg/gozer/gatekeeper/decorators.py", line 34, in wrapper
    return await func(*args)
  File "gozer/venvs/dev/lib/python3.6/site-packages/gozer-0.1-py3.6.egg/gozer/gatekeeper/views.py", line 48, in post
    await self.request.app['sqs'].async_send_message(body=to_send)
  File "gozer/venvs/dev/lib/python3.6/site-packages/gozer-0.1-py3.6.egg/gozer/apis/sqs.py", line 53, in async_send_message
    QueueUrl=self.queue_url, MessageBody=body)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiobotocore/client.py", line 80, in _make_api_call
    operation_model, request_dict)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiobotocore/endpoint.py", line 265, in _send_request
    exception)):
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiobotocore/endpoint.py", line 297, in _needs_retry
    caught_exception=caught_exception, request_dict=request_dict)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
    return self._emit(event_name, kwargs)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
    response = handler(**kwargs)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
    if self._checker(attempts, response, caught_exception):
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
    caught_exception)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 269, in _should_retry
    return self._checker(attempt_number, response, caught_exception)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
    caught_exception)
  File "/gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
    attempt_number, caught_exception)
  File "gozer/venvs/dev/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
    raise caught_exception
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiobotocore/endpoint.py", line 321, in _get_response
    request.method, request.url, request.headers, request.body)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiobotocore/endpoint.py", line 254, in _request
    timeout=None)
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/helpers.py", line 99, in __iter__
    ret = yield from self._coro
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/client.py", line 266, in _request
    with CeilTimeout(self._conn_timeout, loop=self._loop):
  File "gozer/venvs/dev/lib/python3.6/site-packages/aiohttp/helpers.py", line 744, in __enter__
    'Timeout context manager should be used inside a task')
RuntimeError: Timeout context manager should be used inside a task
aiobotocore==0.5.1
aiohttp==2.3.3
asn1crypto==0.23.0
async-timeout==2.0.0
botocore==1.7.40
cffi==1.11.2
chardet==3.0.4
cryptography==2.1.3
docutils==0.14
gunicorn==19.7.1
idna==2.6
jmespath==0.9.3
Logbook==1.1.0
marshmallow==2.14.0
multidict==3.3.2
packaging==16.8
pycparser==2.18
pyparsing==2.2.0
python-dateutil==2.6.1
riprova==0.2.3
six==1.10.0
uvloop==0.8.1
wrapt==1.10.11
yarl==0.14.2

I'm I doing something obviously wrong?

@hellysmile
Copy link
Member

Are You using tornado?

@beneshed
Copy link
Author

beneshed commented Nov 20, 2017

No just plain asyncio

I'm running an aiohttp server that's using uvloop and running inside of gunicorn

edit: been looking here szastupov/aiotg#36

@beneshed
Copy link
Author

beneshed commented Nov 20, 2017

After following the stack trace for it bit it seems that either botocore or aiohttp is ignoring the timeout=None parameter

edit: botocore is passing a timeout of 60s.....is there a version or a way to get rid of this?

edit2: it also seems to just be sqs.send_message and sqs.get_queue_url (sqs problem?)

@thehesiod
Copy link
Collaborator

hmm, for some reason "current_task" in aiohttp is not finding the current task, can you debug into it to see why that is? Are you using multiple loops? Could there be loop confusion?

@beneshed
Copy link
Author

@thehesiod single loop. it's getting None for current task. Any more suggestions?

@jettify
Copy link
Member

jettify commented Nov 21, 2017

Could you try to run same code but do not use view just simple async def function?

@thehesiod
Copy link
Collaborator

any way for a simple test case for us to try?

@asvetlov
Copy link
Member

Task.current_task() can be None if the method is executed not from coroutine.
Or from Tornado -- the library is still not 100% compatible with asyncio.
Should be fixed by tornado 5 I hope.

@DeoLeung
Copy link

DeoLeung commented Dec 5, 2017

getting this in Tornado, hope it could be fixed soon

@asvetlov
Copy link
Member

asvetlov commented Dec 6, 2017

For Tornado there is a workaround: create asyncio task in tornado handler and wait for result:

class MyHandler(tornado.web.RequestHandler):
    async def task(self):
        await self.boto.call(...) 

    async def get(self):
        await asyncio.ensure_future(self.task())

@beneshed
Copy link
Author

beneshed commented Dec 7, 2017

Ok simple test case, not in an aiohttp request

async def fetch():
    q = SQSAsync(queue_name="working-queue", wait_time=10)
    message = await q.async_get_message()
    print(message)


if __name__ == '__main__':
    loop = asyncio.get_event_loop()

    loop.run_until_complete(asyncio.wait([asyncio.ensure_future(fetch())]))
    loop.close()

This works.

@asvetlov not using tornado :(

@thehesiod single thread (unless it's using multiple in the underlying libraries)

@thehesiod
Copy link
Collaborator

if you provide a encapsulated testcase we can take a look. Another thing, if using multiple threads in 3.5.3+, each thread will get their own loop.

@asvetlov
Copy link
Member

asvetlov commented Dec 7, 2017

  1. asyncio.ensure_future() creates a task is coroutine is passed.
  2. asyncio.wait() calls ensure_future().
  3. loop.run_until_complete() calls ensure_future().

Every async def is in asyncio is executed inside a task. There is no way to run async function without task context without weird tricks.
If aiobotocore has these tricks -- the library should be fixed.

@manicai
Copy link

manicai commented Feb 16, 2018

I'm seeing the same error when testing with PyTest.

$ py.test test.py
==================================================== test session starts =====================================================
platform linux -- Python 3.6.4, pytest-3.4.0, py-1.5.2, pluggy-0.6.0
rootdir: /home/ian/PyCharmDeployments/DHB/server, inifile: pytest.ini
plugins: cov-2.5.1, asyncio-0.8.0, aiohttp-0.3.0
collected 1 item
run-last-failure: rerun previous 1 failure first

test.py F                                                                                                              [100%]

========================================================== FAILURES ==========================================================
________________________________________________________ test_upload _________________________________________________________

s3_cache = <test.Cache object at 0x7f7c3018ee80>
temp_file = '/tmp/pytest-of-ian/pytest-8/test_upload0/68c70c73-d1ae-4da1-8cfe-26567b11b4d5'

    @pytest.mark.asyncio
    async def test_upload(s3_cache, temp_file):
        value = b'It was the best of times, it was the worst of times'
        with open(temp_file, 'wb') as fp:
            fp.write(value)
>       await s3_cache.upload(A_HASH, temp_file)

test.py:72:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
test.py:41: in upload
    Key=s3_key, Body=fp)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiobotocore/client.py:80: in _make_api_call
    operation_model, request_dict)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiobotocore/endpoint.py:265: in _send_request
    exception)):
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiobotocore/endpoint.py:297: in _needs_retry
    caught_exception=caught_exception, request_dict=request_dict)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/hooks.py:227: in emit
    return self._emit(event_name, kwargs)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/hooks.py:210: in _emit
    response = handler(**kwargs)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:183: in __call__
    if self._checker(attempts, response, caught_exception):
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:251: in __call__
    caught_exception)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:269: in _should_retry
    return self._checker(attempt_number, response, caught_exception)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:317: in __call__
    caught_exception)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:223: in __call__
    attempt_number, caught_exception)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/botocore/retryhandler.py:359: in _check_caught_exception
    raise caught_exception
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiobotocore/endpoint.py:321: in _get_response
    request.method, request.url, request.headers, request.body)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiobotocore/endpoint.py:254: in _request
    timeout=None)
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiohttp/helpers.py:104: in __iter__
    ret = yield from self._coro
/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiohttp/client.py:266: in _request
    with CeilTimeout(self._conn_timeout, loop=self._loop):
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <aiohttp.helpers.CeilTimeout object at 0x7f7c31141cc0>

    def __enter__(self):
        if self._timeout is not None:
            self._task = current_task(loop=self._loop)
            if self._task is None:
                raise RuntimeError(
>                   'Timeout context manager should be used inside a task')
E               RuntimeError: Timeout context manager should be used inside a task

/home/ian/.virtualenvs/server-0AsZQxST/lib/python3.6/site-packages/aiohttp/helpers.py:749: RuntimeError
================================================== 1 failed in
 1.75 seconds ==================================================

I've boiled this down to the following test code.

import aiobotocore
import botocore.exceptions
import botocore.session
import os
import pytest
import uuid


region = 'eu-west-1'
A_HASH = 'd97f577bdec62038af39eff817004ddf128618075c174e35bf6154ad9926fa98'

def _is_404(client_error: botocore.exceptions.ClientError) -> bool:
    """Return true if a ClientError exception from Boto corresponds to a                                                          HTTP 404 not found response.                                                                                                  """
    error = client_error.response.get('Error', {})
    error_code = error.get('Code', '')
    return error_code == '404'

class Cache():
    """Amazon S3 based cache."""
    def __init__(self, s3_bucket):
        super().__init__()
        self.bucket = s3_bucket
        self.session = aiobotocore.get_session()

    async def exists(self, s3_key: str):
        async with self.session.create_client('s3') as client:
            try:
                _ = await client.head_object(Bucket=self.bucket, Key=s3_key)
                return True
            except botocore.exceptions.ClientError as error:
                if _is_404(error):
                    return False
                raise

    async def upload(self, s3_key: str, file_path: str):
        async with self.session.create_client('s3') as client:
            with open(file_path, 'rb') as fp:
                await client.put_object(Bucket=self.bucket,
                                        Key=s3_key, Body=fp)

@pytest.fixture
def temp_file(tmpdir):
    filename = str(uuid.uuid4())
    path = tmpdir / filename
    yield str(path)
    os.unlink(path)


@pytest.fixture(scope="session")
def s3_bucket():
    session = botocore.session.get_session()
    client = session.create_client('s3', region_name=region)
    name = 'dhb-tests-' + str(uuid.uuid4()).replace('-', '')
    client.create_bucket(Bucket=name, ACL='private',
                         CreateBucketConfiguration=dict(LocationConstraint=region))
    yield name
    client.delete_bucket(Bucket=name)


@pytest.fixture
def s3_cache(s3_bucket):
    return Cache(s3_bucket)

@pytest.mark.asyncio
async def test_upload(s3_cache, temp_file):
    value = b'It was the best of times, it was the worst of times'
    with open(temp_file, 'wb') as fp:
        fp.write(value)
    await s3_cache.upload(A_HASH, temp_file)

    exists = await s3_cache.exists(A_HASH)
    assert exists

Python 3.6.4 with the following installed:

aiobotocore==0.5.2
aiofiles==0.3.2
aiohttp==2.3.10
aioresponses==0.3.2
apyio==0.2.0
astroid==1.6.1
async-timeout==2.0.0
attrs==17.4.0
botocore==1.7.40
certifi==2018.1.18
chardet==3.0.4
coverage==4.5
decorator==4.2.1
docutils==0.14
futures==3.1.1
idna==2.6
idna-ssl==1.0.0
ipython==6.2.1
ipython-genutils==0.2.0
isort==4.3.3
jedi==0.11.1
jmespath==0.9.3
lazy-object-proxy==1.3.1
mccabe==0.6.1
multidict==4.1.0
packaging==16.8
parso==0.1.1
pexpect==4.3.1
pickleshare==0.7.4
pkginfo==1.4.1
pluggy==0.6.0
prompt-toolkit==1.0.15
ptyprocess==0.5.2
py==1.5.2
Pygments==2.2.0
pylint==1.8.2
pyparsing==2.2.0
pytest==3.4.0
pytest-aiohttp==0.3.0
pytest-asyncio==0.8.0
pytest-cov==2.5.1
pytest-cover==3.0.0
pytest-coverage==0.0
python-dateutil==2.6.1
PyYAML==3.12
q==2.6
requests==2.18.4
requests-toolbelt==0.8.0
simplegeneric==0.8.1
six==1.11.0
tqdm==4.19.5
traitlets==4.3.2
twine==1.9.1
typing==3.6.4
urllib3==1.22
wcwidth==0.1.7
wrapt==1.10.11
yarl==1.1.0

@manicai
Copy link

manicai commented Feb 16, 2018

Ah, scrub that. Have found that my problem was because the session was being create the constructor rather than inside one of the async methods. Apologies.

@thehesiod
Copy link
Collaborator

@manicai @thebenwaters any updates? if no testcase going to close in a few days

@manicai
Copy link

manicai commented Feb 24, 2018

@thehesiod No change, the code above will reproduce it but since refactoring to ensure the session is created inside a coroutine instead everything works fine. Thanks.

@thehesiod
Copy link
Collaborator

@manicai I think pytest.mark.asyncio creates and destroys its own event loop, so creating a session outside test_upload will probably not work

@beneshed
Copy link
Author

@manicai how did you enforce?

@manicai
Copy link

manicai commented Feb 25, 2018

I haven't found a good way of enforcing it programmatically.

@andersea
Copy link

andersea commented Mar 16, 2018

Hi, I had some code that used aiohttp websockets that suddenly started raising this exception.

The bug was from a class I had made which had a constructor which had a (to me at least) somewhat non-obvious error:

Here is an example:

import asyncio
import aiohttp

class Tester:
    def __init__(self, loop=asyncio.get_event_loop()):
        self._loop = loop

    async def connect(self):
        async with aiohttp.ClientSession(loop=self._loop) as session:
            async with session.ws_connect('wss://some.websocket.service/websocket') as ws:
                async for msg in ws:
                    print(msg.data)

async def test():
    tester = Tester()
    await tester.connect()

loop = asyncio.new_event_loop()
loop.run_until_complete(test())

The problem is with the loop=asyncio.get_event_loop() argument, in combination with creating a new loop to run the code.

The code asyncio.get_event_loop() will get executed when the class is loaded and will forever initialize the constructor with the first event loop asyncio creates.

When I manually create a new event loop, but doesn't supply this loop to the constructor when creating the class, the session will end up being created with the default event loop, but this is not the event loop which is running the code.

The session needs to be created with the same loop that is actually running. After I made sure of this, the issue went away.

@beneshed
Copy link
Author

@andersea I will give it a try!

@thehesiod
Copy link
Collaborator

in that testcase I just verified that the s3_cache fixture is executed before the test_upload method's run loop is created by the python-asyncio pytest plug-in via the @pytest.mark.asyncio, so it will not be in the same runloop as the items in test_update which won't work. in aiobotocore a loop fixture is used so they're all explicitly created in the same loop.

based on this finding are we ok closing this?

@thehesiod
Copy link
Collaborator

going to close for now, feel free to re-open if my explanation doesn't make sense or if you find another issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants