Skip to content
This repository has been archived by the owner on Feb 21, 2023. It is now read-only.

Sentinel client ConnectionError "Too many connections" #1173

Open
1 task done
Tracked by #1225
skhayrulin opened this issue Oct 19, 2021 · 7 comments
Open
1 task done
Tracked by #1225

Sentinel client ConnectionError "Too many connections" #1173

skhayrulin opened this issue Oct 19, 2021 · 7 comments
Labels

Comments

@skhayrulin
Copy link

skhayrulin commented Oct 19, 2021

Describe the bug

Hello,

This problem is very similar to this issue only difference is that I'm using Sentinel. As far as I realized I cannot use BlockingConnectionPool as connection_pool_class when run master_for.

To Reproduce

import asyncio
from aioredis.sentinel import Sentinel

async def run():
    sentinel = Sentinel([("sentinel_host", 26379)])
    redis = sentinel.master_for(
        service_name="master_name",
        password="redis_pass",
        max_connections=2,
    )
    
    task_1 = asyncio.create_task(redis.ping())
    task_2 = asyncio.create_task(redis.ping())
    task_3 = asyncio.create_task(redis.ping())

    await asyncio.gather(*[task_1, task_2, task_3])

asyncio.run(run())

Expected behavior

Use 2 connections in the pool to successfully ping 3 tasks.

Logs/tracebacks

Traceback (most recent call last):
  File "redis_conn.py", line 19, in <module>
    asyncio.run(run())
  File "/usr/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "redis_conn.py", line 17, in run
    await asyncio.gather([task_1, task_2, task_3])
  File "/usr/lib/python3.8/asyncio/tasks.py", line 823, in gather
    if arg not in arg_to_fut:
TypeError: unhashable type: 'list'
Task exception was never retrieved
future: <Task finished name='Task-4' coro=<Redis.execute_command() done, defined at ~/.venv/lib/python3.8/site-packages/aioredis/client.py:1056> exception=ConnectionError('Too many connections')>
Traceback (most recent call last):
  File "~/.venv/lib/python3.8/site-packages/aioredis/connection.py", line 1352, in get_connection
    connection = self._available_connections.pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "~/.venv/lib/python3.8/site-packages/aioredis/client.py", line 1061, in execute_command
    conn = self.connection or await pool.get_connection(command_name, **options)
  File "~/.venv/lib/python3.8/site-packages/aioredis/connection.py", line 1354, in get_connection
    connection = self.make_connection()
  File "~/.venv/lib/python3.8/site-packages/aioredis/connection.py", line 1392, in make_connection
    raise ConnectionError("Too many connections")
aioredis.exceptions.ConnectionError: Too many connections

Python Version

Python 3.8.10

aioredis Version

Name: aioredis
Version: 2.0.0
Summary: asyncio (PEP 3156) Redis support
Home-page: https://github.com/aio-libs/aioredis
Author: None
Author-email: None
License: MIT
Location: ~/.venv/lib/python3.8/site-packages
Requires: async-timeout, typing-extensions
Required-by:

Additional context

Didn't face with such behavior in version 1.3.1

Code of Conduct

  • I agree to follow the aio-libs Code of Conduct
@skhayrulin skhayrulin added the bug label Oct 19, 2021
@Andrew-Chen-Wang
Copy link
Collaborator

@skhayrulin this might be your problem?

File "redis_conn.py", line 17, in run
    await asyncio.gather([task_1, task_2, task_3])

Instead, pass in task_1, task_2, task_3 or do *[...]

@skhayrulin
Copy link
Author

@Andrew-Chen-Wang hi,

Thanks for replying so I've fixed, but problem is still the same

Traceback (most recent call last):
  File ".venv/lib/python3.8/site-packages/aioredis/connection.py", line 1352, in get_connection
    connection = self._available_connections.pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "redis_conn.py", line 32, in <module>
    asyncio.run(run())
  File "/usr/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "redis_conn.py", line 18, in run
    await asyncio.gather(*[task_1, task_2, task_3])
  File ".venv/lib/python3.8/site-packages/aioredis/client.py", line 1061, in execute_command
    conn = self.connection or await pool.get_connection(command_name, **options)
  File ".venv/lib/python3.8/site-packages/aioredis/connection.py", line 1354, in get_connection
    connection = self.make_connection()
  File ".venv/lib/python3.8/site-packages/aioredis/connection.py", line 1392, in make_connection
    raise ConnectionError("Too many connections")
aioredis.exceptions.ConnectionError: Too many connections

@abrookins
Copy link
Contributor

Hmm, this is interesting! How are you running redis in this example? Do you have sentinel setup locally, with or without docker, or are you connecting to a remote redis?

@skhayrulin
Copy link
Author

Hello,
Yep I have sentinel setup local with docker. My docker-compose bellow

version: '3.3'

networks:
  app-tier:
    driver: bridge

services:
  redis:
    image: 'bitnami/redis:latest'
    ports:
      - 6379:6379
    environment:
      - REDIS_PASSWORD=test
    networks:
      - app-tier

  redis-sentinel:
    image: 'bitnami/redis-sentinel:latest'
    environment:
      - REDIS_MASTER_HOST=localhostIp # ip of local machine to have access to redis outside the docker net
      - REDIS_MASTER_PASSWORD=test
    ports:
      - '26379:26379'
    networks:
      - app-tier

@abrookins
Copy link
Contributor

Hi @skhayrulin, excellent, this should give us enough to reproduce. I’ll try to do that soon and update the ticket with my findings.

@mrcatspoon
Copy link

Also got same error when using ConnectionPool. We had to rollback to 1.3.1. Will it be fixed in 2.0.2?

@mbrancato
Copy link

The aioredis ConnectionPool does not really work as other pools work. Other pools maintain a set of long-lived connections, and make them available as needed to clients. Some pools grow and shrink, normally to a reasonable max size. When a client wants a connection, the client waits for a response from the pool to provide a connection.

The pool in aioredis does not work this way. First, by default, the max connection is None (infinite). And when a connection limit is set, the pool just throws exceptions if the connection limit is hit, instead of having a client wait for a connection (we are async, this is common). For example, the following code is impossible to run:

redis_client = aioredis.from_url("redis://127.0.0.1:6379/0", max_connections=2)

tasks = []
for i in range(20):
    tasks.append(asyncio.create_task(redis_client.set(str(uuid.uuid4()), 1, ex=20)))

await asyncio.gather(*tasks)

The exceptions could be caught and just enter a retry loop, but you never know if the connection really is broken and this also introduces a busy-wait pattern.

Because the pool is very willing to open more connections, and does not throttle client requests, the call to release() for all connections in a large set of tasks happen all at the end - this makes it prone to the Too many connections issue.

You'll need to use something like asyncio-connection-pool as a generic wrapper to avoid these problems.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants