Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

apscheduler 定时任务结束时不会释放mysql内存资源? #60

Closed
nswz opened this issue Jun 24, 2019 · 6 comments
Closed

apscheduler 定时任务结束时不会释放mysql内存资源? #60

nswz opened this issue Jun 24, 2019 · 6 comments

Comments

@nswz
Copy link

nswz commented Jun 24, 2019

No description provided.

@rettier
Copy link

rettier commented Oct 14, 2019

I have the same issue, but with postgre sql. If a job makes a sql query during its execution a sql connection is kept and it will slowly exceed the connection limit.

@shuai93
Copy link

shuai93 commented Oct 23, 2019

mee too.

@rettier
Copy link

rettier commented Oct 30, 2019

the problem exists because django's connection manager creates a new connection per thread. Every time you access a model from a job, a connection is created for this thread and never cleaned up. We have fixed this for now by wrapping our job functions and calling django.db.connections.close_all() at the end. But this should definitely become part of django-apscheduler

@shuai93
Copy link

shuai93 commented Nov 6, 2019

yeah, i have way to try. you can put a patch in django orm code like

import logging
from django.db.backends.base import base as django_db_base
try:
    import MySQLdb as mysqldb
except ImportError:
    mysqldb = None

log = logging.getLogger(__name__)

def mysql_connect_patch() -> None:
    _old_ensure_connection = django_db_base.BaseDatabaseWrapper.ensure_connection
    log.info('loading patch enseure connection is available')
    def ensure_connection_with_retries(self: django_db_base.BaseDatabaseWrapper) -> None:
        if self.connection:
            if self.autocommit:
                if mysqldb and isinstance(self.connection, mysqldb.connections.Connection):
                    # log.info('mysqldb ping confirm connection is available')
                    self.connection.ping(True)
        _old_ensure_connection(self)

    django_db_base.BaseDatabaseWrapper.ensure_connection = ensure_connection_with_retries


mysql_connect_patch()

this way may be cause extra database overhead

@nswz
Copy link
Author

nswz commented Nov 14, 2019

你好棒呀,我什么时候才能向你一样优秀

@jcass77
Copy link
Owner

jcass77 commented Jul 16, 2020

I think django_apscheduler can probably be better at managing database connections.

Django does have some configuration options for managing persistent connections that could help. But this has limited application however and at some point (depending on job duration and throughput) the only viable option is to use a dedicated connection pooler like pgbouncer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants