-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
apscheduler 定时任务结束时不会释放mysql内存资源? #60
Comments
I have the same issue, but with postgre sql. If a job makes a sql query during its execution a sql connection is kept and it will slowly exceed the connection limit. |
mee too. |
the problem exists because django's connection manager creates a new connection per thread. Every time you access a model from a job, a connection is created for this thread and never cleaned up. We have fixed this for now by wrapping our job functions and calling |
yeah, i have way to try. you can put a patch in django orm code like
this way may be cause extra database overhead |
你好棒呀,我什么时候才能向你一样优秀 |
I think django_apscheduler can probably be better at managing database connections. Django does have some configuration options for managing persistent connections that could help. But this has limited application however and at some point (depending on job duration and throughput) the only viable option is to use a dedicated connection pooler like pgbouncer. |
No description provided.
The text was updated successfully, but these errors were encountered: