I have set up a decorator to provide my db connection using psycopg3 and I sometimes have it passing a connection that is closed and throwing the following error when I try to use it:
the connection is closed
Here is my implementation:
master_connection_pool = psycopg_pool.ConnectionPool(
conninfo=masterDbConnectionString,
min_size=5,
open=True
)
def provide_db(func):
"""
Function decorator that provides a session if it isn't provided.
"""
@wraps(func)
def wrapper(*args, **kwargs):
arg_session = 'db'
func_params = func.__code__.co_varnames
session_in_args = arg_session in func_params and
func_params.index(arg_session) < len(args)
session_in_kwargs = arg_session in kwargs
if session_in_kwargs and kwargs[arg_session] is not None:
return func(*args, **kwargs)
if session_in_args and args[func_params.index(arg_session)] is not None:
return func(*args, **kwargs)
else:
with master_connection_pool.connection() as conn:
conn.row_factory = dict_row
kwargs[arg_session] = conn
return func(*args, **kwargs)
return wrapper
Then in my app I can do this:
@provide_db
def do_Stuff(user_id, db):
db.query(f"SELECT * FROM users WHERE id={user_id}")
...
What is the proper way to reconnect or prevent this from happening, this is a webapp so its a long running process.
2
Answers
You can develop a connection pool manager that checks that connections are valid and reopens them as needed to address this error in a web application. To handle this, here is a revised version of your code:
This should help eliminate "the connection is closed" issues in your web application by ensuring that the connection is correctly maintained and returned to the pool, even in the case of exceptions.
From the document I see the connection is not closed, instead, it’s put back in the pool so it’s reused.
I can think of 3 approaches:
max_idle
andmax_lifetime
can be used to control the pool behavior. you can read more about them here.I hope you will be able to fix it If you play around with these.