skip to Main Content

I’m using FastAPI with Redis. My app looks something like this

from fastapi import FastAPI
import redis

# Instantiate redis client
r = redis.Redis(host='localhost', port=6379, db=0, decode_responses=True)

# Instantiate fastapi app
app = FastAPI()

@app.get("/foo/")
async def foo():
    x = r.get("foo")
    return {"message": x}

@app.get("/bar/")
async def bar():
    x = r.get("bar")
    return {"message": x}

Is it bad practice to create r as a module-scoped variable like this? If so what are the drawbacks?

In Tiangolo’s tutorial on setting up a SQL database connection he uses a dependency, which I guess in my case would look something like this

from fastapi import Depends, FastAPI
import redis

# Instantiate fastapi app
app = FastAPI()

# Dependency
def get_redis():
    return redis.Redis(host='localhost', port=6379, db=0, decode_responses=True)

@app.get("/foo/")
async def foo(r = Depends(get_redis)):
    x = r.get("foo")
    return {"message": x}

@app.get("/bar/")
async def bar(r = Depends(get_redis)):
    x = r.get("bar")
    return {"message": x}

I’m a bit confused as to which of these methods (or something else) would be preferred and why.

3

Answers


  1. In your second example, every time your creating new redis instance and one time it reach max connection limit. If you put code like this that much more clean and re-usable,

    from fastapi import FastAPI
    import redis
    
    class AppAPI(FastAPI):
        def __init__(self):
            self.redis_client = redis.Redis(host='localhost', port=6379, db=0, decode_responses=True)
    
        @app.get("/foo/")
        async def foo():
            x = self.redis_client.get("foo")
            return {"message": x}
    
    Login or Signup to reply.
  2. Depends will evaluate every time your function got a request, so your second example will create a new connection for each request. As @JarroVGIT said, we can use connection pooling to maintain the connection from FastAPI to Redis and reduce open-closing connection costs.

    Usually, I create a different file to define the connection. Let’s say we have config/db.py:

    import redis
    
    def create_redis():
      return redis.ConnectionPool(
        host='localhost', 
        port=6379, 
        db=0, 
        decode_responses=True
      )
    
    pool = create_redis()
    

    Then in the main.py

    from fastapi import Depends, FastAPI
    import redis
    
    from config.db import pool
    
    app = FastAPI()
    
    def get_redis():
      # Here, we re-use our connection pool
      # not creating a new one
      return redis.Redis(connection_pool=pool)
    
    @app.get("/items/{item_id}")
    def read_item(item_id: int, cache = Depends(get_redis)):
      status = cache.get(item_id)
      return {"item_name": status}
    
    
    @app.put("/items/{item_id}")
    def update_item(item_id: int, cache = Depends(get_redis)):
      cache.set(item_id, "available")
      return {"status": "available", "item_id": item_id}
    

    Usually, I also split the dependencies file like the doc so we can call it from our routing module, but for simplicity, I will leave it like this.

    You can check this repo to experiment by yourself. It has more comprehensive code and I have already created several scenarios that might help you understand the difference. And it will also cover how your first example may block other endpoints.

    Login or Signup to reply.
  3. https://github.com/redis/redis-py#connection-pools. You can define the pool at module level and import it wherever needed. All Redis connection will be created out of the pool.

    pool = redis.ConnectionPool(host='localhost', port=6379, db=0)
    r = redis.Redis(connection_pool=pool)
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search