skip to Main Content

I’m sure the issue here is not with StackExchange.Redis but something in our Azure App Service config/startup/setup.

We have been using StackExchange.Redis happily for years. We migrated an Azure App Service to .NET 8, and now redis is very slow and frequently errors:

Timeout performing EXISTS (5000ms), 
next: EXISTS XXX, 
inst: 10, qu: 0, qs: 0, 
aw: False, bw: SpinningDown, 
rs: ReadAsync, 
ws: Idle, in: 208, 
last-in: 0, cur-in: 0, sync-ops: 20932, 
async-ops: 1, 
serverEndpoint: XXX-XXXXX-REDIS-ECOM-PRD-1-STAGE.redis.cache.windows.net:6380, 
conn-sec: 64302.22, aoc: 0, mc: 1/1/0, 
mgr: 10 of 10 available, 
clientName: DW0MDWK0000LG(SE.Redis-v2.8.0.27420), 
IOCP: (Busy=0,Free=1000,Min=1,Max=1000), 
WORKER: (Busy=37,Free=32730,Min=2,Max=32767), 
POOL: (Threads=37,QueuedItems=64,CompletedItems=315273,Timers=29), 
v: 2.8.0.27420 
Failed Method: StackExchange.Redis.ConnectionMultiplexer.ExecuteSyncImpl

My gut feeling is this has to do with the resources of the Azure App in some way.

I could increase the timeout, but redis should be returning these light queries in a lot less than 5000ms. In the Az portal, the Redis Server Load is under 10%. We hardly use it. Mem is 40Mb of 250Mb.

 slowlog get 10
 1) 1) (integer) 260
    2) (integer) 1725447601
    3) (integer) 30987
    4) 1) "EVALSHA"
       2) "3915ee22fda531a1d5661f2523d0443fd35ff0a4"
       3) "1"
       4) "CatalogueDataAccess.GetX"
       5) "638610516012637021"
       6) "-1"
       7) "7200"
       8) A small amount of JSON here

I have tried and failed to interpret the error messages.

Can a Redis wizard please interpret.

2

Answers


  1. It looks like you’re running into performance issues with StackExchange.Redis after migrating to .NET 8 on Azure App Service. The timeout errors and slow response times you’re seeing are frustrating, but there are a few common culprits and steps you can take to resolve them.

    Understanding the Error Message
    Timeout Performing EXISTS (5000ms):

    This means the EXISTS command is timing out after 5 seconds. This indicates there’s a delay in communication between your application and the Redis server.
    Connection Metrics:

    The metrics show that your Redis connections and threads are within expected ranges, so it’s less likely to be a connection pool issue.
    Client Version:

    You’re using StackExchange.Redis v2.8.0.27420, which is quite old. Newer versions often come with performance improvements and bug fixes.
    Possible Causes and Solutions
    Resource Constraints:

    Check if your Azure App Service is running out of resources like CPU or memory. If it’s under heavy load, consider scaling up to a higher service tier.
    Network Latency:

    Ensure your Redis instance and App Service are in the same region to reduce network latency. High latency can lead to timeouts.
    Connection Pool Configuration:

    Review and adjust your Redis connection pool settings. The default settings might need to be optimized based on your application’s workload.
    Redis Server Performance:

    Monitor the performance of your Redis server. If it’s under heavy load or misconfigured, it could be slowing down response times.
    Application Changes:

    Since you’ve migrated to .NET 8, check if any changes in how your application interacts with Redis might be affecting performance.
    Upgrade StackExchange.Redis:

    Consider updating StackExchange.Redis to a more recent version. New versions might have improvements that could help with performance issues.

    Login or Signup to reply.
  2. POOL: (Threads=37,QueuedItems=64,CompletedItems=315273,Timers=29),
    

    Low threads + high QueuedItems + low CPU usage indicates thread pool exhaustion. Happened to our app on .net 8 upgrade too.

    StackExchange.Redis still relies on shared thread pool, and with a high request rate it just fails first, so you see RedisTimeout exceptions. However, anything that relies on thread poll, including asp.net, will be slow because of exhaustion as well. Redis client is just a canary that dies first.

    The fix is to set min threads to 200-300 to allow .net to spin up the pool quickly.
    Add the following on the application start:

    ThreadPool.SetMinThreads(300, 300);
    

    Ref: Azure Cache for Redis management FAQs / Important details about ThreadPool growth

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search