skip to Main Content

I have a appendonly.aof file that’s grown too large (1.5gb, and 29,558,054 lines).

When I try and load redis it hangs on “Loading data into memory” for what seems like all day (still hasn’t finished).

Is there anything I can do to optimize this file as it likely contains many duplicate transactions (like deleting the same record).

Or anything I can do to see progress to know if i’m waiting for nothing or how long it will take before I try and restore an older backup?

2

Answers


  1. With redis 4+ you can use mixed format for optimizing appendonly by setting aof-use-rdb-preamble to yes.

    With this setting in place redis dumps the data in RDB format to AOF file with every BGAOFREWRITE call, which you can verify by aof files contents which starts with REDIS keyword.

    Upon restarts with this REDIS keyword in aof file along with this aof-use-rdb-preamble, redis will load the RDB followed by aof contents.

    Login or Signup to reply.
  2. You can configure your Redis server based on this AOF

    And if you are using docker you should be careful with how frequently your container is being restarted

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search