skip to Main Content

I have some webscraped data. I’m using redis as my server and fetching data from nodejs and displaying it in react.

This is what I have:

2 redis keys both ~ 200kb (so far, this will become ~10-20 mb in the future)

I fetch data from redis by getting both keys, parsing the data and then sending it. Just to note I’m storing data as one big string in the keys.

express:

 try {
    // Fetch redis data
    const data1 = await GET_ASYNC('data1');
    const data2 = await GET_ASYNC('data2');

    // Parse data
    const parse1 = JSON.parse(data1);
    const parse2 = JSON.parse(data2);

    const result = [...data2, ...data1];

    res.send(filterResults);

  } catch (err) {
    
    //
  }

The problem is that nodejs takes about ~2000ms to fetch and send the data, for my situation I need to lower this to 500ms or less. How could I do this? Is my problem in nodejs or how I’m storing data in redis? Thanks.

2

Answers


  1. My suggestion would be to go for smaller keys instead of just 2 large keys. This will have few advantages i.e.

    1. you will save on the network time to transfer such huge data.
    2. you will save on the processing too at the nodejs side by not having to parse a big json structure.
    3. Also, it would be easier for you to maintain smaller data structures in case you need to delete any of the keys by just adding an expiry to it, where as in case of 1 big structure you will have to maintain deletion of the data manually
    Login or Signup to reply.
  2. My suggestions:

    1. compress the data before save and after get. The compress cost is worthy.
    2. bigkey operation will block Redis, try split datas to different Redis instance.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search