skip to Main Content

ConnectionError: Error 104 while writing to socket. Connection reset by peer.

Environment:
ubuntu : 16.04
python : 3.6
PC total memory : 32G

I have redis ‘3.0.6’ installed.

It succeeds when you try to insert 500,000 data, but fails when you try to insert 40 million large data.

When trying to insert Python data frame into redis, it fails because the capacity is too large.

data insert Sucessful:

 1. r = redis.StrictRedis(host='localhost', port=6379, db=0)
 2.log_df_50.shape
   -> (500000, 6)
 3.r.setex('log_df_50',100,log_df_50.to_json())
   -> True

data insert Fail:

 1.r = redis.StrictRedis(host='localhost', port=6379, db=0)
 2. log_df.shape
   -> (41757802, 6)
 3. r.setex('session',100,log_df.to_json())

ConnectionResetError Traceback (most recent call last)
~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/connection.py
in send_packed_command(self, command, check_health)
705 for item in command:
–> 706 sendall(self._sock, item)
707 except socket.timeout:

~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/_compat.py
in sendall(sock, *args, **kwargs)
8 def sendall(sock, *args, **kwargs):
—-> 9 return sock.sendall(*args, **kwargs)
10

ConnectionResetError: [Errno 104] Connection reset by peer

During handling of the above exception, another exception occurred:

ConnectionError Traceback (most recent call last)
in
—-> 1 r.setex(‘session’,100,log_df.to_json())

~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/client.py in
setex(self, name, time, value) 1820 if isinstance(time,
datetime.timedelta): 1821 time =
int(time.total_seconds())
-> 1822 return self.execute_command(‘SETEX’, name, time, value) 1823 1824 def setnx(self, name, value):

~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/client.py in
execute_command(self, *args, **options)
898 conn = self.connection or pool.get_connection(command_name, **options)
899 try:
–> 900 conn.send_command(*args)
901 return self.parse_response(conn, command_name, **options)
902 except (ConnectionError, TimeoutError) as e:

~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/connection.py
in send_command(self, *args, **kwargs)
724 "Pack and send a command to the Redis server"
725 self.send_packed_command(self.pack_command(*args),
–> 726 check_health=kwargs.get(‘check_health’, True))
727
728 def can_read(self, timeout=0):

~/anaconda3/envs/Colabo/lib/python3.6/site-packages/redis/connection.py
in send_packed_command(self, command, check_health)
716 errmsg = e.args[1] 717 raise ConnectionError("Error %s while writing to socket. %s." %
–> 718 (errno, errmsg))
719 except BaseException:
720 self.disconnect()

ConnectionError: Error 104 while writing to socket. Connection reset
by peer.

Any hints on the cause?

How do you insert python Large capacity Dataframe into redis?

What should i do to solve this problem?

2

Answers


  1. I had the same issue.
    Setting ssl=True in

    1. r = redis.StrictRedis(host='localhost', port=6379, db=0, ssl=True)
    

    solved it for me

    Login or Signup to reply.
  2. I think that the problem is with the 512 MB limit for the key/value in Redis’ protocol.
    https://redis.io/topics/protocol
    https://redis.io/topics/data-types-intro

    And there is not much you can do about it now.

    You probably have to just slice your dataframes into smaller pieces.

    UPDATE:
    You can also use newer Redis (minimum 5.0) and increase proto-max-bulk-len in redis.conf.
    https://github.com/redis/redis/issues/7354
    I also had to increase client-query-buffer-limit.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search