We have a postgres database table where every second 50 rows will be added to that table continously from a realtime application. Now we have read every newly added row(one row at a time) using python script. Is it possible read all the rows without missing any?
If possible, please help me with python script to read realtime updated rows(50 rows/second) from postgres database.
Tried below script.
print('nConnecting to the PostgreSQL database...n')
conn = psycopg2.connect(**params)
# create a cursor
cur = conn.cursor()
while TRUE:
cur.execute('SELECT * from postgrestable ORDER BY TIMESTAMP DESC LIMIT 1')
result = cur.fetchone();
print(result)
#Commit your changes in the database
conn.commit()
Response : Database access is working but i am able get a row every to seconds. Every 2 seconds my database id getting 100 rows. But i am able to fetch only 1 row out of 100.
Is this because of cur.execute() function? Will this function takes time to execute?
2
Answers
Try this updated code, in this code I replaced OrderBy: [asc(document.columns.latestUpdatedAt)] with orderBy: [desc(document.columns.latestUpdatedAt)] . You’ll get the latest chat rooms first because this will sort the result set according to the latestUpdatedAt column in descending order.
Hope it works 🙂
You can set your python client to
listen
topg_notify()
broadcasted from atrigger...after insert or update
on the table collecting your signals.A sample insert will trigger a message to be relayed to your listening client:
To work around the fact you can send only
text
payload vianotify
you can instead relay only the unique identifier for your clientThen let it simply go pick it up:
And let psycopg2 handle the binding.