skip to Main Content

I have deep learning models (tensorflow in hdf5 format), which I want to upload to a PostgreSQL database.
A single model may be up to 500 MBs, and the models need to be updated and uploaded/downloaded from the database frequently (e.g., once every 30 mins).

I’m a begineer to PostgreSQL, so any information/receipes to start with would be appreciated.

2

Answers


  1. Storing large binary data in an relational DB is possible but unwise. Also, you have to eventually copy your model from the DB to the disk for TensorFlow to load anyway. Why not store your model directly on the disk?

    Login or Signup to reply.
  2. Relation DB is a wrong tool for this. Store the link to an S3 bucket for example. Or your own private cloud if the model is private.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search