skip to Main Content

Is there any opensource tool or application available for migrating data from CSV file/s to MongoDB and Postgres Databases ?

  1. The tool should be able to take in the dataSource, in our case the CSV file , and we should be able to tell which is the target database, MongoDb and Postgres in our case.
  2. Based on reading the CSV headers (dataSource) and the collection’s attributes and table’s columns (target database) , we should be able to define a mapping of fields between the dataSource and target database.
  3. The Tool should be able to import the data from the dataSource to the target database and give a report of the Import operation, like the status of import, no. of records imported, no.of records failed etc…

I couldn’t find any tools which satisfies all the above condition.

Please suggest if there are any tools available.

Thanks.

2

Answers


  1. For MongoDB there is a tool available: MongoDB Database Tools: mongoimport

    I don’t think you will find any tool which can to both MongoDB and Postgres Databases unless you go for an ETL suite like Pentaho Data Integration. But these tools are rather big and complex and might be an overkill.

    Login or Signup to reply.
  2. I think it should be best if you define your own pipeline using some coding language. let’s take the example of python, you can use psycopg2 (for postgres) or pymongo (for mongoDB). below is the approach you can follow:

    • Use read_csv to read the data.
    • Use transform_data to clean,
      transform, and map the data based on the target database.
    • If the target database is MongoDB, you should use load_to_mongo to insert the data into the specified collection.
    • If the target database is Postgres, Use load_to_postgres to insert the data into the specified table.

    Also if you are into some tools you can use OpenRefine to transform the data then load into the database.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search