skip to Main Content

I want to create a operational data store in mongo database. The source data is available in oracle database.The problem is while migration there are chances, where the data in source db may get updated. How can i reconcile between source and destination database effectively as there are millions of records in the source?

I tried using spring batch program to reconcile based on time stamp but that’s not effective.

2

Answers


  1. What you’re looking for is CDC (Continuous Data Capture).
    There are commercial tools like Oracle Golden Gate and Shareplex, which can eavesdrop changes in Database and send them into Kafka stream.

    You can pickup these changes from Kafka and store them in Mongo or Elastic.

    Plus there is about a dozen of "free" DCD tools, which are based on Oracle Logminer. Just do Google for "Oracle CDC Kafka".

    UPDATE: Acrording to this answer Debezium can also use OpenLogReplicator. This can offer much better throughput and have lesser impact on stability and performance of the database

    Login or Signup to reply.
  2. Possibly MOVEX CDC may match your requirements?
    https://github.com/osp-ottogroup/movex-cdc

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search