Skip to Main Content
  • Questions
  • Real Time Data Sync from ORACLE OLTP DB to Kafka on Cloud

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Ankit.

Asked: July 05, 2019 - 12:14 pm UTC

Last updated: July 25, 2019 - 2:46 am UTC

Version: 12.3

Viewed 1000+ times

You Asked

Hello Tom,


I have a business requirement in my project to be able to Replicat data in real-time from Production OLTP Oracle DB's which are very heavily transaction active to Kafka and ultimately feed to Target in Cloud for Data Analytics and Machine learning. One of the options explored is to go for Goldengate. Requirement is to send all the columns of each tables regardless of any updates. In my opinion, enabling full supplemental logging (trandata) at DB or table level will serve the purpose but we dont want to have any potential performance implication on source ORACLE OLTP DB due to this as it may impact other systems. Is there any other best practices/recommended solution in same line or with any other tool that can help meet my business requirement?

Appreciate some quick responses on this.



Thanks & Regards,
Ankit Gadhiya

and Connor said...

I think Goldengate is pretty much your only option here. There used to be a 3rd party product called dbvisit but its been discontinued. Similarly, other options inside Oracle (streams, change data capture) are also now deprecated in favour of Goldengate

As with anything that changes the volume of I/O on your system, I'd strongly encourage you to test the impact of supplemental logging on a non-production system first.

For systems that have ample I/O bandwidth, supplemental logging is barely a blip on the radar. But for system where their I/O is already very close to max throughput, then you don't want to add anything that might push you over the edge of that cliff.

The other I'd strongly recommend is a thorough investigation of the requirement: "send all the columns of each tables regardless of any updates". Often that is the opening position of a business because its the easiest (ie, no-one wants to undertake the effort of assessing what it definitely required, so we just go with "take everything"). Which is fine if you go that way, but understand that the database is ultimately paying that cost in effort instead of you.

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Thank You.

Ankit Gadhiya, July 08, 2019 - 5:57 pm UTC

Hi Connor,

Thank You so much for your detailed response.
Is there a way to log all columns only incase of updates? The reason we need this because target is Kafka Topics and Application architect is unable to determine other ways of collecting the data on Data Lake from Kafka. Any alternative or better solution would be highly appreciated.

Thanks,
Ankit Gadhiya
Connor McDonald
July 25, 2019 - 2:46 am UTC

I think Goldengate is your best option here

dbvisit

A reader, July 11, 2019 - 9:11 am UTC

... There used to be a 3rd party product called dbvisit but its been discontinued ...


errm ?
where did you read that ?
https://replicate-connector-for-kafka.readthedocs.io/en/latest/
Connor McDonald
July 25, 2019 - 2:22 am UTC

Sorry, I should have been more precise

https://dbvisit.com/dbvisit-replicate

More to Explore

Administration

Need more information on Administration? Check out the Administrators guide for the Oracle Database