The video describes Initial Load strategy using Oracle Goldengate as well as how to perform zero Downtime migration and Upgradation using Oracle Goldengate
Hi Raghu, Thanks for your comments. I will upload more videos on the channel. Please like and subscribe the channel for updates on Channel. Regards, Ashish Agarwal Whatsapp@+91-7208729978 Call@+1-4168715708 (Canada) Email: ashishagarwalag@gmail.com
Hi Ashish . Thank you for the awsome videos. Really easy to understand. What is the next video continued after this one? I am not getting the video. could you please list the following videos so that i wont mess up the contiuation. Thanks in advance.
Hi Asish I have already subscribed. Data compression is possible when we use pump process using compress parameter. Your explanation is simple and can understand easily. you're doing great job brother. I am waiting for your series videos. please upload.
Yes compression of trail files is possible by using the Compress parameter in Data Pump Parameter file. Once again thanks v much for your motivating comments.
Ashish Agarwal thanks a lot Ashish. My source database is an oracle data warehouse which I want to replicate only 3 schemas . Example the oracle database warehouse is called ASIADEV the schema I want to replicate is ODSFIN.* (for all the tables). Now all these tables exist in sql server but the naming convention there is different . On Sql server ODSFIN is a database not a schema . The schema is called DBO.* (for all tables) . How will I map that on the replicat ? Will it be MAP ODSFIN.*, TARGET ODSFIN.ODS.*; ? Thanks a lot
On the source extract I have schema.table name. Like this MAP ODSFIN.*; And I even got the source definition for all the tables . I am confused with sql server mapping on the replicat
@@landrybalbin560 The recommended is to give table neames individually instead of *. however even if u give in above way thats also OK. I hope the datatype conversion etc has been successfully done.
@@landrybalbin560 You can give in the same way as of Oracle. You have to give on source extract side as TABLE ODSFIN.* or instead oif * give individual table names. You can send me an email@ashishagarwalag@gmail.com or whatsapp me @+14168715708 and I can help u more.
HI, Ashish ... in this vedio initial load extract process done .. but u stopped replicat process ...when u start replicat .. then how data inserted ..????
hi Mohammed, once u restarted the replicat it picks up the transaction which extract is capturing the data from. Replicat restarts from the point from where it is stopped at.
It is really a very nice explanation with a lot patience by Ashish. Great session.
Thanks Samarjit!
Great job! I'm about try a proof of concept upgrading from 10g to 19c using GoldenGate.
Hi Ashish . Thank you for the awsome videos. nice explanation good work .Keep it up
Thanks kasieswar!
please upload other series videos related to golden gate. you explained really well. thanks
Hi Raghu,
Thanks for your comments. I will upload more videos on the channel. Please like and subscribe the channel for updates on Channel.
Regards,
Ashish Agarwal
Whatsapp@+91-7208729978
Call@+1-4168715708 (Canada)
Email: ashishagarwalag@gmail.com
Hi Ashish . Thank you for the awsome videos. Really easy to understand.
What is the next video continued after this one? I am not getting the video.
could you please list the following videos so that i wont mess up the contiuation.
Thanks in advance.
Thanks Sanjiv. The full GG course is also available. You may email me @ashishagarwalag@gmail.com for details
This setup is for classic architecture or microservices or can use it for both
Hi Asish I have already subscribed. Data compression is possible when we use pump process using compress parameter. Your explanation is simple and can understand easily. you're doing great job brother. I am waiting for your series videos. please upload.
Yes compression of trail files is possible by using the Compress parameter in Data Pump Parameter file.
Once again thanks v much for your motivating comments.
if you don't mind, can i ask or know when will you upload next series videos ???
Hi Raghu, I will upload it soon. however, if u have any questions u can drop me a note @ashishagarwalag@gmail.com
Compression of trail files is not possible, but only data can be compressed while its being written to trail file by extract process.
Hi Ashish. Is it possible to get load status in mail every day with count of records and table name
What initial load method do you recommend fir schemas with big tables ? Source oracle target sql server
U can use SSMS to create structure and load the data. Another option is using SSMS to create the structure and do initial load using Oracle Goldengate
Ashish Agarwal thanks a lot Ashish. My source database is an oracle data warehouse which I want to replicate only 3 schemas . Example the oracle database warehouse is called ASIADEV the schema I want to replicate is ODSFIN.* (for all the tables). Now all these tables exist in sql server but the naming convention there is different . On Sql server ODSFIN is a database not a schema . The schema is called DBO.* (for all tables) . How will I map that on the replicat ? Will it be
MAP ODSFIN.*, TARGET ODSFIN.ODS.*;
? Thanks a lot
On the source extract I have schema.table name. Like this
MAP ODSFIN.*;
And I even got the source definition for all the tables . I am confused with sql server mapping on the replicat
@@landrybalbin560 The recommended is to give table neames individually instead of *. however even if u give in above way thats also OK.
I hope the datatype conversion etc has been successfully done.
@@landrybalbin560 You can give in the same way as of Oracle. You have to give on source extract side as TABLE ODSFIN.* or instead oif * give individual table names. You can send me an email@ashishagarwalag@gmail.com or whatsapp me @+14168715708 and I can help u more.
HI, Ashish ... in this vedio initial load extract process done .. but u stopped replicat process ...when u start replicat .. then how data inserted ..????
hi Mohammed,
once u restarted the replicat it picks up the transaction which extract is capturing the data from. Replicat restarts from the point from where it is stopped at.