Oracle Goldengate Initial Load and Zero downtime Migration and Upgradation Strategy

Sdílet
Vložit
  • čas přidán 5. 09. 2024
  • The video describes Initial Load strategy using Oracle Goldengate as well as how to perform zero Downtime migration and Upgradation using Oracle Goldengate

Komentáře • 26

  • @samarjitpanigrahy659
    @samarjitpanigrahy659 Před 5 lety +1

    It is really a very nice explanation with a lot patience by Ashish. Great session.

  • @jamesakins4599
    @jamesakins4599 Před 2 lety

    Great job! I'm about try a proof of concept upgrading from 10g to 19c using GoldenGate.

  • @kasieswary5989
    @kasieswary5989 Před 6 lety +1

    Hi Ashish . Thank you for the awsome videos. nice explanation good work .Keep it up

  • @clraghu
    @clraghu Před 7 lety +1

    please upload other series videos related to golden gate. you explained really well. thanks

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 7 lety +1

      Hi Raghu,
      Thanks for your comments. I will upload more videos on the channel. Please like and subscribe the channel for updates on Channel.
      Regards,
      Ashish Agarwal
      Whatsapp@+91-7208729978
      Call@+1-4168715708 (Canada)
      Email: ashishagarwalag@gmail.com

  • @sanjivranjit4292
    @sanjivranjit4292 Před 6 lety +1

    Hi Ashish . Thank you for the awsome videos. Really easy to understand.
    What is the next video continued after this one? I am not getting the video.
    could you please list the following videos so that i wont mess up the contiuation.
    Thanks in advance.

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 6 lety

      Thanks Sanjiv. The full GG course is also available. You may email me @ashishagarwalag@gmail.com for details

  • @kirankumar-np7tc
    @kirankumar-np7tc Před 4 měsíci

    This setup is for classic architecture or microservices or can use it for both

  • @clraghu
    @clraghu Před 7 lety +1

    Hi Asish I have already subscribed. Data compression is possible when we use pump process using compress parameter. Your explanation is simple and can understand easily. you're doing great job brother. I am waiting for your series videos. please upload.

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 7 lety

      Yes compression of trail files is possible by using the Compress parameter in Data Pump Parameter file.
      Once again thanks v much for your motivating comments.

    • @clraghu
      @clraghu Před 7 lety

      if you don't mind, can i ask or know when will you upload next series videos ???

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 7 lety

      Hi Raghu, I will upload it soon. however, if u have any questions u can drop me a note @ashishagarwalag@gmail.com

    • @hemalshah4683
      @hemalshah4683 Před 7 lety

      Compression of trail files is not possible, but only data can be compressed while its being written to trail file by extract process.

  • @MrDilip83
    @MrDilip83 Před 2 měsíci

    Hi Ashish. Is it possible to get load status in mail every day with count of records and table name

  • @landrybalbin560
    @landrybalbin560 Před 4 lety +1

    What initial load method do you recommend fir schemas with big tables ? Source oracle target sql server

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 4 lety +1

      U can use SSMS to create structure and load the data. Another option is using SSMS to create the structure and do initial load using Oracle Goldengate

    • @landrybalbin560
      @landrybalbin560 Před 4 lety

      Ashish Agarwal thanks a lot Ashish. My source database is an oracle data warehouse which I want to replicate only 3 schemas . Example the oracle database warehouse is called ASIADEV the schema I want to replicate is ODSFIN.* (for all the tables). Now all these tables exist in sql server but the naming convention there is different . On Sql server ODSFIN is a database not a schema . The schema is called DBO.* (for all tables) . How will I map that on the replicat ? Will it be
      MAP ODSFIN.*, TARGET ODSFIN.ODS.*;
      ? Thanks a lot

    • @landrybalbin560
      @landrybalbin560 Před 4 lety

      On the source extract I have schema.table name. Like this
      MAP ODSFIN.*;
      And I even got the source definition for all the tables . I am confused with sql server mapping on the replicat

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 4 lety

      @@landrybalbin560 The recommended is to give table neames individually instead of *. however even if u give in above way thats also OK.
      I hope the datatype conversion etc has been successfully done.

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 4 lety

      @@landrybalbin560 You can give in the same way as of Oracle. You have to give on source extract side as TABLE ODSFIN.* or instead oif * give individual table names. You can send me an email@ashishagarwalag@gmail.com or whatsapp me @+14168715708 and I can help u more.

  • @nswaknwaj
    @nswaknwaj Před 3 lety

    HI, Ashish ... in this vedio initial load extract process done .. but u stopped replicat process ...when u start replicat .. then how data inserted ..????

    • @AshishAgarwal_GG
      @AshishAgarwal_GG  Před 3 lety

      hi Mohammed,
      once u restarted the replicat it picks up the transaction which extract is capturing the data from. Replicat restarts from the point from where it is stopped at.