Processing Slowly Changing Dimensions with ADF Data Flows

Sdílet
Vložit
  • čas přidán 30. 07. 2019
  • Do you want to learn how to slowly change dimensions with Azure Data Factory?
    This session will begin with an overview of Azure Data Factory Data Flows and a review of dimension processing patterns, followed by demos.
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - - -
    Next step on your journey:
    👉 On-Demand Learning Courses FREE Trial: www.pragmaticworkstraining.co...
    🔗Pragmatic Works On-Demand Learning Packages: pragmaticworks.com/pricing/
    🔗Pragmatic Works Boot Camps: pragmaticworks.com/boot-camps/
    🔗Pragmatic Works Hackathons: pragmaticworks.com/private-tr...
    🔗Pragmatic Works Virtual Mentoring: pragmaticworks.com/virtual-me...
    🔗Pragmatic Works Enterprise Private Training: pragmaticworks.com/private-tr...
    🔗Pragmatic Works Blog: blog.pragmaticworks.com/
    Let's connect:
    ✔️Twitter: / pragmaticworks
    ✔️Facebook: / pragmaticworks
    ✔️Instagram: / pragmatic.works
    ✔️LinkedIn: / pragmaticworks
    ✔️CZcams: / pragmaticworks
    Pragmatic Works
    7175 Hwy 17, Suite 2 Fleming Island, FL 32003
    Phone: (904) 413-1911
    Email: training@pragmaticworks.com

Komentáře • 19

  • @abelfelleke7290
    @abelfelleke7290 Před 5 lety +2

    Thank you it neat and briefly explained

  • @terryliu3635
    @terryliu3635 Před 4 lety

    Great intro! Thanks!

  • @vishal259
    @vishal259 Před 4 lety +4

    It would be good if you can upload scripts you have used in demo. It would be a good reference for audience.

  • @theownmages
    @theownmages Před 4 lety

    how would i quarry to get a view of a dataset looked like at a spesific date, if i structured it like this type 2?

  • @Sarabjitsethi
    @Sarabjitsethi Před 3 lety +1

    how are u taking care,if same hash exists ?

  • @moesmael
    @moesmael Před 3 lety +1

    It was a very good explanation, but in Insert Sink, it will not insert newly updated rows with new Keys and new ActiveEnddate. Please make sure about the process, but I would like to thank you very much, it was a very good demo and helpful

  • @shivanityagi1073
    @shivanityagi1073 Před rokem

    Can you please explain how to implement type 3 and type 2 scd in adf?

  • @srinin6252
    @srinin6252 Před 3 lety +1

    its great explanation ,can you provide me the source file which you used for current SCD Type one? will try for that

  • @YTEEsenior
    @YTEEsenior Před 5 lety +1

    On insert for uniqueidentifiers, how can you create a "new guid"?

    • @bobrubocki5305
      @bobrubocki5305 Před 4 lety +1

      Assuming you are loading to a SQL Server DB, you could create a default constraint and use the NEWID() function.

  • @jagadishparale8632
    @jagadishparale8632 Před 3 lety

    Can you share the files used in the demos

  • @reckitt9173
    @reckitt9173 Před 11 měsíci

    16:42 Correct me if I am wrong. This step will be prove bottleneck when data count is higher. Instead it would have added at 24:04 as lookup. We need a logic here for a data whose records found can be consider update and for other is insert.

  • @kal1leo
    @kal1leo Před 4 lety

    how do I export and import a data factory including all the pipelines,data flows & data sets?

    • @rameshchowdary6390
      @rameshchowdary6390 Před 4 lety

      You can go to resource group where you created your data factory and click on export. It generates json code then click on download. ARM Template will be downloaded to your local machine.

  • @abelfelleke7290
    @abelfelleke7290 Před 5 lety

    Can you make a video which shows how you transfered those data set files

    • @bobrubocki5305
      @bobrubocki5305 Před 4 lety

      For this demo, I manually uploaded the files with Azure Storage Explorer.

  • @amitnavgire8113
    @amitnavgire8113 Před 4 lety

    ITs just too slow and requires a lot of patience to watch this video...