Processing Slowly Changing Dimensions with ADF Data Flows
Vložit
- čas přidán 30. 07. 2019
- Do you want to learn how to slowly change dimensions with Azure Data Factory?
This session will begin with an overview of Azure Data Factory Data Flows and a review of dimension processing patterns, followed by demos.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - - -
Next step on your journey:
👉 On-Demand Learning Courses FREE Trial: www.pragmaticworkstraining.co...
🔗Pragmatic Works On-Demand Learning Packages: pragmaticworks.com/pricing/
🔗Pragmatic Works Boot Camps: pragmaticworks.com/boot-camps/
🔗Pragmatic Works Hackathons: pragmaticworks.com/private-tr...
🔗Pragmatic Works Virtual Mentoring: pragmaticworks.com/virtual-me...
🔗Pragmatic Works Enterprise Private Training: pragmaticworks.com/private-tr...
🔗Pragmatic Works Blog: blog.pragmaticworks.com/
Let's connect:
✔️Twitter: / pragmaticworks
✔️Facebook: / pragmaticworks
✔️Instagram: / pragmatic.works
✔️LinkedIn: / pragmaticworks
✔️CZcams: / pragmaticworks
Pragmatic Works
7175 Hwy 17, Suite 2 Fleming Island, FL 32003
Phone: (904) 413-1911
Email: training@pragmaticworks.com
Thank you it neat and briefly explained
Great intro! Thanks!
It would be good if you can upload scripts you have used in demo. It would be a good reference for audience.
how would i quarry to get a view of a dataset looked like at a spesific date, if i structured it like this type 2?
how are u taking care,if same hash exists ?
It was a very good explanation, but in Insert Sink, it will not insert newly updated rows with new Keys and new ActiveEnddate. Please make sure about the process, but I would like to thank you very much, it was a very good demo and helpful
Why do you think it wont work
Can you please explain how to implement type 3 and type 2 scd in adf?
its great explanation ,can you provide me the source file which you used for current SCD Type one? will try for that
czcams.com/video/-cGPnND2880/video.html
On insert for uniqueidentifiers, how can you create a "new guid"?
Assuming you are loading to a SQL Server DB, you could create a default constraint and use the NEWID() function.
Can you share the files used in the demos
16:42 Correct me if I am wrong. This step will be prove bottleneck when data count is higher. Instead it would have added at 24:04 as lookup. We need a logic here for a data whose records found can be consider update and for other is insert.
how do I export and import a data factory including all the pipelines,data flows & data sets?
You can go to resource group where you created your data factory and click on export. It generates json code then click on download. ARM Template will be downloaded to your local machine.
Can you make a video which shows how you transfered those data set files
For this demo, I manually uploaded the files with Azure Storage Explorer.
ITs just too slow and requires a lot of patience to watch this video...