Delta Tables
Vložit
- čas přidán 27. 03. 2023
- Have you heard of a Delta Lake? What is up with that? In this second video in our new series on Delta Lake, Austin goes through the basics of explaining what a delta is, how to store data in the data lake using delta, working with the transaction log, and querying the delta table with serverless SQL On-Demand in Azure Synapse Analytics.
👍 If you enjoy this video and are interested in formal training on Power BI, Power Apps, Azure, or other Microsoft products you can use my code "Austin30" at check out when purchasing our On-Demand Learning classes to get an extra 30% off - pragmaticworks.com/pricing/
-- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - - -
Next step on your journey:
👉 On-Demand Learning - Start With The FREE Community Plan: tinyurl.com/2ujujsb5
🔗Pragmatic Works On-Demand Learning Packages: pragmaticworks.com/pricing/
🔗Pragmatic Works Boot Camps: pragmaticworks.com/boot-camps/
🔗Pragmatic Works Hackathons: pragmaticworks.com/private-tr...
🔗Pragmatic Works Virtual Mentoring: pragmaticworks.com/virtual-me...
🔗Pragmatic Works Enterprise Private Training: pragmaticworks.com/private-tr...
🔗Pragmatic Works Blog: blog.pragmaticworks.com/
Let's connect:
✔️Twitter: / pragmaticworks
✔️Facebook: / pragmaticworks
✔️Instagram: / pragmatic.works
✔️LinkedIn: / pragmaticworks
✔️CZcams: / pragmaticworks
Pragmatic Works
7175 Hwy 17, Suite 2 Fleming Island, FL 32003
Phone: (904) 638-5743
Email: training@pragmaticworks.com
#PragmaticWorks
Great video. Thank you!
did that really take 39sec to process 5 rows? why?
The purpose of this type of tech is to work on big data, not tiny data (5 rows). Its distributed compute, so multiple computers splitting up the processing. The 39 seconds is the overhead to orchestrate and execute the request across the various systems. Seems like a long time when you just want 5 rows of data, but would seem short when you're pulling billions of records and seeing SQL jobs run for hours to days.
Imagine turning on 5 computers to have each pull 1 row, kind of what's going on in that example. Just showing that it works, not that its the solution for the task at hand (reading 5 rows).
@@pauljeffcott8770 I'm perfectly aware of overhead, just was surprised it's that much. It disqualifies the tech for several applications, which is what I was looking for
But thx for the reply!
@@pauljeffcott8770 In other words: 5 rows will take 39 seconds, 500.000 rows will take 41 seconds, 5.000.000 rows will take 50 seconds etc. because the overhead is relatively big having small data but relatively small having big data.