Automated maintenance features in Fabric Warehouse

Sdílet
Vložit
  • čas přidán 26. 08. 2024
  • Welcome back to another episode of Fabric Espresso DW series. In this episode, we'll be exploring maintenance features that optimize your tables automatically for performant querying. Stay tuned to learn how it lets you focus on delivering value to business users instead of maintaining the warehouse.
    🎙 Meet the Speakers:
    👤 Guest from Microsoft Fabric Product Group: Kevin Conan is a Senior Product Manager working on transactions, metadata and publishing Delta Lake Logs for warehouse in Microsoft Fabric. He has more than 20 years of experience in databases and analytics.
    LinkedIn: / kconan
    👤 Host: Filip Popović is a Senior Product Manager at Microsoft and part of the Fabric Product Group, focusing on performance. Before joining Microsoft in 2019, he spent over ten years in the software and financial services industry previously, delivering business analytics solutions using the Microsoft technology stack.
    LinkedIn: / popovicfilip
    Twitter: / filippop_msft
    👍 Like this video? Don't forget to hit the 'Like' button and share it with your network!
    🔔 Stay Updated: For more insights into Microsoft Fabric Data Warehousing, and all things tech, make sure to subscribe to our channel and hit the notification bell so you never miss an episode!
    #Microsoft #MicrosoftFabric #FabricWarehouse #Fabricators

Komentáře • 3

  • @jesi6929
    @jesi6929 Před 7 měsíci

    What's synapse BG DQ

  • @adeeshaadeesha
    @adeeshaadeesha Před 4 měsíci

    how can we delete old parquet files?

  • @keen8five
    @keen8five Před 7 měsíci

    does this mean, that if I _know_ upfront that the table that I just inserted many rows into will be queried soon by end users, I should end the loading process by "SELECT COUNT(*)" to trigger optimize/compaction so that the first end user already benefits from the merged files? Is there a dedicated OPTIMIZE statement / sp_optimize procedure?