Dive into Microsoft Fabric's Power BI Direct Lake

Sdílet
Vložit
  • čas přidán 15. 08. 2024
  • Let's break down Power BI Direct Lake in Microsoft Fabric and explain how you can leverage one copy of the data from OneLake. Patrick explains!
    Direct Lake
    learn.microsof...
    📢 Become a member: guyinacu.be/me...
    *******************
    Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
    🎓 Guy in a Cube courses: guyinacu.be/co...
    *******************
    LET'S CONNECT!
    *******************
    -- / guyinacube
    -- / awsaxton
    -- / patrickdba
    -- / guyinacube
    -- / guyinacube
    -- guyinacube.com
    **Gear**
    🛠 Check out my Tools page - guyinacube.com...
    #PowerBI #DirectLake #GuyInACube
  • Věda a technologie

Komentáře • 23

  • @gvasvas
    @gvasvas Před 5 měsíci +3

    Awesome demo! Quick and right on the spot.

  • @archanasrivastava6531
    @archanasrivastava6531 Před 3 měsíci +2

    Thanks for this insightful video.
    Do you have any performance/capability metrix for comparison between Import, Direct query and Direct Lake , pls share. Thanks in advance

  • @robcarrol
    @robcarrol Před měsícem

    Great demo. I've been using direct lake in a current project and absolutely love it

  • @toma4528
    @toma4528 Před 5 měsíci +2

    Great video, Patrick!

  • @brunomagalhaes9349
    @brunomagalhaes9349 Před měsícem

    I have several sematic models that are alike. Do I need to have a fabric capacity to merge them and treat the data like I do for SQL? Thanks a lot.

  • @christophehervouet3280
    @christophehervouet3280 Před 4 měsíci +1

    Super post Patrick , as usual

  • @shekharkumardas
    @shekharkumardas Před 5 měsíci +2

    How to create dax column in direct lake dataset

  • @gnomesukno
    @gnomesukno Před 5 měsíci

    Not using it currently but I can see some potential benefits to it. Will have to look into it

  • @nishantkumar9570
    @nishantkumar9570 Před 5 měsíci +5

    How costing will work for direct lake mode?

    • @toulasantha
      @toulasantha Před 3 měsíci

      Less to start with
      Will be rocketing up after that
      Just like everything else MS 😂

  • @UnbelievableOdyssey
    @UnbelievableOdyssey Před 2 měsíci

    If my Delta Lake is in Azure Data Lake Stoage can I still use Direct Lake?

  • @user-iv5tq4qk7m
    @user-iv5tq4qk7m Před 5 měsíci +3

    Q I love the ease of creating new semantic models but I keep coming across the problem whereby I have to give somebody access to the whole lake house in order to give them access to a segmented part of that data but I only want them to see via a semantic model. Is there any way that I can create a gold lake house in one workspace then create multiple semantic models in other workspaces and only give users access to those?

    • @npergand
      @npergand Před 5 měsíci

      You don’t need to give users access to the lakehouse, that’s just the default behavior. What happens is when you create a new semantic model it uses a gateway connection the lakehouse that is with SSO. You can see this in the semantic model settings screen. You can change that by creating a new connection to the lakehouse using a specific credential.

  • @Mike-en1rd
    @Mike-en1rd Před 3 měsíci

    Do you know when Direct Lake will be available to use in Power BI Desktop?

  • @danrolfe7862
    @danrolfe7862 Před 5 měsíci

    THIS IS BANANAS!!!!!!!! WOOOHOOOO
    Is there still a row limit? (On data that you can actually bring into Power BI)
    I seem to remember hitting an upper limit on rows using SQL Endpoint / Direct Query.. I had this MONSTER data set of about 14m rows that the stakeholder insisted he needed all of the data.

  • @googlogmob
    @googlogmob Před 5 měsíci

    Patrick, thanks 👍

  • @NicolasPappasA
    @NicolasPappasA Před 2 měsíci

    Is direct lake using Delta Live Tables? It seems like it's the same technology.

  • @dilipinamdarpatil6301
    @dilipinamdarpatil6301 Před 5 měsíci

    Awesome 🙏

  • @NateHerring1
    @NateHerring1 Před 5 měsíci +1

    I watch Patrick

  • @Milhouse77BS
    @Milhouse77BS Před 5 měsíci

    I’m up

  • @EBAN4444
    @EBAN4444 Před 5 měsíci +1

    Does this mean the massive 25GB model I have that holds too many years of data because the "business" needs it, even though they only look at a few years, can be removed and then only the partitions of data that is needed will be held into memory? Lowing the memory used on the capacity and the amount of data and CPU needed to crunch all the measures?
    Can I recreate the model using direct lake against our ADLS gen2 databricks parquet files which are already the fact tables we pull in. Do you need to setup partitions in the onelake or does it automatically do it for you?
    This does seem to remove the query folding performance gains, so it seems like the parquet files will need to be rewrote to be better optimized and only include the data that is needed in the model.
    Also is that python library to refresh a dataset available outside of onelake? aka would love an easy way to refresh a PBI model from an Azure databricks notebook versus adf xmla call

  • @googlogmob
    @googlogmob Před 5 měsíci

    Does Fablic available for developers for free?

    • @srikanthm4504
      @srikanthm4504 Před 5 měsíci

      No your admin must enable and can do for a specific space.