3 REASONS to use a single dataset for your Power BI reports

Sdílet
Vložit
  • čas přidán 29. 07. 2024
  • Are you copying your Power BI Dataset or PBIX multiple times? Here are 3 reasons why you shouldn't be doing that.
    Christopher Wagner Article: / dataflows-datasets-pow...
    ExceleratorBI article: exceleratorbi.com.au/new-powe...
    *******************
    Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
    Guy in a Cube courses: guyinacu.be/courses
    *******************
    ********
    LET'S CONNECT!
    ********
    -- / guyinacube
    -- / awsaxton
    -- / patrickdba
    -- / guyinacube
    -- / guyinacube
    -- guyinacube.com
    **Gear**
    Check out my Tools page - guyinacube.com/tools/
    #PowerBI #DataSilos #GuyInACube
  • Věda a technologie

Komentáře • 252

  • @ntm0709
    @ntm0709 Před 4 lety +64

    Another top reason: only having to maintain the Row Level Security on 1 dataset and applying RLS across workspaces with centralized control!

    • @Maartenravie
      @Maartenravie Před 4 lety

      I noticed, when you publish the report in an app (and the end-user does not have view permissions on the original dataset) the end-user is not able to view the app
      When the dataset is in the same workspace as were the App is based on, it works fine.

    • @ntm0709
      @ntm0709 Před 4 lety +5

      @@Maartenravie I also ran into this problem. However, I found that you can go into Manage Permissions of the dataset and manually provide Permission/access to report consumers which then allows users to view the report in alternate apps then the workspace where the dataset exists. The reason it works automatically when the report is in the same app as the workspace is because when you publish app, it's assigning those app users these permissions. This is not the case when publishing alternative apps which is why you need to do it manually.

    • @ceciliawang3538
      @ceciliawang3538 Před rokem

      We recently ran into this issue but couldn’t work it out. We have the golden dataset in a PPU workspace. The reports are built on a different PPU workspace pointing to the golden dataset and published to the App. Users have pro license and we gave them viewer and build permission under the golden dataset and viewer permission under the workspace that hosts the reports. However, users still received the prompt that they don’t have access to the underlying dataset. Appreciated if you can answer this.

  • @tonycovarrubias5931
    @tonycovarrubias5931 Před 4 lety +12

    "I'm not lazy; I'm efficient!" ~ Thank you for that. It drives me crazy when people say they're lazy when clearly they are not.

    • @MichaelRygaard
      @MichaelRygaard Před 4 lety +1

      I am Lazy, but in order to be proper lazy i have to be really efficient, you can be both ;) - but in truth im not lazy, since i will peruse an idea that can save me 10 minutes for hours, because i know that idea can and will save me 10 minutes 100 times in the next year.

  • @manosmaniatakis4067
    @manosmaniatakis4067 Před 4 lety +4

    Thank you Patrick for the video!Just had the conversation internally on why to maintain just one dataset. Excellent timing!

    • @GuyInACube
      @GuyInACube  Před 4 lety +1

      That's awesome! Thanks for watching! 👊

  • @lindarn6--269
    @lindarn6--269 Před 4 lety

    AMAZING!!! I will start digging myself out of my 4-dataset-maintenance-nightmare first thing tomorrow!!! Thank you Patrick!!

  • @CCHSmathematics
    @CCHSmathematics Před 4 lety +3

    I'm really looking forward to composite models enhancing this experience. Thanks for the video Patrick!

  • @NeumsFor9
    @NeumsFor9 Před 3 lety +5

    Good tips. The flip side of this is finding a good way to organize your measures (display folders, governance). Before you know it your "central datasets" can get pretty crowded....which is a better problem to have than data anarchy.

  • @wardsteensels4598
    @wardsteensels4598 Před 2 lety

    Great, Patrick. Just starting with Power BI but was googling if I could not use one shared dataset for several reports. Your video explains it perfect, now using golden datamodels!

  • @jb360360
    @jb360360 Před 3 lety

    This is solid advice. Applying this now for an enterprise roster and headcount request project solution I’m working on with HR. I currently have like 6 pages developed against one dataset and each page/report has a site and dept filter. Could probably create a report for each site using this method and only have one central data model. I love it. Thanks kindly sir!!

  • @hosamalanazi5289
    @hosamalanazi5289 Před 3 lety +2

    This actually inspired me to clean all the companies report to one dataset of each needed data, like our product reports are all connected to one dataset and all sales reports to one dataset, and all mobile app reports to our online mobile app dataset

  • @mohamedaboobackersiddique7322

    Thanks both Patrick and Chris for this great tips!.

    • @GuyInACube
      @GuyInACube  Před 4 lety

      Hope you found them helpful. Thanks for watching. 👊

  • @JoanBoadas-Boadas
    @JoanBoadas-Boadas Před rokem

    Amazing video. You have saved me hours of copying measures and code from one report to another.
    Love this channel.

  • @jackypacky13
    @jackypacky13 Před 4 lety +1

    The information I didn't know I needed! Thank you

  • @jaymehta3320
    @jaymehta3320 Před 4 lety +2

    Thanks Patrick for sharing this! have always tried to look for ways to increase efficiencies and this one helps a lot.

    • @GuyInACube
      @GuyInACube  Před 4 lety

      You are very welcome. Thanks for watching! 👊

  • @kostaskandralids6979
    @kostaskandralids6979 Před 4 lety +9

    I always use that practice to separate data model and visuals. But there is an aspect you must pay attention. The visual ones (I mean connected with Services ones) has the ability and let's you write "local" extra measures. If you use so many measures in your "mother" dataset model you can't make out which ones were made on the dataset and which has been written "localy" on the visual .pbix. Tragic, (after three months) you must click every single measure and if the function is appear on the top of the screen is local otherwise is from dataset. Tip: Before you publish your data model create a pseudo_table with one dax function let's say Patric =1 and name it Dax Table for non dataset DAX functions. Thank you for your accommodation... :)

  • @yornav
    @yornav Před 4 lety +4

    Fully agree with your reasons. I do it all the time and it makes life so much easier. With regards to measures, with shared datamodels you are still able to add your own measures to the reports. So if other people build reports based on the shared model, they can add their own measures. Then, if people create useful measures that might be of benefit of other users as well, I 'migrate' them to the source model.
    4th reason: The reports built on a shared model are WAY smaller and publish MUCH faster.

  • @gabrielfuma4890
    @gabrielfuma4890 Před 4 lety

    Thank you Patrick, this is exactly what I needed!

  • @yannickfranckum6589
    @yannickfranckum6589 Před 4 lety +1

    Thanks Patrick. I have already implemented that scenario/architecture in my company and It's very helpful.

  • @hrishiw1989
    @hrishiw1989 Před 2 lety +1

    It really makes sense with creating multiple reports with a single data model.
    Even i would say that to make a better use of your datasets that are coming from different sources and may be even for different purpose it is easier to put all those datasets in one data model for easier management.

  • @simonestrizzolo5357
    @simonestrizzolo5357 Před 4 lety +1

    Very good suggestion! I have multiple identical reports, which are different only in language (eng, ita etc.). Setting up automatic refresh is always a nightmare!

  • @JackOfTrades12
    @JackOfTrades12 Před 4 lety +1

    I'm not really a fan of connecting to datasets as you can't create calculated columns on tables.
    However, my team recently onboarded to Premium, and are taking advantage of connecting to the workspace SSAS api. We use our published datasets, open a new file, connect using the analysis services connector, and specify a DAX query with summarize to get the data we need. It's been incredibly helpful since we don't own all reports, and need to re-use and centralize many of our OKRs/KPIs. This also removes burden from the source sql servers, and preserves logic. Of course the only downside is recreating measures, but with some cleanly processed values, it's not so bad.

  • @driouchemountasir7894
    @driouchemountasir7894 Před 4 lety +4

    Hello Patrick, thank you for this video !
    Always interesting to learn good practices from professionals like Adam and yourself.
    Thanks for sharing with your community :) We appreciate it

    • @GuyInACube
      @GuyInACube  Před 4 lety +1

      You are welcome. Thanks for watching.

  • @abigi4me
    @abigi4me Před 4 lety +1

    Wow what a great tip to maintain data model. Love it
    Thanks Patrick 👌🏻👌🏻👌🏻👌🏻👌🏻

  • @JEffigy
    @JEffigy Před 4 lety

    Another awesome video! But hey you forgot to mention a 'Common Data Model' and Dataflow entities. Also be worth extending out for the slightly older school SSAS Tabular model migration, if in a larger enterprise environment.

  • @jonbaylis2203
    @jonbaylis2203 Před 4 lety +1

    Superb and informative as always. Hope you and Adam are well 👍.

  • @raz8657
    @raz8657 Před 4 lety

    This is Raj, appreciate that you brought Christopher blog post, he was my manager earlier..hehe

  • @Ruudje1878
    @Ruudje1878 Před rokem

    Just what I needed. Thanks Patrick!

  • @janquieldapper
    @janquieldapper Před 4 lety +1

    In a simple video I can learn a lot about databases! I have a online server with SQL databases, in this server I install the PowerBI Gateway, create my database, and now I can work directly from my laptop with this data... ohhh off course, schedule updates in this server... all data updated all the time!

  • @sarahacton7192
    @sarahacton7192 Před 3 lety

    This is exactly what I needed! Thanks team

  • @donaldkidd7427
    @donaldkidd7427 Před 4 lety

    This video is very timely as I have been selling this "golden" model concept to the powers that be, now I can send them a link to this video to support my efforts. We have an extensive set of Dashboards in Performance Point that is based off an SSAS multidimensional data mart. The executives like their dashboards but we are scheduled to upgrade to Sharepoint Online which is not compatible with Performance Point. I'm in the process of creating a tabular data model in Power BI because the muti-dimensional SSAS data model is built specifically for Performance Point with name sets which are not compatible with Power BI in order to duplicate the the dashboards in PowerBI. So this is a long winded Thank you for this video.

  • @read89simo
    @read89simo Před 3 lety

    OMG, I made almost 30 reports by copying pasting. This is a great tip thank you

  • @geniesworld4643
    @geniesworld4643 Před 4 lety

    thanks Patrick, its very informative. Having a Golden Data model is good because we can maintain one single version of truth. But this will have impact on the dataset size. Having all tables in one dataset will increase the size of the dataset.

  • @chayakiraneng
    @chayakiraneng Před 4 lety +1

    Thanks Patrick. We have been following this design/architecture in our firm for a long time now and are a big fan/proponent of this design. We use the term 'core model' and 'thin report' for this design. Hopefully MS adopts some terminology for this design.
    The pain points we see with this are:
    1) Whenever a column or table is renamed in the core model or the golden dataset, all the linked thin report using those tables/columns break. This is unlike other BI tools such as Webi. Hopefully MS can enhance this.
    2) We know that a measure can be added to the linked thin report, but we often encounter a scenario where a calculated column is needed. It would be great if MS could provide a feature for this.
    3) Last scenario we often get challenged upon is being able to add multiple relationships between the same set of tables in the core data model and at thin report building time, specify the relationship context. This is supported on other BI tools such Webi universes and in Power BI, we can only add one set of active relationships and other set becomes inactive relationships. Does MS plan to have the concept of 'context' for the relationships?
    Thanks again!

    • @youjohnny16
      @youjohnny16 Před 4 lety

      Cant you just use dax in your thun report to solve problem 2 and 3?
      Try looking at addcolums dax function and userelarionship dax function.
      2. dax.guide/addcolumns/
      3. dax.guide/userelationship/
      John

  • @lloyd_talks
    @lloyd_talks Před 10 měsíci

    Great Video! Efficiency is key!!

  • @roaming_bob8591
    @roaming_bob8591 Před 4 lety +1

    Patrick, looking for a solution that will show previous year result, and then static line for goal improvement for new year. Appreciate your help.

  • @adamb4950
    @adamb4950 Před 4 lety +1

    Nice touch with the LSU watch. GEAUX TIGAHHS!

  • @emmanuelmandica2733
    @emmanuelmandica2733 Před 3 lety

    Thanks Patrick as usual your Videos Rocks and are very helpfull

  • @unigirl123
    @unigirl123 Před 3 lety

    mind blown!!! Thank you, thank you, thank you!

  • @MuhammadMansoorAhmed
    @MuhammadMansoorAhmed Před 4 lety +1

    Great tips. Really enjoy your conversation

    • @GuyInACube
      @GuyInACube  Před 4 lety

      Thanks Muhammad! Glad you liked it. 👊

  • @johanaalarse6870
    @johanaalarse6870 Před 4 lety

    Thanks Patrick! You’re great. What about security? It’s possible to shared only one report and drill through different pages in others reports?

  • @robsonnascimento5935
    @robsonnascimento5935 Před 4 lety +1

    Thanks Patricks for this Tips, Awesome!!!

  • @Silverlythia
    @Silverlythia Před 4 lety +1

    Yes yes yes!! So many reasons but just do it! But...Patrick, not fond of the borders ;). You can create it (connected to central dataset) and then give to them the PBIX for them to publish too, and they can create measures if they are savvy too.

  • @mudyasaad
    @mudyasaad Před 4 lety +2

    Hello Patrik, Great video as always ! .
    I am with you on this one 100% but the only reason why i cant do this is because i have different reports going to different audiences and i share these reports as apps. If we were able to seperate a workspace into multiple apps with their own access permission that would be awsome.

    • @GuyInACube
      @GuyInACube  Před 4 lety +5

      Why not have a shared dataset with different workspaces for the different roles, and the app for each? all reports hit the same dataset. The biggest blocker there is that you can't add anything with the shared dataset today, but once the updates to Composite Models comes to allow that, it really makes for a great way to handle it.

  • @Avenger24601
    @Avenger24601 Před 3 lety

    Super useful. Thanks!

  • @vinaypugalia6090
    @vinaypugalia6090 Před 4 lety +2

    Hello @partick, the concept of a shared dataset is really useful. However, there are situations when these can get us nuts Vs bananas. Below are a few situations & I would like to hear from the community on how to handle them -
    1. If there are multiple reports pointing to the same shared dataset, what is the best place to define the measures - in the dataset or in that particular report? I took the approach to define the common ones in the dataset & the report specific ones in the report but got stuck while preparing my dataset for Q&A, as I was not able to train the dataset for the measures which were defined at the report level.
    2. Having a central/shared dataset sounds really exciting, but when we have really big ones, things can go crazy - as a small mistake will impact a lot more reports and every report user will get affected. How to handle this?
    3. What to do when 2 reports pointing to the same shared dataset wants a different relationship or filter direction?
    I am practically facing these issues & am looking for some guidance/suggestions from you & the community.
    Thanks in advance!

  • @marinakhanukaeva5666
    @marinakhanukaeva5666 Před 4 lety +2

    Hi Patrick, thanks for the video. Any chance you know about the limitations of using Powebi datasets across reports? I'm trying to build reports using a single PBI dataset, but it turns out that some visuals (like sparkline by OKViz) stop working when I use PBI dataset in the embedded report, while they work properly with DB imported data instead of PBI dataset. thanks!

  • @leonidiakovlev
    @leonidiakovlev Před 3 lety

    Another reason not to have single source of truth is that you will need to host the whole dataset at every workspace... Even for tiny reports... Because you can publish reports only on workspaces where lies the dataset the model is connected to... Thanks for video anyway, you are great!

  • @malcorub
    @malcorub Před 4 lety +31

    I'm not lazy, just efficient.

    • @TheLeotLion
      @TheLeotLion Před 3 lety

      I used to work with 'Eddie' worked with him for year, he never shared his last name, but he used to say this all the time (( he also worked in data in the early days of computing )).

  • @karolisausra4416
    @karolisausra4416 Před 3 lety +13

    Great video, Patrick, thank you very much. I can really see why having one central dataset is very useful in many cases.
    But why are we talking only about the most simple ideal-world scenarios, where someone just wants to add some new tiny calculation on top of the existing model (a new measure for example). In reality there will be more complex requests, where you will end up adding new dimensions, increasing granularity, aggregations, creating new relationships, or even having to change existing dimensions to meet up to the new requirements of the end-users. In the end the central model would soon grow to a monster model, where each report would use only a tiny fraction of this model or even some older reports will be broken because of the changes made to the central model trying to solve new requirements.
    So is there some rule of thumb (or set of rules) - when to use a centralized model (and inevitably increasing it more and more) and when it is better to create a new model (allthough slightly redundant to the existing model). Or is this idea of using a central model perfect in all means and you should always use it and I am just not getting the idea right (sorry if this is the case) :)
    Thank you!

    • @alexandremachado1014
      @alexandremachado1014 Před 2 lety +4

      Great question, would love if they could answer it

    • @anojones1
      @anojones1 Před 2 lety +2

      I think you're getting the idea perfectly. I find real life scenarios are, that they like measures but the definition is slightly different and you end up writing many measures. Or they need extra dimensions or facts.

    • @bessa0
      @bessa0 Před 2 lety

      Yes. A great video would be answering "When to use more than 1 dataset". In my case, as I do not have PowerBi premium I can't have a dataset that is too heavy. So what do i do? I separate dataset with tables that are not related to each other. That is, data that won't be shown in the same report/page, or that don't have to be updated at the same frequency. But that is what works in my cenario.

  • @earlnoli
    @earlnoli Před 2 lety

    When i was doing ReportBuilder there was an option to create SMDL files that host common data models. Good to know powerBi also has it.

  • @miragliag
    @miragliag Před 4 lety +1

    Thanks! Thats great! the only downside is if you rename measures or column in the linked reports it crashes... in the original report all graphs are updated automatically. Is there a way to fix this?

  • @CrazySw3de
    @CrazySw3de Před 3 lety +1

    Would taking an approach like this increase loading time when opening different reports?
    I am curious if there would be performance issues opening a report containing more data than it is necessarily using, or if it would only impact that dataset refresh.
    (Ex: Linking data from multiple departments on the off-chance you may want to cross-analyze in the future)

  • @husnabanu4370
    @husnabanu4370 Před 3 lety

    Hi,
    Thank you for the wonderful explanation...when we used shared dataset we connect with the entire dataset is it possible to choose only few tables from the shared dataset ??

  • @namaa1000
    @namaa1000 Před 2 lety

    Thanks Patrick for your great work!
    Is there an alternative solution for Power BI Report Server?

  • @teamofsteve
    @teamofsteve Před 3 lety

    Very well explained

  • @masonwhitehouse7587
    @masonwhitehouse7587 Před 4 lety

    Great video Patrick but quick question how many people can use a given shared model at a time without creating issues.

  • @oladaposorinola6216
    @oladaposorinola6216 Před 4 lety +1

    Thank you Patrick!

    • @GuyInACube
      @GuyInACube  Před 4 lety

      Glad you liked it. Thanks for watching!👊

  • @juancarlosfigueroafigueroa943

    Great thank you. I'm wondering, how can you reverse the path from 2 files (dataset & report for viz) to a single master PBI file? To test the RLS for example.

  • @abhijeetdesai3350
    @abhijeetdesai3350 Před 4 lety

    Good one Patrick, we r using same approach while developing any new reports with same model...even if any measure is not present in model, we can create it for that perticular report in report only...

    • @atomek1000
      @atomek1000 Před 4 lety

      that creates also burden as you could end up with same measure calculated differently in two reports

  • @Jahspecs1
    @Jahspecs1 Před 7 měsíci

    Excellent!!!

  • @rudisoft
    @rudisoft Před 4 lety

    Hi Patrick, thanks for this! So why can't join data from multiple datasets? It would make cascaded datasets so attractive.

  • @MrNurbolb
    @MrNurbolb Před 4 lety

    Hi. I really like your videos. You help me to know something new in pbi every day. Can You answer the next question: how can I get list of all visualisations and object used in them? I use DAX Studio, but it cannot help me. Can You?

  • @DIGITAL_COOKING
    @DIGITAL_COOKING Před 4 lety +2

    👍🙂good one Patrick

  • @annmunro9468
    @annmunro9468 Před 4 lety

    Thanks Patrick, been trying this and it's great just having one data model. The problem I've found is that while the dataset updates, the reports based on this one dataset don't always display up-to-date information when they are opened. My users keep having to refresh the cache or hit the refresh button to get the app or report to show the latest data, or I get called continually about it. Is there a solution to this? We are only on Pro so I think that's maybe why we have this limitation.

  • @Art-tv7yu
    @Art-tv7yu Před 3 lety

    Good advice.

  • @Karenshow
    @Karenshow Před 4 lety

    what about if I don't want the new person from the new department to be able to see the new measure Patrick that you create?

  • @stevenfoster5799
    @stevenfoster5799 Před 4 lety +2

    2 videos in 2 days, is this heaven?

  • @massimocolaceci8537
    @massimocolaceci8537 Před 4 lety

    Hello Patrick and folks! Brilliant video. Let me share a question with you: I have a dataset about purchases used for 3 categories of analysis: products in general, cars maintainance costs and another very specific subdomain. The three share almost all measures, queries and tables, but the latter two have some peculiarities: they have their own adictional measures and tables, we can say they inherit everything (of almost everything) from the "father dataset" and they extend this one.
    Given that it's not allowed to extend a dataset which was published to PBI Service (afaik), how can I manage the inheritance of "children datasets"?
    Thanks

  • @terryliu3635
    @terryliu3635 Před 4 lety

    Thanks for sharing Patrick. Honestly, I do not quite get it. Say we have an executive report (20 pages to deliver) which needs both Sales data and EHS data. If we already have a Sales dataset and EHS dataset from separate Workspaces, do we have to create a new dataset and duplicate some of the metrics / calculations?

  • @regulardev
    @regulardev Před 2 lety

    Thanks for this video. I was doing this in my projects. Have a question - does this relatively slow down the reports as they are no more in import mode - but in live connect mode to the published dataset ?

  • @DanielScottDavis
    @DanielScottDavis Před 4 lety +1

    You can also add a local measure for reports using a single dataset. We do this for 3 reasons:
    1. Modeling efficiency. Because PowerBI was built for use by power users, it has some built in safety nets that can be a problem if you have really complex DAX. Essentially, every time you create a new measure it creates a blank measure in the background, then creates another one to overwrite the blank once you hit enter or exit the formula pane. In both instances it recalculates ALL measures and calculates columns in the background. One inefficient measure can lock-up the dataset.
    2. Data preservation. Adding a measure to a PBIX file and uploading overwrites the published model, including an up to date data. You end up having to refresh locally before pushing which can be very time consuming.
    3. Clutter... sometimes users need a VERY specific DAX measure that no one else will use. For these, it's fine to leave it in the visual file.
    Basically, we develop all measures in the report files for testing and staging. After we are ready to publish, we make the determination if it should stay local or be pushed to the "Golden Dataset".

  • @arnohoedelmans
    @arnohoedelmans Před 4 lety

    Hi Patrick like this. Is there a possibilty to develop in desktop a single dataset and the different reports in desktop and then publish the singledataset and the different reports based on the dataset. So changes in the dataset when developping can be made in desktop quickly without having it to publish.

  • @oseliocandido1807
    @oseliocandido1807 Před 3 lety

    This is definitely the problem that i was passing. Clever solution

  • @rasonline
    @rasonline Před 4 lety

    Does it carry over your DAX formulas too? Or do you have to recreate those in the "live connected" version?

  • @jenyura
    @jenyura Před 4 lety

    Hi, I tried to use data model from another persons report to have one set of data, but it was not possible to ad additional dimension table I needed for my report. How could that be solved?

  • @alexrosen8762
    @alexrosen8762 Před 4 lety +1

    Great stuff!

  • @dannydavis1573
    @dannydavis1573 Před 2 lety

    This is such a great video! Thank you Patrick!
    I do have an important question though - how can you handle the deployment process to higher environments with this approach and deployment pipelines in a premium capacity? Can you? Ideally, I would have my DEVELOPMENT thin reports connected to my DEVELOPMENT golden dataset, and then my PRODUCTION thin reports connected to my PRODUCTION golden dataset. And then normally with deployment pipelines I would handle this environment promotion and dataset connection using parameters or data source rules, but it doesn't look like this option is available with a live connection to datatsets. How would you handle this?

  • @Babayagaom
    @Babayagaom Před 4 lety

    Awesome 🙏

  • @monilgandhi822
    @monilgandhi822 Před 3 lety +1

    Hey, that's what I was looking for. However, I have already created versions of pbix files with the same data. Is there a way to link all of them to one data model and not recreate all visuals?

  • @NaNuTumblue
    @NaNuTumblue Před 4 lety +1

    Thank you From Thailand

    • @GuyInACube
      @GuyInACube  Před 4 lety

      Thank you for watching our videos! 👊

  • @jomzrpg6312
    @jomzrpg6312 Před 3 lety

    I was just wondering how the performance is for this report? Do visuals still load the same? Faster? Slower?

  • @CaribouDataScience
    @CaribouDataScience Před rokem

    Good stuff.

  • @PrathamDiaries
    @PrathamDiaries Před 4 lety

    Hi Patrick: My question is a little tricky i thought I might get your suggestion on this. There are 7 reports coming from the same data model. However, nature of all 7 reports, basically all the custom columns and KPIs are different for every report I make. Should i control it from data model. Because for 7 reports and growing the calculations differ. But i can't create custom columns in PBI if coming from Live data model. How should I then control it. Is it then feasible to make 7 copies individually for each ? Appreciate your help on this

  • @inchristalone2594
    @inchristalone2594 Před 4 lety +29

    Whooohooo. Guy in the Cube famous!

  • @njbkilcoyne
    @njbkilcoyne Před 4 lety +1

    Can't agree with you more.. perfect.. efficient not lazy.

  • @Adeelkhalid17
    @Adeelkhalid17 Před rokem

    I have a dataset that is consumed by multiple reports. Now I am adding data from last 20 years to same data set but existing reports should only use last n number of years. Is there a way to filter last n years when connecting to shared dataset or my only option is to apply report filter on all reports

  • @dheitsc7871
    @dheitsc7871 Před 3 lety

    I already thought the same last year and so I developed a very large data model which combines many different sources (sharepoint, sql, mysql etc.). The problem now is that when at least one source is not available the whole datasets wont get refreshed. It would be really great if I could have one master dataset per source and combine them as I need them in my reports. The problem is that its not possible to create a report from more than one PowerBI Datasets actually.

  • @michaelmaloy6378
    @michaelmaloy6378 Před 2 lety

    Relatively new to PowerBI, and trying to make sure I get how this should work.
    In order to create datasets, do you create new pbix "reports" that only have the datasets in them and then publish them (organized in a special workspace) so that other reports are able to access those datasets directly?

  • @nareshrocks9532
    @nareshrocks9532 Před 2 lety

    Hi Patrick's where the start date is the lowest login date of any user with that domain and the end date is the most recent login of any user with the domain

  • @intuitivelearning4179
    @intuitivelearning4179 Před 4 lety +1

    Hi ,
    I have 2 reports using single dataset/data model out of 1 report is daily refresh and another report is monthly consoildated refresh.
    How can this be done using the shared datasets. Any help on this please?

  • @sharavananp5570
    @sharavananp5570 Před 4 lety

    Hi, Can you make a video regarding shared dimension. Like a model having 6 facts and 5 shared dimensions and you have no choice to make a view in backend . So only way is to datamodel with joins in Power BI. But then the fear is, will it slow down the report .

  • @DanielTavares29
    @DanielTavares29 Před 3 lety

    Wooooow!! If i am not wrong this topic is perfect for the new composite models in Power BI (Dez/2020), right?!

  • @yogeshmahadnac9426
    @yogeshmahadnac9426 Před 2 lety

    Hi Patrick! Many thanks for this very useful video! 👏
    Situation:
    Let's say we have 1 Dataset Workspace where all datasets are published. And then we have 3 reports connecting to 1 dataset in that workspace, ok?
    Questions:
    When you have finished building those reports in your Power BI Desktop, you publish them and now you want to give access to them (to designated users in your Active Directory, for example), do you only need to give access to the report, or, do you also need to give access to the Dataset Workspace as well?
    Thanks for your reply 🙏😊

  • @mathickling223
    @mathickling223 Před 4 lety

    Question. If I've already failed at this and have multiple ongoing reports where I've copy and pasted the pbix - is it possible to just take one of them and make it the golden dataset, and then make my other reports point to that dataset? Or do I need to rebuild all of my other reports off that golden dataset? Hoping if I just change the source and end up with the same tables, columns, etc, it'll just smoothly continue on like nothing has happened.

  • @Wolf7448
    @Wolf7448 Před 4 lety

    Is it possible to do this when you're running report server instead of powerbi.com?

  • @rajkumarrajan8059
    @rajkumarrajan8059 Před 3 lety

    Patrick - Is there a way to connect a excel data from Desktop to PowerBI online?
    I have an excel report which will be updated on every week, what i am doing now is updating the data in the excel open PowerBI desktop refresh my report and then publish the report to the workspace.
    Rather i want to connect the Excel report directly into my PowerBI online. Please provide some tips

  • @texmexdragon2160
    @texmexdragon2160 Před 2 lety

    But is this still applicable with the advent of composite models? And what about chaining datasets? Seems to me there is also an argument to be made for having subject matter datasets that serve specific audiences within an organization.

  • @zme888
    @zme888 Před 4 měsíci

    Q: if i have one semantic model/dataset. What it i have multiple different frequencies that data needs to be refreshed?
    From slowly changing dimensions that are REAL slow, reference tables updated yearly to fact tables that need updating every hour.
    When i schedule a refresh of just the one semantic model i have to refresh everything frequently? Even if for some tables/sources they havent changed?

  • @jonaskarlsson477
    @jonaskarlsson477 Před 3 lety

    Do you have a video how to setup a single dataset

  • @vincenzonosso7400
    @vincenzonosso7400 Před 2 lety

    Yòooo Patrick!
    Great video, as usual :)
    What if I have different remote coworkers, which want me to "merge" their Excel files (in a single dataset)?
    I mean, how would you efficiently manage this scenario?
    How to collect the files (maybe via a shared folder)?
    How the refreshing process could work?
    Thank you in advance!

  • @nguyenlamtong8503
    @nguyenlamtong8503 Před 4 lety

    Thanks for your tip. But can you also name some of the reason we should NOT use the same dataset