Creating real time dashboards in Power BI with push datasets
Vložit
- čas přidán 22. 10. 2021
- Though you can build real-time reports with DirectQuery, push datasets offer a more scalable, economical, and effective solution especially when combined with an Import model already in place. This video introduces the architecture of push datasets.
Article and download: www.sqlbi.com/?p=729917&aff=yt
How to learn DAX: www.sqlbi.com/guides/dax/?aff=yt
The definitive guide to DAX: www.sqlbi.com/books/the-defin... - Věda a technologie
this is absolutely wonderful. thank you for the thorough introduction! Last time we needed "real time" for a report we resorted to direct query and "refreshing" - super slow and inconvenient... I hope to try this out soon!
What a foruitous video! I have recently been posed the challenge to find a way to provide real time dashboard within my organisation and this information and tool will be invaluable. Marco & Alberto thank you so so much for this video and the article.
This sounds really exciting. Great job, great explanation.
It sounds fantastic. Im very excited to watch the video. Thanks @Sqlbi team
Always good to see dashboards to get some love.
thanks for sharing. excellent video. This push data method also opens up a lot of new possibilities.
Great stuff!
It might be very helpful for a specific dashboard of a client, where I am currently using direct query.
Thanks for the video
This is sooooo crazy, I am in a project that require push datasets. Going to read all the documentation right now :)
thanks for sharing this video..so wonderful..
this is just wow thank you !
You guys did it again! Seriously this is great stuff
Great solution!
Excel explanação, parabéns!
Thank you
I love it!
Thanks for the amazing video! I apologize in advance for my ignorance but, is the api call still gonna work if my machine is off?
amazing
This is really helpful. Is there anyway that we can add slicers?
At present, we cannot add slicers to dashboard.
If we add it to the report, we have to refresh the page for the latest data.
Is there any other way to do this?
No you cannot add slicers to a dashboard.
Hi thanks for a great solution. I've worked on it and made it working as far as incrementally refresh the push dataset after batch updating from DAX results from its twin in-memory model. Then I am hitting the wall with RLS - is RLS not supposed to be working with push dataset? I reviewed microsoft documentations etc, but couldn't find a definite answer for that.
Unfortunately, RLS is not supported in push datasets.
Excelent video! Would you recommend use this tool if the original data source are json files in an Azure Blob Storage?
It's hard to implement real-time updates with that architecture: you should create a system (with Power Automate or a script/service) that polls the blob storage looking for new files very often. Not sure it would be much different than a frequent incremental update (but it would be every 1-2 hours, not every minute).
@@SQLBIThanks for the reply! Maybe it would make sense to add an additional layer to transform data into table.
It would be a good idea to cover the implmentation of the library presented in this video for Azure Cloud DB connection
Is anyone else getting this error message: Confirm-PushDataset : The term 'Confirm-PushDataset' is not recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:3 char:1
+ Confirm-PushDataset -Model 'C:\Demo\Contoso.bim'
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (Confirm-PushDataset:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
thanks for the informative video. By any chance, is there an alternative tool that could generate .bim as tabular editor is not open source and neither it is a freeware. Your feedback and comments are appreciated
Tabular Editor 2 is free and open source: www.sqlbi.com/tools/tabular-editor/ and github.com/TabularEditor/TabularEditor
What about building a composite model with a similar architecture to what you described? You have your import tables and then a single DQ table that pulls in today's data? I know you have the DAX limitations, but it seems like the performance component would be mitigated in a large way.
You would not remove the scalability issue for DirectQuery and you would lose the automatic refresh of the dashboard tile when there is new data. Performance depends on the implementation.
I haven't finished the video, but ist his possible when im getting the data from a connected database, say Smartsheet or other online data source?
Please, can you clarify the question?
Awesome Stuff! Thanks. Will this work with Oracle Database?
It is not related to the data source, it is up to you to build the technique to push updates, though.
Hi, For continuous running of dashboard do we need to run the powershell application continuously? Is it possible for running powershell command automatically?
You should use an automation process to run powershell, on Azure or on a PC - you can also look for Power Automate.
I have a question regarding the tabular editor, Like We have some client laptops I made one model and deployed it in SSAS. Then The next day I am on another laptop can I load deployed model setting in the Tabular editor or do I need to bring saved local to new machine and open that file then only I can see all my work like CKF, Measures, relationships etc. thanks in advance
You can either copy the model.bim file manually or connect Tabular Editor to the server and retrieve the model from the server. However, we suggest that you use the Tabular Editor forum for these questions: github.com/TabularEditor/TabularEditor/issues
@@SQLBI Thanks a mill, I did it manually before will try how to retrieve it from the server. thanks again
You are running a Command line utility (your Push app) in the shell on your desktop. How does that work in Power BI Service as a command line app? Do we need to install anything on the Service?
The command-line app is for testing purposes only. In order to push data into the dataset, you need to call the Power BI REST API through a service in the cloud that intercepts new data in your system. As a polling technique and no-code solution, you can use Power Automate to do the job.
I have to build realtime dashboard. but for one of the visualization I need to perform Pivot transformation. is this achievable in though this process? (Pivot transformation not supporting when using direct query)
Push datasets don't have the limitations of DirectQuery. Power BI matrix visual (which is the way you obtain a pivot transformation) are compatible - but that's only at visual level, if you are talking about M, then push datasets don't have anything like that, you must transform the data upfront.
Thanks for the great video,
I get,
Failed to get shared database name. Description: {"Message":"API is not accessible for application"
When trying to refresh data from import DB to push DB
Please, write the technical questions in the comments of the related article and make sure you follow all the steps described here: www.sqlbi.com/articles/implementing-real-time-updates-in-power-bi-using-push-datasets-instead-of-directquery/
I really like the idea, but... I got the feeling that you present Direct Query in pure form which - I guess is not best - but we have dual storage mode and aggregations which are increasing DirectQuery feasibility a lot. And soon the automatic aggregations will be introduced. Having it in mind is push dataset is really better solution?
Push dataset have the unique feature of updating the dashboard tiles as soon as new data arrive. Aggregations in DirectQuery should be refreshed periodically for the entire table (no partitioning) so currently there is no way to have an hybrid tables (part DirectQuery and part Import) that would solve the issue. However, even with hybrid tables you would still have a scalability issue with the DirectQuery part in case you have many users.
@@SQLBI Thanks a lot for explanation
show Confirm-PushDataset -Model 'C:\xxx.\desktop\Contoso.bim' no command name cmdlet: when i repeat your work in the powershell
Read the article, you should install the PowerShell cmdlet first.
Thanks looks interesting.. however just question/limitation with push datasets.. although looks fantastic to see live data but --for me it falls on duplicates.. its tough to dedup live data ..if the data coming in is not filtered properly and if there are data surprises.
It can give erroneous dashboard.
some existing tools which uses Push -data on ports like machine logs.. struggles with matching legacy and live data
Over night pull data can be cleansed and filter through business rules ..etc.. but liked your diagrams and data flow presentations
The entire idea behind these real-time dashboards is to implement "vanity projects" without creating a dedicated infrastructure that is way more expensive to maintain. The daily cleanup partially solves the issue you mentioned. Clearly, you must have an error tolerance about the live data.
@@SQLBIi need your personal contact please. I have a project in mining industry
I think we can use power automate to push the data, no?
there is a power automate "add rows to a dataset" connector.
Yes you can!
Just pay attention to what you use in Power Automate to detect the presence of new rows, because a polling *might* be resource consuming for the server. But it definitely works!
How is the Dual mode in Power BI different from this approach sir?
Push datasets update dashboards as soon as new data comes in. Dual mode can keep data in Power Query, but you always have to trigger a dashboard refresh in order to see new data.
Is the hybrid table going toachive the same
It depends. You cannot have automatic updates as shown in the dashboard, you could implement automatic refresh in the report. However, every report sends query to the DirectQuery partition, potentially introducing DirectQuery latency to every report (where is your data source?) and creating a bigger workload on the datasource (read: consider scalability for your data source).
The hybrid tables should be easier to implement if the data source fits well in the DirectQuery scenario and you are able to configure the partition wisely. The push datasets are a good alternative for data sources that would not perform well in DirectQuery and/or you have transformation that don't comply with the folding requirements for hybrid tables.
As usual, every tool has pros and cons, having more options is better because you have more choices and depending on the scenario you can choose the best one.
Will this tool work with power bi report server?
No, push datasets only works in Power BI service.
where did you get your chair ?
You can find it here: www.sqlbi.com/blog/alberto/2021/02/14/on-my-recording-gear/
Thank you very much! So seems that don't work very well to "update" data.
What do you mean?
@@SQLBI I mean that works really well when we are increasing the database, but seems that when we need to update some figures don't work, is that right?
For the push dataset part, an update can be obtained by pushing a compensating transaction. The biggest issue is that you have to identify the change first. Once you know that a transaction changed from $10 to $8, you have to push a compensating transaction of $-2
Or, you can just ignore that, knowing that these adjustments will be reflected in the next daily refresh.
For example, in an implementation we made, we just ignore refunds made in the day, because they will appear the following day in the regular daily refresh.
One other solution is to FTP directory running a script that is running the same script that updates the data.
is it just me or is the whole live dashboards really not worth the cost! i mean when would you benefit from live dashboards in business over a regular report.
The "real-time" behavior could be good for a monitor that shows the information updated real-time. We call them "vanity projects" as they do not have a real business value, but they could be a motivational object. We've seen them used in monitor in the hall of a company (or in private areas when there are sensitive company data!).
Perhaps when it gets to the level of investor meetings?
Other than that, as said, vanity.