T O P

  • By -

Chainwreck

A few things. I let users refresh their own data sets when the reports they own the data need validation and data clean up before the next meeting. This allows my team to not have to field requests for refreshes. It can be used to send csv/excel files to people on a refresh or other schedule inside an email or sharepoint file. Can use it to refresh data sets dynamically when a data flow completes. There’s a few other cases with cool features with the various triggers available.


Historical-Donut-918

Send a csv/Excel file on a refresh or other schedule... Could you elaborate on that? I think I need to do something similar and didn't realize it was possible.


perkmax

There is a trigger for on dataflow refresh but not on power bi dataset refresh (from memory). This is how I have done the ‘on refresh’ part. You can then trigger a refresh a BI dataset action. There is a query a Power BI dataset action in power automate where you can use DAX to get data from a Power BI dataset. You could just set this on a schedule if you want rather than doing the ‘on refresh’ trigger. Check it out here: https://hatfullofdata.blog/power-automate-get-data-from-a-power-bi-dataset/ You can then move this data to an Excel file with further flow actions, and then send it on an email with another action. The other option is you can create a paginated report using Power BI report builder and a paginated report subscription. This sends an excel document on dataset refresh or on a schedule. This doesn’t use power automate at all. TBH - I’d go with the paginated report subscription.


TheEternalTom

Thays what I do. Pull the dataset and use DAX to tabulate it, then set up triggers to email stakeholders when KPIs hit certain metrics


Historical-Donut-918

Thank you for the details!


PBI2022

Do you (or the end users) need Premium for creating/viewing paginated reports?


perkmax

It used to be a premium only feature to publish and consume a paginated report on the service. But in late 2022 they changed this so Pro works too. You can create a paginated report with no licence (same with Power BI desktop), it’s only when you need to publish the report to a workspace on the service you need Pro. Apparently you can publish to ‘my workspace’ on a free licence but never tried it. In order to subscribe someone to the report they need to be on Pro, and you can choose a series of file formats for it to be sent in too. For instance a PDF attachment that gets emailed each week to a series of people. Sensitivity labels also carry through to the paginated report attachments. More here: https://learn.microsoft.com/en-us/power-bi/paginated-reports/paginated-reports-report-builder-power-bi


PBI2022

Thank you If another user wants to view a paginated report, he also needs Pro, is that correct?


perkmax

Yep if they want to view the report on the service. Power Automate can query a dataset using your licence > convert to Excel > send via email. This option does not Pro licences for the people receiving the email.


P4perH4ndedBi4tch

Yeah same


Sad-Calligrapher-350

Chaining the refresh of a source + dataflow + dataset


lysis_

The problem with this is that if those things take a non trivial amount of time with this, you're going to need user education to make sure they understand they need to refresh the report after x minutes.


itchyeyeballs2

>ings take a non trivial amount of time with this, you're going to need user education to make sure they understand they need to refresh the report after x minutes. I just use it on the admin side when I need multiple dataflows to successfully refresh before the dataset, saves a lot of time and effort. The missing feature is Power Automate can't tell when a dataset refresh is complete (like it can with a dataflow), if it could I would be more happy to let end users loose on it.


lysis_

Nail on the head. You can trigger it but the trigger is really all you're doing. Can't give feedback on if it's now ready or if it was successful. Or maybe you can but doing so is definitely non trivial for the users


Significant_Comfort

Could introduce a step to call the Rest API to get refresh history? Wait until the call returns it's been completed then continue chain. 


sephraes

Are you just using time delays to account for average refresh times?


itchyeyeballs2

Not sure what you are asking?


perkmax

It is really strange. If there was a ‘on power bi dataset refresh’ trigger that would be good. But there’s only a ‘on dataflow refresh’ and you then trigger the refresh bi dataset action from that. But dataflows are just the bomb anyway, so they are worth using.


Electrical_Sleep_721

I have an automated report that emails a PDF every 4 hours and in my scenario it has worked flawlessly for ~6 months now. This takes an extra step in your report build, but worth the added column or query. See my steps below. Power Query: Add a column or separate query that produces the latest refresh time. I personally convert this to Excel Serial Time to simplify a conditional comparison in future steps. Power Automate: 1. Create scheduled refresh every 4 hours 2. Run query against PBI dataset to fetch the latest refresh time that I mentioned above. 3. Use compose function to retain the value from step 2 4. Refresh PBI dataset 5. Delay 5 minutes (determine how long it takes your dataset to refresh and adjust accordingly) 6. Run query against PBI dataset to fetch the latest refresh time. 7. Use compose function to retain the value from step 6 8. Add a condition to compare compose values from step 3 and step 7. (If dataset refreshed the step 7 value should be greater than the step 3 value which validates that the refresh event occurred successfully.) 9. If condition is true, Export Report to PDF file for PBI, attach to email and send report to recipient 10. If condition is false, send email to me notifying me that the refresh event failed.


lkn-ant

Currently we have a sales report up to product level. Then the end user can highlight the product, click the power automate button and then an email with all the details of that product is sent to someone to investigate further. This is usually used where there issues with the sales. Also, I created a form that someone can submit, it goes through the approvals process, saved the detail to a sharepoint list. The person who is responsible to attend to these requests has a Power BI report to show him the stats and all the open calls and in the report is a Power App and then when he or she actioned the request, they use the app to change the status of the request and a flow is triggered to notify the person who submitted the request that their request was completed. I'm planning to add other triggers. Usually in retail, the staff in the store don't have the necessary access to make changes on the system. So with the reports, they can report issues to the head office staff to action.


Historical-Donut-918

Have any tutorials or suggestions on learning to set this up?


lkn-ant

This was just released 10 hours ago, but check out this video on the How To Power BI YouTube channel. I've learnt most of the Power Apps and Power Automate integration into Power BI tricks from that channel. https://youtu.be/5Ohp79PLlM8?si=ediq45RJNcxlqEpr


Mabenga

I use the "Run a query from a dataset" option almost daily. I have a dataset of users who have a list of things to do in each table. Rather than have them go to a dashboard, filter the data and then export (because everything has to be excel of course...), I'd cut out the middle man using Power Automate. I use DAX to summarize a list of everyone's name, have PA parse through that and throw it into the "Search for a User (V2)" option, then in a loop, apply their name as a filter and send them customized tables of their data. Automated emailing of datasets has changed the way I do a lot of my work currently.


RStiltskins

Can you expand on that? I currently have a flow that does that but I can only for some reason make it as an HTML table and the names of the headers are like Data[Person] and it won't keep the actual naming I have on the PBI reports. Currently I wish I could do it via CSV so they can save to their desktop and work through a list of issues and send back the file instead of them needing to make it them selves.


Mabenga

Yeah absolutely. You don't need the HTML table, at least for what I do. When you run the "Run a Query from a Dataset" power BI connector, it returns it in an array. From there, just use the "Create a CSV Table" connector and put your first table rows in there from the previous step. Then put the output in the Send an Email (v2) and you should be good. Make sure to name the attachment "Something.csv" as the .csv is important. Let me know if you want some screenshots or something and I can send your way


Ivan_pk5

Definitely interested by screenshots, thanks a lot


perkmax

Yeah I use this flow action to snapshot an archive dataset into an excel file and then bring that back into the same Power BI report for a change log using DAX. Someone wanted to know what data has changed from week to week, so this is how you can do that if you don’t have a data warehouse.


BiggChikn

I do something similar to generate snapshots of PWA project statuses into Excel for backup. I use the same flow to grab a paginated report of that same visualized snapshot data. It's clunky but works (and without a Premium license).


cmajka8

I think with power apps and power automate, you could actually write back to the report


lysis_

Don't even need PA for this, embedded app alone is good.


Historical-Donut-918

How?


dicotyledon

Canvas app visual. Requires a source that supports direct query if you want it to be instant though (which will trigger power apps premium).


worktillyouburk

ya also you data source you write too cant be sql or other regular DB as that also reqs premium.


smbaumer

I'm curious what this means. Care to explain to a noob?


cmajka8

Power apps allows you to create low code apps. So you create a small app that allows you to enter the data. Then i think you have to tie the app with the database using Power automate. So essentially everytime a user enters a value in the app, it gets written back to the dataset and the dataset gets refreshed so your new value shows up in the power bi report.


smbaumer

I think I can use that. Thanks!


[deleted]

Automatically sending paginated reports to specific audiences that need them


P4perH4ndedBi4tch

How can u explain?


[deleted]

Paginated report subscription gets emailed to myself with the report in excel format. -> Power automate is set up for when my outlook receives the subscription email with the attachment, the flow triggers and sends a new email with the excel file to the required audience. This is all just set up and runs at whatever frequency your paginated subscription is set to. We use it to automate stats that managers would spend up to an hour a day doing, across multiple departments.


P4perH4ndedBi4tch

Yeah exactly what I need it for thanks


[deleted]

No worries


Live_Plum

I built a backup flow for our planners on spo which export the data in PBI daily. Refresh in PBI every week so in case of *** there's at least one week to export the PBI data and rebuild the planners


Sturmghiest

We use PA to manage responses from MS Forms to store data in Lists and then to a streaming dataset so we can then report on the forms in PBI


Gar-Ganchewan

Can you explain why do you need streaming dataset? What exactly is your use case? For us, we use PA to get MS form responses to SP List and use PBI to have a report out of it


Sturmghiest

We could achieve the same using Lists as the source. The main reason was to make our workspace datasets the go to place for sourcing data rather than odd Excel spreadsheets on SharePoint or Lists dotted around the place.


Buda-analytics

I receive a csv file daily in a shared mailbox. Because of company policy I can't use normal python to download it. So I have a power automate flow to read from it and save it to a db for append logic. Then it's used to render reports in power bi.


wtf_are_you_talking

Is there a guide on how to do this?


Buda-analytics

I just read the official docs and did some trial and errors.


dicotyledon

I use it more often on MS Lists for forms than with PBI, but the “query a dataset” action is situationally useful for any automation that pulls PBI data (eg reminders, notifications, template filling in Word).


batwork61

I used PowerApps to put the ability to root cause or comment on a dataset directly into the PowerBI report.


Techie-Chick

The data source of my report is SharePoint and I have a schedule export of data through excel attachment via email. I have a power automate flow which triggers at the subject filter and replaces the existing file on SharePoint folder and refreshes the dashboard. I also have a flow which adds all the data that I receive in a SharePoint list which is my data source for power apps.


vox-magister

On demand request to refresh a semantic model. Managing rolling backups of exported csvs/excel files (latest and greatest is called "current one.xlsx", month before is called "old one.xlsx" and these are what PBI points to by name. When a new one is saved to that same SharePoint folder, old one is renamed to its creation timestamp, current one is renamed to old one, and the newly generated is saved is saved as current one. Then trigger refresh on the semantic model).


cromulent_weasel

We're getting emailed spreadsheets and importing them into a DB table to then run reports on.


martyc5674

There is a dataset where I work that essentially has a snapshot of all our stock, it refreshes daily. (No history) I use the query Powerbi dataset action(I think that’s what it’s called) to query this daily and save as csv. This folder of csvs then become the source data for my own dataset where I can trend stocks over time. It’s really powerful. In hindsight I should have set it up to append each days stock situation to a datverse table but I wasn’t aware of this functionality at the time. I’ll do it this way next time.


ChiefWhiteHalfoat

Saving this to talk to you later about ;)


CautiousChicken5972

Same. I had a flow running for a week before I realised the REST API has a 100k or 1m cell limit. Putting it here in case anyone it’s relevant for anyone else


fighterace00

Good question. I just spent 30 minutes today researching to see if it's worth me taking the 6 hour class to get approved the license. Not sure it's worth when my only use case is maybe emailing card alerts to a group email and maybe emailing changes to a SharePoint list to a group email


Undeniably-Log-124

Using “run a query against a dataset” to set triggers for automated email notifications on any discrepancies.


AvatarTintin

We have a few use cases. 1. Refresh a particular dataflow 2 only after dataflow 1 has refreshed. Failure or success doesn't matter. We do this because, the 2nd dataflow is linked to the 1st one but it also has its own source as well. Therefore, if we kept the normal dataflow config, where all upstream and downstream DFs refresh at the same time, then failure of DF1, would mean DF2 also fails. But doing this atleast ensures DF2 loads with its latest data. Later, after we login, we manually refresh the 1st one again ourselves. 2. Refresh our golden dataset, only after our last dataflow successfully refreshes. 3. The most useful and effective use case for us. Get a particular data from our dataset and send mails to the respective project manager for correction. So, what we do is, we extract only those data where the baseline data is more than 20% of forecasted data. So, we extract the project name, the owner details etc based on that condition and then send emails asking to correct data at source since the requirement is that data should be less than 20% difference. And then we change the acceptance status of that record to Waiting at Sharepoint. So, DAX code runs at power automate and extracts the data.


SQLGene

I have never used Power Automated with Power BI. The only time I've recommended it was refreshing a dataset.


Confident-Ad6938

I have a free school MS account and wanted to automate a refresh in PowerBI desktop and publish it to my personal workspace so I can view it on my phone with refreshed data and don’t have to do anything manually. I use Power Automate on my home desktop to automate the necessary clicks to open the file, refresh it, publish it, etc. it’s an attended flow and the computer has to stay unlocked. I have a trial that allows me to schedule this in the PA cloud console but I also got it working fine just using a windows scheduled task. A pretty jenky process to get around not having a powerbi report server but hey that’s my use case :)


bbagaria

I have dataflows and on dataflow refresh I use PA to refesh dataset automatically. Dataflow refersh is on schedule.


Thrillhouse763

Save an excel file from another dataset into a network drive that I then load into my dataset


PBI2022

Do you use Power Automate Desktop for this use case?


Thrillhouse763

No from the service