Send to PowerBI, using chunk loop. Microsoft Authetication Token Timeout.

Hi
I am trying to load dataset 800k-1mln rows. I am using the “Chunk loop” node to fix issue with "exceeding 1 mln rows limit within 60 minutes. "

My loop has a timer to wait for 60 minutes to continue the load, however I get another error: “Try-Catch block: Error: Access token has expired, resubmit with a new access token, Code: TokenExpired”

Any ideas how to refresh “auto” Microsoft Autheticator token within the loop?
Alternative “manual” scenario is to at least, catch the last row which was loaded to dataset. Start again but c from the point when it finished last time.

Hi
where is this limitation comming from? Power BI? The API?
Any reason you need/want to send it directly to the Power BI Service instead of write it to e.g. a DB and then have an scheduled refresh of the PBI Dataset via Service?
br

Limitation comes from API:
Power BI REST APIs push semantic model limitations - Power BI | Microsoft Learn linked the table in PowerBI pbix file. Then did couple of final transformations, some charts etc. Then published that in PowerBI App.

I want to upgrade this to next level. DB you say? Hmm any proposal? I would like it NOT to be stored on my local file. Place my Knime file on Knime HUB and start using scheduled automatic refreshing.
I am looking forward also for the way to update the dataset in PowerBI with only, new rows, which were created lets say withing a current mont. I dont not to realod the complete data set. Looks for me to be leaner, faster approach.

I will appreciate your feedback! Thank you

Could you do the transformations in powerquery? Question to me is whether KNIME is required at all
br

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.