Power Automate - Optimization Case 1

Performance considerations when loading data into Azure SQL with Power Automate.

The Problem

I was asked to review a reporting solution that was taking approximately 10 minutes to ingest and upload a single spreadsheet. The team found this level of performance unacceptable, especially when multiple individuals were loading spreadsheets at the same time.

Upon further review, the team was using a Power Automate Flow to ingest spreadsheets from a SharePoint folder, iterate over all the rows, then invoke an Azure SQL stored procedure against each row to load the data. Each spreadsheet consists of around a thousand rows with two dozen columns, largely numbers and a few strings. Each row is then read, calling Execute Stored Procedure (v2) on the Microsoft SQL Server connector to upload one row of data.

The inefficiency here stems from repeatedly calling the stored procedure. Each time you invoke this connector, an API call is made to the SQL service and data is inserted one line at a time. This serial approach meant that with each call averaging 0.3s, this process would take 5 minutes at a minimum - often longer as some calls may take 2s or more due to external factors.

In Search of a Solution

For Each loops in Power Automate have a feature called concurrency control, enabling this and setting a concurrency value allows it to execute multiple iterations of a loop concurrently, up to a maximum of 50. By setting this value to 10 concurrent loops, I was able to bring down ingestion to under 2 minutes. When setting the value higher than 10, I was not able to get any further improvements in speed. This is probably due to some kind of rate limiting on the connector or elsewhere.

The improvements were substantial, but far from optimal. Inserting this quantity of data should take close to nothing. Can we go even further? Yes we can.

Uploading this data serially, one row at a time, is slow. We can improve this by uploading larger chunks of data, as long as we stay within the SQL Server connector limits. We take the entire content of the spreadsheet, convert it into JSON and pass it to the connector as one. This requires that we modify our Azure SQL stored procedure to accept this data on the other side and process all the JSON objects.

The whole upload now takes 10 seconds, down from 10 minutes.

Diagram

A diagram representing logic flows compared side-by-side.

This website was built by leveraging open source software. Please consider supporting these projects.

Hugo and Stack