Nov 22, 2025

A way to process more than 5000 records in Cloud Flow execution

If you need to process a large number of records, for example in Dataverse, what we really do is create a scheduled Cloud Flow/ Power Automate. If the process is complex, we can encapsulate the logic in a Custom API and call from Cloud Flow. 

To fetch the records to process, you need to use List Rows operation of Dataverse Connector which returns only 5000 records as once. This is a limitation. In order to overcome this, we need to do below trick.

1) Introduce a Boolean Variable
2) Create a Do Until loop to run until this variable is False
3) Inside the Loop
    - Create List Rows operation to retrieve records
    - Set Variable to True as far as List Rows returns none zero records
4) Add a For Each control to call a Bound Operation against each record in the batch.


Caution

1) List Rows fetch should contains a specific condition for the records being fetched. (perhaps a flag)
2) Within the Custom API it is important to change this flag value once its processed either its successful or failure. 

This way, any record, once processed is omitted in next fetch resulting only one processing per a given record. Otherwise, this will go to an infinite loop which could cause issues. 

Anyway, Cloud flows can handle badly designed loops and it will anyway stop at default maximum number of loops.. still this should be avoided.

No comments:

Post a Comment