Save your flow run history to Azure Table Storage

Posted by

Power Automate flow run history is available for no longer than 28 days in EMEA located environments because of GDPR.

General Data Protection Regulation (GDPR) requires us to keep run logs for no longer than 28 days. To maintain a longer history you’ll need to manually capture run histories before they are deleted.

Information on the 28-day run history

With the custom connector to get flow run history it’s now possible to automatically capture run histories before they are deleted. One option is to save your flow run history to Azure Table Storage.

I’ve created a table using the Create table action of the Azure Table Storage connector. The name of the table is based on the evironment name of where the to be created flow and the target flows are running. Because of the restrictions on the table name the guid itself can’t be used so I use an expression to create a valid name, like: env55152933bb484026a392d374b7c9fb5b

concat('env',replace(workflow()['tags']['environmentName'],'-',''))

The setup of the table storage logic is:

  • One storage table per environment
  • Partition key = flow name (id)
  • Row key = run name (id)
  • All properties are of type String, except the startTime and endTime that are of type DateTime.

Here is an overview of the flow. In this example I will use a fixed flow name, but that can be made dynamic by using the List Flows as Admin action of the Power Automate Management connector to loop through all or a selection of flows in the specified environment.

The flow starts by a Recurrence trigger. The fixed flow name is set as variable. The environment name is based on the current environment with the following expression:

workflow()['tags']['environmentName']

Use this JSON schema to parse the JSON of the Get Flow Runs output:

{
    "type": "object",
    "properties": {
        "value": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "name": {
                        "type": "string"
                    },
                    "id": {
                        "type": "string"
                    },
                    "type": {
                        "type": "string"
                    },
                    "properties": {
                        "type": "object",
                        "properties": {
                            "startTime": {
                                "type": "string"
                            },
                            "endTime": {
                                "type": "string"
                            },
                            "status": {
                                "type": "string"
                            },
                            "code": {
                                "type": "string"
                            },
                            "error": {
                                "type": "object",
                                "properties": {
                                    "code": {
                                        "type": "string"
                                    },
                                    "message": {
                                        "type": "string"
                                    }
                                }
                            },
                            "correlation": {
                                "type": "object",
                                "properties": {
                                    "clientTrackingId": {
                                        "type": "string"
                                    }
                                }
                            },
                            "trigger": {
                                "type": "object",
                                "properties": {
                                    "name": {
                                        "type": "string"
                                    },
                                    "startTime": {
                                        "type": "string"
                                    },
                                    "endTime": {
                                        "type": "string"
                                    },
                                    "scheduledTime": {
                                        "type": "string"
                                    },
                                    "originHistoryName": {
                                        "type": "string"
                                    },
                                    "correlation": {
                                        "type": "object",
                                        "properties": {
                                            "clientTrackingId": {
                                                "type": "string"
                                            }
                                        }
                                    },
                                    "code": {
                                        "type": "string"
                                    },
                                    "status": {
                                        "type": "string"
                                    }
                                }
                            }
                        }
                    }
                },
                "required": [
                    "name",
                    "id",
                    "type",
                    "properties"
                ]
            }
        }
    }
}

Then for each flow run the JSON for the table entity is composed based on the input:

{
  "name": @{items('Apply_to_each')?['name']},
  "id": @{items('Apply_to_each')?['id']},
  "type": @{items('Apply_to_each')?['type']},
  "startTime@odata.type": "Edm.DateTime",
  "startTime": @{items('Apply_to_each')?['properties']?['startTime']},
  "endTime@odata.type": "Edm.DateTime",
  "endTime": @{items('Apply_to_each')?['properties']?['endTime']},
  "status": @{items('Apply_to_each')?['properties']?['status']},
  "code": @{items('Apply_to_each')?['properties']?['code']},
  "errorCode": @{items('Apply_to_each')?['properties']?['error']?['code']},
  "errorMessage": @{items('Apply_to_each')?['properties']?['error']?['message']},
  "clientTrackingId": @{items('Apply_to_each')?['properties']?['correlation']?['clientTrackingId']},
  "trigger": "@{items('Apply_to_each')?['properties']?['trigger']}",
  "PartitionKey": @{variables('flowName')},
  "RowKey": @{items('Apply_to_each')?['name']}
}

The last action is to insert or replace (in case the entity exists) the entity with the flow run data into the table.

This flow run will fill the storage table with the available run history of the targeted flow. To save on API requests you could/should filter the oldest runs that will be deleted the next day(s) and only save them.

You can query the tables and entities by the (careful) construction of a request URI to get the run history from the Azure Table Storage. Or by using the actions of the Azure Table Storage connector.

4 comments

  1. Thanks for the article Stefan. In the example you are working with a fixed flow name from one environment though you say iteration is possible. I’m trying to iterate through all flows in an environment, even ones which I do not own. Even though I have a Power Admin account, this is still not enough to retrieve the Run History of flows from other users and fails with permission error: “The caller with object id [account id] does not have permission for connection [flow GUID] under Api ‘shared_logicflows'” Do you know if above is possible at all?

  2. Thanks for the article. Strangely we are working in the US GCC cloud and seeing this happen. Why would GDPR apply to US datacenters? Or is this a bug and we need to open a support ticket with MS?

    1. GDPR applies if you have any data belonging to an EU citizen regardless of the location of the data centre.

Leave a Reply

Your email address will not be published. Required fields are marked *