I'm currently exporting a SharePoint list into a CSV file on my local machine, converting it into an array of [PSCustomObject] using PowerShell, and then manually pasting that data into my Azure Function App. I'm looking for a way to automate this process to avoid manual updates. Ideally, I want a cached version of the data that refreshes every four hours instead of the function always pulling the latest data from SharePoint. I initially thought about dot sourcing the data in the function app, but that seems complicated. I've also considered loading the CSV from an Azure Storage account and updating that file separately. However, I'm concerned about the potential for multiple function app executions trying to download the same file at once. Any thoughts on how I can achieve this?
1 Answer
Why not set up a Logic App that triggers the function whenever there's a change or update to the SharePoint list? This way, your function can be notified about updates and grab the latest data when needed, rather than fetching it every time.
I understand that idea, but I forgot to mention that my Function App is triggered by blob events. It needs the data from SharePoint list to generate the output, so whenever it runs, I still need the latest information. But I don't want it to reload from SharePoint on every trigger; I need a way to have the updated data available on a schedule instead.