I'm currently exporting a SharePoint list into a CSV file on my local machine, converting it into an array of [PSCustomObject] using PowerShell, and then manually pasting that data into my Azure Function App. I'm looking for a way to automate this process to avoid manual updates. Ideally, I want a cached version of the data that refreshes every four hours instead of the function always pulling the latest data from SharePoint. I initially thought about dot sourcing the data in the function app, but that seems complicated. I've also considered loading the CSV from an Azure Storage account and updating that file separately. However, I'm concerned about the potential for multiple function app executions trying to download the same file at once. Any thoughts on how I can achieve this?
3 Answers
If the function doesn't need fresh data each run, can’t you just retrieve it directly from SharePoint whenever something changes? Another option is to store the data in Azure Blob Storage or tables, and have a separate function that populates that data periodically.
You could set up an HTTP trigger that responds to updates in the SharePoint list. Use a Logic App to call the Function when a change occurs, so you only fetch new values when needed instead of every time the Function runs.
Why not set up a Logic App that triggers the function whenever there's a change or update to the SharePoint list? This way, your function can be notified about updates and grab the latest data when needed, rather than fetching it every time.

I understand that idea, but I forgot to mention that my Function App is triggered by blob events. It needs the data from SharePoint list to generate the output, so whenever it runs, I still need the latest information. But I don't want it to reload from SharePoint on every trigger; I need a way to have the updated data available on a schedule instead.