I'm seeking advice on how to effectively orchestrate daily jobs using Bedrock Flows. I've created several flows that handle complex tasks, each taking about 15 minutes to complete. These flows need to be executed once a day for various tenants. My primary challenge is figuring out how to orchestrate these executions. I initially tried using a Lambda function triggered by a cron job (EventBridge Scheduler), but I hit the 15-minute maximum execution timeout. Then, I explored using Step Functions, but it seems there's no direct service integration for the InvokeFlow action from the Bedrock API, unlike the existing InvokeModel action. Considering these constraints, what architectural patterns and services would you suggest for efficiently orchestrating these long-running tasks while keeping scalability and cost in mind?
3 Answers
If you haven’t already considered it, think about wrapping your flow in a Docker image and deploying it to ECR, then running an ECS Fargate task. You can also schedule these containers on ECS, which might give you a more manageable setup. Using containers for orchestration can help maintain consistency across your different flow implementations.
Have you looked into AWS Batch for this? It’s worth exploring, especially if your flows consistently take up to 15 minutes. AWS Batch can efficiently handle running jobs that might exceed typical Lambda or Step Function limits, and could be a good fit for your needs.
If you're facing issues with direct integration between Step Functions and the Bedrock Flow API, one option is to trigger a Lambda function. This function can return the flow ID to your Step Function workflow, allowing you to use that ID to check the flow's completion status as part of your state in Step Functions. Another approach is using SQS Delayed Queues. Essentially, you would trigger a Lambda function with an EventBridge schedule, call `invokeFlow` to get an execution ID, and place this ID in a delayed SQS queue. The Lambda will check if the flow has completed; if not, it will re-enter the message and wait to be triggered again until it's done. Also, check if Bedrock Flows sends completion events to EventBridge; then you could use EventBridge as your orchestrator if your workflow is linear without many alternate paths.
That's a solid workaround. It's interesting how many options there are once we dig deeper into AWS services.
Just to clarify, I had a different experience when I asked earlier about this integration. Actually, AWS Step Functions does support invoking multiple Bedrock models in parallel through its optimized API, including `InvokeModel`. You can setup parallel tasks to run different models concurrently, which could solve your orchestration challenge as each branch can process different inputs.