How to Use Azure API Management with Anthropic’s Claude Model?

0
11
Asked By TechExplorer89 On

Hey everyone! I'm trying to find out if it's possible to set up Azure API Management in front of an Anthropic model hosted on Foundry. I've successfully managed to connect APIM with OpenAI models, but when I attempt to integrate with the Anthropic models, I keep running into issues where it says either "not supported" or "resource not found." Can anyone provide insights on the parameters or settings I should use for this integration? Thanks in advance!

3 Answers

Answered By CloudNinja42 On

I faced the same problem when routing Anthropic through APIM initially. The main issue is that Claude's API requires a different path structure and authentication header compared to OpenAI. The default OpenAI policy in APIM rewrites to "/openai/deployments/..." which doesn't work for Anthropic.

What worked for me was creating a separate API definition in APIM that targets your Anthropic model's AI Foundry endpoint directly. Make sure to set the backend URL to the specific model inference endpoint (something like https://your-foundry.services.ai.azure.com/models/chat/completions), then use a set-header policy to send the API key instead of a bearer token, and remove the OpenAI-specific path rewrite from the API. Also, the "not supported" error usually indicates that APIM is adding the OpenAI API version query parameter, which the Anthropic endpoints don't accept. Use a set-query-parameter policy to delete it.

Answered By RandomDude22 On

I see you have all the moneys

TechExplorer89 -

I wish!

Answered By DataGuru77 On

Getting it to work is definitely doable, but keep in mind that the routes and headers for Anthropic (Claude) APIs are not the same as those for OpenAI. The "resource not found" error usually comes from mismatched URLs or authentication headers.

You should also verify that your model is correctly deployed in your Foundry instance; that's a common mistake. Many people opt to handle routing outside of APIM, using simple proxies or serverless functions instead of trying to force it to behave like OpenAI.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.