azure-openai
chat
embedding
vision
Test your Azure OpenAI API key.
Validate Azure OpenAI credentials (endpoint + api-key + api-version) and list deployments.
Stateless proxy — keys never logged, stored, or persisted. What happens to your key →
Detected
Azure OpenAI
What this key does
Azure OpenAI runs OpenAI's models on Azure infrastructure. Auth is api-key header instead of Bearer; routes are scoped to your resource and a deployment id, not a model id directly. The validator here accepts a JSON-encoded composite credential.
How to get a Azure OpenAI API key
- Open the Azure portal and find your Azure OpenAI resource.
- Under Keys and Endpoint, copy KEY 1 and the endpoint URL.
- Pick an API version (2024-10-21 is a stable current default).
- Paste a JSON object here: {"endpoint":"…","apiKey":"…","apiVersion":"2024-10-21"}.
Common errors and fixes
- 401 Unauthorized: Key is invalid, revoked, or pasted with extra whitespace. Generate a new key from the provider console and try again.
- 403 Forbidden: Key is valid but lacks permission for this resource. Check project / org / workspace scope, or that billing is set up for this key.
- 429 Too Many Requests: You hit the per-minute or per-day rate limit. Wait a moment and retry, or upgrade your tier.
- 404 Not Found: The endpoint or model id changed. Check the provider docs for the current path and model identifier.
- 5xx: The provider is having issues. Check their status page before assuming the bug is yours.
- 404 Deployment not found: The model id you're calling must be the deployment name from your Azure resource, not the OpenAI model name.
Security best practices
- Store keys in an env var or secret manager — never commit them to a repo, even a private one.
- Restrict scope: prefer per-project or per-deployment keys over a single root key shared across services.
- Rotate on a schedule (90 days is a sane default) and immediately on suspected leak.
- Audit usage in the provider console after rotation to confirm the old key has zero traffic.
- Set per-key spend limits where the provider supports them, so a leaked key has a bounded blast radius.
Pricing at a glance
Azure OpenAI mirrors OpenAI's per-million-token pricing with provisioned throughput options for predictable QoS. Billing flows through your Azure subscription.
FAQ
- Why does Azure use api-key instead of Bearer?
- Legacy choice from Cognitive Services. The OpenAI SDK abstracts this difference.
- What's an api-version for?
- Pins the request shape. Azure releases new versions as OpenAI ships features; old ones are deprecated on a schedule.
- Can I use the same key on Azure and OpenAI direct?
- No, they're separate auth systems.
- Why does my deployment 404?
- You're calling a model id, not a deployment name. Azure routes via deployment names you assign.
- Is Entra ID auth supported?
- Yes — token-based auth is preferred for production. The simple api-key path is fine for testing.
- Where can I check usage?
- Azure cost analysis, scoped to the resource.
Test other providers
Related reading
- API key security best practices for LLMsHow to store, scope, rotate, and revoke LLM API keys without leaking them through git, logs, or shared environments.
- OpenAI vs Anthropic pricing in 2026A side-by-side breakdown of OpenAI and Anthropic per-token pricing, batch discounts, and prompt-caching savings.