Use with OpenRouter, Bedrock, Vertex, or Vercel
If you're using Superset in the terminal - any CLI agent (Claude Code, Codex, Curser Agents, etc.) automatically just work. That's the beauty of us being terminal first.
For the Chat UI, we have a flow for users to get started.
Chat UI
- Open the model picker at the bottom of the chat

-
Click the key icon next to Anthropic
-
Select Use API key

- Enter the environment variables for your provider (one per line,
VAR_NAME=valueformat)

- Click Save settings
See the provider sections below for the specific variables to use.
Terminal
Set environment variables in your shell profile or per workspace via Settings > Env.
For Anthropic-compatible providers (OpenRouter, Vercel AI Gateway), set ANTHROPIC_API_KEY to an empty string to prevent direct Anthropic authentication.
Vercel AI Gateway
Route requests through the Vercel AI Gateway.
ANTHROPIC_BASE_URL=https://ai-gateway.vercel.sh
ANTHROPIC_AUTH_TOKEN=your-vercel-ai-gateway-api-key
ANTHROPIC_API_KEY=OpenRouter
OpenRouter provides access to many models through a single API.
ANTHROPIC_BASE_URL=https://openrouter.ai/api
ANTHROPIC_AUTH_TOKEN=your-openrouter-api-key
ANTHROPIC_API_KEY=AWS Bedrock
Use Claude models through your AWS account with Amazon Bedrock.
Ensure your AWS credentials are configured via aws configure, environment variables, or SSO profile.
CLAUDE_CODE_USE_BEDROCK=1
AWS_REGION=us-east-1Optionally override the region for the small/fast model (Haiku):
ANTHROPIC_SMALL_FAST_MODEL_AWS_REGION=us-west-2Pin specific model versions to prevent breakage when new models are released:
ANTHROPIC_DEFAULT_OPUS_MODEL=us.anthropic.claude-opus-4-6-v1
ANTHROPIC_DEFAULT_SONNET_MODEL=us.anthropic.claude-sonnet-4-6
ANTHROPIC_DEFAULT_HAIKU_MODEL=us.anthropic.claude-haiku-4-5-20251001-v1:0For full setup details including IAM configuration, see the Claude Code on Amazon Bedrock docs.
Google Vertex AI
Run Claude models on Google Cloud Vertex AI.
Authenticate with gcloud auth application-default login, then set:
CLAUDE_CODE_USE_VERTEX=1
CLOUD_ML_REGION=global
ANTHROPIC_VERTEX_PROJECT_ID=your-project-idPin specific model versions to prevent breakage when new models are released:
ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-6
ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-6
ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-haiku-4-5@20251001For full setup details including IAM configuration, see the Claude Code on Google Vertex AI docs.
Microsoft Foundry (Azure)
Use Claude models via Microsoft Foundry.
Authenticate with an API key or Microsoft Entra ID (az login), then set:
CLAUDE_CODE_USE_FOUNDRY=1
ANTHROPIC_FOUNDRY_RESOURCE=your-resource-name
# If using API key authentication:
ANTHROPIC_FOUNDRY_API_KEY=your-azure-api-keyPin specific model versions to prevent breakage when new models are released:
ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-6
ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-6
ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-haiku-4-5For full setup details including RBAC configuration, see the Claude Code on Microsoft Foundry docs.
Other Anthropic-Compatible Providers
Any provider that exposes an Anthropic-compatible API (such as LiteLLM) can be used:
ANTHROPIC_BASE_URL=https://your-provider.example.com
ANTHROPIC_AUTH_TOKEN=your-api-key
ANTHROPIC_API_KEY=For LLM gateway configuration details, see the Claude Code LLM gateway docs.