deepeval automatically loads environment variables from dotenv files in this order: .env → .env.{APP_ENV} → .env.local (highest precedence). Existing process environment variables are never overwritten—process env always wins.
Boolean environment variables in deepeval are parsed using env-style boolean semantics. Tokens are case-insensitive and any surrounding quotes or whitespace is ignored.
Truthy tokens:
1, true, t, yes, y, on, enable, enabled
Falsy tokens:
0, false, f, no, n, off, disable, disabled
Rules:
bool values are used as-is.
Numeric values are False when 0, otherwise True.
Strings are matched against the tokens above.
If a value is unset (or doesn't match any token), deepeval falls back to the setting's default.
In the tables below, boolean variables are shown as 1 / 0 / unset, but all of the tokens above are accepted.
You can configure model providers by setting a combination of environment variables (API keys, model names, provider flags, etc.). However, we recommend using the CLI commands instead, which will set these variables for you.
Explicit constructor arguments (e.g. OpenAIModel(api_key=...)) always take precedence over environment variables. You can also set TEMPERATURE to provide a default temperature for all model instances.
When set to 1, USE_{PROVIDER}_MODEL (e.g. USE_OPENAI_MODEL) tells deepeval which provider to use for LLM-as-a-judge metrics when no model is explicitly passed.
Each provider also has its own set of variables for API keys, model names, and other provider-specific options. Expand the sections below to see the full list for each provider.
AWS / Amazon Bedrock
If AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are not set, the AWS SDK default credentials chain is used.
Variable
Values
Effect
AWS_ACCESS_KEY_ID
string / unset
Optional AWS access key ID for authentication.
AWS_SECRET_ACCESS_KEY
string / unset
Optional AWS secret access key for authentication.
USE_AWS_BEDROCK_MODEL
1 / 0 / unset
Prefer Bedrock as the default LLM provider (where applicable).
AWS_BEDROCK_MODEL_NAME
string / unset
Bedrock model ID (e.g. anthropic.claude-3-opus-20240229-v1:0).
AWS_BEDROCK_REGION
string / unset
AWS region (e.g. us-east-1).
AWS_BEDROCK_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
AWS_BEDROCK_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
Anthropic
Variable
Values
Effect
ANTHROPIC_API_KEY
string / unset
Anthropic API key.
ANTHROPIC_MODEL_NAME
string / unset
Optional default Anthropic model name.
ANTHROPIC_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
ANTHROPIC_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
Azure OpenAI
Variable
Values
Effect
USE_AZURE_OPENAI
1 / 0 / unset
Prefer Azure OpenAI as the default LLM provider (where applicable).
AZURE_OPENAI_API_KEY
string / unset
Azure OpenAI API key.
AZURE_OPENAI_ENDPOINT
string / unset
Azure OpenAI endpoint URL.
OPENAI_API_VERSION
string / unset
Azure OpenAI API version.
AZURE_DEPLOYMENT_NAME
string / unset
Azure deployment name.
AZURE_MODEL_NAME
string / unset
Optional Azure model name (for metadata / reporting).
AZURE_MODEL_VERSION
string / unset
Optional Azure model version (for metadata / reporting).
OpenAI
Variable
Values
Effect
USE_OPENAI_MODEL
1 / 0 / unset
Prefer OpenAI as the default LLM provider (where applicable).
OPENAI_API_KEY
string / unset
OpenAI API key.
OPENAI_MODEL_NAME
string / unset
Optional default OpenAI model name.
OPENAI_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
OPENAI_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
DeepSeek
Variable
Values
Effect
USE_DEEPSEEK_MODEL
1 / 0 / unset
Prefer DeepSeek as the default LLM provider (where applicable).
DEEPSEEK_API_KEY
string / unset
DeepSeek API key.
DEEPSEEK_MODEL_NAME
string / unset
Optional default DeepSeek model name.
DEEPSEEK_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
DEEPSEEK_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
Gemini
Variable
Values
Effect
USE_GEMINI_MODEL
1 / 0 / unset
Prefer Gemini as the default LLM provider (where applicable).
GOOGLE_API_KEY
string / unset
Google API key.
GEMINI_MODEL_NAME
string / unset
Optional default Gemini model name.
GOOGLE_GENAI_USE_VERTEXAI
1 / 0 / unset
If set, use Vertex AI via google-genai (where supported).
GOOGLE_CLOUD_PROJECT
string / unset
Optional GCP project (Vertex AI).
GOOGLE_CLOUD_LOCATION
string / unset
Optional GCP location/region (Vertex AI).
GOOGLE_SERVICE_ACCOUNT_KEY
string / unset
Optional service account key (Vertex AI).
VERTEX_AI_MODEL_NAME
string / unset
Optional Vertex AI model name.
Grok
Variable
Values
Effect
USE_GROK_MODEL
1 / 0 / unset
Prefer Grok as the default LLM provider (where applicable).
GROK_API_KEY
string / unset
Grok API key.
GROK_MODEL_NAME
string / unset
Optional default Grok model name.
GROK_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
GROK_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
LiteLLM
Variable
Values
Effect
USE_LITELLM
1 / 0 / unset
Prefer LiteLLM as the default LLM provider (where applicable).
LITELLM_API_KEY
string / unset
Optional API key passed to LiteLLM.
LITELLM_MODEL_NAME
string / unset
Default LiteLLM model name.
LITELLM_API_BASE
string / unset
Optional base URL for the LiteLLM endpoint.
LITELLM_PROXY_API_BASE
string / unset
Optional proxy base URL (if using a proxy).
LITELLM_PROXY_API_KEY
string / unset
Optional proxy API key (if using a proxy).
Local Model
Variable
Values
Effect
USE_LOCAL_MODEL
1 / 0 / unset
Prefer the local model adapter as the default LLM provider (where applicable).
LOCAL_MODEL_API_KEY
string / unset
Optional API key for the local model endpoint (if required).
LOCAL_MODEL_NAME
string / unset
Optional default local model name.
LOCAL_MODEL_BASE_URL
string / unset
Base URL for the local model endpoint.
LOCAL_MODEL_FORMAT
string / unset
Optional format hint for the local model integration.
Kimi (Moonshot)
Variable
Values
Effect
USE_MOONSHOT_MODEL
1 / 0 / unset
Prefer Moonshot as the default LLM provider (where applicable).
MOONSHOT_API_KEY
string / unset
Moonshot API key.
MOONSHOT_MODEL_NAME
string / unset
Optional default Moonshot model name.
MOONSHOT_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
MOONSHOT_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
Ollama
Variable
Values
Effect
OLLAMA_MODEL_NAME
string / unset
Optional default Ollama model name.
Portkey
Variable
Values
Effect
USE_PORTKEY_MODEL
1 / 0 / unset
Prefer Portkey as the default LLM provider (where applicable).
PORTKEY_API_KEY
string / unset
Portkey API key.
PORTKEY_MODEL_NAME
string / unset
Optional default model name passed to Portkey.
PORTKEY_BASE_URL
string / unset
Optional Portkey base URL.
PORTKEY_PROVIDER_NAME
string / unset
Optional provider name (Portkey routing).
OpenRouter
Variable
Values
Effect
USE_OPENROUTER_MODEL
1 / 0 / unset
Prefer OpenRouter as the default LLM provider (where applicable).
OPENROUTER_API_KEY
string / unset
OpenRouter API key.
OPENROUTER_MODEL_NAME
string / unset
Optional default model name passed to OpenRouter.
OPENROUTER_BASE_URL
string / unset
Optional OpenRouter base URL.
OPENROUTER_COST_PER_INPUT_TOKEN
float / unset
Optional input-token cost used for cost reporting.
OPENROUTER_COST_PER_OUTPUT_TOKEN
float / unset
Optional output-token cost used for cost reporting.
Embeddings
Variable
Values
Effect
USE_AZURE_OPENAI_EMBEDDING
1 / 0 / unset
Prefer Azure OpenAI embeddings as the default embeddings provider (where applicable).
AZURE_EMBEDDING_DEPLOYMENT_NAME
string / unset
Azure embedding deployment name.
USE_LOCAL_EMBEDDINGS
1 / 0 / unset
Prefer local embeddings as the default embeddings provider (where applicable).
LOCAL_EMBEDDING_API_KEY
string / unset
Optional API key for the local embeddings endpoint (if required).