Azure OpenAI
deepeval allows you to directly integrate Azure OpenAI models into all available LLM-based metrics. You can easily configure the model through the command line or directly within your python code.
Command Line
Run the following command in your terminal to configure your deepeval environment to use Azure OpenAI for all metrics.
deepeval set-azure-openai \
--base-url=<endpoint> \ # e.g. https://example-resource.azure.openai.com/
--model-name=<model_name> \ # e.g. gpt-4.1
--deployment-name=<deployment_name> \ # e.g. Test Deployment
--api-version=<api_version> \ # e.g. 2025-01-01-preview
The CLI command above sets Azure OpenAI as the default provider for all metrics, unless overridden in Python code. To use a different default model provider, you must first unset Azure OpenAI:
deepeval unset-azure-openai
You can persist CLI settings with the optional --save flag.
See Flags and Configs -> Persisting CLI settings.
Python
Alternatively, you can specify your model directly in code using AzureOpenAIModel from deepeval's model collection.
This approach is ideal when you need to use separate models for specific metrics.
from deepeval.models import AzureOpenAIModel
from deepeval.metrics import AnswerRelevancyMetric
model = AzureOpenAIModel(
model="gpt-4.1",
deployment_name="Test Deployment",
api_key="Your Azure OpenAI API Key",
api_version="2025-01-01-preview",
base_url="https://example-resource.azure.openai.com/",
temperature=0
)
answer_relevancy = AnswerRelevancyMetric(model=model)
There are ZERO mandatory and NINE optional parameters when creating an AzureOpenAIModel:
- [Optional]
model: A string specifying the name of the Azure OpenAI model to use. Defaults toAZURE_MODEL_NAMEif not passed; raises an error at runtime if unset. - [Optional]
api_key: A string specifying your Azure OpenAI API key. Defaults toAZURE_OPENAI_API_KEYif not passed; raises an error at runtime ifazure_ad_tokenandazure_ad_token_providerare also unset. - [Optional]
azure_ad_token: A string specifying your Azure Ad Token. Defaults toAZURE_OPENAI_AD_TOKENif not passed; raises an error at runtime ifapi_keyandazure_ad_token_providerare also unset. - [Optional]
azure_ad_token_provider: A callback of eitherAsyncAzureADTokenProviderorAzureADTokenProviderthat can be used for credentials (see example usage). Raises an error at runtime ifapi_keyandazure_ad_tokenare also unset. - [Optional]
base_url: A string specifying your Azure OpenAI endpoint URL. Defaults toAZURE_OPENAI_ENDPOINTif not passed; raises an error at runtime if unset. - [Optional]
temperature: A float specifying the model temperature. Defaults toTEMPERATUREif not passed; falls back to0.0if unset. - [Optional]
cost_per_input_token: A float specifying the cost for each input token for the provided model. Defaults toOPENAI_COST_PER_INPUT_TOKENif available indeepeval's model cost registry, elseNone. - [Optional]
cost_per_output_token: A float specifying the cost for each output token for the provided model. Defaults toOPENAI_COST_PER_OUTPUT_TOKENif available indeepeval's model cost registry, elseNone. - [Optional]
deployment_name: A string specifying the name of your Azure OpenAI deployment. Defaults toAZURE_DEPLOYMENT_NAMEif not passed; raises an error at runtime if unset. - [Optional]
api_version: A string specifying the OpenAI API version used in your deployment. Defaults toOPENAI_API_VERSIONif not passed; raises an error at runtime if unset. - [Optional]
generation_kwargs: A dictionary of additional generation parameters forwarded to the Azure OpenAIchat.completions.create(...)andbeta.chat.completions.parse(...)calls.
Parameters may be explicitly passed to the model at initialization time, or configured with optional settings. The mandatory parameters are required at runtime, but you can provide them either explicitly as constructor arguments, or via deepeval settings / environment variables (constructor args take precedence). See Environment variables and settings for the Azure OpenAI-related environment variables.
Any **kwargs you would like to use for your model can be passed through the generation_kwargs parameter. However, we recommend that you double check the params supported by the model and your model provider in their official docs.
Available Azure OpenAI Models
This list only displays some of the available models. For a comprehensive list, refer to the Azure OpenAI's official documentation.
Below is a list of commonly used Azure OpenAI models:
gpt-4.1gpt-4.5-previewgpt-4ogpt-4o-minigpt-4gpt-4-32kgpt-35-turbogpt-35-turbo-16kgpt-35-turbo-instructo1o1-minio1-previewo3-mini