DeepSeek
deepeval allows you to use deepseek-chat and deepseek-reasoner directly from DeepSeek to run all of deepeval's metrics, which can be set through the CLI or in python.
Command Line
To configure your DeepSeek model through the CLI, run the following command:
deepeval set-deepseek --model=deepseek-chat \
--temperature=0
The CLI command above sets deepseek-chat as the default model for all metrics, unless overridden in Python code. To use a different default model provider, you must first unset DeepSeek:
deepeval unset-deepseek
You can persist CLI settings with the optional --save flag.
See Flags and Configs -> Persisting CLI settings.
Python
You can also specify your model directly in code using DeepSeekModel.
from deepeval.models import DeepSeekModel
from deepeval.metrics import AnswerRelevancyMetric
model = DeepSeekModel(
model="deepseek-chat",
api_key="your-api-key",
temperature=0
)
answer_relevancy = AnswerRelevancyMetric(model=model)
There are ZERO mandatory and SIX optional parameters when creating a DeepSeekModel:
- [Optional]
model: A string specifying the name of the DeepSeek model to use. Defaults toDEEPSEEK_MODEL_NAMEif not passed; raises an error at runtime if unset. - [Optional]
api_key: A string specifying your DeepSeek API key for authentication. Defaults toDEEPSEEK_API_KEYif not passed; raises an error at runtime if unset. - [Optional]
temperature: A float specifying the model temperature. Defaults toTEMPERATUREif not passed; falls back to0.0if unset. - [Optional]
cost_per_input_token: A float specifying the cost for each input token for the provided model. Defaults toDEEPSEEK_COST_PER_INPUT_TOKENif available indeepeval's model cost registry, elseNone. - [Optional]
cost_per_output_token: A float specifying the cost for each output token for the provided model. Defaults toDEEPSEEK_COST_PER_OUTPUT_TOKENif available indeepeval's model cost registry, elseNone. - [Optional]
generation_kwargs: A dictionary of additional generation forwarded to the OpenAIchat.completions.create(...)call.
Parameters may be explicitly passed to the model at initialization time, or configured with optional settings. The mandatory parameters are required at runtime, but you can provide them either explicitly as constructor arguments, or via deepeval settings / environment variables (constructor args take precedence). See Environment variables and settings for the DeepSeek-related environment variables.
Any **kwargs you would like to use for your model can be passed through the generation_kwargs parameter. However, we request you to double check the params supported by the model and your model provider in their official docs.
Available DeepSeek Models
Below is the comprehensive list of available DeepSeek models in deepeval:
deepseek-chatdeepseek-v3.2deepseek-v3.2-expdeepseek-v3.1deepseek-v3deepseek-reasonerdeepseek-r1deepseek-r1-litedeepseek-v2.5deepseek-coderdeepseek-coder-6.7bdeepseek-coder-33b