DeepSeek
DeepEval allows you to use deepseek-chat and deepseek-reasoner directly from DeepSeek to run all of DeepEval's metrics, which can be set through the CLI or in python.
Command Line
To configure your DeepSeek model through the CLI, run the following command:
deepeval set-deepseek --model deepseek-chat \
--api-key="your-api-key" \
--temperature=0
The CLI command above sets deepseek-chat as the default model for all metrics, unless overridden in Python code. To use a different default model provider, you must first unset DeepSeek:
deepeval unset-deepseek
You can persist CLI settings with the optional --save flag.
See Flags and Configs -> Persisting CLI settings.
Python
You can also specify your model directly in code using DeepSeekModel.
from deepeval.models import DeepSeekModel
from deepeval.metrics import AnswerRelevancyMetric
model = DeepSeekModel(
model="deepseek-chat",
api_key="your-api-key",
temperature=0
)
answer_relevancy = AnswerRelevancyMetric(model=model)
There are TWO mandatory and ONE optional parameters when creating a DeepSeekModel:
model: A string specifying the name of the DeepSeek model to use. Either bedeepseek-chatordeepseek-reasoner.api_key: A string specifying your DeepSeek API key for authentication.- [Optional]
temperature: A float specifying the model temperature. Defaulted to 0. - [Optional]
generation_kwargs: A dictionary of additional generation parameters supported by your model provider.
Any **kwargs you would like to use for your model can be passed through the generation_kwargs parameter. However, we request you to double check the params supported by the model and your model provider in their official docs.
Available DeepSeek Models
Below is the comprehensive list of available DeepSeek models in DeepEval:
deepseek-chatdeepseek-reasoner