Skip to main content

Vertex AI

You can also use Google Cloud's Vertex AI models, including Gemini or your own fine-tuned models, with DeepEval.

info

To use Vertex AI, you must have the following:

  1. A Google Cloud project with the Vertex AI API enabled
  2. Application Default Credentials set up:
gcloud auth application-default login

Command Line

Run the following command in your terminal to configure your deepeval environment to use Gemini models through Vertex AI for all metrics.

deepeval set-gemini \
--model-name=<model> \ # e.g. "gemini-2.0-flash-001"
--project-id=<project_id> \
--location=<location> # e.g. "us-central1"
info

The CLI command above sets Gemini (via Vertex AI) as the default provider for all metrics, unless overridden in Python code. To use a different default model provider, you must first unset Gemini:

deepeval unset-gemini
Persisting settings

You can persist CLI settings with the optional --save flag. See Flags and Configs -> Persisting CLI settings.

Python

Alternatively, you can specify your model directly in code using GeminiModel from DeepEval's model collection. By default, model is set to gemini-1.5-pro.

from deepeval.models import GeminiModel
from deepeval.metrics import AnswerRelevancyMetric

model = GeminiModel(
model="gemini-1.5-pro",
project="Your Project ID",
location="us-central1",
temperature=0
)

answer_relevancy = AnswerRelevancyMetric(model=model)

There are ZERO mandatory and SIX optional parameters when creating an GeminiModel through Vertex AI:

  • [Optional] model: A string specifying the name of the Gemini model to use. Defaults to GEMINI_MODEL_NAME if not passed; raises an error at runtime if unset.
  • [Optional] temperature: A float specifying the model temperature. Defaults to TEMPERATURE if not passed; falls back to 0.0 if unset.
  • [Optional] project: A string specifying the Google Cloud project ID for Vertex AI. Defaults to GOOGLE_CLOUD_PROJECT if not passed.
  • [Optional] location: A string specifying the Google Cloud location for Vertex AI. Defaults to GOOGLE_CLOUD_LOCATION if not passed.
  • [Optional] service_account_key: A JSON string containing the service account key for authentication when using Vertex AI. This string can be either the path to a service account key file or the raw JSON string. Defaults to GOOGLE_SERVICE_ACCOUNT_KEY if not passed.
  • [Optional] generation_kwargs: A dictionary of additional generation parameters supported by your model provider.
note

To use Vertex AI you must set project and location (via args or GOOGLE_CLOUD_PROJECT / GOOGLE_CLOUD_LOCATION). service_account_key is optional if you use Application Default Credentials.

tip

Any **kwargs you would like to use for your model can be passed through the generation_kwargs parameter. However, we request you to double check the params supported by the model and your model provider in their official docs.

Available Vertex AI Models

note

This list only displays some of the available models. For a comprehensive list, refer to the Vertex AI's official documentation.

Below is a list of commonly used Gemini models:

gemini-2.0-pro-exp-02-05
gemini-2.0-flash
gemini-2.0-flash-001
gemini-2.0-flash-002
gemini-2.0-flash-lite
gemini-2.0-flash-lite-001
gemini-1.5-pro
gemini-1.5-pro-001
gemini-1.5-pro-002
gemini-1.5-flash
gemini-1.5-flash-001
gemini-1.5-flash-002
gemini-1.0-pro
gemini-1.0-pro-001
gemini-1.0-pro-002
gemini-1.0-pro-vision
gemini-1.0-pro-vision-001