Skip to main content

Vertex AI

You can also use Google Cloud's Vertex AI models, including Gemini or your own fine-tuned models, with DeepEval.

info

To use Vertex AI, you must have the following:

  1. A Google Cloud project with the Vertex AI API enabled
  2. Application Default Credentials set up:
gcloud auth application-default login

Command Line

Run the following command in your terminal to configure your deepeval environment to use Gemini models through Vertex AI for all metrics.

deepeval set-gemini \
--model-name=<model_name> \ # e.g. "gemini-2.0-flash-001"
--project-id=<project_id> \
--location=<location> # e.g. "us-central1"
info

The CLI command above sets Gemini (via Vertex AI) as the default provider for all metrics, unless overridden in Python code. To use a different default model provider, you must first unset Gemini:

deepeval unset-gemini

Python

Alternatively, you can specify your model directly in code using GeminiModel from DeepEval's model collection. By default, model_name is set to gemini-1.5-pro.

from deepeval.models import GeminiModel
from deepeval.metrics import AnswerRelevancyMetric

model = GeminiModel(
model_name="gemini-1.5-pro",
project="Your Project ID",
location="us-central1"
)

answer_relevancy = AnswerRelevancyMetric(model=model)

Available Gemini Models

Below is a list of commonly used Gemini models:

gemini-2.0-pro-exp-02-05
gemini-2.0-flash
gemini-2.0-flash-001
gemini-2.0-flash-002
gemini-2.0-flash-lite
gemini-2.0-flash-lite-001
gemini-1.5-pro
gemini-1.5-pro-001
gemini-1.5-pro-002
gemini-1.5-flash
gemini-1.5-flash-001
gemini-1.5-flash-002
gemini-1.0-pro
gemini-1.0-pro-001
gemini-1.0-pro-002
gemini-1.0-pro-vision
gemini-1.0-pro-vision-001