Skip to main content

Amazon Bedrock

DeepEval supports Amazon Bedrock models that are available through the Bedrock Runtime Converse API for all evaluation metrics. To get started, you'll need to set up your AWS credentials.

note

AmazonBedrockModel requires aiobotocore and botocore. DeepEval will prompt you to install them if they are missing.

Setting Up Your API Key

To use Amazon Bedrock for deepeval's LLM-based evaluations (metrics evaluated using an LLM), provide your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in the CLI:

export AWS_ACCESS_KEY_ID=<your-aws-access-key-id>
export AWS_SECRET_ACCESS_KEY=<your-aws-secret-access-key>

Alternatively, if you're working in a notebook environment (e.g., Jupyter or Colab), set your keys in a cell:

%env AWS_ACCESS_KEY_ID=<your-aws-access-key-id>
%env AWS_SECRET_ACCESS_KEY=<your-aws-secret-access-key>

Python

To use Amazon bedrock models for DeepEval metrics, define an AmazonBedrockModel and specify the model you want to use.

from deepeval.models import AmazonBedrockModel
from deepeval.metrics import AnswerRelevancyMetric

model = AmazonBedrockModel(
model="anthropic.claude-3-opus-20240229-v1:0",
region="us-east-1",
generation_kwargs={"temperature": 0},
)
answer_relevancy = AnswerRelevancyMetric(model=model)

There are ZERO mandatory and SEVEN optional parameters when creating an AmazonBedrockModel. Parameters may be explicitly passed to the model at initialization time, or configured with optional settings. The mandatory parameters are required at runtime, but you can provide them either explicitly as constructor arguments, or via DeepEval settings / environment variables (constructor args take precedence). See Environment variables and settings for the Bedrock-related environment variables:

  • [Optional] model: A string specifying the bedrock model identifier to call (e.g. anthropic.claude-3-opus-20240229-v1:0). Defaults to AWS_BEDROCK_MODEL_NAME if not passed; raises an error at runtime if unset.
  • [Optional] region: A string specifying the AWS region hosting your Bedrock endpoint (e.g. us-east-1). Defaults to AWS_BEDROCK_REGION if not passed; raises an error at runtime if unset.
  • [Optional] aws_access_key_id: A string specifiying your AWS Access Key ID. Defaults to AWS_ACCESS_KEY_ID if not passed; if still omitted, falls back to the AWS default credentials chain.
  • [Optional] aws_secret_access_key: A string specifiying your AWS Secret Access Key. Defaults to AWS_SECRET_ACCESS_KEY if not passed; if still omitted, falls back to the AWS default credentials chain.
  • [Optional] cost_per_input_token: A float specifying the per-input-token cost in USD. Defaults to AWS_BEDROCK_COST_PER_INPUT_TOKEN (else 0).
  • [Optional] cost_per_output_token: A float specifying the per-output-token cost in USD. Defaults to AWS_BEDROCK_COST_PER_OUTPUT_TOKEN (else 0).
  • [Optional] generation_kwargs: A dictionary of generation parameters that will be sent to Bedrock as inferenceConfig. Available keys may vary by the Bedrock model you choose. See the AWS Bedrock inference parameters docs.
tip

Pass generation parameters like temperature, topP, or maxTokens via generation_kwargs (they are sent as inferenceConfig).

Extra **kwargs passed to AmazonBedrockModel(...) are forwarded to the underlying Bedrock client (aiobotocore/botocore) and are not treated as generation parameters.

Available Amazon Bedrock Models

note

This list only displays some of the available models. For a comprehensive list, refer to the Amazon Bedrock's official documentation.

Below is a list of commonly used Amazon Bedrock foundation models:

  • anthropic.claude-3-opus-20240229-v1:0
  • anthropic.claude-3-sonnet-20240229-v1:0
  • anthropic.claude-opus-4-20250514-v1:0
  • anthropic.claude-opus-4-1-20250805-v1:0
  • anthropic.claude-sonnet-4-20250514-v1:0
  • anthropic.claude-sonnet-4-5-20250929-v1:0
  • anthropic.claude-haiku-4-5-20251001-v1:0
  • amazon.titan-text-express-v1
  • amazon.titan-text-premier-v1:0
  • amazon.nova-micro-v1:0
  • amazon.nova-lite-v1:0
  • amazon.nova-pro-v1:0
  • amazon.nova-premier-v1:0
  • meta.llama4-maverick-17b-instruct-v1:0
  • meta.llama4-maverick-17b-instruct-128k-v1:0
  • meta.llama4-scout-17b-instruct-v1:0
  • meta.llama4-scout-17b-instruct-128k-v1:0
  • mistral.mistral-large-2407-v1:0
  • mistral.mistral-large-2411-v1:0
  • mistral.pixtral-large-2411-v1:0
  • mistral.pixtral-large-2502-v1:0
  • mistral.pixtral-large-2511-v1:0
  • openai.gpt-oss-20b-1:0
  • openai.gpt-oss-120b-1:0