🔥 Vibe coding for DeepEval is here. Get started now.
Evaluation Models

LM Studio

deepeval supports running evaluations using local LLMs that expose OpenAI-compatible APIs. One such provider is LM Studio, a user-friendly desktop app for running models locally.

Command Line

To start using LM Studio with deepeval, follow these steps:

  1. Make sure LM Studio is running. The typical base URL for LM Studio is: http://localhost:1234/v1/.
  2. Run the following command in your terminal to connect deepeval to LM Studio:
deepeval set-local-model \
    --model=<model_name> \
    --base-url="http://localhost:1234/v1/"

Reverting to OpenAI

To switch back to using OpenAI’s hosted models, run:

deepeval unset-local-model

On this page