Evaluation Models
LM Studio
deepeval supports running evaluations using local LLMs that expose OpenAI-compatible APIs. One such provider is LM Studio, a user-friendly desktop app for running models locally.
Command Line
To start using LM Studio with deepeval, follow these steps:
- Make sure LM Studio is running. The typical base URL for LM Studio is:
http://localhost:1234/v1/. - Run the following command in your terminal to connect
deepevalto LM Studio:
deepeval set-local-model \
--model=<model_name> \
--base-url="http://localhost:1234/v1/"Reverting to OpenAI
To switch back to using OpenAI’s hosted models, run:
deepeval unset-local-model