Skip to main content

dspy.OllamaLocal

note

Adapted from documentation provided by https://github.com/insop

Ollama is a good software tool that allows you to run LLMs locally, such as Mistral, Llama2, and Phi. The following are the instructions to install and run Ollama.

Prerequisites

Install Ollama by following the instructions from this page:

Download model: ollama pull

Download a model by running the ollama pull command. You can download Mistral, Llama2, and Phi.

# download mistral
ollama pull mistral

Here is the list of other models you can download:

Running Ollama model

Run model: ollama run

You can test a model by running the model with the ollama run command.

# run mistral
ollama run mistral

Sending requests to the server

Here is the code to load a model through Ollama:

lm = dspy.OllamaLocal(model='mistral')