Skip to content
Searching: All docs

Start typing to search the docs.

Using Local LLMs (Ollama)

Ollama lets you run LLMs locally for privacy and offline translation.

Install Ollama

  1. Download and install from https://ollama.ai
  2. Start Ollama
  3. Pull a model (example):
Terminal window
ollama pull llama3

Use in Supervertaler

Once Ollama is installed and running, Supervertaler can use it as a provider.