Using Local LLMs (Ollama)
Ollama lets you run LLMs locally for privacy and offline translation.
Install Ollama
- Download and install from https://ollama.ai
- Start Ollama
- Pull a model (example):
ollama pull llama3Use in Supervertaler
Once Ollama is installed and running, Supervertaler can use it as a provider.