Run open-source LLMs locally on your computer
Ollama makes it easy to run powerful open-source models like Llama 3, Mistral, Gemma, and Phi locally on your Mac, Linux, or Windows machine. No cloud required, completely private. Simple command-line interface with a growing library of models.