Ollama
Run LLMs on your own computer with a single command — no GPU required for smaller models
Desktop app to discover, download, and run local LLMs with a gorgeous interface
LM Studio is the most polished desktop GUI for local AI, period. If Ollama feels too command-line-heavy and you'd rather click through a visual interface, LM Studio gives you an app store experience for AI models. You browse available models, see benchmarks, download with one click, and start chatting — all within a single window. The side-by-side model comparison feature is brilliant for deciding which model works best for your needs. It's optimized for Apple Silicon, where it runs Llama-class models remarkably smoothly. The local server mode lets other apps connect to it via a ChatGPT-compatible API. The main limitation is that it's proprietary software with a free tier, so you're trusting a company rather than a community project. But as a daily driver for exploring local AI, it's hard to beat.
Good first choice if you want a practical tool without spending the afternoon reading developer docs.
Browse available models, see benchmarks, and test them all without touching a terminal
Optimized for Apple Silicon chips — runs Llama 3 8B smoothly on M1/M2/M3 Macs