Extract the ZIP
Download and extract to a folder of your choice, e.g. C:\LocalAssistant
A powerful AI chat application that runs entirely on your computer. No cloud servers, no subscriptions, no data collection. Just you and your AI.
Everything you need for an exceptional AI chat experience, running entirely on your machine.
All processing happens locally. Your conversations never leave your computer.
Learn moreCreate AI personas with unique personalities, expertise, and communication styles.
Learn moreAnalyze images, extract text, and work with visual content using vision models.
Learn moreListen to AI responses with natural-sounding voices in multiple languages.
Learn moreLocal Assistant uses Ollama to run large language models directly on your hardware. Choose from dozens of open-source models including Llama, Mistral, and more.
Get OllamaA glimpse of what you can do with Local Assistant.
Get started in minutes. Free forever, no account required.
Python 3.10+ is required. Make sure to check "Add Python to PATH" during installation.
Get PythonOllama runs AI models locally on your computer. It's the AI engine that powers Local Assistant.
Get OllamaDownload the ZIP, extract to a folder of your choice, then double-click start.bat to run.
Download and extract to a folder of your choice, e.g. C:\LocalAssistant
start.batFirst run automatically creates a virtual environment and installs all dependencies
Your browser opens to http://localhost:5000 — select a model and go
cd C:\LocalAssistant then start.bat
Need help? Check out our detailed installation guide.
Make sure your computer meets the minimum requirements.
| Component | Minimum | Recommended |
|---|---|---|
| CPU | 4-core (i5/Ryzen 5) | 8+ cores (i7/Ryzen 7) |
| RAM | 8GB | 16GB+ |
| Storage | 10GB free | 50GB+ SSD |
| GPU | Not required | NVIDIA 6GB+ VRAM |
Take control of your AI experience with complete privacy and zero cost. Download Local Assistant today and enjoy powerful AI conversations — entirely on your terms.