100% Private & Local

Your Private AI Companion

A powerful AI chat application that runs entirely on your computer. No cloud servers, no subscriptions, no data collection. Just you and your AI.

0%
Private & Local
$0
Forever Free
Unlimited Usage

Powerful Features

Everything you need for an exceptional AI chat experience, running entirely on your machine.

Complete Privacy

All processing happens locally. Your conversations never leave your computer.

Learn more

Custom Characters

Create AI personas with unique personalities, expertise, and communication styles.

Learn more

Vision & Images

Analyze images, extract text, and work with visual content using vision models.

Learn more

Text-to-Speech

Listen to AI responses with natural-sounding voices in multiple languages.

Learn more

Web Search

Enable web search to get current information and real-time answers.

Learn more

Export to PDF

Download conversations and summaries as professionally formatted PDFs.

Learn more

Powered by Ollama

Local Assistant uses Ollama to run large language models directly on your hardware. Choose from dozens of open-source models including Llama, Mistral, and more.

Get Ollama

See It In Action

A glimpse of what you can do with Local Assistant.

Chat Interface
Modern Chat Interface
Character Personas
Custom Characters
Vision Analysis
Image Analysis
Settings Panel
Customizable Settings
PDF Export
PDF Export

Where does your data go?

Cloud AI Services

  • Data sent to remote servers
  • Conversations may be stored & trained on
  • Monthly subscription fees
  • Requires internet connection
VS

Local Assistant

  • Runs 100% on your machine
  • Zero data collection, ever
  • Completely free, no limits
  • Works fully offline

Download Local Assistant

Get started in minutes. Free forever, no account required.

Early Access

Step 1: Install Python

Python 3.10+ is required. Make sure to check "Add Python to PATH" during installation.

Get Python

Step 2: Install Ollama

Ollama runs AI models locally on your computer. It's the AI engine that powers Local Assistant.

Get Ollama

Step 3: Download & Run

Download the ZIP, extract to a folder of your choice, then double-click start.bat to run.

Download v1.0.0

Quick Start

1
Extract the ZIP

Download and extract to a folder of your choice, e.g. C:\LocalAssistant

2
Double-click start.bat

First run automatically creates a virtual environment and installs all dependencies

3
Start chatting!

Your browser opens to http://localhost:5000 — select a model and go

CLI alternative: cd C:\LocalAssistant then start.bat

Need help? Check out our detailed installation guide.

System Requirements

Make sure your computer meets the minimum requirements.

Hardware Requirements

ComponentMinimumRecommended
CPU4-core (i5/Ryzen 5)8+ cores (i7/Ryzen 7)
RAM8GB16GB+
Storage10GB free50GB+ SSD
GPUNot requiredNVIDIA 6GB+ VRAM
CPU Min: 4-core (i5/Ryzen 5) Rec: 8+ cores (i7/Ryzen 7)
RAM Min: 8GB Rec: 16GB+
Storage Min: 10GB free Rec: 50GB+ SSD
GPU Min: Not required Rec: NVIDIA 6GB+ VRAM

Required Software

RAM by Model Size

  • 8GB RAM: Small models (1-3B) - llama3.2:1b, phi3:mini
  • 16GB RAM: Medium models (7-8B) - llama3.1:8b, mistral
  • 24GB+ RAM: Large models (13B+) - codellama:13b

View full requirements

Ready to Get Started?

Take control of your AI experience with complete privacy and zero cost. Download Local Assistant today and enjoy powerful AI conversations — entirely on your terms.