Leif — Local-First AI Assistant
Private. Fast. Yours. Local speech command. Your personal computer assistant.
What Leif does
- Runs local models for private, low-latency responses.
- Understands context on your machine: files, notes, and apps.
- Launches scripts/CLI tasks and returns results inline.
- Dictation + hotkeys for hands-free commands anywhere.
- Retrieval with source links so you can verify answers.
Why it’s different
- Local-first by default — cloud optional per action.
- Composable actions — wire any tool via small adapters.
- Deterministic guardrails — explicit permissions.
- Fast startup, small footprint (Tauri + Rust).
- Offline-friendly — stays useful without internet.
Local Models
Ollama-backed models on your box. No external calls unless allowed.
Retrieval
Index your docs and projects. Answers include citations.
Automations
Trigger scripts, APIs, and workflows with parameters.
Guardrails
Per-action prompts for filesystem, network, and shell.
Hotkeys
Global shortcuts to talk, ask, or run playbooks instantly.
Profiles
Switch personas (coder/writer/analyst) with tuned defaults.
Stack & Integrations
- Tauri (UI shell) + Rust (core) + Node bridges where needed.
- Ollama models; optional cloud LLMs per rule.
- Adapters for ClickUp, Slack, Git, and shell scripts.
- Embeddings + vector store for local retrieval.
- Pluggable actions via a tiny JSON manifest.