Skip to content

libre-webui/libre-webui-demo

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

690 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Libre WebUI

Privacy-First AI Chat Interface

Self-hosted • Open Source • Extensible
Enterprise support by Kroonen AI

Dark Theme

Version License 25 Languages Stars

GDPR Ready HIPAA Compatible SOC 2 Ready

WebsiteDocumentation𝕏SponsorGet Started


Why Libre WebUI?

A simple, self-hosted interface for AI chat. Run it locally with Ollama, connect to OpenAI, Anthropic, HuggingFace, or 10+ providers—all from one UI.

  • Your data stays yours — Zero telemetry, fully self-hosted
  • Extensible plugin system — Ollama, OpenAI, Anthropic, and any OpenAI-compatible API
  • Simple & focused — Keyboard shortcuts, dark mode, responsive design

Features

Core Experience

  • Real-time streaming chat
  • Dark/light themes
  • VS Code-style keyboard shortcuts
  • Mobile-responsive design
  • Native Desktop App — macOS (Windows & Linux coming soon)

AI Providers

  • Local: Ollama (full integration)
  • Cloud: OpenAI, Anthropic, Google, Groq, Mistral, OpenRouter, HuggingFace, and more
  • HuggingFace Hub — 1M+ models for chat, TTS, image gen, embeddings, STT
  • Image Generation — ComfyUI with Flux models
  • Plugin System — Add any OpenAI-compatible API via JSON config
  • Plugin Variables — Per-plugin configurable settings (temperature, endpoint, etc.)

Advanced Capabilities

  • Document Chat (RAG) — Upload PDFs, chat with your docs
  • Custom Personas — AI personalities with memory
  • Interactive Artifacts — Live HTML, SVG, code preview
  • Text-to-Speech — Multiple voices and providers
  • SSO Authentication — GitHub, Hugging Face OAuth

Security

  • AES-256-GCM encryption
  • Role-based access control
  • Enterprise compliance ready

Quick Start

Requirements: Ollama (for local AI) or API keys for cloud providers

One Command Install

npx libre-webui

That's it. Opens at http://localhost:8080

Homebrew (macOS)

# CLI version (includes backend server)
brew tap libre-webui/tap
brew install libre-webui
libre-webui

# Or desktop app
brew install --cask libre-webui

Run as a background service:

brew services start libre-webui

Docker

Setup Command
Bundled Ollama (CPU) docker-compose up -d
Bundled Ollama (NVIDIA GPU) docker-compose -f docker-compose.gpu.yml up -d
External Ollama (already running on host) docker-compose -f docker-compose.external-ollama.yml up -d

Access at http://localhost:8080

Development builds (unstable)

Warning: Development builds are automatically generated from the dev branch and may contain experimental features, breaking changes, or bugs. Use at your own risk and do not use in production environments.

Setup Command
Dev + Bundled Ollama (CPU) docker-compose -f docker-compose.dev.yml up -d
Dev + Bundled Ollama (NVIDIA GPU) docker-compose -f docker-compose.dev.gpu.yml up -d
Dev + External Ollama docker-compose -f docker-compose.dev.external-ollama.yml up -d

Development builds use separate data volumes (libre_webui_dev_data) to prevent conflicts with stable installations.

To pull the latest dev image manually:

docker pull librewebui/libre-webui:dev

Kubernetes (Helm)

helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui
Helm configuration options
# With external Ollama
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ollama.bundled.enabled=false \
  --set ollama.external.enabled=true \
  --set ollama.external.url=http://my-ollama:11434

# With NVIDIA GPU support
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ollama.bundled.gpu.enabled=true

# With Ingress
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ingress.enabled=true \
  --set ingress.hosts[0].host=chat.example.com

See helm/libre-webui/values.yaml for all configuration options.

Development Setup

# 1. Clone the repo
git clone https://github.com/libre-webui/libre-webui
cd libre-webui

# 2. Configure environment
cp backend/.env.example backend/.env

# 3. Install and run
npm install && npm run dev

Configuration

Edit backend/.env to add your API keys:

# Local AI (Ollama)
OLLAMA_BASE_URL=http://localhost:11434

# Cloud AI Providers (add the ones you need)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
HUGGINGFACE_API_KEY=hf_...

Desktop App (In Development)

Note: The desktop app is currently in active development. The macOS build is pending Apple notarization, which may cause security warnings or installation issues on some systems. We're working to resolve this. Feedback and bug reports are welcome!

Download the native desktop app from GitHub Releases:

Platform Status
macOS (Apple Silicon) Beta (.dmg or .zip)
Windows Coming soon
Linux Coming soon

Note

Enterprise Services

Need a custom deployment? Kroonen AI provides professional services for Libre WebUI deployments.

Service Use Case
On-premise & cloud deployment HIPAA, SOC 2, air-gapped environments
SSO integration Okta, Azure AD, SAML, LDAP
Custom development Integrations, white-labeling, plugins
SLA-backed support Priority response, dedicated channel

Contact: enterprise@kroonen.ai | Learn more →

Tip

Support Development

Libre WebUI is built and maintained independently. Your support keeps it free and open source.

Sponsor

Become a Sponsor — Help fund active development


Community


Apache 2.0 License • Copyright © 2025–present Libre WebUI™

Built & maintained by Kroonen AIEnterprise Support

About

dev-demo

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 89.1%
  • JavaScript 6.8%
  • Shell 2.1%
  • CSS 1.2%
  • Dockerfile 0.3%
  • Ruby 0.2%
  • Other 0.3%