Skip to content

docs: add OpenUI + Ollama local setup tutorial with troubleshooting(issue #1)#20

Open
shogun444 wants to merge 1 commit intothesysdev:mainfrom
shogun444:docs/openui-ollama-setup
Open

docs: add OpenUI + Ollama local setup tutorial with troubleshooting(issue #1)#20
shogun444 wants to merge 1 commit intothesysdev:mainfrom
shogun444:docs/openui-ollama-setup

Conversation

@shogun444
Copy link
Copy Markdown

@shogun444 shogun444 commented Apr 9, 2026

Summary

This PR replaces the fixed model selection in route.ts with process.env.MODEL || 'gpt-5.4', allowing model overrides through environment configuration while keeping the existing default behavior intact.


Changes

  • Updated 1 single line in route.ts to use process.env.MODEL and removed hardcoded model dependency
-const MODEL = "gpt-5.4";
+const MODEL = process.env.MODEL || "gpt-5.4";
  • Enabled flexible model configuration across environments
  • No breaking changes

here's the command :

docker run --rm -p 3000:3000 \
  -e OPENAI_BASE_URL=http://host.docker.internal:11434/v1 \
  -e OPENAI_API_KEY=ollama \
  -e MODEL=minimax-m2.7:cloud \
  openui-chat

Why This Change

OpenUI depends heavily on structured, instruction-following outputs.

In practice:

  • Small local models (3B–8B) frequently ignore instructions, hallucinate UI structure, and produce inconsistent results
  • Cloud models generate stable and correct UI output

Relying purely on local models is not practical without modifying OpenUI internals or the prompt.

This change allows developers to:

  • Use local models for experimentation
  • Switch to reliable cloud models when needed
  • Avoid editing core files just to change models

Usage

Run with Docker (set model via MODEL):

docker run --rm -p 3000:3000 -e OPENAI_BASE_URL=http://host.docker.internal:11434/v1 -e OPENAI_API_KEY=ollama -e MODEL=minimax-m2.7:cloud openui-chat 

Or set the model manually:

MODEL=minimax-m2.7:cloud

Then run OpenUI as usual.


Testing

  • Verified working end-to-end at http://localhost:3000
  • Tested with minimax-m2.7:cloud (recommended)
  • Local models tested but found unreliable for UI generation

Proof

UI breaking in small models
Screenshot 2026-04-08 131401

Screenshot 2026-04-09 145035 Screenshot 2026-04-09 144739

Results After using the cloud models

Screenshot 2026-04-09 193315 Screenshot 2026-04-09 193735 Screenshot 2026-04-09 150951 Screenshot 2026-04-09 150715

Notes

  • Focuses on minimal setup and developer flexibility
  • Keeps Ollama as the local runtime
  • Allows seamless switching between local and cloud models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant