Skip to content

Sets up openhands to run locally with LM Studio or other APIs

Notifications You must be signed in to change notification settings

purohitdeep/openhands-setup

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

OpenHands + LM Studio Setup

Local autonomous AI agent setup using OpenHands (Docker) and LM Studio (LLM Provider).

Quick Start

  1. LM Studio:
    • Load a model (Recommended: qwen2.5-coder-7b-instruct or similar).
    • Start Local Server on port 1234.
  2. Start OpenHands:
    docker-compose up -d
  3. Access: Open http://localhost:3000.

Configuration

Settings are in .env.

  • LLM_MODEL: Must match the model ID loaded in LM Studio (e.g., openai/qwen/qwen3-coder-30b).
  • LLM_BASE_URL: http://host.docker.internal:1234/v1 (Required for Docker -> Host communication).

To switch models:

  1. Edit .env (uncomment desired model).
  2. Restart: docker-compose restart.

Troubleshooting

Run the pre-flight check to verify connectivity:

./preflight-check.sh

File Structure

  • docker-compose.yml: Container definition.
  • .env: Configuration variables.
  • workspace/: Shared directory for agent files.

About

Sets up openhands to run locally with LM Studio or other APIs

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages