Skip to content

AI-powered code documentation platform. Multi-model RAG pipeline generates interactive wikis with Mermaid diagrams, 3D visualizations, dependency graphs, and export to 5 formats. Next.js 15 + FastAPI + Claude/GPT/Gemini.

License

Notifications You must be signed in to change notification settings

REDFOX1899/BetterCodeWiki

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BetterCodeWiki

Originally based on DeepWiki-Open by AsyncFuncAI. BetterCodeWiki builds upon the foundation with an enhanced UI, 3D landing experience, improved diagrams, and additional features.

BetterCodeWiki automatically creates beautiful, interactive wikis for any GitHub, GitLab, or BitBucket repository. Just enter a repo name, and BetterCodeWiki will:

  1. Analyze the code structure
  2. Generate comprehensive documentation
  3. Create visual diagrams to explain how everything works
  4. Organize it all into an easy-to-navigate wiki

Features

  • Instant Documentation: Turn any GitHub, GitLab or BitBucket repo into a wiki in seconds
  • Private Repository Support: Securely access private repositories with personal access tokens
  • Smart Analysis: AI-powered understanding of code structure and relationships
  • Beautiful Diagrams: Automatic Mermaid diagrams with animated flows and multi-color nodes
  • 3D Landing Experience: Immersive Three.js hero with interactive knowledge cube
  • Easy Navigation: Intuitive sidebar with search, table of contents, and reading mode
  • Ask Feature: Chat with your repository using RAG-powered AI to get accurate answers
  • DeepResearch: Multi-turn research process that thoroughly investigates complex topics
  • Multiple Model Providers: Support for Google Gemini, OpenAI, OpenRouter, and local Ollama models
  • Flexible Embeddings: Choose between OpenAI, Google AI, or local Ollama embeddings
  • Visual Dependency Graph: Force-directed graph visualization of code relationships
  • Export Options: Export to Markdown, JSON, Notion, Confluence, and HTML formats
  • Global Search: Command palette (Cmd+K) for quick navigation
  • Reading Mode: Distraction-free reading with Alt+R toggle
  • MCP Server: Expose wiki data to AI agents (Claude Desktop, Claude Code, Cursor, Windsurf) via Model Context Protocol

Quick Start

Option 1: Using Docker

# Clone the repository
git clone https://github.com/REDFOX1899/BetterCodeWiki.git
cd BetterCodeWiki

# Create a .env file with your API keys
echo "GOOGLE_API_KEY=your_google_api_key" > .env
echo "OPENAI_API_KEY=your_openai_api_key" >> .env
# Optional: Use Google AI embeddings instead of OpenAI
echo "DEEPWIKI_EMBEDDER_TYPE=google" >> .env
# Optional: Add OpenRouter API key
echo "OPENROUTER_API_KEY=your_openrouter_api_key" >> .env
# Optional: Add Ollama host if not local
echo "OLLAMA_HOST=your_ollama_host" >> .env
# Optional: Azure OpenAI
echo "AZURE_OPENAI_API_KEY=your_azure_openai_api_key" >> .env
echo "AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint" >> .env
echo "AZURE_OPENAI_VERSION=your_azure_openai_version" >> .env

# Run with Docker Compose
docker-compose up

For detailed instructions on using with Ollama and Docker, see Ollama Instructions.

Where to get API keys:

Option 2: Manual Setup

Step 1: Set Up Your API Keys

Create a .env file in the project root:

GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
# Optional: Use Google AI embeddings
DEEPWIKI_EMBEDDER_TYPE=google
# Optional: OpenRouter models
OPENROUTER_API_KEY=your_openrouter_api_key
# Optional: Azure OpenAI
AZURE_OPENAI_API_KEY=your_azure_openai_api_key
AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint
AZURE_OPENAI_VERSION=your_azure_openai_version
# Optional: Ollama host
OLLAMA_HOST=your_ollama_host

Step 2: Start the Backend

python -m pip install poetry==2.0.1 && poetry install -C api
python -m api.main

Step 3: Start the Frontend

yarn install
yarn dev

Step 4: Use BetterCodeWiki!

  1. Open http://localhost:3000 in your browser
  2. Enter a GitHub, GitLab, or Bitbucket repository URL
  3. For private repositories, click "+ Add access tokens" and enter your personal access token
  4. Click "Generate Wiki" and watch the magic happen!

How It Works

BetterCodeWiki uses AI to:

  1. Clone and analyze the repository (including private repos with token authentication)
  2. Create embeddings of the code for smart retrieval
  3. Generate documentation with context-aware AI (Google Gemini, OpenAI, OpenRouter, Azure OpenAI, or Ollama)
  4. Create visual diagrams to explain code relationships
  5. Organize everything into a structured wiki
  6. Enable intelligent Q&A through the Ask feature
  7. Provide in-depth research capabilities with DeepResearch
graph TD
    A[User inputs GitHub/GitLab/Bitbucket repo] --> AA{Private repo?}
    AA -->|Yes| AB[Add access token]
    AA -->|No| B[Clone Repository]
    AB --> B
    B --> C[Analyze Code Structure]
    C --> D[Create Code Embeddings]

    D --> M{Select Model Provider}
    M -->|Google Gemini| E1[Generate with Gemini]
    M -->|OpenAI| E2[Generate with OpenAI]
    M -->|OpenRouter| E3[Generate with OpenRouter]
    M -->|Local Ollama| E4[Generate with Ollama]
    M -->|Azure| E5[Generate with Azure]

    E1 --> E[Generate Documentation]
    E2 --> E
    E3 --> E
    E4 --> E
    E5 --> E

    D --> F[Create Visual Diagrams]
    E --> G[Organize as Wiki]
    F --> G
    G --> H[Interactive BetterCodeWiki]

    classDef process stroke-width:2px;
    classDef data stroke-width:2px;
    classDef result stroke-width:2px;
    classDef decision stroke-width:2px;

    class A,D data;
    class AA,M decision;
    class B,C,E,F,G,AB,E1,E2,E3,E4,E5 process;
    class H result;
Loading

Project Structure

BetterCodeWiki/
├── api/                  # Backend API server
│   ├── main.py           # API entry point
│   ├── api.py            # FastAPI implementation
│   ├── rag.py            # Retrieval Augmented Generation
│   ├── data_pipeline.py  # Data processing utilities
│   ├── mcp/              # MCP server (standalone)
│   │   └── server.py     # 5 tools for AI agent access to wikis
│   ├── pyproject.toml    # Python dependencies (Poetry)
│   └── poetry.lock       # Locked Python dependency versions
│
├── src/                  # Frontend Next.js app
│   ├── app/              # Next.js app directory
│   │   └── page.tsx      # Main application page (3D landing)
│   └── components/       # React components
│       ├── landing/      # 3D landing page components
│       ├── Mermaid.tsx   # Enhanced diagram renderer
│       └── ...           # Additional UI components
│
├── public/               # Static assets
├── package.json          # JavaScript dependencies
└── .env                  # Environment variables (create this)

Supported Providers and Models

  • Google: Default gemini-2.5-flash, also supports gemini-2.5-flash-lite, gemini-2.5-pro, etc.
  • OpenAI: Default gpt-5-nano, also supports gpt-5, 4o, etc.
  • OpenRouter: Access to multiple models via a unified API
  • Azure OpenAI: Default gpt-4o, also supports o4-mini, etc.
  • Ollama: Support for locally running open-source models like llama3

Environment Variables

Variable Description Required
GOOGLE_API_KEY Google Gemini API key No (required for Gemini)
OPENAI_API_KEY OpenAI API key Conditional
OPENROUTER_API_KEY OpenRouter API key No
AZURE_OPENAI_API_KEY Azure OpenAI API key No
AZURE_OPENAI_ENDPOINT Azure OpenAI endpoint No
AZURE_OPENAI_VERSION Azure OpenAI version No
OLLAMA_HOST Ollama Host (default: http://localhost:11434) No
DEEPWIKI_EMBEDDER_TYPE openai, google, ollama, or bedrock No
OPENAI_BASE_URL Custom OpenAI API endpoint No
DEEPWIKI_CONFIG_DIR Custom config file location No
SERVER_BASE_URL API server URL (default: http://localhost:8001) No

Docker Setup

# Pull and run
docker pull ghcr.io/asyncfuncai/deepwiki-open:latest

docker run -p 8001:8001 -p 3000:3000 -p 8008:8008 \
  -e GOOGLE_API_KEY=your_key \
  -e OPENAI_API_KEY=your_key \
  -v ~/.adalflow:/root/.adalflow \
  ghcr.io/asyncfuncai/deepwiki-open:latest

Or use Docker Compose:

docker-compose up

Ports

Port Service
3000 Next.js frontend
8001 FastAPI backend
8008 MCP server (streamable-http)

MCP Server — AI Agent Integration

BetterCodeWiki includes a built-in MCP (Model Context Protocol) server that lets AI coding agents query your wiki documentation in real-time.

What It Does

Any MCP-compatible client (Claude Desktop, Claude Code, Cursor, Windsurf) can:

  • Discover which repos have generated wikis
  • Look up architecture overviews and specific wiki pages
  • Search documentation for relevant context
  • Ask questions about a codebase using wiki content

5 MCP Tools

Tool Description
list_projects Discover all cached wiki repos
get_wiki_overview Get project architecture in one call
get_wiki_page Fetch a specific page by title (fuzzy match) or ID
search_wiki Full-text search across all wiki pages
ask_codebase Get relevant wiki context for a question

Connect via Docker (HTTP)

When running with Docker, the MCP server is available at http://localhost:8008/mcp.

Claude Code:

claude mcp add bettercodewiki --transport streamable-http http://localhost:8008/mcp

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "bettercodewiki": {
      "url": "http://localhost:8008/mcp"
    }
  }
}

Connect Locally (stdio — no Docker needed)

# Set up the venv once
python3 -m venv api/mcp/.venv
api/mcp/.venv/bin/pip install "mcp[cli]"

# Register with Claude Code
claude mcp add bettercodewiki -- api/mcp/.venv/bin/python api/mcp/server.py

Claude Desktop (stdio mode):

{
  "mcpServers": {
    "bettercodewiki": {
      "command": "/absolute/path/to/BetterCodeWiki/api/mcp/.venv/bin/python",
      "args": ["/absolute/path/to/BetterCodeWiki/api/mcp/server.py"]
    }
  }
}

For full setup details, see MCP_SETUP.md.

Ask & DeepResearch

  • Ask: Chat with your repository using RAG for context-aware, streaming responses
  • DeepResearch: Multi-turn research with structured plans, updates, and comprehensive conclusions

Troubleshooting

  • "Missing environment variables": Check your .env file in the project root
  • "API key not valid": Verify the key with no extra spaces
  • "Cannot connect to API server": Ensure the API server is running on port 8001
  • "Error generating wiki": Try a smaller repository first

Credits

This project is based on DeepWiki-Open by AsyncFuncAI. The original project is licensed under the MIT License.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

AI-powered code documentation platform. Multi-model RAG pipeline generates interactive wikis with Mermaid diagrams, 3D visualizations, dependency graphs, and export to 5 formats. Next.js 15 + FastAPI + Claude/GPT/Gemini.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors 63