A transparent proxy for capturing and visualizing in-flight Claude Code requests and conversations, with optional agent routing to different LLM providers.
Claude Code Proxy serves three main purposes:
- Claude Code Proxy: Intercepts and monitors requests from Claude Code (claude.ai/code) to the Anthropic API, allowing you to see what Claude Code is doing in real-time
- Conversation Viewer: Displays and analyzes your Claude API conversations with a beautiful web interface
- Agent Routing (Optional): Routes specific Claude Code agents to different LLM providers (e.g., route code-reviewer agent to GPT-4o)
- Transparent Proxy: Routes Claude Code requests through the monitor without disruption
- Agent Routing (Optional): Map specific Claude Code agents to different LLM models
- Request Monitoring: SQLite-based logging of all API interactions
- Live Dashboard: Real-time visualization of requests and responses
- Conversation Analysis: View full conversation threads with tool usage
- Easy Setup: One-command startup for both services
- Option 1: Go 1.20+ and Node.js 18+ (for local development)
- Option 2: Docker (for containerized deployment)
- Claude Code
-
Clone the repository
git clone https://github.com/seifghazi/claude-code-proxy.git cd claude-code-proxy -
Configure the proxy
cp config.yaml.example config.yaml
-
Install and run (first time)
make install # Install all dependencies make dev # Start both services
-
Subsequent runs (after initial setup)
make dev # or ./run.sh
-
Clone the repository
git clone https://github.com/seifghazi/claude-code-proxy.git cd claude-code-proxy -
Configure the proxy
cp config.yaml.example config.yaml # Edit config.yaml as needed -
Build and run with Docker
# Build the image docker build -t claude-code-proxy . # Run with default settings docker run -p 3001:3001 -p 5173:5173 claude-code-proxy
-
Run with persistent data and custom configuration
# Create a data directory for persistent SQLite database mkdir -p ./data # Option 1: Run with config file (recommended) docker run -p 3001:3001 -p 5173:5173 \ -v ./data:/app/data \ -v ./config.yaml:/app/config.yaml:ro \ claude-code-proxy # Option 2: Run with environment variables docker run -p 3001:3001 -p 5173:5173 \ -v ./data:/app/data \ -e ANTHROPIC_FORWARD_URL=https://api.anthropic.com \ -e PORT=3001 \ -e WEB_PORT=5173 \ claude-code-proxy
-
Docker Compose (alternative)
# docker-compose.yml version: '3.8' services: claude-code-proxy: build: . ports: - "3001:3001" - "5173:5173" volumes: - ./data:/app/data - ./config.yaml:/app/config.yaml:ro # Mount config file environment: - ANTHROPIC_FORWARD_URL=https://api.anthropic.com - PORT=3001 - WEB_PORT=5173 - DB_PATH=/app/data/requests.db
Then run:
docker-compose up
To use this proxy with Claude Code, set:
export ANTHROPIC_BASE_URL=http://localhost:3001Then launch Claude Code using the claude command.
This will route Claude Code's requests through the proxy for monitoring.
- Web Dashboard: http://localhost:5173
- API Proxy: http://localhost:3001
- Health Check: http://localhost:3001/health
If you need to run services independently:
# Run proxy only
make run-proxy
# Run web interface only (in another terminal)
make run-webmake install # Install all dependencies
make build # Build both services
make dev # Run in development mode
make clean # Clean build artifacts
make db-reset # Reset database
make help # Show all commandsCreate a config.yaml file (or copy from config.yaml.example):
server:
port: 3001
providers:
anthropic:
base_url: "https://api.anthropic.com"
openai: # if enabling subagent routing
api_key: "your-openai-key" # Or set OPENAI_API_KEY env var
storage:
db_path: "requests.db"The proxy supports routing specific Claude Code agents to different LLM providers. This is an optional feature that's disabled by default.
- Enable the feature in
config.yaml:
subagents:
enable: true # Set to true to enable subagent routing
mappings:
code-reviewer: "gpt-4o"
data-analyst: "o3"
doc-writer: "gpt-3.5-turbo"-
Set up your Claude Code agents following Anthropic's official documentation:
-
How it works: When Claude Code uses a subagent that matches one of your mappings, the proxy will automatically route the request to the specified model instead of Claude.
Example 1: Code Review Agent β GPT-4o
# config.yaml
subagents:
enable: true
mappings:
code-reviewer: "gpt-4o"Use case: Route code review tasks to GPT-4o for faster responses while keeping complex coding tasks on Claude.
Example 2: Reasoning Agent β O3
# config.yaml
subagents:
enable: true
mappings:
deep-reasoning: "o3"Use case: Send complex reasoning tasks to O3 while using Claude for general coding.
Example 3: Multiple Agents
# config.yaml
subagents:
enable: true
mappings:
streaming-systems-engineer: "o3"
frontend-developer: "gpt-4o-mini"
security-auditor: "gpt-4o"Use case: Different specialists for different tasks, optimizing for speed/cost/quality.
Override config via environment:
PORT- Server portOPENAI_API_KEY- OpenAI API keyDB_PATH- Database pathSUBAGENT_MAPPINGS- Comma-separated mappings (e.g.,"code-reviewer:gpt-4o,data-analyst:o3")
All environment variables can be configured when running the Docker container:
| Variable | Default | Description |
|---|---|---|
PORT |
3001 |
Proxy server port |
WEB_PORT |
5173 |
Web dashboard port |
READ_TIMEOUT |
600 |
Server read timeout (seconds) |
WRITE_TIMEOUT |
600 |
Server write timeout (seconds) |
IDLE_TIMEOUT |
600 |
Server idle timeout (seconds) |
ANTHROPIC_FORWARD_URL |
https://api.anthropic.com |
Target Anthropic API URL |
ANTHROPIC_VERSION |
2023-06-01 |
Anthropic API version |
ANTHROPIC_MAX_RETRIES |
3 |
Maximum retry attempts |
DB_PATH |
/app/data/requests.db |
SQLite database path |
Example with custom configuration:
docker run -p 3001:3001 -p 5173:5173 \
-v ./data:/app/data \
-e PORT=8080 \
-e WEB_PORT=3000 \
-e ANTHROPIC_FORWARD_URL=https://api.anthropic.com \
-e DB_PATH=/app/data/custom.db \
claude-code-proxyclaude-code-proxy/
βββ proxy/ # Go proxy server
β βββ cmd/ # Application entry points
β βββ internal/ # Internal packages
β βββ go.mod # Go dependencies
βββ web/ # React Remix frontend
β βββ app/ # Remix application
β βββ package.json # Node dependencies
βββ run.sh # Start script
βββ .env.example # Environment template
βββ README.md # This file
- All API requests logged to SQLite database
- Searchable request history
- Request/response body inspection
- Conversation threading
- Real-time request streaming
- Interactive request explorer
- Conversation visualization
- Performance metrics
MIT License - see LICENSE for details.
