-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Open
Labels
bugSomething isn't workingSomething isn't workingplatform:linuxIssue specifically occurs on LinuxIssue specifically occurs on Linux
Description
Summary
Claude Code CLI (claude --model qwen2.5-coder:14b -p) hangs indefinitely during startup when configured to use a local Ollama instance via Anthropic Messages API protocol.
Environment
- Claude Code Version: 2.1.39
- OS: Linux (Debian/Ubuntu)
- Node.js: v24.13.1
- Ollama Version: v0.15.6
- Model: qwen2.5-coder:14b (14.8B, Q4_K_M)
Configuration
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://localhost:11434
claude --model qwen2.5-coder:14b "Hello" -pExpected Behavior
- Claude Code should initialize the session and respond to the prompt within 5-10 seconds
- Output should be printed and the process should exit (due to
-pflag)
Actual Behavior
- Process hangs indefinitely after initialization
- Debug logs show it's stuck loading MCP servers from
https://api.anthropic.com/v1/mcp_servers?limit=1000 - No error messages or timeout errors
- Process must be manually killed (SIGKILL)
- Works fine with default Anthropic API key (
ANTHROPIC_API_KEY)
Debug Output
2026-02-13T00:32:19.854Z [DEBUG] [STARTUP] Loading MCP configs...
2026-02-13T00:32:19.857Z [DEBUG] Found 0 plugins (0 enabled, 0 disabled)
2026-02-13T00:32:19.857Z [DEBUG] [claudeai-mcp] Checking gate (cached)...
2026-02-13T00:32:19.857Z [DEBUG] [claudeai-mcp] Gate returned: true
2026-02-13T00:32:19.857Z [DEBUG] [claudeai-mcp] Fetching from https://api.anthropic.com/v1/mcp_servers?limit=1000
(HANGS HERE - never completes)
Workaround
Using sessions_spawn with local model or direct Ollama invocation works perfectly:
ollama run qwen2.5-coder:14b "Hello" # Works fineRoot Cause Hypothesis
Claude Code CLI appears to be attempting to fetch MCP server configurations from the Anthropic cloud endpoint (api.anthropic.com) even when using a local Ollama backend. This causes a hang because the connection to the cloud endpoint is never resolved with a timeout.
Steps to Reproduce
- Install Ollama (https://ollama.ai)
- Pull a model:
ollama pull qwen2.5-coder:14b - Set environment variables:
export ANTHROPIC_AUTH_TOKEN=ollama export ANTHROPIC_BASE_URL=http://localhost:11434
- Run:
claude --model qwen2.5-coder:14b "Hello" -p - Observe: Process hangs for >60 seconds and must be killed
Suggested Fixes
- Add startup timeout: Implement a configurable timeout (e.g., 10s) for MCP server fetching
- Skip cloud MCP calls for local backends: Detect when
ANTHROPIC_BASE_URLis localhost and skip cloud MCP discovery - Fallback mechanism: If MCP fetch fails/times out, log warning and continue with base tools only
- Option flag: Add
--skip-mcpor--mcp-disabledflag for users who don't need MCP servers
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingplatform:linuxIssue specifically occurs on LinuxIssue specifically occurs on Linux