fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server#12796
fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server#12796
Conversation
20e3c31 to
b723780
Compare
EurFelux
left a comment
There was a problem hiding this comment.
Review Summary
Overall this PR follows existing patterns well and the test coverage is good. The core idea — allowing ollama as an Anthropic-compatible provider for extended thinking — makes sense and is a useful feature.
However, there are a few concerns worth addressing before merge:
Key Issues
-
Scope creep in
getAvailableProviders: Addingollamaandnew-apito the API server'ssupportedTypesis broader than what's needed for extended thinking support. This exposes all models from these providers through the API server, not just Anthropic-compatible ones. -
Unintended behavioral change for aihubmix: Removing the explicit
aihubmixcase fromgetProviderAnthropicModelCheckersilently changes its behavior from filtering to Claude-only models to allowing all models. This is an unrelated regression that should be separated. -
Side-effect mutation in validation: The
validateAgentModelsmethod now mutatesvalidation.provider.apiKeyas a side effect, which is unexpected for a validation method. If the provider object is shared/referenced elsewhere, this could cause subtle bugs.
Minor Items
-
Loop allocation:
localProvidersWithoutApiKeyarray is re-created on each loop iteration unnecessarily. -
Migration edge case: The v196 migration doesn't handle the case where
apiHostis empty/undefined for an ollama provider — consider a fallback to the defaulthttp://localhost:11434.
What looks good
- Test coverage for the API key bypass logic is thorough (4 test cases covering key scenarios)
- Migration v196 follows the established pattern
- Adding
anthropicApiHostto the ollama system config is clean - PR description is well-written with clear explanation of changes and trade-offs
e669eda to
9786114
Compare
EurFelux
left a comment
There was a problem hiding this comment.
Review Summary
This PR adds ollama as an Anthropic-compatible provider for extended thinking, fixes API key validation for local providers, and expands the API server to support ollama/new-api. Overall this is a well-structured change with good test coverage.
What looks good
- Test coverage: 4 test cases covering the key API key bypass scenarios — well done
- Migration v197: Clean, follows established patterns, with proper fallback for empty
apiHost - JSDoc documentation: The side-effect mutation in
validateAgentModelsis now properly documented - Type safety:
satisfies SystemProviderId[]onlocalProvidersWithoutApiKeyensures the list stays in sync with the type system - Earlier review feedback: All previous comments have been addressed (hoisted array, fallback in migration, JSDoc added)
Minor suggestions (non-blocking)
- Dual source of truth:
localProvidersWithoutApiKeyinBaseService.tsandNOT_SUPPORT_API_KEY_PROVIDERSinprovider.tsserve related purposes — consider a shared constant in@sharedin the future to keep them in sync - Placeholder key value: Using the provider ID as a fake API key (
apiKey = 'ollama') works but a more explicit value like'no-key-required'would be clearer if it surfaces in logs - Test gap: Consider adding a
valid: falsetest case forvalidateAgentModelsto cover the early-throw path - Comment on supportedTypes: A brief comment explaining why ollama/new-api are included would help future maintainers
aihubmix change
The removal of the explicit aihubmix case from getProviderAnthropicModelChecker is a behavioral change (from Claude-only to all models). The author's explanation that aihubmix now supports all models as Anthropic-compatible makes sense. Since it falls through to the default: () => true case, this is correct if the upstream provider has indeed expanded support.
LGTM — approving with minor suggestions above.
…nking - Add ollama to ANTHROPIC_COMPATIBLE_PROVIDER_IDS for UI configuration - Update API provider filter to include ollama and new-api alongside anthropic - Add anthropicApiHost to ollama provider config (defaults to apiHost) - Skip API key validation for local providers (ollama, lmstudio) with placeholder - Add migration v195 to set anthropicApiHost for existing ollama configs - Add comprehensive test coverage for API key bypass logic - Improve code comments clarifying provider support differences This enables local ollama instances to support Claude's extended thinking feature when properly configured. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
- Hoist localProvidersWithoutApiKey outside the for loop to avoid per-iteration re-allocation, switch from .some() to .includes() - Add JSDoc to validateAgentModels documenting the side-effect mutation of provider.apiKey for local providers - Add fallback to 'http://localhost:11434' in migration 196 for ollama providers with empty/undefined apiHost Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
9786114 to
d213de3
Compare
- BaseService.ts: keep both resolveAccessiblePaths from main and enhanced JSDoc for validateAgentModels from this branch - store/index.ts: bump persist version to 200 - migrate.ts: keep main's migrations (198: minimax, 199: shortcuts), renumber ollama anthropicApiHost migration to 200 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
What this PR does
Before this PR:
m.id.includes('claude'))After this PR:
anthropicApiHostconfiguration() => true), allowing all its Anthropic-compatible modelsFixes: #13247
Why we need it and why it was done in this way
Ollama extended thinking: Local AI models through Ollama should be able to access extended thinking features like other Anthropic-compatible providers. The implementation adds ollama to the Anthropic-compatible provider list and configures it with an
anthropicApiHostpointing to the local Ollama server.API server provider expansion:
new-apialready had Anthropic model support viagetProviderAnthropicModelCheckerbut was missing fromsupportedTypes, so its models weren't exposed through the API server. Similarly, ollama needs to be in the supported providers list forvalidateModelIdto resolve ollama models for agents.aihubmix model checker: aihubmix now supports all LLM models as Anthropic-compatible (not just Claude models), so the previous Claude-only filter was overly restrictive.
API key bypass: Local providers (ollama, lmstudio) don't require real API keys, but downstream SDK calls reject empty keys. The validation method now sets a placeholder key for these providers, documented via JSDoc.
Breaking changes
None — this is a backwards-compatible enhancement.
Special notes for your reviewer
anthropicApiHostfor existing ollama configs, with a fallback tohttp://localhost:11434ifapiHostis emptyvalidateAgentModelsJSDoc now documents the side-effect of mutatingprovider.apiKeyfor local providerslocalProvidersWithoutApiKeyis hoisted outside the loop to avoid per-iteration allocationChecklist
Release note