Skip to content

fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server#12796

Open
DeJeune wants to merge 5 commits intomainfrom
DeJeune/fix-extended-thinking
Open

fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server#12796
DeJeune wants to merge 5 commits intomainfrom
DeJeune/fix-extended-thinking

Conversation

@DeJeune
Copy link
Collaborator

@DeJeune DeJeune commented Feb 7, 2026

What this PR does

Before this PR:

  • Ollama was not recognized as an Anthropic-compatible provider for extended thinking
  • Local providers (ollama, lmstudio) without API keys would fail agent model validation
  • The API server only exposed OpenAI and Anthropic provider types, excluding ollama and new-api
  • aihubmix's Anthropic model checker was restricted to Claude-only models (m.id.includes('claude'))

After this PR:

  • Ollama is supported as an Anthropic-compatible provider with anthropicApiHost configuration
  • Local providers automatically use a placeholder API key if none is provided
  • The API server now exposes ollama and new-api provider types alongside OpenAI and Anthropic
  • aihubmix falls through to the default checker (() => true), allowing all its Anthropic-compatible models

Fixes: #13247

Why we need it and why it was done in this way

Ollama extended thinking: Local AI models through Ollama should be able to access extended thinking features like other Anthropic-compatible providers. The implementation adds ollama to the Anthropic-compatible provider list and configures it with an anthropicApiHost pointing to the local Ollama server.

API server provider expansion: new-api already had Anthropic model support via getProviderAnthropicModelChecker but was missing from supportedTypes, so its models weren't exposed through the API server. Similarly, ollama needs to be in the supported providers list for validateModelId to resolve ollama models for agents.

aihubmix model checker: aihubmix now supports all LLM models as Anthropic-compatible (not just Claude models), so the previous Claude-only filter was overly restrictive.

API key bypass: Local providers (ollama, lmstudio) don't require real API keys, but downstream SDK calls reject empty keys. The validation method now sets a placeholder key for these providers, documented via JSDoc.

Breaking changes

None — this is a backwards-compatible enhancement.

Special notes for your reviewer

  • The migration (v196) safely initializes anthropicApiHost for existing ollama configs, with a fallback to http://localhost:11434 if apiHost is empty
  • The validateAgentModels JSDoc now documents the side-effect of mutating provider.apiKey for local providers
  • localProvidersWithoutApiKey is hoisted outside the loop to avoid per-iteration allocation
  • All tests pass (lint, type-check, unit tests)

Checklist

  • Code follows established patterns in the codebase
  • Tests added and passing (API key validation bypass, provider compatibility)
  • Build check passed (lint, test, typecheck)
  • Documentation: No user-facing feature (internal provider configuration)

Release note

NONE

@DeJeune DeJeune requested a review from 0xfullex as a code owner February 7, 2026 20:23
@DeJeune DeJeune force-pushed the DeJeune/fix-extended-thinking branch from 20e3c31 to b723780 Compare February 7, 2026 20:31
Copy link
Collaborator

@EurFelux EurFelux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Overall this PR follows existing patterns well and the test coverage is good. The core idea — allowing ollama as an Anthropic-compatible provider for extended thinking — makes sense and is a useful feature.

However, there are a few concerns worth addressing before merge:

Key Issues

  1. Scope creep in getAvailableProviders: Adding ollama and new-api to the API server's supportedTypes is broader than what's needed for extended thinking support. This exposes all models from these providers through the API server, not just Anthropic-compatible ones.

  2. Unintended behavioral change for aihubmix: Removing the explicit aihubmix case from getProviderAnthropicModelChecker silently changes its behavior from filtering to Claude-only models to allowing all models. This is an unrelated regression that should be separated.

  3. Side-effect mutation in validation: The validateAgentModels method now mutates validation.provider.apiKey as a side effect, which is unexpected for a validation method. If the provider object is shared/referenced elsewhere, this could cause subtle bugs.

Minor Items

  1. Loop allocation: localProvidersWithoutApiKey array is re-created on each loop iteration unnecessarily.

  2. Migration edge case: The v196 migration doesn't handle the case where apiHost is empty/undefined for an ollama provider — consider a fallback to the default http://localhost:11434.

What looks good

  • Test coverage for the API key bypass logic is thorough (4 test cases covering key scenarios)
  • Migration v196 follows the established pattern
  • Adding anthropicApiHost to the ollama system config is clean
  • PR description is well-written with clear explanation of changes and trade-offs

@DeJeune DeJeune changed the title fix: support ollama as anthropic-compatible provider fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server Feb 8, 2026
@DeJeune DeJeune requested a review from EurFelux February 8, 2026 14:30
@DeJeune DeJeune force-pushed the DeJeune/fix-extended-thinking branch from e669eda to 9786114 Compare February 10, 2026 05:09
Copy link
Collaborator

@EurFelux EurFelux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

This PR adds ollama as an Anthropic-compatible provider for extended thinking, fixes API key validation for local providers, and expands the API server to support ollama/new-api. Overall this is a well-structured change with good test coverage.

What looks good

  • Test coverage: 4 test cases covering the key API key bypass scenarios — well done
  • Migration v197: Clean, follows established patterns, with proper fallback for empty apiHost
  • JSDoc documentation: The side-effect mutation in validateAgentModels is now properly documented
  • Type safety: satisfies SystemProviderId[] on localProvidersWithoutApiKey ensures the list stays in sync with the type system
  • Earlier review feedback: All previous comments have been addressed (hoisted array, fallback in migration, JSDoc added)

Minor suggestions (non-blocking)

  1. Dual source of truth: localProvidersWithoutApiKey in BaseService.ts and NOT_SUPPORT_API_KEY_PROVIDERS in provider.ts serve related purposes — consider a shared constant in @shared in the future to keep them in sync
  2. Placeholder key value: Using the provider ID as a fake API key (apiKey = 'ollama') works but a more explicit value like 'no-key-required' would be clearer if it surfaces in logs
  3. Test gap: Consider adding a valid: false test case for validateAgentModels to cover the early-throw path
  4. Comment on supportedTypes: A brief comment explaining why ollama/new-api are included would help future maintainers

aihubmix change

The removal of the explicit aihubmix case from getProviderAnthropicModelChecker is a behavioral change (from Claude-only to all models). The author's explanation that aihubmix now supports all models as Anthropic-compatible makes sense. Since it falls through to the default: () => true case, this is correct if the upstream provider has indeed expanded support.

LGTM — approving with minor suggestions above.

DeJeune and others added 4 commits February 13, 2026 00:27
…nking

- Add ollama to ANTHROPIC_COMPATIBLE_PROVIDER_IDS for UI configuration
- Update API provider filter to include ollama and new-api alongside anthropic
- Add anthropicApiHost to ollama provider config (defaults to apiHost)
- Skip API key validation for local providers (ollama, lmstudio) with placeholder
- Add migration v195 to set anthropicApiHost for existing ollama configs
- Add comprehensive test coverage for API key bypass logic
- Improve code comments clarifying provider support differences

This enables local ollama instances to support Claude's extended thinking feature when properly configured.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
- Hoist localProvidersWithoutApiKey outside the for loop to avoid
  per-iteration re-allocation, switch from .some() to .includes()
- Add JSDoc to validateAgentModels documenting the side-effect mutation
  of provider.apiKey for local providers
- Add fallback to 'http://localhost:11434' in migration 196 for ollama
  providers with empty/undefined apiHost

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@DeJeune DeJeune force-pushed the DeJeune/fix-extended-thinking branch from 9786114 to d213de3 Compare February 12, 2026 16:27
@DeJeune DeJeune requested a review from GeorgeDong32 March 8, 2026 01:25
- BaseService.ts: keep both resolveAccessiblePaths from main and
  enhanced JSDoc for validateAgentModels from this branch
- store/index.ts: bump persist version to 200
- migrate.ts: keep main's migrations (198: minimax, 199: shortcuts),
  renumber ollama anthropicApiHost migration to 200

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: The code tool cannot recognize the New API

3 participants