Skip to content

fix: whisper breaking on cuda-13 (use absolute path for CUDA directory detection)#8678

Merged
mudler merged 1 commit intomudler:masterfrom
localai-bot:fix-whisper-cuda-13
Feb 28, 2026
Merged

fix: whisper breaking on cuda-13 (use absolute path for CUDA directory detection)#8678
mudler merged 1 commit intomudler:masterfrom
localai-bot:fix-whisper-cuda-13

Conversation

@localai-bot
Copy link
Copy Markdown
Contributor

Description

The whisper backend (and other CUDA backends) were failing on CUDA-13 containers because the system capability detection was using a relative path usr/local/cuda-13 instead of an absolute path /usr/local/cuda-13.

When LocalAI runs from a different working directory, the relative path check fails, causing the system to incorrectly detect nvidia capability instead of nvidia-cuda-13. This leads to wrong backend selection - loading cuda12-whisper instead of cuda13-whisper, which then fails with:

panic: libcudart.so.12: cannot open shared object file: No such file or directory

Root Cause

In pkg/system/capabilities.go, the init() function checks for CUDA directories using:

os.Stat(filepath.Join("usr", "local", "cuda-13"))

This creates a relative path usr/local/cuda-13 which only works if the current working directory is /.

Fix

Changed to use absolute path by prefixing with os.PathSeparator:

os.Stat(filepath.Join(string(os.PathSeparator), "usr", "local", "cuda-13"))

This ensures the path is always /usr/local/cuda-13 regardless of the current working directory.

Testing

The fix has been verified to compile successfully. The change is minimal and follows Go best practices for cross-platform path handling.

Related Issues

The capability detection was using a relative path 'usr/local/cuda-13'
which doesn't work when LocalAI is run from a different working directory.
This caused whisper (and other backends) to fail on CUDA-13 containers
because the system incorrectly detected 'nvidia' capability instead of
'nvidia-cuda-13', leading to wrong backend selection (cuda12-whisper
instead of cuda13-whisper).

Fixes: mudler#8033
@netlify
Copy link
Copy Markdown

netlify bot commented Feb 28, 2026

Deploy Preview for localai ready!

Name Link
🔨 Latest commit 3fc4de2
🔍 Latest deploy log https://app.netlify.com/projects/localai/deploys/69a2a1a6811f130008b1321f
😎 Deploy Preview https://deploy-preview-8678--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@mudler mudler merged commit 42e580b into mudler:master Feb 28, 2026
33 of 35 checks passed
@mudler mudler added the bug Something isn't working label Mar 14, 2026
localai-bot added a commit to localai-bot/LocalAI that referenced this pull request Mar 25, 2026
…y detection) (mudler#8678)

fix: use absolute path for CUDA directory detection

The capability detection was using a relative path 'usr/local/cuda-13'
which doesn't work when LocalAI is run from a different working directory.
This caused whisper (and other backends) to fail on CUDA-13 containers
because the system incorrectly detected 'nvidia' capability instead of
'nvidia-cuda-13', leading to wrong backend selection (cuda12-whisper
instead of cuda13-whisper).

Fixes: mudler#8033

Co-authored-by: localai-bot <localai-bot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

whisper breaks on cuda-13

2 participants