-
-
Notifications
You must be signed in to change notification settings - Fork 163
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Updated my llamaswap:cpu container to try out the new Qwen3-vl support.
ghcr.io/mostlygeek/llama-swap cpu ee1d1321e80b 18 hours ago 130MB
ghcr.io/mostlygeek/llama-swap c4db55692ff6 7 days ago 130MB
I was getting error messages in llamaswap that it couldn't load the models even though they had all worked previously.
I got into the llamaswap container with "docker exec -it harbor.llamaswap /bin/bash" and found I was unable to access anything under /root
I have altered my compose.llamaswap.yaml and llamaswap/config.yaml to use this path instead and the models are working again as a temporary measure: /nroot/.cache/llama.cpp
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working