-
|
Has anybody gotten this working with llama.cpp? When i try it just says its not a valid provider. |
Beta Was this translation helpful? Give feedback.
Answered by
MODSetter
Dec 23, 2025
Replies: 2 comments
-
|
We use litellm in backend. This should do it |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
schlick7
-
|
I tried that 3 times but now it works. Only change I made was to start using the new model router setting in llama.cpp . So either that was needed or I was doing something wrong. Thanks |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
We use litellm in backend. This should do it
BerriAI/litellm#9138
https://docs.litellm.ai/docs/providers/openai_compatible