Question on running with LM Studio and Ollama #835
juergen69
started this conversation in
General Discussion
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I just wanted to test ArchonOS with a local LM Studio and the Ollama Models. I can get them shown as "online" in Archon but it never finds any chat models.
In the LM Studio Logs I can see that Archon always calls /api/tags and gets the error "unexpected endpoint or method".
As the models are in
http://127.0.0.1:1234/v1/models
I wonder what i need to configure?
any hints?
thanks
Juergen
Beta Was this translation helpful? Give feedback.
All reactions