ollama integration

Feature Request Andreas Andreas · 1 year ago · edited

i host my own llms at home. this is being done with ollama. i saw mixpost has an open ai integration, ollama's api is meant to be compatible with the open ai one, so it looks as if it would just be enough to add a field where we can override the api url and be good.

i guess it would be cleaner to add an ollama provider though, but it could use the same code base as open ai so it would probably be a low hanging fruit.

can we get an ollama integration?

madanyang madanyang · 9 months ago (edited)

Rather than ollama the goal should be at least allow a logic to edit host and endpoint variables so any openai compatible LLM can be integrated

Log in to reply

Voters (7)

Sascha Foerster Sascha Foerster
Oluwatobi Oluwatobi
Byron Garcia Byron Garcia
Felix Felix
Andreas Andreas