ollama integration
Feature Request
Andreas
·
1 year ago
·
edited
i host my own llms at home. this is being done with ollama. i saw mixpost has an open ai integration, ollama's api is meant to be compatible with the open ai one, so it looks as if it would just be enough to add a field where we can override the api url and be good.
i guess it would be cleaner to add an ollama provider though, but it could use the same code base as open ai so it would probably be a low hanging fruit.
can we get an ollama integration?
Rather than ollama the goal should be at least allow a logic to edit host and endpoint variables so any openai compatible LLM can be integrated