Skip to content

Ollama support#18

Merged
zortax merged 4 commits intozortax:mainfrom
AlexJonker:ollama-support
Jan 13, 2026
Merged

Ollama support#18
zortax merged 4 commits intozortax:mainfrom
AlexJonker:ollama-support

Conversation

@AlexJonker
Copy link
Copy Markdown
Contributor

Allow the usage of ollama models in the ai mode.
Also made it so the Ask AI button only shows if ai is actually configured.

The llm crate doesnt seem to support streaming even though the ollama api does so it uses normal .chat()
Not a big problem I think, just gotta wait for the full message to generate before you can see it.
I kept the old function commented out so if the llm crate updates and does support it you can probably just replace the new function with the old one.

@AlexJonker
Copy link
Copy Markdown
Contributor Author

@CodedNil Could you check if this still works with your gemini, openai and openrouter? I don't have premium for any of those so I can't test.

@CodedNil
Copy link
Copy Markdown
Contributor

Still works for me with OpenRouter, don't have the other two keys atm but just looking at the code there's no reason they shouldn't work too.

@zortax
Copy link
Copy Markdown
Owner

zortax commented Jan 13, 2026

Gemini also still works fine

Copy link
Copy Markdown
Owner

@zortax zortax left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@zortax zortax merged commit b145a92 into zortax:main Jan 13, 2026
1 check passed
@AlexJonker AlexJonker deleted the ollama-support branch January 13, 2026 16:44
zortax pushed a commit that referenced this pull request Jan 19, 2026
* Ollama support

* Disable ai mode if not configured
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants