Add support for custom AI provider (OpenAI-compatible)
complete
Jakub Pomykała
We could add an option to use third-party (or even locally hosted) LLMs by allowing users to configure a custom auto-translation provider. This provider would function similarly to OpenAI but with the ability to customize the base URL. This approach would enable users to integrate Gemini, Mistral, or local LLMs without needing to create separate configurations for each or rely on OpenRouter.ai.
Jakub Pomykała
marked this post as
complete
We can now use custom AI models like Mistral, Claude, and Gemini, as well as any other OpenAI-compatible models, including local ones via tools like LM Studio + ngrok.
Photo Viewer
View photos in a modal
Jakub Pomykała
Merged in a post:
Add support for Mistral AI
Jakub Pomykała
The goal of this feature request is to add Mistral AI as AI & auto-translation provider.
Jakub Pomykała
marked this post as
in progress