We could add an option to use third-party (or even locally hosted) LLMs by allowing users to configure a custom auto-translation provider. This provider would function similarly to OpenAI but with the ability to customize the base URL. This approach would enable users to integrate Gemini, Mistral, or local LLMs without needing to create separate configurations for each or rely on OpenRouter.ai.