b2279067e2dc1b3c6a17ce63bea52cabbd42b75d
All checks were successful
Build and Push Docker Image / build (push) Successful in 8m22s
- Add ai_provider setting: 'openai' (API key) or 'litellm' (ChatGPT subscription proxy) - Auto-strip max_tokens/max_completion_tokens for chatgpt/ prefix models (ChatGPT subscription backend rejects token limit fields) - LiteLLM mode: dummy API key when none configured, base URL required - isOpenAIConfigured() checks base URL instead of API key for LiteLLM - listAvailableModels() returns manualEntry flag for LiteLLM (no models.list) - Settings UI: conditional fields, info banner, manual model input with chatgpt/ prefix examples when LiteLLM selected - All 7 AI services work transparently via buildCompletionParams() Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Description
No description provided
Languages
TypeScript
99.4%
JavaScript
0.2%
CSS
0.2%
Shell
0.2%