Implement LLM provider configuration and update user settings
- Added functionality to update the default LLM provider for users via a new endpoint in UserController. - Introduced LlmProvider enum to manage available LLM options: Auto, Gemini, OpenAI, and Claude. - Updated User and UserEntity models to include DefaultLlmProvider property. - Enhanced database context and migrations to support the new LLM provider configuration. - Integrated LLM services into the application bootstrap for dependency injection. - Updated TypeScript API client to include methods for managing LLM providers and chat requests.
This commit is contained in:
@@ -9,8 +9,6 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
|
||||
"InfluxDb": {
|
||||
"Organization": "managing-org"
|
||||
},
|
||||
@@ -28,6 +26,17 @@
|
||||
"Flagsmith": {
|
||||
"ApiUrl": "https://flag.kaigen.ai/api/v1/"
|
||||
},
|
||||
"Llm": {
|
||||
"Gemini": {
|
||||
"DefaultModel": "gemini-2.0-flash"
|
||||
},
|
||||
"OpenAI": {
|
||||
"DefaultModel": "gpt-4o"
|
||||
},
|
||||
"Claude": {
|
||||
"DefaultModel": "claude-haiku-4-5-20251001"
|
||||
}
|
||||
},
|
||||
"N8n": {
|
||||
},
|
||||
"Sentry": {
|
||||
|
||||
Reference in New Issue
Block a user