- Added functionality to update the default LLM provider for users via a new endpoint in UserController. - Introduced LlmProvider enum to manage available LLM options: Auto, Gemini, OpenAI, and Claude. - Updated User and UserEntity models to include DefaultLlmProvider property. - Enhanced database context and migrations to support the new LLM provider configuration. - Integrated LLM services into the application bootstrap for dependency injection. - Updated TypeScript API client to include methods for managing LLM providers and chat requests.
2.4 KiB
2.4 KiB
Frontend Fix for MCP Implementation
Issue
The frontend was trying to import ManagingApi which doesn't exist in the generated API client:
import { ManagingApi } from '../generated/ManagingApi' // ❌ Wrong
Error: The requested module '/src/generated/ManagingApi.ts' does not provide an export named 'ManagingApi'
Solution
The generated API client uses individual client classes for each controller, not a single unified ManagingApi class.
Correct Import Pattern
import { LlmClient } from '../generated/ManagingApi' // ✅ Correct
Correct Instantiation Pattern
Following the pattern used throughout the codebase:
// ❌ Wrong - this pattern doesn't exist
const apiClient = new ManagingApi(apiUrl, userToken)
// ✅ Correct - individual client classes
const llmClient = new LlmClient({}, apiUrl)
const accountClient = new AccountClient({}, apiUrl)
const botClient = new BotClient({}, apiUrl)
// etc.
Files Fixed
1. aiChatService.ts
Before:
import { ManagingApi } from '../generated/ManagingApi'
export class AiChatService {
private apiClient: ManagingApi
constructor(apiClient: ManagingApi) { ... }
}
After:
import { LlmClient } from '../generated/ManagingApi'
export class AiChatService {
private llmClient: LlmClient
constructor(llmClient: LlmClient) { ... }
}
2. AiChat.tsx
Before:
import { ManagingApi } from '../../generated/ManagingApi'
const apiClient = new ManagingApi(apiUrl, userToken)
const service = new AiChatService(apiClient)
After:
import { LlmClient } from '../../generated/ManagingApi'
const llmClient = new LlmClient({}, apiUrl)
const service = new AiChatService(llmClient)
Available Client Classes
The generated ManagingApi.ts exports these client classes:
AccountClientAdminClientBacktestClientBotClientDataClientJobClientLlmClient← Used for AI chatMoneyManagementClientScenarioClientSentryTestClientSettingsClientSqlMonitoringClientTradingClientUserClientWhitelistClient
Testing
After these fixes, the frontend should work correctly:
- No more import errors
- LlmClient properly instantiated
- All methods available:
llm_Chat(),llm_GetProviders(),llm_GetTools()
The AI chat button should now appear and function correctly when you run the app.