- Added functionality to update the default LLM provider for users via a new endpoint in UserController. - Introduced LlmProvider enum to manage available LLM options: Auto, Gemini, OpenAI, and Claude. - Updated User and UserEntity models to include DefaultLlmProvider property. - Enhanced database context and migrations to support the new LLM provider configuration. - Integrated LLM services into the application bootstrap for dependency injection. - Updated TypeScript API client to include methods for managing LLM providers and chat requests.
109 lines
2.4 KiB
Markdown
109 lines
2.4 KiB
Markdown
# Frontend Fix for MCP Implementation
|
|
|
|
## Issue
|
|
|
|
The frontend was trying to import `ManagingApi` which doesn't exist in the generated API client:
|
|
|
|
```typescript
|
|
import { ManagingApi } from '../generated/ManagingApi' // ❌ Wrong
|
|
```
|
|
|
|
**Error**: `The requested module '/src/generated/ManagingApi.ts' does not provide an export named 'ManagingApi'`
|
|
|
|
## Solution
|
|
|
|
The generated API client uses individual client classes for each controller, not a single unified `ManagingApi` class.
|
|
|
|
### Correct Import Pattern
|
|
|
|
```typescript
|
|
import { LlmClient } from '../generated/ManagingApi' // ✅ Correct
|
|
```
|
|
|
|
### Correct Instantiation Pattern
|
|
|
|
Following the pattern used throughout the codebase:
|
|
|
|
```typescript
|
|
// ❌ Wrong - this pattern doesn't exist
|
|
const apiClient = new ManagingApi(apiUrl, userToken)
|
|
|
|
// ✅ Correct - individual client classes
|
|
const llmClient = new LlmClient({}, apiUrl)
|
|
const accountClient = new AccountClient({}, apiUrl)
|
|
const botClient = new BotClient({}, apiUrl)
|
|
// etc.
|
|
```
|
|
|
|
## Files Fixed
|
|
|
|
### 1. aiChatService.ts
|
|
|
|
**Before**:
|
|
```typescript
|
|
import { ManagingApi } from '../generated/ManagingApi'
|
|
|
|
export class AiChatService {
|
|
private apiClient: ManagingApi
|
|
constructor(apiClient: ManagingApi) { ... }
|
|
}
|
|
```
|
|
|
|
**After**:
|
|
```typescript
|
|
import { LlmClient } from '../generated/ManagingApi'
|
|
|
|
export class AiChatService {
|
|
private llmClient: LlmClient
|
|
constructor(llmClient: LlmClient) { ... }
|
|
}
|
|
```
|
|
|
|
### 2. AiChat.tsx
|
|
|
|
**Before**:
|
|
```typescript
|
|
import { ManagingApi } from '../../generated/ManagingApi'
|
|
|
|
const apiClient = new ManagingApi(apiUrl, userToken)
|
|
const service = new AiChatService(apiClient)
|
|
```
|
|
|
|
**After**:
|
|
```typescript
|
|
import { LlmClient } from '../../generated/ManagingApi'
|
|
|
|
const llmClient = new LlmClient({}, apiUrl)
|
|
const service = new AiChatService(llmClient)
|
|
```
|
|
|
|
## Available Client Classes
|
|
|
|
The generated `ManagingApi.ts` exports these client classes:
|
|
|
|
- `AccountClient`
|
|
- `AdminClient`
|
|
- `BacktestClient`
|
|
- `BotClient`
|
|
- `DataClient`
|
|
- `JobClient`
|
|
- **`LlmClient`** ← Used for AI chat
|
|
- `MoneyManagementClient`
|
|
- `ScenarioClient`
|
|
- `SentryTestClient`
|
|
- `SettingsClient`
|
|
- `SqlMonitoringClient`
|
|
- `TradingClient`
|
|
- `UserClient`
|
|
- `WhitelistClient`
|
|
|
|
## Testing
|
|
|
|
After these fixes, the frontend should work correctly:
|
|
|
|
1. No more import errors
|
|
2. LlmClient properly instantiated
|
|
3. All methods available: `llm_Chat()`, `llm_GetProviders()`, `llm_GetTools()`
|
|
|
|
The AI chat button should now appear and function correctly when you run the app.
|