Implement LLM provider configuration and update user settings
- Added functionality to update the default LLM provider for users via a new endpoint in UserController. - Introduced LlmProvider enum to manage available LLM options: Auto, Gemini, OpenAI, and Claude. - Updated User and UserEntity models to include DefaultLlmProvider property. - Enhanced database context and migrations to support the new LLM provider configuration. - Integrated LLM services into the application bootstrap for dependency injection. - Updated TypeScript API client to include methods for managing LLM providers and chat requests.
This commit is contained in:
198
assets/documentation/MCP-Quick-Start.md
Normal file
198
assets/documentation/MCP-Quick-Start.md
Normal file
@@ -0,0 +1,198 @@
|
||||
# MCP Quick Start Guide
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- .NET 8 SDK
|
||||
- Node.js 18+
|
||||
- At least one LLM API key (Gemini, OpenAI, or Claude)
|
||||
|
||||
## Setup Steps
|
||||
|
||||
### 1. Configure LLM API Keys
|
||||
|
||||
Add your API key to `appsettings.Development.json` or user secrets:
|
||||
|
||||
```json
|
||||
{
|
||||
"Llm": {
|
||||
"Claude": {
|
||||
"ApiKey": "YOUR_CLAUDE_API_KEY_HERE"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Or use .NET user secrets (recommended):
|
||||
|
||||
```bash
|
||||
cd src/Managing.Api
|
||||
dotnet user-secrets set "Llm:Claude:ApiKey" "YOUR_API_KEY"
|
||||
```
|
||||
|
||||
Or use environment variables:
|
||||
|
||||
```bash
|
||||
export Llm__Claude__ApiKey="YOUR_API_KEY"
|
||||
dotnet run --project src/Managing.Api
|
||||
```
|
||||
|
||||
### 2. Build the Backend
|
||||
|
||||
```bash
|
||||
cd src
|
||||
dotnet build Managing.sln
|
||||
```
|
||||
|
||||
### 3. Run the Backend
|
||||
|
||||
```bash
|
||||
cd src/Managing.Api
|
||||
dotnet run
|
||||
```
|
||||
|
||||
The API will be available at `https://localhost:7001` (or configured port).
|
||||
|
||||
### 4. Generate API Client (if needed)
|
||||
|
||||
If the LLM endpoints aren't in the generated client yet:
|
||||
|
||||
```bash
|
||||
# Make sure the API is running
|
||||
cd src/Managing.Nswag
|
||||
dotnet build
|
||||
```
|
||||
|
||||
This will regenerate `ManagingApi.ts` with the new LLM endpoints.
|
||||
|
||||
### 5. Run the Frontend
|
||||
|
||||
```bash
|
||||
cd src/Managing.WebApp
|
||||
npm install # if first time
|
||||
npm run dev
|
||||
```
|
||||
|
||||
The app will be available at `http://localhost:5173` (or configured port).
|
||||
|
||||
### 6. Test the AI Chat
|
||||
|
||||
1. Login to the application
|
||||
2. Look for the floating chat button in the bottom-right corner
|
||||
3. Click it to open the AI chat
|
||||
4. Try these example queries:
|
||||
- "Show me my backtests"
|
||||
- "Find my best performing strategies"
|
||||
- "What are my BTC backtests?"
|
||||
- "Show backtests with a score above 80"
|
||||
|
||||
## Getting LLM API Keys
|
||||
|
||||
### Anthropic Claude (Recommended - Best for MCP)
|
||||
1. Go to [Anthropic Console](https://console.anthropic.com/)
|
||||
2. Sign in or create an account
|
||||
3. Navigate to API Keys and create a new key
|
||||
4. Copy and add to configuration
|
||||
5. Note: Requires payment setup
|
||||
|
||||
### Google Gemini (Free Tier Available)
|
||||
1. Go to [Google AI Studio](https://makersuite.google.com/app/apikey)
|
||||
2. Click "Get API Key"
|
||||
3. Create a new API key
|
||||
4. Copy and add to configuration
|
||||
|
||||
### OpenAI
|
||||
1. Go to [OpenAI Platform](https://platform.openai.com/api-keys)
|
||||
2. Create a new API key
|
||||
3. Copy and add to configuration
|
||||
4. Note: Requires payment setup
|
||||
|
||||
### Anthropic Claude
|
||||
1. Go to [Anthropic Console](https://console.anthropic.com/)
|
||||
2. Create an account and API key
|
||||
3. Copy and add to configuration
|
||||
4. Note: Requires payment setup
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
User Browser
|
||||
↓
|
||||
AI Chat Component (React)
|
||||
↓
|
||||
LlmController (/api/Llm/Chat)
|
||||
↓
|
||||
LlmService (Auto-selects provider)
|
||||
↓
|
||||
Gemini/OpenAI/Claude Provider
|
||||
↓
|
||||
MCP Service (executes tools)
|
||||
↓
|
||||
BacktestTools (queries data)
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### No providers available
|
||||
- Check that at least one API key is configured
|
||||
- Verify the API key is valid
|
||||
- Check application logs for provider initialization
|
||||
|
||||
### Tool calls not working
|
||||
- Verify `IBacktester` service is registered
|
||||
- Check user has backtests in the database
|
||||
- Review logs for tool execution errors
|
||||
|
||||
### Frontend errors
|
||||
- Ensure API is running
|
||||
- Check browser console for errors
|
||||
- Verify `ManagingApi.ts` includes LLM endpoints
|
||||
|
||||
### Build errors
|
||||
- Run `dotnet restore` in src/
|
||||
- Ensure all NuGet packages are restored
|
||||
- Check for version conflicts in project files
|
||||
|
||||
## Example Queries
|
||||
|
||||
### Simple Queries
|
||||
```
|
||||
"Show me my backtests"
|
||||
"What's my best strategy?"
|
||||
"List all my BTC backtests"
|
||||
```
|
||||
|
||||
### Filtered Queries
|
||||
```
|
||||
"Find backtests with a score above 85"
|
||||
"Show me backtests from the last 30 days"
|
||||
"List backtests with low drawdown (under 10%)"
|
||||
```
|
||||
|
||||
### Complex Queries
|
||||
```
|
||||
"What are my best performing ETH strategies with a winrate above 70%?"
|
||||
"Find backtests using RSI indicator sorted by Sharpe ratio"
|
||||
"Show me my top 5 backtests by growth percentage"
|
||||
```
|
||||
|
||||
## Next Steps
|
||||
|
||||
- Add more MCP tools for additional functionality
|
||||
- Customize the chat UI to match your brand
|
||||
- Implement chat history persistence
|
||||
- Add streaming support for better UX
|
||||
- Create custom tools for your specific use cases
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
1. Check the logs in `Managing.Api` console
|
||||
2. Review browser console for frontend errors
|
||||
3. Verify API keys are correctly configured
|
||||
4. Ensure all services are running
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- [MCP Architecture Documentation](./MCP-Architecture.md)
|
||||
- [Implementation Summary](./MCP-Implementation-Summary.md)
|
||||
- [Model Context Protocol Spec](https://modelcontextprotocol.io)
|
||||
Reference in New Issue
Block a user