Skip to content

ardaglobal/arda-chat-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ARDA Chat Agent Service

LLM-powered chat agent service that orchestrates MCP (Model Context Protocol) tools to provide intelligent assistance across the ARDA platform.

Table of Contents


Overview

The Chat Agent service acts as a middleware layer between frontend chat widgets and MCP proxy servers. It uses Anthropic's Claude models with tool calling to intelligently orchestrate tool calls and provide context-aware responses.

Key Features

  • Multi-Server MCP Support: Automatically discovers and uses tools from multiple MCP servers
  • Context-Aware: Adapts responses based on application context (platform, ari-ui)
  • Intelligent Tool Routing: Claude reasons over available tools and calls the appropriate ones
  • Comprehensive Logging: Detailed logs for debugging and monitoring
  • Graceful Fallbacks: Works even when MCP servers are unavailable

Architecture Diagram

┌─────────────────────────────────────────────────────────────────┐
│                         Frontend Apps                            │
│  ┌──────────┐      ┌──────────┐      ┌──────────┐              │
│  │ Platform │      │Credit App│      │   IDR    │              │
│  └────┬─────┘      └────┬─────┘      └────┬─────┘              │
│       └─────────────────┴──────────────────┘                     │
│                    ChatWidget (Shared)                           │
└─────────────────────────┬───────────────────────────────────────┘
                          │ HTTP POST /api/v1/chat
                          ▼
        ┌─────────────────────────────────────────┐
        │      ARDA Chat Agent (Claude 3.5)       │
        │  • Context-aware prompts                │
        │  • Tool orchestration                   │
        │  • Multi-server routing                 │
        └─────────────────┬───────────────────────┘
                          │
                          ▼
                ┌──────────────────┐
                │   MCP Proxy      │
                │   (FastAPI)      │
                └────────┬─────────┘
                         │
           ┌─────────────┴─────────────┐
           ▼                           ▼
    ┌──────────────┐          ┌──────────────┐
    │ MCP Server 1 │          │ MCP Server 2 │
    │ (arda-global)│          │  (other)     │
    └──────────────┘          └──────────────┘

Quick Start

Prerequisites

  • Node.js >= 18.0.0
  • Anthropic API key
  • MCP Proxy server running (optional - will work without it)

1. Install Dependencies

cd apps/chat-agent
npm install

2. Configure Environment

cp .env.example .env

Edit .env:

# Required: Your Anthropic API key
ANTHROPIC_API_KEY=sk-ant-your-actual-key-here

# Required: MCP Proxy URL
MCP_PROXY_URL=https://proxy.dev.arda.xyz

# Required: MCP API Keys (one per client type)
# Each client type can access different MCP servers based on their API key
MCP_API_KEY_PLATFORM=platform_key_abc123def456
MCP_API_KEY_ARI_UI=ari_ui_key_xyz789ghi012

# Optional: CORS allowed origins (comma-separated)
# Default: Empty (blocks all origins) - you MUST set this explicitly
# For development: ALLOWED_ORIGINS=*
# For production: ALLOWED_ORIGINS=https://app.arda.xyz,https://dashboard.arda.xyz
ALLOWED_ORIGINS=*

3. Start the Service

# Development mode with auto-reload
npm run dev

# Or build and run production
npm run build
npm start

The service will start on http://localhost:3002.

4. Verify It's Running

# Basic health check
curl http://localhost:3002/health

# Check MCP connectivity
curl http://localhost:3002/health/mcp

# List available MCP servers
curl http://localhost:3002/health/mcp/servers

# View interactive API documentation
open http://localhost:3002/api-docs

5. Explore the API

Visit http://localhost:3002/api-docs in your browser to see the full interactive Swagger UI documentation with:

  • All available endpoints
  • Request/response schemas
  • Interactive "Try it out" functionality
  • Example requests for each endpoint

Architecture

Project Structure

├── src/
│   ├── index.ts              # Main Express server
│   ├── routes/
│   │   ├── chat.ts          # Chat endpoint
│   │   └── health.ts        # Health checks
│   ├── services/
│   │   ├── llmService.ts    # Claude orchestration
│   │   └── mcpClient.ts     # MCP proxy client
│   ├── middleware/
│   │   └── validation.ts    # Request validation
│   ├── types/
│   │   └── index.ts         # TypeScript types
│   └── utils/
│       └── config.ts        # Environment config
├── Dockerfile
├── docker-compose.yml
├── package.json
└── tsconfig.json

Data Flow

  1. User sends message in frontend chat widget with context data
  2. Chat Agent validates request and extracts context
  3. LLM Service builds context-aware system prompt
  4. Discovers tools from all available MCP servers
  5. Claude receives prompt + all available tools (tagged by server)
  6. Claude decides which tools to call (if any)
  7. Chat Agent executes tools via MCP proxy (routed to correct server)
  8. Tool results sent back to Claude
  9. Claude synthesizes final response
  10. Response sent to frontend

Context Management

Two application contexts supported:

Platform (context: "platform")

  • Focus: Main ARDA platform application
  • Questions: "What's my portfolio performance?", "Show me available deals"

ARI UI (context: "ari-ui")

  • Focus: ARDA Research Insights UI application
  • Questions: "Search for code examples", "Analyze technical documentation"

Multi-Server MCP Support

The Chat Agent automatically discovers and uses tools from multiple MCP servers.

How It Works

1. Server Discovery

On startup and when fetching tools:

// Automatically discovers all MCP servers
const servers = await mcpClient.listServers();
// Returns: [{ name: "arda-global", status: "connected", enabled: true, ... }]

// Fetches tools from all enabled servers
const allTools = await mcpClient.listAllTools();
// Returns tools tagged with their server name

2. Tool Presentation to Claude

Each tool's description is enhanced with server context:

Original: "Perform semantic search across code embeddings"
Enhanced: "[MCP Server: arda-global] Perform semantic search across code embeddings"

Claude sees which server each tool belongs to and can reason about which to use.

3. Intelligent Routing

When Claude calls a tool:

// Chat Agent maintains internal mapping
toolToServerMap.set('semantic_search', 'arda-global');

// Routes call to correct server
const serverName = toolToServerMap.get('semantic_search');
const result = await mcpClient.callTool('semantic_search', args, serverName);

4. MCP Proxy Routes to Server

The MCP proxy receives:

{
  "tool_name": "semantic_search",
  "arguments": {...},
  "server_name": "arda-global"
}

And routes to the appropriate MCP server instance.

Benefits

  • Extensible: Add new MCP servers without changing Chat Agent code
  • Isolated: Different servers provide specialized toolsets
  • Intelligent: Claude reasons about which server's tools to use
  • Observable: Detailed logging shows server routing

Debugging Multi-Server Setup

# List all configured MCP servers
curl http://localhost:3002/health/mcp/servers

# Response shows server status:
{
  "status": "ok",
  "servers": [
    {
      "name": "arda-global",
      "status": "connected",
      "enabled": true,
      "tool_count": 15
    }
  ],
  "total": 1,
  "enabled": 1,
  "connected": 1
}

Watch the logs when processing requests:

[MCP Client] Fetching tools from all servers
[MCP Client] Found 2 enabled and connected servers
[MCP Client] Successfully fetched 10 tools from arda-global
[MCP Client] Successfully fetched 5 tools from other-server
[LLM Service] Loaded 15 MCP tools for Claude from 2 servers:
  - arda-global: 10 tools
  - other-server: 5 tools

Development

Development Commands

# Development with auto-reload
npm run dev

# Type checking
npm run type-check

# Build TypeScript
npm run build

# Production start
npm start

# Linting
npm run lint

# Open API documentation in browser
npm run docs

# Export OpenAPI spec to file
npm run spec

Environment Variables

Variable Required Default Description
ANTHROPIC_API_KEY Yes - Your Anthropic API key
PORT No 3002 Server port
ANTHROPIC_MODEL No claude-sonnet-4-5-20250929 Claude model to use
MCP_PROXY_URL Yes https://proxy.dev.arda.xyz MCP proxy server URL
MCP_API_KEY_PLATFORM Yes* - API key for Platform client to access MCP proxy
MCP_API_KEY_ARI_UI Yes* - API key for ARI UI client to access MCP proxy
ALLOWED_ORIGINS No `` (empty) CORS allowed origins (comma-separated). Blocks all origins by default. Set to * for dev or specific origins for prod
NODE_ENV No development Environment mode
LOG_LEVEL No info Logging verbosity
OPIK_API_KEY No - Opik API key for LLM observability and tracing
OPIK_PROJECT_NAME No arda-chat-agent Opik project name
OPIK_ENABLED No true Enable/disable Opik tracing

*At least one MCP API key (MCP_API_KEY_PLATFORM or MCP_API_KEY_ARI_UI) must be configured. Each client type requires its corresponding API key to access MCP tools.

Available Claude Models

Production (Default):

  • claude-3-5-sonnet-20241022 - Best balance of intelligence & speed
  • Cost: $3/M input tokens, $15/M output tokens

Budget:

  • claude-3-haiku-20240307 - Fastest & cheapest
  • Cost: $0.25/M input tokens, $1.25/M output tokens

Premium:

  • claude-3-opus-20240229 - Most capable
  • Cost: $15/M input tokens, $75/M output tokens

Logging

The service provides comprehensive logging with prefixed tags:

[MCP Client] Initialized with base URL: http://localhost:8000
[LLM Service] Initializing Anthropic client
[LLM Service] Using model: claude-3-5-sonnet-20241022
[Chat] New request received: Context: platform
[MCP Client] Fetching tools from all servers
[LLM Service] Loaded 15 MCP tools for Claude from 2 servers
[LLM Service] Claude requested 2 tool calls
[MCP Client] Calling tool: semantic_search on server: arda-global
[Chat] Request completed in 2500ms

License

Proprietary - ARDA Platform


Version: 0.1.0
Last Updated: October 2025
LLM Provider: Anthropic Claude 3.5 Sonnet

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages