Skip to content

AI Architecture

This document provides a comprehensive overview of the AI architecture in the LMS, including provider services, tool calling, and configuration.

Overview

The LMS uses a provider-agnostic AI architecture that enables:

  • Multiple AI provider support (OpenAI, Anthropic, Google, Custom backends)
  • Per-course AI configuration with database-driven settings
  • AI tool calling for rich interactions with course data
  • Mock provider for testing and development

Architecture Diagram

mermaid
graph TD
    subgraph "Frontend"
        A[React Components] --> B[AiChatService]
        B --> C[CourseAwareAiProviderService]
    end

    subgraph "Backend Services"
        C --> D[OpenAiService]
        C --> E[AnthropicService]
        C --> F[GoogleAiService]
        C --> G[CustomBackendService]
        C --> H[MockAiService]

        I[AIToolRegistry] --> D
        I --> E
        I --> F
    end

    subgraph "Database"
        J[CourseAiConfig] --> C
        K[AIToolDefinition] --> I
        L[ChatAnalytics] --> M
    end

    N[External APIs] --> D
    N --> E
    N --> F

AI Provider Services

IAiProviderService Interface

The IAiProviderService interface defines the contract for all AI providers:

typescript
export interface IAiProviderService {
  /**
   * Count tokens in input text
   */
  getInputTokenCount(input: string): Promise<number>;

  /**
   * Get chat completion as a stream
   */
  getChatCompletion(
    messages: ChatMessage[],
    tools?: ToolDefinition[],
    model?: string,
  ): Promise<Readable>;

  /**
   * Get chat completion as a single response
   */
  getChatCompletionSync(
    messages: ChatMessage[],
    tools?: ToolDefinition[],
    model?: string,
  ): Promise<ChatCompletionResult>;

  /**
   * Check if provider is available
   */
  isAvailable(): Promise<boolean>;
}

Supported Providers

ProviderService ClassDefault ModelUse Case
OpenAIOpenAiServicegpt-4.1-miniDefault, general-purpose AI
AnthropicAnthropicServiceclaude-sonnet-4-20250514Claude models for complex reasoning
GoogleGoogleAiServicegemini-1.5-proGemini models for multimodal
CustomCustomBackendServiceConfigurableSelf-hosted LLMs (Ollama, LM Studio)
MockMockAiServiceN/ATesting and development

Provider Loading Flow

mermaid
sequenceDiagram
    participant User
    participant AiChatService
    participant CourseAwareAiProviderService
    participant CourseAiConfig
    participant OpenAiService
    participant AnthropicService

    User->>AiChatService: generateResponse(conversationId, message, courseId)
    AiChatService->>CourseAwareAiProviderService: getChatCompletionSync(messages, tools, courseId)

    CourseAwareAiProviderService->>CourseAiConfig: findOne({ courseId, isEnabled: true })

    alt No course config
        CourseAiConfig-->>CourseAwareAiProviderService: null
        CourseAwareAiProviderService->>OpenAiService: getChatCompletionSync(messages)
    else Has course config
        CourseAiConfig-->>CourseAwareAiProviderService: Config with provider
        CourseAwareAiProviderService->>CourseAwareAiProviderService: selectProvider(config)

        alt provider === 'anthropic'
            CourseAwareAiProviderService->>AnthropicService: getChatCompletionSync(messages)
        else provider === 'openai'
            CourseAwareAiProviderService->>OpenAiService: getChatCompletionSync(messages)
        end
    end

    OpenAiService-->>CourseAwareAiProviderService: ChatCompletionResult
    AnthropicService-->>CourseAwareAiProviderService: ChatCompletionResult
    CourseAwareAiProviderService-->>AiChatService: result
    AiChatService-->>User: AI Response

Course Configuration

CourseAiConfig Entity

The CourseAiConfig entity stores per-course AI settings:

typescript
@Entity("course_ai_configs")
export class CourseAiConfig {
  @PrimaryGeneratedColumn()
  id: number;

  @Column()
  courseId: number;

  @Column({
    type: "varchar",
    length: 50,
    default: "openai",
  })
  provider: "openai" | "anthropic" | "google" | "custom" | "mock";

  @Column({ nullable: true })
  apiFormat: "openai" | "anthropic" | "google" | "custom";

  @Column({ nullable: true })
  baseUrl: string; // For custom backends (e.g., http://localhost:11434)

  @Column({ nullable: true })
  apiKey: string; // Course-specific API key

  @Column({ nullable: true })
  model: string; // Override default model

  @Column({ default: false })
  useCustomBackend: boolean;

  @Column({ default: true })
  isEnabled: boolean;
}

Environment Variables

Configure default providers via environment variables:

env
# OpenAI (default)
OPENAI_API_KEY=sk-...

# Anthropic
ANTHROPIC_API_KEY=sk-ant-...
ANTHROPIC_MODEL=claude-sonnet-4-20250514

# Google
GOOGLE_AI_API_KEY=...
GOOGLE_AI_MODEL=gemini-1.5-pro

AI Tool Calling

AIToolRegistry

The AIToolRegistry enables AI agents to call backend services:

typescript
@singleton()
export class AIToolRegistry {
  /**
   * Register a new AI tool
   */
  async registerTool(toolData: {
    name: string; // e.g., "calendar_create_event"
    displayName: string; // e.g., "Create Calendar Event"
    description: string; // e.g., "Creates a new calendar event"
    category: string; // e.g., "calendar"
    handlerPath: string; // e.g., "CalendarService.createEvent"
    parameters?: Array<{
      name: string;
      type: "string" | "number" | "boolean" | "date" | "array" | "object";
      description: string;
      isRequired?: boolean;
    }>;
    requiresAuth?: boolean;
    requiredPermission?: string;
  }): Promise<AIToolDefinition>;

  /**
   * Get available tools
   */
  async getAvailableTools(category?: string): Promise<AIToolDefinition[]>;

  /**
   * Execute a tool
   */
  async executeTool(
    toolName: string,
    params: Record<string, any>,
    userId?: number,
  ): Promise<any>;
}

Supported Tools

CategoryServiceAvailable Tools
CalendarCalendarServicecalendar_create_event, calendar_get_events
QuizQuizServicequiz_get_attempts, quiz_get_questions
CourseCourseServicecourse_get_modules, course_get_chapters
DiscussionDiscussionServicediscussion_search, discussion_get_posts

Tool Definition Example

typescript
await aiToolRegistry.registerTool({
  name: "calendar_create_event",
  displayName: "Create Calendar Event",
  description: "Creates a new event in a calendar",
  category: "calendar",
  handlerPath: "CalendarService.createEvent",
  parameters: [
    {
      name: "calendarId",
      type: "number",
      description: "ID of the calendar",
      isRequired: true,
    },
    {
      name: "title",
      type: "string",
      description: "Event title",
      isRequired: true,
    },
    {
      name: "startDate",
      type: "date",
      description: "Start date/time",
      isRequired: true,
    },
    {
      name: "endDate",
      type: "date",
      description: "End date/time",
      isRequired: false,
    },
  ],
  requiresAuth: true,
});

Tool Calling Flow

mermaid
sequenceDiagram
    participant User
    participant AI
    participant AIToolRegistry
    participant CalendarService

    User->>AI: "Create an event for tomorrow's meeting"

    AI->>AIToolRegistry: getAvailableTools("calendar")
    AIToolRegistry-->>AI: [calendar_create_event]

    AI->>User: Would you like me to create a calendar event? I'll need: calendarId, title, startDate

    User->>AI: Yes, calendar ID 1, title "Team Meeting", start "2024-01-15T10:00:00Z"

    AI->>AIToolRegistry: executeTool("calendar_create_event", {...}, userId)
    AIToolRegistry->>CalendarService: createEvent({...}, userId)
    CalendarService-->>AIToolRegistry: Event created
    AIToolRegistry-->>AI: { id: 123, title: "Team Meeting", ... }

    AI-->>User: I've created the calendar event "Team Meeting" for tomorrow at 10:00 AM.

Adding a New AI Provider

Step 1: Create Provider Service

typescript
// apps/client-backend/src/integrations/providers/newprovider.service.ts
import { singleton } from "tsyringe";
import type {
  IAiProviderService,
  ChatMessage,
  ChatCompletionResult,
  ToolDefinition,
} from "../contracts/IAiProviderService";

@singleton()
export class NewProviderService implements IAiProviderService {
  async getInputTokenCount(input: string): Promise<number> {
    // Implementation
  }

  async getChatCompletion(
    messages: ChatMessage[],
    tools?: ToolDefinition[],
    model?: string,
  ): Promise<Readable> {
    // Implementation
  }

  async getChatCompletionSync(
    messages: ChatMessage[],
    tools?: ToolDefinition[],
    model?: string,
  ): Promise<ChatCompletionResult> {
    // Implementation
  }

  async isAvailable(): Promise<boolean> {
    // Health check
  }
}

Step 2: Update CourseAwareAiProviderService

Add your provider to the constructor and selectProvider method:

typescript
constructor(
  private readonly openAiService: OpenAiService,
  private readonly anthropicService: AnthropicService,
  // ... existing providers
  private readonly newProviderService: NewProviderService,  // Add this
) {}

private selectProvider(config: CourseAiConfig): IAiProviderService {
  switch (config.provider) {
    // ... existing cases
    case "newprovider":  // Add this case
      return this.newProviderService;
    default:
      return this.openAiService;
  }
}

Step 3: Update CourseAiConfig Entity

Add the new provider to the union type:

typescript
@Column({
  type: "varchar",
  length: 50,
  default: "openai",
})
provider: "openai" | "anthropic" | "google" | "custom" | "mock" | "newprovider";

Testing with Mock Provider

Enable Mock Mode

Option 1: Environment Variable

env
USE_MOCK_AI=true

Option 2: Course Configuration

Set provider: "mock" in the course's CourseAiConfig record.

Configure Mock Responses

typescript
// In tests
const mockAiService = container.resolve(MockAiService);

// Configure a response
mockAiService.configureResponse("hello", "Hello! How can I help you today?");

// Configure a tool call response
mockAiService.configureToolCall("calendar_create_event", {
  success: true,
  eventId: 123,
});

// Reset mock state
mockAiService.reset();