AI Configuration Guide
This guide explains how to configure AI chat assistants for your courses, including provider selection, model configuration, and custom backend setup.
Overview
The LMS supports multiple AI providers that can be configured on a per-course basis:
- OpenAI (GPT models) - Default provider for general-purpose AI
- Anthropic (Claude models) - Advanced reasoning and complex tasks
- Google AI (Gemini models) - Multimodal capabilities
- Custom Backend - Self-hosted solutions (Ollama, LM Studio, etc.)
- Mock Provider - Testing and development
Accessing AI Configuration
For Instructors
- Navigate to your course
- Click on Course Settings in the sidebar
- Select the AI Configuration tab
- Configure your preferred AI provider and settings
Prerequisites
- Instructor or administrator role for the course
- (Optional) API keys for commercial AI providers
- (Optional) Self-hosted AI backend for custom deployments
Provider Configuration
OpenAI (GPT)
OpenAI's GPT models provide general-purpose AI capabilities with strong performance across various tasks.
Configuration Steps:
- Select OpenAI as the provider
- (Optional) Enter your OpenAI API key
- If not provided, uses system default
- Per-course keys allow cost tracking
- Select model:
gpt-4.1-mini(Default) - Fast, cost-effectivegpt-4- More capable, higher costgpt-3.5-turbo- Fastest, lowest cost
- Click Save Configuration
When to Use:
- General course assistance
- Quick responses
- Code explanations
- Document summarization
Anthropic (Claude)
Anthropic's Claude models excel at complex reasoning, detailed analysis, and maintaining context in long conversations.
Configuration Steps:
- Select Anthropic as the provider
- Enter your Anthropic API key
- Required for Anthropic provider
- Get key from: https://console.anthropic.com
- Select model:
claude-sonnet-4-20250514(Default) - Balanced performanceclaude-3-opus- Highest capabilityclaude-3-haiku- Fastest responses
- Click Save Configuration
When to Use:
- Complex problem-solving
- Detailed explanations
- Research assistance
- Long-form content generation
Google AI (Gemini)
Google's Gemini models offer multimodal capabilities and strong performance across text, code, and reasoning tasks.
Configuration Steps:
- Select Google AI as the provider
- Enter your Google AI API key
- Required for Google AI provider
- Get key from: https://makersuite.google.com/app/apikey
- Select model:
gemini-1.5-pro(Default) - Best overall performancegemini-1.5-flash- Faster, lower cost
- Click Save Configuration
When to Use:
- Multimodal tasks
- Code generation and analysis
- General course assistance
- Complex reasoning
Custom Backend (Self-Hosted)
Configure self-hosted AI models using Ollama, LM Studio, or other OpenAI-compatible backends.
Configuration Steps:
- Select Custom Backend as the provider
- Configure settings:
- Base URL: Your backend URL (e.g.,
http://localhost:11434) - API Format: Select compatibility mode
- OpenAI (most common)
- Anthropic
- Model Name: Model identifier from your backend
- For Ollama:
llama3,mistral,codellama, etc. - For LM Studio: Model name from UI
- For Ollama:
- Base URL: Your backend URL (e.g.,
- (Optional) API Key: If your backend requires authentication
- Click Save Configuration
Supported Backends:
| Backend | Default Port | API Format | Example URL |
|---|---|---|---|
| Ollama | 11434 | OpenAI | http://localhost:11434 |
| LM Studio | 1234 | OpenAI | http://localhost:1234 |
| text-generation-webui | 5000 | OpenAI | http://localhost:5000 |
| vLLM | 8000 | OpenAI | http://localhost:8000 |
When to Use:
- Privacy-sensitive courses
- Offline or air-gapped environments
- Cost control
- Custom-trained models
- Research and experimentation
Setting Up Ollama
# Install Ollama
curl https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama3
# Start Ollama (usually starts automatically)
ollama serveConfigure in LMS:
- Base URL:
http://localhost:11434 - API Format:
openai - Model:
llama3
Setting Up LM Studio
- Download and install LM Studio from https://lmstudio.ai
- Download your preferred model through LM Studio UI
- Start the local server from LM Studio (Server tab)
- Note the port (usually 1234)
Configure in LMS:
- Base URL:
http://localhost:1234 - API Format:
openai - Model: Name shown in LM Studio
AI Tool Permissions
Configure which AI tools students can use in their conversations. Tools allow the AI to interact with course data and perform actions.
Available Tool Categories
| Category | Tools | Purpose |
|---|---|---|
| Calendar | View events, create events | Help students manage deadlines |
| Course Content | Browse modules, chapters | Navigate course materials |
| Quiz | View quiz info, review attempts | Study assistance |
| Discussion | Search discussions, view posts | Find relevant conversations |
Permission Scopes
Control how long tool permissions last:
- Always: Tool can be used anytime (default for read-only tools)
- Conversation: Permission lasts for the current conversation
- Once: Single-use permission, requires re-approval
Configuring Tool Permissions
- Go to Course Settings → AI Configuration
- Scroll to Tool Permissions section
- For each tool category:
- Enable/disable the category
- Set default permission scope
- Configure auto-approval rules
- Click Save Configuration
Best Practice
Enable read-only tools (browsing content, viewing events) with "Always" scope, and action tools (creating content, modifying data) with "Conversation" or "Once" scope for better control.
Managing AI Features
Enabling/Disabling AI
Toggle AI chat for your course:
- Navigate to Course Settings → AI Configuration
- Use the Enable AI Chat toggle
- When disabled, students won't see the AI chat interface
Testing Configuration
Before making AI available to students:
- Configure your preferred provider
- Use the Test Configuration button
- Send a test message to verify connectivity
- Check response quality and latency
Monitoring Usage
Track AI usage in your course:
- Navigate to Course Analytics
- View AI Chat Analytics section
- See metrics:
- Total conversations
- Messages per student
- Token usage (cost estimation)
- Most used tools
- Common questions
Cost Management
API Key Strategies
System-Wide Keys (Set by Administrator):
- Shared across all courses
- Simplest setup
- Central cost tracking
Per-Course Keys (Set by Instructor):
- Individual course budgets
- Isolated cost tracking
- Useful for research projects
Per-User Keys (Advanced):
- Students provide their own keys
- Zero cost to institution
- Requires student technical setup
Cost Optimization Tips
Choose Appropriate Models
- Use smaller models (gpt-4.1-mini, claude-haiku) for simple queries
- Reserve larger models for complex tasks
Set Usage Limits
- Configure rate limits per student
- Set daily/weekly message caps
- Monitor token usage
Consider Custom Backends
- One-time hardware cost
- No per-token fees
- Unlimited usage
Tool Selection
- Enable only necessary tools
- Reduce tool calls to minimize tokens
Troubleshooting
AI Chat Not Responding
Check Configuration:
- Verify AI is enabled for the course
- Confirm API key is valid
- Test connection with "Test Configuration" button
Common Issues:
- Invalid or expired API key → Update in settings
- Backend URL unreachable → Check network/firewall
- Model not available → Verify model name spelling
Slow Response Times
Causes:
- Large model selected → Try smaller model
- High token count → Shorter context works faster
- Backend overloaded → Check backend resources
Incorrect Responses
Improvements:
- Adjust model selection
- Review tool permissions
- Check context window size
- Consider different provider
Best Practices
For General Courses
- Provider: OpenAI (gpt-4.1-mini)
- Tools: Enable calendar, course content, discussions
- Permissions: Read-only tools on "Always", write tools on "Conversation"
For Programming Courses
- Provider: Anthropic (claude-sonnet) or OpenAI (gpt-4)
- Tools: Enable course content, quiz review
- Permissions: Mostly "Always" for reference materials
For Privacy-Sensitive Courses
- Provider: Custom Backend (Ollama + open model)
- Tools: Limited to essential only
- Permissions: "Once" or "Conversation" for all tools
For Research Projects
- Provider: Per-course API keys (any provider)
- Tools: Full access
- Permissions: Track usage, adjust based on needs
Related Documentation
- AI Architecture (Developer Guide) - Technical implementation details
- Instructor Guide - General course management
- Student Guide - Using AI chat as a student
Getting Help
If you need assistance with AI configuration:
- Check the troubleshooting section above
- Review the AI Architecture documentation
- Contact your system administrator
- Consult your institution's IT support