Burki Voice AI Docs home page
Search...
⌘K
Support
Dashboard
Dashboard
Search...
Navigation
Core Concepts
AI Configuration (Deep Dive)
Documentation
API Reference
GitHub
Community
Getting Started
Introduction
Quickstart
Installation & Setup
Configuration
Deployment
Usage
Core Concepts
Core Concepts & Architecture
AI Configuration (Deep Dive)
AI Providers
LLM Providers
TTS Providers
STT Providers & Settings
Features
Features
RAG (Knowledge Base)
Call Management
Tools Configuration
Live Transcript WebSocket
Advanced
Advanced Features
Integrations
Help & Resources
FAQ
Contributing
Development
Changelog
On this page
The Tabbed Interface
Best Practices
What’s Next?
Core Concepts
AI Configuration (Deep Dive)
The Tabbed Interface
When creating or editing an assistant, you’ll see three main tabs:
Tab Title
Tab Title
Tab Title
What is the LLM?
The LLM (Large Language Model) is the “brain” of your assistant. It understands what callers say and generates smart, helpful responses.
Key Settings
How do Fallbacks Work?
If your main LLM provider fails (e.g., rate limit, downtime), the system automatically tries your backup providers in order. This keeps your assistant reliable—even if one provider has issues.
Best Practices
Start simple:
Use recommended defaults, then experiment with advanced settings
Test with real calls:
Try different voices, models, and fallback setups
Document your changes:
Keep track of what works best for your use case
What’s Next?
📞 Call Management
Configure interruption handling, timeouts, and conversation flow control
🎙️ STT Advanced Settings
Fine-tune speech detection timing and audio processing
🔊 TTS Provider Details
Deep dive into voice options and audio optimization
🛠️ Tools & Actions
Enable call actions like transfers and automated responses
Core Concepts & Architecture
LLM Providers
Assistant
Responses are generated using AI and may contain mistakes.