Mullai v0.0.1
Mullai is an advanced AI Assistant framework built entirely on .NET. It provides a robust foundation for creating intelligent, multi-turn conversational AI agents equipped with tools, memory, and skills.
Leveraging Microsoft.Extensions.AI and Microsoft.Agents.AI, Mullai empowers
developers to build sophisticated AI assistants with a highly scalable and observable architecture.
Quick Install
Install the Mullai CLI directly to your system using our automated install scripts.
macOS / Linux (Bash)
curl -sSL https://raw.githubusercontent.com/agentmatters/mullai-bot/main/scripts/install.sh | bash
Windows (PowerShell)
irm https://raw.githubusercontent.com/agentmatters/mullai-bot/main/scripts/install.ps1 | iex
Key Features
- Multi-Provider with Fallback: Integrate Gemini, Groq, Cerebras, Mistral, and more. If one fails, the next in line is automatically tried.
- Centralized Model Catalog: Manage all model metadata (priority, pricing,
context window) from a single
models.jsonfile. - Rich Tool Ecosystem: Empower agents with
WeatherTool,CliTool, andFileSystemTool. - Robust Middleware: Pipeline for
FunctionCalling,PII Handling, andGuardrails. - Memory & Skills: Persistent
UserInfoMemoryand dynamicAgentSkills.
Project Architecture
Mullai is built with a modular and decoupled architecture, promoting maintainability and extensibility.
Core Layers
- Abstractions: Fundamental interfaces and base classes.
- Agents: The brain of Mullai, housing agent personalities and factory logic.
- Channels: Interaction via CLI, API, Telegram, or WebAssembly.
- Providers: Adapters for various LLM backends (Gemini, Groq, etc.).
- Telemetry: Shared OpenTelemetry configuration for tracing and metrics.
Provider Configuration
Mullai separates model configuration from sensitive API keys for flexibility and security.
models.json — Model Catalog
Defined in src/Mullai.Global.ServiceConfiguration/models.json.
{
"MullaiProviders": {
"Providers": [{
"Name": "Gemini",
"Priority": 1,
"Enabled": true,
"Models": [{ "ModelId": "gemini-2.0-flash", ... }]
}]
}
}
appsettings.json — API Keys
Sensitive keys are stored in src/Mullai.Global.ServiceConfiguration/appsettings.json.
{
"Gemini": { "ApiKey": "YOUR_API_KEY" },
"Groq": { "ApiKey": "YOUR_API_KEY" }
}
Observability
Deep insights out-of-the-box. Mullai integrates with OpenTelemetry to emit:
- Distributed Traces: Parent spans for requests, child spans for each LLM attempt.
- Structured Logs: Granular insight into the agent's decision-making process.
- Metrics: Quantitative views of health and performance.
A local stack (Jaeger, Prometheus) can be launched via docker/observability.
Contributing
We welcome contributions! Whether you're adding providers, creating tools, or improving documentation.
- Report Bugs: Use Github issues for bug reports or feature suggestions.
- Development: Follow .NET best practices and our branching strategy.
- Pull Requests: Ensure your code is well-tested and documented.
See our Full Contributing Guidelines for more details.