Current

BotSharp

A .NET-based open-source multi-agent framework enabling Conversation as a Platform (CaaP) with plugin-driven pipeline execution for cross-platform intelligent assistant development.

Signal

BotSharp · GitHub · 2026-03-24

AI Multi-Agent Framework in .NET | ai-agent, chatbot, multi-agent. BotSharp is an open source machine learning framework for AI Bot platform builder. This project involves natural language understanding, computer vision and audio processing technologies, and aims to promote the development and application of intelligent robot assistants in information systems. Out-of-the-box machine learning algorithms allow ordinary programmers to develop artificial intelligence applications faster and easier. It's written in C# running on .Net Core that is full cross-platform framework, the plug-in and pipeline flow execution design is adopted to completely.

Context

BotSharp operates within the .NET ecosystem, targeting developers who require cross-platform AI agent capabilities without leaving the C#/.NET Core environment. The framework emphasizes "Conversation as a Platform (CaaP)," suggesting a structural approach where conversational interfaces serve as the primary orchestration layer for business logic and agent workflows. Key architectural features include plugin modularity and pipeline flow execution, allowing for granular control over the AI processing pipeline.

Relevance

BotSharp fills a specific infrastructure gap for organizations and developers invested in the .NET stack who previously lacked a native, open-source multi-agent orchestration framework comparable to Python-based alternatives. Its pipeline design supports deterministic control over agent interactions, which is critical for enterprise-grade applications requiring auditability and structured execution flows. The inclusion of NLU, computer vision, and audio processing capabilities within the core framework reduces dependency on external microservices for multimodal tasks.

Current State

The project is hosted on GitHub under the SciSharp organization with Apache 2.0 licensing. It maintains a NuGet package for distribution and has an active Discord community. Documentation is available via ReadTheDocs. The build pipeline is automated via GitHub Actions. The framework supports integration with major LLM providers through its abstraction layer, aligning with the broader Openflows infrastructure model.

Open Questions

  • Adoption: How does the .NET-specific focus impact adoption compared to language-agnostic frameworks in the broader ecosystem?
  • Model Support: What is the breadth of supported model providers and inference backends compared to frameworks like OpenClaw or CrewAI?
  • Security: How does the pipeline execution design handle untrusted code execution and sandboxing, particularly in enterprise environments?
  • Performance: Does the .NET runtime overhead impact inference latency compared to native Python implementations for high-throughput scenarios?

Connections

BotSharp is directly comparable to openclaw, crewai, hermes-agent, and deerflow as a multi-agent orchestration framework. It integrates with infrastructure layers such as fastapi-llm-gateway, xinference, vllm, and ollama for model serving and inference. It is cataloged within the open-source-ai-agent-framework-landscape-2026 as a representative .NET solution. The framework's focus on CaaP aligns with qwen-agent's application-oriented approach, though with a distinct runtime environment.

Connections

  • OpenClaw - Competing multi-agent orchestration framework (Current · en)
  • CrewAI - Competing multi-agent orchestration framework (Current · en)
  • Open-Source AI Agent Framework Landscape 2026 - Included in 2026 framework landscape overview (Current · en)
  • Qwen-Agent - Comparable LLM application framework (Current · en)
  • Hermes Agent - Comparable autonomous agent platform (Current · en)
  • DeerFlow - Comparable multi-agent orchestration framework (Current · en)
  • FastAPI LLM Gateway - Compatible API integration layer (Current · en)
  • Xinference - Compatible unified inference API (Current · en)
  • vLLM - Compatible inference engine (Current · en)
  • Ollama - Compatible local inference runtime (Current · en)

External references

Mediation note

Tooling: OpenRouter / qwen/qwen3.5-flash-02-23

Use: drafted entry from external signal, assessed linkage against existing knowledge base

Human role: review, edit, and approve before publication

Limits: signal content may be incomplete; verify primary sources before publishing