OpenKnowledge is an open-source toolkit designed to drastically simplify the process of creating, configuring, and deploying an AI chatbot linked to a specialized knowledge base.
By leveraging a Markdown-Driven Configuration approach, developers can define an agent's identity, behavior, security guardrails, and knowledge strictly through readable .md files—requiring zero code changes to iterate on AI behavior.
We use a Turbo-powered monorepo consisting of:
packages/core: The headless Node.js engine. It reads local Markdown files and routes prompt generations via@tanstack/aito providers like OpenAI, Anthropic, and Gemini.packages/react: A purely UI-focused, controlled chat widget built with React, Tailwind CSS, and Radix UI. It maintains zero knowledge of your API keys.apps/playground: A Vite-based SSR playground for testing the widget UI and simulating the backend Server-to-Client communication flow.
- Node.js (v18+)
- npm
-
Clone the repository:
git clone https://github.com/your-org/openknowledge.git cd openknowledge -
Install dependencies:
npm install
-
Set up environment variables: Navigate to
apps/playgroundand create a.envfile (you can copy from.env.exampleif it exists) with your AI provider keys:GEMINI_API_KEY=your_key_here
-
Start the development server:
npm run dev
This will start the Turbo pipeline, building the packages in watch mode and spinning up the Vite SSR playground.
The OpenKnowledge architecture strictly enforces a Server-to-Client bridge to protect API keys.
Create your agent using @lucasformiga/openknowledge-core and expose an endpoint:
import { createAgent, parseEnv } from '@lucasformiga/openknowledge-core';
// Automatically loads behavior.md, security.md, and knowledge/*.md from the directory
const agent = await createAgent(parseEnv(process.env), './agent-config');
export async function POST(req, res) {
// Pass the new message and the conversation history array for context
const { message, history } = req.body;
const response = await agent.ask(message, undefined, history);
res.json({ text: response });
}Use the @lucasformiga/openknowledge-react UI widget to interact with your secure endpoint. The widget supports Markdown parsing out of the box and uses Compound Components for maximum flexibility:
import { Widget, useWidgetMessages } from '@lucasformiga/openknowledge-react';
export default function ChatWidget() {
const { messages, isProcessing, appendMessage, setIsProcessing } = useWidgetMessages();
const handleSendMessage = async (text: string) => {
appendMessage({ role: 'user', content: text });
setIsProcessing(true);
try {
const res = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
message: text,
history: messages // Send history to maintain context
})
});
const data = await res.json();
appendMessage({ role: 'assistant', content: data.text });
} finally {
setIsProcessing(false);
}
};
return (
<Widget.Root messages={messages} isProcessing={isProcessing} onSendMessage={handleSendMessage}>
<Widget.Trigger />
<Widget.Content>
<Widget.Header />
<Widget.Body />
<Widget.Footer />
</Widget.Content>
</Widget.Root>
);
}We welcome contributions from the community! If you're looking to make your first PR, here are a few Good First Issues derived from our current roadmap:
-
Add Vector Database/RAG Support Example
- Context: The core agent currently loads
.mdfiles directly from the file system, which is great for small configurations. - Task: Create a new example in
packages/core/examplesdemonstrating how to connect theKnowledgeRouter(orAgentInstance) to a Vector Database like Pinecone, Supabase Vector, or similar using the manual initialization flow.
- Context: The core agent currently loads
-
Expand AI Provider Support in
@lucasformiga/openknowledge-core- Context: The core package currently supports
openai,anthropic, andgeminivia@tanstack/ai. - Task: Add support for a new provider (e.g., Groq, DeepSeek, or Mistral). You will need to update
config.ts,router.ts, and write covering Vitest tests.
- Context: The core package currently supports
-
Enhance Widget Theming Options
- Context: The widget supports 4 preset themes and a custom primary color override.
- Task: Expose more CSS variables (like border radiuses, font families, or secondary colors) via the
themeVariablesprop to allow deeper developer customization.
Before contributing, please read our Agent / Developer Guide for strict engineering, architecture, and testing guidelines.
This project is licensed under the MIT License.