You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/content/docs/features/ai/backend-integration.mdx
+10-5Lines changed: 10 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,12 +11,16 @@ The most common (and recommended) setup to integrate BlockNote AI with an LLM is
11
11
## Default setup (Vercel AI SDK)
12
12
13
13
The example below closely follows the [basic example from the Vercel AI SDK](https://ai-sdk.dev/docs/ai-sdk-ui/chatbot#example) for Next.js.
14
-
The only difference is that we're retrieving the BlockNote tools from the request body and using the `toolDefinitionsToToolSet` function to convert them to AI SDK tools. The LLM will now be able to invoke these tools to make modifications to the BlockNote document as requested by the user. The tool calls are forwarded to the client application where they're handled automatically by the AI Extension.
14
+
The only difference is that we're retrieving the BlockNote tools from the request body and using the `toolDefinitionsToToolSet` function to convert them to AI SDK tools. We also forward the serialized document state (selection, cursor, block IDs) that BlockNote adds to every user message by calling `injectDocumentStateMessages`. The LLM will now be able to invoke these tools to make modifications to the BlockNote document as requested by the user. The tool calls are forwarded to the client application where they're handled automatically by the AI Extension.
@@ -65,7 +70,7 @@ const model = createOpenAICompatible({
65
70
})('model-id');
66
71
67
72
// ...
68
-
createAIExtension({
73
+
AIExtension({
69
74
transport: newClientSideTransport({
70
75
model,
71
76
}),
@@ -103,4 +108,4 @@ You can connect BlockNote AI features with more advanced AI pipelines. You can i
103
108
with BlockNote AI, [get in touch](/about).
104
109
</Callout>
105
110
106
-
- By default, BlockNote AI composes the LLM request (messages) based on the user's prompt and passes these to your backend. See [this example](https://github.com/TypeCellOS/BlockNote/blob/main/examples/09-ai/07-server-promptbuilder/src/App.tsx) for an example where composing the LLM request (prompt building) is delegated to the server.
111
+
- By default, BlockNote AI sends the entire LLM chat history to the backend. See [the server persistence example](https://github.com/TypeCellOS/BlockNote/tree/main/examples/09-ai/07-server-persistence) for a pattern where the backend stores chat and only the latest message is sent to the backend.
@@ -104,6 +109,12 @@ export async function POST(req: Request) {
104
109
}
105
110
```
106
111
112
+
This follows the regular `streamText` pattern of the AI SDK, with 3 exceptions:
113
+
114
+
- the BlockNote document state is extracted from message metadata and injected into the messages, using `injectDocumentStateMessages`
115
+
- BlockNote client-side tool definitions are extracted from the request body and passed to the LLM using `toolDefinitionsToToolSet`
116
+
- The system prompt is set to the default BlockNote system prompt (`aiDocumentFormats.html.systemPrompt`). You can override or extend the system prompt. If you do so, make sure your modified system prompt still explains the AI on how to modify the BlockNote document.
117
+
107
118
See [Backend integrations](/docs/features/ai/backend-integration) for more information on how to integrate BlockNote AI with your backend.
0 commit comments