Want to try something different? Let’s build a custom Strapi plugin that integrates Vercel’s AI SDK, giving our backend native AI capabilities we can securely expose to our Next.js app.
Here is a complimentary video to go along with the post.
Instead of wiring AI logic directly into the frontend, we’ll centralize it inside Strapi—so we can:
🔐 Keep API keys secure 🧠 Standardize prompts and AI workflows 🔌 Expose clean, reusable endpoints to Next.js 🚀 Scale and iterate without touching frontend logic
Think of it as turning Strapi into our AI control center—while Next.js becomes the experience layer.
If we do this right, we won’t just “add AI.” We’ll build an extensible AI layer into our architecture.
We will cover three following examples
1ai-sdk/
2├── server/ # Strapi Backend
3│ ├── config/
4│ │ ├── plugins.ts # Plugin configuration
5│ │ ├── database.ts # Database settings
6│ │ └── server.ts # Server settings
7│ ├── src/
8│ │ └── plugins/
9│ │ └── ai-sdk/ # Our AI SDK Plugin
10│ │ ├── admin/ # Admin UI components
11│ │ └── server/ # Backend logic
12│ │ ├── src/
13│ │ │ ├── controllers/
14│ │ │ ├── services/
15│ │ │ ├── routes/
16│ │ │ └── lib/ # AI SDK integration
17│ │ └── package.json
18│ ├── .env # Environment variables
19│ └── package.json
20│
21├── next-client/ # Next.js Frontend
22│ ├── app/
23│ │ ├── layout.tsx # Root layout
24│ │ └── page.tsx # Home page
25│ ├── components/
26│ │ ├── AskExample.tsx # Non-streaming demo
27│ │ ├── AskStreamExample.tsx # SSE streaming demo
28│ │ └── ChatExample.tsx # Chat interface demo
29│ ├── hooks/
30│ │ ├── useAsk.ts # Non-streaming hook
31│ │ └── useAskStream.ts # Streaming hook
32│ ├── lib/
33│ │ └── api.ts # API client functions
34│ └── package.json
35│
36└── docs/ # Documentation
37 └── TUTORIAL.md # This file| Technology | Version | Purpose |
|---|---|---|
| Strapi | 5.33.3 | Headless CMS framework |
| AI SDK | 6.0.39 | Vercel's unified AI SDK |
| @ai-sdk/anthropic | 3.0.15 | Anthropic provider for AI SDK |
| TypeScript | 5.x | Type-safe development |
| SQLite | - | Default database (configurable) |
| Technology | Version | Purpose |
|---|---|---|
| Next.js | 16.1.3 | React framework |
| React | 19.2.3 | UI library |
| @ai-sdk/react | 3.0.41 | React hooks for AI SDK |
| Tailwind CSS | 4.x | Utility-first styling |
| TypeScript | 5.x | Type-safe development |
| Service | Model | Purpose |
|---|---|---|
| Anthropic Claude | claude-sonnet-4-20250514 | Primary LLM |
In AI SDK v6, there are two distinct message formats:
| Format | Used By | Structure |
|---|---|---|
| UIMessage | useChat hook, frontend state | { id, role, parts: [{ type: "text", text }] } |
| ModelMessage | streamText, generateText | { role, content } |
The useChat hook manages conversation state using UIMessage format (richer, includes IDs and parts). However, the AI SDK's streamText function expects ModelMessage format.
Use convertToModelMessages(uiMessages) to convert between them:
1import { convertToModelMessages, type UIMessage } from "ai";
2
3// In your service layer
4async chat(messages: UIMessage[]) {
5 const modelMessages = await convertToModelMessages(messages);
6 return streamText({ model, messages: modelMessages });
7}Note:
convertToModelMessagesis async in AI SDK v6 to support asyncTool.toModelOutput().
Note: make sure to use Node 22 (lts version)
Note: This tutorial uses
npmfor all commands. If you prefer yarn or pnpm, the commands are interchangeable (e.g.,npm run dev→yarn dev,npm install→yarn add).
# Create a new Strapi project
npx create-strapi@latest serverComplete the following questions:
paul@dev test npx create-strapi-app@latest server
Strapi v5.33.3 🚀 Let's create your new project
🚀 Welcome to Strapi! Ready to bring your project to life?
Create a free account and get:
30 days of access to the Growth plan, which includes:
✨ Strapi AI: content-type builder, media library and translations
✅ Live Preview
✅ Single Sign-On (SSO) login
✅ Content History
✅ Releases
? Please log in or sign up. Skip
? Do you want to use the default database (sqlite) ? Yes
? Start with an example structure & data? No
? Start with Typescript? Yes
? Install dependencies with npm? Yes
? Initialize a git repository? Yes
? Participate in anonymous A/B testing (to improve Strapi)? No
Strapi Creating a new application at /Users/paul/test/server# Navigate to the project
cd server1server/
2├── config/
3│ ├── admin.ts
4│ ├── api.ts
5│ ├── database.ts
6│ ├── middlewares.ts
7│ ├── plugins.ts
8│ └── server.ts
9├── database/
10├── public/
11├── src/
12│ └── admin/
13├── types/
14├── .env
15├── .env.example
16├── package.json
17└── tsconfig.jsonCreate or update your .env file:
1# Server
2HOST=0.0.0.0
3PORT=1337
4
5# Secrets (generate with: openssl rand -base64 32)
6APP_KEYS=your-app-keys-here
7API_TOKEN_SALT=your-api-token-salt
8ADMIN_JWT_SECRET=your-admin-jwt-secret
9TRANSFER_TOKEN_SALT=your-transfer-token-salt
10JWT_SECRET=your-jwt-secret
11
12# Database (for SQLite)
13DATABASE_CLIENT=sqlite
14DATABASE_FILENAME=.tmp/data.db
15
16# AI SDK Configuration ( you can use any provider, I chose to use Anthropic)
17ANTHROPIC_API_KEY=sk-ant-your-api-key-here
18ANTHROPIC_MODEL=claude-sonnet-4-20250514# Start the development server
cd server
npm run devVisit http://localhost:1337 to create your admin account.
Once you have created your Admin User you will be greeted by the Strapi Admin area.
If you have never used Strapi checkout this Crash Course Tutorial that I have created.
This is the core of our implementation - a Strapi plugin that integrates the Vercel AI SDK with Anthropic Claude.
We'll use the official Strapi Plugin SDK to scaffold our plugin. This CLI toolkit provides a complete project structure with all necessary configuration files.
# From the server directory, run the plugin init command
cd server
npx @strapi/sdk-plugin@latest init ai-sdkYou'll be prompted with several questions. Here are the recommended answers for our AI SDK plugin:
1npx @strapi/sdk-plugin@latest init ai-sdk
2[INFO] Creating a new package at: src/plugins/ai-sdk
3✔ plugin name … ai-sdk
4✔ plugin display name … AI SDK
5✔ plugin description … Integrate AI capabilities using Vercel AI SDK
6✔ plugin author name … Paul Bratslavsky
7✔ plugin author email … paul.bratslavsky@strapi.io
8✔ git url …
9✔ plugin license … MIT
10✔ register with the admin panel? … yes
11✔ register with the server? … yes
12✔ use editorconfig? … yes
13✔ use eslint? … yes
14✔ use prettier? … yes
15✔ use typescript? … yesImportant: Make sure to answer Yes to both "register with the admin panel" and "register with the server" since our plugin needs backend API routes.
Troubleshooting: Peer Dependency Error During Scaffolding
The
@strapi/sdk-plugin initcommand automatically runsnpm installat the end of scaffolding. You may encounter a peer dependency conflict like this:1npm error ERESOLVE unable to resolve dependency tree 2npm error peer react@"19" from react-intl@8.1.3This happens because the scaffolded plugin template includes
react-intl@^8.1.3(which requires React 19) but setsreact@^18.3.1as a devDependency. Don't worry - the plugin files are still generated successfully, only thenpm installstep failed.To fix this, navigate to the plugin directory and install dependencies manually:
cd src/plugins/ai-sdk npm install --legacy-peer-depsAlternatively, if you're using Yarn or pnpm, you can install with those instead, as they handle peer dependency conflicts more gracefully:
cd src/plugins/ai-sdk yarn install # or: pnpm install
Once this process is done, you will see the following message:
You can now enable your plugin by adding the following in ./config/plugins.ts
───────────────────────────────────
export default {
// ...
'ai-sdk': {
enabled: true,
resolve: './src/plugins/ai-sdk'
},
// ...
}
───────────────────────────────────
[INFO] Plugin generated successfully.Go ahead and do that now.
Now let's install our dependencies and build out plugin for the first time.
# Navigate to the plugin directory
cd src/plugins/ai-sdk
npm run buildYou can also start your plugin in watch mode with the following command:
npm run watchThen restart your Strapi application by running npm run dev in the root and you should see the following.
Should show up in Menu
Should show up in Settings
Now that we know our basic plugin structure is set up, lets start by installing all the required dependencies.
Make sure to run this in the src/plugins/ai-sdk folder:
npm install @ai-sdk/anthropic aiNote: If you encounter the same
ERESOLVEpeer dependency error from Step 2.1, use the--legacy-peer-depsflag:npm install @ai-sdk/anthropic ai --legacy-peer-deps
Note: Don't forget to rebuild your plugin and restart the Strapi application to apply the changes.
After scaffolding, your plugin structure will look like this:
1src/plugins/ai-sdk/
2├── admin/
3│ └── src/
4│ ├── components/ # Admin UI components (generated)
5│ ├── pages/ # Admin pages (generated)
6│ └── index.ts # Admin entry point
7├── server/
8│ └── src/
9│ ├── controllers/ # We'll add our API controllers here
10│ ├── services/ # We'll add our service layer here
11│ ├── routes/ # We'll add our routes here
12│ └── index.ts # Server entry point
13├── package.json
14├── tsconfig.json
15└── README.mdNow let's create the additional directories we need for our implementation.
Run the following in the root folder of your plugin:
# Create the lib directory for our core AI SDK integration
mkdir -p server/src/lib
mkdir -p server/src/routes/content-apiWe will cover these in detail in a bit.
Let's update plugin config to initialize and connect to the Anthropic API. We'll verify it works before adding any endpoints.
First, let's tell Strapi about our plugin and pass it the API key. Update config/plugins.ts in your Strapi root:
1export default ({ env }) => ({
2 "ai-sdk": {
3 enabled: true,
4 resolve: "./src/plugins/ai-sdk",
5 config: {
6 anthropicApiKey: env("ANTHROPIC_API_KEY"),
7 chatModel: env("ANTHROPIC_MODEL", "claude-sonnet-4-20250514"),
8 },
9 },
10});Make sure your .env file has the API key:
1ANTHROPIC_API_KEY=sk-ant-your-api-key-here
2ANTHROPIC_MODEL=claude-sonnet-4-20250514Create src/plugins/ai-sdk/server/src/lib/types.ts with just what we need for initialization:
1/**
2 * Supported Claude model names
3 */
4export const CHAT_MODELS = [
5 "claude-sonnet-4-20250514",
6 "claude-opus-4-20250514",
7 "claude-3-5-sonnet-20241022",
8 "claude-3-5-haiku-20241022",
9 "claude-3-haiku-20240307",
10] as const;
11
12export type ChatModelName = (typeof CHAT_MODELS)[number];
13export const DEFAULT_MODEL: ChatModelName = "claude-sonnet-4-20250514";
14
15/**
16 * Plugin configuration interface
17 */
18export interface PluginConfig {
19 anthropicApiKey: string;
20 chatModel?: ChatModelName;
21 baseURL?: string;
22}We'll add more types as we need them in later phases.
Create src/plugins/ai-sdk/server/src/lib/init-ai-sdk.ts. We'll start with just initialization - no AI calls yet:
1import { createAnthropic, type AnthropicProvider } from "@ai-sdk/anthropic";
2import {
3 CHAT_MODELS,
4 DEFAULT_MODEL,
5 type PluginConfig,
6 type ChatModelName,
7} from "./types";
8
9/**
10 * AISDKManager - Core class for AI SDK integration
11 * We'll add AI methods in the next phase
12 */
13class AISDKManager {
14 private provider: AnthropicProvider | null = null;
15 private model: ChatModelName = DEFAULT_MODEL;
16
17 /**
18 * Initialize the manager with plugin configuration
19 * Returns false if config is missing required fields
20 */
21 initialize(config: unknown): boolean {
22 const cfg = config as Partial<PluginConfig> | undefined;
23
24 if (!cfg?.anthropicApiKey) {
25 return false;
26 }
27
28 this.provider = createAnthropic({
29 apiKey: cfg.anthropicApiKey,
30 baseURL: cfg.baseURL,
31 });
32
33 if (cfg.chatModel && CHAT_MODELS.includes(cfg.chatModel)) {
34 this.model = cfg.chatModel;
35 }
36
37 return true;
38 }
39
40 getChatModel(): ChatModelName {
41 return this.model;
42 }
43
44 isInitialized(): boolean {
45 return this.provider !== null;
46 }
47}
48
49export const aiSDKManager = new AISDKManager();Create src/plugins/ai-sdk/server/src/register.ts. This runs when Strapi starts:
1import type { Core } from "@strapi/strapi";
2import { aiSDKManager } from "./lib/init-ai-sdk";
3
4const register = ({ strapi }: { strapi: Core.Strapi }) => {
5 const config = strapi.config.get("plugin::ai-sdk");
6 const initialized = aiSDKManager.initialize(config);
7
8 if (!initialized) {
9 strapi.log.warn(
10 "AI SDK plugin: anthropicApiKey not configured, plugin will not be initialized",
11 );
12 return;
13 }
14
15 strapi.log.info(
16 `AI SDK plugin initialized with model: ${aiSDKManager.getChatModel()}`,
17 );
18};
19
20export default register;Rebuild and start Strapi:
npm run build
npm run developCheck the console. You will see:
[2026-01-18 16:10:49.069] warn: AI SDK plugin: anthropicApiKey not configured, plugin will not be initializedWe need to add our credentials to our .env file:
1ANTHROPIC_API_KEY=your_api_keyNow, restart and you should see the following message:
[2026-01-18 16:15:50.508] info: AI SDK plugin initialized with model: claude-sonnet-4-20250514Now let's add our first AI endpoint.
Update src/plugins/ai-sdk/server/src/lib/init-ai-sdk.ts to add text generation capabilities.
Add to imports:
1import { generateText, type LanguageModel } from "ai";Add these methods to the AISDKManager class (before getChatModel()):
1 private getLanguageModel(): LanguageModel {
2 if (!this.provider) {
3 throw new Error('AI SDK Manager not initialized');
4 }
5 return this.provider(this.model);
6 }
7
8 async generateText(prompt: string, options?: { system?: string }) {
9 const result = await generateText({
10 model: this.getLanguageModel(),
11 prompt,
12 system: options?.system,
13 });
14 return { text: result.text };
15 }Your file should now look like this:
1import { createAnthropic, type AnthropicProvider } from "@ai-sdk/anthropic";
2import { generateText, type LanguageModel } from "ai"; // ← Added
3import {
4 CHAT_MODELS,
5 DEFAULT_MODEL,
6 type PluginConfig,
7 type ChatModelName,
8} from "./types";
9
10class AISDKManager {
11 private provider: AnthropicProvider | null = null;
12 private model: ChatModelName = DEFAULT_MODEL;
13
14 initialize(config: unknown): boolean {
15 const cfg = config as Partial<PluginConfig> | undefined;
16
17 if (!cfg?.anthropicApiKey) {
18 return false;
19 }
20
21 this.provider = createAnthropic({
22 apiKey: cfg.anthropicApiKey,
23 baseURL: cfg.baseURL,
24 });
25
26 if (cfg.chatModel && CHAT_MODELS.includes(cfg.chatModel)) {
27 this.model = cfg.chatModel;
28 }
29
30 return true;
31 }
32
33 // ↓ New method
34 private getLanguageModel(): LanguageModel {
35 if (!this.provider) {
36 throw new Error("AI SDK Manager not initialized");
37 }
38 return this.provider(this.model);
39 }
40
41 // ↓ New method
42 async generateText(prompt: string, options?: { system?: string }) {
43 const result = await generateText({
44 model: this.getLanguageModel(),
45 prompt,
46 system: options?.system,
47 });
48 return { text: result.text };
49 }
50
51 getChatModel(): ChatModelName {
52 return this.model;
53 }
54
55 isInitialized(): boolean {
56 return this.provider !== null;
57 }
58}
59
60export const aiSDKManager = new AISDKManager();Create src/plugins/ai-sdk/server/src/services/service.ts:
1import type { Core } from "@strapi/strapi";
2import { aiSDKManager } from "../lib/init-ai-sdk";
3
4const service = ({ strapi }: { strapi: Core.Strapi }) => ({
5 async ask(prompt: string, options?: { system?: string }) {
6 const result = await aiSDKManager.generateText(prompt, options);
7 return result.text;
8 },
9
10 isInitialized() {
11 return aiSDKManager.isInitialized();
12 },
13});
14
15export default service;Create src/plugins/ai-sdk/server/src/controllers/controller.ts:
1import type { Core } from "@strapi/strapi";
2import type { Context } from "koa";
3
4const controller = ({ strapi }: { strapi: Core.Strapi }) => ({
5 async ask(ctx: Context) {
6 const { prompt, system } = (ctx.request as any).body as {
7 prompt?: string;
8 system?: string;
9 };
10
11 if (!prompt || typeof prompt !== "string") {
12 ctx.badRequest("prompt is required and must be a string");
13 return;
14 }
15
16 const service = strapi.plugin("ai-sdk").service("service");
17 if (!service.isInitialized()) {
18 ctx.badRequest("AI SDK not initialized");
19 return;
20 }
21
22 const result = await service.ask(prompt, { system });
23 ctx.body = { data: { text: result } };
24 },
25});
26
27export default controller;Create src/plugins/ai-sdk/server/src/routes/content-api/index.ts:
1export default {
2 type: "content-api",
3 routes: [
4 {
5 method: "POST",
6 path: "/ask",
7 handler: "controller.ask",
8 config: { policies: [] },
9 },
10 ],
11};Create src/plugins/ai-sdk/server/src/routes/index.ts:
1import contentApi from "./content-api";
2
3export default {
4 "content-api": contentApi,
5};Rebuild, restart, and test:
curl -X POST http://localhost:1337/api/ai-sdk/ask \
-H "Content-Type: application/json" \
-d '{"prompt": "What is 2 + 2? Reply with just the number."}'You will get the Forbidden message:
1{
2 "data": null,
3 "error": {
4 "status": 403,
5 "name": "ForbiddenError",
6 "message": "Forbidden",
7 "details": {}
8 }
9}We need to first enable our API in Strapi:
Now, try again.
Expected response:
1{ "data": { "text": "4" } }Create tests/test-ask.mjs in your Strapi project root:
1const API_URL = "http://localhost:1337/api/ai-sdk";
2
3async function testAsk() {
4 console.log("Testing /ask endpoint...\n");
5
6 const response = await fetch(`${API_URL}/ask`, {
7 method: "POST",
8 headers: { "Content-Type": "application/json" },
9 body: JSON.stringify({
10 prompt: "What is Strapi and why should I use it?",
11 }),
12 });
13
14 if (!response.ok) {
15 console.error("Request failed:", response.status, response.statusText);
16 const error = await response.text();
17 console.error(error);
18 process.exit(1);
19 }
20
21 const data = await response.json();
22 console.log("Response:", JSON.stringify(data, null, 2));
23
24 if (data.data?.text) {
25 console.log("\n✅ Test passed!");
26 } else {
27 console.error("\n❌ Test failed: unexpected response format");
28 process.exit(1);
29 }
30}
31
32testAsk().catch(console.error);Add the test script to your package.json:
1{
2 "scripts": {
3 "test:ask": "node tests/test-ask.mjs"
4 }
5}Run the test:
npm run test:askExpected output:
1Testing /ask endpoint...
2
3Response: {
4 "data": {
5 "text": "Strapi is an open-source headless CMS..."
6 }
7}
8
9✅ Test passed!Now let's build a simple frontend to interact with our API.
In a separate directory (outside your Strapi project):
npx create-next-app@latest next-clientpaul@dev test npx create-next-app@latest next-client
? Would you like to use the recommended Next.js defaults? › - Use arrow-keys. Return to submit.
❯ Yes, use recommended defaults
TypeScript, ESLint, Tailwind CSS, App Router
No, reuse previous settings
No, customize settings
Using npm.
Initializing project with template: app-tw
Installing dependencies:
- next
- react
- react-dom
Installing devDependencies:
- @tailwindcss/postcss
- @types/node
- @types/react
- @types/react-dom
- eslint
- eslint-config-next
- tailwindcss
- typescript
added 357 packages, and audited 358 packages in 30s
142 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
Generating route types...
✓ Types generated successfully
Initialized a git repository.
Success! Created next-client at /Users/paul/test/next-clientCreate the lib folder and add lib/api.ts:
1const API_BASE = process.env.NEXT_PUBLIC_API_URL || "http://localhost:1337/api/ai-sdk";
2
3// Simple fetch wrapper
4export async function askAI(prompt: string, options?: { system?: string }) {
5 const res = await fetch(`${API_BASE}/ask`, {
6 method: "POST",
7 headers: { "Content-Type": "application/json" },
8 body: JSON.stringify({ prompt, ...options }),
9 });
10
11 if (!res.ok) {
12 throw new Error(`API error: ${res.status}`);
13 }
14
15 const data = await res.json();
16 return data.data?.text as string;
17}
18
19export { API_BASE };Create the hooks folder and add hooks/useAsk.ts:
1"use client";
2
3import { useState, useCallback } from "react";
4import { askAI } from "@/lib/api";
5
6export function useAsk() {
7 const [response, setResponse] = useState("");
8 const [loading, setLoading] = useState(false);
9 const [error, setError] = useState<Error | null>(null);
10
11 const ask = useCallback(async (prompt: string, options?: { system?: string }) => {
12 setLoading(true);
13 setError(null);
14 setResponse("");
15
16 try {
17 const text = await askAI(prompt, options);
18 setResponse(text);
19 return text;
20 } catch (err) {
21 const error = err instanceof Error ? err : new Error(String(err));
22 setError(error);
23 throw error;
24 } finally {
25 setLoading(false);
26 }
27 }, []);
28
29 const reset = useCallback(() => {
30 setResponse("");
31 setError(null);
32 }, []);
33
34 return { ask, response, loading, error, reset };
35}Create hooks/index.ts to export the hook:
1export { useAsk } from "./useAsk";Create the components folder and add components/AskExample.tsx:
1"use client";
2
3import { useState, type SubmitEvent } from "react";
4import { useAsk } from "@/hooks";
5
6export function AskExample() {
7 const [prompt, setPrompt] = useState("What is the capital of France?");
8 const { ask, response, loading, error } = useAsk();
9
10 const handleSubmit = async (e: SubmitEvent) => {
11 e.preventDefault();
12 await ask(prompt);
13 };
14
15 return (
16 <section className="bg-white dark:bg-zinc-900 rounded-lg p-6 shadow">
17 <h2 className="text-xl font-semibold mb-4 text-black dark:text-white">
18 /ask - Non-streaming
19 </h2>
20 <form onSubmit={handleSubmit} className="space-y-4">
21 <input
22 type="text"
23 value={prompt}
24 onChange={(e) => setPrompt(e.target.value)}
25 className="w-full p-3 border rounded-lg dark:bg-zinc-800 dark:border-zinc-700 dark:text-white"
26 placeholder="Enter your prompt..."
27 />
28 <button
29 type="submit"
30 disabled={loading}
31 className="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 disabled:opacity-50"
32 >
33 {loading ? "Loading..." : "Ask"}
34 </button>
35 </form>
36 {error && (
37 <div className="mt-4 p-4 bg-red-100 dark:bg-red-900 rounded-lg">
38 <p className="text-red-700 dark:text-red-200">{error.message}</p>
39 </div>
40 )}
41 {response && (
42 <div className="mt-4 p-4 bg-zinc-100 dark:bg-zinc-800 rounded-lg">
43 <p className="text-black dark:text-white whitespace-pre-wrap">{response}</p>
44 </div>
45 )}
46 </section>
47 );
48}Create components/index.ts to export the component:
1export { AskExample } from "./AskExample";Replace app/page.tsx:
1import { AskExample } from "@/components";
2
3export default function Home() {
4 return (
5 <div className="min-h-screen bg-zinc-50 dark:bg-black p-8">
6 <main className="max-w-4xl mx-auto space-y-8">
7 <h1 className="text-3xl font-bold text-center text-black dark:text-white">
8 AI SDK Test
9 </h1>
10
11 <div className="grid gap-8">
12 <AskExample />
13 </div>
14 </main>
15 </div>
16 );
17}npm run devOpen http://localhost:3000, type a prompt, and click Ask. You should see the AI response appear.
Now let's enhance our app with streaming responses for a better user experience. Instead of waiting for the entire response, users will see text appear token-by-token in real time.
Update src/plugins/ai-sdk/server/src/lib/init-ai-sdk.ts to add streaming capabilities.
Update the import to add streamText:
1import { generateText, streamText, type LanguageModel } from "ai"; // ← Added streamTextAdd this method to the AISDKManager class (after generateText()):
1 async streamText(prompt: string, options?: { system?: string }) {
2 const result = streamText({
3 model: this.getLanguageModel(),
4 prompt,
5 system: options?.system,
6 });
7 return { textStream: result.textStream };
8 }Your full file should now look like this:
1import { createAnthropic, type AnthropicProvider } from "@ai-sdk/anthropic";
2import { generateText, streamText, type LanguageModel } from "ai"; // ← Added streamText
3import {
4 CHAT_MODELS,
5 DEFAULT_MODEL,
6 type PluginConfig,
7 type ChatModelName,
8} from "./types";
9
10class AISDKManager {
11 private provider: AnthropicProvider | null = null;
12 private model: ChatModelName = DEFAULT_MODEL;
13
14 initialize(config: unknown): boolean {
15 const cfg = config as Partial<PluginConfig> | undefined;
16
17 if (!cfg?.anthropicApiKey) {
18 return false;
19 }
20
21 this.provider = createAnthropic({
22 apiKey: cfg.anthropicApiKey,
23 baseURL: cfg.baseURL,
24 });
25
26 if (cfg.chatModel && CHAT_MODELS.includes(cfg.chatModel)) {
27 this.model = cfg.chatModel;
28 }
29
30 return true;
31 }
32
33 private getLanguageModel(): LanguageModel {
34 if (!this.provider) {
35 throw new Error("AI SDK Manager not initialized");
36 }
37 return this.provider(this.model);
38 }
39
40 async generateText(prompt: string, options?: { system?: string }) {
41 const result = await generateText({
42 model: this.getLanguageModel(),
43 prompt,
44 system: options?.system,
45 });
46 return { text: result.text };
47 }
48
49 // ↓ New method
50 async streamText(prompt: string, options?: { system?: string }) {
51 const result = streamText({
52 model: this.getLanguageModel(),
53 prompt,
54 system: options?.system,
55 });
56 return { textStream: result.textStream };
57 }
58
59 getChatModel(): ChatModelName {
60 return this.model;
61 }
62
63 isInitialized(): boolean {
64 return this.provider !== null;
65 }
66}
67
68export const aiSDKManager = new AISDKManager();The SSE utility we're about to create uses Koa's Context type. Install the type definitions in your plugin directory:
cd src/plugins/ai-sdk
npm install --save-dev @types/koa --legacy-peer-depsNote: After installing, if your editor still shows a "Cannot find module 'koa'" error, don't worry - this is just your IDE not picking up the newly installed types. Reload VS Code (Cmd+Shift+P → "Developer: Reload Window") and the error will disappear. The build itself will work fine.
We need a helper to set up Server-Sent Events (SSE) on Koa's response. Create src/plugins/ai-sdk/server/src/lib/utils.ts:
1import type { Context } from "koa";
2import { PassThrough } from "node:stream";
3
4export function createSSEStream(ctx: Context): PassThrough {
5 ctx.set({
6 "Content-Type": "text/event-stream",
7 "Cache-Control": "no-cache, no-transform",
8 Connection: "keep-alive",
9 "X-Accel-Buffering": "no",
10 });
11
12 const stream = new PassThrough();
13 ctx.body = stream;
14 ctx.res.flushHeaders();
15
16 return stream;
17}
18
19export function writeSSE(stream: PassThrough, data: unknown): void {
20 stream.write(`data: ${JSON.stringify(data)}\n\n`);
21}Update src/plugins/ai-sdk/server/src/services/service.ts to add streaming support.
Add this method (after ask()):
1 async askStream(prompt: string, options?: { system?: string }) {
2 const result = await aiSDKManager.streamText(prompt, options);
3 return result.textStream;
4 },Your full service file should now look like this:
1import type { Core } from "@strapi/strapi";
2import { aiSDKManager } from "../lib/init-ai-sdk";
3
4const service = ({ strapi }: { strapi: Core.Strapi }) => ({
5 async ask(prompt: string, options?: { system?: string }) {
6 const result = await aiSDKManager.generateText(prompt, options);
7 return result.text;
8 },
9
10 // ↓ New method
11 async askStream(prompt: string, options?: { system?: string }) {
12 const result = await aiSDKManager.streamText(prompt, options);
13 return result.textStream;
14 },
15
16 isInitialized() {
17 return aiSDKManager.isInitialized();
18 },
19});
20
21export default service;Update src/plugins/ai-sdk/server/src/controllers/controller.ts.
Add the import for our SSE utilities at the top:
1import { createSSEStream, writeSSE } from "../lib/utils";Add this method (after ask()):
1 async askStream(ctx: Context) {
2 const { prompt, system } = (ctx.request as any).body as { prompt?: string; system?: string };
3
4 if (!prompt || typeof prompt !== 'string') {
5 ctx.badRequest('prompt is required');
6 return;
7 }
8
9 const service = strapi.plugin('ai-sdk').service('service');
10 if (!service.isInitialized()) {
11 ctx.badRequest('AI SDK not initialized');
12 return;
13 }
14
15 const textStream = await service.askStream(prompt, { system });
16 const stream = createSSEStream(ctx);
17
18 void (async () => {
19 try {
20 for await (const chunk of textStream) {
21 writeSSE(stream, { text: chunk });
22 }
23 stream.write('data: [DONE]\n\n');
24 } catch (error) {
25 strapi.log.error('AI SDK stream error:', error);
26 writeSSE(stream, { error: 'Stream error' });
27 } finally {
28 stream.end();
29 }
30 })();
31 },Your full controller file should now look like this:
1import type { Core } from "@strapi/strapi";
2import type { Context } from "koa";
3import { createSSEStream, writeSSE } from "../lib/utils";
4
5const controller = ({ strapi }: { strapi: Core.Strapi }) => ({
6 async ask(ctx: Context) {
7 const { prompt, system } = (ctx.request as any).body as {
8 prompt?: string;
9 system?: string;
10 };
11
12 if (!prompt || typeof prompt !== "string") {
13 ctx.badRequest("prompt is required and must be a string");
14 return;
15 }
16
17 const service = strapi.plugin("ai-sdk").service("service");
18 if (!service.isInitialized()) {
19 ctx.badRequest("AI SDK not initialized");
20 return;
21 }
22
23 const result = await service.ask(prompt, { system });
24 ctx.body = { data: { text: result } };
25 },
26
27 // ↓ New method
28 async askStream(ctx: Context) {
29 const { prompt, system } = (ctx.request as any).body as {
30 prompt?: string;
31 system?: string;
32 };
33
34 if (!prompt || typeof prompt !== "string") {
35 ctx.badRequest("prompt is required");
36 return;
37 }
38
39 const service = strapi.plugin("ai-sdk").service("service");
40 if (!service.isInitialized()) {
41 ctx.badRequest("AI SDK not initialized");
42 return;
43 }
44
45 const textStream = await service.askStream(prompt, { system });
46 const stream = createSSEStream(ctx);
47
48 void (async () => {
49 try {
50 for await (const chunk of textStream) {
51 writeSSE(stream, { text: chunk });
52 }
53 stream.write("data: [DONE]\n\n");
54 } catch (error) {
55 strapi.log.error("AI SDK stream error:", error);
56 writeSSE(stream, { error: "Stream error" });
57 } finally {
58 stream.end();
59 }
60 })();
61 },
62});
63
64export default controller;Update src/plugins/ai-sdk/server/src/routes/content-api/index.ts to add the new endpoint:
1export default {
2 type: "content-api",
3 routes: [
4 {
5 method: "POST",
6 path: "/ask",
7 handler: "controller.ask",
8 config: { policies: [] },
9 },
10 {
11 method: "POST",
12 path: "/ask-stream",
13 handler: "controller.askStream",
14 config: { policies: [] },
15 },
16 ],
17};Rebuild and restart your plugin, then test:
curl -X POST http://localhost:1337/api/ai-sdk/ask-stream \
-H "Content-Type: application/json" \
-d '{"prompt": "Count from 1 to 5"}'Important: Don't forget to enable the new endpoint in Strapi Admin: Settings → Users & Permissions → Roles → Public → Ai-sdk → ask-stream.
You should see SSE events streaming in:
1data: {"text":"1"}
2data: {"text":","}
3data: {"text":" 2"}
4...
5
6data: [DONE]Create tests/test-stream.mjs in your Strapi project root:
1const API_URL = "http://localhost:1337/api/ai-sdk";
2
3async function testStream() {
4 console.log("Testing /ask-stream endpoint...\n");
5
6 const response = await fetch(`${API_URL}/ask-stream`, {
7 method: "POST",
8 headers: { "Content-Type": "application/json" },
9 body: JSON.stringify({ prompt: "Count from 1 to 5" }),
10 });
11
12 if (!response.ok) {
13 console.error("Request failed:", response.status, response.statusText);
14 const error = await response.text();
15 console.error(error);
16 process.exit(1);
17 }
18
19 const reader = response.body.getReader();
20 const decoder = new TextDecoder();
21
22 while (true) {
23 const { done, value } = await reader.read();
24 if (done) break;
25 process.stdout.write(decoder.decode(value));
26 }
27
28 console.log("\n\n✅ Stream test passed!");
29}
30
31testStream().catch(console.error);Add the test script to your package.json:
1{
2 "scripts": {
3 "test:ask": "node tests/test-ask.mjs",
4 "test:stream": "node tests/test-stream.mjs"
5 }
6}Run the test:
npm run test:streamNow let's add streaming support to our Next.js app. We'll create a dedicated hook and component that sits alongside the existing non-streaming ones.
askStreamAI function to lib/api.ts:1export async function askStreamAI(
2 prompt: string,
3 options?: { system?: string }
4): Promise<Response> {
5 const res = await fetch(`${API_BASE}/ask-stream`, {
6 method: "POST",
7 headers: { "Content-Type": "application/json" },
8 body: JSON.stringify({ prompt, ...options }),
9 });
10
11 if (!res.ok) {
12 throw new Error(`API error: ${res.status}`);
13 }
14
15 return res;
16}hooks/useAskStream.ts:1"use client";
2
3import { useState, useCallback } from "react";
4import { askStreamAI } from "@/lib/api";
5
6export function useAskStream() {
7 const [response, setResponse] = useState("");
8 const [loading, setLoading] = useState(false);
9 const [error, setError] = useState<Error | null>(null);
10
11 const askStream = useCallback(
12 async (prompt: string, options?: { system?: string }) => {
13 setLoading(true);
14 setError(null);
15 setResponse("");
16
17 try {
18 const res = await askStreamAI(prompt, options);
19 const reader = res.body?.getReader();
20 const decoder = new TextDecoder();
21
22 if (!reader) throw new Error("No reader available");
23
24 while (true) {
25 const { done, value } = await reader.read();
26 if (done) break;
27
28 const chunk = decoder.decode(value);
29 const lines = chunk.split("\n");
30
31 for (const line of lines) {
32 if (line.startsWith("data: ") && line !== "data: [DONE]") {
33 try {
34 const data = JSON.parse(line.slice(6));
35 if (data.text) {
36 setResponse((prev) => prev + data.text);
37 }
38 } catch {
39 // Skip invalid JSON
40 }
41 }
42 }
43 }
44 } catch (err) {
45 const error = err instanceof Error ? err : new Error(String(err));
46 setError(error);
47 throw error;
48 } finally {
49 setLoading(false);
50 }
51 },
52 []
53 );
54
55 const reset = useCallback(() => {
56 setResponse("");
57 setError(null);
58 }, []);
59
60 return { askStream, response, loading, error, reset };
61}hooks/index.ts:1export { useAsk } from "./useAsk";
2export { useAskStream } from "./useAskStream";components/AskStreamExample.tsx:1"use client";
2
3import { useState, type SubmitEvent } from "react";
4import { useAskStream } from "@/hooks";
5
6export function AskStreamExample() {
7 const [prompt, setPrompt] = useState("Count from 1 to 10 and explain each number.");
8 const { askStream, response, loading, error } = useAskStream();
9
10 const handleSubmit = async (e: SubmitEvent) => {
11 e.preventDefault();
12 await askStream(prompt);
13 };
14
15 return (
16 <section className="bg-white dark:bg-zinc-900 rounded-lg p-6 shadow">
17 <h2 className="text-xl font-semibold mb-4 text-black dark:text-white">
18 /ask-stream - SSE Streaming
19 </h2>
20 <form onSubmit={handleSubmit} className="space-y-4">
21 <input
22 type="text"
23 value={prompt}
24 onChange={(e) => setPrompt(e.target.value)}
25 className="w-full p-3 border rounded-lg dark:bg-zinc-800 dark:border-zinc-700 dark:text-white"
26 placeholder="Enter your prompt..."
27 />
28 <button
29 type="submit"
30 disabled={loading}
31 className="px-4 py-2 bg-green-600 text-white rounded-lg hover:bg-green-700 disabled:opacity-50"
32 >
33 {loading ? "Streaming..." : "Ask (Stream)"}
34 </button>
35 </form>
36 {error && (
37 <div className="mt-4 p-4 bg-red-100 dark:bg-red-900 rounded-lg">
38 <p className="text-red-700 dark:text-red-200">{error.message}</p>
39 </div>
40 )}
41 {response && (
42 <div className="mt-4 p-4 bg-zinc-100 dark:bg-zinc-800 rounded-lg">
43 <p className="text-black dark:text-white whitespace-pre-wrap">{response}</p>
44 </div>
45 )}
46 </section>
47 );
48}components/index.ts:1export { AskExample } from "./AskExample";
2export { AskStreamExample } from "./AskStreamExample";app/page.tsx to include both examples:1import { AskExample, AskStreamExample } from "@/components";
2
3export default function Home() {
4 return (
5 <div className="min-h-screen bg-zinc-50 dark:bg-black p-8">
6 <main className="max-w-4xl mx-auto space-y-8">
7 <h1 className="text-3xl font-bold text-center text-black dark:text-white">
8 AI SDK Test
9 </h1>
10
11 <div className="grid gap-8">
12 <AskExample />
13 <AskStreamExample />
14 </div>
15 </main>
16 </div>
17 );
18}Start both your Strapi server and Next.js client:
# Terminal 1 - Strapi
cd server
npm run develop
# Terminal 2 - Next.js
cd next-client
npm run devOpen http://localhost:3000. You should now see two sections - the original non-streaming "Ask" and the new "Ask (Stream)". Try the streaming version and watch the response appear token-by-token in real time.
Congratulations! You now have both non-streaming and streaming AI responses working end-to-end.
This final phase adds full multi-turn chat support compatible with the useChat hook from @ai-sdk/react. Unlike our previous endpoints that take a single prompt, chat maintains conversation history and streams responses using the UI Message Stream protocol.
We need to add types for handling chat messages. Update src/plugins/ai-sdk/server/src/lib/types.ts:
1import type { ModelMessage } from "ai";
2
3export const CHAT_MODELS = [
4 "claude-sonnet-4-20250514",
5 "claude-opus-4-20250514",
6 "claude-3-5-sonnet-20241022",
7 "claude-3-5-haiku-20241022",
8 "claude-3-haiku-20240307",
9] as const;
10
11export type ChatModelName = (typeof CHAT_MODELS)[number];
12export const DEFAULT_MODEL: ChatModelName = "claude-sonnet-4-20250514";
13export const DEFAULT_TEMPERATURE = 0.7;
14
15export interface PluginConfig {
16 anthropicApiKey: string;
17 chatModel?: ChatModelName;
18 baseURL?: string;
19}
20
21export interface GenerateOptions {
22 system?: string;
23 temperature?: number;
24 maxOutputTokens?: number;
25}
26
27export interface PromptInput extends GenerateOptions {
28 prompt: string;
29}
30
31export interface MessagesInput extends GenerateOptions {
32 messages: ModelMessage[];
33}
34
35export type GenerateInput = PromptInput | MessagesInput;
36
37export function isPromptInput(input: GenerateInput): input is PromptInput {
38 return "prompt" in input;
39}Update src/plugins/ai-sdk/server/src/lib/init-ai-sdk.ts to support the useChat hook.
Update the imports from ./types to include the new types:
1import {
2 CHAT_MODELS,
3 DEFAULT_MODEL,
4 DEFAULT_TEMPERATURE, // ← Added
5 isPromptInput, // ← Added
6 type PluginConfig,
7 type ChatModelName,
8 type GenerateInput, // ← Added
9} from "./types";Add this interface before the AISDKManager class:
1export interface StreamTextRawResult {
2 readonly textStream: AsyncIterable<string>;
3 toUIMessageStreamResponse(): Response;
4}Add these methods to the AISDKManager class (after getLanguageModel()):
1 private buildParams(input: GenerateInput) {
2 const base = {
3 model: this.getLanguageModel(),
4 system: input.system,
5 temperature: input.temperature ?? DEFAULT_TEMPERATURE,
6 maxOutputTokens: input.maxOutputTokens,
7 };
8
9 return isPromptInput(input)
10 ? { ...base, prompt: input.prompt }
11 : { ...base, messages: input.messages };
12 }Add this method (after streamText()):
1 streamRaw(input: GenerateInput): StreamTextRawResult {
2 return streamText(this.buildParams(input)) as StreamTextRawResult;
3 }Your full file should now look like this:
1import { createAnthropic, type AnthropicProvider } from "@ai-sdk/anthropic";
2import { generateText, streamText, type LanguageModel } from "ai";
3import {
4 CHAT_MODELS,
5 DEFAULT_MODEL,
6 DEFAULT_TEMPERATURE,
7 isPromptInput,
8 type PluginConfig,
9 type ChatModelName,
10 type GenerateInput,
11} from "./types";
12
13export interface StreamTextRawResult {
14 readonly textStream: AsyncIterable<string>;
15 toUIMessageStreamResponse(): Response;
16}
17
18class AISDKManager {
19 private provider: AnthropicProvider | null = null;
20 private model: ChatModelName = DEFAULT_MODEL;
21
22 initialize(config: unknown): boolean {
23 const cfg = config as Partial<PluginConfig> | undefined;
24
25 if (!cfg?.anthropicApiKey) {
26 return false;
27 }
28
29 this.provider = createAnthropic({
30 apiKey: cfg.anthropicApiKey,
31 baseURL: cfg.baseURL,
32 });
33
34 if (cfg.chatModel && CHAT_MODELS.includes(cfg.chatModel)) {
35 this.model = cfg.chatModel;
36 }
37
38 return true;
39 }
40
41 private getLanguageModel(): LanguageModel {
42 if (!this.provider) {
43 throw new Error("AI SDK Manager not initialized");
44 }
45 return this.provider(this.model);
46 }
47
48 // ↓ New method
49 private buildParams(input: GenerateInput) {
50 const base = {
51 model: this.getLanguageModel(),
52 system: input.system,
53 temperature: input.temperature ?? DEFAULT_TEMPERATURE,
54 maxOutputTokens: input.maxOutputTokens,
55 };
56
57 return isPromptInput(input)
58 ? { ...base, prompt: input.prompt }
59 : { ...base, messages: input.messages };
60 }
61
62 async generateText(prompt: string, options?: { system?: string }) {
63 const result = await generateText({
64 model: this.getLanguageModel(),
65 prompt,
66 system: options?.system,
67 });
68 return { text: result.text };
69 }
70
71 async streamText(prompt: string, options?: { system?: string }) {
72 const result = streamText({
73 model: this.getLanguageModel(),
74 prompt,
75 system: options?.system,
76 });
77 return { textStream: result.textStream };
78 }
79
80 // ↓ New method
81 streamRaw(input: GenerateInput): StreamTextRawResult {
82 return streamText(this.buildParams(input)) as StreamTextRawResult;
83 }
84
85 getChatModel(): ChatModelName {
86 return this.model;
87 }
88
89 isInitialized(): boolean {
90 return this.provider !== null;
91 }
92}
93
94export const aiSDKManager = new AISDKManager();Update src/plugins/ai-sdk/server/src/services/service.ts.
Update the imports:
1import type { Core } from "@strapi/strapi";
2import type { UIMessage } from "ai"; // ← Added
3import { convertToModelMessages } from "ai"; // ← Added
4import { aiSDKManager, type StreamTextRawResult } from "../lib/init-ai-sdk"; // ← Added typeAdd this method (after askStream()):
1 async chat(messages: UIMessage[], options?: { system?: string }): Promise<StreamTextRawResult> {
2 const modelMessages = await convertToModelMessages(messages);
3 return aiSDKManager.streamRaw({
4 messages: modelMessages,
5 system: options?.system,
6 });
7 },Your full service file should now look like this:
1import type { Core } from "@strapi/strapi";
2import type { UIMessage } from "ai";
3import { convertToModelMessages } from "ai";
4import { aiSDKManager, type StreamTextRawResult } from "../lib/init-ai-sdk";
5
6const service = ({ strapi }: { strapi: Core.Strapi }) => ({
7 async ask(prompt: string, options?: { system?: string }) {
8 const result = await aiSDKManager.generateText(prompt, options);
9 return result.text;
10 },
11
12 async askStream(prompt: string, options?: { system?: string }) {
13 const result = await aiSDKManager.streamText(prompt, options);
14 return result.textStream;
15 },
16
17 // ↓ New method
18 async chat(messages: UIMessage[], options?: { system?: string }): Promise<StreamTextRawResult> {
19 const modelMessages = await convertToModelMessages(messages);
20 return aiSDKManager.streamRaw({
21 messages: modelMessages,
22 system: options?.system,
23 });
24 },
25
26 isInitialized() {
27 return aiSDKManager.isInitialized();
28 },
29});
30
31export default service;Update src/plugins/ai-sdk/server/src/controllers/controller.ts.
Add the import for Readable at the top:
1import { Readable } from "node:stream";Add this method (after askStream()):
1 async chat(ctx: Context) {
2 const { messages, system } = (ctx.request as any).body as {
3 messages?: any[];
4 system?: string;
5 };
6
7 if (!messages || !Array.isArray(messages) || messages.length === 0) {
8 ctx.badRequest("messages is required and must be a non-empty array");
9 return;
10 }
11
12 const service = strapi.plugin("ai-sdk").service("service");
13 if (!service.isInitialized()) {
14 ctx.badRequest("AI SDK not initialized");
15 return;
16 }
17
18 const result = await service.chat(messages, { system });
19 const response = result.toUIMessageStreamResponse();
20
21 ctx.status = 200;
22 ctx.set("Content-Type", "text/event-stream; charset=utf-8");
23 ctx.set("Cache-Control", "no-cache, no-transform");
24 ctx.set("Connection", "keep-alive");
25 ctx.set("X-Accel-Buffering", "no");
26 ctx.set("x-vercel-ai-ui-message-stream", "v1");
27
28 ctx.body = Readable.fromWeb(
29 response.body as import("stream/web").ReadableStream
30 );
31 },Your full controller file should now look like this:
1import type { Core } from "@strapi/strapi";
2import type { Context } from "koa";
3import { Readable } from "node:stream";
4import { createSSEStream, writeSSE } from "../lib/utils";
5
6const controller = ({ strapi }: { strapi: Core.Strapi }) => ({
7 async ask(ctx: Context) {
8 const { prompt, system } = (ctx.request as any).body as {
9 prompt?: string;
10 system?: string;
11 };
12
13 if (!prompt || typeof prompt !== "string") {
14 ctx.badRequest("prompt is required and must be a string");
15 return;
16 }
17
18 const service = strapi.plugin("ai-sdk").service("service");
19 if (!service.isInitialized()) {
20 ctx.badRequest("AI SDK not initialized");
21 return;
22 }
23
24 const result = await service.ask(prompt, { system });
25 ctx.body = { data: { text: result } };
26 },
27
28 async askStream(ctx: Context) {
29 const { prompt, system } = (ctx.request as any).body as {
30 prompt?: string;
31 system?: string;
32 };
33
34 if (!prompt || typeof prompt !== "string") {
35 ctx.badRequest("prompt is required");
36 return;
37 }
38
39 const service = strapi.plugin("ai-sdk").service("service");
40 if (!service.isInitialized()) {
41 ctx.badRequest("AI SDK not initialized");
42 return;
43 }
44
45 const textStream = await service.askStream(prompt, { system });
46 const stream = createSSEStream(ctx);
47
48 void (async () => {
49 try {
50 for await (const chunk of textStream) {
51 writeSSE(stream, { text: chunk });
52 }
53 stream.write("data: [DONE]\n\n");
54 } catch (error) {
55 strapi.log.error("AI SDK stream error:", error);
56 writeSSE(stream, { error: "Stream error" });
57 } finally {
58 stream.end();
59 }
60 })();
61 },
62
63 // ↓ New method
64 async chat(ctx: Context) {
65 const { messages, system } = (ctx.request as any).body as {
66 messages?: any[];
67 system?: string;
68 };
69
70 if (!messages || !Array.isArray(messages) || messages.length === 0) {
71 ctx.badRequest("messages is required and must be a non-empty array");
72 return;
73 }
74
75 const service = strapi.plugin("ai-sdk").service("service");
76 if (!service.isInitialized()) {
77 ctx.badRequest("AI SDK not initialized");
78 return;
79 }
80
81 const result = await service.chat(messages, { system });
82 const response = result.toUIMessageStreamResponse();
83
84 ctx.status = 200;
85 ctx.set("Content-Type", "text/event-stream; charset=utf-8");
86 ctx.set("Cache-Control", "no-cache, no-transform");
87 ctx.set("Connection", "keep-alive");
88 ctx.set("X-Accel-Buffering", "no");
89 ctx.set("x-vercel-ai-ui-message-stream", "v1");
90
91 ctx.body = Readable.fromWeb(
92 response.body as import("stream/web").ReadableStream
93 );
94 },
95});
96
97export default controller;Update src/plugins/ai-sdk/server/src/routes/content-api/index.ts to add the chat endpoint:
1export default {
2 type: "content-api",
3 routes: [
4 {
5 method: "POST",
6 path: "/ask",
7 handler: "controller.ask",
8 config: { policies: [] },
9 },
10 {
11 method: "POST",
12 path: "/ask-stream",
13 handler: "controller.askStream",
14 config: { policies: [] },
15 },
16 {
17 method: "POST",
18 path: "/chat",
19 handler: "controller.chat",
20 config: { policies: [] },
21 },
22 ],
23};Rebuild your plugin, restart Strapi, and test:
curl -X POST http://localhost:1337/api/ai-sdk/chat \
-H "Content-Type: application/json" \
-d '{
"messages": [{
"id": "1",
"role": "user",
"parts": [{"type": "text", "text": "Hello!"}]
}]
}'Important: Don't forget to enable the new endpoint in Strapi Admin: Settings → Users & Permissions → Roles → Public → Ai-sdk → chat.
Create tests/test-chat.mjs in your Strapi project root:
1/**
2 * Test chat endpoint with UIMessage format
3 *
4 * In AI SDK v6, the useChat hook sends UIMessage format:
5 * - id: unique message identifier
6 * - role: "user" | "assistant" | "system"
7 * - parts: array of content parts (text, tool calls, etc.)
8 */
9const API_URL = "http://localhost:1337/api/ai-sdk";
10
11async function testChat() {
12 console.log("Testing /chat endpoint...\n");
13
14 const response = await fetch(`${API_URL}/chat`, {
15 method: "POST",
16 headers: { "Content-Type": "application/json" },
17 body: JSON.stringify({
18 messages: [
19 {
20 id: "msg-1",
21 role: "user",
22 parts: [{ type: "text", text: "Hello! What is Strapi?" }],
23 },
24 ],
25 }),
26 });
27
28 if (!response.ok) {
29 console.error("Request failed:", response.status, response.statusText);
30 const error = await response.text();
31 console.error(error);
32 process.exit(1);
33 }
34
35 const reader = response.body.getReader();
36 const decoder = new TextDecoder();
37
38 while (true) {
39 const { done, value } = await reader.read();
40 if (done) break;
41 process.stdout.write(decoder.decode(value));
42 }
43
44 console.log("\n\n✅ Chat test passed!");
45}
46
47testChat().catch(console.error);Add the test script to your package.json:
1{
2 "scripts": {
3 "test:ask": "node tests/test-ask.mjs",
4 "test:stream": "node tests/test-stream.mjs",
5 "test:chat": "node tests/test-chat.mjs"
6 }
7}Run the test:
npm run test:chatNow let's add a full chat interface using the useChat hook from @ai-sdk/react.
cd next-client
npm install @ai-sdk/reactcomponents/ChatExample.tsx:1"use client";
2
3import { useState, type SubmitEvent } from "react";
4import { useChat } from "@ai-sdk/react";
5import { DefaultChatTransport } from "ai";
6
7const transport = new DefaultChatTransport({
8 api: "http://localhost:1337/api/ai-sdk/chat",
9});
10
11export function ChatExample() {
12 const [input, setInput] = useState("");
13 const { messages, sendMessage, status } = useChat({ transport });
14
15 const isLoading = status === "submitted" || status === "streaming";
16
17 const handleSubmit = async (e: SubmitEvent) => {
18 e.preventDefault();
19 if (!input.trim() || isLoading) return;
20
21 const text = input;
22 setInput("");
23 await sendMessage({ text });
24 };
25
26 return (
27 <section className="bg-white dark:bg-zinc-900 rounded-lg p-6 shadow">
28 <h2 className="text-xl font-semibold mb-4 text-black dark:text-white">
29 /chat - Multi-turn Chat
30 </h2>
31
32 <div className="space-y-3 mb-4 max-h-96 overflow-y-auto">
33 {messages.map((message) => (
34 <div
35 key={message.id}
36 className={`p-3 rounded-lg ${
37 message.role === "user"
38 ? "bg-blue-100 dark:bg-blue-900 ml-auto max-w-[80%]"
39 : "bg-zinc-100 dark:bg-zinc-800 mr-auto max-w-[80%]"
40 }`}
41 >
42 <p className="text-xs font-semibold mb-1 text-zinc-500 dark:text-zinc-400">
43 {message.role === "user" ? "You" : "AI"}
44 </p>
45 <p className="text-black dark:text-white whitespace-pre-wrap">
46 {message.parts
47 .filter((part) => part.type === "text")
48 .map((part) => part.text)
49 .join("")}
50 </p>
51 </div>
52 ))}
53 </div>
54
55 <form onSubmit={handleSubmit} className="flex gap-2">
56 <input
57 value={input}
58 onChange={(e) => setInput(e.target.value)}
59 placeholder="Type a message..."
60 className="flex-1 p-3 border rounded-lg dark:bg-zinc-800 dark:border-zinc-700 dark:text-white"
61 disabled={isLoading}
62 />
63 <button
64 type="submit"
65 disabled={isLoading || !input.trim()}
66 className="px-4 py-2 bg-purple-600 text-white rounded-lg hover:bg-purple-700 disabled:opacity-50"
67 >
68 {isLoading ? "..." : "Send"}
69 </button>
70 </form>
71 </section>
72 );
73}Note: In AI SDK v6,
useChatno longer manages input state internally. You manage the input withuseStateyourself and usesendMessage({ text })instead of the oldhandleSubmit. Thestatusproperty ('ready' | 'submitted' | 'streaming' | 'error') replacesisLoading.
components/index.ts:1export { AskExample } from "./AskExample";
2export { AskStreamExample } from "./AskStreamExample";
3export { ChatExample } from "./ChatExample";app/page.tsx to include all three examples:1import { AskExample, AskStreamExample, ChatExample } from "@/components";
2
3export default function Home() {
4 return (
5 <div className="min-h-screen bg-zinc-50 dark:bg-black p-8">
6 <main className="max-w-4xl mx-auto space-y-8">
7 <h1 className="text-3xl font-bold text-center text-black dark:text-white">
8 AI SDK Test
9 </h1>
10
11 <div className="grid gap-8">
12 <AskExample />
13 <AskStreamExample />
14 <ChatExample />
15 </div>
16 </main>
17 </div>
18 );
19}Start both your Strapi server and Next.js client:
# Terminal 1 - Strapi
cd server
npm run develop
# Terminal 2 - Next.js
cd next-client
npm run devOpen http://localhost:3000. You should now see all three sections:
Try sending multiple messages in the Chat section - the conversation history is maintained, so the AI remembers context from previous messages.
Nice—we did it! Great job. So what’s next? I’d love to see the AI integrations you build based on what you’ve learned.
As for me, I might have gone a bit overboard. I kept building and ended up adding AI chat directly into the Strapi Admin, with the ability to call tools and expose everything via MCP. It’s still a work in progress, but it’s coming together nicely.
Project repo: https://github.com/PaulBratslavsky/fun-strapi-ai-plugin
Did you know? Strapi hosts Open Office Hours on Discord Monday through Friday at 12:30 PM CST (running until about 1:30 PM CST). It's a great way to connect, get help, and share your feedback directly with the team.