pref0 + Vercel AI SDK

Add preference learning to your Vercel AI SDK app. pref0 works alongside streamText and generateText to personalize every response.

Quick start

typescript
import { streamText } from "ai";
import { openai } from "@ai-sdk/openai";

const PREF0_API = "https://api.pref0.com";
const PREF0_KEY = process.env.PREF0_API_KEY!;

async function getPreferences(userId: string) {
  const res = await fetch(`${PREF0_API}/v1/profiles/${userId}`, {
    headers: { Authorization: `Bearer ${PREF0_KEY}` },
  });
  const { preferences = [] } = await res.json();
  return preferences
    .filter((p: any) => p.confidence >= 0.5)
    .map((p: any) => `- ${p.key}: ${p.value}`)
    .join("\n");
}

export async function POST(req: Request) {
  const { messages, userId } = await req.json();
  const learned = await getPreferences(userId);

  const result = streamText({
    model: openai("gpt-4o"),
    system: `You are a helpful assistant.\n\nLearned preferences:\n${learned}`,
    messages,
  });

  return result.toDataStreamResponse();
}

Why use pref0 with Vercel AI SDK

Works with streamText

Inject learned preferences into the system prompt before streaming. No changes to your streaming setup.

Edge-ready

pref0's lightweight API call adds minimal latency. Works in Edge Runtime and serverless functions.

Framework-native

Fits naturally into Next.js API routes, server actions, and the Vercel AI SDK's patterns.

Any model provider

Use pref0 with OpenAI, Anthropic, Google, or any provider supported by the Vercel AI SDK.

Other integrations

Add preference learning to Vercel AI SDK

Your users are already teaching your agent what they want. pref0 makes sure the lesson sticks.