How to integrate ai sdk for embeddings on Vercel
Integrate AI SDK for embeddings on Vercel by installing the @ai-sdk/openai package, configuring your API keys in environment variables, and creating API routes that generate embeddings using the embed function. Deploy your project to Vercel with proper environment variable configuration.
Prerequisites
- Basic knowledge of JavaScript/TypeScript
- Vercel account and project setup
- OpenAI or compatible AI provider API key
- Understanding of vector embeddings concepts
Step-by-Step Instructions
Install AI SDK Dependencies
npm install ai @ai-sdk/openaiThis installs the core AI SDK and OpenAI provider for generating embeddings.Configure Environment Variables
.env.local file in your project root and add your API key:OPENAI_API_KEY=your_openai_api_key_hereMake sure to add .env.local to your .gitignore file to keep your API key secure.Create Embedding API Route
app/api/embeddings/route.ts (for App Router) or pages/api/embeddings.ts (for Pages Router):import { embed } from 'ai';
import { openai } from '@ai-sdk/openai';
export async function POST(req: Request) {
const { text } = await req.json();
const { embedding } = await embed({
model: openai.embedding('text-embedding-3-small'),
value: text,
});
return Response.json({ embedding });
}This creates an endpoint that accepts text and returns embeddings.Create Client-Side Integration
const generateEmbedding = async (text: string) => {
const response = await fetch('/api/embeddings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ text }),
});
const { embedding } = await response.json();
return embedding;
};This function sends text to your API route and receives the embedding vector.Configure Vercel Environment Variables
- Key:
OPENAI_API_KEY - Value: Your actual API key
- Environment: Production, Preview, and Development
Test Embeddings Locally
npm run devTest your embedding endpoint using a tool like curl or your frontend:curl -X POST http://localhost:3000/api/embeddings \
-H "Content-Type: application/json" \
-d '{"text":"Hello, world!"}'Verify that you receive a numerical array representing the embedding.Deploy to Vercel
vercel --prodOr commit and push your changes if using Git integration. Vercel will automatically build and deploy your project with the AI SDK embeddings functionality.Implement Vector Storage (Optional)
npm install @pinecone-database/pineconeModify your API route to store embeddings:// After generating embedding
await vectorDb.upsert({
id: generateId(),
values: embedding,
metadata: { text }
});This enables similarity search and retrieval capabilities.Common Issues & Troubleshooting
API key not found or authentication failed
Verify that OPENAI_API_KEY is correctly set in Vercel environment variables and redeploy your application. Check that the API key has sufficient credits and permissions.
Embedding API returns 500 internal server error
Check the Vercel function logs in your dashboard. Common issues include incorrect model names, malformed requests, or rate limiting. Ensure your embed function call syntax matches the AI SDK documentation.
Embeddings request times out
Large texts may exceed Vercel's serverless function timeout. Split long texts into smaller chunks before processing, or consider upgrading to Vercel Pro for longer execution times.
CORS errors when calling embedding API from frontend
Ensure your API route handles CORS properly by adding appropriate headers: headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST' } or use Next.js built-in API routes which handle CORS automatically.