When Using Vercel AI SDK with Remix, Include Zod
When using Vercel AI SDK with Remix, you may encounter a “z8.string(…).base64 is not a function” error. This occurs because Vercel AI SDK depends on the zod library for type validation, which needs to be installed explicitly. Simply run “npm install zod” to resolve the issue and enable seamless integration of AI capabilities into your Remix app.
目次
While implementing a recent project combining Remix and Vercel AI SDK, I encountered an error despite having seemingly simple code. After seeing the error message z8.string(...).base64 is not a function,
I was puzzled for a moment, but there was a simple solution. In this article, I’ll explain the necessary dependencies and error resolution methods when using Vercel AI SDK in a Remix environment.
Introduction: The Power of Remix + Vercel AI SDK
The combination of Remix (a framework created by the React Router development team) and Vercel AI SDK (which simplifies integration with large language models) is incredibly powerful for building interactive web applications that leverage AI. You can take advantage of Remix’s server-side capabilities and client-side navigation while easily connecting with AI models through Vercel AI SDK.
One feature that greatly enhances user experience is displaying AI responses through streaming responses. However, you may encounter unexpected errors when combining these technologies.
Prerequisites and Environment Setup
This article assumes the following environment:
- Remix v2 series (@remix-run/node)
- Vercel AI SDK v3 series
- Node.js 18 or higher
First, install the necessary packages:
npm install @vercel/ai @ai-sdk/anthropic
Error Details and Cause
I implemented the following code:
import { getAuth } from '@clerk/remix/ssr.server'
import type { ActionFunctionArgs } from '@remix-run/node'
import { streamText } from 'ai';
import { createAnthropic } from '@ai-sdk/anthropic';
export async function action(args: ActionFunctionArgs) {
const { CLAUDE_API_KEY } = args.context.cloudflare.env
const { userId } = await getAuth(args)
console.log(userId)
const model = createAnthropic({
apiKey: CLAUDE_API_KEY
})('claude-3-5-sonnet-20241022')
const result = streamText({
model,
messages: [
{
role: 'user',
content: 'Hello, how are you?'
}
],
})
return result.toDataStreamResponse();
}
However, when executing this code, I encountered the following error:
13:14:26 [vite] Internal server error: z8.string(...).base64 is not a function
While it’s difficult to intuitively understand the cause from this error message, it turns out that Vercel AI SDK depends on a type validation library called zod, and this dependency wasn’t explicitly installed.
Solution: Install Zod
The solution is very simple. Just install the zod package:
npm install zod
After running this command, the same code will execute without errors.
Why Does This Error Occur?
Vercel AI SDK internally uses zod for type validation. Specifically, zod is used for processing streaming responses and validating messages. Package managers like npm or yarn automatically install dependencies, but packages declared as peerDependencies need to be installed explicitly.
In Vercel AI SDK, zod is treated as a peerDependency, which is why it needs to be installed explicitly.
Complete Implementation Example
Here’s a complete implementation example with zod added. This example uses a Remix application with Cloudflare Workers, Clerk authentication, and Anthropic’s API:
import { getAuth } from '@clerk/remix/ssr.server'
import type { ActionFunctionArgs } from '@remix-run/node'
import { streamText } from 'ai';
import { createAnthropic } from '@ai-sdk/anthropic';
import { z } from 'zod'; // Import zod
// Input data validation schema
const inputSchema = z.object({
message: z.string().min(1).max(1000)
});
export async function action(args: ActionFunctionArgs) {
const { CLAUDE_API_KEY } = args.context.cloudflare.env
const { userId } = await getAuth(args)
// Check for unauthenticated users
if (!userId) {
return new Response('Unauthorized', { status: 401 });
}
// Get and validate form data
const formData = await args.request.formData();
const userMessage = formData.get('message')?.toString() || '';
try {
// Validate input data
const { message } = inputSchema.parse({ message: userMessage });
// Initialize Anthropic model
const model = createAnthropic({
apiKey: CLAUDE_API_KEY
})('claude-3-5-sonnet-20241022')
// Create streaming response
const result = streamText({
model,
messages: [
{
role: 'user',
content: message
}
],
})
return result.toDataStreamResponse();
} catch (error) {
console.error('Error processing AI request:', error);
return new Response('Error processing request', { status: 500 });
}
}
On the frontend side, you can receive and display the streaming response like this:
import { useAIStream } from 'ai/react';
import { Form, useActionData } from '@remix-run/react';
import { useEffect, useState } from 'react';
export default function AIChat() {
const actionData = useActionData<typeof action>();
const [message, setMessage] = useState('');
const [streamedText, setStreamedText] = useState('');
// Process streaming data
useEffect(() => {
if (actionData?.stream) {
const reader = actionData.stream.getReader();
const readStream = async () => {
let done = false;
let accumulatedText = '';
while (!done) {
const { value, done: doneReading } = await reader.read();
done = doneReading;
if (value) {
const text = new TextDecoder().decode(value);
accumulatedText += text;
setStreamedText(accumulatedText);
}
}
};
readStream();
}
}, [actionData]);
return (
<div className="p-4 max-w-3xl mx-auto">
<h1 className="text-2xl font-bold mb-4">AI Chat</h1>
<Form method="post" className="mb-4">
<div className="flex gap-2">
<input
type="text"
name="message"
value={message}
onChange={(e) => setMessage(e.target.value)}
className="flex-1 p-2 border rounded"
placeholder="Enter message..."
/>
<button
type="submit"
className="px-4 py-2 bg-blue-500 text-white rounded"
>
Send
</button>
</div>
</Form>
{streamedText && (
<div className="p-4 border rounded bg-gray-50">
<h2 className="font-bold mb-2">AI Response:</h2>
<div className="whitespace-pre-wrap">{streamedText}</div>
</div>
)}
</div>
);
}
Summary and Future Development
By combining Remix and Vercel AI SDK, you can efficiently develop interactive AI applications. The error “z8.string(…).base64 is not a function” was resolved simply by installing zod, but such error messages can be difficult to intuitively understand.
When you can’t identify the cause from an error message, it’s important to check library dependencies. GitHub Issues can also be helpful:
For future development, consider the following:
- Optimize Remix loading: Improve loading states while AI responses are being generated
- Enhance error handling: Implement mechanisms to properly handle errors from AI models
- Switch between multiple models: Implement functionality that allows users to select different AI models
- Manage message history: Add functionality to save chat history and maintain context
Small issues like this frequently occur during development, but understanding the cause and addressing it appropriately leads to more robust application development. Combine Remix’s powerful server-side capabilities with Vercel AI SDK’s flexibility to develop fantastic AI applications.