-
Notifications
You must be signed in to change notification settings - Fork 111
Description
Summary
When calling gemini-3-pro-preview through the OpenRouter provider, the returned providerMetadata object includes fields with the literal value undefined (e.g., cost: undefined). This violates the JSONValue type expected by the Vercel AI SDK.
Reasoning models require passing prior providerMetadata back into subsequent turns. When doing so, ai-sdk performs Zod validation and throws an AI_InvalidPromptError because undefined is not valid JSON and is not allowed in the schema.
Manually sanitizing the metadata (e.g., using a deep prune or JSON.parse(JSON.stringify(...))) works around the issue.
Reproduction
Code
import { openrouter } from "@openrouter/ai-sdk-provider";
import { generateText } from "ai";
const model = openrouter("google/gemini-3-pro-preview");
const messages = [
{ role: "user", content: "Hi" }
];
const { text, providerMetadata } = await generateText({
model,
messages,
});
console.log(text);
console.dir(providerMetadata, { depth: null });
const { text: text2 } = await generateText({
model,
messages: [
...messages,
{ role: "assistant", content: text, providerOptions: providerMetadata },
{ role: "user", content: "wassup?" },
],
});
console.log(text2);Output
Hello! How can I help you today?
{
openrouter: {
provider: 'Google AI Studio',
reasoning_details: [ ...<truncated>... ],
usage: {
promptTokens: 2,
promptTokensDetails: { cachedTokens: 0 },
completionTokens: 233,
completionTokensDetails: { reasoningTokens: 224 },
totalTokens: 235,
cost: undefined, <-- ❌ violates schema
costDetails: { upstreamInferenceCost: 0 }
}
}
}
file:///Users/ramnique/work/ai-sdk-undefined/node_modules/ai/dist/index.mjs:1393
throw new InvalidPromptError2({
^
InvalidPromptError [AI_InvalidPromptError]: Invalid prompt: The messages must be a ModelMessage[]. If you have passed a UIMessage[], you can use convertToModelMessages to convert them.
at standardizePrompt (file:///Users/ramnique/work/ai-sdk-undefined/node_modules/ai/dist/index.mjs:1393:11)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async generateText (file:///Users/ramnique/work/ai-sdk-undefined/node_modules/ai/dist/index.mjs:2180:25)
at async file:///Users/ramnique/work/ai-sdk-undefined/test.js:16:25 {
cause: _TypeValidationError [AI_TypeValidationError]: Type validation failed: Value: ..<truncated>
see the full log here: https://gist.github.com/ramnique/29a67913be644e9465010607284a5696
Expected Behavior
- providerMetadata should contain only valid JSONValue types:
providerMetadata?: Record<string, Record<string, JSONValue>> | undefined;The relevant definitions:
- https://github.com/vercel/ai/blob/ai%405.0.104/packages/ai/src/types/provider-metadata.ts
- https://github.com/vercel/ai/blob/ai%405.0.104/packages/ai/src/types/json-value.ts
undefined is not permitted in the above schema
Other info
$ node --version
v22.16.0 "dependencies": {
"@openrouter/ai-sdk-provider": "^1.2.8",
"ai": "^5.0.104"
}