AI Gateway wraps AI SDK v6 providers into a single model factory. Switch between Vercel AI Gateway, LLM Gateway, and OpenRouter without rewriting your chat logic.
Features
- Three Gateway Providers: Vercel AI Gateway, LLM Gateway, and OpenRouter
- Unified API: One
createGatewayModelcall for every provider - Type-Safe: Works with AI SDK
LanguageModel - Flexible Authentication: Environment variables or runtime API keys
- Streaming Ready: Pairs with
useGatewayChatfor text streaming - Error Handling: Gateway-specific error classes
Installation
bunx --bun shadcn@latest add https://sidekick.montek.dev/r/gateway.jsonThis installs:
lib/gateway.ts(model factory + helpers)hooks/use-gateway-chat.ts(AI SDK chat hook wrapper)
Quick Start
1. Set environment variables
# Vercel AI Gateway
VERCEL_AI_GATEWAY_API_KEY=your_key_here
# or
AI_GATEWAY_API_KEY=your_key_here
# LLM Gateway
LLM_GATEWAY_API_KEY=llmgtwy_xxxxx
# OpenRouter
OPENROUTER_API_KEY=sk-or-v1-xxxxx
OPENROUTER_SITE_URL=https://your-site.com # Optional
OPENROUTER_SITE_NAME=Your Site Name # Optional2. Create an API route
useGatewayChat expects a text stream response.
// app/api/chat/route.ts
import { streamText } from "ai";
import { createGatewayModel } from "@/lib/gateway";
export async function POST(request: Request) {
const { messages, gateway } = await request.json();
const model = createGatewayModel(gateway);
const result = streamText({
model,
messages,
});
return result.toTextStreamResponse();
}3. Use with Sidekick
useGatewayChat exposes sendMessage and streaming status you can wire into PromptInput.
import {
Conversation,
ConversationContent,
Message,
MessageContent,
Sidekick,
SidekickContent,
SidekickFooter,
SidekickHeader,
} from "@/components/sidekick";
import {
PromptInput,
PromptInputBody,
PromptInputFooter,
PromptInputSubmit,
PromptInputTextarea,
PromptInputTools,
} from "@/components/ui/prompt-input";
import { useGatewayChat } from "@/hooks/use-gateway-chat";
function getMessageText(message: { parts?: Array<{ type: string; text?: string }> }) {
return (
message.parts
?.filter((part) => part.type === "text")
.map((part) => part.text ?? "")
.join("") ?? ""
);
}
export function SidekickChat() {
const { messages, sendMessage, status } = useGatewayChat({
gateway: { modelId: "openai/gpt-4o-mini" },
});
const handleSubmit = async ({ text }) => {
if (!text.trim()) return;
await sendMessage({ text });
};
return (
<Sidekick standalone>
<SidekickHeader>Support</SidekickHeader>
<SidekickContent>
<Conversation>
<ConversationContent>
{messages.map((msg) => (
<Message
key={msg.id}
from={msg.role === "assistant" ? "assistant" : "user"}
>
<MessageContent
from={msg.role === "assistant" ? "assistant" : "user"}
>
{getMessageText(msg)}
</MessageContent>
</Message>
))}
</ConversationContent>
</Conversation>
</SidekickContent>
<SidekickFooter>
<PromptInput onSubmit={handleSubmit}>
<PromptInputBody>
<PromptInputTextarea placeholder="Ask a question..." />
</PromptInputBody>
<PromptInputFooter>
<PromptInputTools />
<PromptInputSubmit status={status} />
</PromptInputFooter>
</PromptInput>
</SidekickFooter>
</Sidekick>
);
}If you need attachments, read message.files from PromptInput and pass them to your API route.
Gateway Providers
All providers are optional. The gateway auto-detects which provider to use based on available environment variables.
Vercel AI Gateway
const model = createGatewayModel({
provider: "vercel",
modelId: "anthropic/claude-sonnet-4",
});Uses VERCEL_AI_GATEWAY_API_KEY or AI_GATEWAY_API_KEY.
LLM Gateway
const model = createGatewayModel({
provider: "llmgateway",
modelId: "openai/gpt-4o",
});Requires LLM_GATEWAY_API_KEY. (The LLM Gateway provider reads the key from the environment.)
OpenRouter
const model = createGatewayModel({
provider: "openrouter",
modelId: "openai/gpt-4o",
});Requires OPENROUTER_API_KEY. Use OPENROUTER_SITE_URL and OPENROUTER_SITE_NAME to set ranking headers.
Auto-Detect Priority
When provider is not specified, the gateway checks:
- Vercel AI Gateway (
VERCEL_AI_GATEWAY_API_KEYorAI_GATEWAY_API_KEY) - LLM Gateway (
LLM_GATEWAY_API_KEY) - OpenRouter (
OPENROUTER_API_KEY)
Error Handling
import { createGatewayModel, GatewayError } from "@/lib/gateway";
try {
const model = createGatewayModel({ modelId: "openai/gpt-4o" });
const result = await generateText({ model, prompt: "Hello" });
} catch (error) {
if (error instanceof GatewayError) {
console.error(`Gateway Error [${error.provider}]:`, error.message);
}
}Troubleshooting
"No gateway provider configured"
Make sure you have at least one of these environment variables set:
VERCEL_AI_GATEWAY_API_KEYorAI_GATEWAY_API_KEYLLM_GATEWAY_API_KEYOPENROUTER_API_KEY
"Vercel AI Gateway requires VERCEL_AI_GATEWAY_API_KEY"
Set VERCEL_AI_GATEWAY_API_KEY or pass an apiKey parameter.
"LLM Gateway requires LLM_GATEWAY_API_KEY"
Set LLM_GATEWAY_API_KEY in your environment.
"OpenRouter requires OPENROUTER_API_KEY"
Set OPENROUTER_API_KEY in your environment or pass an apiKey parameter.
On This Page
FeaturesInstallationQuick Start1. Set environment variables2. Create an API route3. Use with SidekickGateway ProvidersVercel AI GatewayLLM GatewayOpenRouterAuto-Detect PriorityError HandlingTroubleshooting"No gateway provider configured""Vercel AI Gateway requires VERCEL_AI_GATEWAY_API_KEY""LLM Gateway requires LLM_GATEWAY_API_KEY""OpenRouter requires OPENROUTER_API_KEY"