Get Your Nordlys API Key
Sign up here to create an account and generate your API key.
Overview
The Vercel AI SDK works seamlessly with Nordlys through two methods:
Nordlys SDK (Recommended): Use the native @nordlys-llm/nordlys-ai-provider package.
OpenAI-Compatible Client: Use Nordlys via @ai-sdk/openai with a custom base URL.
Method 1: Nordlys SDK
Installation
npm install ai @nordlys-llm/nordlys-ai-provider
Basic Setup
Basic Configuration
Custom Configuration
import { nordlys } from "@nordlys-llm/nordlys-ai-provider" ;
// Use default configuration
const model = nordlys ();
Method 2: OpenAI-Compatible Client
Installation
npm install ai @ai-sdk/openai
Configuration
JavaScript/Node.js
TypeScript
import { openai } from '@ai-sdk/openai' ;
import { generateText } from 'ai' ;
const nordlysOpenAI = openai ({
apiKey: process . env . NORDLYS_API_KEY ,
baseURL: 'https://api.nordlylabs.com/v1' ,
});
const { text } = await generateText ({
model: nordlysOpenAI ( 'nordlys/hypernova' ),
prompt: 'Explain quantum computing simply' ,
});
Text Generation
Nordlys Provider
OpenAI Provider
import { generateText } from 'ai' ;
import { nordlys } from '@nordlys-llm/nordlys-ai-provider' ;
const { text } = await generateText ({
model: nordlys (),
prompt: 'Write a vegetarian lasagna recipe for 4 people.' ,
});
Streaming
Nordlys Provider
OpenAI Provider
import { streamText } from 'ai' ;
import { nordlys } from '@nordlys-llm/nordlys-ai-provider' ;
const { textStream } = await streamText ({
model: nordlys (),
prompt: 'Explain machine learning step by step' ,
});
for await ( const delta of textStream ) {
process . stdout . write ( delta );
}
React Chat Component
useChat Hook
API Route - Nordlys Provider
API Route - OpenAI Provider
import { useChat } from 'ai/react' ;
export default function Chat () {
const { messages , input , handleInputChange , handleSubmit } = useChat ({
api: '/api/chat' , // Your API route using Nordlys
});
return (
< div >
{ messages . map ( m => (
< div key = { m . id } >
< strong > { m . role } : </ strong >
{ m . content }
</ div >
)) }
< form onSubmit = { handleSubmit } >
< input
value = { input }
placeholder = "Say something..."
onChange = { handleInputChange }
/>
< button type = "submit" > Send </ button >
</ form >
</ div >
);
}
Configuration
Nordlys keeps advanced tuning internal. Use a Nordlys model ID and focus on your product logic.
Nordlys Provider
OpenAI Provider
import { generateText , tool } from 'ai' ;
import { z } from 'zod' ;
import { nordlys } from '@nordlys-llm/nordlys-ai-provider' ;
const { text } = await generateText ({
model: nordlys (),
prompt: "What's the weather in New York?" ,
tools: {
getWeather: tool ({
description: 'Get weather for a location' ,
parameters: z . object ({
location: z . string (),
}),
execute : async ({ location }) => {
return `Weather in ${ location } is sunny and 72°F` ;
},
}),
},
});
Environment Variables
# .env.local
NORDLYS_API_KEY = your-nordlys-api-key
Error Handling
Nordlys retries transient errors automatically. For comprehensive error handling patterns, see the Error Handling Guide .
import { useChat } from 'ai/react' ;
export default function Chat () {
const { messages , input , handleInputChange , handleSubmit , error , reload } = useChat ({
api: '/api/chat' ,
onError : ( error ) => {
console . error ( 'Nordlys chat error:' , error );
},
});
return (
< div >
{ /* Your chat UI */ }
{ error && (
< div className = "error" >
{ error . message }
< button onClick = {() => reload ()} > Retry </ button >
</ div >
)}
</ div >
);
}
What You Get
Mixture of Models Activation The right specialists activate per prompt
Built-in Streaming Real-time response streaming with React components
Cost Optimization Significant cost savings through lab‑tuned models
Response Metadata Inspect latency and quality signals per request
Next Steps