Get started with Nordlys by changing one line of code. No complex setup required.
Nordlys is a Mixture of Models model: each prompt activates the right models under the hood.
Step 1: Get Your API Key
Generate Key
Generate your API key from the dashboard
Step 2: Install SDK (Optional)
JavaScript/Node.js
Python
cURL
No installation required - cURL is available on most systems.
Step 3: Make Your First Request
Choose your preferred language and framework:
OpenAI SDK
Anthropic SDK
Gemini SDK
Vercel AI SDK
LangChain
JavaScript/Node.js
Python
cURL
import OpenAI from 'openai' ;
const client = new OpenAI ({
apiKey: 'your-nordlys-api-key' ,
baseURL: 'https://api.nordlylabs.com/v1'
});
const response = await client . chat . completions . create ({
model: 'nordlys/hypernova' ,
messages: [{ role: 'user' , content: 'Hello!' }]
});
console . log ( response . choices [ 0 ]. message . content );
JavaScript/Node.js
Python
cURL
import Anthropic from '@anthropic-ai/sdk' ;
const client = new Anthropic ({
apiKey: 'your-nordlys-api-key' ,
baseURL: 'https://api.nordlylabs.com/v1'
});
const response = await client . messages . create ({
model: 'nordlys/hypernova' ,
max_tokens: 1000 ,
messages: [{ role: 'user' , content: 'Hello!' }]
});
console . log ( response . content [ 0 ]. text );
JavaScript/Node.js
Python
cURL
import { GoogleGenerativeAI } from '@google/genai' ;
const genAI = new GoogleGenerativeAI ({
apiKey: process . env . NORDLYS_API_KEY || 'your-nordlys-api-key' ,
httpOptions: {
baseUrl: 'https://api.nordlylabs.com/v1beta'
}
});
const model = genAI . getGenerativeModel ({ model: 'nordlys/hypernova' });
const result = await model . generateContent ({
contents: [
{
role: 'user' ,
parts: [{ text: 'Hello!' }]
}
],
generationConfig: {
maxOutputTokens: 512
}
});
console . log ( result . response . text ());
Basic Text Generation
Streaming
React Components
import { openai } from '@ai-sdk/openai' ;
import { generateText } from 'ai' ;
const { text } = await generateText ({
model: openai ( 'nordlys/hypernova' , {
baseURL: 'https://api.nordlylabs.com/v1' ,
apiKey: 'your-nordlys-api-key'
}),
prompt: 'Hello!'
});
console . log ( text );
JavaScript/Node.js
Python
Chains
import { ChatOpenAI } from '@langchain/openai' ;
const model = new ChatOpenAI ({
openAIApiKey: 'your-nordlys-api-key' ,
configuration: {
baseURL: 'https://api.nordlylabs.com/v1'
},
modelName: 'nordlys/hypernova'
});
const response = await model . invoke ( 'Hello!' );
console . log ( response . content );
Error Handling
Always implement proper error handling in production. Nordlys provides detailed error information to help you build resilient applications.
TypeScript
Python
JavaScript (Browser)
import OpenAI from 'openai' ;
const client = new OpenAI ({
apiKey: process . env . NORDLYS_API_KEY ,
baseURL: 'https://api.nordlylabs.com/v1'
});
async function chatWithRetry ( message : string , maxRetries = 3 ) {
for ( let attempt = 1 ; attempt <= maxRetries ; attempt ++ ) {
try {
const response = await client . chat . completions . create ({
model: 'nordlys/hypernova' ,
messages: [{ role: 'user' , content: message }]
});
return response . choices [ 0 ]. message . content ;
} catch ( error : any ) {
console . error ( `Attempt ${ attempt } failed:` , error . message );
if ( attempt === maxRetries ) throw error ;
// Exponential backoff
await new Promise ( resolve =>
setTimeout ( resolve , Math . pow ( 2 , attempt ) * 1000 )
);
}
}
}
// Usage
try {
const result = await chatWithRetry ( 'Explain quantum computing' );
console . log ( result );
} catch ( error ) {
console . error ( 'All retries failed:' , error );
// Implement your preferred recovery behavior (message, etc.)
}
Production Tip : Always log the request_id from error responses for debugging. For comprehensive error handling patterns, see the Error Handling Best Practices guide.
Key Features
Nordlys Model Use nordlys/hypernova for the default lab‑grade model experience
Cost Savings Save 60-90% on AI costs with automatic Nordlys model
6+ Providers Access OpenAI, Anthropic, Google, Groq, DeepSeek, and Grok
Drop-in Replacement Works with existing OpenAI and Anthropic SDK code
Example Response
OpenAI Format
Anthropic Format
{
"id" : "chatcmpl-abc123" ,
"object" : "chat.completion" ,
"created" : 1677652288 ,
"model" : "gpt-5-nano" ,
"choices" : [{
"index" : 0 ,
"message" : {
"role" : "assistant" ,
"content" : "Hello! I'm ready to help you."
},
"finish_reason" : "stop"
}],
"usage" : {
"prompt_tokens" : 5 ,
"completion_tokens" : 10 ,
"total_tokens" : 15
}
}
Nordlys returns standard OpenAI or Anthropic-compatible responses.
Testing Your Integration
Send Test Request
Run your code with a simple message like “Hello!” to verify the connection
Check Response
Confirm you receive a response and check the model field in the response
Next Steps
Need Help?