Skip to content
SolidRusT.ai

JavaScript/TypeScript SDK

In the meantime, use the OpenAI JavaScript SDK with our API.

Terminal window
npm install openai
# or
yarn add openai
# or
pnpm add openai
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.solidrust.ai/v1',
});
const response = await client.chat.completions.create({
model: 'vllm-primary',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is JavaScript?' }
],
});
console.log(response.choices[0].message.content);
const stream = await client.chat.completions.create({
model: 'vllm-primary',
messages: [{ role: 'user', content: 'Tell me a story' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
const response = await client.embeddings.create({
model: 'bge-m3',
input: 'Text to embed',
});
const embedding = response.data[0].embedding;
console.log(`Embedding dimension: ${embedding.length}`);
import OpenAI from 'openai';
try {
const response = await client.chat.completions.create({
model: 'vllm-primary',
messages: [{ role: 'user', content: 'Hello' }],
});
} catch (error) {
if (error instanceof OpenAI.AuthenticationError) {
console.log('Invalid API key');
} else if (error instanceof OpenAI.RateLimitError) {
console.log('Rate limited - implement backoff');
} else if (error instanceof OpenAI.APIError) {
console.log(`API error: ${error.message}`);
}
}

Full TypeScript support with proper types:

import OpenAI from 'openai';
import type { ChatCompletion, ChatCompletionMessageParam } from 'openai/resources/chat';
const messages: ChatCompletionMessageParam[] = [
{ role: 'user', content: 'Hello!' }
];
const response: ChatCompletion = await client.chat.completions.create({
model: 'vllm-primary',
messages,
});

The OpenAI SDK works in browsers, but never expose your API key in client-side code.

Use a backend proxy:

// Backend (Node.js)
app.post('/api/chat', async (req, res) => {
const response = await client.chat.completions.create({
model: 'vllm-primary',
messages: req.body.messages,
});
res.json(response);
});
// Frontend
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ messages }),
});
const client = new OpenAI({
apiKey: process.env.SOLIDRUST_API_KEY,
baseURL: process.env.SOLIDRUST_BASE_URL || 'https://api.solidrust.ai/v1',
});