Requires Node.js 18 or higher.
Installation
Quick Start
import { init, trace, interaction } from 'glass-ai';
import OpenAI from 'openai';
// Initialize Glass
init({ apiKey: 'your-glass-api-key' });
// Your AI calls are now automatically traced
const client = new OpenAI();
const generateResponse = trace(async function generateResponse(prompt: string): Promise<string> {
const response = await client.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: prompt }],
});
return response.choices[0].message.content!;
});
// Track user interactions
await interaction({ userId: 'user_123', sessionId: 'sess_abc' }, async (ctx) => {
const result = await generateResponse('What is the meaning of life?');
ctx.finish({ output: { response: result } });
});
That’s it. All your OpenAI, Anthropic, and Google Generative AI calls are now traced automatically.
API Reference
init()
Initializes the Glass SDK. Call this once at application startup.
import { init } from 'glass-ai';
// Basic initialization
init({ apiKey: 'your-api-key' });
// With debug mode (logs traces to console)
init({ apiKey: 'your-api-key', debug: true });
Your Glass API key. Falls back to GLASS_API_KEY environment variable if not provided.
Enable console output for local development.
skipDefaultInstrumentations
trace()
Wraps any function with tracing. Automatically records arguments, return values, and exceptions.
import { trace } from 'glass-ai';
// Basic usage
const processData = trace(function processData(data: object): object {
return { processed: true, ...data };
});
// Custom span name
const myFunction = trace({ name: 'custom-operation' }, function myFunction() {
// ...
});
// With attributes
const createEmbedding = trace(
{ attributes: { model: 'text-embedding-3-small' } },
function createEmbedding(text: string): number[] {
return [0.1, 0.2, 0.3];
}
);
Works with async functions too:
const asyncProcess = trace(async function asyncProcess(data: string): Promise<string> {
await someAsyncOperation();
return `processed: ${data}`;
});
Custom span name. Defaults to the function name.
Additional attributes to attach to the span.
interaction()
Tracks a user interaction. Propagates user context to all nested traces.
import { interaction, trace } from 'glass-ai';
const callLlm = trace(async function callLlm(prompt: string): Promise<string> {
return 'LLM response';
});
await interaction(
{ userId: 'user_123', sessionId: 'sess_abc', project: 'conversational-chat', input: 'Hello!' },
async (ctx) => {
const result = await callLlm('Hello!');
ctx.finish({ output: { response: result } });
}
);
In which project to classify traces.
Service name for routing.
Methods on the context:
| Method | Description |
|---|
finish(output) | Record the final output of the interaction |
setAttribute(key, value) | Set a custom attribute |
recordException(exception) | Record an exception |
taskSpan()
Creates a task span with explicit input/output recording.
import { taskSpan } from 'glass-ai';
await taskSpan('embedding-task', { attributes: { model: 'ada-002' } }, async (task) => {
task.recordInput({ text: 'Hello, world!' });
const embedding = await computeEmbedding('Hello, world!');
task.recordOutput({ embedding, dimensions: 1536 });
});
The name of the task span.
Additional attributes for the span.
Methods on the task:
| Method | Description |
|---|
recordInput(data) | Record input data |
recordOutput(data) | Record output data |
setAttribute(key, value) | Set a custom attribute |
recordException(exception) | Record an exception |
Full Example: RAG Pipeline
Here’s how the primitives compose together:
import { init, trace, interaction, taskSpan } from 'glass-ai';
import OpenAI from 'openai';
init({ apiKey: 'your-api-key' });
const client = new OpenAI();
const retrieveContext = trace(async function retrieveContext(query: string): Promise<string[]> {
return ['context 1', 'context 2'];
});
const generateResponse = trace(async function generateResponse(
query: string,
context: string[]
): Promise<string> {
const response = await client.chat.completions.create({
model: 'gpt-4',
messages: [
{ role: 'system', content: `Context: ${context}` },
{ role: 'user', content: query },
],
});
return response.choices[0].message.content!;
});
const ragQuery = trace({ name: 'rag-pipeline' }, async function ragQuery(query: string): Promise<string> {
let context: string[] = [];
await taskSpan('retrieval', {}, async (task) => {
task.recordInput({ query });
context = await retrieveContext(query);
task.recordOutput({ numDocs: context.length });
});
return generateResponse(query, context);
});
// Track the full user interaction
await interaction({ userId: 'user_123', input: 'What is quantum computing?' }, async (ctx) => {
const result = await ragQuery('What is quantum computing?');
ctx.finish({ output: { answer: result } });
});
This creates a trace hierarchy like:
interaction (userId=user_123)
└── rag-pipeline
├── retrieval (taskSpan)
│ └── retrieveContext
└── generateResponse
└── OpenAI chat.completions.create (auto)
Environment Variables
| Variable | Description |
|---|
GLASS_API_KEY | Your Glass API key (alternative to passing in code) |
export GLASS_API_KEY="your-api-key"
import { init } from 'glass-ai';
// API key is read from environment
init({});