Skip to main content

What are Instrumentations?

Glass is built on OpenTelemetry, the industry-standard for observability. OpenTelemetry uses instrumentations to automatically capture traces from libraries and frameworks — without you changing any code. When you call init(), Glass automatically instruments your AI providers. Every API call is traced with request/response data, token usage, latency, and errors.

Default Instrumentations

Glass automatically instruments these AI providers out of the box:
ProviderPackageWhat’s Traced
OpenAIopenaiChat completions, embeddings, images, audio
AnthropicanthropicMessages, completions
Google Generative AIgoogle-generativeaiGemini generate, chat, embeddings
Default instrumentations are enabled automatically. No configuration needed.
from glass import init
from openai import OpenAI

init(api_key="your-api-key")

# This call is automatically traced
client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)
Each traced call captures:
  • Request and response payloads
  • Token usage (input/output/total)
  • Model name and parameters
  • Latency
  • Errors and exceptions

Adding Custom Instrumentations

Need to trace other libraries? You can add any OpenTelemetry-compatible instrumentation.

Example: Adding HTTP Request Tracing

from glass import init
from opentelemetry.instrumentation.requests import RequestsInstrumentor

init(
    api_key="your-api-key",
    instrumentations=[RequestsInstrumentor()]
)
Now all requests library calls are traced alongside your AI calls.

Example: Multiple Custom Instrumentations

from glass import init
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.instrumentation.httpx import HTTPXClientInstrumentor
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor

init(
    api_key="your-api-key",
    instrumentations=[
        RequestsInstrumentor(),
        HTTPXClientInstrumentor(),
        SQLAlchemyInstrumentor(),
    ]
)
Custom instrumentations are added on top of the defaults. Your AI providers are still traced automatically.

Disabling Default Instrumentations

If you want full control over what gets instrumented, disable the defaults:
from glass import init
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

init(
    api_key="your-api-key",
    skip_default_instrumentations=True,
    instrumentations=[
        OpenAIInstrumentor(),  # Only instrument OpenAI
    ]
)

Here are commonly used instrumentations for AI/ML workflows:
LibraryInstrumentation PackageInstall
LangChainopentelemetry-instrumentation-langchainpip install opentelemetry-instrumentation-langchain
LlamaIndexopentelemetry-instrumentation-llamaindexpip install opentelemetry-instrumentation-llamaindex
Cohereopentelemetry-instrumentation-coherepip install opentelemetry-instrumentation-cohere
Bedrockopentelemetry-instrumentation-bedrockpip install opentelemetry-instrumentation-bedrock
Replicateopentelemetry-instrumentation-replicatepip install opentelemetry-instrumentation-replicate
Pineconeopentelemetry-instrumentation-pineconepip install opentelemetry-instrumentation-pinecone
Chromaopentelemetry-instrumentation-chromadbpip install opentelemetry-instrumentation-chromadb
Weaviateopentelemetry-instrumentation-weaviatepip install opentelemetry-instrumentation-weaviate
Qdrantopentelemetry-instrumentation-qdrantpip install opentelemetry-instrumentation-qdrant

OpenTelemetry Registry

Browse all available Python instrumentations in the OpenTelemetry Registry.