Skip to main content

Plugins

For usage examples, see Usage - Plugins.

trackOpenAI / track_openai

Wraps an OpenAI client to automatically trace API calls.

function trackOpenAI(
openaiClient: OpenAI,
options: TrackOpenAIOptions
): OpenAI

Parameters:

  • openaiClient: OpenAI - The OpenAI client instance to wrap
  • options: TrackOpenAIOptions - Plugin configuration options

Returns: Wrapped OpenAI client with tracing enabled

TrackOpenAIOptions:

OptionTypeRequiredDescription
mentioraClientMentioraClientYesMentiora client instance for sending traces
threadIdstringNoThread/conversation ID (UUID v7) for grouping traces
tagsstring[]NoOptional tags to add to all traces
metadataRecord<string, unknown>NoOptional metadata to add to all traces
captureContentbooleanNoWhether to capture input/output content (default: true). Set to false for privacy.

Example:

import { trackOpenAI } from '@mentiora.ai/sdk/openai';
import OpenAI from 'openai';

const trackedClient = trackOpenAI(openaiClient, {
mentioraClient,
tags: ['production'],
});

Captured trace data:

Each traced call produces a TraceEvent with:

  • input: All parameters passed to chat.completions.create (model, messages, temperature, tools, etc.), plus a prompt field extracted from the last user message. Multimodal content arrays (text + images) are supported.
  • output: Full response data including id, created, system_fingerprint, service_tier, usage, and per-choice logprobs and refusal fields.
  • metadata: Includes openai_id (the OpenAI response ID, e.g. chatcmpl-...) and created timestamp for correlation with OpenAI's logs.

For streaming requests, the plugin automatically injects stream_options: { include_usage: true } to capture token usage statistics. Refusal content is accumulated across stream chunks.

MentioraTracingLangChain

Callback handler for automatically tracing LangChain executions.

class MentioraTracingLangChain extends BaseCallbackHandler {
constructor(options: MentioraTracingLangChainOptions)
}

MentioraTracingLangChainOptions:

OptionTypeRequiredDescription
mentioraClientMentioraClientYesMentiora client instance for sending traces
threadIdstringNoThread/conversation ID (UUID v7) for grouping traces
tagsstring[]NoOptional tags to add to all traces
metadataRecord<string, unknown>NoOptional metadata to add to all traces
captureContentbooleanNoWhether to capture input/output content (default: true). Set to false for privacy.

Example:

import { MentioraTracingLangChain } from '@mentiora.ai/sdk/langchain';

const callback = new MentioraTracingLangChain({
mentioraClient,
tags: ['production'],
});

await chain.invoke({ input: '...' }, { callbacks: [callback] });

Note: This class extends LangChain's BaseCallbackHandler and implements all required callback methods for tracing LLM calls, chain executions, tool calls, and agent operations.


See also: Tracing | Streaming Helpers