Skip to content

Conversation

@pichlermarc
Copy link
Member

@pichlermarc pichlermarc commented Nov 24, 2025

Fixes #3240
Introduced via #3164 -> #3157

Errors stem from OpenAI types being used in the instrumentation's public interface. Instrumentations are supposed to no-op without causing errors even if the instrumented package is not installed.

This PR fixes this by adding private access modifiers the OpenAIInstrumentation intended private properties. This will rease the types in the compile output and removes the imports.

You can test this by following these steps:

  • check out main
  • run npm ci && npm run compile
  • inspect ./packages/instrumentation-openai/build/src/instrumentation.d.ts
    • you'll see type-imports from openai
  • check out this PRs branch
  • run npm ci && npm run compile
  • inspect ./packages/instrumentation-openai/build/src/instrumentation.d.ts
    • type imports from openai should be gone

Before this change the compiled output looks like this:

// instrumentation.d.ts
import type { Attributes, Context, Span } from '@opentelemetry/api';
import { InstrumentationBase, InstrumentationNodeModuleDefinition } from '@opentelemetry/instrumentation';
import type { ChatCompletion, ChatCompletionCreateParams, ChatCompletionChunk } from 'openai/resources/chat/completions';
import type { CreateEmbeddingResponse, EmbeddingCreateParams } from 'openai/resources/embeddings';
import { OpenAIInstrumentationConfig } from './types';
export declare class OpenAIInstrumentation extends InstrumentationBase<OpenAIInstrumentationConfig> {
    private _genaiClientOperationDuration;
    private _genaiClientTokenUsage;
    constructor(config?: OpenAIInstrumentationConfig);
    setConfig(config?: OpenAIInstrumentationConfig): void;
    protected init(): InstrumentationNodeModuleDefinition[];
    _updateMetricInstruments(): void;
    _getPatchedChatCompletionsCreate(): any;
    /**
     * Start a span for this chat-completion API call. This also emits log events
     * as appropriate for the request params.
     */
    _startChatCompletionsSpan(params: ChatCompletionCreateParams, config: OpenAIInstrumentationConfig, baseURL: string | undefined): {
        span: Span;
        ctx: Context;
        commonAttrs: Attributes;
    };
    /**
     * This wraps an instance of a `openai/streaming.Stream.iterator()`, an
     * async iterator. It should yield the chunks unchanged, and gather telemetry
     * data from those chunks, then end the span.
     */
    _onChatCompletionsStreamIterator(streamIter: AsyncIterator<ChatCompletionChunk>, span: Span, startNow: number, config: OpenAIInstrumentationConfig, commonAttrs: Attributes, ctx: Context): AsyncGenerator<any, void, unknown>;
    _onChatCompletionsCreateResult(span: Span, startNow: number, commonAttrs: Attributes, result: ChatCompletion, config: OpenAIInstrumentationConfig, ctx: Context): void;
    _createAPIPromiseRejectionHandler(startNow: number, span: Span, commonAttrs: Attributes): (err: Error) => void;
    _getPatchedEmbeddingsCreate(): any;
    /**
     * Start a span for this chat-completion API call. This also emits log events
     * as appropriate for the request params.
     */
    _startEmbeddingsSpan(params: EmbeddingCreateParams, baseURL: string | undefined): {
        span: Span;
        ctx: Context;
        commonAttrs: Attributes;
    };
    _onEmbeddingsCreateResult(span: Span, startNow: number, commonAttrs: Attributes, result: CreateEmbeddingResponse): void;
}
//# sourceMappingURL=instrumentation.d.ts.map

After this change the compiled output will looks like this:

// instrumentation.d.ts
import { InstrumentationBase, InstrumentationNodeModuleDefinition } from '@opentelemetry/instrumentation';
import { OpenAIInstrumentationConfig } from './types';
export declare class OpenAIInstrumentation extends InstrumentationBase<OpenAIInstrumentationConfig> {
    private _genaiClientOperationDuration;
    private _genaiClientTokenUsage;
    constructor(config?: OpenAIInstrumentationConfig);
    setConfig(config?: OpenAIInstrumentationConfig): void;
    protected init(): InstrumentationNodeModuleDefinition[];
    _updateMetricInstruments(): void;
    private _getPatchedChatCompletionsCreate;
    /**
     * Start a span for this chat-completion API call. This also emits log events
     * as appropriate for the request params.
     */
    private _startChatCompletionsSpan;
    /**
     * This wraps an instance of a `openai/streaming.Stream.iterator()`, an
     * async iterator. It should yield the chunks unchanged, and gather telemetry
     * data from those chunks, then end the span.
     */
    private _onChatCompletionsStreamIterator;
    private _onChatCompletionsCreateResult;
    private _createAPIPromiseRejectionHandler;
    private _getPatchedEmbeddingsCreate;
    /**
     * Start a span for this chat-completion API call. This also emits log events
     * as appropriate for the request params.
     */
    private _startEmbeddingsSpan;
    private _onEmbeddingsCreateResult;
}
//# sourceMappingURL=instrumentation.d.ts.map

@pichlermarc pichlermarc requested a review from a team as a code owner November 24, 2025 13:17
@github-actions github-actions bot requested review from seemk and trentm November 24, 2025 13:17
@pichlermarc pichlermarc added bug Something isn't working priority:p1 Bugs which cause problems in end-user applications such as crashes, data inconsistencies labels Nov 24, 2025
@pichlermarc pichlermarc merged commit 29e294a into open-telemetry:main Nov 24, 2025
26 checks passed
@pichlermarc pichlermarc deleted the fix/openai-types branch November 24, 2025 13:44
@trentm
Copy link
Contributor

trentm commented Nov 24, 2025

Sorry about this. Thanks for fixing it!

trentm added a commit to elastic/elastic-otel-node that referenced this pull request Nov 24, 2025
…1182)

Summary of changes:

    0.6.0 -> 0.7.0 @opentelemetry/instrumentation-openai (range-bump)
    0.53.0 -> 0.53.1 @opentelemetry/instrumentation-knex
    0.57.0 -> 0.57.1 @opentelemetry/instrumentation-redis
    0.61.0 -> 0.61.1 @opentelemetry/instrumentation-pg

Refs: open-telemetry/opentelemetry-js-contrib#3242 (p1 bug in instr-openai)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working pkg:instrumentation-openai priority:p1 Bugs which cause problems in end-user applications such as crashes, data inconsistencies

Projects

None yet

Development

Successfully merging this pull request may close these issues.

tsc errors when bumping to @opentelemetry/[email protected]

4 participants