SigNoz Observability

SigNoz is a single tool for all your monitoring and observability needs. Here are a few reasons why you should choose SigNoz:

  • Single tool for observability(logs, metrics, and traces)
  • Built on top of OpenTelemetry, the open-source standard which frees you from any type of vendor lock-in
  • Correlated logs, metrics and traces for much richer context while debugging
  • Uses ClickHouse (used by likes of Uber & Cloudflare) as datastore - an extremely fast and highly optimized storage for observability data
  • DIY Query builder, PromQL, and ClickHouse queries to fulfill all your use-cases around querying observability data

Setup

Instrument your Next.js application

Check out detailed instructions on how to set up OpenTelemetry instrumentation in your Nextjs applications and view your application traces in SigNoz over here.

Send traces directly to SigNoz Cloud

Step 1. Install OpenTelemetry packages

npm install @vercel/otel @opentelemetry/api

Step 2. Update next.config.mjs to include instrumentationHook

This step is only needed when using NextJs 14 and below

/** @type {import('next').NextConfig} */
const nextConfig = {
// include instrumentationHook experimental feature
experimental: {
instrumentationHook: true,
},
};
export default nextConfig;

Step 3. Create instrumentation.ts file(in root project directory)

import { registerOTel, OTLPHttpJsonTraceExporter } from '@vercel/otel';
// Add otel logging
import { diag, DiagConsoleLogger, DiagLogLevel } from '@opentelemetry/api';
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.ERROR); // set diaglog level to DEBUG when debugging
export function register() {
registerOTel({
serviceName: '<service_name>',
traceExporter: new OTLPHttpJsonTraceExporter({
url: 'https://ingest.<region>.signoz.cloud:443/v1/traces',
headers: { 'signoz-ingestion-key': '<your-ingestion-key>' },
}),
});
}
  • <service_name> is the name of your service
  • Set the <region> to match your SigNoz Cloud region
  • Replace <your-ingestion-key> with your SigNoz ingestion key

The instrumentation file should be in the root of your project and not inside the app or pages directory. If you're using the src folder, then place the file inside src alongside pages and app.

Your Next.js app should be properly instrumented now.

Enable Telemetry for Vercel AI SDK

The Vercel AI SDK uses OpenTelemetry to collect telemetry data. OpenTelemetry is an open-source observability framework designed to provide standardized instrumentation for collecting telemetry data.

Enabling Telemetry

Check out more detailed information about Vercel AI SDK’s telemetry options visit here.

You can then use the experimental_telemetry option to enable telemetry on specific function calls while the feature is experimental:

const result = await generateText({
model: openai('gpt-4-turbo'),
prompt: 'Write a short story about a cat.',
experimental_telemetry: { isEnabled: true },
});

When telemetry is enabled, you can also control whether you want to record the input values and the output values for the function. By default, both are enabled. You can disable them by setting the recordInputs and recordOutputs options to false.

experimental_telemetry: { isEnabled: true, recordInputs: false, recordOutputs: false}

Disabling the recording of inputs and outputs can be useful for privacy, data transfer, and performance reasons. You might, for example, want to disable recording inputs if they contain sensitive information.

Telemetry Metadata

You can provide a functionId to identify the function that the telemetry data is for, and metadata to include additional information in the telemetry data.

const result = await generateText({
model: openai('gpt-4-turbo'),
prompt: 'Write a short story about a cat.',
experimental_telemetry: {
isEnabled: true,
functionId: 'my-awesome-function',
metadata: {
something: 'custom',
someOtherThing: 'other-value',
},
},
});

Custom Tracer

You may provide a tracer which must return an OpenTelemetry Tracer. This is useful in situations where you want your traces to use a TracerProvider other than the one provided by the @opentelemetry/api singleton.

const tracerProvider = new NodeTracerProvider();
const result = await generateText({
model: openai('gpt-4-turbo'),
prompt: 'Write a short story about a cat.',
experimental_telemetry: {
isEnabled: true,
tracer: tracerProvider.getTracer('ai'),
},
});

Your Vercel AI SDK commands should now automatically emit traces, spans, and events. You can find more details on the types of spans and events generated here.

Finally, you should be able to view this data in Signoz Cloud under the traces tab.