Skip to main content
If you’re on Next.js, follow the Next.js guide.
1

Install the SDK

Want our AI to do it for you? Click here
Run the following command in your terminal:
npm install @traceloop/node-server-sdk
In your LLM app, initialize the Traceloop tracer like this:
import * as traceloop from "@traceloop/node-server-sdk";

traceloop.initialize();
Because of the way Javascript works, you must import the Traceloop SDK before importing any LLM module like OpenAI.
If you’re running this locally, you may want to disable batch sending, so you can see the traces immediately:
traceloop.initialize({ disableBatch: true });
If you’re using Sentry, make sure to disable their OpenTelemetry configuration as it overrides OpenLLMetry. When calling Sentry.init, pass skipOpenTelemetrySetup: true.
2

Annotate your workflows

If you have complex workflows or chains, you can annotate them to get a better understanding of what’s going on. You’ll see the complete trace of your workflow on Traceloop or any other dashboard you’re using.We have a set of methods and decorators to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply wrap it in a withWorkflow() function call.We also have compatible Typescript decorators for class methods which are more convenient.
If you’re using a supported LLM framework - we’ll do that for you. No need to add any annotations to your code.
async function suggestAnswers(question: string) {
  return await withWorkflow({ name: "suggestAnswers" }, () => {
    ...
  });
}
For more information, see the dedicated section in the docs.
3

Configure trace exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are TRACELOOP_API_KEY and TRACELOOP_BASE_URL.For Traceloop, read on. For other options, see Exporting.

Using Traceloop Cloud

You need an API key to send traces to Traceloop. Generate one in Settings by selecting a project and environment, then click Generate API key.⚠️ Important: Copy the key immediately - it won’t be shown again after you close or reload the page.Detailed instructions →
Set the API key as an environment variable in your app named TRACELOOP_API_KEY:
export TRACELOOP_API_KEY=your_api_key_here
Done! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.
Not seeing traces? Make sure you’re viewing the correct project and environment in the dashboard that matches your API key. See Troubleshooting.