Skip to content

Commit

Permalink
docs(vercel): nested executions within parent trace
Browse files Browse the repository at this point in the history
  • Loading branch information
hassiebp committed Nov 25, 2024
1 parent 1f33cea commit 90998c1
Show file tree
Hide file tree
Showing 2 changed files with 50 additions and 5 deletions.
55 changes: 50 additions & 5 deletions pages/docs/integrations/vercel-ai-sdk.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -174,21 +174,62 @@ We created a sample repository ([langfuse/langfuse-vercel-ai-nextjs-example](htt

## Customization

### Group multiple executions in one trace

You can open a Langfuse trace and pass the trace ID to AI SDK calls to group multiple execution spans under one trace. The passed name in functionId will be the root span name of the respective execution.

```typescript
import { randomUUID } from "crypto";
import { Langfuse } from "langfuse";

const langfuse = new Langfuse();
const parentTraceId = randomUUID();

langfuse.trace({
id: parentTraceId,
name: "holiday-traditions",
});

for (let i = 0; i < 3; i++) {
const result = await generateText({
model: openai("gpt-3.5-turbo"),
maxTokens: 50,
prompt: "Invent a new holiday and describe its traditions.",
experimental_telemetry: {
isEnabled: true,
functionId: `holiday-tradition-${i}`,
metadata: {
langfuseTraceId: parentTraceId,
},
},
});

console.log(result.text);
}

await langfuse.flushAsync();
await sdk.shutdown();
```

The resulting trace hierarchy will be:

![Vercel nested trace in Langfuse UI](/images/docs/vercel-nested-trace.png)

### Disable Tracking of Input/Output

By default, the exporter captures the input and output of each request. You can disable this behavior by setting the `recordInputs` and `recordOutputs` options to `false`.

### Link Langfuse prompts to traces

You can link Langfuse prompts to Vercel AI SDK generations by setting the `langfusePrompt` property in the `metadata` field:
You can link Langfuse prompts to Vercel AI SDK generations by setting the `langfusePrompt` property in the `metadata` field:

```typescript
import { generateText } from "ai"
import { Langfuse } from "langfuse"
import { generateText } from "ai";
import { Langfuse } from "langfuse";

const langfuse = new Langfuse()
const langfuse = new Langfuse();

const fetchedPrompt = await langfuse.getPrompt('my-prompt')
const fetchedPrompt = await langfuse.getPrompt("my-prompt");

const result = await generateText({
model: openai("gpt-4o"),
Expand Down Expand Up @@ -243,6 +284,10 @@ new LangfuseExporter({ debug: true });
- set `skipOpenTelemetrySetup: true` in Sentry.init
- follow Sentry's docs on how to manually set up Sentry with OTEL

### Short-lived environments

In short-lived environments such as Vercel Cloud Functions, AWS Lambdas etc. you may force an export and flushing of spans after function execution and prior to environment freeze or shutdown by awaiting a call to the `forceFlush` method on the LangfuseExporter instance.

## Learn more

See the [telemetry documentation](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry) of the Vercel AI SDK for more information.
Expand Down
Binary file added public/images/docs/vercel-nested-trace.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 90998c1

Please sign in to comment.