Skip to content

Commit

Permalink
feat: track token usage in Langfuse (and fix memory leak)
Browse files Browse the repository at this point in the history
  • Loading branch information
marcklingen committed Sep 18, 2024
1 parent de253bf commit 77dc049
Showing 1 changed file with 17 additions and 4 deletions.
21 changes: 17 additions & 4 deletions examples/filters/langfuse_filter_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,13 +113,26 @@ async def outlet(self, body: dict, user: Optional[dict] = None) -> dict:
return body

generation = self.chat_generations[body["chat_id"]]
assistant_message = get_last_assistant_message(body["messages"])

# Extract usage information
info = assistant_message.get("info", {})
usage = None
if "prompt_tokens" in info and "completion_tokens" in info:
usage = {
"input": info["prompt_tokens"],
"output": info["completion_tokens"],
"unit": "TOKENS",
}

user_message = get_last_user_message(body["messages"])
generated_message = get_last_assistant_message(body["messages"])

# Update generation
generation.end(
output=generated_message,
output=assistant_message,
metadata={"interface": "open-webui"},
usage=usage,
)

# Clean up the chat_generations dictionary
del self.chat_generations[body["chat_id"]]

return body

0 comments on commit 77dc049

Please sign in to comment.