fix(telemetry): restore userId and sessionId metadata in experimental_telemetry#8195
Conversation
|
The following comment was made by an LLM, it may be inaccurate: No duplicate PRs found |
|
Wow, he's very useful. Some models' cache hits also depend on this. Why hasn't anyone merged them😂 |
|
Hey @rekram1-node , can you take a quick look at this PR when you have time? Simple regression fix. |
|
This would be a great addition to the experimental telemetry. Hoping to see this merged soon! |
|
I'm experiencing the same issue from #8193 and would benefit from this fix. |
|
please merged , i also need this |
hi bro , can i got your help , i tried modify in source code . do you have any idea this is log : i noticed the request body not take metadata params |
|
@tisoz The Vercel AI SDK doesn't send any metadata values present experimental_telemetry to LLM provider APIs. The experimental_telemetry parameter is only used for telemetry purposes - these values are NOT sent to the LLM request body |
thanks bro , some proviver use metadata to enabled cache fetures. maybe i need found a custom the way . looks not good :( |
What does this PR do?
Fixes #8193
Restores
userIdandsessionIdmetadata toexperimental_telemetryin llm.ts, which was removed during LLM refactoring.Originally added in #5279.
How did you verify your code works?
Tested with Langfuse - traces correctly group by session and display user attribution.