You can use this endpoint to send a user interaction with its generation trace. A generation trace is a list of chain steps that the AI agentic system performed before generating the response.
To correctly track the costs of the interaction, it is mandatory to provide
the API calls made to the LLM as LLMTraces. Ensure to pass the input and
output tokens consumed to have more accurate cost estimation.
The start time of the call to the LLM. You can approximate this to the time when the user sends the interaction. The accepted format is the ISO 8601.Example: 2023-12-07T15:00:00.000Z
The end time of the call to the LLM. This is when the user receives the full answer from the model. The accepted format is the ISO 8601.Example: 2023-12-07T15:00:00.000Z
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version" => "v1.0.0"}. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.
A list of messages from the conversation so far, following the same format used by OpenAI Chat Completion endpoints.Each message has two required fields:
role: the role of who is sending the message. Possible values are: system, user, assistant, tool.
content: the content of the message.
The messages in the list should be ordered according to the conversation’s sequence, from the oldest message to the most recent.Example:
Copy
Ask AI
"messages": [ { "role": "system", "content": "This is a system prompt" }, { "role": "user", "content": "What's the weather like in Turin?" }, { "role": "assistant", "content": "The weather is currently rainy in Turin." }, { "role": "user", "content": "What's the weather like in Rome?" }]