Welcome
Tracking
- Getting started
- Python
- Node.js
- Ruby
- Javascript
Send Interaction with trace
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
The interaction to send to the nebuly platform.
The user input in the interaction.
The LLM output in the interaction (the text shown to the user as assistant response).
The start time of the call to the LLM. You can approximate this to the time when the user sends the interaction. The accepted format is the Example:
The end time of the call to the LLM. This is when the user receives the full answer from the model. The accepted format is the Example:
Previous interactions between the user and the assistant.
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value. For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g.
This field is used for the AB testing feature. Please refer to its documentation for further details.
The full trace of your LLM agent or chain.
Boolean flag to anonymize your data
The interaction to send to the nebuly platform.
The user input in the interaction.
The LLM output in the interaction (the text shown to the user as assistant response).
The start time of the call to the LLM. You can approximate this to the time when the user sends the interaction. The accepted format is the ISO 8601
.
Example: 2023-12-07T15:00:00.000Z
The end time of the call to the LLM. This is when the user receives the full answer from the model. The accepted format is the ISO 8601
.
Example: 2023-12-07T15:00:00.000Z
Previous interactions between the user and the assistant.
The input of the interaction.
The output of the interaction.
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.
For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version" => "v1.0.0"}
. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.
This field is used for the AB testing feature. Please refer to its documentation for further details.
The full trace of your LLM agent or chain.
The LLM used.
A list of messages from the conversation so far, following the same format used by OpenAI Chat Completion endpoints.
Each message has two required fields:
role
: the role of who is sending the message. Possible values are: system, user, assistant, tool.content
: the content of the message.
The messages in the list should be ordered according to the conversation’s sequence, from the oldest message to the most recent.
Example:
"messages": [
{
"role": "system",
"content": "This is a system prompt"
},
{
"role": "user",
"content": "What's the weather like in Turin?"
},
{
"role": "assistant",
"content": "The weather is currently rainy in Turin."
},
{
"role": "user",
"content": "What's the weather like in Rome?"
}
]
The LLM output message.
The number of input tokens.
The number of output tokens.
Boolean flag to anonymize your data