IBM Watson x
The nebuly platform provides full support for the IBM Watson models. In this section, we will show you how to easily monitor all the requests made to the IBM Watson models.
First of all, let’s perform a simple chat request using the IBM Watsonx SDK, tracking the interaction time start and time end:
Now, let’s build the payload with all the useful information to be sent to the nebuly platform. We are going to include all the conversation details, along with the details of the model used that are used in the platform to compute the cost of the interaction.
The costs are available only for the following IBM Foundation models (Granite models):
- ibm/granite-13b-instruct-v2
- ibm/granite-8b-japanese
- ibm/granite-20b-multilingual
- ibm/granite-3-2b-instruct
- ibm/granite-3-8b-instruct
- ibm/granite-guardian-3-2b
- ibm/granite-guardian-3-8b
- ibm/granite-3b-code-instruct
- ibm/granite-8b-code-instruct
- ibm/granite-20b-code-instruct
- ibm/granite-34b-code-instruct
You can send the traces also for other models, but the cost will not be computed.
At this point, we have all the information needed to send the request to the nebuly platform. The following code snippet shows how to easily send the request:
You can find a detailed explanation of the some of the specific parameters used in the code snippets above:
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
A unique identifier for the conversation. It is used to group all the single interactions exchanged during the conversation.
If set to True
, a PII detection algorithm will be applied to the input and output messages to remove any personal information.
You can find more details in the API Reference section.