Google Vertex AI
The Nebuly SDK enables you to monitor all the requests made to:
- Single call to VertexAI Chat models
- Single call to VertexAI Text models
- Chain of calls to VertexAI models
Both of them are supported also when using stream
or async
modes.
The process is straightforward, you just need to:
- initialize the SDK with your API key
- include the
user_id
in your original Google VertexAI method calls.
You can then use the platform to analyze the results and get insights about your LLM users.
Chat Models
You can find a detailed explanation of the allowed nebuly additional keyword arguments below:
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.
For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version": "v1.0.0"}
. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.
You can use this field to temporarily override the Nebuly API key for the selected model call. The interaction will be stored in the project associated with the provided API key.
Generate Text Models
You can find a detailed explanation of the allowed nebuly additional keyword arguments below:
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.
For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version": "v1.0.0"}
. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.
You can use this field to temporarily override the Nebuly API key for the selected model call. The interaction will be stored in the project associated with the provided API key.
Track Traces
To track your google-model traces we currently expose two different methodologies:
- The chain of models: you can use the context manager integration built in in the nebuly SDK to catch all the model calls. More information can be found in the chain of models section.
- Raw endpoints: you can directly use the exposed APIs to send the raw interactions and traces to the nebuly platform. You can find an example of usage of the endpoint here, while the formal endpoint definition is available here.