The Nebuly SDK enables you to monitor all the requests made to:

The process is straightforward, you just need to:

  • initialize the SDK with your API key
  • include the user_id in your original HF Pipelines method calls.

You can then use the platform to analyze the results and get insights about your LLM users.

Conversational Models

import nebuly

nebuly.init(api_key="<YOUR_NEBULY_API_KEY>")

from transformers import pipeline, Conversation
converse = pipeline("conversational")

conversation = Conversation(
    "Going to the movies tonight - any suggestions?",
    past_user_inputs=[],
    generated_responses=[],
)
output = converse(
    conversation,
    # ... other optional hf hub kwargs
    # Nebuly additional kwargs
    user_id="<YOUR_USER_ID>",
)

Note that in order to correctly get an answer from the HF model, you need to provide the conversation history as a list of past_user_inputs and generated_responses. This history will also be used to analyze the model’s performance on the Nebuly platform.

You can find a detailed explanation of the allowed nebuly additional keyword arguments below:

user_id
string
required

An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.

nebuly_tags
dict

Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.

For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version": "v1.0.0"}. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.

nebuly_api_key
string

You can use this field to temporarily override the Nebuly API key for the selected model call. The interaction will be stored in the project associated with the provided API key.

Text Generation Models

import nebuly

nebuly.init(api_key="<YOUR_NEBULY_API_KEY>")

from transformers import pipeline
generator = pipeline('text-generation', model = 'gpt2')

output = generator(
    "Who is the president of Russia?",
    # ... other optional hf hub kwargs
    # Nebuly additional kwargs
    user_id="<YOUR_USER_ID>",
)

You can find a detailed explanation of the allowed nebuly additional keyword arguments below:

user_id
string
required

An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.

nebuly_tags
dict

Tag user interactions by adding key-value pairs using this parameter. Each key represents the tag name, and the corresponding value is the tag value.

For example, if you want to tag an interaction with the model version used to reply to user input, provide it as an argument for nebuly_tags, e.g. {"version": "v1.0.0"}. You have the flexibility to define custom tags, making them available as potential filters on the Nebuly platform.

nebuly_api_key
string

You can use this field to temporarily override the Nebuly API key for the selected model call. The interaction will be stored in the project associated with the provided API key.

Track Traces

To track your huggingface traces we currently expose two different methodologies:

  • The chain of models: you can use the context manager integration built in in the nebuly SDK to catch all the model calls. More information can be found in the chain of models section.
  • Raw endpoints: you can directly use the exposed APIs to send the raw interactions and traces to the nebuly platform. You can find an example of usage of the endpoint here, while the formal endpoint definition is available here.