Langchain
In this guide, we will show how to use Nebuly’s Javascript and Typescript SDK to collect interactions from langchain’s chains and agents.
The Nebuly SDK enables you to monitor all the requests made to:
Currently langchain-js integration is in beta. It currently supports only the agents and chains based on ChatModel
s, like "gpt-4"
and "gpt-3.5-turbo"
.
The process is straightforward, you just need to:
- import the
NebulyCallbackHandler
from the Nebuly SDK - pass the
NebulyCallbackHandler
to the original langchain method calls
Below you can find a simplified snippet showing how to use it
import { NebulyCallbackHandler } from '@nebuly-ai/nebuly-js-sdk';
let handler = new NebulyCallbackHandler('<YOUR_USER_ID>', '<NEBULY_API_KEY>');
// Here add the handler to the call of your langchain chains or agents
await handler.sendData();
In order to send the data to Nebuly’s platform the handler sendData
method needs to be called AFTER the chain has been run.
The needed parameters are
An id or username uniquely identifying the end-user. We recommend hashing their username or email address, in order to avoid sending us any identifying information.
The Nebuly API key. If no API key is explicitly provided, the handler will search for it in the NEBULY_API_KEY environment variable.
You can then use the platform to analyze the results and get insights about your LLM users.
Chat Models
import { ChatOpenAI } from "@langchain/openai";
import { NebulyCallbackHandler} from "@nebuly-ai/nebuly-js-sdk";
let nebulyCallbackHandler = new NebulyCallbackHandler('<YOUR_USER_ID>', '<NEBULY_API_KEY>')
const chatModel = new ChatOpenAI({
openAIApiKey: "<YOUR_OPENAI_API_KEY>"
});
const response = await chatModel.invoke(
"what is LangSmith?",
{
callbacks: [nebulyCallbackHandler]
}
);
await nebulyCallbackHandler.sendData()
Retrievers
import { CheerioWebBaseLoader } from "langchain/document_loaders/web/cheerio";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { OpenAIEmbeddings } from "@langchain/openai";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createRetrievalChain } from "langchain/chains/retrieval";
import { ChatOpenAI } from "@langchain/openai";
import { NebulyCallbackHandler} from "@nebuly-ai/nebuly-js-sdk";
let myCallbackHandler = new NebulyCallbackHandler('<YOUR_USER_ID>', '<NEBULY_API_KEY>')
const loader = new CheerioWebBaseLoader(
"https://docs.smith.langchain.com/overview"
);
const docs = await loader.load();
const splitter = new RecursiveCharacterTextSplitter();
const splitDocs = await splitter.splitDocuments(docs);
const embeddings = new OpenAIEmbeddings();
const vectorstore = await MemoryVectorStore.fromDocuments(
splitDocs,
embeddings
);
const prompt =
ChatPromptTemplate.fromTemplate(`Answer the following question based only on the provided context:
<context>
{context}
</context>
Question: {input}`);
const chatModel = new ChatOpenAI({
openAIApiKey: "<YOUR_OPENAI_API_KEY>"
});
const documentChain = await createStuffDocumentsChain({
llm: chatModel,
prompt,
});
const retriever = vectorstore.asRetriever();
const retrievalChain = await createRetrievalChain({
combineDocsChain: documentChain,
retriever,
});
const result = await retrievalChain.invoke({
input: "what is LangSmith?",
},
{
callbacks: [myCallbackHandler],
}
);
await myCallbackHandler.sendData()
Agents
import { CheerioWebBaseLoader } from "langchain/document_loaders/web/cheerio";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { OpenAIEmbeddings } from "@langchain/openai";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { ChatOpenAI } from "@langchain/openai";
import { AIMessage, HumanMessage } from "@langchain/core/messages";
import { pull } from "langchain/hub";
import { createOpenAIFunctionsAgent, AgentExecutor } from "langchain/agents";
import { createRetrieverTool } from "langchain/tools/retriever";
import { NebulyCallbackHandler} from "@nebuly-ai/nebuly-js-sdk";
let myCallbackHandler = new NebulyCallbackHandler('<YOUR_USER_ID>', '<NEBULY_API_KEY>')
const loader = new CheerioWebBaseLoader(
"https://docs.smith.langchain.com/overview"
);
const docs = await loader.load();
const splitter = new RecursiveCharacterTextSplitter();
const splitDocs = await splitter.splitDocuments(docs);
const embeddings = new OpenAIEmbeddings();
const vectorstore = await MemoryVectorStore.fromDocuments(
splitDocs,
embeddings
);
const retriever = vectorstore.asRetriever();
// Create agent
const retrieverTool = await createRetrieverTool(retriever, {
name: "langsmith_search",
description:
"Search for information about LangSmith. For any questions about LangSmith, you must use this tool!",
});
const tools = [retrieverTool];
const agentPrompt = await pull<ChatPromptTemplate>(
"hwchase17/openai-functions-agent"
);
const agentModel = new ChatOpenAI({
modelName: "gpt-3.5-turbo-1106",
temperature: 0,
openAIApiKey: "<YOUR_OPENAI_API_KEY>"
});
const agent = await createOpenAIFunctionsAgent({
llm: agentModel,
tools,
prompt: agentPrompt,
});
const agentExecutor = new AgentExecutor({
agent,
tools,
verbose: false,
});
const agentResult = await agentExecutor.invoke(
{
chat_history: [
new HumanMessage("Can LangSmith help test my LLM applications?"),
new AIMessage("Yes!"),
],
input: "Tell me how",
},
{
callbacks: [myCallbackHandler],
}
);
await myCallbackHandler.sendData();