Skip to main content

Events / Callbacks

LangChain provides a callback system that allows you to hook into the various stages of your LLM application. This is useful for logging, monitoring, streaming, and other tasks.

You can subscribe to these events by using the callbackManager argument available throughout the API. A CallbackManager is an object that manages a list of CallbackHandlers. The CallbackManager will call the appropriate method on each handler when the event is triggered.

interface CallbackManager {
addHandler(handler: CallbackHandler): void;

removeHandler(handler: CallbackHandler): void;

setHandlers(handlers: CallbackHandler[]): void;

setHandler(handler: CallbackHandler): void;
}

CallbackHandlers are objects that implement the CallbackHandler interface, which has a method for each event that can be subscribed to. The CallbackManager will call the appropriate method on each handler when the event is triggered.

abstract class BaseCallbackHandler {
handleLLMStart?(
llm: { name: string },
prompts: string[],
verbose?: boolean
): Promise<void>;

handleLLMNewToken?(token: string, verbose?: boolean): Promise<void>;

handleLLMError?(err: Error, verbose?: boolean): Promise<void>;

handleLLMEnd?(output: LLMResult, verbose?: boolean): Promise<void>;

handleChainStart?(
chain: { name: string },
inputs: ChainValues,
verbose?: boolean
): Promise<void>;

handleChainError?(err: Error, verbose?: boolean): Promise<void>;

handleChainEnd?(outputs: ChainValues, verbose?: boolean): Promise<void>;

handleToolStart?(
tool: { name: string },
input: string,
verbose?: boolean
): Promise<void>;

handleToolError?(err: Error, verbose?: boolean): Promise<void>;

handleToolEnd?(output: string, verbose?: boolean): Promise<void>;

handleText?(text: string, verbose?: boolean): Promise<void>;

handleAgentAction?(action: AgentAction, verbose?: boolean): Promise<void>;

handleAgentEnd?(action: AgentFinish, verbose?: boolean): Promise<void>;
}

Using an existing handler

LangChain provides a few built-in handlers that you can use to get started. These are available in the langchain/callbacks module. The most basic handler is the ConsoleCallbackHandler, which simply logs all events to the console. In the future we will add more default handlers to the library.

import { CallbackManager, ConsoleCallbackHandler } from "langchain/callbacks";
import { LLMChain } from "langchain/chains";
import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";

export const run = async () => {
const callbackManager = new CallbackManager();
callbackManager.addHandler(new ConsoleCallbackHandler());

const llm = new OpenAI({ temperature: 0, callbackManager });
const prompt = PromptTemplate.fromTemplate("1 + {number} =");
const chain = new LLMChain({ prompt, llm, callbackManager });

await chain.call({ number: 2 });
/*
Entering new llm_chain chain...
Finished chain.
*/
};

Creating a one-off handler

We offer a method on the CallbackManager class that allows you to create a one-off handler. This is useful if eg. you need to create a handler that you will use only for a single request, eg to stream the output of an LLM/Agent/etc to a websocket.

This is a more complete example that passes a CallbackManager to a ChatModel, and LLMChain, a Tool, and an Agent.

import { LLMChain } from "langchain/chains";
import { AgentExecutor, ZeroShotAgent } from "langchain/agents";
import { CallbackManager } from "langchain/callbacks";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { Calculator } from "langchain/tools/calculator";

export const run = async () => {
// Create a callback manager that will be used throughout
const callbackManager = CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
console.log("token", { token });
},
async handleLLMStart(llm, _prompts: string[]) {
console.log("handleLLMStart", { llm });
},
async handleChainStart(chain) {
console.log("handleChainStart", { chain });
},
async handleAgentAction(action) {
console.log("handleAgentAction", action);
},
async handleToolStart(tool) {
console.log("handleToolStart", { tool });
},
});

const model = new ChatOpenAI({
temperature: 0,
callbackManager, // this is needed to see handleLLMStart and handleLLMNewToken
streaming: true, // needed to enable streaming, which enables handleLLMNewToken
});

const tools = [
new Calculator(
true,
callbackManager /* this is needed to see handleToolStart */
),
];
const agentPrompt = ZeroShotAgent.createPrompt(tools);
const llmChain = new LLMChain({
llm: model,
prompt: agentPrompt,
callbackManager, // this is needed to see handleChainStart
});
const agent = new ZeroShotAgent({
llmChain,
allowedTools: ["search"],
});

const agentExecutor = AgentExecutor.fromAgentAndTools({
agent,
tools,
callbackManager, // this is needed to see handleAgentAction
});

const result = await agentExecutor.call({
input: "What is 2 to the power of 8",
});
/*
handleChainStart { chain: { name: 'agent_executor' } }
handleChainStart { chain: { name: 'llm_chain' } }
handleLLMStart { llm: { name: 'openai' } }
token { token: '' }
token { token: 'I' }
token { token: ' need' }
token { token: ' to' }
token { token: ' calculate' }
token { token: ' ' }
token { token: '2' }
token { token: ' raised' }
token { token: ' to' }
token { token: ' the' }
token { token: ' power' }
token { token: ' of' }
token { token: ' ' }
token { token: '8' }
token { token: '\n' }
token { token: 'Action' }
token { token: ':' }
token { token: ' calculator' }
token { token: '\n' }
token { token: 'Action' }
token { token: ' Input' }
token { token: ':' }
token { token: ' ' }
token { token: '2' }
token { token: '^' }
token { token: '8' }
token { token: '' }
handleAgentAction {
tool: 'calculator',
toolInput: '2^8',
log: 'I need to calculate 2 raised to the power of 8\n' +
'Action: calculator\n' +
'Action Input: 2^8'
}
handleToolStart { tool: { name: 'calculator' } }
handleChainStart { chain: { name: 'llm_chain' } }
handleLLMStart { llm: { name: 'openai' } }
token { token: '' }
token { token: 'That' }
token { token: "'s" }
token { token: ' the' }
token { token: ' answer' }
token { token: ' to' }
token { token: ' the' }
token { token: ' question' }
token { token: '\n' }
token { token: 'Final' }
token { token: ' Answer' }
token { token: ':' }
token { token: ' ' }
token { token: '256' }
token { token: '' }
*/

console.log(result);
/*
{ output: '256' }
*/
};

Creating a custom handler

You can also create your own handler by implementing the CallbackHandler interface. This is useful if you want to do something more complex than just logging to the console, eg. send the events to a logging service. As an example here is the implementation of the ConsoleCallbackHandler:

export class MyCallbackHandler extends BaseCallbackHandler {
async handleChainStart(chain: { name: string }) {
console.log(`Entering new ${chain.name} chain...`);
}

async handleChainEnd(_output: ChainValues) {
console.log("Finished chain.");
}

async handleAgentAction(action: AgentAction) {
console.log(action.log);
}

async handleToolEnd(output: string) {
console.log(output);
}

async handleText(text: string) {
console.log(text);
}

async handleAgentEnd(action: AgentFinish) {
console.log(action.log);
}
}

You could then use it as described in the section above.