Skip to main content

Class: ChatAnthropic

chat_models/anthropic.ChatAnthropic

Wrapper around Anthropic large language models.

To use you should have the @anthropic-ai/sdk package installed, with the ANTHROPIC_API_KEY environment variable set.

Remarks

Any parameters that are valid to be passed to anthropic.complete can be passed through invocationKwargs, even if not explicitly available on this class.

Hierarchy

Implements

  • AnthropicInput

Constructors

constructor

new ChatAnthropic(fields?)

Parameters

NameType
fields?Partial<AnthropicInput> & BaseLanguageModelParams & { anthropicApiKey?: string }

Overrides

BaseChatModel.constructor

Defined in

langchain/src/chat_models/anthropic.ts:130

Properties

apiKey

Optional apiKey: string

Implementation of

AnthropicInput.apiKey

Defined in

langchain/src/chat_models/anthropic.ts:106


callbackManager

callbackManager: CallbackManager

Inherited from

BaseChatModel.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

BaseChatModel.caller

Defined in

langchain/src/base_language/index.ts:40


invocationKwargs

Optional invocationKwargs: Kwargs

Implementation of

AnthropicInput.invocationKwargs

Defined in

langchain/src/chat_models/anthropic.ts:118


maxTokensToSample

maxTokensToSample: number = 2048

Implementation of

AnthropicInput.maxTokensToSample

Defined in

langchain/src/chat_models/anthropic.ts:114


modelName

modelName: string = "claude-v1"

Implementation of

AnthropicInput.modelName

Defined in

langchain/src/chat_models/anthropic.ts:116


stopSequences

Optional stopSequences: string[]

Implementation of

AnthropicInput.stopSequences

Defined in

langchain/src/chat_models/anthropic.ts:120


streaming

streaming: boolean = false

Implementation of

AnthropicInput.streaming

Defined in

langchain/src/chat_models/anthropic.ts:122


temperature

temperature: number = 1

Implementation of

AnthropicInput.temperature

Defined in

langchain/src/chat_models/anthropic.ts:108


topK

topK: number = -1

Implementation of

AnthropicInput.topK

Defined in

langchain/src/chat_models/anthropic.ts:110


topP

topP: number = -1

Implementation of

AnthropicInput.topP

Defined in

langchain/src/chat_models/anthropic.ts:112


verbose

verbose: boolean

Whether to print out response text.

Inherited from

BaseChatModel.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_combineLLMOutput

_combineLLMOutput(): never[]

Returns

never[]

Overrides

BaseChatModel._combineLLMOutput

Defined in

langchain/src/chat_models/anthropic.ts:296


_generate

_generate(messages, stopSequences?): Promise<ChatResult>

Call out to Anthropic's endpoint with k unique prompts

Example

import { ChatAnthropic } from "langchain/chat_models/openai";
const anthropic = new ChatAnthropic();
const response = await anthropic.generate(new HumanChatMessage(["Tell me a joke."]));

Parameters

NameTypeDescription
messagesBaseChatMessage[]The messages to pass into the model.
stopSequences?string[]Optional list of stop sequences to use when generating.

Returns

Promise<ChatResult>

The full LLM output.

Overrides

BaseChatModel._generate

Defined in

langchain/src/chat_models/anthropic.ts:222


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Overrides

BaseChatModel._identifyingParams

Defined in

langchain/src/chat_models/anthropic.ts:177


_llmType

_llmType(): string

Returns

string

Overrides

BaseChatModel._llmType

Defined in

langchain/src/chat_models/anthropic.ts:292


_modelType

_modelType(): string

Returns

string

Inherited from

BaseChatModel._modelType

Defined in

langchain/src/chat_models/base.ts:76


call

call(messages, stop?): Promise<BaseChatMessage>

Parameters

NameType
messagesBaseChatMessage[]
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.call

Defined in

langchain/src/chat_models/base.ts:97


callPrompt

callPrompt(promptValue, stop?): Promise<BaseChatMessage>

Parameters

NameType
promptValueBasePromptValue
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.callPrompt

Defined in

langchain/src/chat_models/base.ts:106


generate

generate(messages, stop?): Promise<LLMResult>

Parameters

NameType
messagesBaseChatMessage[][]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generate

Defined in

langchain/src/chat_models/base.ts:39


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generatePrompt

Defined in

langchain/src/chat_models/base.ts:82


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

BaseChatModel.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Defined in

langchain/src/chat_models/anthropic.ts:187


invocationParams

invocationParams(): Omit<SamplingParameters, "prompt"> & Kwargs

Get the parameters used to invoke the model

Returns

Omit<SamplingParameters, "prompt"> & Kwargs

Defined in

langchain/src/chat_models/anthropic.ts:164


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

BaseChatModel.serialize

Defined in

langchain/src/base_language/index.ts:108


deserialize

Static deserialize(data): Promise<BaseLanguageModel>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLanguageModel>

Inherited from

BaseChatModel.deserialize

Defined in

langchain/src/base_language/index.ts:119