Skip to main content

Class: BaseChatModel

chat_models/base.BaseChatModel

Base class for language models.

Hierarchy​

Constructors​

constructor​

β€’ Protected new BaseChatModel(Β«destructuredΒ»)

Parameters​

NameType
Β«destructuredΒ»BaseLanguageModelParams

Overrides​

BaseLanguageModel.constructor

Defined in​

langchain/src/chat_models/base.ts:31

Properties​

callbackManager​

β€’ callbackManager: CallbackManager

Inherited from​

BaseLanguageModel.callbackManager

Defined in​

langchain/src/base_language/index.ts:34


caller​

β€’ Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from​

BaseLanguageModel.caller

Defined in​

langchain/src/base_language/index.ts:40


verbose​

β€’ verbose: boolean

Whether to print out response text.

Inherited from​

BaseLanguageModel.verbose

Defined in​

langchain/src/base_language/index.ts:32

Methods​

_combineLLMOutput​

β–Έ Optional Abstract _combineLLMOutput(...llmOutputs): undefined | Record<string, any>

Parameters​

NameType
...llmOutputs(undefined | Record<string, any>)[]

Returns​

undefined | Record<string, any>

Defined in​

langchain/src/chat_models/base.ts:35


_generate​

β–Έ Abstract _generate(messages, stop?): Promise<ChatResult>

Parameters​

NameType
messagesBaseChatMessage[]
stop?string[]

Returns​

Promise<ChatResult>

Defined in​

langchain/src/chat_models/base.ts:92


_identifyingParams​

β–Έ _identifyingParams(): Record<string, any>

Get the identifying parameters of the LLM.

Returns​

Record<string, any>

Inherited from​

BaseLanguageModel._identifyingParams

Defined in​

langchain/src/base_language/index.ts:101


_llmType​

β–Έ Abstract _llmType(): string

Returns​

string

Overrides​

BaseLanguageModel._llmType

Defined in​

langchain/src/chat_models/base.ts:80


_modelType​

β–Έ _modelType(): string

Returns​

string

Overrides​

BaseLanguageModel._modelType

Defined in​

langchain/src/chat_models/base.ts:76


call​

β–Έ call(messages, stop?): Promise<BaseChatMessage>

Parameters​

NameType
messagesBaseChatMessage[]
stop?string[]

Returns​

Promise<BaseChatMessage>

Defined in​

langchain/src/chat_models/base.ts:97


callPrompt​

β–Έ callPrompt(promptValue, stop?): Promise<BaseChatMessage>

Parameters​

NameType
promptValueBasePromptValue
stop?string[]

Returns​

Promise<BaseChatMessage>

Defined in​

langchain/src/chat_models/base.ts:106


generate​

β–Έ generate(messages, stop?): Promise<LLMResult>

Parameters​

NameType
messagesBaseChatMessage[][]
stop?string[]

Returns​

Promise<LLMResult>

Defined in​

langchain/src/chat_models/base.ts:39


generatePrompt​

β–Έ generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters​

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns​

Promise<LLMResult>

Overrides​

BaseLanguageModel.generatePrompt

Defined in​

langchain/src/chat_models/base.ts:82


getNumTokens​

β–Έ getNumTokens(text): Promise<number>

Parameters​

NameType
textstring

Returns​

Promise<number>

Inherited from​

BaseLanguageModel.getNumTokens

Defined in​

langchain/src/base_language/index.ts:62


serialize​

β–Έ serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns​

SerializedLLM

Inherited from​

BaseLanguageModel.serialize

Defined in​

langchain/src/base_language/index.ts:108


deserialize​

β–Έ Static deserialize(data): Promise<BaseLanguageModel>

Load an LLM from a json-like object describing it.

Parameters​

NameType
dataSerializedLLM

Returns​

Promise<BaseLanguageModel>

Inherited from​

BaseLanguageModel.deserialize

Defined in​

langchain/src/base_language/index.ts:119