Skip to main content

Class: SimpleChatModel

chat_models/base.SimpleChatModel

Base class for language models.

Hierarchy​

Constructors​

constructor​

β€’ Protected new SimpleChatModel(Β«destructuredΒ»)

Parameters​

NameType
Β«destructuredΒ»BaseLanguageModelParams

Inherited from​

BaseChatModel.constructor

Defined in​

langchain/src/chat_models/base.ts:31

Properties​

callbackManager​

β€’ callbackManager: CallbackManager

Inherited from​

BaseChatModel.callbackManager

Defined in​

langchain/src/base_language/index.ts:34


caller​

β€’ Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from​

BaseChatModel.caller

Defined in​

langchain/src/base_language/index.ts:40


verbose​

β€’ verbose: boolean

Whether to print out response text.

Inherited from​

BaseChatModel.verbose

Defined in​

langchain/src/base_language/index.ts:32

Methods​

_call​

β–Έ Abstract _call(messages, stop?): Promise<string>

Parameters​

NameType
messagesBaseChatMessage[]
stop?string[]

Returns​

Promise<string>

Defined in​

langchain/src/chat_models/base.ts:116


_combineLLMOutput​

β–Έ Optional Abstract _combineLLMOutput(...llmOutputs): undefined | Record<string, any>

Parameters​

NameType
...llmOutputs(undefined | Record<string, any>)[]

Returns​

undefined | Record<string, any>

Inherited from​

BaseChatModel._combineLLMOutput

Defined in​

langchain/src/chat_models/base.ts:35


_generate​

β–Έ _generate(messages, stop?): Promise<ChatResult>

Parameters​

NameType
messagesBaseChatMessage[]
stop?string[]

Returns​

Promise<ChatResult>

Overrides​

BaseChatModel._generate

Defined in​

langchain/src/chat_models/base.ts:118


_identifyingParams​

β–Έ _identifyingParams(): Record<string, any>

Get the identifying parameters of the LLM.

Returns​

Record<string, any>

Inherited from​

BaseChatModel._identifyingParams

Defined in​

langchain/src/base_language/index.ts:101


_llmType​

β–Έ Abstract _llmType(): string

Returns​

string

Inherited from​

BaseChatModel._llmType

Defined in​

langchain/src/chat_models/base.ts:80


_modelType​

β–Έ _modelType(): string

Returns​

string

Inherited from​

BaseChatModel._modelType

Defined in​

langchain/src/chat_models/base.ts:76


call​

β–Έ call(messages, stop?): Promise<BaseChatMessage>

Parameters​

NameType
messagesBaseChatMessage[]
stop?string[]

Returns​

Promise<BaseChatMessage>

Inherited from​

BaseChatModel.call

Defined in​

langchain/src/chat_models/base.ts:97


callPrompt​

β–Έ callPrompt(promptValue, stop?): Promise<BaseChatMessage>

Parameters​

NameType
promptValueBasePromptValue
stop?string[]

Returns​

Promise<BaseChatMessage>

Inherited from​

BaseChatModel.callPrompt

Defined in​

langchain/src/chat_models/base.ts:106


generate​

β–Έ generate(messages, stop?): Promise<LLMResult>

Parameters​

NameType
messagesBaseChatMessage[][]
stop?string[]

Returns​

Promise<LLMResult>

Inherited from​

BaseChatModel.generate

Defined in​

langchain/src/chat_models/base.ts:39


generatePrompt​

β–Έ generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters​

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns​

Promise<LLMResult>

Inherited from​

BaseChatModel.generatePrompt

Defined in​

langchain/src/chat_models/base.ts:82


getNumTokens​

β–Έ getNumTokens(text): Promise<number>

Parameters​

NameType
textstring

Returns​

Promise<number>

Inherited from​

BaseChatModel.getNumTokens

Defined in​

langchain/src/base_language/index.ts:62


serialize​

β–Έ serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns​

SerializedLLM

Inherited from​

BaseChatModel.serialize

Defined in​

langchain/src/base_language/index.ts:108


deserialize​

β–Έ Static deserialize(data): Promise<BaseLanguageModel>

Load an LLM from a json-like object describing it.

Parameters​

NameType
dataSerializedLLM

Returns​

Promise<BaseLanguageModel>

Inherited from​

BaseChatModel.deserialize

Defined in​

langchain/src/base_language/index.ts:119