Skip to main content

Class: BaseLLM

llms/base.BaseLLM

LLM Wrapper. Provides an call (an generate) function that takes in a prompt (or prompts) and returns a string.

Hierarchy​

Constructors​

constructor​

β€’ new BaseLLM(Β«destructuredΒ»)

Parameters​

NameType
Β«destructuredΒ»BaseLLMParams

Overrides​

BaseLanguageModel.constructor

Defined in​

langchain/src/llms/base.ts:33

Properties​

cache​

β€’ Optional cache: BaseCache<Generation[]>

Defined in​

langchain/src/llms/base.ts:31


callbackManager​

β€’ callbackManager: CallbackManager

Inherited from​

BaseLanguageModel.callbackManager

Defined in​

langchain/src/base_language/index.ts:34


caller​

β€’ Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from​

BaseLanguageModel.caller

Defined in​

langchain/src/base_language/index.ts:40


name​

β€’ name: string

The name of the LLM class

Defined in​

langchain/src/llms/base.ts:29


verbose​

β€’ verbose: boolean

Whether to print out response text.

Inherited from​

BaseLanguageModel.verbose

Defined in​

langchain/src/base_language/index.ts:32

Methods​

_generate​

β–Έ Abstract _generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given prompts and input.

Parameters​

NameType
promptsstring[]
stop?string[]

Returns​

Promise<LLMResult>

Defined in​

langchain/src/llms/base.ts:57


_identifyingParams​

β–Έ _identifyingParams(): Record<string, any>

Get the identifying parameters of the LLM.

Returns​

Record<string, any>

Overrides​

BaseLanguageModel._identifyingParams

Defined in​

langchain/src/llms/base.ts:140


_llmType​

β–Έ Abstract _llmType(): string

Return the string type key uniquely identifying this class of LLM.

Returns​

string

Overrides​

BaseLanguageModel._llmType

Defined in​

langchain/src/llms/base.ts:147


_modelType​

β–Έ _modelType(): string

Returns​

string

Overrides​

BaseLanguageModel._modelType

Defined in​

langchain/src/llms/base.ts:160


call​

β–Έ call(prompt, stop?): Promise<string>

Convenience wrapper for generate that takes in a single string prompt and returns a single string output.

Parameters​

NameType
promptstring
stop?string[]

Returns​

Promise<string>

Defined in​

langchain/src/llms/base.ts:131


generate​

β–Έ generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given propmts an input, handling caching.

Parameters​

NameType
promptsstring[]
stop?string[]

Returns​

Promise<LLMResult>

Defined in​

langchain/src/llms/base.ts:84


generatePrompt​

β–Έ generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters​

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns​

Promise<LLMResult>

Overrides​

BaseLanguageModel.generatePrompt

Defined in​

langchain/src/llms/base.ts:44


getNumTokens​

β–Έ getNumTokens(text): Promise<number>

Parameters​

NameType
textstring

Returns​

Promise<number>

Inherited from​

BaseLanguageModel.getNumTokens

Defined in​

langchain/src/base_language/index.ts:62


serialize​

β–Έ serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns​

SerializedLLM

Overrides​

BaseLanguageModel.serialize

Defined in​

langchain/src/llms/base.ts:152


deserialize​

β–Έ Static deserialize(data): Promise<BaseLLM>

Load an LLM from a json-like object describing it.

Parameters​

NameType
dataSerializedLLM

Returns​

Promise<BaseLLM>

Overrides​

BaseLanguageModel.deserialize

Defined in​

langchain/src/llms/base.ts:167