Skip to main content

Class: HuggingFaceInference

llms/hf.HuggingFaceInference

LLM class that provides a simpler interface to subclass than BaseLLM.

Requires only implementing a simpler _call method instead of _generate.

Hierarchy

  • LLM

    HuggingFaceInference

Implements

  • HFInput

Constructors

constructor

new HuggingFaceInference(fields?)

Parameters

NameType
fields?Partial<HFInput> & BaseLLMParams

Overrides

LLM.constructor

Defined in

langchain/src/llms/hf.ts:43

Properties

apiKey

apiKey: undefined | string = undefined

Implementation of

HFInput.apiKey

Defined in

langchain/src/llms/hf.ts:41


cache

Optional cache: BaseCache<Generation[]>

Inherited from

LLM.cache

Defined in

langchain/src/llms/base.ts:31


callbackManager

callbackManager: CallbackManager

Inherited from

LLM.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

LLM.caller

Defined in

langchain/src/base_language/index.ts:40


frequencyPenalty

frequencyPenalty: undefined | number = undefined

Implementation of

HFInput.frequencyPenalty

Defined in

langchain/src/llms/hf.ts:39


maxTokens

maxTokens: undefined | number = undefined

Implementation of

HFInput.maxTokens

Defined in

langchain/src/llms/hf.ts:33


model

model: string = "gpt2"

Implementation of

HFInput.model

Defined in

langchain/src/llms/hf.ts:29


name

name: string

The name of the LLM class

Inherited from

LLM.name

Defined in

langchain/src/llms/base.ts:29


temperature

temperature: undefined | number = undefined

Implementation of

HFInput.temperature

Defined in

langchain/src/llms/hf.ts:31


topK

topK: undefined | number = undefined

Implementation of

HFInput.topK

Defined in

langchain/src/llms/hf.ts:37


topP

topP: undefined | number = undefined

Implementation of

HFInput.topP

Defined in

langchain/src/llms/hf.ts:35


verbose

verbose: boolean

Whether to print out response text.

Inherited from

LLM.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_call

_call(prompt, _stop?): Promise<string>

Run the LLM on the given prompt and input.

Parameters

NameType
promptstring
_stop?string[]

Returns

Promise<string>

Overrides

LLM._call

Defined in

langchain/src/llms/hf.ts:69


_generate

_generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given prompts and input.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM._generate

Defined in

langchain/src/llms/base.ts:197


_identifyingParams

_identifyingParams(): Record<string, any>

Get the identifying parameters of the LLM.

Returns

Record<string, any>

Inherited from

LLM._identifyingParams

Defined in

langchain/src/llms/base.ts:140


_llmType

_llmType(): string

Return the string type key uniquely identifying this class of LLM.

Returns

string

Overrides

LLM._llmType

Defined in

langchain/src/llms/hf.ts:65


_modelType

_modelType(): string

Returns

string

Inherited from

LLM._modelType

Defined in

langchain/src/llms/base.ts:160


call

call(prompt, stop?): Promise<string>

Convenience wrapper for generate that takes in a single string prompt and returns a single string output.

Parameters

NameType
promptstring
stop?string[]

Returns

Promise<string>

Inherited from

LLM.call

Defined in

langchain/src/llms/base.ts:131


generate

generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given propmts an input, handling caching.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM.generate

Defined in

langchain/src/llms/base.ts:84


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM.generatePrompt

Defined in

langchain/src/llms/base.ts:44


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

LLM.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

LLM.serialize

Defined in

langchain/src/llms/base.ts:152


deserialize

Static deserialize(data): Promise<BaseLLM>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLLM>

Inherited from

LLM.deserialize

Defined in

langchain/src/llms/base.ts:167


imports

Static imports(): Promise<{ HfInference: typeof HfInference }>

Returns

Promise<{ HfInference: typeof HfInference }>

Defined in

langchain/src/llms/hf.ts:88