Skip to main content

Class: OpenAIChat

llms/openai.OpenAIChat

Wrapper around OpenAI large language models that use the Chat endpoint.

To use you should have the openai package installed, with the OPENAI_API_KEY environment variable set.

Remarks

Any parameters that are valid to be passed to openai.createCompletion can be passed through modelKwargs, even if not explicitly available on this class.

Hierarchy

  • LLM

    OpenAIChat

Implements

  • OpenAIInput

Constructors

constructor

new OpenAIChat(fields?, configuration?)

Parameters

NameType
fields?Partial<OpenAIInput> & BaseLLMParams & { openAIApiKey?: string }
configuration?ConfigurationParameters

Overrides

LLM.constructor

Defined in

langchain/src/llms/openai-chat.ts:118

Properties

cache

Optional cache: BaseCache<Generation[]>

Inherited from

LLM.cache

Defined in

langchain/src/llms/base.ts:31


callbackManager

callbackManager: CallbackManager

Inherited from

LLM.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

LLM.caller

Defined in

langchain/src/base_language/index.ts:40


frequencyPenalty

frequencyPenalty: number = 0

Implementation of

OpenAIInput.frequencyPenalty

Defined in

langchain/src/llms/openai-chat.ts:92


logitBias

Optional logitBias: Record<string, number>

Implementation of

OpenAIInput.logitBias

Defined in

langchain/src/llms/openai-chat.ts:98


maxTokens

Optional maxTokens: number

Implementation of

OpenAIInput.maxTokens

Defined in

langchain/src/llms/openai-chat.ts:100


modelKwargs

Optional modelKwargs: Kwargs

Implementation of

OpenAIInput.modelKwargs

Defined in

langchain/src/llms/openai-chat.ts:106


modelName

modelName: string = "gpt-3.5-turbo"

Implementation of

OpenAIInput.modelName

Defined in

langchain/src/llms/openai-chat.ts:102


n

n: number = 1

Implementation of

OpenAIInput.n

Defined in

langchain/src/llms/openai-chat.ts:96


name

name: string

The name of the LLM class

Inherited from

LLM.name

Defined in

langchain/src/llms/base.ts:29


prefixMessages

Optional prefixMessages: ChatCompletionRequestMessage[]

Implementation of

OpenAIInput.prefixMessages

Defined in

langchain/src/llms/openai-chat.ts:104


presencePenalty

presencePenalty: number = 0

Implementation of

OpenAIInput.presencePenalty

Defined in

langchain/src/llms/openai-chat.ts:94


stop

Optional stop: string[]

Implementation of

OpenAIInput.stop

Defined in

langchain/src/llms/openai-chat.ts:110


streaming

streaming: boolean = false

Implementation of

OpenAIInput.streaming

Defined in

langchain/src/llms/openai-chat.ts:112


temperature

temperature: number = 1

Implementation of

OpenAIInput.temperature

Defined in

langchain/src/llms/openai-chat.ts:88


timeout

Optional timeout: number

Implementation of

OpenAIInput.timeout

Defined in

langchain/src/llms/openai-chat.ts:108


topP

topP: number = 1

Implementation of

OpenAIInput.topP

Defined in

langchain/src/llms/openai-chat.ts:90


verbose

verbose: boolean

Whether to print out response text.

Inherited from

LLM.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_call

_call(prompt, stop?): Promise<string>

Call out to OpenAI's endpoint with k unique prompts

Example

import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);

Parameters

NameTypeDescription
promptstringThe prompt to pass into the model.
stop?string[]Optional list of stop words to use when generating.

Returns

Promise<string>

The full LLM output.

Overrides

LLM._call

Defined in

langchain/src/llms/openai-chat.ts:222


_generate

_generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given prompts and input.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM._generate

Defined in

langchain/src/llms/base.ts:197


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Overrides

LLM._identifyingParams

Defined in

langchain/src/llms/openai-chat.ts:180


_llmType

_llmType(): string

Return the string type key uniquely identifying this class of LLM.

Returns

string

Overrides

LLM._llmType

Defined in

langchain/src/llms/openai-chat.ts:341


_modelType

_modelType(): string

Returns

string

Inherited from

LLM._modelType

Defined in

langchain/src/llms/base.ts:160


call

call(prompt, stop?): Promise<string>

Convenience wrapper for generate that takes in a single string prompt and returns a single string output.

Parameters

NameType
promptstring
stop?string[]

Returns

Promise<string>

Inherited from

LLM.call

Defined in

langchain/src/llms/base.ts:131


generate

generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given propmts an input, handling caching.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM.generate

Defined in

langchain/src/llms/base.ts:84


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

LLM.generatePrompt

Defined in

langchain/src/llms/base.ts:44


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

LLM.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Defined in

langchain/src/llms/openai-chat.ts:191


invocationParams

invocationParams(): Omit<CreateChatCompletionRequest, "messages"> & Kwargs

Get the parameters used to invoke the model

Returns

Omit<CreateChatCompletionRequest, "messages"> & Kwargs

Defined in

langchain/src/llms/openai-chat.ts:164


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

LLM.serialize

Defined in

langchain/src/llms/base.ts:152


deserialize

Static deserialize(data): Promise<BaseLLM>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLLM>

Inherited from

LLM.deserialize

Defined in

langchain/src/llms/base.ts:167