Skip to main content

Class: ChatOpenAI

chat_models/openai.ChatOpenAI

Wrapper around OpenAI large language models that use the Chat endpoint.

To use you should have the openai package installed, with the OPENAI_API_KEY environment variable set.

Remarks

Any parameters that are valid to be passed to openai.createCompletion can be passed through modelKwargs, even if not explicitly available on this class.

Hierarchy

Implements

  • OpenAIInput

Constructors

constructor

new ChatOpenAI(fields?, configuration?)

Parameters

NameType
fields?Partial<OpenAIInput> & BaseLanguageModelParams & { cache?: boolean ; concurrency?: number ; openAIApiKey?: string }
configuration?ConfigurationParameters

Overrides

BaseChatModel.constructor

Defined in

langchain/src/chat_models/openai.ts:165

Properties

callbackManager

callbackManager: CallbackManager

Inherited from

BaseChatModel.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

BaseChatModel.caller

Defined in

langchain/src/base_language/index.ts:40


frequencyPenalty

frequencyPenalty: number = 0

Implementation of

OpenAIInput.frequencyPenalty

Defined in

langchain/src/chat_models/openai.ts:141


logitBias

Optional logitBias: Record<string, number>

Implementation of

OpenAIInput.logitBias

Defined in

langchain/src/chat_models/openai.ts:147


maxTokens

Optional maxTokens: number

Implementation of

OpenAIInput.maxTokens

Defined in

langchain/src/chat_models/openai.ts:159


modelKwargs

Optional modelKwargs: Kwargs

Implementation of

OpenAIInput.modelKwargs

Defined in

langchain/src/chat_models/openai.ts:151


modelName

modelName: string = "gpt-3.5-turbo"

Implementation of

OpenAIInput.modelName

Defined in

langchain/src/chat_models/openai.ts:149


n

n: number = 1

Implementation of

OpenAIInput.n

Defined in

langchain/src/chat_models/openai.ts:145


presencePenalty

presencePenalty: number = 0

Implementation of

OpenAIInput.presencePenalty

Defined in

langchain/src/chat_models/openai.ts:143


stop

Optional stop: string[]

Implementation of

OpenAIInput.stop

Defined in

langchain/src/chat_models/openai.ts:153


streaming

streaming: boolean = false

Implementation of

OpenAIInput.streaming

Defined in

langchain/src/chat_models/openai.ts:157


temperature

temperature: number = 1

Implementation of

OpenAIInput.temperature

Defined in

langchain/src/chat_models/openai.ts:137


timeout

Optional timeout: number

Implementation of

OpenAIInput.timeout

Defined in

langchain/src/chat_models/openai.ts:155


topP

topP: number = 1

Implementation of

OpenAIInput.topP

Defined in

langchain/src/chat_models/openai.ts:139


verbose

verbose: boolean

Whether to print out response text.

Inherited from

BaseChatModel.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_combineLLMOutput

_combineLLMOutput(...llmOutputs): OpenAILLMOutput

Parameters

NameType
...llmOutputsOpenAILLMOutput[]

Returns

OpenAILLMOutput

Overrides

BaseChatModel._combineLLMOutput

Defined in

langchain/src/chat_models/openai.ts:454


_generate

_generate(messages, stop?): Promise<ChatResult>

Call out to OpenAI's endpoint with k unique prompts

Example

import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);

Parameters

NameTypeDescription
messagesBaseChatMessage[]The messages to pass into the model.
stop?string[]Optional list of stop words to use when generating.

Returns

Promise<ChatResult>

The full LLM output.

Overrides

BaseChatModel._generate

Defined in

langchain/src/chat_models/openai.ts:258


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Overrides

BaseChatModel._identifyingParams

Defined in

langchain/src/chat_models/openai.ts:228


_llmType

_llmType(): string

Returns

string

Overrides

BaseChatModel._llmType

Defined in

langchain/src/chat_models/openai.ts:450


_modelType

_modelType(): string

Returns

string

Inherited from

BaseChatModel._modelType

Defined in

langchain/src/chat_models/base.ts:76


call

call(messages, stop?): Promise<BaseChatMessage>

Parameters

NameType
messagesBaseChatMessage[]
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.call

Defined in

langchain/src/chat_models/base.ts:97


callPrompt

callPrompt(promptValue, stop?): Promise<BaseChatMessage>

Parameters

NameType
promptValueBasePromptValue
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.callPrompt

Defined in

langchain/src/chat_models/base.ts:106


generate

generate(messages, stop?): Promise<LLMResult>

Parameters

NameType
messagesBaseChatMessage[][]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generate

Defined in

langchain/src/chat_models/base.ts:39


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generatePrompt

Defined in

langchain/src/chat_models/base.ts:82


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

BaseChatModel.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


getNumTokensFromMessages

getNumTokensFromMessages(messages): Promise<{ countPerMessage: number[] ; totalCount: number }>

Parameters

NameType
messagesBaseChatMessage[]

Returns

Promise<{ countPerMessage: number[] ; totalCount: number }>

Defined in

langchain/src/chat_models/openai.ts:394


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Defined in

langchain/src/chat_models/openai.ts:239


invocationParams

invocationParams(): Omit<CreateChatCompletionRequest, "messages"> & Kwargs

Get the parameters used to invoke the model

Returns

Omit<CreateChatCompletionRequest, "messages"> & Kwargs

Defined in

langchain/src/chat_models/openai.ts:212


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

BaseChatModel.serialize

Defined in

langchain/src/base_language/index.ts:108


deserialize

Static deserialize(data): Promise<BaseLanguageModel>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLanguageModel>

Inherited from

BaseChatModel.deserialize

Defined in

langchain/src/base_language/index.ts:119