Class: ChatOpenAI
chat_models/openai.ChatOpenAI
Wrapper around OpenAI large language models that use the Chat endpoint.
To use you should have the openai package installed, with the
OPENAI_API_KEY environment variable set.
Remarks
Any parameters that are valid to be passed to openai.createCompletion can be passed through modelKwargs, even
if not explicitly available on this class.
Hierarchy
↳
ChatOpenAI
Implements
OpenAIInput
Constructors
constructor
• new ChatOpenAI(fields?, configuration?)
Parameters
| Name | Type | 
|---|---|
fields? | Partial<OpenAIInput> & BaseLanguageModelParams & { cache?: boolean ; concurrency?: number ; openAIApiKey?: string  } | 
configuration? | ConfigurationParameters | 
Overrides
Defined in
langchain/src/chat_models/openai.ts:165
Properties
callbackManager
• callbackManager: CallbackManager
Inherited from
Defined in
langchain/src/base_language/index.ts:34
caller
• Protected caller: AsyncCaller
The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.
Inherited from
Defined in
langchain/src/base_language/index.ts:40
frequencyPenalty
• frequencyPenalty: number = 0
Implementation of
OpenAIInput.frequencyPenalty
Defined in
langchain/src/chat_models/openai.ts:141
logitBias
• Optional logitBias: Record<string, number>
Implementation of
OpenAIInput.logitBias
Defined in
langchain/src/chat_models/openai.ts:147
maxTokens
• Optional maxTokens: number
Implementation of
OpenAIInput.maxTokens
Defined in
langchain/src/chat_models/openai.ts:159
modelKwargs
• Optional modelKwargs: Kwargs
Implementation of
OpenAIInput.modelKwargs
Defined in
langchain/src/chat_models/openai.ts:151
modelName
• modelName: string = "gpt-3.5-turbo"
Implementation of
OpenAIInput.modelName
Defined in
langchain/src/chat_models/openai.ts:149
n
• n: number = 1
Implementation of
OpenAIInput.n
Defined in
langchain/src/chat_models/openai.ts:145
presencePenalty
• presencePenalty: number = 0
Implementation of
OpenAIInput.presencePenalty
Defined in
langchain/src/chat_models/openai.ts:143
stop
• Optional stop: string[]
Implementation of
OpenAIInput.stop
Defined in
langchain/src/chat_models/openai.ts:153
streaming
• streaming: boolean = false
Implementation of
OpenAIInput.streaming
Defined in
langchain/src/chat_models/openai.ts:157
temperature
• temperature: number = 1
Implementation of
OpenAIInput.temperature
Defined in
langchain/src/chat_models/openai.ts:137
timeout
• Optional timeout: number
Implementation of
OpenAIInput.timeout
Defined in
langchain/src/chat_models/openai.ts:155
topP
• topP: number = 1
Implementation of
OpenAIInput.topP
Defined in
langchain/src/chat_models/openai.ts:139
verbose
• verbose: boolean
Whether to print out response text.
Inherited from
Defined in
langchain/src/base_language/index.ts:32
Methods
_combineLLMOutput
▸ _combineLLMOutput(...llmOutputs): OpenAILLMOutput
Parameters
| Name | Type | 
|---|---|
...llmOutputs | OpenAILLMOutput[] | 
Returns
OpenAILLMOutput
Overrides
BaseChatModel._combineLLMOutput
Defined in
langchain/src/chat_models/openai.ts:454
_generate
▸ _generate(messages, stop?): Promise<ChatResult>
Call out to OpenAI's endpoint with k unique prompts
Example
import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);
Parameters
| Name | Type | Description | 
|---|---|---|
messages | BaseChatMessage[] | The messages to pass into the model. | 
stop? | string[] | Optional list of stop words to use when generating. | 
Returns
Promise<ChatResult>
The full LLM output.
Overrides
Defined in
langchain/src/chat_models/openai.ts:258
_identifyingParams
▸ _identifyingParams(): Object
Get the identifying parameters of the LLM.
Returns
Object
| Name | Type | 
|---|---|
model_name | string | 
Overrides
BaseChatModel._identifyingParams
Defined in
langchain/src/chat_models/openai.ts:228
_llmType
▸ _llmType(): string
Returns
string
Overrides
Defined in
langchain/src/chat_models/openai.ts:450
_modelType
▸ _modelType(): string
Returns
string
Inherited from
Defined in
langchain/src/chat_models/base.ts:76
call
▸ call(messages, stop?): Promise<BaseChatMessage>
Parameters
| Name | Type | 
|---|---|
messages | BaseChatMessage[] | 
stop? | string[] | 
Returns
Promise<BaseChatMessage>
Inherited from
Defined in
langchain/src/chat_models/base.ts:97
callPrompt
▸ callPrompt(promptValue, stop?): Promise<BaseChatMessage>
Parameters
| Name | Type | 
|---|---|
promptValue | BasePromptValue | 
stop? | string[] | 
Returns
Promise<BaseChatMessage>
Inherited from
Defined in
langchain/src/chat_models/base.ts:106
generate
▸ generate(messages, stop?): Promise<LLMResult>
Parameters
| Name | Type | 
|---|---|
messages | BaseChatMessage[][] | 
stop? | string[] | 
Returns
Promise<LLMResult>
Inherited from
Defined in
langchain/src/chat_models/base.ts:39
generatePrompt
▸ generatePrompt(promptValues, stop?): Promise<LLMResult>
Parameters
| Name | Type | 
|---|---|
promptValues | BasePromptValue[] | 
stop? | string[] | 
Returns
Promise<LLMResult>
Inherited from
Defined in
langchain/src/chat_models/base.ts:82
getNumTokens
▸ getNumTokens(text): Promise<number>
Parameters
| Name | Type | 
|---|---|
text | string | 
Returns
Promise<number>
Inherited from
Defined in
langchain/src/base_language/index.ts:62
getNumTokensFromMessages
▸ getNumTokensFromMessages(messages): Promise<{ countPerMessage: number[] ; totalCount: number  }>
Parameters
| Name | Type | 
|---|---|
messages | BaseChatMessage[] | 
Returns
Promise<{ countPerMessage: number[] ; totalCount: number  }>
Defined in
langchain/src/chat_models/openai.ts:394
identifyingParams
▸ identifyingParams(): Object
Get the identifying parameters for the model
Returns
Object
| Name | Type | 
|---|---|
model_name | string | 
Defined in
langchain/src/chat_models/openai.ts:239
invocationParams
▸ invocationParams(): Omit<CreateChatCompletionRequest, "messages"> & Kwargs
Get the parameters used to invoke the model
Returns
Omit<CreateChatCompletionRequest, "messages"> & Kwargs
Defined in
langchain/src/chat_models/openai.ts:212
serialize
▸ serialize(): SerializedLLM
Return a json-like object representing this LLM.
Returns
Inherited from
Defined in
langchain/src/base_language/index.ts:108
deserialize
▸ Static deserialize(data): Promise<BaseLanguageModel>
Load an LLM from a json-like object describing it.
Parameters
| Name | Type | 
|---|---|
data | SerializedLLM | 
Returns
Promise<BaseLanguageModel>
Inherited from
Defined in
langchain/src/base_language/index.ts:119