Class: OpenAIChat
llms/openai.OpenAIChat
Wrapper around OpenAI large language models that use the Chat endpoint.
To use you should have the openai package installed, with the
OPENAI_API_KEY environment variable set.
Remarks
Any parameters that are valid to be passed to openai.createCompletion can be passed through modelKwargs, even
if not explicitly available on this class.
Hierarchy
↳
OpenAIChat
Implements
OpenAIInput
Constructors
constructor
• new OpenAIChat(fields?, configuration?)
Parameters
| Name | Type | 
|---|---|
fields? | Partial<OpenAIInput> & BaseLLMParams & { openAIApiKey?: string  } | 
configuration? | ConfigurationParameters | 
Overrides
Defined in
langchain/src/llms/openai-chat.ts:118
Properties
cache
• Optional cache: BaseCache<Generation[]>
Inherited from
Defined in
langchain/src/llms/base.ts:31
callbackManager
• callbackManager: CallbackManager
Inherited from
Defined in
langchain/src/base_language/index.ts:34
caller
• Protected caller: AsyncCaller
The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.
Inherited from
Defined in
langchain/src/base_language/index.ts:40
frequencyPenalty
• frequencyPenalty: number = 0
Implementation of
OpenAIInput.frequencyPenalty
Defined in
langchain/src/llms/openai-chat.ts:92
logitBias
• Optional logitBias: Record<string, number>
Implementation of
OpenAIInput.logitBias
Defined in
langchain/src/llms/openai-chat.ts:98
maxTokens
• Optional maxTokens: number
Implementation of
OpenAIInput.maxTokens
Defined in
langchain/src/llms/openai-chat.ts:100
modelKwargs
• Optional modelKwargs: Kwargs
Implementation of
OpenAIInput.modelKwargs
Defined in
langchain/src/llms/openai-chat.ts:106
modelName
• modelName: string = "gpt-3.5-turbo"
Implementation of
OpenAIInput.modelName
Defined in
langchain/src/llms/openai-chat.ts:102
n
• n: number = 1
Implementation of
OpenAIInput.n
Defined in
langchain/src/llms/openai-chat.ts:96
name
• name: string
The name of the LLM class
Inherited from
Defined in
langchain/src/llms/base.ts:29
prefixMessages
• Optional prefixMessages: ChatCompletionRequestMessage[]
Implementation of
OpenAIInput.prefixMessages
Defined in
langchain/src/llms/openai-chat.ts:104
presencePenalty
• presencePenalty: number = 0
Implementation of
OpenAIInput.presencePenalty
Defined in
langchain/src/llms/openai-chat.ts:94
stop
• Optional stop: string[]
Implementation of
OpenAIInput.stop
Defined in
langchain/src/llms/openai-chat.ts:110
streaming
• streaming: boolean = false
Implementation of
OpenAIInput.streaming
Defined in
langchain/src/llms/openai-chat.ts:112
temperature
• temperature: number = 1
Implementation of
OpenAIInput.temperature
Defined in
langchain/src/llms/openai-chat.ts:88
timeout
• Optional timeout: number
Implementation of
OpenAIInput.timeout
Defined in
langchain/src/llms/openai-chat.ts:108
topP
• topP: number = 1
Implementation of
OpenAIInput.topP
Defined in
langchain/src/llms/openai-chat.ts:90
verbose
• verbose: boolean
Whether to print out response text.
Inherited from
Defined in
langchain/src/base_language/index.ts:32
Methods
_call
▸ _call(prompt, stop?): Promise<string>
Call out to OpenAI's endpoint with k unique prompts
Example
import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);
Parameters
| Name | Type | Description | 
|---|---|---|
prompt | string | The prompt to pass into the model. | 
stop? | string[] | Optional list of stop words to use when generating. | 
Returns
Promise<string>
The full LLM output.
Overrides
Defined in
langchain/src/llms/openai-chat.ts:222
_generate
▸ _generate(prompts, stop?): Promise<LLMResult>
Run the LLM on the given prompts and input.
Parameters
| Name | Type | 
|---|---|
prompts | string[] | 
stop? | string[] | 
Returns
Promise<LLMResult>
Inherited from
Defined in
langchain/src/llms/base.ts:197
_identifyingParams
▸ _identifyingParams(): Object
Get the identifying parameters of the LLM.
Returns
Object
| Name | Type | 
|---|---|
model_name | string | 
Overrides
Defined in
langchain/src/llms/openai-chat.ts:180
_llmType
▸ _llmType(): string
Return the string type key uniquely identifying this class of LLM.
Returns
string
Overrides
Defined in
langchain/src/llms/openai-chat.ts:341
_modelType
▸ _modelType(): string
Returns
string
Inherited from
Defined in
langchain/src/llms/base.ts:160
call
▸ call(prompt, stop?): Promise<string>
Convenience wrapper for generate that takes in a single string prompt and returns a single string output.
Parameters
| Name | Type | 
|---|---|
prompt | string | 
stop? | string[] | 
Returns
Promise<string>
Inherited from
Defined in
langchain/src/llms/base.ts:131
generate
▸ generate(prompts, stop?): Promise<LLMResult>
Run the LLM on the given propmts an input, handling caching.
Parameters
| Name | Type | 
|---|---|
prompts | string[] | 
stop? | string[] | 
Returns
Promise<LLMResult>
Inherited from
Defined in
langchain/src/llms/base.ts:84
generatePrompt
▸ generatePrompt(promptValues, stop?): Promise<LLMResult>
Parameters
| Name | Type | 
|---|---|
promptValues | BasePromptValue[] | 
stop? | string[] | 
Returns
Promise<LLMResult>
Inherited from
Defined in
langchain/src/llms/base.ts:44
getNumTokens
▸ getNumTokens(text): Promise<number>
Parameters
| Name | Type | 
|---|---|
text | string | 
Returns
Promise<number>
Inherited from
Defined in
langchain/src/base_language/index.ts:62
identifyingParams
▸ identifyingParams(): Object
Get the identifying parameters for the model
Returns
Object
| Name | Type | 
|---|---|
model_name | string | 
Defined in
langchain/src/llms/openai-chat.ts:191
invocationParams
▸ invocationParams(): Omit<CreateChatCompletionRequest, "messages"> & Kwargs
Get the parameters used to invoke the model
Returns
Omit<CreateChatCompletionRequest, "messages"> & Kwargs
Defined in
langchain/src/llms/openai-chat.ts:164
serialize
▸ serialize(): SerializedLLM
Return a json-like object representing this LLM.
Returns
Inherited from
Defined in
langchain/src/llms/base.ts:152
deserialize
▸ Static deserialize(data): Promise<BaseLLM>
Load an LLM from a json-like object describing it.
Parameters
| Name | Type | 
|---|---|
data | SerializedLLM | 
Returns
Promise<BaseLLM>
Inherited from
Defined in
langchain/src/llms/base.ts:167