Skip to main content

Class: OpenAI

llms/openai.OpenAI

Wrapper around OpenAI large language models.

To use you should have the openai package installed, with the OPENAI_API_KEY environment variable set.

Remarks

Any parameters that are valid to be passed to openai.createCompletion can be passed through modelKwargs, even if not explicitly available on this class.

Hierarchy

Implements

  • OpenAIInput

Constructors

constructor

new OpenAI(fields?, configuration?)

Parameters

NameType
fields?Partial<OpenAIInput> & BaseLLMParams & { openAIApiKey?: string }
configuration?ConfigurationParameters

Overrides

BaseLLM.constructor

Defined in

langchain/src/llms/openai.ts:133

Properties

batchSize

batchSize: number = 20

Implementation of

OpenAIInput.batchSize

Defined in

langchain/src/llms/openai.ts:121


bestOf

bestOf: number = 1

Implementation of

OpenAIInput.bestOf

Defined in

langchain/src/llms/openai.ts:113


cache

Optional cache: BaseCache<Generation[]>

Inherited from

BaseLLM.cache

Defined in

langchain/src/llms/base.ts:31


callbackManager

callbackManager: CallbackManager

Inherited from

BaseLLM.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

BaseLLM.caller

Defined in

langchain/src/base_language/index.ts:40


frequencyPenalty

frequencyPenalty: number = 0

Implementation of

OpenAIInput.frequencyPenalty

Defined in

langchain/src/llms/openai.ts:107


logitBias

Optional logitBias: Record<string, number>

Implementation of

OpenAIInput.logitBias

Defined in

langchain/src/llms/openai.ts:115


maxTokens

maxTokens: number = 256

Implementation of

OpenAIInput.maxTokens

Defined in

langchain/src/llms/openai.ts:103


modelKwargs

Optional modelKwargs: Kwargs

Implementation of

OpenAIInput.modelKwargs

Defined in

langchain/src/llms/openai.ts:119


modelName

modelName: string = "text-davinci-003"

Implementation of

OpenAIInput.modelName

Defined in

langchain/src/llms/openai.ts:117


n

n: number = 1

Implementation of

OpenAIInput.n

Defined in

langchain/src/llms/openai.ts:111


name

name: string

The name of the LLM class

Inherited from

BaseLLM.name

Defined in

langchain/src/llms/base.ts:29


presencePenalty

presencePenalty: number = 0

Implementation of

OpenAIInput.presencePenalty

Defined in

langchain/src/llms/openai.ts:109


stop

Optional stop: string[]

Implementation of

OpenAIInput.stop

Defined in

langchain/src/llms/openai.ts:125


streaming

streaming: boolean = false

Implementation of

OpenAIInput.streaming

Defined in

langchain/src/llms/openai.ts:127


temperature

temperature: number = 0.7

Implementation of

OpenAIInput.temperature

Defined in

langchain/src/llms/openai.ts:101


timeout

Optional timeout: number

Implementation of

OpenAIInput.timeout

Defined in

langchain/src/llms/openai.ts:123


topP

topP: number = 1

Implementation of

OpenAIInput.topP

Defined in

langchain/src/llms/openai.ts:105


verbose

verbose: boolean

Whether to print out response text.

Inherited from

BaseLLM.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_generate

_generate(prompts, stop?): Promise<LLMResult>

Call out to OpenAI's endpoint with k unique prompts

Example

import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);

Parameters

NameTypeDescription
promptsstring[]The prompts to pass into the model.
stop?string[]Optional list of stop words to use when generating.

Returns

Promise<LLMResult>

The full LLM output.

Overrides

BaseLLM._generate

Defined in

langchain/src/llms/openai.ts:238


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Overrides

BaseLLM._identifyingParams

Defined in

langchain/src/llms/openai.ts:208


_llmType

_llmType(): string

Return the string type key uniquely identifying this class of LLM.

Returns

string

Overrides

BaseLLM._llmType

Defined in

langchain/src/llms/openai.ts:383


_modelType

_modelType(): string

Returns

string

Inherited from

BaseLLM._modelType

Defined in

langchain/src/llms/base.ts:160


call

call(prompt, stop?): Promise<string>

Convenience wrapper for generate that takes in a single string prompt and returns a single string output.

Parameters

NameType
promptstring
stop?string[]

Returns

Promise<string>

Inherited from

BaseLLM.call

Defined in

langchain/src/llms/base.ts:131


generate

generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given propmts an input, handling caching.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseLLM.generate

Defined in

langchain/src/llms/base.ts:84


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseLLM.generatePrompt

Defined in

langchain/src/llms/base.ts:44


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

BaseLLM.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Defined in

langchain/src/llms/openai.ts:219


invocationParams

invocationParams(): CreateCompletionRequest & Kwargs

Get the parameters used to invoke the model

Returns

CreateCompletionRequest & Kwargs

Defined in

langchain/src/llms/openai.ts:191


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

BaseLLM.serialize

Defined in

langchain/src/llms/base.ts:152


deserialize

Static deserialize(data): Promise<BaseLLM>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLLM>

Inherited from

BaseLLM.deserialize

Defined in

langchain/src/llms/base.ts:167