Skip to main content

Class: PromptLayerOpenAI

llms/openai.PromptLayerOpenAI

PromptLayer wrapper to OpenAI

Hierarchy

Constructors

constructor

new PromptLayerOpenAI(fields?)

Parameters

NameType
fields?Partial<OpenAIInput> & BaseLLMParams & { openAIApiKey?: string } & { plTags?: string[] ; promptLayerApiKey?: string }

Overrides

OpenAI.constructor

Defined in

langchain/src/llms/openai.ts:397

Properties

batchSize

batchSize: number = 20

Inherited from

OpenAI.batchSize

Defined in

langchain/src/llms/openai.ts:121


bestOf

bestOf: number = 1

Inherited from

OpenAI.bestOf

Defined in

langchain/src/llms/openai.ts:113


cache

Optional cache: BaseCache<Generation[]>

Inherited from

OpenAI.cache

Defined in

langchain/src/llms/base.ts:31


callbackManager

callbackManager: CallbackManager

Inherited from

OpenAI.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

OpenAI.caller

Defined in

langchain/src/base_language/index.ts:40


frequencyPenalty

frequencyPenalty: number = 0

Inherited from

OpenAI.frequencyPenalty

Defined in

langchain/src/llms/openai.ts:107


logitBias

Optional logitBias: Record<string, number>

Inherited from

OpenAI.logitBias

Defined in

langchain/src/llms/openai.ts:115


maxTokens

maxTokens: number = 256

Inherited from

OpenAI.maxTokens

Defined in

langchain/src/llms/openai.ts:103


modelKwargs

Optional modelKwargs: Kwargs

Inherited from

OpenAI.modelKwargs

Defined in

langchain/src/llms/openai.ts:119


modelName

modelName: string = "text-davinci-003"

Inherited from

OpenAI.modelName

Defined in

langchain/src/llms/openai.ts:117


n

n: number = 1

Inherited from

OpenAI.n

Defined in

langchain/src/llms/openai.ts:111


name

name: string

The name of the LLM class

Inherited from

OpenAI.name

Defined in

langchain/src/llms/base.ts:29


plTags

Optional plTags: string[]

Defined in

langchain/src/llms/openai.ts:395


presencePenalty

presencePenalty: number = 0

Inherited from

OpenAI.presencePenalty

Defined in

langchain/src/llms/openai.ts:109


promptLayerApiKey

Optional promptLayerApiKey: string

Defined in

langchain/src/llms/openai.ts:393


stop

Optional stop: string[]

Inherited from

OpenAI.stop

Defined in

langchain/src/llms/openai.ts:125


streaming

streaming: boolean = false

Inherited from

OpenAI.streaming

Defined in

langchain/src/llms/openai.ts:127


temperature

temperature: number = 0.7

Inherited from

OpenAI.temperature

Defined in

langchain/src/llms/openai.ts:101


timeout

Optional timeout: number

Inherited from

OpenAI.timeout

Defined in

langchain/src/llms/openai.ts:123


topP

topP: number = 1

Inherited from

OpenAI.topP

Defined in

langchain/src/llms/openai.ts:105


verbose

verbose: boolean

Whether to print out response text.

Inherited from

OpenAI.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_generate

_generate(prompts, stop?): Promise<LLMResult>

Call out to OpenAI's endpoint with k unique prompts

Example

import { OpenAI } from "langchain/llms/openai";
const openai = new OpenAI();
const response = await openai.generate(["Tell me a joke."]);

Parameters

NameTypeDescription
promptsstring[]The prompts to pass into the model.
stop?string[]Optional list of stop words to use when generating.

Returns

Promise<LLMResult>

The full LLM output.

Inherited from

OpenAI._generate

Defined in

langchain/src/llms/openai.ts:238


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Inherited from

OpenAI._identifyingParams

Defined in

langchain/src/llms/openai.ts:208


_llmType

_llmType(): string

Return the string type key uniquely identifying this class of LLM.

Returns

string

Inherited from

OpenAI._llmType

Defined in

langchain/src/llms/openai.ts:383


_modelType

_modelType(): string

Returns

string

Inherited from

OpenAI._modelType

Defined in

langchain/src/llms/base.ts:160


call

call(prompt, stop?): Promise<string>

Convenience wrapper for generate that takes in a single string prompt and returns a single string output.

Parameters

NameType
promptstring
stop?string[]

Returns

Promise<string>

Inherited from

OpenAI.call

Defined in

langchain/src/llms/base.ts:131


completionWithRetry

completionWithRetry(request, options?): Promise<CreateCompletionResponse>

Parameters

NameType
requestCreateCompletionRequest
options?StreamingAxiosConfiguration

Returns

Promise<CreateCompletionResponse>

Overrides

OpenAI.completionWithRetry

Defined in

langchain/src/llms/openai.ts:418


generate

generate(prompts, stop?): Promise<LLMResult>

Run the LLM on the given propmts an input, handling caching.

Parameters

NameType
promptsstring[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

OpenAI.generate

Defined in

langchain/src/llms/base.ts:84


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

OpenAI.generatePrompt

Defined in

langchain/src/llms/base.ts:44


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

OpenAI.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Inherited from

OpenAI.identifyingParams

Defined in

langchain/src/llms/openai.ts:219


invocationParams

invocationParams(): CreateCompletionRequest & Kwargs

Get the parameters used to invoke the model

Returns

CreateCompletionRequest & Kwargs

Inherited from

OpenAI.invocationParams

Defined in

langchain/src/llms/openai.ts:191


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

OpenAI.serialize

Defined in

langchain/src/llms/base.ts:152


deserialize

Static deserialize(data): Promise<BaseLLM>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLLM>

Inherited from

OpenAI.deserialize

Defined in

langchain/src/llms/base.ts:167