Class: ConversationChain
chains.ConversationChain
Chain to run queries against LLMs.
Example
import { LLMChain } from "langchain/chains";
import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";
const prompt = PromptTemplate.fromTemplate("Tell me a {adjective} joke");
const llm = LLMChain({ llm: new OpenAI(), prompt });
Hierarchy
↳
ConversationChain
Constructors
constructor
• new ConversationChain(fields
)
Parameters
Name | Type |
---|---|
fields | Object |
fields.llm | BaseLanguageModel |
fields.memory? | BaseMemory |
fields.outputKey? | string |
fields.prompt? | BasePromptTemplate |
Overrides
Defined in
langchain/src/chains/llm_chain.ts:150
Properties
callbackManager
• callbackManager: CallbackManager
Inherited from
Defined in
langchain/src/chains/base.ts:25
llm
• llm: BaseLanguageModel
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:44
memory
• Optional
memory: BaseMemory
Inherited from
Defined in
langchain/src/chains/base.ts:21
outputKey
• outputKey: string
= "text"
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:46
outputParser
• Optional
outputParser: BaseOutputParser
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:48
prompt
• prompt: BasePromptTemplate
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:42
verbose
• verbose: boolean
Inherited from
Defined in
langchain/src/chains/base.ts:23
Accessors
inputKeys
• get
inputKeys(): string
[]
Returns
string
[]
Inherited from
LLMChain.inputKeys
Defined in
langchain/src/chains/llm_chain.ts:50
Methods
_call
▸ _call(values
): Promise
<ChainValues
>
Run the core logic of this chain and return the output
Parameters
Name | Type |
---|---|
values | ChainValues |
Returns
Promise
<ChainValues
>
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:85
_chainType
▸ _chainType(): "llm_chain"
Return the string type key uniquely identifying this class of chain.
Returns
"llm_chain"
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:113
_getFinalOutput
▸ _getFinalOutput(generations
, promptValue
): Promise
<unknown
>
Parameters
Name | Type |
---|---|
generations | Generation [] |
promptValue | BasePromptValue |
Returns
Promise
<unknown
>
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:68
apply
▸ apply(inputs
): Promise
<ChainValues
>
Call the chain on all inputs in the list
Parameters
Name | Type |
---|---|
inputs | ChainValues [] |
Returns
Promise
<ChainValues
>
Inherited from
Defined in
langchain/src/chains/base.ts:109
call
▸ call(values
): Promise
<ChainValues
>
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Parameters
Name | Type |
---|---|
values | ChainValues |
Returns
Promise
<ChainValues
>
Inherited from
Defined in
langchain/src/chains/base.ts:79
predict
▸ predict(values
): Promise
<string
>
Format prompt with values and pass to LLM
Example
llm.predict({ adjective: "funny" })
Parameters
Name | Type | Description |
---|---|---|
values | ChainValues | keys to pass to prompt template |
Returns
Promise
<string
>
Completion from LLM.
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:108
run
▸ run(input
): Promise
<string
>
Parameters
Name | Type |
---|---|
input | any |
Returns
Promise
<string
>
Inherited from
Defined in
langchain/src/chains/base.ts:55
serialize
▸ serialize(): SerializedLLMChain
Return a json-like object representing this chain.
Returns
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:132
deserialize
▸ Static
deserialize(data
): Promise
<LLMChain
>
Load a chain from a json-like object describing it.
Parameters
Name | Type |
---|---|
data | SerializedLLMChain |
Returns
Promise
<LLMChain
>
Inherited from
Defined in
langchain/src/chains/llm_chain.ts:117