Integrations: LLMs
LangChain offers a number of LLM implementations that integrate with various model providers. These are:
OpenAI
import { OpenAI } from "langchain/llms/openai";
const model = new OpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
console.log({ res });
HuggingFaceInference
- npm
- Yarn
- pnpm
npm install @huggingface/inference
yarn add @huggingface/inference
pnpm add @huggingface/inference
import { HuggingFaceInference } from "langchain/llms/hf";
const model = new HuggingFaceInference({
model: "gpt2",
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.HUGGINGFACEHUB_API_KEY
});
const res = await model.call("1 + 1 =");
console.log({ res });
Cohere
- npm
- Yarn
- pnpm
npm install cohere-ai
yarn add cohere-ai
pnpm add cohere-ai
import { Cohere } from "langchain/llms/cohere";
const model = new Cohere({
maxTokens: 20,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.COHERE_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
console.log({ res });
Replicate
- npm
- Yarn
- pnpm
npm install replicate
yarn add replicate
pnpm add replicate
import { Replicate } from "langchain/llms/cohere";
const model = new Replicate({
model:
"daanelson/flan-t5:04e422a9b85baed86a4f24981d7f9953e20c5fd82f6103b74ebc431588e1cec8",
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.REPLICATE_API_KEY
});
const res = await modelA.call(
"What would be a good company name a company that makes colorful socks?"
);
console.log({ res });
Additional LLM Implementations
PromptLayerOpenAI
LangChain integrates with PromptLayer for logging and debugging prompts and responses. To add support for PromptLayer:
- Create a PromptLayer account here: https://promptlayer.com.
- Create an API token and pass it either as
promptLayerApiKey
argument in thePromptLayerOpenAI
constructor or in thePROMPTLAYER_API_KEY
environment variable.
import { PromptLayerOpenAI } from "langchain/llms/openai";
const model = new PromptLayerOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
The request and the response will be logged in the PromptLayer dashboard.
Note: In streaming mode PromptLayer will not log the response.