-
I am using NEXT.JS.... I was thinking OpenAI would automatically process .env??? This is the code I have from pages/api/ai.js in Next.js
^^ Obviously this code isn't working, maybe because I am not processing the .env? How could I include the processing of the .env in the code??? Also... the same code work in just Node.js with the addition of:
But Next.js I understand doesn't need dotenv??? |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 3 replies
-
🤖 Hello, The error message "OpenAI or Azure OpenAI API key not found" suggests that the API key for OpenAI is not being found in your Next.js application. This could be due to the way environment variables are handled in Next.js, which is different from Node.js. In Next.js, environment variables are loaded from the .env.local file in your project root during build time. However, these variables are only available on the server side by default. If you want to expose some of these variables to the browser, you need to prefix them with NEXT_PUBLIC_. Assuming your OpenAI API key is stored in an environment variable named OPENAI_API_KEY, you should rename it to NEXT_PUBLIC_OPENAI_API_KEY in your .env.local file. Then, you can access it in your code like this: import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";
const tools = [new Calculator(), new SerpAPI()];
const chat = new ChatOpenAI({ modelName: "gpt-4", temperature: 0, openAIApiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY });
const executor = await initializeAgentExecutorWithOptions(tools, chat, {
agentType: "openai-functions",
verbose: true,
});
const result = await executor.run("What is the weather in New York?");
console.log(result); Remember to restart your Next.js server after making changes to your .env.local file. As for the correct way to initialize and use the OpenAI model in the langchainjs framework, you first need to import the ChatOpenAI model from the langchain/chat_models/openai module. Then, you can create a new instance of the ChatOpenAI model by using the After initializing the model, you can use the Here is the code snippet from the context shared: import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanMessage } from "langchain/schema";
import { SerpAPI } from "langchain/tools";
import { createOpenAPIChain } from "../openapi";
const model = new ChatOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
const modelForFunctionCalling = new ChatOpenAI({
modelName: "gpt-4",
temperature: 0,
});
await modelForFunctionCalling.predictMessages(
[new HumanMessage("What is the weather in New York?")],
{ tools: [new SerpAPI()] }
);
const chain = await createOpenAPIChain(
"https://gist.githubusercontent.com/roaldnefs/053e505b2b7a807290908fe9aa3e1f00/raw/0a212622ebfef501163f91e23803552411ed00e4/openapi.yaml",
{
llm: new ChatOpenAI({ modelName: "gpt-4-0613", temperature: 0 }),
}
);
const result = await chain.run(`What's today's comic?`);
console.log(result); In this example, the model is asked to predict the weather in New York and the comic of the day. The results of the I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
How can I process the .env from this line of code?
I am using it inside the Next.js api..... pages/api/ai.js still give me the Error |
Beta Was this translation helpful? Give feedback.
-
@Elindo586 in NextJS, there is a restriction on environment variables exposed to client side as per official docs here. We need to add prefix
But this poses security risk as it exposed the api key to browser. |
Beta Was this translation helpful? Give feedback.
-
The langchain API will automatically look for OPENAI_API_KEY in youro env as far as I can tell. Just make it available through bash |
Beta Was this translation helpful? Give feedback.
-
Hey everyone, I thought I would chime in here. I was having this exact same problem in a Next 14.2 project, and then I realized that I had called my .env.local variable OPEN_AI_API_KEY instead of OPENAI_API_KEY. Please note the additional underscore between OPEN and AI. apparently OPENAI is one word and if it's not that exact var name, the langchain library will not be able to find it. |
Beta Was this translation helpful? Give feedback.
My solution was not to use langchain.
I just used Open AI directly. There is also good community support forum for Open AI