Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get response from Pinecone using LangChain? #5262

Open
2 tasks done
GoldenDragon0710 opened this issue Apr 30, 2024 · 1 comment
Open
2 tasks done

How to get response from Pinecone using LangChain? #5262

GoldenDragon0710 opened this issue Apr 30, 2024 · 1 comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@GoldenDragon0710
Copy link

Checklist

  • I added a very descriptive title to this issue.
  • I included a link to the documentation page I am referring to (if applicable).

Issue with current documentation:

const getReply = async (messages, namespaceId) => {
  const pinecone = new Pinecone({
    apiKey: process.env.PINECONE_API_KEY,
  });
  const pineconeIndex = pinecone.Index(process.env.PINECONE_INDEX);
  const vectorStore = await PineconeStore.fromExistingIndex(
    new OpenAIEmbeddings(),
    { pineconeIndex, namespace: namespaceId }
  );

  const CONDENSE_PROMPT = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
        Chat History: {chat_history}
        Follow Up Input: {question}
        Standalone question:`;

  const QA_PROMPT = `
        ${systemPrompt}
        =========
        {context}
        =========
        Question: {question}
        
        Answer:`;
  const chat = new ChatOpenAI({
    modelName: "gpt-4",
    temperature: 0.2,
    apiKey: process.env.OPENAI_API_KEY,
  });

  const chain = ConversationalRetrievalQAChain.fromLLM(
    chat,
    vectorStore.asRetriever(),
    {
      qaTemplate: QA_PROMPT,
      questionGeneratorTemplate: CONDENSE_PROMPT,
      returnSourceDocuments: true,
    }
  );

  const result = await chain.call({
    chat_history: messages,
  });
  return result.text;
};

This is code for generating response from Pinecone by using Langchain v0.0.91.
But this code does not work in Langchain v0.1.36. How to update this code?

Idea or request for content:

No response

Copy link

dosubot bot commented Apr 30, 2024

To update your code for LangChain v0.1.36, follow these key steps:

  1. Update Import Statements to match the latest module names and paths, for example:

    import { Pinecone } from "@pinecone-database/pinecone";
    import { Document } from "@langchain/core/documents";
    import { OpenAIEmbeddings } from "@langchain/openai";
    import { PineconeStore } from "@langchain/pinecone";
  2. Initialize Pinecone Client more simply with:

    const pinecone = new Pinecone();
    const pineconeIndex = pinecone.Index(process.env.PINECONE_INDEX!);
  3. Set Up Embeddings and PineconeStore with:

    const embeddings = new OpenAIEmbeddings();
    const pineconeStore = new PineconeStore(embeddings, { pineconeIndex });
  4. Create Documents using the Document class:

    const docs = [
      new Document({
        metadata: { foo: "bar" },
        pageContent: "pinecone is a vector db",
      }),
      // Additional documents...
    ];
  5. Add and Search Documents with updated API usage:

    const ids = await pineconeStore.addDocuments(docs);
    const results = await pineconeStore.similaritySearch(pageContent, 2, { foo: "bar" });
  6. Delete Documents using:

    pineconeStore.delete({ ids: [ids[0], ids[1]], });

Ensure your code includes try-catch blocks around asynchronous operations for error handling. The ConversationalRetrievalQAChain class initialization has changed, so refer to the ConversationalRetrievalQAChain class for configuring parameters and options.

This should make your code compatible with LangChain v0.1.36. If you need further assistance, feel free to ask.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant