ConversationalRetrivealChain

hello every one tried running chatting with my own data. when I input the parameter Return_source_documents and pass the argument to True I get ValueError: One output key expected, got dict_keys([‘answer’, ‘source_documents’]) but when I don’t pass the parameter I get a response from the llm I don’t know why that is. can someone help me with a possible solution


and Also I am bit confused with the difference between using OpenAI and ChatOpenAI are they thesame or different. If they are different or the same what makes it so? thank you.

Hi @Nnaemeka_Nwankwo

I’m not sure if it is the case but, Your code is returning python dictionary instead of a result

Looking to the original code we have this:

# Run chain
qa_chain = RetrievalQA.from_chain_type(
    llm,
    retriever=vectordb.as_retriever(),
    return_source_documents=True,
    chain_type_kwargs={"prompt": QA_CHAIN_PROMPT}
)
question = "Is probability a class topic?"
result = qa_chain({"query": question})
result["result"]

I believe that the mistake is here:

result = qa_chain({"question": question}) 

You are using a key named "question" witch not exist.
The correct is "query"

The correct is “query”

I think I found the issue it only allows for only one key. Its either the question or return_documents , or it returns generated_question, it cannot return all at a time because of how it is passed. for the ConversationalRetrievalChain not the RetrievalQA. Thank you. if that is not the case please let me know.

1 Like