hello every one tried running chatting with my own data. when I input the parameter Return_source_documents and pass the argument to True I get ValueError: One output key expected, got dict_keys([‘answer’, ‘source_documents’]) but when I don’t pass the parameter I get a response from the llm I don’t know why that is. can someone help me with a possible solution
and Also I am bit confused with the difference between using OpenAI and ChatOpenAI are they thesame or different. If they are different or the same what makes it so? thank you.
I think I found the issue it only allows for only one key. Its either the question or return_documents , or it returns generated_question, it cannot return all at a time because of how it is passed. for the ConversationalRetrievalChain not the RetrievalQA. Thank you. if that is not the case please let me know.