My question is in reference to 06_chat notebook.
Specifically this code below:
# Build prompt
from langchain.prompts import PromptTemplate
template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. Use three sentences maximum. Keep the answer as concise as possible. Always say "thanks for asking!" at the end of the answer.
{context}
Question: {question}
Helpful Answer:"""
QA_CHAIN_PROMPT = PromptTemplate(input_variables=["context", "question"],template=template,)
# Run chain
from langchain.chains import RetrievalQA
question = "Is probability a class topic?"
qa_chain = RetrievalQA.from_chain_type(llm,
retriever=vectordb.as_retriever(),
return_source_documents=True,
chain_type_kwargs={"prompt": QA_CHAIN_PROMPT})
result = qa_chain({"query": question})
result["result"]
My questions:
- In qa_chain, I only see 1 input - text value held by ‘question’. But, in the prompt template, we define 2 input variables - QA_CHAIN_PROMPT = PromptTemplate(input_variables=[“context”, “question”],template=template,)
- In chain run (qa_chain({“query”: question}), the input variable is ‘query’. Why would it not be ‘question’ as that is how we define the input variable in the prompt template, as.
- Also which code passes the similarity-search docs to the context input variable in prompt template
Perhaps I am missing something obvious.
Please advise. Thank you