acm
June 10, 2023, 8:56am
1
I get this python error in the instruction: chat = ChatOpenAI(temperature=0.0)
File “pydantic\main.py”, line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for PromptTemplate
root
Invalid format specifier (type=value_error)
Any help? all libraries versions seems corrects and compatibles. Im using windows10 with:
python 3.10.11
pydantic==1.10.9
langchain==0.0.195
langchainplus-sdk==0.0.8
openai==0.27.7
openapi-schema-pydantic==1.2.4
Have a look on this I found on google search:
opened 08:38AM - 25 Mar 23 UTC
Am refereing to this documentaion [here](https://langchain.readthedocs.io/en/lat… est/modules/chat/examples/chat_vector_db.html) created a file
test.py
```
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter import CharacterTextSplitter
from langchain.chains import ConversationalRetrievalChain
from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.schema import (
AIMessage,
HumanMessage,
SystemMessage
)
from langchain.document_loaders import TextLoader
loader = TextLoader('test.txt')
documents = loader.load()
text_splitter = CharacterTextSplitter(chunk_size=200, chunk_overlap=0)
documents = text_splitter.split_documents(documents)
embeddings = OpenAIEmbeddings()
vectorstore = Chroma.from_documents(documents, embeddings)
system_template="""Use the following pieces of context to answer the users question.
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}"""
messages = [
SystemMessagePromptTemplate.from_template(system_template),
HumanMessagePromptTemplate.from_template("{question}")
]
prompt = ChatPromptTemplate.from_messages(messages)
qa = ConversationalRetrievalChain.from_llm(ChatOpenAI(temperature=0), vectorstore, qa_prompt=prompt)
chat_history = []
query = "Who is the CEO?"
result = qa({"question": query, "chat_history": chat_history})
print(result)
```
Which throws
Traceback (most recent call last):
File "test.py", line 44, in <module>
qa = ConversationalRetrievalChain.from_llm(ChatOpenAI(temperature=0), vectorstore, qa_prompt=prompt)
File "/home/prajin/works/ults/gpt/lamda-openai-chat-python/venv/lib/python3.8/site-packages/langchain/chains/conversational_retrieval/base.py", line 140, in from_llm
return cls(
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ConversationalRetrievalChain
retriever
instance of BaseRetriever expected (type=type_error.arbitrary_type; expected_arbitrary_type=BaseRetriever)
1 Like