Hey! I have some troubles, and I can’t find any resource online that has been through it. I am using Llama 3.1 locally, from Ollama owned hosted server. When I try to do a simple code such as this one, it returns an empty string! I don’t know what I am doing wrong.
The logs doesn’t show anything weird, and I have the local Ollama server running. What should I try to get a response? I want to be able to summarize some documents.
prompt = ChatPromptTemplate.from_messages(
[("system", "What are everyone's favorite colors:\n\n{context}")]
)
llm = ChatOllama(model="llama3.1:latest")
# chain = create_stuff_documents_chain(llm, prompt, output_parser=None)
chain = prompt | llm
docs = [
Document(page_content="Jesse loves red but not yellow"),
Document(page_content = "Jamal loves green but not as much as he loves orange")
]
chain.invoke({"context": docs})
Just in case, I tried both chains, and both have the same answer: nothing. With ChatOpenAI, it works perfectly!
Thanks!