I create a local conda env on my pc for this course, as follows:
conda deactivate
conda create -n openai_rag python
conda activate openai_rag
pip install python-dotenv
pip install numpy
pip install jupyter
pip install trulens_eval
pip install -U llama-index-readers-file
pip install -U llama-index-core
pip install -U llama-index-llms-openai
pip install -U llama-index-embeddings-openai
problem with llm = OpenAI() solved by coding
from llama_index.llms.openai import OpenAI
instead of
from llama_index.core import llms
from llms import OpenAI
Then the following line will work:
llm = OpenAI(model=“gpt-3.5-turbo”, temperature=0.1)
pip install llama-index-embeddings-huggingface
In session L1, L2 etc, I always have the following error code while executing the command
tru_recorder = get_prebuilt_trulens_recorder(query_engine, feedbacks=feedbacks, app_id=“Direct Query Engine”)
I got this python error, can anybody help?
ValidationError Traceback (most recent call last)
Cell In[46], line 1
----> 1 tru_recorder = get_trulens_recorder(query_engine, feedbacks=feedbacks, app_id=“Direct Query Engine”)
Cell In[42], line 2, in get_trulens_recorder(query_engine, feedbacks, app_id)
1 def get_trulens_recorder(query_engine, feedbacks, app_id):
----> 2 tru_recorder = TruLlama(
3 query_engine,
4 app_id=app_id,
5 feedbacks=feedbacks
6 )
7 return tru_recorder