L5: Role of the LLM

Hi, I am just about gathering steam in GenAI, hence trying to understanding the role of the LLM in L5. on chaining prompts. I am wonder how crucial was the role of the LLM.

Seems to me that we could have as well implemented the chatbot using the Python code and maybe be some additional code without the LLM. What role did “gpt-3.5-turbo” play? All the information required to answer the query was in the prompt itself. Nothing really came from the LLM.

Or is my understanding correct that the LLM facilitated the dialogue in natural language, but the information (categories + products) was all contained in the prompt(s)?

Cheers
Ram

first of all im newbie here, so take my answer at your own risk :wink:

LLM is the tool that from a user question ( in the course the question more used is “i want info about smartx prophone, also the fotocamera snap , the dslr one, and also tell me about tvs”…is obviuous that somehow you have to pass the system a list of the categories and products with data. Then LLM “understands” the question and search for the results more accurate, and returns a response in a natural language.

At the end of the course you would learn how to evaluate, the response of llm, and modify prompt to improve the result and satisfaction of the client.