Prompt of follow-up questions in a LLM session

Both ChatGPT and Gemini support chat sessions. After the initial question, you can continue to ask follow-up questions in the same session. Under the hood, are the previous questions in a session sent to the LLM as prompt? How does the LLM process follow-up questions in a session? Is a LLM session stateful or stateless – can LLM remember what has been asked and what it has answered in the same session?

It can remember but you have to explicitly trigger it as far as I remember, check out the LangChain short course here, it tells you how to remain in context with chatgpt!

Basically, LLM should be provided with previous history in any form. Most naively, you can directrly pass in previous question-answer pairs. But in many cases, you can use more advanced techniques such as similarity-based history retrieval, where only most relevant histories to the current input are passed in.

Anyway, you MUST pass in some form of history to the LLM in prompt to make it remember and reason over previous conversation. Chat sessions of ChatGPT or others do it behind the scenes.