ChatBot token limit

How should the ChatBot track token usage? I imagine that the context of a long conversation could grow beyond the token limit. How can that be tracked and addressed?

Here is a link I found How do I check my token usage? | OpenAI Help Center

Along wtht that every model has its own price plan and limits .

I have started learning gpt and implementing it on few projects but get struck with it.

Thanks. What I’m really wondering about is how to address that within the chat application. As you get close to the token limit, what should you do? I suspect, you should summarize to compress the tokens but I was wondering if there were recommended ways of approaching this. I’ve started the LangChain short course and that might help answer this question.

yes,compressing token or max_token =limit.
Here is link which I think will help you out