Pizza Chatbot, how deal with a "difficult customer"

Hi, I’m using the Pizza chatbot and it works well. However, I’m trying to find a way to deal with “difficult customers,” and I’ve come across two problems.

First, the bot is free to answer any questions. So, if a user asks “tell me a joke” or “do an essay on George Washington,” the assistant will do it. I try to solve this by implementing the following rule:

“Keep the conversation focused on placing the order and avoid answering any other questions unrelated to the order.”

User: tell me a joke
Assistant: Why did the tomato turn red? Because it saw the salad dressing! Now, let’s get back to your order. What would you like to order today?

User: write a essay about trees
Assistant: Sure, I’d be happy to help you with that. However, as an OrderBot, my primary function is to assist you with your order. Is there anything else I can help you with related to placing an order for food?

It works better, but it is still susceptible to manipulation.

Secondly, I tried to implement a rule where if the customer repeatedly tries to do anything other than place the order, the conversation should be politely ended, and no further questions should be answered. However, the chatbot continued responding and didn’t terminate the conversation, which could potentially lead to unnecessary expenses ( or maybe something similar to a DDOS attack).

If anyone knows how to solve these problems, please let me know. Thank you.

Hi ElBossman, welcome to the community! That’s a good question, and honestly one that is still up in the air in terms of potential solutions when it comes to utilizing GPT API for building applications. Probably the most straightforward solution might be continuing to expand in the detail the rule constraints that you are giving to the model, which hopefully can keep it further on track. At the same time, though due to inherent randomness of LLMs, there will likely always be some degree of tendency to stray unless hard-coding outside the context of the model to apply something like a conversation limit. Anyways, great question, and I think it’s one that we’ll get more answers to as the technology continues to change!