ChatGPT: Printing Pizza Ordering Bot response doesn't work

Hi, im using pycharm and practicing the chatgpt pizza ordering bot and pasting this code right after the dashboard

messages = context.copy()
{‘role’:‘system’, ‘content’:‘format the previous food order in json format. Itemize the price for each item
The fields should be 1) pizza, include size 2) list of toppings 3) list of drinks, include size 4) list of sides include size 5)total price 6) name,email and phone number 7)delivery or pick up, if its delivery show the delivery address’},)
#The fields should be 1) pizza, price 2) list of toppings 3) list of drinks, include size include price 4) list of sides include size include price, 5)total price '},

response = get_completion_from_messages(messages, temperature=0)

This code supposed to execute and print the pizza order summary right after the order completion, but it doesn’t do that, i’m using venv instead of conda like in the video

Have you tried it inside the course notebook? try that and see if you get better results. The exercise in the course prints out the summary at the end.

So one thing I don’t understand is this line. As far as I can tell when running it in a Jupyter notebook in my environment, response is a JSON formatted string. When you pass it as a parameter to json.loads() you would get back a Python dictionary. From the variable name you’re assigning to, json_response, it seems like you expect what you’re printing to be JSON, but it won’t be.

However, when I run your code in my env, it doesn’t even get that far, because json.loads() throws a JSONDecodeError. Is that what you are seeing? Try printing out the response first, before calling json.loads() and see if that doesn’t give a hint where the trouble is.

Here’s what mine does…

response = get_completion_from_messages(messages, temperature=0)
Here's a JSON summary of the previous food order:
  "pizza": {
    "type": "pepperoni",
    "size": "medium",
    "price": 12.95
  "toppings": [],
  "drinks": [],
  "sides": [],
  "total_price": 11.66

Note that there are no toppings, drinks, or sides in this order. The total price reflects the 10% discount applied to the original price of $12.95.

Notice the non JSON text at the end

dict = json.loads(response)

→ 337 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
338 end = _w(s, end).end()
339 if end != len(s):

File ~/miniforge3/envs/openai/lib/python3.11/json/, in JSONDecoder.raw_decode(self, s, idx)
353 obj, end = self.scan_once(s, idx)
354 except StopIteration as err:
→ 355 raise JSONDecodeError(“Expecting value”, s, err.value) from None
356 return obj, end

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

This line is loading into a json object the json string contained in response.

When you call the OpenAI api, it sends back a json string that you are receiving in ‘response’.

I don’t think this is correct. ChatGPT is already returning in its response a string that is (mostly) JSON format. json.loads() converts a JSON formatted string into a Python dictionary.


Looks like what the decoder produces depends on the input type it receives

class json.JSONDecoder(***, object_hook=None, parse_float=None, parse_int=None, parse_constant=None, strict=True, object_pairs_hook=None )

Simple JSON decoder.

Performs the following translations in decoding by default:

So the “it’s a dict” part of my response looks to be incorrect. However, I still think the problem is that ChatGPT isn’t returning a string that is entirely JSON formatted. At least in my environment, there is plain text in the response, too, which causes the loads() method to bark.

OpenAI returns a respons as a string that is JSON format. The key here is “JSON format”. It is a string in json, but not a json object. the json.loads(…) converts it into a json object. And you are right, in Python, a json object is seen as a dict, which is the data structure that python uses to represent it.

Do this: when you receive the response from the call to the OpenAI API, do print(response)… copy that, and paste it in . You’ll see the json right there.

I doubt that at any time a response is an improperly formatted json.

Ok, but look a few replies up in this thread, where I just printed out the response string directly. It starts out with unformatted text Here’s a JSON summary of the previous food order: followed by some JSON within the {} then ends with more unformatted text Note that there are no toppings, drinks, or sides in this order. The total price reflects the 10% discount applied to the original price of $12.95.

As far as I can tell, that causes a runtime exception when passed into loads()

You need to see the function “get_completion_from_messages”. Most probably it is modifying the original response from the api. Try to get the response directly received from the API.

This is the function:

def get_completion_from_messages(messages, model="gpt-3.5-turbo", temperature=0):
    response = openai.ChatCompletion.create(
        temperature=temperature, # this is the degree of randomness of the model's output
#     print(str(response.choices[0].message))
    return response.choices[0].message["content"]

As you can see, it returns only a portion of the actual response. If you take the response from openai.ChatCompletion, you’ll get a complete json.

So what the function is doing is returning the content key of the choices array of the json’s response. And that can contain any format.

You could try to strictly get a json by asking in the prompt to return strictly a json.