I just discovered that if I run the Examples on the WebSite with the Video on the left handed side, anything seems to work. So everything seems to work inside the website provided from DeepLearning.
I’ ve got the issues when copying the code in a freshly installed Jupyter
Yes, I have tried to pin my installation to th 0.28 version by running this command in Jupyter:
!pip install openai==0.28
but I’m receiving this Error Message:
Building wheels for collected packages: aiohttp, frozenlist, multidict, yarl
Building wheel for aiohttp (pyproject.toml): started
Building wheel for aiohttp (pyproject.toml): finished with status ‘error’
Building wheel for frozenlist (pyproject.toml): started
Building wheel for frozenlist (pyproject.toml): finished with status ‘error’
Building wheel for multidict (pyproject.toml): started
Building wheel for multidict (pyproject.toml): finished with status ‘error’
Building wheel for yarl (pyproject.toml): started
Building wheel for yarl (pyproject.toml): finished with status ‘error’
Failed to build aiohttp frozenlist multidict yarl
Then I copy all the commands from the Website into my Jupiter and if Irun the prompt I still receive this error:
In the latest OpenAI package the response.choices object type is changed and in this way you must read the response:
print(response.choices[0].message.content)
Changes needed in the code of the course:
“”" Change this:
import openai
import os
from dotenv import load_dotenv, find_dotenv
_ = load_dotenv(find_dotenv())
openai.api_key = os.getenv(‘OPENAI_API_KEY’)
“”"
To this:
from openai import OpenAI
client = OpenAI()
“”" Change this:
def get_completion(prompt, model=“gpt-3.5-turbo”):
messages = [{“role”: “user”, “content”: prompt}]
response = openai.ChatCompletion.create(
model=model,
messages=messages,
temperature=0, # this is the degree of randomness of the model’s output
)
return response.choices[0].message[“content”]
“”"
To this:
def get_completion(prompt, model=“gpt-3.5-turbo”):
messages = [{“role”: “user”, “content”: prompt}]
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0, # this is the degree of randomness of the model’s output
)
return response.choices[0].message.content