API Error in "Week 2: Generative AI Projects" Jupyter coding sessions

In Week 2: Generative AI Projects, the two Jupyter coding sessions

  • Lesson 2 “Prompting an LLM in code” (GENAI4E_Activity1.ipynb)
  • Lesson 3 “Reputation monitoring system” (GENAI4E_Activity2.ipynb)

Cause an error when running the code cell that tries to prompt the LLM.

Z

Its better to post the error screenshot here so that mentor may have an idea about it!

Ok, I avoided that because of the number of error lines it generated, but here its:

---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
File /usr/local/lib/python3.9/site-packages/urllib3/connection.py:174, in HTTPConnection._new_conn(self)
    173 try:
--> 174     conn = connection.create_connection(
    175         (self._dns_host, self.port), self.timeout, **extra_kw
    176     )
    178 except SocketTimeout:

File /usr/local/lib/python3.9/site-packages/urllib3/util/connection.py:72, in create_connection(address, timeout, source_address, socket_options)
     68     return six.raise_from(
     69         LocationParseError(u"'%s', label empty or too long" % host), None
     70     )
---> 72 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
     73     af, socktype, proto, canonname, sa = res

File /usr/local/lib/python3.9/socket.py:954, in getaddrinfo(host, port, family, type, proto, flags)
    953 addrlist = []
--> 954 for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    955     af, socktype, proto, canonname, sa = res

gaierror: [Errno -2] Name or service not known

During handling of the above exception, another exception occurred:

NewConnectionError                        Traceback (most recent call last)
File /usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py:714, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    713 # Make the request on the httplib connection object.
--> 714 httplib_response = self._make_request(
    715     conn,
    716     method,
    717     url,
    718     timeout=timeout_obj,
    719     body=body,
    720     headers=headers,
    721     chunked=chunked,
    722 )
    724 # If we're going to release the connection in ``finally:``, then
    725 # the response doesn't need to know about the connection. Otherwise
    726 # it will also try to release it and we'll have a double-release
    727 # mess.

File /usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py:415, in HTTPConnectionPool._make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    414     else:
--> 415         conn.request(method, url, **httplib_request_kw)
    417 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is
    418 # legitimately able to close the connection after sending a valid response.
    419 # With this behaviour, the received response is still readable.

File /usr/local/lib/python3.9/site-packages/urllib3/connection.py:244, in HTTPConnection.request(self, method, url, body, headers)
    243     headers["User-Agent"] = _get_default_user_agent()
--> 244 super(HTTPConnection, self).request(method, url, body=body, headers=headers)

File /usr/local/lib/python3.9/http/client.py:1285, in HTTPConnection.request(self, method, url, body, headers, encode_chunked)
   1284 """Send a complete request to the server."""
-> 1285 self._send_request(method, url, body, headers, encode_chunked)

File /usr/local/lib/python3.9/http/client.py:1331, in HTTPConnection._send_request(self, method, url, body, headers, encode_chunked)
   1330     body = _encode(body, 'body')
-> 1331 self.endheaders(body, encode_chunked=encode_chunked)

File /usr/local/lib/python3.9/http/client.py:1280, in HTTPConnection.endheaders(self, message_body, encode_chunked)
   1279     raise CannotSendHeader()
-> 1280 self._send_output(message_body, encode_chunked=encode_chunked)

File /usr/local/lib/python3.9/http/client.py:1040, in HTTPConnection._send_output(self, message_body, encode_chunked)
   1039 del self._buffer[:]
-> 1040 self.send(msg)
   1042 if message_body is not None:
   1043 
   1044     # create a consistent interface to message_body

File /usr/local/lib/python3.9/http/client.py:980, in HTTPConnection.send(self, data)
    979 if self.auto_open:
--> 980     self.connect()
    981 else:

File /usr/local/lib/python3.9/site-packages/urllib3/connection.py:205, in HTTPConnection.connect(self)
    204 def connect(self):
--> 205     conn = self._new_conn()
    206     self._prepare_conn(conn)

File /usr/local/lib/python3.9/site-packages/urllib3/connection.py:186, in HTTPConnection._new_conn(self)
    185 except SocketError as e:
--> 186     raise NewConnectionError(
    187         self, "Failed to establish a new connection: %s" % e
    188     )
    190 return conn

NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f741c324040>: Failed to establish a new connection: [Errno -2] Name or service not known

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
File /usr/local/lib/python3.9/site-packages/requests/adapters.py:489, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
    488 if not chunked:
--> 489     resp = conn.urlopen(
    490         method=request.method,
    491         url=url,
    492         body=request.body,
    493         headers=request.headers,
    494         redirect=False,
    495         assert_same_host=False,
    496         preload_content=False,
    497         decode_content=False,
    498         retries=self.max_retries,
    499         timeout=timeout,
    500     )
    502 # Send the request.
    503 else:

File /usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py:798, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    796     e = ProtocolError("Connection aborted.", e)
--> 798 retries = retries.increment(
    799     method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
    800 )
    801 retries.sleep()

File /usr/local/lib/python3.9/site-packages/urllib3/util/retry.py:592, in Retry.increment(self, method, url, response, error, _pool, _stacktrace)
    591 if new_retry.is_exhausted():
--> 592     raise MaxRetryError(_pool, url, error or ResponseError(cause))
    594 log.debug("Incremented Retry for (url='%s'): %r", url, new_retry)

MaxRetryError: HTTPConnectionPool(host='jupyter-api-proxy.stg.internal.dlai', port=80): Max retries exceeded with url: /rev-proxy/gen_ai_for_everyone/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f741c324040>: Failed to establish a new connection: [Errno -2] Name or service not known'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:516, in APIRequestor.request_raw(self, method, url, params, supplied_headers, files, stream, request_id, request_timeout)
    515 try:
--> 516     result = _thread_context.session.request(
    517         method,
    518         abs_url,
    519         headers=headers,
    520         data=data,
    521         files=files,
    522         stream=stream,
    523         timeout=request_timeout if request_timeout else TIMEOUT_SECS,
    524     )
    525 except requests.exceptions.Timeout as e:

File /usr/local/lib/python3.9/site-packages/requests/sessions.py:587, in Session.request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    586 send_kwargs.update(settings)
--> 587 resp = self.send(prep, **send_kwargs)
    589 return resp

File /usr/local/lib/python3.9/site-packages/requests/sessions.py:701, in Session.send(self, request, **kwargs)
    700 # Send the request
--> 701 r = adapter.send(request, **kwargs)
    703 # Total elapsed time of the request (approximately)

File /usr/local/lib/python3.9/site-packages/requests/adapters.py:565, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
    563         raise SSLError(e, request=request)
--> 565     raise ConnectionError(e, request=request)
    567 except ClosedPoolError as e:

ConnectionError: HTTPConnectionPool(host='jupyter-api-proxy.stg.internal.dlai', port=80): Max retries exceeded with url: /rev-proxy/gen_ai_for_everyone/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f741c324040>: Failed to establish a new connection: [Errno -2] Name or service not known'))

The above exception was the direct cause of the following exception:

APIConnectionError                        Traceback (most recent call last)
Cell In[2], line 9
      1 prompt = '''
      2     Classify the following review 
      3     as having either a positive or
   (...)
      6     The banana pudding was really tasty!
      7 '''
----> 9 response = llm_response(prompt)
     10 print(response)

Cell In[1], line 7, in llm_response(prompt)
      6 def llm_response(prompt):
----> 7     response = openai.ChatCompletion.create(
      8         model='gpt-3.5-turbo',
      9         messages=[{'role':'user','content':prompt}],
     10         temperature=0
     11     )
     12     return response.choices[0].message['content']

File /usr/local/lib/python3.9/site-packages/openai/api_resources/chat_completion.py:25, in ChatCompletion.create(cls, *args, **kwargs)
     23 while True:
     24     try:
---> 25         return super().create(*args, **kwargs)
     26     except TryAgain as e:
     27         if timeout is not None and time.time() > start + timeout:

File /usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py:153, in EngineAPIResource.create(cls, api_key, api_base, api_type, request_id, api_version, organization, **params)
    127 @classmethod
    128 def create(
    129     cls,
   (...)
    136     **params,
    137 ):
    138     (
    139         deployment_id,
    140         engine,
   (...)
    150         api_key, api_base, api_type, api_version, organization, **params
    151     )
--> 153     response, _, api_key = requestor.request(
    154         "post",
    155         url,
    156         params=params,
    157         headers=headers,
    158         stream=stream,
    159         request_id=request_id,
    160         request_timeout=request_timeout,
    161     )
    163     if stream:
    164         # must be an iterator
    165         assert not isinstance(response, OpenAIResponse)

File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:216, in APIRequestor.request(self, method, url, params, headers, files, stream, request_id, request_timeout)
    205 def request(
    206     self,
    207     method,
   (...)
    214     request_timeout: Optional[Union[float, Tuple[float, float]]] = None,
    215 ) -> Tuple[Union[OpenAIResponse, Iterator[OpenAIResponse]], bool, str]:
--> 216     result = self.request_raw(
    217         method.lower(),
    218         url,
    219         params=params,
    220         supplied_headers=headers,
    221         files=files,
    222         stream=stream,
    223         request_id=request_id,
    224         request_timeout=request_timeout,
    225     )
    226     resp, got_stream = self._interpret_response(result, stream)
    227     return resp, got_stream, self.api_key

File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:528, in APIRequestor.request_raw(self, method, url, params, supplied_headers, files, stream, request_id, request_timeout)
    526     raise error.Timeout("Request timed out: {}".format(e)) from e
    527 except requests.exceptions.RequestException as e:
--> 528     raise error.APIConnectionError(
    529         "Error communicating with OpenAI: {}".format(e)
    530     ) from e
    531 util.log_debug(
    532     "OpenAI API response",
    533     path=abs_url,
   (...)
    536     request_id=result.headers.get("X-Request-Id"),
    537 )
    538 # Don't read the whole stream for debug logging unless necessary.

APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='jupyter-api-proxy.stg.internal.dlai', port=80): Max retries exceeded with url: /rev-proxy/gen_ai_for_everyone/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f741c324040>: Failed to establish a new connection: [Errno -2] Name or service not known'))


Hi, and thank you for reporting. This might have been just a glitch. The labs are running fine on my end. Can you try again? On another note, do you see the video playing on the right side by default?

Sorry I see the errors now. I was trying a different approach earlier. Will raise this to the team.

I am facing the same issue

Hi, and thank you for following up. I’m sorry for the experience. The team has acknowledged the issue last week. Will follow up as well but it might take till tomorrow to resolve because it’s the weekend.

I took the opportunity to create my own Jupyter environment (using Anaconda), download the juypter notebook (.ipynb files) from the course and run them locally. You will need an OpenAI API key though, but doesn’t cost much ($5), which goes a long way … jump ahead to lessons to the Week 2 Cost Intuition lesson.

If you can do it you should try … you will have to do it eventually

Z

Hi everyone! Sorry for the delay. The notebooks should now be fixed. I just tried it and all cells now run. Thank you again for reporting!