Authentication Error when running Notebooks

I haven’t tested the L3, L4 and L5 and they currently give me the same error. As far as I understand it there is an issue with the openai authentication token (Example from L5 notebook, Cell 19):

---------------------------------------------------------------------------
AuthenticationError                       Traceback (most recent call last)
Cell In[19], line 1
----> 1 docs = db.similarity_search(query)

File /usr/local/lib/python3.9/site-packages/langchain/vectorstores/docarray/base.py:123, in DocArrayIndex.similarity_search(self, query, k, **kwargs)
    111 def similarity_search(
    112     self, query: str, k: int = 4, **kwargs: Any
    113 ) -> List[Document]:
    114     """Return docs most similar to query.
    115 
    116     Args:
   (...)
    121         List of Documents most similar to the query.
    122     """
--> 123     results = self.similarity_search_with_score(query, k=k, **kwargs)
    124     return [doc for doc, _ in results]

File /usr/local/lib/python3.9/site-packages/langchain/vectorstores/docarray/base.py:101, in DocArrayIndex.similarity_search_with_score(self, query, k, **kwargs)
     89 def similarity_search_with_score(
     90     self, query: str, k: int = 4, **kwargs: Any
     91 ) -> List[Tuple[Document, float]]:
     92     """Return docs most similar to query.
     93 
     94     Args:
   (...)
     99         List of Documents most similar to the query and score for each.
    100     """
--> 101     query_embedding = self.embedding.embed_query(query)
    102     query_doc = self.doc_cls(embedding=query_embedding)  # type: ignore
    103     docs, scores = self.doc_index.find(query_doc, search_field="embedding", limit=k)

File /usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py:297, in OpenAIEmbeddings.embed_query(self, text)
    288 def embed_query(self, text: str) -> List[float]:
    289     """Call out to OpenAI's embedding endpoint for embedding query text.
    290 
    291     Args:
   (...)
    295         Embedding for the text.
    296     """
--> 297     embedding = self._embedding_func(text, engine=self.deployment)
    298     return embedding

File /usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py:263, in OpenAIEmbeddings._embedding_func(self, text, engine)
    259 if self.model.endswith("001"):
    260     # See: https://github.com/openai/openai-python/issues/418#issuecomment-1525939500
    261     # replace newlines, which can negatively affect performance.
    262     text = text.replace("\n", " ")
--> 263 return embed_with_retry(
    264     self,
    265     input=[text],
    266     engine=engine,
    267     request_timeout=self.request_timeout,
    268     headers=self.headers,
    269 )["data"][0]["embedding"]

File /usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py:64, in embed_with_retry(embeddings, **kwargs)
     60 @retry_decorator
     61 def _embed_with_retry(**kwargs: Any) -> Any:
     62     return embeddings.client.create(**kwargs)
---> 64 return _embed_with_retry(**kwargs)

File /usr/local/lib/python3.9/site-packages/tenacity/__init__.py:289, in BaseRetrying.wraps.<locals>.wrapped_f(*args, **kw)
    287 @functools.wraps(f)
    288 def wrapped_f(*args: t.Any, **kw: t.Any) -> t.Any:
--> 289     return self(f, *args, **kw)

File /usr/local/lib/python3.9/site-packages/tenacity/__init__.py:379, in Retrying.__call__(self, fn, *args, **kwargs)
    377 retry_state = RetryCallState(retry_object=self, fn=fn, args=args, kwargs=kwargs)
    378 while True:
--> 379     do = self.iter(retry_state=retry_state)
    380     if isinstance(do, DoAttempt):
    381         try:

File /usr/local/lib/python3.9/site-packages/tenacity/__init__.py:314, in BaseRetrying.iter(self, retry_state)
    312 is_explicit_retry = fut.failed and isinstance(fut.exception(), TryAgain)
    313 if not (is_explicit_retry or self.retry(retry_state)):
--> 314     return fut.result()
    316 if self.after is not None:
    317     self.after(retry_state)

File /usr/local/lib/python3.9/concurrent/futures/_base.py:439, in Future.result(self, timeout)
    437     raise CancelledError()
    438 elif self._state == FINISHED:
--> 439     return self.__get_result()
    441 self._condition.wait(timeout)
    443 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File /usr/local/lib/python3.9/concurrent/futures/_base.py:391, in Future.__get_result(self)
    389 if self._exception:
    390     try:
--> 391         raise self._exception
    392     finally:
    393         # Break a reference cycle with the exception in self._exception
    394         self = None

File /usr/local/lib/python3.9/site-packages/tenacity/__init__.py:382, in Retrying.__call__(self, fn, *args, **kwargs)
    380 if isinstance(do, DoAttempt):
    381     try:
--> 382         result = fn(*args, **kwargs)
    383     except BaseException:  # noqa: B902
    384         retry_state.set_exception(sys.exc_info())  # type: ignore[arg-type]

File /usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py:62, in embed_with_retry.<locals>._embed_with_retry(**kwargs)
     60 @retry_decorator
     61 def _embed_with_retry(**kwargs: Any) -> Any:
---> 62     return embeddings.client.create(**kwargs)

File /usr/local/lib/python3.9/site-packages/openai/api_resources/embedding.py:33, in Embedding.create(cls, *args, **kwargs)
     31 while True:
     32     try:
---> 33         response = super().create(*args, **kwargs)
     35         # If a user specifies base64, we'll just return the encoded string.
     36         # This is only for the default case.
     37         if not user_provided_encoding_format:

File /usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py:153, in EngineAPIResource.create(cls, api_key, api_base, api_type, request_id, api_version, organization, **params)
    127 @classmethod
    128 def create(
    129     cls,
   (...)
    136     **params,
    137 ):
    138     (
    139         deployment_id,
    140         engine,
   (...)
    150         api_key, api_base, api_type, api_version, organization, **params
    151     )
--> 153     response, _, api_key = requestor.request(
    154         "post",
    155         url,
    156         params=params,
    157         headers=headers,
    158         stream=stream,
    159         request_id=request_id,
    160         request_timeout=request_timeout,
    161     )
    163     if stream:
    164         # must be an iterator
    165         assert not isinstance(response, OpenAIResponse)

File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:230, in APIRequestor.request(self, method, url, params, headers, files, stream, request_id, request_timeout)
    209 def request(
    210     self,
    211     method,
   (...)
    218     request_timeout: Optional[Union[float, Tuple[float, float]]] = None,
    219 ) -> Tuple[Union[OpenAIResponse, Iterator[OpenAIResponse]], bool, str]:
    220     result = self.request_raw(
    221         method.lower(),
    222         url,
   (...)
    228         request_timeout=request_timeout,
    229     )
--> 230     resp, got_stream = self._interpret_response(result, stream)
    231     return resp, got_stream, self.api_key

File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:624, in APIRequestor._interpret_response(self, result, stream)
    616     return (
    617         self._interpret_response_line(
    618             line, result.status_code, result.headers, stream=True
    619         )
    620         for line in parse_stream(result.iter_lines())
    621     ), True
    622 else:
    623     return (
--> 624         self._interpret_response_line(
    625             result.content.decode("utf-8"),
    626             result.status_code,
    627             result.headers,
    628             stream=False,
    629         ),
    630         False,
    631     )

File /usr/local/lib/python3.9/site-packages/openai/api_requestor.py:687, in APIRequestor._interpret_response_line(self, rbody, rcode, rheaders, stream)
    685 stream_error = stream and "error" in resp.data
    686 if stream_error or not 200 <= rcode < 300:
--> 687     raise self.handle_error_response(
    688         rbody, rcode, resp.data, rheaders, stream_error=stream_error
    689     )
    690 return resp

AuthenticationError: Your authentication token is not from a valid issuer.

Can this be fixed individually, or does it require action from the course organiser?

1 Like

I am encountering the exact same error message while running nbs on a different course on ChatGPT prompting so seems like a generic problem with openai authentication??
Seems like this issue needs a systemic fix?

Is there anyway to get a ticket on deeplearning helpdesk for this issue?

1 Like

One solution is to setup your own API key on OpenAI and upgrade to a paid account. After doing this, I can run examples locally on my machine.

Note: I get the same error if I try to run exercises for the ChatGPT Prompt Engineering class which worked in the past. I suspect that something has gone wrong with their key.

I was wondering about that too. I was looking for help with another issue and the only thing I have found is the forum, which is fantastic for most cases.

EDIT: It looks like it could be working again, as the issue has been reported as fixed in another course:
AuthenticationError - Short Courses / ChatGPT Prompt Engineering for Developers - DeepLearning.AI