Always quota exceeded now - _Always_

2 examples:

  1. LangChain for LLM Application Development: Question and Answer section, Cell [7]
index = VectorstoreIndexCreator(
    vectorstore_cls=DocArrayInMemorySearch
).from_loaders([loader])

Retrying langchain.embeddings.openai.embed_with_retry…_embed_with_retry in 4.0 seconds as it raised RateLimitError: exceeded quota for this month.

  1. LangChain Chat with Your Data: Document Loading section, Cell [11]
url="https://www.youtube.com/watch?v=jGwO_UgTS7I"
save_dir="docs/youtube/"
loader = GenericLoader(
    YoutubeAudioLoader([url],save_dir),
    OpenAIWhisperParser()
)
docs = loader.load()

Extensive error says the following at end: RateLimitError: exceeded quota for this month

This kind of error means that you free chatgpt quota for that account is finished!

Yes, I know that. However, I was under the impression from the short courses leading up to this that the OPENAI_KEY key is local to the environment. Can you indicate where it says otherwise?

For LangChain for LLM Application Development, Cell [4] indicates the file is local to the short course notebook,

file = 'OutdoorClothingCatalog_1000.csv'
loader = CSVLoader(file_path=file)

For LangChain with your Data, Cell [3] says,

# The course will show the pip installs you would need to install packages on your own machine.
# These packages are already installed on this platform and should not be run again.
#! pip install pypdf

The OpenAI server, which is where usage is metered and rate limit imposed, is decidedly not local to your environment.

1 Like

Dear ai_curious,

Please read a my comment with the idea of being helpful.

The OPENAI API KEY is created of course at OpenAI Platform

That key is then used by storage in standard in a local environment variable; usually OPENAI_API_KEY, of course.

This is no longer working, e.g. openai.api_key = os.getenv('OPENAI_API_KEY')

If it’s noteworthy to no longer depend on the servers and do a local install or if there’s a Google Colab suggestion, just say so and kindly give a recommendation.

I’m looking for how others are either using Colab or local setups to complete the short courses.

The message at top indicates that the labs are now working (see top of page or message below).
Is there a quota attached to our deeplearning.ai account?
Or are we supposed to download the files and use locally?
I went back to the very first Short Course and even it does not work with the very first call using the openai.api_key = os.getenv('OPENAI_API_KEY')

Update: 10 August 2023

Hi all,

Coursera has informed us that the labs are now working.


(9 August 2023)

Hi all learners!

Since the title of the thread and the first post mention rate limit and quota, I inferred that there was some confusion about where that was being measured, which is not in the local environment. Apparently the issue is not related to quota at all, but dealing with the mechanics of setting and retrieving the key. My bad. Best of luck with your gen AI exploration.

Thank you.
Is anyone taking these Short Courses any more?

I mean it says the Coursera labs are now working but if you go back to the very first course “ChatGPT Prompt Engineering for Developers” it still says rate limit error. All of them do.

Just looking to either give these up, download what I can or use Google Colab. Have others given up? Just looking for status.

I’m taking the Generative AI with LLMs course on Coursera and took the Deep Learning Specialization on Coursera sometime ago. Just trying to advance in LLMs and Stable Diffusion.

Hi @Robert.Thompson,

Hope you are doing well.

Just to be clear, that message reads, “Coursera labs are now working” . This message is in regards to the site-wide issue Coursera had on their platform a few days ago.

Now, the short courses you are taking are not hosted on Coursera. They are hosted on DeepLearning.AI’s own platform. Please do not confuse the two.

As for the issue you are having, I’ll try my best to help out, but give me some time to get back to you regarding this (after I investigate).

Best,
Mubsi