C4W1 Resource Exhaustion Error and kernel dying at Decoder class . running fine on colab but not in the assignment environment of the course

When creating a post, please add:

  • Week # must be added in the tags option of the post.
  • Link to the classroom item you are referring to:
  • Description (include relevant info but please do not post solution code or your entire notebook)

in assignmet 1 of the Natural Language Processing with Attention Models

For the cell below Decoder class

  • The kernel is dying every time i kept running this cell and i got this resource exhaustion error once
  • The same cell ran fine when i tried running it in the google colab
  • Tried restarting the kernel and rebooting the assignment but facing the same problem every time at that cell
  • Is this a kernel related issue ?.

Is this Lab meant to be run on Colab or Coursera, obviously colab gives you more memory but unless you have modified something and its supposed to be run on Coursera, it should run fine!

Perhaps if you reset the assignment to its original state might solve the issue.

if you running codes locally on colab then the whole reasoning behind your issue need to be reviewed with what output you are getting as the course assignment you are talking about doesn’t uses Google colab but Coursera provided jupyter lab.

Upon executing the sequence of code cells within the assignment environment of Coursera provided jupyter lab, a persistent issue was observed at the cell immediately following the definition of the Decoder class. Specifically, the kernel exhibited a critical failure, resulting in a crash and subsequent automatic reconnection. This behavior was consistently reproducible solely upon attempting to execute this particular cell, while all preceding code cells successfully completed execution without incident across multiple attempts.

I then tried restarting the kernel, performing a kernel reboot, and completely restarting the assignment session; however, the same problem continued. To confirm that there is no bug in the code i ran it in colab, where it got executed. Based on these I think there might be a problem with the environment which i am not so sure. to run it in colab i downloaded all the lab files and opened the same notebook and uploaded the lab files too except for the w1_unittest script which seems to need the dlai_grader which is availalbe only in the coursera’s jupyter lab, everything else got executed .

hi @Rohith_17
chances are your code failing in the Coursera provided lab failing the decoder class unittest would be resultant of discrepancies between variable function recall(I cannot confirm until I see the codes)

You need to understand unittest are based on to check if your previously written codes were written correctly as per the metadata written, like not using self. function while implenting class function, or using global variables instead of local variables or any other.

kindly post the screenshot of the output you got for unittest you are failing. Also remember passing unittest doesn’t always confirm one would pass the grader assignment as there are multiple dependencies required to pass an assignment. You codes might be in general correct (reason for successful execution of code in colab) but not as per the autograder.

1 Like

Thanks, i identified my mistake . i just overlooked several times before, i used vocab_size instead of units in the pre_attention_rnn layer which have cause this error. Sorry, I should have looked for it more thoroughly, colab ran fine since i edited it there itself without copying, which complicated the problem

2 Likes

Nothing to be sorry for, glad you could debug your code yourself and the error you mentioned is common mistake overlooked by previous learners especially missing reading instructions below the exercise headers.

Keep Learning!