DLS Course 5, Week 2, Emojify-Notebook not passed

Hello,
I have passed all tests, but after Submit I get the message “Cell #13. Can’t compile the student’s code. Error: ValueError('operands could not be broadcast together with shapes (5,2) (5,) ',)”

Unfortunately, I can’t see where this Cell #13 is located. Also, I don’t think I unintentionally made a change somewhere that I shouldn’t have.

I have read the “how-to-refresh-your-workspace” help and also searched the Discourse for answers. Since this is the first time for me that this message occurs I am a bit irritated. My lab ID is: hqntuxwd.

Can you please help me to fix this problem? I can also send the entire notebook if desired.
Regards
Volker

Passing all the unit tests does not guarantee your code is perfect. The unit tests only catch some of the possible errors.

(edited)

Hello Tom,

Thank you for taking the time to find the possible error in my notebook.

0/100 means to me that no cell has been filled in correctly, but this somehow is against my experience I made in the last 4 courses. In all of my other programming exercises I have completed with 100/100, therefore I’m a little bit confused. Please let me know what kind of mistake I made when filling out the notebook.

Have a nice day

Volker

Mit freundlichen Grüßen / Best regards

Dr. Volker Drewes

Powertrain Analysis and Operation Strategies (PS/EPP1)
Robert Bosch GmbH | Postfach 30 02 40 | 70442 Stuttgart | GERMANY | www.bosch.com
Tel. +49 711 811-8731 | Telefax +49 711 811-5111134 | Volker.Drewes@de.bosch.com

Sitz: Stuttgart, Registergericht: Amtsgericht Stuttgart, HRB 14000;
Aufsichtsratsvorsitzender: Prof. Dr. Stefan Asenkerschbaumer; Geschäftsführung: Dr. Stefan Hartung,
Dr. Christian Fischer, Filiz Albrecht, Dr. Markus Forschner, Dr. Markus Heyn, Rolf Najork

{mentor edit: code removed}

In sentence_to_avg(), try making this change:
avg = np.zeros(word_to_vec_map[any_word].shape)

Your code:
avg = np.zeros(np.shape(any_word))
… returns avg as a single float 0.0. But the correct size is an array of floats, the same size as a word vector.

Your code will pass the unit test, but I think it does not pass the grader. I am still looking at why.

Please let me know your results.

I think I figured out why your code passes the unit test but fails the grader.

Your code generates the wrong shape for avg in the case where the sentence contains no words from the vocabulary. The avg should always be the same size as the number of words in the vocabulary, but your code can return a scalar zero instead of a vector of zeros.

The assert in the unit test does not catch this problem, because it does not test the shape of the avg in the case where the sentence has only one word “love” which isn’t in the vocabulary.

But I think the grader uses a different test, and failing it is causing the grader to crash, so it doesn’t evaluate the rest of your notebook.

I’ll check into this further tomorrow , and submit a bug report if appropriate. That may be where the grader’s message about (5,) comes from - the grader might be using a vocabulary with five words. I’ll check into that.

Hello Tom,

Thank you for checking my code. I made your suggested change (avg = np.zeros(word_to_vec_map[any_word].shape)) and now get “full score”. Am a bit surprised that this error makes the difference between 0/100 to 100/100 :slight_smile:

Thanks again for your support!!!

Volker

Mit freundlichen Grüßen / Best regards

Dr. Volker Drewes

Powertrain Analysis and Operation Strategies (PS/EPP1)
Robert Bosch GmbH | Postfach 30 02 40 | 70442 Stuttgart | GERMANY | www.bosch.com
Tel. +49 711 811-8731 | Telefax +49 711 811-5111134 | Volker.Drewes@de.bosch.com

Sitz: Stuttgart, Registergericht: Amtsgericht Stuttgart, HRB 14000;
Aufsichtsratsvorsitzender: Prof. Dr. Stefan Asenkerschbaumer; Geschäftsführung: Dr. Stefan Hartung,
Dr. Christian Fischer, Filiz Albrecht, Dr. Markus Forschner, Dr. Markus Heyn, Rolf Najork

1 Like

The reason you got either 0 or 100 is that the defect in the code caused the grader to crash before it even looked at the rest of the notebook.