C4W2A1 Res Net not working locally - public_tests.py mismatch

Hello,

I had same first issue as described in this post about Res Net.

So after, I set a Local virtual environment with python 3.8.10 and TensorFlow 2.9.1 versions thanks to this other post.

In the “Programming Assignment: Residual Networks” lab my submission has been accepted online but locally I have still 2 discrepancies on the first exercise described bellow.

If you have any clues, please let me know.

regards,
Francis

# UNQ_C1
# GRADED FUNCTION: identity_block
  1. My output is faulty:
    Screenshot_ResNet_1
    AssertionError: Wrong values with training=True

  2. public_tests.py is not checking same values as what we find online :upside_down_face::

  3. Expected:
    Screenshot_ResNet_3

If you want to run the labs locally, the preferred method is to install the same package versions as are used by Coursera Labs.

That’s what I’ve done and written @TMosh.
Also it doesn’t answer why public_tests.py is obsolete.

import sys
print(“Python version:”, sys.version)
!pip list

Out:

Python version: 3.8.10 (default, Mar 15 2022, 12:22:08)
[GCC 9.4.0]
Package                      Version
---------------------------- --------------------
list
absl-py                      1.0.0
argon2-cffi                  21.3.0
argon2-cffi-bindings         21.2.0
asttokens                    2.0.5
astunparse                   1.6.3
attrs                        21.4.0
backcall                     0.2.0
beautifulsoup4               4.11.1
bleach                       5.0.0
blis                         0.7.4
cachetools                   5.1.0
catalogue                    2.0.4
certifi                      2020.12.5
cffi                         1.15.0
chardet                      4.0.0
click                        7.1.2
cycler                       0.11.0
cymem                        2.0.5
dbus-python                  1.2.16
debugpy                      1.6.0
decorator                    5.1.1
defusedxml                   0.7.1
entrypoints                  0.4
executing                    0.8.3
fastjsonschema               2.15.3
flatbuffers                  1.12
fonttools                    4.33.3
gast                         0.4.0
google-auth                  2.6.6
google-auth-oauthlib         0.4.6
google-pasta                 0.2.0
graphviz                     0.16
grpcio                       1.46.3
h5py                         3.6.0
idna                         2.10
imageio                      2.9.0
importlib-metadata           4.0.1
importlib-resources          5.7.1
ipykernel                    5.1.1
ipython                      8.3.0
ipython-genutils             0.2.0
ipywidgets                   7.7.0
jedi                         0.17.2
Jinja2                       2.11.3
joblib                       1.1.1
jsonschema                   4.5.1
jupyter                      1.0.0
jupyter-client               7.3.1
jupyter-console              6.4.3
jupyter-core                 4.10.0
jupyter-http-over-ws         0.0.8
jupyterlab-pygments          0.2.2
jupyterlab-widgets           1.1.0
keras                        2.9.0
Keras-Preprocessing          1.1.2
kiwisolver                   1.4.2
libclang                     14.0.1
Markdown                     3.3.7
MarkupSafe                   1.1.1
matplotlib                   3.5.2
matplotlib-inline            0.1.3
mistune                      0.8.4
murmurhash                   1.0.5
nbclient                     0.6.3
nbconvert                    6.5.0
nbformat                     4.4.0
nest-asyncio                 1.5.5
notebook                     6.4.11
numpy                        1.22.3
oauthlib                     3.2.0
opt-einsum                   3.3.0
packaging                    20.9
pandas                       1.5.2
pandocfilters                1.5.0
parso                        0.7.1
pathy                        0.5.2
pexpect                      4.8.0
pickleshare                  0.7.5
Pillow                       9.0.0
pip                          20.2.4
preshed                      3.0.5
prometheus-client            0.14.1
prompt-toolkit               3.0.29
protobuf                     3.19.4
psutil                       5.9.1
ptyprocess                   0.7.0
pure-eval                    0.2.2
pyasn1                       0.4.8
pyasn1-modules               0.2.8
pycparser                    2.21
pydantic                     1.7.3
pydot                        1.2.3
Pygments                     2.12.0
PyGObject                    3.36.0
pyparsing                    2.4.7
pyrsistent                   0.18.1
python-apt                   2.0.0+ubuntu0.20.4.7
python-dateutil              2.8.1
pytz                         2021.1
pyzmq                        23.0.0
qtconsole                    5.3.0
QtPy                         2.1.0
requests                     2.25.1
requests-oauthlib            1.3.1
requests-unixsocket          0.2.0
rsa                          4.8
scikit-learn                 1.2.0
scipy                        1.7.3
Send2Trash                   1.8.0
setuptools                   62.3.2
six                          1.15.0
sklearn                      0.0
smart-open                   3.0.0
soupsieve                    2.3.2.post1
spacy                        3.0.3
spacy-legacy                 3.0.5
srsly                        2.4.1
stack-data                   0.2.0
tensorboard                  2.9.0
tensorboard-data-server      0.6.1
tensorboard-plugin-wit       1.8.1
tensorflow                   2.9.1
tensorflow-estimator         2.9.0
tensorflow-io-gcs-filesystem 0.26.0
termcolor                    1.1.0
terminado                    0.15.0
thinc                        8.0.3
threadpoolctl                2.1.0
tinycss2                     1.1.1
tornado                      6.1
tqdm                         4.49.0
traitlets                    5.2.1.post0
typer                        0.3.2
typing-extensions            3.10.0.0
urllib3                      1.26.4
wasabi                       0.8.2
wcwidth                      0.2.5
webencodings                 0.5.1
Werkzeug                     2.1.2
wheel                        0.34.2
widgetsnbextension           3.6.0
wrapt                        1.14.1
zipp                         3.4.1

Compare the TensorFlow and Keras versions between your local and Coursera’s environments.

They are identical (2.9.1 & 2.9.0)

I compared my codes and found one difference:

my faulty local code when I tried to get TensorFlow 2.16 running:

X = Conv2D(filters = F1 , kernel_size= (1, 1), strides = (1,1), padding = ‘valid’, kernel_initializer = initializer(seed=0))(X)

my correct code on coursera:

X = Conv2D(filters = F1 , kernel_size= 1, strides = (1,1), padding = ‘valid’, kernel_initializer = initializer(seed=0))(X)

This explains the first issue, but still

public_tests.identity_block_test(identity_block)

Gives this wrong output:

<_ ="AssertionError:Wrong" values with training=True

So as I suspected earlier public_tests.py file available for download on coursera is obsolete and as it has not been updated compares with older version of Lab.

The public_tests.py file is not obsolete.

Note that there are two test cases: one in the notebook itself and one in the public_tests.py file and they are not the same. The inputs are the same, but compare the code carefully. The difference is subtle: they use f = 2 in the training = True case in the notebook, but f = 3 in the public_tests.py case in the training = True case. The tests both passed for me on the Coursera website.

If they are failing on your local system, then something else is wrong.

BTW the difference between kernel_size = 1 and kernel_size = (1, 1) is not meaningful.

1 Like

No, it isn’t.

1 Like

Subtle indeed @paulinpaloalto ! :slight_smile:


Locally, I have discrepancies at the third decimal place.

I aligned TensorFlow, Numpy, h5py and Python version but not all dependencies.
In fact many of other packages versions do not match.

Interesting. Speaking of subtleties, that was a clever way to explore and diagnose the problem! I’m sure you noticed in that process that you need to do a Kernel -> Restart to pick up a change to one of the imported files.

I’m actually surprised that they used such a high error threshold in the first place, since typically 1e-7 is fine even in float32. But errors in the 3rd decimal place of the mantissa are starting to get a bit scary. Normally that couldn’t be a mere rounding error, unless the calculation in question is pretty unstable and we fundamentally assume that the algorithms in TF are written and tested for numerical stability in general. Just for curiosity’s sake, I modified the threshold and both tests for identity_block work with 1e-9 for me on the course website.

So if you’re sure you haven’t changed anything in the notebook itself or public_tests.py in the download process, then we’re left with your theory that some other supporting library must have changed. There is a more general way to create and duplicate environments, rather than doing individual installs of particular things on your base platform. We don’t have fully detailed instructions, but this thread describes the process. It’s not easy, unfortunately, but if you do the full version based on conda or anaconda, then you get a generalizable solution to “versionitis”.

I tried user424 advices.
on Coursera:

!pip freeze > requirements.txt

Downloaded the file, created env with python 3.8.10 but install failed

pip install -r requirements.txt

with this many messages

Looking in indexes: Simple index, https://pypi.ngc.nvidia.com
Collecting absl-py==1.0.0 (from -r requirements.txt (line 1))
Downloading absl_py-1.0.0-py3-none-any.whl.metadata (2.3 kB)
Collecting argon2-cffi==21.3.0 (from -r requirements.txt (line 2))
Downloading argon2_cffi-21.3.0-py3-none-any.whl.metadata (5.4 kB)
Collecting argon2-cffi-bindings==21.2.0 (from -r requirements.txt (line 3))
Downloading argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB)
Collecting asttokens==2.0.5 (from -r requirements.txt (line 4))
Downloading asttokens-2.0.5-py2.py3-none-any.whl.metadata (4.6 kB)
Collecting astunparse==1.6.3 (from -r requirements.txt (line 5))
Downloading astunparse-1.6.3-py2.py3-none-any.whl.metadata (4.4 kB)
Collecting attrs==21.4.0 (from -r requirements.txt (line 6))
Downloading attrs-21.4.0-py2.py3-none-any.whl.metadata (9.8 kB)
Collecting backcall==0.2.0 (from -r requirements.txt (line 7))
Downloading backcall-0.2.0-py2.py3-none-any.whl.metadata (2.0 kB)
Collecting beautifulsoup4==4.11.1 (from -r requirements.txt (line 8))
Downloading beautifulsoup4-4.11.1-py3-none-any.whl.metadata (3.5 kB)
Collecting bleach==5.0.0 (from -r requirements.txt (line 9))
Downloading bleach-5.0.0-py3-none-any.whl.metadata (26 kB)
Collecting blis==0.7.4 (from -r requirements.txt (line 10))
Downloading blis-0.7.4-cp38-cp38-manylinux2014_x86_64.whl.metadata (7.3 kB)
Collecting cachetools==5.1.0 (from -r requirements.txt (line 11))
Downloading cachetools-5.1.0-py3-none-any.whl.metadata (5.0 kB)
Collecting catalogue==2.0.4 (from -r requirements.txt (line 12))
Downloading catalogue-2.0.4-py3-none-any.whl.metadata (14 kB)
Collecting certifi==2020.12.5 (from -r requirements.txt (line 13))
Downloading certifi-2020.12.5-py2.py3-none-any.whl.metadata (3.0 kB)
Collecting cffi==1.15.0 (from -r requirements.txt (line 14))
Downloading cffi-1.15.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (1.2 kB)
Collecting chardet==4.0.0 (from -r requirements.txt (line 15))
Downloading chardet-4.0.0-py2.py3-none-any.whl.metadata (3.5 kB)
Collecting click==7.1.2 (from -r requirements.txt (line 16))
Downloading click-7.1.2-py2.py3-none-any.whl.metadata (2.9 kB)
Collecting cycler==0.11.0 (from -r requirements.txt (line 17))
Downloading cycler-0.11.0-py3-none-any.whl.metadata (785 bytes)
Collecting cymem==2.0.5 (from -r requirements.txt (line 18))
Downloading cymem-2.0.5-cp38-cp38-manylinux2014_x86_64.whl.metadata (8.2 kB)
Collecting dbus-python==1.2.16 (from -r requirements.txt (line 19))
Downloading dbus-python-1.2.16.tar.gz (576 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 576.7/576.7 kB 7.5 MB/s eta 0:00:00
Preparing metadata (setup.py) … done
Collecting debugpy==1.6.0 (from -r requirements.txt (line 20))
Downloading debugpy-1.6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (1.1 kB)
Collecting decorator==5.1.1 (from -r requirements.txt (line 21))
Downloading decorator-5.1.1-py3-none-any.whl.metadata (4.0 kB)
Collecting defusedxml==0.7.1 (from -r requirements.txt (line 22))
Downloading defusedxml-0.7.1-py2.py3-none-any.whl.metadata (32 kB)
Collecting entrypoints==0.4 (from -r requirements.txt (line 23))
Downloading entrypoints-0.4-py3-none-any.whl.metadata (2.6 kB)
Collecting executing==0.8.3 (from -r requirements.txt (line 24))
Downloading executing-0.8.3-py2.py3-none-any.whl.metadata (8.6 kB)
Collecting fastjsonschema==2.15.3 (from -r requirements.txt (line 25))
Downloading fastjsonschema-2.15.3-py3-none-any.whl.metadata (1.9 kB)
Collecting flatbuffers==1.12 (from -r requirements.txt (line 26))
Downloading flatbuffers-1.12-py2.py3-none-any.whl.metadata (872 bytes)
Collecting fonttools==4.33.3 (from -r requirements.txt (line 27))
Downloading fonttools-4.33.3-py3-none-any.whl.metadata (125 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 125.6/125.6 kB 9.1 MB/s eta 0:00:00
Collecting gast==0.4.0 (from -r requirements.txt (line 28))
Downloading gast-0.4.0-py3-none-any.whl.metadata (1.1 kB)
Collecting google-auth==2.6.6 (from -r requirements.txt (line 29))
Downloading google_auth-2.6.6-py2.py3-none-any.whl.metadata (3.6 kB)
Collecting google-auth-oauthlib==0.4.6 (from -r requirements.txt (line 30))
Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl.metadata (2.7 kB)
Collecting google-pasta==0.2.0 (from -r requirements.txt (line 31))
Downloading google_pasta-0.2.0-py3-none-any.whl.metadata (814 bytes)
Collecting graphviz==0.16 (from -r requirements.txt (line 32))
Downloading graphviz-0.16-py2.py3-none-any.whl.metadata (7.1 kB)
Collecting grpcio==1.46.3 (from -r requirements.txt (line 33))
Downloading grpcio-1.46.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB)
Collecting h5py==3.6.0 (from -r requirements.txt (line 34))
Downloading h5py-3.6.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (1.9 kB)
Collecting idna==2.10 (from -r requirements.txt (line 35))
Downloading idna-2.10-py2.py3-none-any.whl.metadata (9.1 kB)
Collecting imageio==2.9.0 (from -r requirements.txt (line 36))
Downloading imageio-2.9.0-py3-none-any.whl.metadata (2.6 kB)
Collecting importlib-metadata==4.0.1 (from -r requirements.txt (line 37))
Downloading importlib_metadata-4.0.1-py3-none-any.whl.metadata (3.8 kB)
Collecting importlib-resources==5.7.1 (from -r requirements.txt (line 38))
Downloading importlib_resources-5.7.1-py3-none-any.whl.metadata (3.1 kB)
Collecting ipykernel==5.1.1 (from -r requirements.txt (line 39))
Downloading ipykernel-5.1.1-py3-none-any.whl.metadata (919 bytes)
Collecting ipython==8.3.0 (from -r requirements.txt (line 40))
Downloading ipython-8.3.0-py3-none-any.whl.metadata (4.9 kB)
Collecting ipython-genutils==0.2.0 (from -r requirements.txt (line 41))
Downloading ipython_genutils-0.2.0-py2.py3-none-any.whl.metadata (755 bytes)
Collecting ipywidgets==7.7.0 (from -r requirements.txt (line 42))
Downloading ipywidgets-7.7.0-py2.py3-none-any.whl.metadata (1.9 kB)
Collecting jedi==0.17.2 (from -r requirements.txt (line 43))
Downloading jedi-0.17.2-py2.py3-none-any.whl.metadata (19 kB)
Collecting Jinja2==2.11.3 (from -r requirements.txt (line 44))
Downloading Jinja2-2.11.3-py2.py3-none-any.whl.metadata (3.5 kB)
Collecting joblib==1.1.1 (from -r requirements.txt (line 45))
Downloading joblib-1.1.1-py2.py3-none-any.whl.metadata (5.2 kB)
Collecting jsonschema==4.5.1 (from -r requirements.txt (line 46))
Downloading jsonschema-4.5.1-py3-none-any.whl.metadata (7.8 kB)
Collecting jupyter==1.0.0 (from -r requirements.txt (line 47))
Downloading jupyter-1.0.0-py2.py3-none-any.whl.metadata (995 bytes)
Collecting jupyter-client==7.3.1 (from -r requirements.txt (line 48))
Downloading jupyter_client-7.3.1-py3-none-any.whl.metadata (5.5 kB)
Collecting jupyter-console==6.4.3 (from -r requirements.txt (line 49))
Downloading jupyter_console-6.4.3-py3-none-any.whl.metadata (1.2 kB)
Collecting jupyter-core==4.10.0 (from -r requirements.txt (line 50))
Downloading jupyter_core-4.10.0-py3-none-any.whl.metadata (1.4 kB)
Collecting jupyter-http-over-ws==0.0.8 (from -r requirements.txt (line 51))
Downloading jupyter_http_over_ws-0.0.8-py2.py3-none-any.whl.metadata (1.1 kB)
Collecting jupyterlab-pygments==0.2.2 (from -r requirements.txt (line 52))
Downloading jupyterlab_pygments-0.2.2-py2.py3-none-any.whl.metadata (1.9 kB)
Collecting jupyterlab-widgets==1.1.0 (from -r requirements.txt (line 53))
Downloading jupyterlab_widgets-1.1.0-py3-none-any.whl.metadata (3.7 kB)
Collecting keras==2.9.0 (from -r requirements.txt (line 54))
Downloading keras-2.9.0-py2.py3-none-any.whl.metadata (1.3 kB)
Collecting Keras-Preprocessing==1.1.2 (from -r requirements.txt (line 55))
Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl.metadata (1.9 kB)
Collecting kiwisolver==1.4.2 (from -r requirements.txt (line 56))
Downloading kiwisolver-1.4.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (6.4 kB)
Collecting libclang==14.0.1 (from -r requirements.txt (line 57))
Downloading libclang-14.0.1-py2.py3-none-manylinux1_x86_64.whl.metadata (4.8 kB)
Collecting Markdown==3.3.7 (from -r requirements.txt (line 58))
Downloading Markdown-3.3.7-py3-none-any.whl.metadata (4.6 kB)
Collecting MarkupSafe==1.1.1 (from -r requirements.txt (line 59))
Downloading MarkupSafe-1.1.1-cp38-cp38-manylinux2010_x86_64.whl.metadata (3.2 kB)
Collecting matplotlib==3.5.2 (from -r requirements.txt (line 60))
Downloading matplotlib-3.5.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (6.7 kB)
Collecting matplotlib-inline==0.1.3 (from -r requirements.txt (line 61))
Downloading matplotlib_inline-0.1.3-py3-none-any.whl.metadata (397 bytes)
Collecting mistune==0.8.4 (from -r requirements.txt (line 62))
Downloading mistune-0.8.4-py2.py3-none-any.whl.metadata (8.5 kB)
Collecting murmurhash==1.0.5 (from -r requirements.txt (line 63))
Downloading murmurhash-1.0.5-cp38-cp38-manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting nbclient==0.6.3 (from -r requirements.txt (line 64))
Downloading nbclient-0.6.3-py3-none-any.whl.metadata (5.1 kB)
Collecting nbconvert==6.5.0 (from -r requirements.txt (line 65))
Downloading nbconvert-6.5.0-py3-none-any.whl.metadata (6.0 kB)
Collecting nbformat==4.4.0 (from -r requirements.txt (line 66))
Downloading nbformat-4.4.0-py2.py3-none-any.whl.metadata (1.1 kB)
Collecting nest-asyncio==1.5.5 (from -r requirements.txt (line 67))
Downloading nest_asyncio-1.5.5-py3-none-any.whl.metadata (2.7 kB)
Collecting notebook==6.4.11 (from -r requirements.txt (line 68))
Downloading notebook-6.4.11-py3-none-any.whl.metadata (2.5 kB)
Collecting numpy==1.22.3 (from -r requirements.txt (line 69))
Downloading numpy-1.22.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.0 kB)
Collecting oauthlib==3.2.0 (from -r requirements.txt (line 70))
Downloading oauthlib-3.2.0-py3-none-any.whl.metadata (7.4 kB)
Collecting opt-einsum==3.3.0 (from -r requirements.txt (line 71))
Downloading opt_einsum-3.3.0-py3-none-any.whl.metadata (6.5 kB)
Collecting packaging==20.9 (from -r requirements.txt (line 72))
Downloading packaging-20.9-py2.py3-none-any.whl.metadata (13 kB)
Collecting pandas==1.5.2 (from -r requirements.txt (line 73))
Downloading pandas-1.5.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
Collecting pandocfilters==1.5.0 (from -r requirements.txt (line 74))
Downloading pandocfilters-1.5.0-py2.py3-none-any.whl.metadata (9.0 kB)
Collecting parso==0.7.1 (from -r requirements.txt (line 75))
Downloading parso-0.7.1-py2.py3-none-any.whl.metadata (7.0 kB)
Collecting pathy==0.5.2 (from -r requirements.txt (line 76))
Downloading pathy-0.5.2-py3-none-any.whl.metadata (14 kB)
Collecting pexpect==4.8.0 (from -r requirements.txt (line 77))
Downloading pexpect-4.8.0-py2.py3-none-any.whl.metadata (2.2 kB)
Collecting pickleshare==0.7.5 (from -r requirements.txt (line 78))
Downloading pickleshare-0.7.5-py2.py3-none-any.whl.metadata (1.5 kB)
Collecting Pillow==9.0.0 (from -r requirements.txt (line 79))
Downloading Pillow-9.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.6 kB)
Collecting preshed==3.0.5 (from -r requirements.txt (line 80))
Downloading preshed-3.0.5-cp38-cp38-manylinux2014_x86_64.whl.metadata (2.2 kB)
Collecting prometheus-client==0.14.1 (from -r requirements.txt (line 81))
Downloading prometheus_client-0.14.1-py3-none-any.whl.metadata (21 kB)
Collecting prompt-toolkit==3.0.29 (from -r requirements.txt (line 82))
Downloading prompt_toolkit-3.0.29-py3-none-any.whl.metadata (7.1 kB)
Collecting protobuf==3.19.4 (from -r requirements.txt (line 83))
Downloading protobuf-3.19.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (806 bytes)
Collecting psutil==5.9.1 (from -r requirements.txt (line 84))
Downloading psutil-5.9.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (21 kB)
Collecting ptyprocess==0.7.0 (from -r requirements.txt (line 85))
Downloading ptyprocess-0.7.0-py2.py3-none-any.whl.metadata (1.3 kB)
Collecting pure-eval==0.2.2 (from -r requirements.txt (line 86))
Downloading pure_eval-0.2.2-py3-none-any.whl.metadata (6.2 kB)
Collecting pyasn1==0.4.8 (from -r requirements.txt (line 87))
Downloading pyasn1-0.4.8-py2.py3-none-any.whl.metadata (1.5 kB)
Collecting pyasn1-modules==0.2.8 (from -r requirements.txt (line 88))
Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl.metadata (1.9 kB)
Collecting pycparser==2.21 (from -r requirements.txt (line 89))
Downloading pycparser-2.21-py2.py3-none-any.whl.metadata (1.1 kB)
Collecting pydantic==1.7.3 (from -r requirements.txt (line 90))
Downloading pydantic-1.7.3-cp38-cp38-manylinux2014_x86_64.whl.metadata (84 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.9/84.9 kB 5.7 MB/s eta 0:00:00
Collecting pydot==1.2.3 (from -r requirements.txt (line 91))
Downloading pydot-1.2.3.tar.gz (20 kB)
Preparing metadata (setup.py) … done
Collecting Pygments==2.12.0 (from -r requirements.txt (line 92))
Downloading Pygments-2.12.0-py3-none-any.whl.metadata (1.5 kB)
Collecting PyGObject==3.36.0 (from -r requirements.txt (line 93))
Downloading PyGObject-3.36.0.tar.gz (714 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 714.8/714.8 kB 14.6 MB/s eta 0:00:00
Installing build dependencies … error
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [46 lines of output]
Looking in indexes: Simple index, https://pypi.ngc.nvidia.com, https://pypi.ngc.nvidia.com
Collecting setuptools
Downloading setuptools-69.5.1-py3-none-any.whl.metadata (6.2 kB)
Collecting wheel
Downloading wheel-0.43.0-py3-none-any.whl.metadata (2.2 kB)
Collecting pycairo
Downloading pycairo-1.26.0.tar.gz (346 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 346.9/346.9 kB 9.2 MB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status ‘done’
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status ‘done’
Installing backend dependencies: started
Installing backend dependencies: finished with status ‘done’
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status ‘done’
Downloading setuptools-69.5.1-py3-none-any.whl (894 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 894.6/894.6 kB 33.4 MB/s eta 0:00:00
Downloading wheel-0.43.0-py3-none-any.whl (65 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.8/65.8 kB 63.4 MB/s eta 0:00:00
Building wheels for collected packages: pycairo
Building wheel for pycairo (pyproject.toml): started
Building wheel for pycairo (pyproject.toml): finished with status ‘error’
error: subprocess-exited-with-error

    × Building wheel for pycairo (pyproject.toml) did not run successfully.
    │ exit code: 1
    ╰─> [12 lines of output]
        running bdist_wheel
        running build
        running build_py
        creating build
        creating build/lib.linux-x86_64-cpython-38
        creating build/lib.linux-x86_64-cpython-38/cairo
        copying cairo/__init__.py -> build/lib.linux-x86_64-cpython-38/cairo
        copying cairo/__init__.pyi -> build/lib.linux-x86_64-cpython-38/cairo
        copying cairo/py.typed -> build/lib.linux-x86_64-cpython-38/cairo
        running build_ext
        'pkg-config' not found.
        Command ['pkg-config', '--print-errors', '--exists', 'cairo >= 1.15.10']
        [end of output]
  
    note: This error originates from a subprocess, and is likely not a problem with pip.
    ERROR: Failed building wheel for pycairo
  Failed to build pycairo
  ERROR: Could not build wheels for pycairo, which is required to install pyproject.toml-based projects
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

:frowning:

Sorry, but I have not tried that process and do not have the ability to debug it for you.

Possible approaches that I can think of are:

  1. Try to figure out why those errors are being thrown. Is it that the requested versions just aren’t available on the repo you are using? Or some other cause …
  2. Give up on this strategy and try to directly debug why your answers are different in that one test case. That would involve adding instrumentation to print intermediate values in both cases and use the comparison between the behavior on Coursera and in your local environment to try to figure out where things go off the rails.
  3. Just ignore this issue and try using the notebook to run the training and see what the performance of the resulting model is. Is it actually any worse than the version trained online with the passing tests? You could even edit the test case to make the expected values agree with the values you get. In other words, maybe this is not a big deal and things still work just fine although maybe there is something that affects the deterministic behavior of setting the seeds.

If you learn more, it would be nice to know what you discover. It could help others who hit this issue later.

1 Like

I usually advocate running locally because you can use your preferred IDE, debug and learn on issues like managing dependencies which you will anyway come across in real life. Also you will have the Labs saved in your computer once you will have finished (and forgotten?) the course (if you don’t soon practice).
Since I spent 3 days on the issues versus one hour to complete le assignment online, I wonder if it is still worth it in this case.

I will let you know if I find something.

Gratefully,
Francis

@Francis60 just to be honest, IMHO, instead of trying to copy/rerun code that obviously was not designed to be a ‘plug-in repo’ one’s time is better spent A) Finding a suitable Github repo that produces/replicates the type of model one seeks or B) Even better, if one has followed the class closely enough, even using the lab code as a rough guide, rebuild (aka re-write) the model code from scratch yourself. If nothing else you will actually learn something in the process that way.

2 Likes

Well I move to the next assignment.
I noticed this:

Please know, this assignment uses GPU.

Locally I can run Torch with my Nvidia GPU but I was not able to configure TensorFlow. Maybe the use of CPU it is one of the reasons why I get discrepancies at the 4th decimal.

If someone manage to run the Lab locally on CPU let me know.

If you have an Nvidia GPU, then you need to get CUDA into the loop to interact with the GPU. I have never personally used CUDA, so I don’t know if TF has an interface to it, but I can name that google search in one guess. :laughing:

In general all the math libraries involved implement the IEEE 754 standard, so the rounding behavior should be the same on any given single computation, but it is true that you can get different results by running things in different orders. TF can multithread things on multicore CPUs, but that effect is even more pronounced on a GPU because there are lots more ways to parallelize things on a GPU. Here’s an article from the TF/Keras website that gives some insight into why results can be different (non-deterministic) and how to get reproducible results, but at the expense of performance.

Although now that I think \epsilon more about it, what you are seeing is not non-determinism: it’s reproducible results that are different than you expect. So the above information is just for general understanding. I recently learned of it from a discussion by mentor Raymond.

1 Like

Yes @Nevermnd , I sometimes get the feeling of “illusion of competence” with the multiple-choice quizzes and “fill-in-the-blank” code labs which I can mostly fill in zombie mode. Trying to rewrite one model for example in PyTorch could be a worth-it exercice.

@Francis60 I think it is great that they make it ‘accessible’. I mean prior to this I tried auditing the related MITx course on EDx, and I know, having taken MIT courses before (I completed the Computation Structures course when they once offered it), they don’t muck around-- But that course was just so hard and they don’t even give you any code at all to start with. Nor do they even cover as much ground as DLS does.

But, here, seeing all the pieces of the puzzle more clearly I think it is a worthwhile independent exercise to ‘roll your own’ model. It is also a test for oneself that you really ‘know how it works’.

I mean, for example, though I’ve been a bit tied up lately (and I know we’re not supposed to share our actual labs from here online), I’ve been working to try and translate the models in the first two courses to R. I figure that’s fair game.

1 Like

Hi @paulinpaloalto,

I put requirements.txt investigation list for later.
Rail exit: as soon as the first BatchNorm() and then discrepancy increases on and on.
Your point about non-deterministic multi-thread computation is very interesting.