Copy of C3_W3_Lab_1 access denied when trying to download the dataset

Access denied with the following error:

Cannot retrieve the public link of the file. You may need to change
the permission to 'Anyone with the link', or have had many accesses. 

You may still be able to access the file from the browser:

 https://drive.google.com/uc?id=0B0d9ZiqAgFkiOHR1NTJhWVJMNEU 

FileNotFoundError Traceback (most recent call last)
in ()
4 # extract the downloaded dataset to a local directory: /tmp/fcnn
5 local_zip = ‘/tmp/fcnn-dataset.zip’
----> 6 zip_ref = zipfile.ZipFile(local_zip, ‘r’)
7 zip_ref.extractall(’/tmp/fcnn’)
8 zip_ref.close()

/usr/lib/python3.7/zipfile.py in init(self, file, mode, compression, allowZip64, compresslevel)
1238 while True:
1239 try:
→ 1240 self.fp = io.open(file, filemode)
1241 except OSError:
1242 if filemode in modeDict:

FileNotFoundError: [Errno 2] No such file or directory: ‘/tmp/fcnn-dataset.zip’

1 Like

Yeah there must be a permission issue with the hosting bucket. I will raise an issue about this for the QA team.

1 Like

Hi all! We just added code to work around the download issue. Please reopen the notebook from the classroom to see the changes. Thank you for reporting!

2 Likes

Hi, I still have the same problem. I reopened the notebook from the classroom and still see the access denied.

Hi Dea. Are you talking about the cell with the gdown command? Below that is a troubleshooting section where you will instead run a script to get around that command. I believe the current restriction is temporary so we’re keeping the gdown command there. If it’s still throwing the error after 1 week then we’ll remove it entirely. Nonetheless, the workaround should let you download the dataset into your workspace. Hope this helps!

Thank you! yes, it works. Sorry about that.

1 Like

Hello Chris, sorry i still have the problem with gdown command. Same access denied error. I cannot find the “troubleshooting section” could you please indicate more on how to do the workaround? thank you.

Hi Andrew! The troubleshooting section should be directly below the cell that has the gdown command. It shows a cell with a bash script to download the file via curl instead of gdown. Hope this helps!

Sorry Chris, but I still cannot find teh Curl command, I even searched for the curl command in the file and found nothing. Including my screen shot for your reference.

Hi Andrew! This seems like a notebook from a different specialization so you are not seeing the fix. Can you tell me which course and week this is? Please also give the lab title. Thanks.

I am working on " Optional Assignment- Exploring overfitting in NLP". C3_W3_Assignment.ipynb.

Oh I see. That is for a different specialization. This thread is for Tensorflow Advanced Techniques but you might be in the Tensorflow Developer specialization. Kindly create a topic there to keep the forums organized. But I will still check this dataset. Thanks!

thank you. I have raised a topic over there. by the way, when I click the open tool in developer speciialization (C3_W3) i get connected here. Not sure if this is a bug in the course.

Oh I see. Ok will check the Discourse link and the notebook you’re referring to as soon as i can tomorrow. Thank you!

Thankyou Chris for fixing the issue in Colab.
I’m trying to download the dataset from the Jupyter notebook on my local windows computer, but cannot get it working. When executing the bash-file, I get the error " Failure writing output to destination" . Any idea how to solve. My current workaround is to download in Colab , thanks to your solution, and download the zip-file. A better solution is of course to be able to download directly from the local notebook.
Thanks again for support in previous issus.

Hi Tomas! The directory in your machine might be write protected. What you can do is to modify the filename to an accessibel part of your computer. For example, if you want to download it to the same directory of the notebook, you can do something like:

%%writefile download.sh

#!/bin/bash
fileid="0B0d9ZiqAgFkiOHR1NTJhWVJMNEU"
filename="./fcnn-dataset.zip"
html=`curl -c ./cookie -s -L "https://drive.google.com/uc?export=download&id=${fileid}"`
curl -Lb ./cookie "https://drive.google.com/uc?export=download&`echo ${html}|grep -Po '(confirm=[a-zA-Z0-9\-_]+)'`&id=${fileid}" -o ${filename}

It will have implications on other cells though. Make sure to also revise the locations when unzipping the file. For example:

# extract the downloaded dataset to a local directory: /tmp/fcnn
local_zip = './fcnn-dataset.zip'
zip_ref = zipfile.ZipFile(local_zip, 'r')
zip_ref.extractall('./fcnn')
zip_ref.close()

After that, you’ll need to replace all instances of /tmp/fcnn to ./fcnn or simply fcnn in other cells. This seems tedious and we might revise the notebook to just have a WORKING_DIR global variable so learners will only have to modify one place. Thanks for bringing this up and hope the workaround is okay for now.

Hi Chris and thanks for fast and detailed response!
It helped me to find out that that my windows bash enviroment is not properly set -up. I will set it it up and try out the solution you gave above, so thanks for the support given!

1 Like