Local installation of tf_utils

Where is the collection tf_utils invoked in the assignment?
If I attempt to import it to my local environment, via

from tensorflow import tf_utils

the file is not found. I believe this means that it is written by Coursera. How can I incorporate it to the local environment?

1 Like

Hi EduardoChicago,

After opening an assignment notebook, you can go to File → Open…

This will show files and folders used as utilities for the assignment. You can download from there and subsequently incorporate files in your local env.

2 Likes

Hi Reinoud, and Mentors.
No luck. When I “File → Open” the directory (below), I don’t find a tensorflow package, from which tf_utils is imported.
Can you dig deeper, please? (As I may have explained, I ask because I intend to make this lab work in the local environment --so that I can use my own training sets).
Thanks.

(this is what I find :

[datasets]
[images]
[Residual_Networks training my set.ipynb]
[Residual_Networks.ipynb]
[outputs.py]
[public_tests.py]
[resnet50.h5]
[resnets_utils.py][test_utils.py])

Hi EduardoChicago,

In the residual networks assignment, tf_utils is not imported. So you cannot find it in the local folder used by that assignment. If you need tf_utils, you can reach it from an assignment that uses it.

See also python - How to fix the error as 'tf_utils module not found' - Stack Overflow

2 Likes

Thanks again Reinoud. Sorry to keep asking, but you answer, and no good deed goes unpunished.
So, here is a related one. It’s so related that it does not deserve to be labeled as a “new topic”.
When testing the Identity_block function, the lab test includes these lines

A3np = A3.numpy()
print(np.around(A3.numpy()[:,(0,-1),:,:].mean(axis = 3), 5))
resume = A3np[:,(0,-1),:,:].mean(axis = 3)
print(resume[1, 1, 0] , resume[1, 1, 0])

This causes the creation of A3np, which we are told is not a tensor (A3 is a tensor).
When I try to execute these lines in my (tf) environment, I get an error

----> 1 A3np = A3.numpy()
AttributeError: ‘Tensor’ object has no attribute ‘numpy’

Evidently, the rules in my environment are different. How do I change the local environment to match Coursera’s? Or, how do I reformulate

A3np = A3.numpy()

in a way that is more universally interpretable?
(I keep digging deeper the hole of indebtedness to you…)

What version of TensorFlow are you using locally? Here’s what I get when I run this code in the ResidualNet notebook:

tf.__version__
'2.3.0'
2 Likes

Good point, Paul. You can also have a look here, EduardoChicago:

1 Like

I had the version 1.4 that comes with conda. Seeing that the conventional wisdom is against installing the update with PIL, I did it with

conda install tensorflow==2.0

and then tried the simple

import tensorflow as tf

I got a scary error

ERROR! Session/line number was not unique in database. History logging moved to new session 59

After that, I got more errors (actually, 3) , of the type

ERROR:root:Internal Python error in the inspect module.
Below is the traceback from this internal error.

Here is a sample piece of the long traceback.

ImportError: cannot import name ‘random_uniform’ from ‘tensorflow.keras.initializers’ (C:.…py)
During handling of the above exception, another exception occurred:
…
AttributeError: ‘ImportError’ object has no attribute ‘render_traceback’
During handling of the above exception, another exception occurred: etc.

Obviously, updating tensorflow is above my pay grade :face_with_head_bandage:
Any suggestions?

Paul: If I ask

tf.version

I get

AttributeError Traceback (most recent call last)
in
----> 1 tf.version

AttributeError: module ‘tensorflow’ has no attribute ‘version’

but I understand that tensorflow 2.0 is necessary to run the ResNets exercises. It’s just that I cannot install it in my Anaconda3 system. Should I uninstall Anaconda and start again? I just need an environment similar to what is used in Coursera courses. After the update to tensorflow.keras I lost much of what I had achieved.

Please have another look at the code that I gave you. It’s not just “tf.version” it’s bracketed with double underscores on both ends. Or to put it in words, it’s “under under version under under”. Maybe that got confused by the rendering logic as Markdown.

Or try googling “how do I find what TensorFlow version I have”.

Well, if you want to play that game, then you’re going to have to get your hands dirty. The course gives you an environment which is complete and usable. If you want to create your own environment, then you need to develop the skills to solve problems like this. The point is that it never ends, right? Next you’ll be asking us how to deal with whatever whacko version of Windows you’ve got. Speaking of “beyond pay grade”, please realize the mentors do not get paid to do this. We’re here to be answering questions about the course material, not teaching you how to install software in your own environment. Maybe you’ll get lucky and someone else who is listening here might be a nicer person than I am or actually enjoy solving problems like this, in which case I will be happy to let them handle it.

The good news is that there is copious information of this sort on the web. Just google “how do I install TensorFlow 2.0”. I’m sure that will take you to either StackExchange or the TF website.

2 Likes

I so agree that it never ends. You are nicer than you think. I would be equally mad if a postdoc were asking me how to “correct a z stack of images for blurring by the imaging system”, or an open q like that.
The alternative question, perhaps less irritating, would be how to implement training/testing on one’s own set of images in the Coursera working environment. Would you give this a thought? (Students are the worst!)

sure. the double underscores were invisible in the rendering. thanks.

Doing things in the course environment is pretty easy by comparison. Start with one of the notebooks that uses the same dataset and just make a copy of it for your experiments. Just click “File → Make a copy” or “File → Open” and then use the “Duplicate” and “Rename” functions there. Then you can change the code to run whatever experiments you want. As far as I can tell over several years of using the online Coursera environment, they don’t seem to put any limit on making copies of the various notebooks and keeping them around. The only downside is that you need to keep paying the subscription fee or else you lose access to the online environment. But there are other alternatives for executing Jupyter Notebooks: Google Colab is probably the most famous one. You can download the “ipynb” file and upload it to Colab. They have all the standard python, pytorch, tf, keras and so forth available, but you might need to bring over any of the custom utility routines that are here.

Oh, sorry, I see that I misinterpreted your question a bit. You are talking about using your own datasets. The same “copy a relevant notebook” strategy is the starting point, but then you just need to upload your datasets. From the notebook, click “File → Open” and then you’ll find an “Upload” button in the resulting “file navigation” view.

Thanks Paul. The problem is that my “training set” is a collection of images (i.e. a few hundred separate jpg files, with labels). I don’t think that is the same as a dataset, which I would not know how to make. Perhaps I haven’t taken enough courses.
As for the other approach (build my environment), I’m almost there. I still get a couple of errors (NameError: name ‘EagerTensor’ is not defined), but they don’t seem to affect the performance. I’m sure I will find more issues, but there is progress… I thank you again.

There are various ways to turn a directory full of files into a compound file or “database”. In this course, they prefer using the “h5” file format but there are other simpler alternatives like zip or tar.

Try googling “create a zip file from a set of images in python” and then “how do I read a zip file in python”.

Or google “how do I convert a set of images into an h5 database file in python”.

There is a theme here: this type of question is covered in great detail on the web and modern search engines can do NLP. “Ask and ye shall find”.

1 Like

But I also realize that knowing the right question to ask is not always obvious. It is legitimate to ask for help at that level, so that we can help you scope out the options. On that general topic, a followup question is what is the form of the labels that you have for your data? Are they individual files with names that express the correspondence with the image files or some other format? It’s easy to use “zip” to create a compound file containing a collection of image files, but you also need to coordinate that with the labels.

Given the scaling issues that are fundamental to the online education model, it’s my goal to give “fishing lessons” whenever possible instead of handing people a fish, in the sense of the famous proverb most commonly attributed to the philosopher Lao Tzu:

If I give a man a fish, he will not be hungry today. If I teach a man to fish, then he will never be hungry again.

Hence the theme of suggesting the search terms to be used …

Thanks again. I managed to input all files individually, via an inelegant for loop (one more inelegance in my local environment :cold_sweat:). Now dealing with “shaping” issues. Please see a LinkedIn message on a “professional” issue.

There are plenty of examples of how to resize and reshape images in these courses. E.g. see the Logistic Regression exercise in Week 2 Course 1 or the Application exercise in Week 4 Course 1. Both have the reshape logic to convert 3D image tensors into vectors as well as the “Resize” logic to create standard sizes. See the “Test with Your Own Image” section of both assignments.