When are TensorFlow abbreviations allowed and or required?

In a C4W1 lab, we learn that we can use “tfl” as an abbreviation by doing this earlier in our code:
import tensorflow.keras.layers as tfl
(and as always, we can use “tf” by doing: import tensorflow as tf)

In a C4W2 lab, the example code does away with tf.keras.layers and tfl entirely, and does things like:
Conv2D(…)
instead of
tfl.Conv2D(…)

So my question is: when is it allowed to do away with tfl prefixes? I’m guessing it’s related to one of the unexplained
from … import …
commands at the beginning of the notebook.

As long as you are importing the right library and sub-library, it is not an issue if you are using tfl or any other naming or not at all. There must be some other kind of import here that makes this work without doubt. Naming is not an issue.

Like pretty much everything in Python, there is a lot of flexibility and power ie complexity hidden under the covers of the import mechanism.

You can read details here 5. The import system — Python 3.11.2 documentation

Below are the some of the very high level concepts described in detail at that link…


Python code in one module gains access to the code in another module by the process of importing it. The import statement is the most common way of invoking the import machinery, but it is not the only way.

The import statement combines two operations; it searches for the named module, then it binds the results of that search to a name in the local scope.

All modules have a name. Subpackage names are separated from their parent package name by a dot, akin to Python’s standard attribute access syntax. Thus you might have a package called email, which in turn has a subpackage called email.mime and a module within that subpackage calledemail.mime.text .


Or here 7. Simple statements — Python 3.11.2 documentation
——-

If the requested module is retrieved successfully, it will be made available in the local namespace in one of three ways:

  • If the module name is followed by as, then the name following as is bound directly to the imported module.
  • If no other name is specified, and the module being imported is a top level module, the module’s name is bound in the local namespace as a reference to the imported module
  • If the module being imported is not a top level module, then the name of the top level package that contains the module is bound in the local namespace as a reference to the top level package. The imported module must be accessed using its full qualified name rather than directly.

——-

How you specify the module to import depends on its own package structure, how much of that package you want to import, and how you want to refer to it in your program. The fact that you see tf versus tfl most likely only reflects the exercises were written by different people using their own style. Hope it helps

1 Like

Right. I am familiar with the rules and concepts in your response, but what I’m not familiar with are the specific circumstances under which we can drop the prefixes entirely. Since this is done without fanfare or explanation in the mentioned lab, I was hoping we could get some fanfare or explanation. :slight_smile:

  1. It’s the Python mechanism linked above dictating the rules and options for how the import can be expressed (not TensorFlow)

  2. The ‘specific circumstances’ are the usage of a name in the body of the code must be consistent with the way that name was bound at the import expression. That’s all. You are even free to do it differently for different names (variables) in the same program if it suits your needs.

It doesn’t meet the standard of fanfare, but the information is all there in the code in the “import” cell. It’s the first code cell in every notebook. As @ai_curious has explained, you can import any level of the TF namespace hierarchy and assign it any name you want.

E.g. here’s that cell from the Residual Networks assignment:

import tensorflow as tf
import numpy as np
import scipy.misc
from tensorflow.keras.applications.resnet_v2 import ResNet50V2
from tensorflow.keras.preprocessing import image
from tensorflow.keras.applications.resnet_v2 import preprocess_input, decode_predictions
from tensorflow.keras import layers
from tensorflow.keras.layers import Input, Add, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2D, AveragePooling2D, MaxPooling2D, GlobalMaxPooling2D
from tensorflow.keras.models import Model, load_model
from resnets_utils import *
from tensorflow.keras.initializers import random_uniform, glorot_uniform, constant, identity
from tensorflow.python.framework.ops import EagerTensor
from matplotlib.pyplot import imshow

from test_utils import summary, comparator
import public_tests

%matplotlib inline

You can see that they import some higher level points in the TensorFlow name space (tf and layers), as well as some “leaf” nodes in the namespace. So with that set of imports, suppose you want to use the Keras Layer function called ZeroPadding2D. You have (at least) three choices for how to reference that:

ZeroPadding2D
layers.ZeroPadding2D
tf.keras.layers.ZeroPadding2D

But there’s a clear choice, right? Note that I scanned the notebook and can’t find any actual references to layers, so that import is superfluous at least as the code currently stands.

1 Like