CoffeRoasting with relu and improved numerical accuracy

Hey hey,

I am currently learning week 2 of course 2. I have been playing around with the coffee_roast neural network (from C2_W1_Lab02) and incorporating in it new concepts as I learn them. I modified the code to use the relu activation function and make it more numerically accurate. This is what it looks like:

model2 = Sequential([
tf.keras.Input(shape = (2,)),
Dense(units = 3, activation = “relu”, name = “layer1”),
Dense(units = 1, activation = “linear”, name = “layer2”)
])

model2.compile(
loss = tf.keras.losses.BinaryCrossentropy(from_logits = True),
optimizer = tf.keras.optimizers.Adam(learning_rate=0.01))

model2.fit(Xt, Yt, epochs = 10) #normalized and tiled X, Y training data set

logit = model2(X_test)
predictions = tf.nn.sigmoid(logit)

########
Would appreciate feedback on whether this is an okay way to build a neural network for a binary classification problem. I just want to ensure that I have got it right so far, so that I can play around with more complicated datasets.

Cheers
Nadi

2 Likes

Nadi, if you put the code in between tildes “~”, it will be printed in a monospace typeface (and be subject to syntax highlighting) and be easier to read. The server is using extended extended markdown it seems:

model2 = Sequential([
   tf.keras.Input(shape = (2,)),
      Dense(units = 3, activation = “relu”, name = “layer1”),
      Dense(units = 1, activation = “linear”, name = “layer2”)
   ])

model2.compile(
   loss = tf.keras.losses.BinaryCrossentropy(from_logits = True),
   optimizer = tf.keras.optimizers.Adam(learning_rate=0.01))

model2.fit(Xt, Yt, epochs = 10) #normalized and tiled X, Y training data set

logit = model2(X_test)
predictions = tf.nn.sigmoid(logit)
2 Likes

Thanks for the suggestion! Cheers

1 Like

Hello, Nadi @nadidixit,

I think your code is okay!

Cheers,
Raymond

1 Like

I tried to run it on my local installation (PyCharm) and ran headlong into PyCharm bug PY-53599: tensorflow.keras subpackages are unresolved in Tensorflow >= 2.6.0. There goes the whole morning.

Still, I have found the following:

  1. One should not use keras.Input but keras.layers.InputLayer
  2. Using keras.layers.InputLayer in the sequential model is not necessary.

For point 1, we get this warning:

WARNING:tensorflow:Please add `keras.layers.InputLayer` instead of
`keras.Input` to Sequential model.
`keras.Input` is intended to be used by Functional model.

So:

import tensorflow as tf

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import InputLayer
from tensorflow.keras.losses import BinaryCrossentropy
from tensorflow.keras.optimizers import Adam

import sys

print(f"We are using Python version {sys.version} on {sys.platform}")
print(f"We are using tensorflow version {tf.__version__}")

model2 = Sequential([
    InputLayer(shape=(2,)),
    Dense(units=3, activation="relu", name="layer1"),
    Dense(units=1, activation="linear", name="layer2")
])

model2.compile(
    loss=BinaryCrossentropy(from_logits=True),
    optimizer=Adam(learning_rate=0.01)
)

If you are wrestling with PyCharm, the code needs to be modified and one is absolutely not supposed to do the imports like this:

import tensorflow as tf
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense
from tensorflow.python.keras.layers import InputLayer
from tensorflow.python.keras.losses import BinaryCrossentropy
from tensorflow.python.keras.optimizer_v2.adam import Adam

import sys

print(f"We are using Python version {sys.version} on {sys.platform}")
print(f"We are using tensorflow version {tf.__version__}")

model2 = Sequential([
    InputLayer(input_shape=(2,)),
    Dense(units=3, activation="relu", name="layer1"),
    Dense(units=1, activation="linear", name="layer2")
])

model2.compile(
    loss=BinaryCrossentropy(from_logits=True),
    optimizer=Adam(learning_rate=0.01)
)

Additionally, for some Lovecraftian reason, InputLayer does not recognize the shape named argument now but demands input_shape. This is bad, as this means some older version is being used.

Also, the documentation:

1 Like

I agree with your point 2.

I have tried to reproduce this but couldn’t. Which version of Tensorflow are you using?

I checked out the latest TF codebase and found that the latest Sequential is able to extract the keras.layers.InputLayer object created in keras.Input instead of issuing a warning:

keras.Input does create a layer object:

1 Like

Thank you Raymond

I’m as top as possible:

We are using tensorflow version 2.19.0
We are using tensorflow.keras version 3.9.1

The files seem to be organized differently than the “keras” project. Complexity is: HIGH.

In ./python/keras/models.py we find this line:

from tensorflow.python.keras.engine import sequential

And in ./python/keras/engine/sequential.py we find this code:

    # If we are passed a Keras tensor created by keras.Input(), we can extract
    # the input layer from its keras history and use that without any loss of
    # generality.
    if hasattr(layer, '_keras_history'):
      origin_layer = layer._keras_history[0]
      if isinstance(origin_layer, input_layer.InputLayer):
        layer = origin_layer
        logging.warning(
            'Please add `keras.layers.InputLayer` instead of `keras.Input` to '
            'Sequential model. `keras.Input` is intended to be used by '
            'Functional model.')

Nothing like this can be found in the Keras “3.9.2” tag here

Thought: Does the Keras packed with tensorflow diverge significantly from the standalone Keras?

We have ChatGPT for an approximate answer:

Has the Keras packed with Tensorflow diverged substantially from the standalone Keras?

We read:

Yes, TensorFlow Keras (tf.keras) has diverged significantly from the standalone Keras (keras from pip install keras) over time. Here’s how they compare:

So we need to go to this source, actually:

tensorflow/tensorflow/python/keras/engine/sequential.py

1 Like

Oh! Interesting! Thank you, David.

:raising_hands: :raising_hands: :raising_hands:

1 Like