C4_W3_Lab_1_RNN

hi

with reference to:

model = tf.keras.models.Sequential([
  tf.keras.layers.Lambda(lambda x: tf.expand_dims(x, axis=-1),
                      input_shape=[window_size]),
  tf.keras.layers.SimpleRNN(40, return_sequences=True),
  tf.keras.layers.SimpleRNN(40),
  tf.keras.layers.Dense(1),
  tf.keras.layers.Lambda(lambda x: x * 100.0)
])

in the lab cell it suggests “Feel free to remove this layer later after this lab and see what results you get.”

I did, and the results are terrible. mae of 77 or more.

I would like to learn what the code would be to convert this model so that there are no lambda layers.
In other words, what would the code be for the pre-processing, the model and post-processing. The model would look something like this:

model = tf.keras.models.Sequential([
  tf.keras.layers.SimpleRNN(40, return_sequences=True),
  tf.keras.layers.SimpleRNN(40),
  tf.keras.layers.Dense(1),
  ])

with modifications likely too.

Thank you so much for your help. I really appreciate it! I’m trying to understand how all this fits in with all the other models we have created in the past (that didn’t use Lambda layers).

Thank you!
Ed

Hello,

You could experiment with it, the lambda layers Lambda layer are just used for simple transformations. the first one is preparing the input with the right shape and adding another axis. The last is just multiplication by 100. Keep us posted too!

1 Like

would this be suitable for removing the first lambda layer?

model = tf.keras.models.Sequential([
  tf.keras.layers.Conv1D(filters=64, kernel_size=3,
                      strides=1,
                      activation="relu",
                      padding='causal',
                      input_shape=[window_size, 1]),
  tf.keras.layers.SimpleRNN(40, return_sequences=True),
  tf.keras.layers.SimpleRNN(40),
  tf.keras.layers.Dense(1),
  tf.keras.layers.Lambda(lambda x: x * 100.0)
  ])

thanks