C4_W2_lab_3 question


I just want to confirm what is happening in this code (C4_W2_lab3):
“Ungraded Lab: Training a Deep Neural Network with Time Series Data”
section “You can get the predictions again and overlay it on the validation set.”

# Initialize a list
forecast =

# Reduce the original series
forecast_series = series[split_time - window_size:]

# Use the model to predict data points per window size

for time in range(len(forecast_series) - window_size):

 forecast.append(model_tune.predict(forecast_series[time:time + window_size][np.newaxis]))

# Convert to a numpy array and drop single dimensional axes
results = np.array(forecast).squeeze()

# Plot the results
plot_series(time_valid, (x_valid, results))

So, Please correct me if I am wrong. I believe the above code is doing the following: (let’s assume window_size=20):

  1. create a new forecast_series list by taking the original series data from the split_time - window size to the end of the available data.
  2. iterate
  3. take a single time (integer) from 1…(len(forecast_series)) - 20)
  4. use that time integer to build a list from the data from the forecast_series by extracting data from the range time to time + 20 . So, the first time, the data will be from the range 1…21
  5. use this sublist (above) to feed into the model_tune.predict() function. This essentially predicts the value for the 22nd item (i.e., the value for the next timestep)
  6. append this prediction to the forecast list we are building
  7. the next iteration will create a prediction for the 23rd item, based on the forecast_series data from the range 2…22, and so on until we have filled the forecast list with predictions

So, essentially, this code is taking observed series data of size window_size and creating a prediction exactly one timestep into the future. It continues this process for split_time - window size to the end of the available data.

Thank you for correcting or confirming my understanding on this.


1 Like

Hey Ed!

Thanks for posting this!

Yes, your understanding is correct on this matter!