W1 assigement 1

Error from conv_single_step : must cast to float. i do use Z=float(Z+b)

following is my code and errors.
please help.

def conv_single_step(a_slice_prev, W, b):
    """
    Apply one filter defined by parameters W on a single slice (a_slice_prev) of the output activation 
    of the previous layer.
    
    Arguments:
    a_slice_prev -- slice of input data of shape (f, f, n_C_prev)
    W -- Weight parameters contained in a window - matrix of shape (f, f, n_C_prev)
    b -- Bias parameters contained in a window - matrix of shape (1, 1, 1)
    
    Returns:
    Z -- a scalar value, the result of convolving the sliding window (W, b) on a slice x of the input data
    """

    #(≈ 3 lines of code)
    # Element-wise product between a_slice_prev and W. Do not add the bias yet.
    # s = None
    # Sum over all entries of the volume s.
    # Z = None
    # Add bias b to Z. Cast b to a float() so that Z results in a scalar value.
    # Z = None
    # YOUR CODE STARTS HERE

    # YOUR CODE ENDS HERE
    
    return Z
Z = [[[-6.99908945]]]
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-84-ad8a5d0c1b57> in <module>
      6 Z = conv_single_step(a_slice_prev, W, b)
      7 print("Z =", Z)
----> 8 conv_single_step_test(conv_single_step)
      9 
     10 assert (type(Z) == np.float64), "You must cast the output to numpy float 64"

~/work/release/W1A1/public_tests.py in conv_single_step_test(target)
     48     expected_output = np.float64(-3.5443670581382474)
     49 
---> 50     assert (type(Z) == np.float64 or type(Z) == np.float32), "You must cast the output to float"
     51     assert np.isclose(Z, expected_output), f"Wrong value. Expected: {expected_output} got: {Z}"
     52 

AssertionError: You must cast the output to float

Try Z = Z + float(b)

The docstring says z is expected to be a scalar of float type, not an array of float type.

Good luck!
Raymond

PS: removing code as we can’t share it here.