Strange error while solving random_mini_batches all assertion tests were passed and 3 out of 2 tests are passed. I couldn’t understand where the mistake is
I too am having the same issue. The previous cell tests pass, but here there is a cryptic error. I am not able to find where the issue is.
I am also facing the same problem.have u got any clue.kindly help
Having the same issue.
Hi, @JanGoLearn and @gotominhaz.
The most common problem I’ve seen is getting the indexes wrong when generating the last mini-batch:
- You have already processed
num_complete_minibatches
mini-batches of sizemini_batch_size
. What should the starting index be? - You want to read all the remaining examples. What should the stopping index be?
Let me know if that helped
Facing the same issue . Please elaborate.
@nramon please elaborate
Hi, @A_12_EESHAN_BHANAP.
I was referring to the calculation of the last mini-batch:
mini_batch_X = shuffled_X[:, starting_index:stopping_index]
mini_batch_Y = shuffled_Y[:, starting_index:stopping_index]
What error do you get?
Hi, @nramon
I am facing the same error,
2 Tests passed
1 Tests failed
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
<ipython-input-38-459bc9cfd993> in <module>
10 print ("mini batch sanity check: " + str(mini_batches[0][0][0][0:3]))
11
---> 12 random_mini_batches_test(random_mini_batches)
~/work/release/W2A1/public_tests.py in random_mini_batches_test(target)
70 ]
71
---> 72 multiple_test(test_cases, target)
73
74 def initialize_velocity_test(target):
/opt/conda/lib/python3.7/site-packages/dlai_tools/testing_utils.py in multiple_test(test_cases, target)
162 print('\033[91m', len(test_cases) - success, " Tests failed")
163 raise AssertionError(
--> 164 "Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
AssertionError: Not all tests were passed for random_mini_batches. Check your equations and avoid using global variables inside the function.
I had assigned the starting index as negative of (m%minibatchsize), and the stopping index as m. My logic was that it would get the last 20 values.
Thank you for taking your time to help!
Following up,
The problem actually is not in the last mini batch. If you have coded the step_2 part in a certain way, you will get passed in all except the sanity check.
Hint: check the starting and ending index of X and Y in step_2, your mini batch has to move along the list, meaning the partitions have to serially shift from beginning towards the end for each loop iteration. If for every loop the same slice is being given to mini_batch_X or mini_batch_Y, you will get this issue!
Hello @nramon! Hope you are fine.
I’m reading the topics regarding random_mini_batches and it seems to help but I still getting trouble to construct the last mini batch. I believe the the complete minibatch are correct. But the message I receive is in attach below.
Is missing complete this part os the code to finish the Week 2.
Thanks in advance.
Vivian
Fixed by @Vivian_de_Carvalho_R. Just a small indexing mistake.
Good job. Enjoy the rest of the course
Hello @nramon I’m having a similar problem to what Vivian experienced. Shapes are fine but I can’t understand why the values should be wrong.
Is it possible that something’s wrong with the values in the tests?
Hi @Francesco_Manfredi ,
Please start a fresh query to get a quick response from mentors. This thread is over 2 years old. The contributors to this thread may not be reachable.
If you attach that function in a direct message to me, I will have a look for you.
Thank you, @Kic. Your suggestions make a lot of sense.
Since it was a small mistake, I’m posting a hint that may be helpful to other learners: Pay attention to the difference between X
and shuffled_X
.
Good luck with the rest of the course, @Francesco_Manfredi
I forgot to use shuffled_X instead of X in the remaining batch, hence leading to different array values inspite of of the correct shape. Thanks for the hint!
I hope this is the right place for making a proposal for another solution that’s at least to me easier to grasp and maybe others will like it too. I don’t recommend using it in the notebook because of the grader but in other cases it might be helpful.
My solution is instead of calculating num of complete mini batches, start index, end index and handling the special case too (lots of calculations), I do one for loop from ‘0’ to ‘m’ (num training examples) with a step mini_batch_size. Then in the for loop I calculate the end_index of the current mini_batch by taking the min between ‘i + mini_batch_size’ and m. Ensuring the end index can’t exceed the length of our data sets.
In code it would look like this:
def random_mini_batches(X, Y, mini_batch_size):
m = X.shape[1]
mini_batches = []
permutation = list(np.random.permutation(m))
shuffled_X = X[:, permutation]
shuffled_Y = Y[:, permutation]
for i in range(0, m, mini_batch_size):
end_idx = min(i + mini_batch_size, m)
mini_batch = (shuffled_X[:, i:end_idx], shuffled_Y[:, i:end_idx])
mini_batches.append(mini_batch)
return mini_batches
I find this easy to understand with minimum calculations and code.
I have read all the threads related to this, still not getting the idea of where am I going wrong…
Let me know if i can send someone my code to check…I feel I have put everything in place.
Help me