Back propogation across multiple layer

In the back propogation across multiple layer its shown that

if we changes w1[1] the cost function changes by 6 *epsilon.
This has an assumption that the w1[2] is not changed. But when we have to derive the values of w[1] and w[2] dont we have to change both w[1] and w[2] at the same iteration of back propogation?

Can you please give a reference for where you see this?