In the video, \begin{bmatrix} 1 & 1 & 2 & 12 \\ 3 & -3 & -1 & 3 \\ 2 & -1 & 6 &24 \end{bmatrix} transform to \begin{bmatrix} 1&1&2&12\\0&-6&-7&-33\\0&0&6&18\end{bmatrix}.
https://www.coursera.org/learn/machine-learning-linear-algebra/lecture/6Rxh8/row-echelon-form-in-general
Otherwise, I calculated as this.
Matrix[1] = (Matrix[1]/3 - Matrix[0])*3 = \begin{bmatrix}1&1&2&12\\0&-6&-7&-33\\2&-1&6&24\end{bmatrix}.
Matrix[2] = (Matrix[2]/2 - Matrix[0])*2 = \begin{bmatrix}1&1&2&12\\0&-6&-7&-33\\0&-1&2&0\end{bmatrix}
Matrix[2] = (Matrix[2]/-1 - Matrix[1]/-6)*-6 = \begin{bmatrix}1&1&2&12\\0&-6&-7&-33\\0&0&19&33\end{bmatrix}
How should I calculate?