Hello everyone,
Here are some of the mathematical topics I would like to share with you all.
Scalars, Vectors, Matrices & Tensors
Scalar: A single number (e.g., n∈N).
Vector: A 1D array (x=[x1,x2,…,xn]), represents points in space.
Matrix: A 2D array (A∈Rm×n), used in transformations.
Tensor: Generalized multi-dimensional arrays, essential for deep learning models.
Matrix Operations in ML & DL
Hadamard Product (A⊙B) → Element-wise multiplication (used in LSTMs, image processing).
Dot Product (x^T y) → Measures similarity (cosine similarity in NLP, embeddings).
Linear Equations (Ax=b) → Used in solving ML optimization problems.
Identity & Inverse Matrix: AA⁻¹=In, but A⁻¹ is avoided in ML due to numerical instability.
Norms & Distance Measures
L1 Norm (∥x∥1) → Sum of absolute values, used in sparse models & feature selection.
L2 Norm (∥x∥2) → Euclidean distance, used in regularization (Ridge Regression, weight decay in NN).
Max Norm (∥x∥∞) → Max absolute value, controls outliers in optimization.
Frobenius Norm → Measures matrix size, used in PCA & covariance matrices.
Special Matrices & ML Applications
Diagonal Matrix (Di,j=0,∀i≠j) → Efficient in computation.
Symmetric Matrix (A = A^T) → Used in covariance matrices.
Orthogonal Matrix (A−1=AT) → Important in eigen decomposition, PCA, SVD.
Unit Vector (∥x∥2=1) → Used in directional scaling.
Happy to share my understanding. Do connect if any doubt or want to share knowledge on the topic.
Best Regards,
Arif