Calculus and Linear Algebra
- Particularly important in ML as we often express our data as vectors and matrices.
- Consider some datapoint x which measures d different features:
x=[x1x2⋯xd]
- We can use the transpose vector to convert a row vector to a column vector and vice-versa:
x=[x1x2⋯xd]T=x1x2⋯xd
- Consider that we have n of these datapoints, representing n different samples of instances of whatever data we’re measuring.
- We can represent these in a matrix, in which a single column encodes all the datapoints for a specific feature across all datapoints.
- Rows represent a single datapoint.
M=M1,1M2,1⋮Mn,1M1,2M2,2⋯⋱M1,dMn,d
M=M1,1M2,1Mn,1M1,2M2,2⋮⋯M1,d⋱Mn,d
- We can use linear algebra operations to perform analysis and mutate thed ata for analysis.
- Identity matrix, diagonal matrix.
- Symmetric matrices & other structures
- Vector length / euclidean distance / L2 norm
- Orthogonality
- Multiplication (inner multiplication, dot/scalar product; outer/cross product.)
- Inverse of matrices
- Eigenvectors and eigenvalues.
- Derivatives and Integrals
- Gradient of a vector (partial derivatives)
∇f(x)=gradf(x)=∂x∂f(x)=∂x∂f(x)⋮xd∂f(x)
- Slope in a d-dimensional space.