Calculus and Linear Algebra

CourseMachine Learning
SemesterS1 2023

Calculus and Linear Algebra

  • Particularly important in ML as we often express our data as vectors and matrices.
  • Consider some datapoint xx which measures dd different features:

x=[x1x2xd] x=\begin{bmatrix}x_1&x_2&\cdots&x_d\end{bmatrix}

  • We can use the transpose vector to convert a row vector to a column vector and vice-versa:

x=[x1x2xd]T=[x1x2xd] x=\begin{bmatrix}x_1&x_2&\cdots&x_d\end{bmatrix}^T = \begin{bmatrix}x_1 \\ x_2\\ \cdots\\ x_d\end{bmatrix}

  • Consider that we have nn of these datapoints, representing nn different samples of instances of whatever data we’re measuring.
  • We can represent these in a matrix, in which a single column encodes all the datapoints for a specific feature across all datapoints.
    • Rows represent a single datapoint.

M=[M1,1M1,2M1,dM2,1M2,2Mn,1Mn,d]M = \begin{bmatrix}M_{1,1}&M_{1,2}&\cdots&M_{1,d} \\ M_{2,1}&M_{2,2}\\ \vdots&&\ddots\\ M_{n,1}&&&M_{n,d}\end{bmatrix}


M=[M1,1M1,2M1,dM2,1M2,2Mn,1Mn,d]M = \begin{bmatrix}M_{1,1}&M_{1,2}&\cdots&M_{1,d} \\\\ M_{2,1} &M_{2,2} \\\\& \vdots&&\ddots\\\\ M_{n,1}&&&M_{n,d}\end{bmatrix}

  • We can use linear algebra operations to perform analysis and mutate thed ata for analysis.
    • Identity matrix, diagonal matrix.
    • Symmetric matrices & other structures
    • Vector length / euclidean distance / L2 norm
    • Orthogonality
    • Multiplication (inner multiplication, dot/scalar product; outer/cross product.)
    • Inverse of matrices
    • Eigenvectors and eigenvalues.
  • Derivatives and Integrals
  • Gradient of a vector (partial derivatives)

f(x)=gradf(x)=f(x)x=[f(x)xf(x)xd]\nabla f(x) = \text{grad} f(x)=\frac{\partial f(x)}{\partial x}=\begin{bmatrix}\frac{\partial f(x)}{\partial x}\\ \vdots\\ \frac{\partial f(x)}{x_d}\end{bmatrix}

  • Slope in a d-dimensional space.