Free Course Image Matrix Calculus For Machine Learning

Free online courseMatrix Calculus For Machine Learning

Duration of the online course: 13 hours and 55 minutes

New

Master matrix calculus to boost machine learning skills—learn Jacobians, gradients, autodiff, and optimization with a free online course and practical exercises.

In this free course, learn about

  • Derivatives as linear operators; linearization for ML approximations and sensitivity analysis
  • Matrix calculus rules: differentials of products (e.g., d(XAX)) and matrix powers (e.g., d(A^3))
  • Jacobians in higher dimensions; derivatives of matrix functions and 2D linear maps as transformations
  • Vectorization and Kronecker products to express Jacobians compactly for matrix mappings
  • Finite-difference derivatives: forward vs central differences; roundoff/cancellation when step is too small
  • Gradients via inner products in general vector spaces; defining derivatives through linear maps
  • Newton’s method as iterative linearization to approximate roots of nonlinear equations
  • Derivatives of determinant and inverse; identities used in optimization and inference
  • Automatic differentiation: dual numbers (forward mode) and computational graphs for exact derivatives
  • Reverse-mode AD advantages for scalar losses with many parameters (common in ML training)
  • Adjoint differentiation for ODE solutions; discretization via time-stepping methods (e.g., Euler/Runge–Kutta)
  • Calculus of variations: differentiating functionals to obtain gradient/Euler–Lagrange conditions
  • Derivatives of random functions; reparameterization trick for low-variance gradient estimates
  • Second derivatives: bilinear forms and Hessians for curvature, uncertainty, and optimization behavior

Course Description

Matrix calculus is the language that turns machine learning ideas into trainable models. When your code needs to compute gradients through vectors, matrices, decompositions, or probabilistic layers, intuition alone stops being enough—you need reliable tools for derivatives in higher dimensions. This free online course helps you build that toolkit, connecting the math directly to the workflows used in modern AI and machine learning.

You will develop a clear understanding of derivatives as linear operators and learn to express changes in matrix-valued functions in a way that stays consistent and checkable. The course bridges the gap between symbolic manipulation and practical computation by showing how Jacobians, vectorization, Kronecker products, and finite-difference approximations relate to real engineering tasks, including debugging gradients and validating numerical results.

From there, the focus expands to gradients and inner products in more general vector spaces—essential when you move beyond basic Euclidean assumptions. You will see how root finding and optimization methods connect to differentiation, and why adjoint-based techniques are central when the cost of naive gradient computation becomes too high.

A key theme is how derivatives are computed in practice. You will explore automatic differentiation concepts, dual numbers, and computational graphs, gaining insight into why reverse-mode methods underpin backpropagation and scale so well for learning problems. The course also connects differentiation to dynamical systems via adjoint differentiation of ODE solutions, offering a pathway into neural ODEs and differentiable simulation.

Later topics address derivatives of random functions, including ideas such as reparameterization for low-variance gradient estimates, and the role of second derivatives and Hessians when curvature matters for optimization and uncertainty. You will also gain perspective on differentiating eigenproblems, which appear in dimensionality reduction, spectral methods, and many advanced model components. Throughout, exercises reinforce the concepts so you can translate them into more stable implementations and stronger model understanding.

Course content

  • Video class: Lecture 1 Part 1: Introduction and Motivation 57m
  • Exercise: In the context of matrix calculus, if A and B are matrices and you define a function F(X) = XAX, what is the correct form of the derivative dF with respect to X, assuming X is also a matrix?
  • Video class: Lecture 1 Part 2: Derivatives as Linear Operators 48m
  • Exercise: What is the primary purpose of using derivatives in the context of linearization for machine learning?
  • Video class: Lecture 2 Part 1: Derivatives in Higher Dimensions: Jacobians and Matrix Functions 1h13m
  • Exercise: In the context of transformations using matrix calculus, what does a linear map from R2 to R2 effectively accomplish?
  • Video class: Lecture 2 Part 2: Vectorization of Matrix Functions 30m
  • Exercise: In the context of matrix calculus, when dealing with a function f(A) = A^3 where A is a square matrix, what does the derivative df represent in terms of dA?
  • Video class: Lecture 3 Part 1: Kronecker Products and Jacobians 53m
  • Exercise: What is the Jacobian matrix for the transformation function of a 2x2 matrix to its LU decomposition?
  • Video class: Lecture 3 Part 2: Finite-Difference Approximations 51m
  • Exercise: What is one reason why error increases when delta x becomes too small in finite difference calculations?
  • Video class: Lecture 4 Part 1: Gradients and Inner Products in Other Vector Spaces 1h03m
  • Exercise: In the context of finite difference approximations for derivatives, what advantage does the central difference method have over the forward difference method?
  • Video class: Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods 44m
  • Exercise: What function does Newton's Method approximate to find the roots of an equation?
  • Video class: Lecture 5 Part 1: Derivative of Matrix Determinant and Inverse 28m
  • Exercise: In the context of matrix calculus for computing the derivatives, what is required to define a derivative in a vector space?
  • Video class: Lecture 5 Part 2: Forward Automatic Differentiation via Dual Numbers 36m
  • Exercise: What is one of the key benefits of automatic differentiation as highlighted in the text?
  • Video class: Lecture 5 Part 3: Differentiation on Computational Graphs 32m
  • Exercise: In the context of forward mode and reverse mode automatic differentiation, what is the main reason one might choose to calculate derivatives using reverse mode over forward mode?
  • Video class: Lecture 6 Part 1: Adjoint Differentiation of ODE Solutions 58m
  • Exercise: In numerical solution of ordinary differential equations (ODEs), which method is commonly used for discretizing the derivative in order to approximate the solution at discrete time steps?
  • Video class: Lecture 6 Part 2: Calculus of Variations and Gradients of Functionals 42m
  • Exercise: What is the primary goal when taking the derivative of a functional in the context of calculus of variations?
  • Video class: Lecture 7 Part 1: Derivatives of Random Functions 1h06m
  • Exercise: In the context of differentiating random functions, which of the following statements best describes the 'reparameterization trick'?
  • Video class: Lecture 7 Part 2: Second Derivatives, Bilinear Forms, and Hessian Matrices 46m
  • Exercise: What is the primary purpose of calculating the second derivative in matrix calculus, especially in the context of machine learning?
  • Video class: Lecture 8 Part 1: Derivatives of Eigenproblems 36m
  • Exercise: What is one way to generate a random vector uniformly on the surface of a sphere in n-dimensional space?
  • Video class: Lecture 8 Part 2: Automatic Differentiation on Computational Graphs 1h05m
  • Exercise: In the context of matrix calculus for machine learning, consider a computation graph formed by sequences of operations. If a computation graph is described as a Directed Acyclic Graph (DAG), which of the following statements is TRUE about the graph?

This free course includes:

13 hours and 55 minutes of online video course

Digital certificate of course completion (Free)

Exercises to train your knowledge

100% free, from content to certificate

Ready to get started?Download the app and get started today.

Install the app now

to access the course
Icon representing technology and business courses

Over 5,000 free courses

Programming, English, Digital Marketing and much more! Learn whatever you want, for free.

Calendar icon with target representing study planning

Study plan with AI

Our app's Artificial Intelligence can create a study schedule for the course you choose.

Professional icon representing career and business

From zero to professional success

Improve your resume with our free Certificate and then use our Artificial Intelligence to find your dream job.

You can also use the QR Code or the links below.

QR Code - Download Cursa - Online Courses

More free courses at Artificial Intelligence and Machine Learning

Free Ebook + Audiobooks! Learn by listening or reading!

Download the App now to have access to + 5000 free courses, exercises, certificates and lots of content without paying anything!

  • 100% free online courses from start to finish

    Thousands of online courses in video, ebooks and audiobooks.

  • More than 60 thousand free exercises

    To test your knowledge during online courses

  • Valid free Digital Certificate with QR Code

    Generated directly from your cell phone's photo gallery and sent to your email

Cursa app on the ebook screen, the video course screen and the course exercises screen, plus the course completion certificate