본문 바로가기
Coursera/Mathematics for ML and Data Science

Linear Algebra for Machine Learning and Data Science (9)

by Fresh Red 2024. 5. 5.
728x90
반응형

Vectors and Linear Transformations

An individual instance (observation) of data is typically represented as a vector in machine learning. In this week, you will learn about the properties and operations of vectors. You will also learn about linear transformations, matrix inverse, and one of the most important operations on matrices: matrix multiplication. You will see how matrix multiplication naturally arises from a composition of linear transformations. Finally, you will learn how to apply some of the properties of matrices and vectors that you have learned so far to neural networks.

Learning Objectives


  • Perform common operations on vectors like sum, difference, and dot product.
  • Multiply matrices and vectors.
  • Represent a system of linear equations as a linear transformation on a vector.
  • Calculate the inverse of a matrix, if it exists.

Vector Algebra

Vectors and their properties

Vectors can be seen as an arrow in the plane or the higher dimensional space

Two important components of vectors are their magnitude (size) and their direction

01
Vectors and norms

Norm is the distance from one point to another

There are two norms:

  • L1-norm (aka Manhattan norm) is the added value of absolute points, for simplicity we can think of it as the absolute values of width and height added
  • L2-norm (aka Euclidean norm) is the square root of the squared points, for simplicity we can think of it as calculating the hypotenuse of a triangle using the Pythagorean theorem

The direction of a vector can be deduced from its coordinates

Sum and difference of vectors

We can get the sum and difference of vectors by adding or subtracting corresponding coordinates

Distance between vectors

Find the difference of vectors (minus corresponding coordinates) and apply either L1-norm or L2-norm to get the distance between vectors

Multiplying a vector by a scalar

01
Multiplying a vector by a positive/negative scalar

The dot product

We get the dot product by multiplying the corresponding coordinates and adding them

0123
The dot product

Geometric Dot Product

The angle between vectors is important

The dot product of the orthogonal vectors is 0

012

We can get the dot product of the vectors in different directions by projecting one vector onto another perpendicularly and multiplying the norms of the vectors

If we know the angle between the vectors, multiplying the norms of the vectors by the cosine of the angle would achieve the same result

01

The dot product can be positive, 0, or negative depending on the projected perpendicular vector

Multiplying a matrix by a vector

Multiplying a matrix by a vector is similar to a dot product, where we multiply each element of rows by the elements of a vector

012

A matrix can be rectangular and one caution to take is that the number of columns of a matrix needs to match the length of a vector

So as long as the numbers bounded by red match, then we can perform a multiplication

 

All the information here is based on the Linear Algebra for Machine Learning and Data Science | Coursera from DeepLearning.AI

728x90
반응형

home top bottom
}