Vectors and Linear Transformations
An individual instance (observation) of data is typically represented as a vector in machine learning. In this week, you will learn about the properties and operations of vectors. You will also learn about linear transformations, matrix inverse, and one of the most important operations on matrices: matrix multiplication. You will see how matrix multiplication naturally arises from a composition of linear transformations. Finally, you will learn how to apply some of the properties of matrices and vectors that you have learned so far to neural networks.
Learning Objectives
- Perform common operations on vectors like sum, difference, and dot product.
- Multiply matrices and vectors.
- Represent a system of linear equations as a linear transformation on a vector.
- Calculate the inverse of a matrix, if it exists.
Vector Algebra
Vectors and their properties
Vectors can be seen as an arrow in the plane or the higher dimensional space
Two important components of vectors are their magnitude (size) and their direction


Norm is the distance from one point to another
There are two norms:
- L1-norm (aka Manhattan norm) is the added value of absolute points, for simplicity we can think of it as the absolute values of width and height added
- L2-norm (aka Euclidean norm) is the square root of the squared points, for simplicity we can think of it as calculating the hypotenuse of a triangle using the Pythagorean theorem

The direction of a vector can be deduced from its coordinates
Sum and difference of vectors
We can get the sum and difference of vectors by adding or subtracting corresponding coordinates

Distance between vectors
Find the difference of vectors (minus corresponding coordinates) and apply either L1-norm or L2-norm to get the distance between vectors

Multiplying a vector by a scalar


The dot product
We get the dot product by multiplying the corresponding coordinates and adding them




Geometric Dot Product
The angle between vectors is important
The dot product of the orthogonal vectors is 0



We can get the dot product of the vectors in different directions by projecting one vector onto another perpendicularly and multiplying the norms of the vectors
If we know the angle between the vectors, multiplying the norms of the vectors by the cosine of the angle would achieve the same result


The dot product can be positive, 0, or negative depending on the projected perpendicular vector
Multiplying a matrix by a vector
Multiplying a matrix by a vector is similar to a dot product, where we multiply each element of rows by the elements of a vector



A matrix can be rectangular and one caution to take is that the number of columns of a matrix needs to match the length of a vector
So as long as the numbers bounded by red match, then we can perform a multiplication
All the information here is based on the Linear Algebra for Machine Learning and Data Science | Coursera from DeepLearning.AI
'Coursera > Mathematics for ML and Data Science' 카테고리의 다른 글
| Linear Algebra for Machine Learning and Data Science (11) (0) | 2024.05.16 |
|---|---|
| Linear Algebra for Machine Learning and Data Science (10) (1) | 2024.05.06 |
| Linear Algebra for Machine Learning and Data Science (8) (0) | 2024.05.04 |
| Linear Algebra for Machine Learning and Data Science (7) (0) | 2024.05.03 |
| Linear Algebra for Machine Learning and Data Science (6) (0) | 2024.05.02 |
}