본문 바로가기
Coursera/Mathematics for ML and Data Science

Linear Algebra for Machine Learning and Data Science (15)

by Fresh Red 2024. 6. 7.
728x90
반응형

Determinants and Eigenvectors

Eigenvalues and Eigenvectors

Bases in Linear Algebra

012

A matrix can be seen as a linear transformation from a plane to a plane

Basis: Two vectors coming from the origin that define the plots (square, parallelogram, etc)

The main property of basis is that every point in the space can be expressed as a linear combination of elements in the bases

Two vectors that go in the same/opposite direction can’t be bases

Span in Linear Algebra

01
Span with basis and non-basis

The span of vectors is a set of points that can be reached by walking in the direction of these vectors in any combination

For a non-basis, the span would be a straight line

01

A basis is a minimal spanning set

For instance, to cover a line, we need 1 element (1 direction) and to cover a plane, we need 2 elements (2 directions)

01

Linearly independent: When one vector can’t be obtained as a linear combination of the others

Linearly dependent: When one vector can be obtained as a linear combination of the others

If we have more vectors than the dimension of the space we are trying to span, then we’ll always have a linearly dependent group (we can solve the system of linear equations)

Are these vectors linearly independent?

A formal definition

Eigenbases

012
A change of basis (a change of coordiantes) and eigenbasis

Eigenbasis: A way to stretch one parallelogram to another parallelogram or look at the linear transformation to the basis that sends a parallelogram to another parallelogram

Eigenvector: Stretching vectors in a linear transformation from the basis to another

Eigenvalues: Stretching factors in a linear transformation

This simplifies the linear transformation and calculations in linear algebra

Eigenvalues and Eigenvectors

01234

Eigenvalues and eigenvectors come in pairs

The matrix multiplication (left term) simplifies into the scalar multiplication (right term) and it simplifies the overall computation

Look at the below example of how it's easier to calculate a linear transformation with eigenvectors and eigenvalues

0123456
Example and a recap of the terms

Calculating Eigenvalues and Eigenvectors

0123

The above illustrates the concept of coinciding or matching linear transformations and their relationship to singular matrices

Two different linear transformations using matrices are not the same, they coincide or match at certain points or along certain lines, and for those specific points or vectors, the two transformations produce the same result

When two linear transformations coincide or match infinitely many points, it indicates that something non-singular is happening. In this case, the difference between the two matrices results in a singular matrix and the determinant is zero.

This demonstration introduces the concept of singular matrices and their connection to coinciding transformations. It helps to illustrate that when two transformations match in infinitely many points, it implies that the difference between the matrices is a singular matrix.

Understanding the relationship between coinciding transformations and singular matrices is important in linear algebra, as it provides insights into the behavior of linear transformations and the properties of matrices

Here's a more detailed explanation:

  1. Coinciding Transformations: When two linear transformations coincide or match infinitely many points, they produce the same result for those specific points or vectors. This indicates that the transformations have similar effects on those points or vectors. By studying coinciding transformations, we can gain insights into how different transformations behave and relate to each other.
  2. Singular Matrices: A singular matrix is a square matrix whose determinant is zero. In the context of coinciding transformations, when two transformations match in infinitely many points, the difference between the matrices representing those transformations results in a singular matrix. This is because the difference matrix represents the transformation that maps all vectors to the zero vector.
  3. Relationship between Coinciding Transformations and Singular Matrices: The connection between coinciding transformations and singular matrices arises from the fact that when two transformations match in infinitely many points, their difference matrix represents a transformation that maps all vectors to the zero vector. This property of the difference matrix being singular is a consequence of the transformations coinciding.
  4. Insights into Linear Transformations: By understanding the relationship between coinciding transformations and singular matrices, we can gain insights into the behavior of linear transformations. We can identify when two transformations have similar effects on certain vectors or subspaces. This knowledge can be useful in various applications, such as understanding the behavior of data transformations in machine learning or analyzing the stability of dynamical systems.
  5. Properties of Matrices: The concept of singular matrices is an important property of matrices. Singular matrices have unique properties and play a significant role in linear algebra. Understanding the connection between coinciding transformations and singular matrices helps us comprehend the properties and characteristics of matrices, such as invertibility, determinant, and eigenvalues.
  6. This implies that the two transformations have shared eigenvectors. Eigenvectors are special vectors that when subjected to a linear transformation, only change in scale (up to a scalar factor) and do not change their direction. In the context of coinciding transformations, the shared eigenvectors represent the points or vectors that remain unchanged or coincide under both transformations.

01
Finding eigenvalues and eigenvectors

If a linear transformation coincides with scaling the entire plane by a certain factor, then that factor is called an eigenvalue

We use the characteristic polynomial (solve for the equation) to find the eigenvalues

To find eigenvalues and eigenvectors we need to use a characteristic polynomial:

$\det(A - \lambda I) = 0$

Find the eigenvalues and eigenvectors of this matrix:

더보기

Start by finding the eigenvalues. Remember you need to calculate the determinant of the matrix $A−λI$, where $A$ is your matrix

11, 1

The characteristic polynomial is:

$\det \left[\begin{array}{cc}9-\lambda & 4\\4&3-\lambda \end{array}\right] = (9-\lambda)(3-\lambda) - 4 \cdot 4 = 0$

When you simplify the expression, you get:

$\lambda^2 - 12\lambda + 11$

Which factors as $(\lambda - 11)(\lambda -1)$

The solutions for the equation are: $\lambda_1 = 11$ and $\lambda_2 = 1$

Now use your results to find the eigenvectors. You'll need to use the eigenvalues you calculated in the last question. In case you didn't get them, they are 11 and 1. Then, for each eigenvector, solve the equation Av=λv, where λ is your eigenvalue, A is your matrix, and v is the eigenvector you're solving for.

더보기

(2, 1), (-1, 2)

Correct! By using the eigenvalues you found $\lambda_1 = 11$ and $\lambda_2 = 1$, you solve the system of equations to find the eigenvectors.

$$ \left[\begin{array}{cc}9& 4\\4&3\end{array}\right] \cdot \left[\begin{array}{c}x\\y\end{array}\right] = 11 \cdot \left[\begin{array}{c}x\\y\end{array}\right] $$

Therefore, solving the linear system, you get the relation $x=2y$

$9x + 4y = 11x → 2x = 4y → x = 2y$

$4x + 3y = 11y → 4x = 8y → x = 2y$

So, any (x, y) satisfying such relation is an eigenvector. In this case, $x = 2$ and $y=1$ satisfies. So, the vector $\left[\begin{array}{c} 2\\1 \end{array}\right]$ is an eigenvector for the matrix. For the next eigenvalue, we have to solve the following set of equations:

$$ \left[\begin{array}{cc}9& 4\\4&3\end{array}\right] \cdot \left[\begin{array}{c}x\\y\end{array}\right] = 1 \cdot \left[\begin{array}{c}x\\y\end{array}\right] $$

Leading to the following relationship between x and y: $2x = -y$

$9x + 4y = x → 8x = -4y → 2x = -y$

$4x + 3y = y → 4x = -2y → 2x = -y$

And $x = -1$ and $y=2$ satisfy such a relation, so the vector $\left[\begin{array}{c} -1\\2 \end{array}\right]$ is also an eigenvector for the matrix.

0123
Eigenvalues for 3x3 matrix

This only works for square matrices as we need to utilize the determinant

Summary:

  • Eigenvalues and eigenvectors are important concepts in linear algebra.
  • Eigenvalues: Eigenvalues are the scaling factors associated with eigenvectors. When a linear transformation is applied to an eigenvector, the resulting vector is simply a scaled version of the original eigenvector. The eigenvalues represent the amount by which the eigenvectors are stretched or compressed under the linear transformation.
  • Eigenvectors: Eigenvectors are the vectors that remain in the same direction (up to scaling) when a linear transformation is applied to them. In other words, they are the vectors that are only stretched or compressed by the linear transformation, without changing their direction. Eigenvectors are associated with specific eigenvalues.
  • To find eigenvalues, you need to solve the characteristic polynomial equation, which is obtained by subtracting the identity matrix multiplied by the eigenvalue from the original matrix.
  • The roots of the characteristic polynomial equation are the eigenvalues of the matrix.
  • Eigenvectors are the vectors that satisfy the equation: matrix times eigenvector equals eigenvalue times eigenvector.
  • To find eigenvectors, you need to solve a system of linear equations derived from the eigenvalue equation.
  • Finding eigenvalues and eigenvectors is similar for both 2x2 and 3x3 matrices.
  • Eigenvalues and eigenvectors can only be computed for square matrices.

On the Number of Eigenvectors

01234

Although it won’t affect the calculation, the matrix of the characteristic polynomial has an error where the elements of the second row should be $[-1, 4-\lambda, -0.5]$ instead of $[1, 4-\lambda, 0.5]$

Got 3 different eigenvectors with 2 eigenvalues

01234
Again the error in the second row the signs of the first and third elements should be opposite and also with the first element of the third row in the determinant matrix

The number of eigenvectors depends on the number of distinct eigenvalues

For a two by two matrix, if the eigenvalues are different, there are always two distinct eigenvectors

If the eigenvalues are the same, there can be either one or two eigenvectors

For a three-by-three matrix, if all eigenvalues are different, there are always three distinct eigenvectors

If one eigenvalue is repeated twice, there can be either two or three eigenvectors

If the same eigenvalue is repeated three times, there can be any number of eigenvectors between one and three

 

All the information here is based on the Linear Algebra for Machine Learning and Data Science | Coursera from DeepLearning.AI

728x90
반응형

home top bottom
}