# Linear Algebra
>[!Abstract] Eigenvalues and Eigenvectors
>The word, **Eigen** translates from German and means **Characteristic.** So when we talk about Eigenvalues and eigenvectors of a Matrix, we’re talking about finding the characteristics of the matrix.
>
>For some $n \times n$ matrix $A$, $x$ is the eigenvector of $A$ if:
>$Ax = \lambda x$
>Where $\lambda$ is a scalar.
>
>**Eigenvalue**— The scalar that is used to transform (stretch) an Eigenvector.
>Eigenvectors and eigenvalues are used to reduce noise in data. They can help us improve efficiency in computationally intensive tasks. They also eliminate features that have a strong correlation between them and also help in reducing over-fitting.
>Therefore an eigenvector is a vector that does not change when a transformation is applied to it, except that it becomes a scaled version of the original vector.
> If the new transformed vector is just a scaled form of the original vector then the original vector is known to be an eigenvector of the original matrix. Vectors that have this characteristic are special vectors and they are known as eigenvectors. Eigenvectors can be used to represent a large dimensional matrix.
A decomposition of a square matrix into its eigenvectors is called an *eigendecomposition*. Non-square matrices are decomposed using a method called *singular value decomposition (SVD)*.
## A Mindmap of Linear Algebra
Taken from [[Mathematics for Machine Learning]]
![[linear_alg_mindmap.png]]
## References
- [What are Eigenvalues and Eigenvectors?](https://medium.com/fintechexplained/what-are-eigenvalues-and-eigenvectors-a-must-know-concept-for-machine-learning-80d0fd330e47)
-