SymmetricMatrixEigensystem Method |

Computes the eigenvalues and eigenvectors of the matrix.

**Namespace:**
Meta.Numerics.Matrices
**Assembly:**
Meta.Numerics (in Meta.Numerics.dll) Version: 3.1.0.0 (3.1.0.0)

Syntax public RealEigensystem Eigensystem()

Public Function Eigensystem As RealEigensystem

public:
RealEigensystem^ Eigensystem()

member Eigensystem : unit -> RealEigensystem

#### Return Value

Type:

RealEigensystemA representation of the eigenvalues and eigenvectors of the matrix.

Remarks For a generic vector v and matrix M, Mv = u will point in some direction with no particular relationship to v.
The eigenvectors of a matrix M are vectors z that satisfy Mz = λz, i.e. multiplying an eigenvector by a
matrix reproduces the same vector, up to a prortionality constant λ called the eigenvalue.

For v to be an eigenvector of M with eigenvalue λ, (M - λI)z = 0. But for a matrix to
anihilate any non-zero vector, that matrix must have determinant, so det(M - λI)=0. For a matrix of
order N, this is an equation for the roots of a polynomial of order N. Since an order-N polynomial always has exactly
N roots, an order-N matrix always has exactly N eigenvalues.

An alternative way of expressing the same relationship is to say that the eigenvalues of a matrix are its
diagonal elements when the matrix is expressed in a basis that diagonalizes it. That is, given Z such that Z^{-1}MZ = D,
where D is diagonal, the columns of Z are the eigenvectors of M and the diagonal elements of D are the eigenvalues.

Note that the eigenvectors of a matrix are not entirely unique. Given an eigenvector z, any scaled vector αz
is an eigenvector with the same eigenvalue, so eigenvectors are at most unique up to a rescaling. If an eigenvalue
is degenerate, i.e. there are two or more linearly independent eigenvectors with the same eigenvalue, then any linear
combination of the eigenvectors is also an eigenvector with that eigenvalue, and in fact any set of vectors that span the
same subspace could be taken as the eigenvector set corresponding to that eigenvalue.

The eigenvectors of a symmetric matrix are always orthogonal and the eigenvalues are always real. The transformation
matrix Z is thus orthogonal (Z^{-1} = Z^{T}).

Finding the eigenvalues and eigenvectors of a symmetric matrix is an O(N^{3}) operation.

If you require only the eigenvalues, not the eigenvectors, of the matrix, the Eigenvalues method
will produce them faster than this method.

See Also