Principal Component Analysis. More...
#include <shark/Algorithms/Trainers/PCA.h>
Public Types | |
enum | PCAAlgorithm { STANDARD , SMALL_SAMPLE , AUTO } |
Public Types inherited from shark::AbstractUnsupervisedTrainer< LinearModel<> > | |
typedef LinearModel<> | ModelType |
typedef Model::InputType | InputType |
typedef UnlabeledData< InputType > | DatasetType |
Public Member Functions | |
PCA (bool whitening=false) | |
PCA (UnlabeledData< RealVector > const &inputs, bool whitening=false) | |
std::string | name () const |
From INameable: return the class name. | |
void | setWhitening (bool whitening) |
void | train (LinearModel<> &model, UnlabeledData< RealVector > const &inputs) |
SHARK_EXPORT_SYMBOL void | setData (UnlabeledData< RealVector > const &inputs) |
SHARK_EXPORT_SYMBOL void | encoder (LinearModel<> &model, std::size_t m=0) |
SHARK_EXPORT_SYMBOL void | decoder (LinearModel<> &model, std::size_t m=0) |
RealVector const & | eigenvalues () const |
double | eigenvalue (std::size_t i) const |
Returns ith eigenvalue. | |
RealMatrix const & | eigenvectors () const |
RealVector const & | mean () const |
mean of last training | |
Public Member Functions inherited from shark::AbstractUnsupervisedTrainer< LinearModel<> > | |
virtual void | train (ModelType &model, DatasetType const &inputset)=0 |
Core of the Trainer interface. | |
Public Member Functions inherited from shark::INameable | |
virtual | ~INameable () |
Public Member Functions inherited from shark::ISerializable | |
virtual | ~ISerializable () |
Virtual d'tor. | |
virtual void | read (InArchive &archive) |
Read the component from the supplied archive. | |
virtual void | write (OutArchive &archive) const |
Write the component to the supplied archive. | |
void | load (InArchive &archive, unsigned int version) |
Versioned loading of components, calls read(...). | |
void | save (OutArchive &archive, unsigned int version) const |
Versioned storing of components, calls write(...). | |
BOOST_SERIALIZATION_SPLIT_MEMBER () | |
Protected Attributes | |
bool | m_whitening |
normalize variance yes/no | |
RealMatrix | m_eigenvectors |
eigenvectors | |
RealVector | m_eigenvalues |
eigenvalues | |
RealVector | m_mean |
mean value | |
std::size_t | m_n |
number of attributes | |
std::size_t | m_l |
number of training data points | |
PCAAlgorithm | m_algorithm |
whether to use design matrix or its transpose for building covariance matrix | |
Principal Component Analysis.
The Principal Component Analysis, also known as Karhunen-Loeve transformation, takes a symmetric \( n \times n \) matrix \( A \) and uses its decomposition
\( A = \Gamma \Lambda \Gamma^T, \)
where \( \Lambda \) is the diagonal matrix of eigenvalues of \( A \) and \( \Gamma \) is the orthogonal matrix with the corresponding eigenvectors as columns. \( \Lambda \) then defines a successive orthogonal rotation that maximizes the variances of the coordinates, i.e. the coordinate system is rotated in such a way that the correlation between the new axes becomes zero. If there are \( p \) axes, the first axis is rotated in a way that the points on the new axis have maximum variance. Then the remaining \( p - 1 \) axes are rotated such that a another axis covers a maximum part of the rest variance, that is not covered by the first axis. After the rotation of \( p - 1 \) axes, the rotation destination of axis no. \( p \) is fixed. An application for PCA is the reduction of dimensions by skipping the components with the least corresponding eigenvalues/variances. Furthermore, the eigenvalues may be rescaled to one, resulting in a whitening of the data.
|
inline |
Constructor. The parameter defines whether the model should also whiten the data.
Definition at line 83 of file PCA.h.
References AUTO, and m_algorithm.
|
inline |
SHARK_EXPORT_SYMBOL void shark::PCA::decoder | ( | LinearModel<> & | model, |
std::size_t | m = 0 |
||
) |
|
inline |
Returns ith eigenvalue.
Definition at line 142 of file PCA.h.
References m_eigenvalues, m_l, and SIZE_CHECK.
Referenced by main().
|
inline |
Eigenvalues of last training. The number of eigenvalues is equal to the minimum of the input dimensions (i.e., number of attributes) and the number of data points used for training the PCA.
Definition at line 138 of file PCA.h.
References m_eigenvalues.
Referenced by main().
|
inline |
Eigenvectors of last training. The number of eigenvectors is equal to the minimum of the input dimensions (i.e., number of attributes) and the number of data points used for training the PCA.
Definition at line 153 of file PCA.h.
References m_eigenvectors.
SHARK_EXPORT_SYMBOL void shark::PCA::encoder | ( | LinearModel<> & | model, |
std::size_t | m = 0 |
||
) |
|
inline |
|
inlinevirtual |
From INameable: return the class name.
Reimplemented from shark::INameable.
SHARK_EXPORT_SYMBOL void shark::PCA::setData | ( | UnlabeledData< RealVector > const & | inputs | ) |
|
inline |
If set to true, the encoded data has unit variance along the new coordinates.
Definition at line 103 of file PCA.h.
References m_whitening.
Referenced by main().
|
inline |
Train the model to perform PCA. The model must be a LinearModel object with offset, and its output dimension defines the number of principal components represented. The model returned is the one given by the econder() function (i.e., mapping from the original input space to the PCA coordinate system).
< reduced dimensionality
Definition at line 113 of file PCA.h.
References encoder(), shark::Shape::numElements(), shark::LinearModel< InputType, ActivationFunction >::outputShape(), and setData().
Referenced by main().
|
protected |
|
protected |
|
protected |
|
protected |
|
protected |
|
protected |