|
| | __construct (?float $totalVariance=null, ?int $numFeatures=null) |
| | Linear Discriminant Analysis (LDA) is used to reduce the dimensionality of the data.
|
| |
|
| fit (array $data, array $classes) |
| | Trains the algorithm to transform the given data to a lower dimensional space.
|
| |
| | transform (array $sample) |
| | Transforms the given sample to a lower dimensional vector by using the eigenVectors obtained in the last run of fit.
|
| |
|
|
array | $counts = [] |
| |
|
bool | $fit = false |
| |
|
array | $labels = [] |
| |
|
array | $means = [] |
| |
|
int | $numFeatures = null |
| | Number of features to be preserved after the reduction.
|
| |
|
float[] | $overallMean = [] |
| |
|
float | $totalVariance = 0.9 |
| | Total variance to be conserved after the reduction.
|
| |
|
|
| calculateClassCov () |
| | Returns between-class scatter matrix for each class, which is an n by m matrix where n is number of classes and m is number of columns.
|
| |
|
| calculateClassVar (array $data, array $classes) |
| | Returns in-class scatter matrix for each class, which is a n by m matrix where n is number of classes and m is number of columns.
|
| |
|
| calculateMeans (array $data, array $classes) |
| | Calculates mean of each column for each class and returns n by m matrix where n is number of labels and m is number of columns.
|
| |
| | calculateVar (array $row, array $means) |
| | Returns the result of the calculation (x - m)T.
|
| |
| | eigenDecomposition (array $matrix) |
| | Calculates eigenValues and eigenVectors of the given matrix.
|
| |
|
| getLabels (array $classes) |
| | Returns unique labels in the dataset.
|
| |
|
| reduce (array $data) |
| | Returns the reduced data.
|
| |
|
|
array | $eigValues = [] |
| | Top eigenValues of the matrix.
|
| |
|
array | $eigVectors = [] |
| | Top eigenvectors of the matrix.
|
| |
◆ __construct()
| Phpml\DimensionReduction\LDA::__construct |
( |
?float | $totalVariance = null, |
|
|
?int | $numFeatures = null ) |
Linear Discriminant Analysis (LDA) is used to reduce the dimensionality of the data.
Unlike Principal Component Analysis (PCA), it is a supervised technique that requires the class labels in order to fit the data to a lower dimensional space.
The algorithm can be initialized by speciyfing either with the totalVariance(a value between 0.1 and 0.99) or numFeatures (number of features in the dataset) to be preserved.
- Parameters
-
| float | null | $totalVariance | Total explained variance to be preserved |
| int | null | $numFeatures | Number of features to be preserved |
- Exceptions
-
◆ calculateVar()
| Phpml\DimensionReduction\LDA::calculateVar |
( |
array | $row, |
|
|
array | $means ) |
|
protected |
Returns the result of the calculation (x - m)T.
(x - m)
◆ eigenDecomposition()
| Phpml\DimensionReduction\EigenTransformerBase::eigenDecomposition |
( |
array | $matrix | ) |
|
|
protectedinherited |
Calculates eigenValues and eigenVectors of the given matrix.
Returns top eigenVectors along with the largest eigenValues. The total explained variance of these eigenVectors will be no less than desired $totalVariance value
◆ transform()
| Phpml\DimensionReduction\LDA::transform |
( |
array | $sample | ) |
|
Transforms the given sample to a lower dimensional vector by using the eigenVectors obtained in the last run of fit.
- Exceptions
-
| InvalidOperationException | |
The documentation for this class was generated from the following file:
- lib/mlbackend/php/phpml/src/Phpml/DimensionReduction/LDA.php