|
In this video, we will learn that we can use Principal Component Analysis (PCA) to find a transformation matrix that minimises reconstruction error for dimensionality reduction. We also show how this…
|
|
Machine Learning Practical (MLP) Lecture 06, Clip 03 / 05.
Course Code
INFR11132 Licence Type
All rights reserved Language
English
|
|
Unsupervised Learning
Course Code
GLHE11086 Licence Type
Creative Commons - Attribution Share A Like
|
|
Properties of eigenfaces
Licence Type
Creative Commons - Attribution
|
|
Pros and cons of dimensionality reduction
Licence Type
Creative Commons - Attribution
|
|
Principal component analysis
Licence Type
Creative Commons - Attribution
|
|
Linear discriminant analysis
Licence Type
Creative Commons - Attribution
|
|
Classification with PCA features
Licence Type
Creative Commons - Attribution
|
|
When principal components fail
Licence Type
Creative Commons - Attribution
|
|
Eigenface representation
Licence Type
Creative Commons - Attribution
|
|
Eigen-faces
Licence Type
Creative Commons - Attribution
|
|
Principal component analysis for the impatient
Licence Type
Creative Commons - Attribution
|
|
How many principal components to use
Licence Type
Creative Commons - Attribution
|
|
Eigenvalue = variance along eigenvector
Licence Type
Creative Commons - Attribution
|
|
Eigenvector = direction of maximum variance
Licence Type
Creative Commons - Attribution
|
|
Low-dimensional projections of data
Licence Type
Creative Commons - Attribution
|