McqMate

Q. |
## performs a PCA with non-linearly separable data sets. |

A. | sparsepca |

B. | kernelpca |

C. | svd |

D. | none of the mentioned |

Answer» B. kernelpca |

994

0

Do you find this helpful?

1

View all MCQs in

Machine Learning (ML)No comments yet

- The most popularly used dimensionality reduction algorithm is Principal Component Analysis (PCA). Which of the following is/are true about PCA? 1. PCA is an unsupervised method2. It searches for the directions that data have the largest variance3. Maximum number of principal components <= number of features4. All principal components are orthogonal to each other
- ________performs a PCA with non-linearly separable data sets.
- In a real problem, you should check to see if the SVM is separable and then include slack variables if it is not separable.
- PCA works better if there is 1. A linear structure in the data 2. If the data lies on a curved surface and not on a flat surface 3. If variables are scaled in the same unit
- Suppose we train a hard-margin linear SVM on n > 100 data points in R2, yielding a hyperplane with exactly 2 support vectors. If we add one more data point and retrain the classifier, what is the maximum possible number of support vectors for the new hyperplane (assuming the n + 1 points are linearly separable)?
- PCA can be used for projecting and visualizing data in lower dimensions.
- In PCA the number of input dimensiona are equal to principal components
- PCA is
- What would you do in PCA to get the same projection as SVD?
- What is PCA, KPCA and ICA used for?