McqMate

Q. |
## Which of the following is true about “Ridge” or “Lasso” regression methods in case of feature selection? |

A. | Ridge regression uses subset selection of features |

B. | Lasso regression uses subset selection of features |

C. | Both use subset selection of features |

D. | None of above |

Answer» B. Lasso regression uses subset selection of features |

1.6k

0

Do you find this helpful?

3

View all MCQs in

Machine Learning (ML)No comments yet

- Which of the following is true about �Ridge� or �Lasso� regression methods in case of feature selection?
- Which of the following is true about �Ridge� or �Lasso� regression methods in case of feature selection?
- Which of the following is true about “Ridge” or “Lasso” regression methods in case of feature selection?
- Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
- Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
- Suppose we fit “Lasso Regression” to a data set, which has 100 features (X1,X2…X100). Now, we rescale one of these feature by multiplying with 10 (say that feature is X1), and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
- Suppose we fit “Lasso Regression” to a data set, which has 100 features (X1,X2…X100). Now, we rescale one of these feature by multiplying with 10 (say that feature is X1), and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1.We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM