McqMate
Sign In
Hamberger menu
McqMate
Sign in
Sign up
Home
Forum
Search
Ask a Question
Sign In
McqMate Copyright © 2025
→
Computer Science Engineering (CSE)
→
Machine Learning (ML)
→
While using feature selection on the dat...
Q.
While using feature selection on the data, is the number of features decreases.
A.
no
B.
yes
Answer» B. yes
4.6k
0
Do you find this helpful?
22
View all MCQs in
Machine Learning (ML)
Discussion
No comments yet
Login to comment
Related MCQs
Which of the following statement(s) can be true post adding a variable in a linear regression model? 1. R-Squared and Adjusted R-squared both increase 2. R-Squared increases and Adjusted R-squared decreases 3. R-Squared decreases and Adjusted R-squared decreases 4. R-Squared decreases and Adjusted R-squared increases
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1.We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
Suppose we fit “Lasso Regression” to a data set, which has 100 features (X1,X2…X100). Now, we rescale one of these feature by multiplying with 10 (say that feature is X1), and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
Suppose we fit “Lasso Regression” to a data set, which has 100 features (X1,X2…X100). Now, we rescale one of these feature by multiplying with 10 (say that feature is X1), and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?