McqMate
Sign In
Hamberger menu
McqMate
Sign in
Sign up
Home
Forum
Search
Ask a Question
Sign In
McqMate Copyright © 2025
→
Computer Science Engineering (CSE)
→
Machine Learning (ML)
→
Feature can be used as a
Q.
Feature can be used as a
A.
binary split
B.
predictor
C.
both a and b??
D.
none of the above
Answer» C. both a and b??
3k
0
Do you find this helpful?
23
View all MCQs in
Machine Learning (ML)
Discussion
No comments yet
Login to comment
Related MCQs
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1.We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
A feature F1 can take certain value: A, B, C, D, E, & F and represents grade of students from a college. Here feature type is
In following type of feature selection method we start with empty feature set
Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?
Let�s say, you are working with categorical feature(s) and you have not looked at the distribution of the categorical variable in the test data. You want to apply one hot encoding (OHE) on the categorical feature(s). What challenges you may face if you have applied OHE on a categorical variable of train dataset?