McqMate

Q. |
## In a linear regression problem, we are using “R-squared” to measure goodness-of-fit. We add a feature in linear regression model and retrain the same model.Which of the following option is true? |

A. | If R Squared increases, this variable is significant. |

B. | If R Squared decreases, this variable is not significant. |

C. | Individually R squared cannot tell about variable importance. We can’t say anything about it right now. |

D. | None of these. |

Answer» C. Individually R squared cannot tell about variable importance. We can’t say anything about it right now. |

727

0

Do you find this helpful?

6

View all MCQs in

Machine Learning (ML)No comments yet

- In a linear regression problem, we are using �R-squared� to measure goodness-of-fit. We add a feature in linear regression model and retrain the same model.Which of the following option is true?
- What is/are true about ridge regression? 1. When lambda is 0, model works like linear regression model 2. When lambda is 0, model doesn't work like linear regression model 3. When lambda goes to infinity, we get very, very small coefficients approaching 0 4. When lambda goes to infinity, we get very, very large coefficients approaching infinity
- What is/are true about ridge regression? 1. When lambda is 0, model works like linear regression model 2. When lambda is 0, model doesn’t work like linear regression model 3. When lambda goes to infinity, we get very, very small coefficients approaching 0 4. When lambda goes to infinity, we get very, very large coefficients approaching infinity
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1.We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables 3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
- We usually use feature normalization before using the Gaussian kernel in SVM. What is true about feature normalization? 1. We do feature normalization so that new feature will dominate other 2. Some times, feature normalization is not feasible in case of categorical variables3. Feature normalization always helps when we use Gaussian kernel in SVM
- Suppose we train a hard-margin linear SVM on n > 100 data points in R2, yielding a hyperplane with exactly 2 support vectors. If we add one more data point and retrain the classifier, what is the maximum possible number of support vectors for the new hyperplane (assuming the n + 1 points are linearly separable)?
- Suppose we fit �Lasso Regression� to a data set, which has 100 features (X1,X2�X100).� Now, we rescale one of these feature by multiplying with 10 (say that feature is X1),� and then refit Lasso regression with the same regularization parameter.Now, which of the following option will be correct?