McqMate
Sign In
Hamberger menu
McqMate
Sign in
Sign up
Home
Forum
Search
Ask a Question
Sign In
McqMate Copyright © 2025
→
Computer Science Engineering (CSE)
→
Machine Learning (ML)
→
If two variables are correlated, is it n...
Q.
If two variables are correlated, is it necessary that they have a linear relationship?
A.
yes
B.
no
Answer» B. no
2.4k
0
Do you find this helpful?
13
View all MCQs in
Machine Learning (ML)
Discussion
No comments yet
Login to comment
Related MCQs
�If two variables are correlated, is it necessary that they have a linear relationship?
If two variables are correlated, is it necessary that they have a linear relationship?
If two variables are correlated, is it necessary that they have a linear relationship?
Which of the following is / are true about weak learners used in ensemble model? 1. They have low variance and they don’t usually overfit 2. They have high bias, so they can not solve hard learning problems 3. They have high variance and they don’t usually overfit
Suppose, you got a situation where you find that your linear regression model is under fitting the data. In such situation which of the following options would you consider? 1. I will add more variables 2. I will start introducing polynomial degree variables 3. I will remove some variables
Correlated variables can have zero correlation coeffficient. True or False?
A term used to describe the case when the independent variables in a multiple regression model are correlated is
We have been given a dataset with n records in which we have input attribute as x and output attribute as y. Suppose we use a linear regression method to model this data. To test our linear regressor, we split the data in training set and test set randomly. Now we increase the training set size gradually. As the training set size increases, What do you expect will happen with the mean training error?
We have been given a dataset with n records in which we have input attribute as x and output attribute as y. Suppose we use a linear regression method to model this data. To test our linear regressor, we split the data in training set and test set randomly. What do you expect will happen with bias and variance as you increase the size of training data?
We have been given a dataset with n records in which we have input attribute as x and output attribute as y. Suppose we use a linear regression method to model this data. To test our linear regressor, we split the data in training set and test set randomly. Now we increase the training set size gradually. As the training set size increases, what do you expect will happen with the mean training error?