McqMate

Q. |
## What are support vectors? |

A. | all the examples that have a non-zero weight ??k in a svm |

B. | the only examples necessary to compute f(x) in an svm. |

C. | all of the above |

D. | none of the above |

Answer» C. all of the above |

2.4k

0

Do you find this helpful?

24

View all MCQs in

Machine Learning (ML)No comments yet

- Suppose we train a hard-margin linear SVM on n > 100 data points in R2, yielding a hyperplane with exactly 2 support vectors. If we add one more data point and retrain the classifier, what is the maximum possible number of support vectors for the new hyperplane (assuming the n + 1 points are linearly separable)?
- Let S1 and S2 be the set of support vectors and w1 and w2 be the learnt weight vectors for a linearly separable problem using hard and soft margin linear SVMs respectively. Which of the following are correct?
- Support vectors are the data points that lie closest to the decision surface.
- Suppose you are using a Linear SVM classifier with 2 class classification problem. Now you have been given the following data in which some points are circled red that are representing support vectors.If you remove the following any one red points from the data. Does the decision boundary will change?
- Support vectors are the data points that lie closest to the decision surface.
- Suppose you are using a Linear SVM classifier with 2 class classification problem. Now you have been given the following data in which some points are circled red that are representing support vectors.If you remove the following any one red points from the data. Does the decision boundary will change?
- Support vectors are the data points that lie closest to the decision
- Support Vector Machine is
- How can we best represent ‘support’ for the following association rule: “If X and Y, then Z”.
- The _______ step eliminates the extensions of (k-1)-itemsets which are not found to be frequent,from being considered for counting support