98
76.5k

240+ Data Mining and Business Intelligence Solved MCQs

These multiple-choice questions (MCQs) are designed to enhance your knowledge and understanding in the following areas: Computer Science Engineering (CSE) , Information Technology Engineering (IT) .

151.

All set of items whose support is greater than the user-specified minimum support are called as_____________

A. Border set
B. Frequent set
C. Maximal frequent set
D. Lattice
Answer» B. Frequent set
152.

If a set is a frequent set and no superset of this set is a frequent set, then it is called____________

A. Maximal frequent set
B. Border set
C. Lattice
D. Infrequent sets
Answer» A. Maximal frequent set
153.

Any subset of a frequent set is a frequent set. This is_________

A. Upward closure property
B. Downward closure property
C. Maximal frequent set
D. Border set
Answer» B. Downward closure property
154.

Any superset of an infrequent set is an infrequent set. This is ___________

A. Maximal frequent set
B. Border set
C. Upward closure property
D. Downward closure property
Answer» C. Upward closure property
155.

If an itemset is not a frequent set and no superset of this is a frequent set, then it is

A. Maximal frequent set
B. Border set
C. Upward closure property
D. Downward closure property
Answer» B. Border set
156.

A priori algorithm is otherwise called as_________

A. Width-wise algorithm
B. Level-wise algorithm
C. Pincer-search algorithm
D. FP growth algorithm
Answer» B. Level-wise algorithm
157.

The A Priori algorithm is a____________

A. Top-down search
B. Breadth first search
C. Depth first search
D. Bottom-up search
Answer» D. Bottom-up search
158.

The first phase of A Priori algorithm is___________

A. Candidate generation
B. Itemset generation
C. Pruning
D. Partitioning
Answer» A. Candidate generation
159.

The second phase of A Priori algorithm is____________

A. Candidate generation
B. Itemset generation
C. Pruning
D. Partitioning
Answer» C. Pruning
160.

The step eliminates the extensions of (k-1)-itemsets which are not found to be frequent, from being considered for counting support.

A. Candidate generation
B. Pruning
C. Partitioning
D. Itemset eliminations
Answer» B. Pruning
161.

The a priori frequent itemset discovery algorithm moves in the lattice

A. Upward
B. Downward
C. Breadthwise
D. Both upward and downward
Answer» A. Upward
162.

After the pruning of a priori algorithm,__________will remain

A. Only candidate set
B. No candidate set
C. Only border set
D. No border set
Answer» B. No candidate set
163.

The number of iterations in a priori

A. Increases with the size of the maximum frequent set
B. Decreases with increase in size of the maximum frequent set
C. Increases with the size of the data
D. Decreases with the increase in size of the data
Answer» A. Increases with the size of the maximum frequent set
164.

MFCS is the acronym of____________

A. Maximum Frequency Control Set
B. Minimal Frequency Control Set
C. Maximal Frequent Candidate Set
D. Minimal Frequent Candidate Set
Answer» C. Maximal Frequent Candidate Set
165.

Dynamic Itemset Counting Algorithm was proposed by

A. Bin et al
B. Argawal et at
C. Toda et al
D. Simon et at
Answer» A. Bin et al
166.

Itemsets in the category of structures have a counter and the stop number with them

A. Dashed
B. Circle
C. Box
D. Solid
Answer» A. Dashed
167.

The itemsets in the_________category structures are not subjected to any counting

A. Dashes
B. Box
C. Soli
D. D Circle
Answer» C. Soli
168.

Certain itemsets in the dashed circle whose support count reach support value during an iteration move into the______________

A. Dashed box
B. Solid circle
C. Solid box
D. None of the above
Answer» A. Dashed box
169.

Certain itemsets enter afresh into the system and get into the , which are essentially the supersets of the itemsets that move from the dashed circle to the dashed box

A. Dashed box
B. Solid circle
C. Solid box
D. Dashed circle
Answer» D. Dashed circle
170.

The item sets that have completed on full pass move from dashed circle to________

A. Dashed box
B. Solid circle
C. Solid box
D. None of the above
Answer» B. Solid circle
171.

The FP-growth algorithm has phases

A. One
B. Two
C. Three
D. Four
Answer» B. Two
172.

A frequent pattern tree is a tree structure consisting of ________

A. An item-prefix-tree
B. A frequent-item-header table
C. A frequent-item-node
D. Both A & B
Answer» D. Both A & B
173.

The non-root node of item-prefix-tree consists of fields

A. Two
B. Three
C. Four
D. Five
Answer» B. Three
174.

The frequent-item-header-table consists of fields

A. Only one.
B. Two.
C. Three.
D. Four
Answer» B. Two.
175.

The paths from root node to the nodes labelled 'a' are called_________

A. Transformed prefix path
B. Suffix subpath
C. Transformed suffix path
D. Prefix subpath
Answer» D. Prefix subpath
176.

The transformed prefix paths of a node 'a' form a truncated database of pattern which cooccur with a is called________

A. Suffix path
B. FP-tree
C. Conditional pattern base
D. Prefix path
Answer» C. Conditional pattern base
177.

The goal of________is to discover both the dense and sparse regions of a data set

A. Association rule
B. Classification
C. Clustering
D. Genetic Algorithm
Answer» C. Clustering
178.

Which of the following is a clustering algorithm?

A. A priori
B. CLARA
C. Pincer-Search
D. FP-growth
Answer» B. CLARA
179.

clustering technique start with as many clusters as there are records, with each cluster having only one record

A. Agglomerative
B. Divisive
C. Partition
D. Numeric
Answer» A. Agglomerative
180.

clustering techniques starts with all records in one cluster and then try to split that

A. Agglomerative.
B. Divisive.
C. Partition.
D. Numeric
Answer» B. Divisive.
181.

Which of the following is a data set in the popular UCI machine-learning repository?

A. CLARA.
B. CACTUS.
C. STIRR.
D. MUSHROOM
Answer» D. MUSHROOM
182.

In algorithm each cluster is represented by the center of gravity of the cluster

A. K-medoid
B. K-means
C. Stirr
D. Rock
Answer» B. K-means
183.

In each cluster is represented by one of the objects of the cluster located near the center

A. K-medoid
B. K-means
C. Stirr
D. Rock
Answer» A. K-medoid
184.

Pick out a k-medoid algorithm

A. DBSCAN
B. BIRCH
C. PAM
D. CURE
Answer» C. PAM
185.

Pick out a hierarchical clustering algorithm

A. DBSCAN
B. CURE
C. PAM
D. BIRCH
Answer» D. BIRCH
186.

CLARANS stands for

A. CLARA Net Server
B. Clustering Large Application Range Network Search
C. Clustering Large Applications based on Randomized Search
D. Clustering Application Randomized Search
Answer» C. Clustering Large Applications based on Randomized Search
187.

BIRCH is a________

A. Agglomerative clustering algorithm
B. Hierarchical algorithm
C. Hierarchical-agglomerative algorithm
D. Divisive
Answer» C. Hierarchical-agglomerative algorithm
188.

The cluster features of different subclusters are maintained in a tree called_________

A. CF tree
B. FP tree
C. FP growth tree
D. B tree
Answer» A. CF tree
189.

The_______algorithm is based on the observation that the frequent sets are normally very few in number compared to the set of all itemsets

A. A priori
B. Clustering
C. Association rule
D. Partition
Answer» D. Partition
190.

The partition algorithm uses scans of the databases to discover all frequent sets

A. Two
B. Four
C. Six
D. Eight
Answer» A. Two
191.

The basic idea of the Apriori algorithm is to generate_____item sets of a particular size & scans the database

A. Candidate
B. Primary
C. Secondary
D. Superkey
Answer» A. Candidate
192.

is the most well-known association rule algorithm and is used in most commercial products

A. Apriori algorithm
B. Partition algorithm
C. Distributed algorithm
D. Pincer-search algorithm
Answer» A. Apriori algorithm
193.

An algorithm called________is used to generate the candidate item sets for each pass after the first

A. Apriori
B. Apriori-gen
C. Sampling
D. Partition
Answer» B. Apriori-gen
194.

The basic partition algorithm reduces the number of database scans to __________ & divides it into partitions

A. One
B. Two
C. Three
D. Four
Answer» B. Two
195.

and prediction may be viewed as types of classification

A. Decision.
B. Verification.
C. Estimation.
D. Illustration
Answer» C. Estimation.
196.

can be thought of as classifying an attribute value into one of a set of possible classes

A. Estimation.
B. Prediction.
C. Identification.
D. Clarification
Answer» B. Prediction.
197.

Prediction can be viewed as forecasting a value

A. Non-continuous.
B. Constant.
C. Continuous.
D. variable
Answer» C. Continuous.
198.

data consists of sample input data as well as the classification assignment for the data

A. Missing.
B. Measuring.
C. Non-training.
D. Training
Answer» B. Measuring.
199.

Rule based classification algorithms generate_________rule to perform the classification

A. If-then
B. While
C. Do while
D. Switch
Answer» A. If-then
200.

are a different paradigm for computing which draws its inspiration from neuroscience

A. Computer networks
B. Neural networks
C. Mobile networks
D. Artificial networks
Answer» B. Neural networks

Done Studing? Take A Test.

Great job completing your study session! Now it's time to put your knowledge to the test. Challenge yourself, see how much you've learned, and identify areas for improvement. Don’t worry, this is all part of the journey to mastery. Ready for the next step? Take a quiz to solidify what you've just studied.