Q.

Entropy is the measure of

A. amount of information at the output
B. amount of information that can be transmitted
C. number of error bits from total number of bits
D. none of the mentioned
Answer» A. amount of information at the output
Explanation: entropy is defined as the average amount of information per source output.
3.2k
0
Do you find this helpful?
15

Discussion

No comments yet