Q. |
Entropy is the measure of |
A. | amount of information at the output |
B. | amount of information that can be transmitted |
C. | number of error bits from total number of bits |
D. | none of the mentioned |
Answer» A. amount of information at the output | |
Explanation: entropy is defined as the average amount of information per source output. |
Login to Continue
It will take less than 2 minutes