McqMate

Q. |
## Entropy is the measure of |

A. | amount of information at the output |

B. | amount of information that can be transmitted |

C. | number of error bits from total number of bits |

D. | none of the mentioned |

Answer» A. amount of information at the output | |

Explanation: entropy is defined as the average amount of information per source output. |

3.2k

0

Do you find this helpful?

15

View all MCQs in

Communication EngineeringNo comments yet

- Entropy of a random variable is
- Which coding method uses entropy coding?
- When the base of the logarithm is e, the unit of measure of information is
- When the base of the logarithm is 2, then the unit of measure of information is
- The ratio (J/S)reqd gives the measure of
- of TDMA system is a measure of the percentage of transmitted data that contains information as opposed to providing overhead for the access scheme.