Assignment Questions UNIT 1: 1. Explain the terms (i) Self information (ii) Average information (iii) Mutual Information. 2. Discuss the reason for using logarithmic measure for measuring the amount of information. 3. Explain the concept of amount of information associated with message. Also explain what infinite information is and zero information. 4. A binary source emitting an independent sequence of 0‟s and 1‟s with p and (1-p) respectively. Plot the entropy of the source.
probabilities
5. Explain the concept of information, average information, information rate and redundancy as referred to information transmission. 6. Let X represents the outcome of a single roll of a fair dice. What is the entropy of X? 7. A code is composed of dots and dashes. Assume that the dash is 3 times as long as the dot and has one-third the probability of occurrence. (i) Calculate the information in dot and that in a dash; (ii) Calculate the average information in dot-dash code; and (iii) Assume that a dot lasts for 10 ms and this same time interval is allowed between symbols. Calculate the average rate of information transmission. 8. What do you understand by the term extension of a discrete memory less source? Show that the entropy of the nth extension of a DMS is n times the entropy of the original source. 9. A card is drawn from a deck of playing cards. A) You are informed that the card you draw is spade. How much information did you receive in bits? B) How much information did you receive if you are told that the card you drew is an ace? C) How much information did you receive if you are told that the card you drew is an ace of spades? Is the information content of the message “ace of spad es” the sum of the information contents of the messages ”spade” and “ace”? 10. A block and white TV picture consists of 525 lines of picture information. Assume that each consists of 525 picture elements and that each element can have 256 brightness levels. Pictures are repeated the rate of 30/sec. Calculate the average rate of information conveyed by a TV set to a viewer.
Dept. of ECE/SJBIT
Page 1
Information Theory & Coding
10EC55
11. A zero memory source has a source alphabet S= {S1, S2, S3} with P= {1/2, 1/4, 1/4}. Find the entropy of the source. Also determine the entropy of its second extension and verify that H (S2) = 2H(S). 12. Show that the entropy is maximum when source transmits symbols with equal probability. Plot the entropy of this source versus p (0
Thank you for interesting in our services. We are a non-profit group that run this website to share documents. We need your help to maintenance this website.