Objective Question - 1 ( Information Theory and Coding)
Short Description
Download Objective Question - 1 ( Information Theory and Coding)...
Description
PARISUTHAM INSTITUTE OF TECHNOLOGY AND SCIENCE DEPARTMENT OF INFORMATION TECHNOLOGY
IT2302 – Information Theory and Coding Technical Questions and Answers
1. New abbreviation of Binary digit is represented by _______ a) Binit b) bin c) Digi d)bit 2. Unit of information is _______ a) Bytes b) bytes/message c) bits d) bit 3. In more uncertainty about the message, information carried is _____ a) less b) more c) very less d) both a&b 4. If receiver knows the message being transmitted, the amount of information carried is_______ a) 1 b) 0 c) -1 d) 2 5. Amount of Information is represented by ______ a) IK b)pK c) 1/ pK d) H
6. Average information is represented by_____ a) Entropy b) code redundancy c) code efficiency d) code word 7. Average Information______ a) Total information b) Entropy No. of Message No. of Message c) Entropy d) Message Rate No. of Message No. of Message 8. Information Rate is represented by_____ a) r b) rH c) R d) rR 9. Source Coding Theorem is represented by_____ a) Huffman’s 1st theorem b) Shannon’s 1st theorem c) Shannon’s 2nd theorem d) Both a&b 10. The codeword generated by the encoder should be ______ a) Digits in Nature b) Codes in Nature c) Binary in Nature d) Values in Nature
11. Coding Efficiency of the source encoder is defined as,______ a) η = Nmin b) η = H N N c) N≥H d) η = H(X2) N 12. Code redundancy is represented by_____ a) γ b) 1- γ c) γ 2 d) σ 13. Code Variance is represented by_____ a) σ -1 b) Pk c) σ2 d) η2 14. Variable length coding is done by source encoder to get______ a) Lower efficiencies b) Higher efficiencies c) Moderate efficiencies d) Both a&b 15. Prefix Code Satisfies_______ a) McMillan inequality b) Shannon’s 1st Theorem c) Huffman Coding d) Shannon’s 2nd Theorem
16. The Channel is Discrete, when Both X and Y are________ a) Analog b) Discrete c) discrete Analog d)Both a&b 17. The conditional entropy H(Y/X) IS Called______ a) Uncertainty b) Information c) Equivocation d) Certainty 18. Standard Probability of m ∑ p(xi,yj)=_______ i=1 a) p(xi) b) p(yj) c) p(xi,yj) d) p(yj, xi) 19. H(X,Y)= H(X/Y) +_______ a) H(X) b) H(Y) c) H(Y/X) d) H(X,Y) 20. H(X,Y)= H(Y/X) +______ a) H(X) b) H(Y) c) H(Y/X) d) H(X,Y)
21. H(X) = m ∑ pi log2 (_______) i=1 a) pi b) pk c) 1/pi d)1/ pk 22.. Average rate of information going into the channel is given as,_____ a) Din= H(X) b) Din= rH(X) c) Din= H(Y) d) Din= rH(y) 23. Average rate of information transmission Dt across the channel______ a) Dt = [H(X)-H(X/Y)] b) Dt = [H(Y)-H(X/Y)] c) Dt =[H(X)-H(X/Y)]r d) Dt = [H(X)+H(X/Y)] 24. In case of errorless transmission H(X/Y)=0,Hence Din=________ a) H(X) b) Dt c) H(Y) d) rH(X) 25. Mutual Information is represented as,________ a) I(X/Y) b) I(X;Y) c) I(X,Y) d) I(X:Y)
26. The mutual information is Symmetric_______ a) I(X;Y)= I(X,Y) b) I(X;Y)= I(Y:X) c) I(X;Y)= I(X:Y) d) I(X;Y) = I(Y;X) 27. I(X;Y) = H(X)_______ a) - H(X) b) - H(X/Y) c) - H(Y/X) d) - H(X,Y) 28. I(X;Y) = H(Y)_______ a) - H(X) b) - H(X/Y) c) - H(Y/X) d) - H(X,Y) 29. Mutual information is always_______ a) +ve b) –ve c) 0 d) Both a&c 30. I(X;Y) is related to the joint entropy H(X,Y) by_______ a) I(X;Y)= H(X) – H(X,Y) b) I(X;Y)= H(X) + H(X,Y) c) I(X;Y)= H(X) +H(Y) – H(X,Y) d) I(X;Y)= H(X)- H(Y) – H(X,Y)
31. Channel Capacity of the discrete memoryless channel is_____ a) C = max b) C = max P(Xi) I(X;Y) P(Yj) I(X;Y) c) C = max d) C = max P(Xi) I(X:Y) P(Xi) I(Y;X) 32. Channel matrix is otherwise is called as______ a) Probability Matrix b) Transition Matrix c) Probability Transition Matrix d) None 33. (Entropy) H=0, if PK=_______ a) 0 b) 1 c) -1 d) Both a&b 34. DMS of Entropy H(S), the average codeword length of a prefix code is bounded as H(S) ≤
View more...
Comments