12 vues

Transféré par Neeraja Soumya

- Computer Networking and Data Communication Solved Question Paper
- (Lecture Notes in Computer Science 2227) James L. Massey (Auth.), Serdar Boztaş, Igor E. Shparlinski (Eds.)-Applied Algebra, Algebraic Algorithms and Error-Correcting Codes_ 14th International Symposi
- SCM Core Interface- Handbook
- DC Digital Communication MODULE IV PART1
- Networking & TCP IP
- Mobile Station
- Lab Manual Computer Network
- Cross-Layer Design for Wireless Video Stream Transmission
- Spacecraft comm data
- maths
- Ch9-ChannelCode.pdf
- 08s Cpe633 Test1 Solution 2
- 1.Coding Theory
- ZGO-01-01-001 Half Rate V8.20 20091230.pdf
- SRM m.tech(Cs) Sylabus 10.05.2013
- ghgh
- Developing OTN Interface Standards for Beyond 100G
- DC Unit Test 2 Question Bank
- Communication Process
- SkyEdge II Pro Brochure 2011-10-03

Vous êtes sur la page 1sur 18

1.

Two binary random variables X and Y are distributed according to the joint Distribution given as P(X=Y=0) = P(X=Y=1) = P(X=Y=1) = 1/3.Then, [01D01] a. H(X) = H(Y) b. H(X) = 2.H(Y) c. H(Y) = 2.H(X) d. H(X) + H(Y) = 1. An independent discrete source transmits letters from an alphabet consisting of A and B with respective probabilities 0.6 and 0.4.If the consecutive letters are statistically independent , and two symbol words are transmitted, then the probability of the words with different symbols is [01D02] a. 0.52 b. 0.36 c. 0.48 d. 0.24 A memoryless source emits 2000 binary symbols/sec and each symbol has a probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.The minimum number of bits/sec required for error free transmission of this source in bits/symbol is [01M01] a. 0.75 b. 0.81 c. 0.65 d. 0.55 Which of the following channel matrices respresent a symmetric channel? [01M02] a.

2.

3.

4.

b.

c.

d.

5.

a. b. c. d. 6.

where xi`s are transmitted messages and yj`s are received messages is [01M03] log3 bits log5 bits log4 bits 1bit

Information rate of a source is [01S01] a. the entropy of the source measured in bits/message b. the entropy of the source measured in bits/sec. c. a measure of the uncertainity of the communication system d. maximum when the source is continuous If a. b. c. d. `a` is an element of a Field `F`, then its additive inverse is [01S02] 0 -a a 1

7.

8.

The minimum number of elements that a field can have is [01S03] a. 3 b. 2 c. 4 d. 1 Which of the following is correct? [01S04] a. The syndrome of a received Block coded word depends on the transmitted code word. b. The syndrome of a received Block coded word depends on the received code word c. The syndrome of a received Block coded word depends on the error pattern d. The syndrome for a received Block coded word under error free reception consists of all 1`s.

9.

www.UandiStar.org

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

10.

If the output of a continuous source is limited to an average power of 2, then the Maximum entropy of the source is [01S05] a. b. c. d.

11.

A Convolutional encoder of code rate 1/2 consists of two stage shift register.The Generator sequence of top adder is (1,1,1) and that of the bottom adder is (1,0,1). The constraint length of the encoder is [02D01] a. 2 b. 3 c. 4 d. 5 The Parity Check Matrix of a (6,3) Systematic Linear Block code is

12.

If the Syndrome vector computed for the received code word is [ 1 1 0] , then for error correction, which of the bits of the received code word is to be complemented? [02D02] a. 2 b. 3 c. 4 d. 5 13. The minimum number of bits per message required to encode the output of source transmitting four different messages with probabilities 0.5,0.25,0.125 and 0.125 is [02M01] a. 2 b. 1 c. 1.5 d. 1.75 A Communication channel is represented by the channel Matrix given as

14.

In the above matrix, rows correspond to the Transmitter X and the columns correspond to the Receiver Y. Then, the Conditional entropy H(Y/X) in bits/message is [02M02] a. zero b. log 5 c. log 3 d. 3 15. The Channel Matrix of a Noiseless channel [02M03] a. consists of a single nonzero number in each column b. consists of a single nonzero number in each row c. is an Identity Matrix d. is a square Matrix Enropy of a source is [02S01] a. Average amount of information conveyed by the communication system b. Average amount of information transferred by the channel c. Average amount of information available with the source d. Average amount of information conveyed by the source to the receiver Relative to Hard decision decoding, soft decision decoding results in [02S02] a. better bit error probability b. better coding gain c. less circuit complexity d. lesser coding gain Which of the following is the essential requirement of a source coding scheme? [02S03] a. Comma free nature of the code words b. A Minimum Hamming distance of 3 c. Error detection and correction capability d. The received code word should compatible with a Matched filter. The transition probabilities for a BSC will be represented using [02S04] a. Joint Probability Matrix b. State diagram c. Conditional Probability Matrix d. Trellis diagram

16.

17.

18.

19.

www.UandiStar.org

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

20.

A Field is [02S05] a. an Abelian group under addition b. a group with 0 as the multiplicative identity for its members c. a group with 1 as the additive identity for its members d. a group with 0 as the additive inverse for its members

21.

The constraint length of a convolutional encoder of code rate 1/3 is 5. If the input of the encoder is a 5 bit message sequence, the length of the out put code word in bits is [03D01] a. 33 b. 27 c. 24 d. 30 A communication channel is represented by its channel matrix with rows representing the messages associated with the source and the columns representing the messages associated with the receiver given as

22.

Its capacity in bits is [03D02] a. log 4 b. log 3 c. log 12 d. log 7 23. A Binary Erasure channel has P(0/0) = P(1/1) = p; P(k/0) = P(k/1) = q. Its Capacity in bits/symbol is [03M01] a. p b. q c. pq d. p/q When a pair of dice is thrown, the average amount of information contained in the message " The sum of the faces is 7" in bits is [03M02] a. 0.75 b. 0.86 c. 0.96 d. 0.68 A source emits messages A and B with probability 0.8 and 0.2 respectively. The redundancy provided by the optimum source coding scheme for the above Source is [03M03] a. 72 % b. 27 % c. 45 % d. 55 % Information content of a message [03S01] a. increases with its certainty of occurrence b. independent of the certainty of occurrence c. increases with its uncertainty of occurrence d. is the logarithm of its certainty of occurrence .Under error free reception, the syndrome vector computed for the received cyclic code word consists of [03S02] a. all ones b. alternate 1`s and 0`s starting with a 1 c. alternate 0`s and 1`s starting with a 0 d. all zeros A continuous source will have maximum entropy associated if the pdf associated with its output is [03S03] a. Poisson b. Exponential c. Rayleigh d. Gaussian Variable length source coding provides better coding efficiency, if all the messages of the source are [03S04] a. Equiprobable b. with different transmission probability c. discretely transmitted d. continuously transmitted Shanon's Limit deals with [03S05] a. maximum information content of a message b. maximum entropy associated with a source c. maximum capacity of a channel d. maximum bit rate of a source

24.

25.

26.

27.

28.

29.

30.

www.UandiStar.org

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

31. The Parity Check Matrix of a (6,3) Systematic Linear Block code is If the Syndrome vector computed for the received code word is [ 0 1 1] , then for error correction, which of the bits of the received code word is to be complemented? [04D01] a. 2 b. 3 c. 4 d. 5 32. A (7,4) Cyclic code has a generator polynomial given as 1+x+x3.If the error pattern 0001000, the corresponding syndrome vector is [04D02] a. 001 b. 010 c. 100 d. 110 The Memory length of a convolutional encoder is 3.If a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [04M01] a. 4 b. 3 c. 5 d. 6 The code words of a systematic (6,3) Linear Block code are 001110,010011,011101, 100101,101011,110110,111000.Which of the following is also a code word of the c Code? [04M02] a. 000111 b. 000101 c. 000000 d. 101101 The syndrome S(x) of a cyclic code is given by Reminder of the division , where V(x) is the

33.

34.

35.

transmitted code polynomial, E(x) is the error polynomial and g(x) is the generator polynomial. The S(x) is also equal to [04M03] a. Reminder of V(x) /g(x) b. Reminder of E(x)/g(x) c. Reminder of [V(x) . E(x)]/g(x) d. Remainder of g(x)/V(x) 36. Source1 is transmitting two messages with probabilities 0.2 and 0.8 and Source 2 is transmitting two messages with probabilities 0.5 and 0.5.Then [04S01] a. Maximum uncertainty is associated with Source 1 b. Maximum uncertainty is associated with Source 2 c. Both the sources 1 and 2 are having maximum amount of uncertainty associated d. There is no uncertainty associated with either of the two sources . A source X and the receiver Y are connected by a noise free channel. Its capacity is [04S02] a. Max H(X) b. Max H(X/Y) c. Max H(Y/X) d. Max H(X,Y) The entropy measure of a continuous source is a [04S03] a. Relative measure b. Absolute measure c. Linear Measure d. Non-Linear Measure Which of the following is correct? [04S04] a. FEC is used for error control after receiver makes a decision about the received bit b. ARQ is used for error control after receiver makes a decision about the received bit c. FEC is used for error control when the receiver is unable to make a decision about the received bit d. FEC and ARQ are not used for error correction Error free communication may be possible by [04S05] a. reducing redundancy during transmission b. increasing transmission power to the required level c. providing redundancy during transmission d. increasing the channel band width For the data word 1010 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the code polynomial is [05D01] a. 1+x+x3+x5 b. 1+x+x3+x4 c. 1+x2+x3+x4 2 5 d. free1+x+xON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts SMS +x

37.

38.

39.

40.

41.

www.UandiStar.org

100 %

& more......

42.

The output of a continuous source is a uniform random variable in the range bits/sample is [05D02] a. 4 b. 2 c. 8 d. 1

43.

For the source X transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8, Maximum coding efficiency can be obtained by using [05M01] a. Convolutional codes b. Only Shanon-Fano method c. Either of the Shanon- fano and Huffman methods d. Block codes In a. b. c. d. Modulo-7 addition, 6+1 is equal to [05M02] 7 5 4 0

44.

45.

A source is transmitting two messages A and B with probabilities 3/4, and 1/4 respectively. The coding efficiency of the first order extension of the source is [05M03] a. 89 % b. 77 % c. 92 % d. 81 % The noise characteristic of a communication channel is given as Rows represent the source and columns represent the columns. The Channel is a [05M04] a. Noise free channel b. Asymmetric channel c. Symmetric channel d. Deterministic channel

46.

47.

The source coding efficiency can be increased by [05S01] a. using source extension b. increasing the entropy of the source c. decreasing the entropy of the source d. using binary coding The capacity of a channel with infinite band width is [05S02] a. infinite because of infinite band width b. finite because of increase in noise power c. infinite because of infinite noise power d. finite because of finite message word length The Hamming Weight of the (6,3) Linear Block coded word 101011 [05S03] a. 3 b. 4 c. 5 d. 2 The cascade of two Binary Symmetric Channels is a [05S04] a. symmetric Binary channel b. asymmetric quaternary channel c. symmetric quaternary channel d. asymmetric Binary channel In a. b. c. d. a (6,3) systematic Linear Block code, the number of `6` bit code words that are not useful is [06D01] 56 64 8 45 Then [06D02]

48.

49.

50.

51.

52.

C.HT = [1], where C is the code word of the code. if the syndrome vector S computed for the received code word is [1 1 0], the third bit of the received code word is in error. The syndrome vector S of the received code word is same as C.HT The syndrome vector is S= [1 1 1] under error free reception

53.

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

In a Binary Symmetric channel, a transmitted 0 is received as 0 with a probability of 1/8.Then, the transition probability of the transmitted 0 is [06M01]

www.UandiStar.org

a. b. c. d. 54.

A source transmitting `n` number of messages is connected to a noise free channel. The capacity of the channel is [06M02] a. n bits/symbol b. n2 bits/symbol c. log n bits/symbol d. 2n bits/symbol There are four binary words given as 0000,0001,0011,0111. Which of these can not be a member of the parity check matrix of a (15,11) linear Block code? [06M03] a. 0000,0001 b. 0000 c. 0011 d. 0111 If X is the transmitter and Y is the receiver and if the channel is the noise free, then, the mutual information I(X,Y) is equal to [06S01] a. Joint entropy of the source and receiver b. Entropy of the source c. Conditional Entropy of the receiver, given the source d. Conditional Entropy of the source, given the receiver Which of the following is correct? [06S02] a. Source coding introduces redundancy b. Channel coding is an efficient way of representing the output of a source c. ARQ scheme of error control is applied after the receiver makes a decision about the received bit d. ARQ scheme of error control is applied when the receiver is unable to make a decision about the received bit. Which of the following is an FEC scheme? [06S03] a. Shanon-Fano encoding b. Huffman encoding c. Non-systematic cyclic codes d. Duo-binary encoding A discrete source X is transmitting m messages and is connected to the receiver Y through a symmetric channel. The capacity of the channel is given as [06S04] a. log m - H(X/Y) bits/symbol b. log m bits/symbol c. log m - H(Y/X) bits/symbol d. H(X) + H(Y) - H(X,Y) bits/symbol If the received code word of a (6,3) linear Block code is 100111 with an error in the pattern will be [06S05] a. 100000 b. 000001 c. 001000 d. 000010 bit, the corresponding error

55.

56.

57.

58.

59.

60.

61.

For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the code polynomial is [07D01] a. 1+x+x3+x5 b. 1+x2+x3+x5 c. 1+x2+x3+x4 d. 1+x4+x5 The output of a source is band limited to 6KHz.It is sampled at a rate of 2KHz above the nyquist`s rate. If the entropy of the source is 2bits/sample, then the entropy of the source in bits/sec is [07D02] a. 24Kbps b. 28Kbps c. 12Kbps d. 32Kbps When two fair dice are thrown simultaneously, the information content of the message ` the sum of the faces is 12` in bits is [07M01] a. 1 b. 5.17 c. 4.17 d. 3.58 The encoder of a (7,4) systematic cyclic encoder with generating polynomial g(x) = 1+x2+x3 is basically a [07M02] a. 4 stage shift register b. 3 stage shift register c. 11 stage shift register

62.

63.

64.

www.UandiStar.org

d. 65.

A received code word of a (7,4) systematic received cyclic code word 1000011 is corrected as 1000111.The corresponding error pattern is [07M03] a. 0000100 b. 0001000 c. 0000010 d. 0010001 The product of 5 and 6 in Modulo-7 multiplication is [07S01] a. 30 b. 1 c. 2 d. 3 Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [07S02] a. x3+x+1 b. x4+x3+1 c. x7+x4+x3+1 d. x5+x2+1 The time domain behavior of a convolutional encoder of code rate 1/3 is defined in terms of a set of [07S03] a. 3 step responses b. 3 impulse responses c. 3 ramp responses d. 3 sinusoidal responses Which of the following is correct? [07S04] a. In an (n,k) block code, each code word is the cyclic shift of an another codeword of the code. b. In an (n,k) systematic cyclic code, the sum of two code words is another code word of the code. c. In a convolutional encoder, the constraint length of the encoder is equal to the tail of the message sequence + 1. d. Source encoding reduces the probability of transmission errors A linear block code with Hamming distance 5 is [07S05] a. Single error correcting and double error detecting code b. double error detecting code c. Triple error correcting code d. Double error correcting code The channel capacity of a BSC with transition probability 1/2 is [08D01] a. 0 bits b. 1bit c. 2 bits d. infinity White noise of PSD w/Hz is applied to an ideal LPF with one sided band width of 1Hz .The two sided output noise

66.

67.

68.

69.

70.

71.

72.

power of the channel is [08D02] a. four times the input PSD b. thrice the input PSD c. twice the input PSD d. same as the input PSD 73. A convolutional encoder of code rate 1/2 is a 3 stage shift register with a message word length of 6.The code word length obtained from the encoder ( in bits) is [08M01] a. 9 b. 18 c. 27 d. 36 A source X with entropy 2 bits/message is connected to the receiver Y through a Noise free channel. The conditional probability of the source, given the receiver is H(X/Y) and the joint entropy of the source and the receiver H(X,Y) .Then [08M02] a. H(X,Y) = 2 bits/message b. H(X/Y) = 2 bits/message c. H(X,Y) = 0 bits/message d. H(X/Y) = 1 bit/message A channel with independent input and output acts as [08M03] a. lossless network b. resistive network c. channel with maximum capacity d. Gaussian channel Automatic Repeat Request is a [08S01] a. Source coding scheme b. error correction scheme c. error control scheme d. data conversion scheme

74.

75.

76.

www.UandiStar.org

77.

Channel coding [08S02] a. avoids redundancy b. reduces transmission efficiency c. increases signal power level relative to channel noise d. results in reduced transmission band width requirement

78.

The information content available with a source is referred to as [08S03] a. Mutual information b. Trans information c. capacity d. entropy In a. b. c. d. In a. b. c. d. a Linear Block code [08S04] the received power varies linearly with that of the transmitted power the encoder satisfies super position principle parity bits of the code word are the linear combination of the message bits the communication channel is a linear system Modulo-4 arithmetic, the product of 3 and 2 is [08S05] 6 3 2 4

79.

80.

81.

For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x2+x3, the code polynomial is [09D01] a. 1+x+x3+x5 b. 1+x2+x3+x5 c. 1+x2+x3+x4 d. 1+x+x5 In a. b. c. d. a (7,4) systematic Linear Block code, the number of `7` bit code words that are not useful for the user is [09D02] 16 112 128 96

82.

83.

Which of the following is a valid source coding scheme for a source transmitting four messages? [09M01] a. 0,00,001,110 b. 1,11,111,1110 c. 0,10,110,111 d. 1,01,001,0010 A system has a band width of 4KHz and an S/N ratio of 28 at the input to the Receiver. If the band width of the channel is doubled, then [09M02] a. Capacity of the channel gets doubled b. Capacity of the channel gets squared c. S/N ratio at the input of the received gets halved d. S/N ratio at the input of the received gets doubled The Memory length of a convolutional encoder is 4.If a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [09M03] a. 4 b. 3 c. 5 d. 6 In a. b. c. d. Modulo-5 multiplication, the product of 4 and 3 is [09S01] 12 7 2 3

84.

85.

86.

87.

Which of the following provides minimum redundancy in coding? [09S02] a. (15,11) linear block code b. Shanon-Fano encoding c. (6,3) systematic cyclic code d. Convolutional code If C is the channel capacity and S is the signal input of the channel and following is the Shannon`s limit? [09S03] a. b. c. d. is the Input noise PSD, then which of the

88.

www.UandiStar.org

89.

A communication channel is fed with an input signal x(t) and the noise in the channel is negative. The Power received at the receiver input is [09S04] a. Signal power - Noise power b. Signal power/ Noise power c. Signal power + Noise Power d. Signal Power x Noise Power The fundamental limit on the average number of bits/source symbol is [09S05] a. Information content of the message b. Entropy of the source c. Mutual Information d. Channel capacity The Parity Check Matrix of a (6,3) Systematic Linear Block code is

90.

91.

If the Syndrome vector computed for the received code word is [ 0 1 0] , then for error correction, which of the bits of the received code word is to be complemented? [10D01] a. 2 b. 3 c. 4 d. 5 92. White noise of PSD is applied to an ideal LPF with one sided band width of B Hz. The filter provides a gain of 2.If the

output power of the filter is 8 , then the value of B in Hz is [10D02] a. 2 b. 4 c. 6 d. 8 93. The Memory length of a convolutional encoder is 5.If a 6 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [10M01] a. 4 b. 3 c. 5 d. 6 A source is transmitting four messages with equal probability. Then,for optimum Source coding efficiency, [10M02] a. necessarily, variable length coding schemes should be used b. Variable length coding schemes need not necessarily be used c. Fixed length coding schemes should not be used d. Convolutional codes should be used Which of the following is a valid source coding scheme for a source transmitting five messages? [10M03] a. 0,00,110,1110,1111 b. 1,11,001,0001,0000 c. 0,10,1110,110,1111 d. 1,01,001,0010, 1111 In a. b. c. d. Modulo-7 addition, 6 + 4 is equal to [10S01] 10 2 3 5

94.

95.

96.

97.

Which of the following provides minimum redundancy in coding? [10S02] a. (6,3) linear block code b. (15,11) linear block code c. (6,3) systematic cyclic code d. Convolutional code Which of the following involves the effect of the communication channel? [10S03] a. Information content of a message b. Entropy of the source c. Mutual information d. information rate of the source Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [10S04] a. x3+x2+1 b. x4+x3+1 c. x7+x4+x3+1 d. x5+x2+1 Which of the following provides the facility to recognize the error at the receiver? [10S05] a. Shanon-Fano Encoding

98.

99.

100.

www.UandiStar.org

b. c. d. 101.

Which of the following coding schemes is linear ? [11D01] a. C = { 00,01,10,} b. C = {000,111,110} c. C = {000,110,111,001} d. C = {111,110,011,101} If the transition probability of messages 0 and 1 in a communication system is 0.1, the noise matrix of the corresponding Communication channel is [11D02] a. b. c. d.

102.

103.

In a. b. c. d.

a BSC, the rate of information transmission over the channel decreases as [11M01] transmission probability approaches 0.5 transmission probability approaches 1 transition probability approaches 0.5 transition probability approaches 1

104.

A source X is connected to a receiver R through a lossless channel. Then [11M02] a. H(Y/X) = 0 b. H(X,Y) = 0 c. H(X) = I(X,Y) d. H(X/Y)= I(X,Y) Which of the following is a valid source coding scheme for a source transmitting four messages? [11M03] a. 0,00,110,1110 b. 1,11,001,0001 c. 0,10,1110,110 d. 1,01,001,0010 The Hamming distance of a triple error correcting code is [11S01] a. 5 b. 6 c. 7 d. 8 A channel whose i/p is xi and output is yj is deterministic if [11S02] a. b. c. d.

105.

106.

107.

108.

If a memoryless source of information rate R is connected to a channel with a channel capacity C, then on which of the following statements, the channel coding for the output of the source is based ? [11S03] a. R must be greater than or equal to C b. R must be exactly equal to C c. R must be less than or equal to C d. Minimum number of bits required to encode the output of the source is its entropy Which of the following is correct? [11S04] a. Source coding reduces transmission efficiency b. Channel coding improves transmission efficiency c. Entropy of a source is a measure of uncertainty of its output d. Cyclic code is an ARQ scheme of error control The minimum source code word length of the message of a source is equal to [11S05] a. its entropy measured in bits/sec b. the channel capacity c. its entropy measured in bits/message d. the sampling rate required for the source If the transition probability of messages 0 and 1 in a communication system is 0.2, the noise matrix of the corresponding Communication channel is [12D01] a.

109.

110.

111.

www.UandiStar.org

b. c. d. 112. In a BSC, if the transition probability of the messages 0 and 1 is P, and if they are of equal transmission probability, then, the probability of these symbols to appear at the channel output is [12D02] a. P,P b. 1/2, 1/2 c. 1,1 d. P,1 - P The number of bits to be used by the efficient source encoder to encode the output of the source is equal to [12M01] a. the information rate of the source b. entropy of the source c. channel capacity d. information content of each message A source X is connected to a receiver R through a deterministic channel. Then [12M02] a. H(X/Y) = 0 b. H(Y/X) = 0 c. H(X) = I(X,Y) d. H(X/Y)= I(X,Y) Which of the following can be valid source coding scheme for a source transmitting 3 messages? [12M03] a. 0,00,110 b. 1,01,001 c. 0,10,101 d. 1,01,011 For an (n,k) cyclic code, E(x) is the error polynomial, g(x) is the generator Polynomial, R(x) is the received code polynomial and C(x) is the transmitted code polynomial. Then, the Syndrome polynomial S(x) is [12S01] a. Remainder of C(x)/g(x) b. Remainder of E(x)/g(x) c. E(x).g(x) d. R(x) + g(x) If is the input noise PSD and S is the input signal power for a communication channel of capacity C, then Which of

113.

114.

115.

116.

117.

the following is Shanon`s Limit? [12S02] a. b. c. d. 118. The Hamming distance of an error correcting code capable of correcting 4 errors is [12S03] a. 8 b. 9 c. 7 d. 6 BCH codes capable of correcting single error are [12S04] a. Systematic Linear Block codes b. Cyclic Hamming codes c. Convolutional codes d. Non-Systematic Linear Block codes Which of the following provides the facility to recognize the error at the receiver? [12S05] a. Shanon-Fano Encoding b. Parity Check codes c. Huffman encoding d. differential encoding The output of a source is a continuous random variable uniformly distributed over (0,2).The entropy of the source in bits/sample is [13D01] a. 4 b. 2 c. 1.5 d. 1 An AWGN low pass channel with 4KHz band width is fed with white noise of PSD power at the output of the channel is [13D02] a. 4 nW b. 2 nW = 10 W/Hz. The two sided noise

119.

120.

121.

122.

www.UandiStar.org

c. d. 123.

6 nW 8 nW

A system has a band width of 3KHz and an S/N ratio of 29dB at the input of the receiver. If the band width of the channel gets doubled, then [13M01] a. its capacity gets doubled b. the corresponding S/N ratio gets doubled c. its capacity gets halved d. the corresponding S/N ratio gets halved A source is transmitting two symbols A and B with probabilities 7/8 and 1/8 respectively. The average source code word length can be decreased by [13M02] a. reducing transmission probability b. increasing transmission probability c. using noise free channel d. using pair coding Non-Uniqueness of Huffman encoding results in [13M03] a. different coding efficiencies b. different average code word lengths c. different entropies d. different sets of source code words Shanon's limit is for [13S01] a. maximum entropy of source b. maximum information rate of a source c. maximum information content of a message d. maximum capacity of a communication channel under infinite band width In a. b. c. d. Modulo-6 addition, the sum of 1 and 5 is [13S02] 4 1 2 0

124.

125.

126.

127.

128.

FEC and ARQ schemes of error control can be applied for the outputs of [13S03] a. Binary symmetric channel only b. Binary Erasure channel only c. Binary Erasure channel, and Binary symmetric channel respectively d. Binary symmetric channel, and Binary erasure channel respectively The Hamming distance of an (n,k) systematic cyclic code is [13S04] a. the weight of any non-zero code word b. the weight of a code word consisting of all 1`s c. the weight the code word consisting of alternate 1`s and 0`s d. the minimum of weights of all non zero code words of the code Which of the following is effected by the communication channel? [13S05] a. Information content of a message b. Entropy of the source c. information rate of the Mutual information d. source The maximum average amount of information content measured in bits/sec associated with the output of a discrete information source transmitting 8 messages and 2000messages/sec is [14D01] a. 6Kbps b. 3Kbps c. 16Kbps d. 4Kbps Which of the following coding schemes is linear ? [14D02] a. C = {00, 01,10,11} b. C = {01,10,11} c. C = {110,111,001} d. C = {000,110,011} A communication source is connected to a receiver using a communication channel such that, the uncertainty about the transmitted at the receiver, after knowing the received is zero. Then, the information gained by the observer at the receiver is [14M01] a. same as the entropy of the source b. same as the entropy of the receiver c. same as the joint entropy of the source and the receiver d. same as the conditional entropy of the source, given the receiver X(t) and n(t) are the signal and the noise each is band limited to 2B Hz applied to a communication channel band limited to BHz. Then ,the minimum number of samples/sec that should be transmitted to recover the input of the channel at its output is [14M02] a. 2B b. 4B c. freeBSMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

129.

130.

131.

132.

133.

134.

www.UandiStar.org

100 %

d. 135.

6B

The upper limit on the minimum distance of a linear block code is [14S01] a. minimum weight of the non-zero code word of the code b. minimum number of errors that can be corrected c. maximum number of errors that can be corrected d. maximum weight of the non-zero code word of the code Information rate of a source can be used to [14S02] a. differentiate between two sources b. to find the entropy in bits/message of a source c. correct the errors at the receiving side d. design the matched filter for the receiver A source is transmitting only one message. Then [14S03] a. the reception of the message conveys zero information to the user b. the reception of the message conveys maximum information to the user c. the message received will be corrupted by noise d. the channel capacity required is infinite If a. b. c. d. C is the code word and H is the Parity check Matrix of an (n,k) linear block code, then, [14S04] each code word of the (n,k) code is orthogonal to the code word of its dual code each code word of the (n,k) code is orthogonal to the code word of the same code C.H = [0] HT.C = 0

136.

137.

138.

139.

Theoritically, the entropy of a continuous random variable is [14S05] a. infinity b. zero c. unity d. finite, but >0 and >1. A convolutional encoder is having a constraint length of 4 and for each input bit, a two bit word is the output of the encoder. If the input message is of length 5, the exact code rate of the encoder is [15D01] a. 50 % b. 31.25 % c. 45.3 % d. 23.3 % In a message conveyed through a sequence of independent dots and dashes, the probability of occurrence of a dash is one third of that of dot.The information content of the word with two dashes in bits is [15D02] a. 2 b. 4 c. 8 d. 16 The voice frequency modulating signal of a PCM system is quantized into 16 levels. If the signal is band limited to 3KHz, the minimum symbol rate of the system is [15M01] a. 48 kilosymbols/sec b. 6 kilosymbols/sec c. 96 kilosymbols/sec d. 3 kilosymbols/sec A source is transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8. To have 100 % transmission efficiency, the average source code word length of the message of the source should be [15M02] a. 2 bits b. 1.75 bits c. 3 bits d. 3.75 bits A source is transmitting six messages with probabilities, 1/2, 1/4, 1/8, 1/16,1/32, and 1/32.Then [15M03] a. Channel coding will reduce the average source code word length. b. Two different source code word sets can be obtained using Huffman coding. c. Source coding improves the error performance of the communication system. d. Two different source code word sets can be obtained using Shanon-Fano coding The average source code word length per bit can be decreased by [15S01] a. increasing the entropy of the source b. extending the order of the source c. using a channel with very large capacity d. increasing the transmitted power Trade-off between Band width and Signal to Noise ratio results in [15S02] a. Shanon`s limit b. the concept of transmitting the given information using various combinations of signal power and band width c. a noise free channel d. minimum redundancy Binary erasure channel is an example of [15S03] a. Wide band channel free SMS ON<space>UandIStar to 9870807070

140.

141.

142.

143.

144.

145.

146.

147.

www.UandiStar.org

for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

100 %

b. c. d. 148. In a. b. c. d.

severe effect of channel noise on the message transmitted narrowband channel symmetric channel

a symmetric channel, [15S04] transmission errors will be less rows and columns of the corresponding channel matrix are identical, except for permutation required transmission power will be less rows and columns of the corresponding channel matrix are identical on permutation basis

149.

Which of the following is correct? [15S05] a. Mutual information of a communication system is same as the entropy of the source b. Mutual information of a communication system is the information gained by the observer c. Mutual information is independent of the channel characteristics d. Mutual information of a communication system is same as the entropy of the receiver A source generates three symbols with probabilities of 0.25, 0.25 and 0.5 at a rate of 3000 symbols/sec. Assuming independent generation of symbols, the most efficient source encoder would have an average bit rate of [16D01] a. 6000 bps b. 4500 bps c. 3000 bps d. 1500 bps A memoryless source emits 2000 binary symbols/sec and each symbol has a Probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.Thre minimum number of bits/sec required for error free transmission of this source is [16D02] a. 1500 b. 1622 c. 1734 d. 1885 In a communication system, due to noise in the channel, an average of one symbol in each 100 received is incorrect. The symbol transmission rate is 1000.The number of bits in error in the received symbols is [16M01] a. 1 b. 10 c. 100 d. 20 Which of the following is a valid source coding scheme for a source transmitting Six messages? [16M02] a. 0, 10,110, 1100,1111, 11000 b. 1, 11, 101, 1100, 11010,11001 c. 0,10,110,1110,11110,11111 d. 1,10,100,1110,11110,11111 The encoder of an (15,11) systematic cyclic code requires [16M03] a. 4 bit shift register and 3 Modulo-2 adders. b. 4 bit shift register and 4 Modulo-2 adders. c. 3 bit shift register and 3 Modulo-2 adders. d. 3 bit shift register and 4 Modulo-2 adders. The distance between the any code word and an all zero code word of an (n,k) linear Block code is referred to as [16S01] a. Hamming distance of the code b. Code rate of the code c. Redundancy of the code d. Hamming weight of the code word As a. b. c. d. per source coding Theorem, it is not possible to find any uniquely decodable code whose average length is [16S02] less than the entropy of the source greater than the entropy of the source equal to number of messages from the source equal to the efficiency of transmission of the source

150.

151.

152.

153.

154.

155.

156.

157.

The coding efficiency due to second order extension of a source [16S03] a. is more b. is less c. remains unaltered d. can not be computed Exchange between Band width and Signal noise ratio can be justified based on [16S04] a. Shanon`s limit b. Shanon`s source coding Theorem c. Hartley - Shanon`s Law d. Shanon`s channel coding Theorem A source X is connected to a receiver Y through a noise free channel. Its capacity is [16S05] a. Maximum of H(X/Y) b. Maximum of H(X) c. Maximum of H(Y/X) d. Maximum of H(X,Y)

158.

159.

www.UandiStar.org

160.

A zero memory source emits two messages A and B with probability of 0.8 and 0.2 respectively. The entropy of the second order extension of the source is [17D01] a. 1.56 bits/ message b. 0.72 bits/ message c. 0.78 bits/ message d. 1.44 bits/message A signal amplitude X is a uniform random variable in the range (-1,1).Its differential Entropy is [17D02] a. 2 bits/sample b. 4 bits/sample c. 1 bit/sample d. 3 bits/sample A communication channel is so noisy that the output Y of the channel is statistically independent of the input X. Then, [17M01] a. H(X,Y) = H(X).H(Y) b. H(X,Y) = H(X)+H(Y) c. H(X/Y) = H(Y/X) d. H(X) = H(Y) A transmitting terminal has 128 characters and the data sent from the terminal consist of independent sequences of equiprobable characters. The entropy of the above terminal in bits/character is [17M02] a. 10 b. 7 c. 1.44 d. 14 For a (7,4) systematic Cyclic code, the generator polynomial is 1+x+x3.Then, the Syndrome vector corresponding to the error pattern 0000010 is [17M03] a. 100 b. 010 c. 011 d. 111. Which of the following is correct? [17S01] a. A noise free channel is not a deterministic channel b. For a noise free channel, H(X/Y) = 1 c. The channel Matrix of a noise free channel is an Identity Matrix d. A noise free channel is of infinite capacity. In a. b. c. d. a communication system, information lost in the channel is measured using [17S02] H(X/Y) I(X,Y) H(X) H(X/Y) H(Y/X)

161.

162.

163.

164.

165.

166.

167.

Capacity of a BSC with infinite band width is not infinity, because [17S03] a. Noise power in the channel and the band width varies linearly b. Noise power in the channel inversely varies with band width c. Noise power in the channel is independent of band width d. Noise power in the channel will not effect the signal power For a noise free channel, I(X,Y) is equal to [17S04] a. entropy of the source, given the receiver b. entropy of the receiver, given the source c. entropy of the source d. joint entropy of the source and the receiver The output of a continuous source is a Gaussian random variable with variance 2 and is band limited to fm Hz. The maximum entropy of the source is [17S05] a. b. c. d.

168.

169.

170.

If the generator polynomial of a (7,4) Non-systematic cyclic code is given as g(x) = 1+x+x2+x4, then the binary word corresponding to x2.g(x) + g(x) is [18D01] a. 1101110 b. 1101001 c. 1010101 d. 1010111 An (7,4) systematic cyclic code has a generator polynomial g(x) = 1+x+x 3,and the Code polynomial is V(x) = x+ . Then, the remainder of the division V(x)/g(x) is [18D02] a. b. Zero syndrome vector

171.

www.UandiStar.org

c. d. 172.

The out put of a continuous source is a uniform random variable of (0,1).Then [18M01] a. the absolute entropy of the source is zero b. output of the source is a gaussian random variable c. the relative entropy of the source is zero d. the source is discrete memory less source The generator sequence of an adder in a convolutional encoder is (1,1,1,1).It is its response for an input sequence of [18M02] a. 0,1,0, 0, .. b. 0,0,1,0,0,0,1,0,... c. 1,0,0,0,.. d. 1,1,1,.. In a communication system, the average amount of uncertainty associated with the Source, sink, source and sink jointly in bits/message are 1.0613,1.5 and 2.432 respectively. Then the information transferred by the channel connecting the source and sink in bits is [18M03] a. 0.1293 b. 4.9933 c. 1.945 d. 2.8707 The efficiency of transmission of information can be measured by [18S01] a. comparing the entropy of the source and maximum limit for the rate of Ans.c transmission of information over the channel. b. comparing the and maximum limit for the rate of transmission of information over the channel and the conditional entropy of receiver, given the source c. comparing the actual rate of transmission and maximum limit for the rate of transmission of information over the channel d. comparing the entropy of the source and the information content of each individual message of the source. Binary Erasure channel is the mathematical modeling of [18S02] a. the effect of channel noise resulting in the incorrect decision of the message bit transmitted b. the inability of the receiver to make a decision about of the received message bit in the back ground of noise c. the error correction mechanism at the receiving side d. error detection mechanism at the receiving side In a. b. c. d. which of the following matrices, the sum of each row is one ? [18S03] Joint probability Matrix Channel Matrix Conditional probability of the source, given the receiver Generator Matrix

173.

174.

175.

176.

177.

178.

If T is the code vector and H is the Parity check Matrix of a Linear Block code, then the code is defined by the set of all code vectors for which [18S04] a. T.HT = 0. b. HT.T = 0 c. H.T = 0 d. = 0. Which of the following is correct? [18S05] a. The entropy measure of a continuous source is not an absolute measure b. A Binary symmetric channel is a noise free channel c. The channel capacity of a Symmetric channel is always 1 bit/symbol d. Self information and mutual information are one and the same. A BSC has a transition probability of P. The cascade of two such channels is [19D01] a. an asymmetric channel with transition probability 2P b. a symmetric channel with transition probability P2. c. an asymmetric channel with transition probability P(1 - P) d. a symmetric channel with transition probability 2P(1 - P) A source is transmitting four messages with probabilities of 0.5, 0.25, 0.125and 0.125.By using Huffman coding, the percentage reduction in the average source code word length is [19D02] a. 10 % b. 20 % c. 12.5 % d. 25 % The parity polynomial in the generation of a systematic (7,4) cyclic code for the data word 1 1 0 0 is 1+x2.The corresponding code word is [19M01] a. 1 1 0 0 1 0 1 b. 1 1 1 0 0 1 0 c. 1 1 1 0 1 0 0 d. 1 1 0 1 0 1 0 Which of the following are prefix free codes? [19M02]

179.

180.

181.

182.

183.

www.UandiStar.org

a. b. c. d. 184.

Which of the following is a single error correcting perfect code? [19M03] a. (6,3) systematic cyclic code b. (7,4) systematic Linear Block code c. (15,11) Hamming code d. Convolutional code of code rate 1/2 If X is the transmitted message and Y is the received message, then the average information content of the pair (X,Y) is equal to the average information of Y plus [19S01] a. average information of X b. the average information of Y after X is known c. the average information of X after Y is known d. mutual information of (X,Y). The entropy H( a. b. c. d. ) is [19S02]

185.

186.

information X2 to some one who knows X1. information X2 with out knowing X1. Mutual information of X1 and X2 effect of noise in receiving X1 as X2. X and Y are the transmitter and the receiver, in a BSC, P(X = i/Y=j) measures [19S03] uncertainty about the received bit based on the transmitted certainty about the received bit based on the transmitted certainty about the transmitted bit based on the received uncertainty about the transmitted bit based on the received X and Y are related in one-to-one manner, then, H(X/Y) in bits is [19S04] 1 log m, m being the number of messages with source 0 0.5 the output of the channel is independent of the input, then [19S05] maximum information is conveyed over the channel no information is transmitted over the channel no errors will occur during transmission information loss is zero .Its Hamming distance is [20D01]

187.

If a. b. c. d. If a. b. c. d. If a. b. c. d.

188.

189.

190.

191.

A source with equally likely outputs is connected to a communication channel with channel matrix .The columns of the matrix represent the probability that a transmitted bit is identified as 0, a transmitted bit unidentified, and a transmitted bit is identified as 1 respectively. Then, the probability that the bit is not identified is [20D02] a. 0.4 b. 0.6 c. 0.3 d. 0.2

192.

The Hamming distance of the code vectors Ci and Cj is [20M01] a. weight of b. c. d. minimum of the weights of Ci and Cj weight of sum of the weights of Ci and Cj

193.

The minimum number of parity bits required for the single error correcting linear block code for 11 data bits is [20M02] a. 3 b. 5 c. 6 d. 4 A source X with symbol rate of 1000 symbols/sec is connected to a receiver Y using a BSC with transition probability P. The messages of the source are equally likely. Then, rate of information transmission over the channel in bits per sec is [20M03] free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

194.

www.UandiStar.org

100 %

a. b. c. d. 195. Entropy coding is a [20S01] a. variable length coding scheme b. fixed length coding scheme c. channel coding scheme d. differential coding scheme

196.

Which of the following is correct? [20S02] a. Mutual information is symmetric about transmitted and received pairs b. Binary Erasure channel is a symmetric channel c. Channel matrix gives the joint probabilities of the transmitted and received pairs d. Channel capacity of a noise free channel is zero. For a BSC with transition probability P, the bit error probability is [20S03] a. 1 - P b. P c. 2P d. 2(1 - P) A (4,3) Parity check code can [20S04] a. correct all single error patterns b. detect all double error patterns c. detect all triple error patterns d. correct all single error patterns and detect all double error patterns A source of information rate of 80 Kbps is connected to a communication channel of capacity 66.6 Kbps. Then [20S05] a. error free transmission is not possible b. channel coding results in error free transmission c. source coding will make the errors corrected at the receiver d. mutual information becomes maximum

197.

198.

199.

www.UandiStar.org

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

- Computer Networking and Data Communication Solved Question PaperTransféré parVishal Kumar Singh
- (Lecture Notes in Computer Science 2227) James L. Massey (Auth.), Serdar Boztaş, Igor E. Shparlinski (Eds.)-Applied Algebra, Algebraic Algorithms and Error-Correcting Codes_ 14th International SymposiTransféré parSamirL.Sánchez
- SCM Core Interface- HandbookTransféré parAshish Chandra
- DC Digital Communication MODULE IV PART1Transféré parARAVIND
- Networking & TCP IPTransféré parGuruKPO
- Mobile StationTransféré parBilly James
- Lab Manual Computer NetworkTransféré parmeghana09
- Cross-Layer Design for Wireless Video Stream TransmissionTransféré parRamadhurai
- Spacecraft comm dataTransféré parsreetejlakkam
- mathsTransféré parBanahene Yaw Mensah
- Ch9-ChannelCode.pdfTransféré parNgocan Huynh
- 08s Cpe633 Test1 Solution 2Transféré parram_786
- 1.Coding TheoryTransféré parmailstonaik
- ZGO-01-01-001 Half Rate V8.20 20091230.pdfTransféré parabojabl
- SRM m.tech(Cs) Sylabus 10.05.2013Transféré parMani Yarasan
- ghghTransféré partrsghstrhs
- Developing OTN Interface Standards for Beyond 100GTransféré parpattabiram46
- DC Unit Test 2 Question BankTransféré parSiva Krishna
- Communication ProcessTransféré parApeksha Gupta
- SkyEdge II Pro Brochure 2011-10-03Transféré parFreddySaltos
- Cdma -UTL TraingTransféré parSuresh Dondapati
- M Tech Syllabus ECETransféré parAnamika Pancholi
- ARI_s6Transféré parthangnm
- Bit Error Rate Evaluation of IEEE 802.16 in OFDM SystemTransféré parabhi_rules08
- Locally.decodable.codes.and.Private.information.retrieval.schemes.springer.berlin.heidelberg.sergey.yekhanin.1.9783642143571Transféré parSaher Saher
- Nand FlashTransféré parsantanu_sinha87
- IRJET-Designing of Asynchronous Viterbi Decoder for Low Power Consumption using Handshaking Protocol: A Technical ReviewTransféré parIRJET Journal
- WiMax.research.paper(3)(2)Transféré parAhmed Haniah
- It Lecture Notes2014Transféré parRasha Ziyad
- Exercises Wireless LinesTransféré parBesnik Lecini

- CHAPTER 01 - Basics of Coding TheoryTransféré parVarshaPishareddy
- Unicode in NetweaverTransféré parjbmedrano20
- Checksum - Wikipedia, The Free EncyclopediaTransféré parSachin Mishra
- Improving the Storage and Quality of Discrete Color Images Using Lossless Image Compression TechniqueTransféré parIRJET Journal
- ISO-IEC 8859-1Transféré partechzones
- UNIT - IV_pptTransféré parShanmugapriyaVinodkumar
- codecs ffmpegTransféré partirantito
- reed solomon algorithmTransféré parMeenakshi Mukhriya
- Adaptive Huffman Coding[1]Transféré parBrandon Mcdaniel
- IT1251 Information Coding TechniquesTransféré parstudentscorners
- xapp616Transféré parAmit Tewarii
- 13imagecompression-120321055027-phpapp02.pptxTransféré parTripathi Vina
- Iterative Channel Decoding of FEC-Based Multiple Description Codes using LDPC-RS Product CodesTransféré parSaikat Majumder
- 2 Linear Block CodesTransféré parJahnavi Rao Sai
- Data Compression (Dc)Transféré parJagarnath Paswan
- delphi-unicode-migrationTransféré parpahduarte
- Ldpc_tutorial [Compatibility Mode]Transféré parAnbuselvi Mathivanan
- Brijesh Rita JainTransféré parTun Đức
- SolvedProblemsTransféré parZoryel Montano
- Linux Unicode ProgrammingTransféré parRock
- Adaptive Huffman CodingTransféré parAshwin Naraayan
- MD5 Algorithm Implementation Using JavaTransféré parPrakash Jayaraman
- Mpa Lp Decoding LdpcTransféré parsal
- DxDiagTransféré parumeshmishrahetauda
- 1984 Orwell Federal Reserve Board chairman Alan Greenspan's ID NCS EP-151016:32 A2803Transféré parMark Richard Hilbert(Mark Richard Rossetti)
- Coding TechsTransféré parMohit Gagrani
- Video Processing Communications Yao Wang Chapter8aTransféré parAshoka Vanjare
- Sayood-DataCompressionTransféré parMaria Deborah Vivas Baluran
- JPEG Image CompressionTransféré parVenkat Arun
- Error Detection Correction and ControlTransféré parJerome Teaño