Vous êtes sur la page 1sur 3

10 XML retrieval(4Marks)

1. *Explain Vector Space model for XML Retrieval.


2. **Explain challenges in XML information retrieval.
3. Write a short note on : data-centric XML retrieval

9 Boolean retrieval
1. **Explain Lemmatization and Stemming in detail.(7)
2. Explain Data Retrieval in brief.(3)
3. ***Write a short note on Tokenization.(4)
4. ***Explain Skip Pointers in brief.(3)
5. Explain Biword Indexes and Positional Indexes in brief(3)
6. **Explain phrase queries with suitable example.(4)
7. *Explain incident matrix and inverted index with suitable example(4)
8. Write a short note on : Stop words Removal
9. Explain the algorithm of intersecting two postings lists in data retrieval.(7)
10. Explain Information Retrieval in detail(4)

8 Vector Quantization
1. **Explain Vector Quantization in brief.(4)
2. **Explain Scalar Quantization in brief.(4)
3. Explain pyramid vector quantization.(3)
4. Explain structured vector quantizers.(3)
5. Explain Linde-Buzo-Gray algorithm in detail(4)
6. Write a short note on tree structure vector quantizer.(7)
7 Mathematical Preliminaries for Lossy Coding
1. Explain nonuniform quantization.(3)
2. Explain pdf optimized quantization.(3)
3. Explain adaptive quantization with its two approaches(7)
4. List out different types of quantizer. Explain quantization problem with example.(7)
5. Compare Uniform Quantization with Non Uniform Quantization.(3)
6 Predictive Coding:
1. Explain usage of discrete cosine transform (DCT) in JPEG.(7)
2. *Encode the sequence this/bis/bthe using Burrows-Wheeler transform and move to front
coding.(7)
3. **Explain prediction with partial match in short.(3)(7)
4. **Explain OLD JPEG Standard.(3)
5. **Explain CALIC.(3)(7)
6. Encode the sequence etaḇcetaḇandḇbetaḇceta using Burrows-Wheeler transform and move
to front coding.*(7)
5 Dictionary Techniques
1. *Explain LZ78 with suitable example.(7)
2. *Explain LZ77 with suitable example.(7)
3. Given an initial dictionary Index 1=w, 2=a, 3=b, encode the following message using the
LZ78 algorithm: wabba/bwabba/bwabba/bwabba/bwoo/bwoo/bwoo.(7)
4. Encode the following sequence using Diagram Coding of Static Dictionary method
(Generate for 3 bit): abracadabra(7)
5. *Explain LZW method with example.(7)
6. A sequence is encoded using the LZ77 algorithm. Given that C(a) = 1, C(b) = 2, C(r) = 3,
and C(t)= 4, decode the following sequence of triples: ,< 0, 0, 1>,,< 2, 8, 2>,< 3, 1, 2>,,,
Assume that the size of the window is 20 and the size of the look-ahead buffer is 10.
Encode the decoded sequence and make sure you get the same sequence of triples(7).
7. Given an initial dictionary consisting of the letters a b r y ḇ, encode the following
message using the LZW algorithm: aḇbarḇarrayḇbyḇbarrayarḇbay(7)
8. Encode the following sequence using the LZ77 and LZ78 algorithm:
ḇarrayarḇbarḇbyḇbarrayarḇba
Assume you have a window size of 30 with a look-ahead buffer of size 15. Furthermore
assume that C(a)=1, C(b)=2, C(ḇ)=3, C(r)=4, and C(y)=5. * (7)

4 Arithmetic Coding
1. Write pseudocode for integer arithmetic encoding and decoding algorithm.(7)
2. Encode and Decode “AABBC” with arithmetic coding. (P(A)=0.6, P(B)=0.3, P(C)=0.1)(7)
3. *Encode “acadebaa” using Adaptive Huffman code. Derive, Codes and final tree(7)(chapter3)
4. Write the method to generate a tag in arithmetic coding.(7)
5. Write an encoding algorithm for arithmetic coding.(7)
6. *Encode and decode “BACBA” with Arithmetic Coding. [P(A)=0.5, P(B)=0.3,P(C)=0.2](7)

Unit 3: Huffman coding


1. An alphabet S ={ a1, a2, a3 ,a4 ,a5} symbols with probabilities as P(a1) = 0.4, P(a2) =
0.3, P(a3)=0.2, P(a4)=0.09, and P(a5)=0.01, Find out Huffman code, source entropy,
average length and compression ratio. 04
2. Explain Huffman Coding with respect to minimum variance Huffman codes with
separate trees. 07
3. Explain the Encoding process of Adaptive Huffman Algorithm. 07
4. How Extended Huffman reduces code average length Code? Prove using alphabet A=
{a1, a2, a3} with probability 0.95, 0.03 and 0.02 respectively. 07
5. Consider a source containing 26 distinct symbols [A-Z]. Encode given sequence of
symbols using Adaptive Huffman algorithm. Symbol Sequence: MUMMY. 07
7. Design a minimum variance Huffman code for a source that put out letter from an alphabet
A={ a1, a2, a3, a4, a5, a6} with P(a1)=P(a2)=0.2, P(a3)=0.25, P(a4)=0.05,
P(a5)=0.15,P(a6)=0.15. Find the entropy of the source, avg. length of the code and efficiency.
Also comment on the difference between Huffman code and minimum variance Huffman
code.07

8. Explain Huffman Coding in detail with example. Define minimum variance Huffman codes.
07 // Explain Huffman Coding with suitable example. 07
9. Encode “aacdeaab” using Adaptive Huffman code. Derive Output string, Codes and final
tree. 07
10. Encode “acadebaa” using Adaptive Huffman code. Derive, Codes and final tree. 07
11. Consider a source emits letter from an alphabet A={a1,a2,a3,a4} with probability
P(a1)=0.3,P(a2)=0.2,P(a3)=0.35,P(a4)=0.15. [I] Find a Huffman code using minimum variance
procedure. [II] Find average length of the code. 07
12. Write a procedure to generate Adaptive Huffman Code. 03
13. Generate GOLOMB code for m=9 and n=8 to 13. 07
14. Generate GOLOMB code for m=5 and n=4 to 10. 04 // Generate GOLMB code for m=5 and
n=0 to 10. 04
15. Write procedure to generate TUNSTALL code. Generate TUNSTALL code with probability of
P(A)=0.6, P(B)=0.3, P(C)=0.1 and n=3 bits. 07 // Write a short note on Tunstall Code. 04
16. Explain Tunstall Codes with example. 07
17. Generate TUNSTALL code P(A)=0.4, P(B)=0.3, P(C)=0.3 and n=3 bits. 04
18. Explain Rice Codes in brief. 03
19. 18. Write a different Application of Huffman Coding. 03 // Write a different Application of
Huffman Coding. 07

Vous aimerez peut-être aussi