Try to Solve asap with in an hour
Extracted text: {S1, $2, S3, S4, S5, S6} Decode the binary data "00100011111011000" in terms of the symbols = with probabilities P(S) = {0.21,0.39,0.12, 0.13,0.08 0.07} which were encoded using Huffman coding. %3D
Extracted text: This is syllabus for question in the given picture Course : Information Theory and Coding Module No. 1 Introduction to information Theory, Information rate and entropy, Measure of Information, Properties of entropy of a binary memory less source. Module No. 2 Entropy, Relative Entropy, and Mutual Information,Discrete memoryless channels - BSC, BEC, noise-free channel, Channel with independent I/0, Cascaded channels