A source of information randomly generates symbols from a fi
A source of information randomly generates symbols from a five letter alpha bet {a, b, c, d, e }. The probability of each symbol is as follows: P(a) = i; P(b) = i; P(c) = i; P(d) = and P(e) = i. These symbols are now encoded into binary codes using the scheme shown below. Let the random variable L denote the length of the binary code (for example, the length of the binary code for the symbol c is 3 bits Find the expectation and variance of L. While a source encoder converts a stream of symbols to a stream of bits (in this case, based on the mapping shown in the above table), a source decoder converts a stream of bits back to symbols. Suppose that the amount of energy required to decode a code of length L on a particular device is defined by the function g(L) = 2l Joules. Find the expectation and variance of the energy g(L). In the field of information theory, a quantity called entropy is used as a measure of information. Let X be a discrete random variable that takes values in the set X (often referred to as the alpha bet) and has probability mass function px (x) = P({X = x}), x epsilon X. The entropy H(X) of the discrete random variable X is defined byH(X)=- sigma xepsilonXPx(x) ogpx(x). xex1
Solution
