Let X and Y be independent Bernoulli random variables with p

Let X and Y be independent Bernoulli random variables with parameter (1/2).

Show that X + Y and |X - Y| are dependent though uncorrelated.

Solution

In probability theory and statistics, two real-valued random variables, X,Y, are said to be uncorrelated if their covariance, E(XY) - E(X)E(Y), is zero. A set of two or more random variables is called uncorrelated if each pair of them are uncorrelated. If two variables are uncorrelated, there is no linear relationship between them.

Uncorrelated random variables have a Pearson correlation coefficient of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.

In general, uncorrelatedness is not the same as orthogonality, except in the special case where either X or Y has an expected value of 0. In this case, the covariance is the expectation of the product, and X and Y are uncorrelated if and only if E(XY) = 0.

If X and Y are independent, then they are uncorrelated. However, not all uncorrelated variables are independent. For example, if X is a continuous random variable uniformly distributed on [?1, 1] and Y = X2, then X and Y are uncorrelated even though X determines Y and a particular value of Y can be produced by only one or two values of X.

Let X and Y be independent Bernoulli random variables with parameter (1/2). Show that X + Y and |X - Y| are dependent though uncorrelated.SolutionIn probability

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site