Xt is a binary waveform carrying a payload of equiprobable b

X(t) is a binary waveform carrying a payload of equiprobable bits: X(t) = sigma^infinity_k = -infinity A_k rect (t + t_0 - kT) where A_k is a sequence of independent and identically-distributed random variables such that A_k epsilon {0, A} with 50% probability on both values. The random variable t_0 is assumed to be uniformly distributed over [0, T] with density: ft_0 (s) = {1/T, 0 lessthanorequalto s lessthanorequalto T 0, otherwise What is the autocorrelation function of the random process X(t)? What is the spectrum of the random process X(t)?

Solution

here i am giving you the notes required for the question asked..you can solve by observing this

a)

Autocorrelation of Random Processes

Page by: Michael Haag

Summary

Before diving into a more complex statistical analysis of random signals and processes, let us quickly review the idea of correlation. Recall that the correlation of two signals or variables is the expected value of the product of those two variables. Since our focus will be to discover more about a random process, a collection of random signals, then imagine us dealing with two samples of a random process, where each sample is taken at a different point in time. Also recall that the key property of these random processes is that they are now functions of time; imagine them as a collection of signals. Theexpected value of the product of these two variables (or samples) will now depend on how quickly they change in regards to time. For example, if the two variables are taken from almost the same time period, then we should expect them to have a high correlation. We will now look at a correlation function that relates a pair of random variables from the same process to the time separations between them, where the argument to this correlation function will be the time difference. For the correlation of signals from two different random process, look at thecrosscorrelation function.

Autocorrelation Function

The first of these correlation functions we will discuss is the autocorrelation, where each of the random variables we will deal with come from the same random process.

Autocorrelation

the expected value of the product of a random variable or signal realization with a time-shifted version of itself

With a simple calculation and analysis of the autocorrelation function, we can discover a few important characteristics about our random process. These include:

How quickly our random signal or processes changes with respect to the time function

Whether our process has a periodic component and what the expected frequency might be

As was mentioned above, the autocorrelation function is simply the expected value of a product. Assume we have a pair of random variables from the same process, X1=X(t1)X 1 X t 1and X2=X(t2)X 2 X t 2, then the autocorrelation is often written as

Rxx(t1,t2)==E[X1X2]x1x2f(x1,x2)dx2dx1R xx t 1 t 2 X 1 X 2 x 1 x 2 x 1 x 2 f x 1 x 2

The above equation is valid for stationary and nonstationary random processes. For stationary processes, we can generalize this expression a little further. Given a wide-sense stationary processes, it can be proven that the expected values from our random process will be independent of the origin of our time function. Therefore, we can say that our autocorrelation function will depend on the time difference and not some absolute time. For this discussion, we will let =t2t1 t 2 t 1, and thus we generalize our autocorrelation expression as

Rxx(t,t+)==Rxx()E[X(t)X(t+)]R xx t t R xx X t X t

for the continuous-time case. In most DSP course we will be more interested in dealing with real signal sequences, and thus we will want to look at the discrete-time case of the autocorrelation function. The formula below will prove to be more common and useful than Equation:

Rxx[n,n+m]=n=x[n]x[n+m]R xx n n m n x n x n m

And again we can generalize the notation for our autocorrelation function as

Rxx[n,n+m]==Rxx[m]E[X[n]X[n+m]]R xx n n m R xx m X n X n m

Properties of Autocorrelation

Below we will look at several properties of the autocorrelation function that hold for stationaryrandom processes.

Autocorrelation is an even function for

Rxx()=Rxx()R xx R xx

The mean-square value can be found by evaluating the autocorrelation where =0 0, which gives us

Rxx(0)=X2R xx 0 X 2

The autocorrelation function will have its largest value when =0 0. This value can appear again, for example in a periodic function at the values of the equivalent periodic points, but will never be exceeded.

Rxx(0)|Rxx()|R xx 0 R xx

If we take the autocorrelation of a period function, then Rxx()R xx will also be periodic with the same frequency.

Estimating the Autocorrleation with Time-Averaging

Sometimes the whole random process is not available to us. In these cases, we would still like to be able to find out some of the characteristics of the stationary random process, even if we just have part of one sample function. In order to do this we can estimate the autocorrelation from a given interval, 00 to TT seconds, of the sample function.

xx()=1TT0x(t)x(t+)dt xx 1 T t T 0 x t x t

However, a lot of times we will not have sufficient information to build a complete continuous-time function of one of our random signals for the above analysis. If this is the case, we can treat the information we do know about the function as a discrete signal and use the discrete-time formula for estimating the autocorrelation.

xx[m]=1Nmn=0Nm1x[n]x[n+m]

b) Let x(n) be a discrete wide-sense stationary random process with autocorrelation function:
Rx(n)(k) = E[x(n)x(n + k)]
for each sample function of the process x(n; ) we can define the truncated Fourier transform:
XT (ej2f ) Nn=Nx(n)ej2fn

The corresponding truncated power spectral density is 1/(2N+1)* |XN (f)|2
Since x(n) is a random process, for each ej2f,
1/(2N+1)*|XT (ej2f )|2
is a random variable. Let us denote its
expectation by
Sx(t),N (ej2f ) E[1/(2N + 1)|XN (ej2f) |2]
and also in this case a natural definition of the power spectral density of the process is
therefore
Sx(n)(ej2f ) = limNSx(n),N (ej2f)
The Wiener-Khintchine asserts that the limit indicated in the previous formula exists
for all ej2f and its value, for a discrete process, is
Sx(n)(ej2f ) = , k=( Rx(n)(k)ej2fk)
The unique required condition is that the Fourier Transform of the autocorrelation function exists.

 X(t) is a binary waveform carrying a payload of equiprobable bits: X(t) = sigma^infinity_k = -infinity A_k rect (t + t_0 - kT) where A_k is a sequence of indep
 X(t) is a binary waveform carrying a payload of equiprobable bits: X(t) = sigma^infinity_k = -infinity A_k rect (t + t_0 - kT) where A_k is a sequence of indep

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site