Consider a simple HTTP steaming model Assume the HTTP video
Consider a simple HTTP steaming model. Assume the HTTP video server sends bits at constant rate of 5 Mbps.
Both answers are a. Can anyone show the working ? Thank you.
20. Consider a simple HTTP streaming model. Assume the HTTP video server sends bits at constant rate of 5 Mbps. Let the size of the client application buffer be 1000 Kb. The client buffer must have 500 Kb buffered before client application can begin playout. Ignore TCP buffers. The video being sent by the server has a frame rate of 30 frames/sec. The size of each frame is 100,000 bits. At what time does the client application become full? a. 350 milliseconds b. 300 milliseconds c. 100 milliseconds d. 250 milliseconds e. 200 milliseconds 21. Consider the same scenario as the previous question. What is the initial playout delay? a. 100 milliseconds b. 1000 milliseconds c. 75 milliseconds d. 50 milliseconds e. 25 millisecondsSolution
1)
The HTTP video server streaming rate is 5Mbps
Size of client application buffer is 1000kb
Client application buffer size before playout is 500kb
Frame rate of the video server is 30frames/sec.
size of the each frame is 100000bits
Now consider 100000bits =12500 bytes
12500 bytes = 12.207 KB
12.207 KB = 0.01192 MB
so size of each frame is 0.01192 MB
the video serveris sending 30 frames/sec hence 30*0.01192 = 0.3576 MB
As resolved above the video is streaming at 0.3576 MB and hence approximately the total time taken to reach client application will be around 350ms.
