Systems and Networks Assume host A sends a 1MB file to host
(Systems and Networks)
Assume host A sends a 1MB file to host B though a network. The path from host A to host B has three links of rates R1 = 600 kbps, R2 = 0.7 Mbps, and R3 = 1.3 Mbps.
Assume R2’s throughput is reduced to 500 kbps. Under the new conditions, how long will it take to transfer the file to host B? As above, don\'t take into consideration the transmission delays at each router, the need for Acknowledgments to be returned, or traffic conditions on the network.
Give answer in seconds and round to the second digit after the decimal point.
Solution
When R2\'s throughput is reduced to 500kbps, it will be acting as bottleneck of the network between host A and host B. That is, to send 1MB file, actual throughput will be R2 = 500kbps
Time taken will be = 1MB/500kbps = 1 mega bytes / [ 500 kilo bits per second ]
= ( 8 mega bits / 500 kilo bits ) seconds
= (8*1024 kilo bits / 500 kilo bits) seconds
= 16.384 seconds
= 16.38 seconds
