Bob sends a 300 Mega Byte 300 MB binary file to Alice over a

Bob sends a 300 Mega Byte (300 MB) binary file to Alice over a 10 Mega bits per second (10 Mbps) network.

How long does it take to complete the transfer (disregard protocol overheads and consider 1 MB = 1,000,000 bytes for simplicity)?

Suppose the network only allows 7-bit data transfer. Bob decides to encode the file in Base64. How long does it to take to complete the transfer?

Solution

The file has been encoded in Base64

=> it consists of 6bit information 1 packet = 6bits

=> no. of packets for 300Mbyte = 300*1000000 (from inputs 1MB = 1,000,000)/6

=50*10^6 packets, but network allow only 7 bit information, so an additional dummy bit is added to 7bit information from a packet =(50*10^6)*7 = 350Mbyte is virtually transfered data

=> data transfer rate is 10Mbps, data is 350MB, transfer time is =350MB/10Mbps = ((350*10^6)*8)/(10^7) = 280sec

so, time taken to complete transfer is =

Bob sends a 300 Mega Byte (300 MB) binary file to Alice over a 10 Mega bits per second (10 Mbps) network. How long does it take to complete the transfer (disreg

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site