1 A server with an upload rate of 5000 bitssec must distribu
1. A server with an upload rate of 5,000 bits/sec must distribute a 10,000 bit file to 20 clients. Each client has a download rate of 4,000 bits/sec. What is the minimum time required to distribute the file to all the clients under the client-server model?
Solution
Under client server model -
Clients download only copies of a file and physical possession of a file remains at server. In addition of it, a server can simultaneously communicates many clients so minimun time required by one client for downloading a file is equal to time required by 20 clients for downloading same file.
size of file = 10,000bts
download speed = 4,000bits/sec.
upload speed = 5,000 bits/sec.
uploading time= 10,000/5,000 = 2 sec.
downloading time = 10,000/4,000 = 2.5sec
total time = uploading time + downloading time
total time = 2sec+2.5sec= 4.5 sec.
