Before I post this question but that answer was not correct
Before I post this question but that answer was not correct. I need the answer someone answer this question please.
Problem: A process A wants to send a dataset of 3 MB to a process B using a message passing mechanism. Estimate the total time required to complete the transfer in each of the following cases, with the performance assumptions listed below:
a. Using connectionless (datagram) communication (for example, UDP);
b. Using connection-oriented communication (TCP);
c. When the two processes are on the same machine.
Latency per packet (remote, incurred on both send and receive): 5 ms
Latency per packet (local, incurred on both send and receive): 2 ms
Connection setup time (TCP only): 5 ms
Data transfer rate: 10 Mbps
Maximum transfer unit (max packet length): 1000 bytes.
Solution
Time taken using UDP is 10 m sec while with TCP/IP it takes around 15 m sec.
And when both are on same machine then it will take same amount of time as it didn\'t depend upon the packet size.
