程序代写代做 Problem#􏰀: De􏰁cription

Problem#􏰀: De􏰁cription
􏰂 Suppose a TCP connection, with window size 1, loses every other packet. Those that do arrive have RTT = 1 second.
􏰂 What happens? What happens to TimeOut? Do this for two cases:
􏰂 After a packet is eventually received, we pick up where we left off, resuming with EstimatedRTT initialized to its pre-timeout value and Timeout double that.
􏰂 After a packet is eventually received, we resume with Timeout initialized to the last exponentially backed-off value used for the timeout interval.
12

Problem#􏰀: Solution, Part (a)
􏰂 If every other packet is lost, we transmit each packet twice. a) Let E ≥ 1 be the value for EstimatedRTT and
T= 2xE be the value for TimeOut.
􏰂 We lose the first packet and back off TimeOut to 2xT.
􏰂 Then when packet arrives, we resume with EstimateRTT = E and TimeOut = T.
􏰂 In other words TimeOut doesn’t change.
13

Problem#􏰀: Solution, Part (b)
b) Let T be the value for TimeOut.
􏰂 When we transmit the packet the first time, it will be lost and we will wait time T.
􏰂 At this point we back off and retransmit using TimeOut=2 x T.
􏰂 The retransmission succeeds with an RTT of 1 sec, but we use the backed-off value of 2 x T for the next TimeOut.
􏰂 In other words, TimeOut doubles with each received packet. This is not Good.
14