I understand that the formula used is the time it would take with no collisions divided by the actual time, with the actual time being equal to the "ideal time" plus the time wasted on the intial collision. However, I don't understand how we can calculate the time lost. Isn't it the time needed for B's message to reach A, and therefore the same as the ideal time, making the actual time just twice the ideal time? I know that can't be right, but it's what I keep coming up with.
In measuring the "ideal time", I asked you to only consider the world from A's point of view. That is, the ideal time is how long it takes A to send 1000 bits, not including the time it takes these bits to reach B. Similarly, the time lost is from A's point of view. It is the time from the beginning of A's first attempt to try to send the message to the moment when A begins it second, successful attempt.
By Tom Murtagh (Admin) on Monday, October 26, 1998 - 03:52 pm:
As a result, the ideal time, has nothing to do with the time it takes a message to travel the length of the network while the wasted time is very dependent on the time it takes to get from A to B.
Tom