I was backing up some data a while back to an external USB drive when I noticed a behavior that is not uncommon.
As I began backing up the contents of a rather bloated folder, during the process I noticed the "estimated time remaining" figure was jumping all over the place.
It started out saying something like 48 minutes remaining, and then jumped down to 30, 18 and then 11 minutes. Then I noticed the estimated time remaining figure stayed at 11 minutes for more than an hour.
The progress bar was moving slowly to the right, and just as I thought the file transfer was just about over, the progress bar jumped back about a third of the way across.
As I watched, the number bounced around some, dropping from the incredibly high 25 million mark and dropping to more manageable numbers, such as 45 minutes, 28 and once again 11 minutes and there it stayed for another 45 minutes.
I finally got fed up with watching and shut off the monitor, so I don't know when it actually finished. But when I came back to it in the morning, the file copy was complete, and it didn't take 48 years.
But it got me thinking; what is this "estimated time remaining" business all about, and can that figure ever be trusted?
I've certainly seen that behavior before with other file copy operations, downloads from the internet, CD burning and other similar procedures, and I've always taken that number with a grain of salt, but that 25 million figure caught my attention. What is going on with this picture?
Well, in order to understand why that number fluctuates so wildly, it's necessary to keep in mind that, on any given computer, what's happening at any given moment is not necessarily the way things are going to happen in the next moment.
So the "estimated" part of "estimated time remaining" is just that an estimate based on the conditions taken at that particular time slice. And what happens, from second to second, is those conditions are changing. Rapidly.
What that means is when the computer starts a file transfer, it grabs hold of a chunk of data and looks to see how much bandwidth is available. Then it bases its estimate of how long it will take on that.
But that estimate does not take into account things such as the overhead activity between the device and the computer, how much of the buffer is still available and lots of other technical things that are happening at the same time as the data transfer.
Let me rephrase it like this: As soon as the file transfer starts, all the figures the computer based its' estimate on change, because of the way computers do things.
What that boils down to is the estimated time for transfer is rarely accurate, because the conditions are constantly changing and can't be relied on, especially when it predicts that it will take 48 years.
One question that is bound to come up is: "Is this strictly a Windows issue?" and to that I would have to say no. I've observed the same behavior on my Macs as well as my Windows boxes.
So that, in a nutshell, is an explanation for the never-accurate estimated time remaining issue that I'm sure we've all experienced at one time or another.
I would personally like to see them do away with the estimated time figure and just stick with a graphical progress meter. That would make a whole lot more sense than the arbitrary numbers we get now, and they are easier to read at a glance.
Sean McCarthy fixes computers. He can be reached at (888) 752-9049 or help@ComputeThisOnline.com (no hyphens).