# Timing Precision: Centiseconds or Milliseconds?



## Cride5 (Feb 25, 2010)

I have a dilemma with a JavaScript timer I'm developing. The timing functions available in JS are capable of measuring time accurate to 1000th of a second. However, most folk's generally seem to be happy with times precise to 100th of a second. This is the format used in most timers currently available and is also the format used by the WCA, presumably because that's the highest level of precision available using StackMat timers.

My question is, if its possible to measure time in milliseconds would this be preferred, or is hundredths adequate?

Another option is to measure all times (and calculate averages) in milliseconds, but then display them in hundredths by rounding. The problem with this is that it can result in a discrepancy between displayed singles and the calculated average.

So in short, what would you guys prefer: milliseonds; centiseconds; or measured in milli, displayed in centi?


----------



## Kirjava (Feb 25, 2010)

measured in milli, choice of which is displayed.


----------



## Mike Hughey (Feb 25, 2010)

Kirjava said:


> measured in milli, choice of which is displayed.



Oops, I voted wrong. This is the right thing to do with your coding. It gives you maximum flexibility, and is not much harder than any other choice.


----------



## qqwref (Feb 25, 2010)

Yep, measure in milliseconds, and display in centiseconds (or have an option letting the user to display in centi or milli). qqTimer does this for a reason 

Sometimes you don't want the precision, but sometimes it's very useful to have... ever get an average that rounded to something like 20.00? [I had a 1x.997 average of 100 once.]


----------



## tim (Feb 25, 2010)

Cride5 said:


> measured in milli, displayed in centi?



That's what we do at cubemania. This could lead to some problems if you need to compare two times though. ("10.22" > "10.22")

Btw. I found this while looking for the accuracy of Javascript times. Pretty interesting.


----------



## panyan (Feb 25, 2010)

i think centi is enough, milli is just over the top


----------



## ove (Feb 25, 2010)

I voted "other", because in my wildest dreams I would be happy if any computer based timer were able to even *guarantee* tenth second accuracy. 

Basically, this is due to the accuracy of the os timers used for multitasking (usually 20 milliseconds). Also, keyboard is not a "real time device", since a delay of 100 milliseconds in a keystroke wouldn't even be noticeable by human (who types 10 letters a second ?!).

With any software based timer, accuracy depends on machine load, and on what happens in the machine. If the hard disk or virus software starts during your solve, or with slow or loaded machines, you may well have 200 or 300 milliseconds error on the final time (already saw that on my netbook with some heavy java timer)
All you can expect is that, *most of the time on a reasonable machine*, the first number (thenths) is ok. For the second number, well... I would bet for an average error around 4 or 5 digits.


----------



## blade740 (Feb 25, 2010)

Billiseconds.


----------



## That70sShowDude (Feb 25, 2010)

It would be kind of cool if we've always used milliseconds, but we couldn't use it now because it would mess up the history. If your just practicing though, I vote for measure in milli, display in centi.


----------



## Cride5 (Feb 26, 2010)

Looks like the general consensus is measure in milli, display in centi ... which is cool cos its already like that and means I have no extra work to do 

However, I may (eventually) code in the ability to display all times in milliseconds as an option if folks think that will be useful. I'll also explain why sometimes the averages/totals don't always appear to add up in the FAQ.

@ove, I was going to mention that in the original message but I figured it had gotten wordy enough already. If your program is swapped out when you slam your hand on that spacebar, it has to wait for the OS to swap it back onto the processor and take the time measurement. This could take any amount of time and isn't predictable. Avoiding it would require a program with exclusive access to the processor - which is basically impossible. Best remedy for that is to use a stackmat. I'm hoping to add interfacing to a stackmat as an option in future...

EDIT: The keyboard delay probably isn't a huge issue though, because its probably relatively constant. I.e. it will start with a 100ms delay, then also stop with a similar 100ms delay.

@tim, interesting article. I guess that accuracy of time measurements not only depends on the kernel's scheduling of processes, but also any schedulers running in the browser/javascript engine too. IE performed predictably badly, yet another reason to avoid it like the plague! No surprises that Mac OS came out on top tho


----------



## Stefan (Feb 26, 2010)

Just to illustrate the discrepancy when measuring millis and showing centis... something like this could happen:

(9.99)
9.99
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
(20.00)
=====
9.99 average

(5.00)
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.01
(10.01)
=====
10.01 average


----------



## Stefan (Feb 26, 2010)

As I see it, the only difference between *Centiseconds* and *Measure in milliseconds, display in centiseconds* is the average, for single times there's no difference. And the average can at most differ by 0.01 seconds. In cases where it does differ by those 0.01, measureMillis+showCentis would be slightly more accurate but at the cost of looking confusing. That's the reason I opted for centiseconds in my timer (i.e. I measure in millis because that's what the environment gives me, but I immediately round to centis and work with that rounded value then).


----------



## qqwref (Feb 26, 2010)

ove said:


> Basically, this is due to the accuracy of the os timers used for multitasking (usually 20 milliseconds). Also, keyboard is not a "real time device", since a delay of 100 milliseconds in a keystroke wouldn't even be noticeable by human (who types 10 letters a second ?!).



I regularly achieve speeds of 10 characters/sec (~120wpm) while typing and there are many people who are even faster than that; as a rhythm game player I often play files that require keypresses at 15+ keys per second. A lag of 100ms is easily noticeable, depending on the application.

I do agree that there are situations in which the timer gets delayed, but most of the time for me it seems to be pretty accurate (or at least the lag is consistent enough that the difference in the computer-measured time and the real time is small).


----------



## ove (Feb 26, 2010)

Cride5 said:


> Avoiding it would require a program with exclusive access to the processor - which is basically impossible. Best remedy for that is to use a stackmat. I'm hoping to add interfacing to a stackmat as an option in future...



I remember having read documentation on MSDN about "high precision timers" (in DirectX stuff) wich would provide 1 millisecond accuracy, for use with special devices like MIDI hardware and joysticks. But I'm not sure it would worth it anyway (certainly more diffcult to use than to just add stackmat support)


----------



## Meisen (Feb 26, 2010)

Kirjava said:


> measured in milli, choice of which is displayed.



This


----------



## EE-Cuber (Feb 26, 2010)

Measure in milliseconds is useless. When doing a solve, the amount of time it takes between finishing and smacking the stop button is on the order of milliseconds (10's).. so this is lost in the "noise."


----------



## Stefan (Feb 26, 2010)

StefanPochmann said:


> In cases where it does differ by those 0.01, measureMillis+showCentis would be slightly more accurate but at the cost of looking confusing.



Not more accurate by 0.01, btw! I think this is the extreme case:

(4.985 show 5.00)
4.985 show 4.99
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
(4.995 show 5.00)

Average the millis:
*4.994* show *4.99*, off by 0.004

Average the centis:
4.999 show *5.00*, off by 0.006

So in the absolute extreme case, averaging the millis rather than the centis is more accurate by only 0.002 seconds. Admittedly it's more for average-of-5:

(4.985 show 5.00)
4.985 show 4.99
4.995 show 5.00
4.995 show 5.00
(4.995 show 5.00)

Average the millis:
*4.991666*... show *4.99*, off by 0.001666...

Average the centis:
4.99666... show *5.00*, off by 0.008333...

So in the absolute extreme case, averaging the millis rather than the centis is more accurate by only 0.00666... seconds.

Again, these are the extremes, centiseconds being worse by 0.002 or 0.00666 seconds. This will rarely happen, usually both versions will show the same average result, and when they do differ, the actual amount by which averaging milliseconds is more accurate will be smaller than 0.002/0.00666. I feel this is not worth the hassle of showing confusing values and having to explain them.


----------



## Stefan (Feb 26, 2010)

EE-Cuber said:


> Measure in milliseconds is useless. When doing a solve, the amount of time it takes between finishing and smacking the stop button is on the order of milliseconds (10's).. so this is lost in the "noise."



And that's not the case every time and for everybody?


----------



## dmitry_n (Feb 27, 2010)

*Stopwatch*

To track the time with milliseconds accuracy you must apply REAL_TIME priority to thread. There are no thoughts how to implement it with JS although.

http://www.stopwatch-timer.com


----------



## Cride5 (Mar 3, 2010)

I've considered the opinions posted and decided to go for centiseconds across the board. The two main driving reasons for it were:
(1) The sizes of errors generated by scheduling, unpredictable lag and other sources mean millisecond precision is unwarranted and
(2) Inconsistency in averages and totals is not really acceptable, even if explained. It just looks as if there is something wrong with the code, making it less trustworthy.

Thanks for your views.


----------

