$include_dir="/home/hyper-archives/boost-users/include"; include("$include_dir/msg-header.inc") ?>
Subject: [Boost-users] [date_time] Timer anomolies on Windows
From: Rush Manbert (rush_at_[hidden])
Date: 2009-02-06 20:57:14
I am trying to implement an accurate cross-platform timer with  
microsecond resolution.
I would like to use the Boost date_time library to replace our current  
platform-specific code. One reason to do this is because we currently  
use the Code Audio framework timing services on the Mac and we would  
like to get away from that. In general, we want to statically link our  
executables.
The current implementation uses the Core Audio framework on the Mac  
and the QueryPerformanceCounter() API on Windows. Our experience with  
the Windows code is that it doesn't perform well when measuring any  
interval less that about 500 msec. I have written a number of tests  
that get the number of performance counter ticks over various  
intervals. One test uses the GetTickCount() millisecond timer to  
measure the intervals. Another test just uses a for loop that  
increments a variable, because I was suspicious that the millisecond  
timer was jittery.
I have found that the QueryPerformanceCounter() results are  
unreliable. One test ran tests of durations from 100 msec to 2000 msec  
and calculated the error between the number of ticks reported and the  
counter frequence returned by the API QueryPerformanceFrequency call.
After getting that result, I wrote the same test that counts  
boost::posix_time::microsec_clock::local_time() ticks against the  
millisecond clock. I was initially surprised to see that the error  
between the counts reported by the API calls and the 1 megahertz  
expected frequency are nearly identical to the errors I calculated for  
QueryPerformanceCounter().
This led me to believe that the date_time microsec_clock  
implementation must be using the QueryPerformanceCounter interface. I  
have searched the sources, however, and I don't find anything that is  
Windows-specific.
One other odd thing that my tests reveal. If I run a test that times  
its duration using the GetTickCount() millisecond timer, and calls  
either QueryPerformanceCounter() or uses microsec_clock to get the  
number of ticks seen during the interval I find that the values  
returned by QueryPerformanceCounter() are different for each test  
duration, as one would expect. Conversely, the tick counts calculated  
for microsec_clock are identical (with the value 15625) for 1 msec  
through 16msec. Then at 17 msec test duration, the tick count jumps to  
31250. The same count gets reported through durations of 31 msec. At  
32 msec the count jumps to 46875 and remains there through a test  
duration of 47 msec...
This looks like the Boost microsec_clock tick count is only updating  
at 60 Hz.
Obviously, I am very bewildered by these results. I would really  
appreciate it if anyone can enlighten me on the following:
1) On Windows, what API is actually being used to read the clock ticks?
2) Where is the platform-specific code that date_time uses to actually  
read the counters?
3) Is there really a 60 Hz update to the microsec_clock count value?
Here is an example of one of my tests. This one is for the  
QueryPerformanceCounter() API. The Boost microsec_clock test is nearly  
identical:
int WindowsPerformanceTimerResolutionSubTest (unsigned int  
testDurationInMillisecs)
{
        LARGE_INTEGER	startHighRes;
        LARGE_INTEGER	endHighRes;
        DWORD			start;
        if (0 == testDurationInMillisecs)
        {	testDurationInMillisecs = 1;
        }
                
        // Sync to clock edge using GetTickCount()
        windowsSyncToClockEdgeOneMillisecond ();
        
        // Get our start time for the performance clock, and the millisecond  
clock
        QueryPerformanceCounter (&startHighRes);
        start = GetTickCount();
        
        // Wait for the required duration
        while ((GetTickCount() - start) < testDurationInMillisecs)
        {
                // Nothing to do
        }
        
        //Get the microsec time at end of test
        QueryPerformanceCounter (&endHighRes);
        
        // Difference divided by number of seconds is ticks per second
        uint64_t diffTicks = endHighRes.QuadPart - startHighRes.QuadPart;
        
        return diffTicks;
}
Thanks,
Rush