While running one of my apps on a Windows 10 VM I noticed that the timing was much different to that seen on the host PC. After lots of digging I finally found that the granularity of the system timer on the VM was around 16ms versus around 0.5ms on the host PC. My app is using some 1-5 millisecond sleeps but when the granularity is 16ms then 1ms becomes 16! (The actual granularity is 15.6ms due to a default 64Hz timer frequency).
Some cool resources on the web related to this:
- Windows Timer Resolution: Megawatts Wasted (Bruce Dawson)
- Timekeeping in VMware Virtual Machines (VMware)
Solved my problems by setting the granularity to the minimum supported by the PC; this setting remains in place until the application exits. So it just seems that my VM doesn’t have anything running that would otherwise cause the timer to run more quickly than the default (of 64Hz), whereas my development PC must have all sorts that are running the timer flat out; probably one reason my battery goes down more quickly than expected!
To query and change the granularity I used theses methods via C#:
I then wrote a little wrapper class to let me play with the timings using the .Net TimeSpan. Note: this is a frustrating struct to use because it really doesn’t want to use fractions of a millisecond without more than a bit of persuasion, specifically because FromMilliseconds will only consider the requested value to the nearest millisecond.
/// <summary> /// Utility to query the timer resolution /// </summary> class TimerResolution { [DllImport("ntdll.dll", SetLastError = true)] private static extern int NtQueryTimerResolution(out int MinimumResolution, out int MaximumResolution, out int CurrentResolution); [DllImport("ntdll.dll", SetLastError = true)] private static extern int NtSetTimerResolution(int DesiredResolution, bool SetResolution, out int CurrentResolution); private static TimeSpan TimeSpanFrom100nsUnits(int valueIn100nsUnits) { var nanoseconds = (double)valueIn100nsUnits * 100.0; var seconds = nanoseconds / 1000000000.0; var ticks = seconds * System.Diagnostics.Stopwatch.Frequency; var timeSpan = TimeSpan.FromTicks((long)ticks); return timeSpan; } private static (TimeSpan min, TimeSpan max, TimeSpan cur) Query() { NtQueryTimerResolution(out var min, out var max, out var cur); return (min: TimeSpanFrom100nsUnits(min), max: TimeSpanFrom100nsUnits(max), cur: TimeSpanFrom100nsUnits(cur)); } /// <summary>Gets the minimum timer resolution</summary> public static TimeSpan MinResolution => Query().min; /// <summary>Gets the maximum timer resolution</summary> public static TimeSpan MaxResolution => Query().max; /// <summary>Gets/sets the current timer resolution</summary> public static TimeSpan CurrentResolution { get { return Query().cur; } set { var valueInSeconds = value.TotalMilliseconds / 1000.0; var valueInNanoseconds = valueInSeconds * 1000000000.0; var valueIn100Nanoseconds = (int)(valueInNanoseconds / 100.0); NtSetTimerResolution(DesiredResolution: valueIn100Nanoseconds, SetResolution: true, out _); } } }
A little test app on my VM produced these results…
Minimum resolution: 15.6ms Maximum resolution: 0.5ms Current resolution: 15.6ms Attempt to change to 2ms Current resolution: 00:00:00.0020000 DateTime granularity: 00:00:00.0020970 Sleep 0: 00:00:00.0000009 Sleep 1: 00:00:00.0020053 Attempt to change to 5ms Current resolution: 00:00:00.0050000 DateTime granularity: 00:00:00.0050328 Sleep 0: 00:00:00.0000012 Sleep 1: 00:00:00.0049719 Attempt to change to 0.5ms Current resolution: 00:00:00.0005000 DateTime granularity: 00:00:00.0005471 Sleep 0: 00:00:00.0000008 Sleep 1: 00:00:00.0011774 Attempt to change to 15.6ms Current resolution: 00:00:00.0156250 DateTime granularity: 00:00:00.0156280 Sleep 0: 00:00:00.0000011 Sleep 1: 00:00:00.0155707