There is a big difference between measuring elapsed time intervals, and doing something at specific intervals. It is quite easy to accurately measure a duration from a bunch of centi-seconds to years, because the computer clock runs at an accurate rate. But the smaller the intervals, the more inaccuracy you get. By the time you get to milliseconds, the possible delay between one event and the next may easily exceed the frequency.
Again, these are not measuring the same thing. GH is measuring how often solutions occur, which is (roughly) 1 second + solution time, whereas the timer is measuring the difference between the current computer clock value and a previous computer clock value.
The question is what precisely do you need? Do you need new solutions to start at exactly 1 second intervals (if so, you’ll never be able to do that), or do you need an accurate way to measure an interval (that’s possible with the right code).