I’ve been playing around for a while with the timer component. When set to 1 sec intervals it appears to be acting ok, but when I set it to 1 millisecond in order to have smoother visual result it appears as if 1000 ms is not equal to 1 second.
Is there some trick?
What is the refresh rate of GH preview in Rhino? Do I need to manipulate the output of the counter somehow?
Thanks in advance.
The timer sets the delay between the end of one solution and the beginning of the next. So the real frequency is one over the duration of the solution plus the timer delay. Also UI applications simply do not get reliable timers in multi-tasking operating systems, it is always possible some other process which is deemed more important gets awarded a disproportionate amount of cycles in one go, putting everything else on hold.
Yes, I’ve read about interruptions of CPU but I believe it is not only that.
This is comparison between the two most accurate timekeeper apps I know. XnoteStopwatch and GoogleChrome App “Clock”
Here it is noticable what you mentioned about giving priority, but their intervals are the same, they just “walk” one after another.
Here you can see different counters inside GH compared with Xnote:
Comparison between XnoteStopwatch and 1st GH counter
Comparison between XnoteStopwatch and 2nd GH counter
Comparison between XnoteStopwatch and 3rd GH counter
And here is a simulation with comparison between one of the above mentioned timers (using 1 ms interval) and Xnote. You can see how deviation becomes even larger.
How do I make a proper simulation without a delay over time, to make it “solution independent” somehow?
How can I make 1000ms = 1 s?
There is a big difference between measuring elapsed time intervals, and doing something at specific intervals. It is quite easy to accurately measure a duration from a bunch of centi-seconds to years, because the computer clock runs at an accurate rate. But the smaller the intervals, the more inaccuracy you get. By the time you get to milliseconds, the possible delay between one event and the next may easily exceed the frequency.
Again, these are not measuring the same thing. GH is measuring how often solutions occur, which is (roughly) 1 second + solution time, whereas the timer is measuring the difference between the current computer clock value and a previous computer clock value.
The question is what precisely do you need? Do you need new solutions to start at exactly 1 second intervals (if so, you’ll never be able to do that), or do you need an accurate way to measure an interval (that’s possible with the right code).
timer.gh (6.8 KB)
Here you can see that even though the Timer object fires at 1 second delays, the Span increases with more than one second every solution. However because this is now taken into account, the measured duration remains accurate.
Thanks for the example David.
What I need is if you see the last video showing the simulation. I want in terms of milliseconds to calculate the location of the points during their fall. So first I need a calculation of their location on each millisecond. And I want the point to actually be at that location at that time. If I understand what you say, the timer even set to milliseconds is not showing the milliseconds but the number of calculations performed.
Do you suggest that I use different counters? One counting seconds other providing the deltatime values? But if the number of calculations are not done exactly on the milliseconds, how can I be sure of the result? In fact I’m sure it will not be correct. Alternatively, I’ll need to store the result of the calculations then play them, making sure 1 millisecond is equal to 1 calculation. How can I do that in GH?
Edit: @Marika_almgren @Luc
This seems like it could be solved by Bongo when they finish their GH components. Hopefully, they will provide a component to create keyframe for each calculation. Then that calculation could go on Bongo timeline at the appropriate millisecond. The tweening is done by Bongo and voila. Can I participate in Beta of Bongo GH components?
Then you can use the code I posted. It allows you to very accurately measure the exact number of milliseconds since the last time you checked, thus giving you an accurate way to solve for a given timestep. It’s just not going to be a constant timestep.
Edit: @Marika_almgren @Luc
This seems like it could be solved by Bongo when they finish their GH components. Hopefully, they will provide a component to create keyframe for each calculation. Then that calculation could go on Bongo timeline at the appropriate millisecond. The tweening is done by Bongo and voila. Can I participate in Beta of Bongo GH components?’’
We’re working on it and hope that we’ll have something done fairly soon. Just follow the McNeel Forum and you’ll get the latest news.