I don't understand how the time intervals really work

I am trying to automate a simulation, and I am using a timer but I really don’t get how it works.It is not real time ,sometimes a little bit more and sometimes the lag is too much. What if I used a dedicated better server, will it take the same time with the same Time interval?

The timer sets the interval between solutions, not from the start of one solution to the start of the next. So the refresh rate is timer interval + solution duration.

You can never guarantee that some code will absolutely positively execute at a given time on a multi-tasking operating system. However you can get closer to an exact interval if you’re willing to write your own logic. You’d have to run a timer in a separate thread, then on that timer tick handler invoke the main UI thread and tell it to start a new solution. There’s lots of places where delays can creep in though, so you must be mindful of that.


Thanks for this very detailed reply. One thing I also have noticed, I can run the code and it takes like 90 seconds to compute each simulation,so something like 10 minutes for 6 simulations. When I repeated this and changed nothing -The 6 simulations are the same- a very long delay happened, I will try running a timer in a separate thread.
Thank you!