Hey guys,
In my C++ gauge I'm trying to 'animate' a variable by having it chase a target number at various speeds. This is something I've done many times in different sims, and ordinarily I use delta_time to normalise my rates for frame time (ie multiply the desired rate per second by the timestep). The function looks like this:
Reading up as well as my own tests shows the update loop runs at about 18hz (0.055s); so in my tests I've just passed in 0.055 as dt. With a speed set to 1, I expect the variable to 'animate' at a rate of 1unit/second (or close to it since I'm passing in dt as a const). It simply doesn't though; what should take 5 mins happens in about 15-20 seconds.
The relevant code flow in the update loop:
1. Read my lvar current value
2.Perform calculations etc
3.Animate the lvar via function above
4.Set lvar to new value from step 3
My questions:
1. What is the standard way to get & store dt as a var in C++? Should I just read the difference between ABSOLUTE TIME simvar each update call?
2. Can anyone explain what could be causing the animation rate to be so off? Or if not; can anyone explain a tried & true method to normalize rates for frame-time?
EDIT: On a related note, how can I detect when the sim is paused? I'm using SimConnect currently but it only detects dev-mode pause and does not react to the esc key menu state (ie the 'pause' feature that will be used by the user 99.9% of the time)
Thanks
In my C++ gauge I'm trying to 'animate' a variable by having it chase a target number at various speeds. This is something I've done many times in different sims, and ordinarily I use delta_time to normalise my rates for frame time (ie multiply the desired rate per second by the timestep). The function looks like this:
C++:
float animate(float tgt, float currentVal, float dt = 0.2, float speed = 1.0, float offset = 1.2)
{
if (currentVal > tgt + offset)
{
return currentVal - (speed * dt);
}
else if (currentVal < tgt - offset)
{
return currentVal + (speed * dt);
}
else
{
return tgt;
}
}
Reading up as well as my own tests shows the update loop runs at about 18hz (0.055s); so in my tests I've just passed in 0.055 as dt. With a speed set to 1, I expect the variable to 'animate' at a rate of 1unit/second (or close to it since I'm passing in dt as a const). It simply doesn't though; what should take 5 mins happens in about 15-20 seconds.
The relevant code flow in the update loop:
1. Read my lvar current value
2.Perform calculations etc
3.Animate the lvar via function above
4.Set lvar to new value from step 3
My questions:
1. What is the standard way to get & store dt as a var in C++? Should I just read the difference between ABSOLUTE TIME simvar each update call?
2. Can anyone explain what could be causing the animation rate to be so off? Or if not; can anyone explain a tried & true method to normalize rates for frame-time?
EDIT: On a related note, how can I detect when the sim is paused? I'm using SimConnect currently but it only detects dev-mode pause and does not react to the esc key menu state (ie the 'pause' feature that will be used by the user 99.9% of the time)
Thanks
Last edited: