Denormal Numbers? Still a problem!

Hey guys,

I just finished some checks and figured that there is still a problem concerning very small numbers. I am simulating a bunch of damped oscillators, which leads to crunching very small numbers when I let them decay. I see that the CPU load is going up significantly, when I don’t touch the keys although the algorithm is running constantly. I have discovered this already 20 years ago and thought it was related to the processor at that time. Seems not the case. Is there a clean solution or a workaround to deal with this?

Thanks a lot!

You can mask the corresponding MXCSR flags as described here

But denormals are not supposed to incur such a performance penalty on recent cpu’s AFAIK.
Perhaps you have some kind of branching in your loop that makes the cpu change depending on the input level ?