Who of you have not struggled with denormalized floating point numbers? I hate them, for many reasons. When I started doing audio dsp I stumbled upon it at once, but what I didn't realize right away was that not all platforms suffers from the phenomenon as much as the x86 platform. In fact, I am pretty sure that the old PPC didn't even slow down. How it is in the ARM platform I don't know. x64? SIMD? Various DSP's?
I have stumbled upon several implementations for un-denormalization, some which are faster than others.
I have also stumbled upon programmers who have no idea when to expect denormalized numbers to appear, and thus try to "remove" them all the time, with an unnecessary performance penalty as a result.
As there are an disproportional amount of audio DSP programmers on this forum, I propose those who are interested put our heads together and come up with a set of detection techniques and remedies. This need not be a huge "project" but I'm pretty sure many out there don't really master numerical stability. With a bit of luck jules may even include some code in juce if it's good enough.
How about it?
[Insert signature here]