By: Wilco (Wilco.Dijkstra.delete@this.ntlworld.com), July 12, 2013 4:08 am
Room: Moderated Discussions
Wilco (Wilco.Dijkstra.delete@this.ntlworld.com) on July 11, 2013 7:26 am wrote:
> none (none.delete@this.none.com) on July 11, 2013 5:12 am wrote:
> > Wilco (Wilco.Dijkstra.delete@this.ntlworld.com) on July 11, 2013 5:03 am wrote:
> > [...]
> > > Which benchmark is affected by denormals? I thought pretty much any modern
> > > CPU nowadays deals with denormals in hardware with minimal penalty...
> >
> > That's an Intel claim, so I can't say. I have no reason not to believe it.
>
> Alright. I would personally want to see some hard evidence, such as which benchmarks are affected
> and by how much. The 2 FP benchmarks where Atom does really bad are blur and sharpen image,
> but it's hard to see how you could accidentally make a simple filter use denormals.
This is what Primate labs said on AnandTech forums:
"John from Primate Labs here (the company behind Geekbench).
I wanted to provide some details about what's going on with the floating point workloads the Silvermont architect referenced. Two of the Geekbench 2 floating point workloads (Sharpen Image and Blur Image) have a fencepost error. This error causes the workloads to read uninitialized memory, which can contain denorms (depending on the platform). This causes a massive drop in performance, and isn't representative of real-world performance.
We only found out about this issue a couple of months ago. Given that Geekbench 3 will be out in August, and fixing the issue in Geekbench 2 would break the ability to compare Geekbench 2 scores, we made the call not to fix the issue in Geekbench 2.
If you've got any questions about this (or about anything Geekbench) please let me know and I'd be happy to answer them. My email address is john at primatelabs dot com if you'd prefer to get in touch that way."
So I guess that explains the accidental use of denormals.
Wilco
> none (none.delete@this.none.com) on July 11, 2013 5:12 am wrote:
> > Wilco (Wilco.Dijkstra.delete@this.ntlworld.com) on July 11, 2013 5:03 am wrote:
> > [...]
> > > Which benchmark is affected by denormals? I thought pretty much any modern
> > > CPU nowadays deals with denormals in hardware with minimal penalty...
> >
> > That's an Intel claim, so I can't say. I have no reason not to believe it.
>
> Alright. I would personally want to see some hard evidence, such as which benchmarks are affected
> and by how much. The 2 FP benchmarks where Atom does really bad are blur and sharpen image,
> but it's hard to see how you could accidentally make a simple filter use denormals.
This is what Primate labs said on AnandTech forums:
"John from Primate Labs here (the company behind Geekbench).
I wanted to provide some details about what's going on with the floating point workloads the Silvermont architect referenced. Two of the Geekbench 2 floating point workloads (Sharpen Image and Blur Image) have a fencepost error. This error causes the workloads to read uninitialized memory, which can contain denorms (depending on the platform). This causes a massive drop in performance, and isn't representative of real-world performance.
We only found out about this issue a couple of months ago. Given that Geekbench 3 will be out in August, and fixing the issue in Geekbench 2 would break the ability to compare Geekbench 2 scores, we made the call not to fix the issue in Geekbench 2.
If you've got any questions about this (or about anything Geekbench) please let me know and I'd be happy to answer them. My email address is john at primatelabs dot com if you'd prefer to get in touch that way."
So I guess that explains the accidental use of denormals.
Wilco