By: vvid (no.delete@this.thanks.com), June 23, 2020 4:51 pm
Room: Moderated Discussions
pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> > You realize Apple's graphics are Apple-developed and totally
> > different from the GPUs in any Android system, yes?
>
> Doesn't matter as the performance of Apple tested in that article was far lower than even standard AMD iGPUs
> of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> >
> > Also, care to offer some evidence for your claim of using 16-bit ints?
>
> It seems like the Apple performance article I read was wrong in certain details. However
> Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
>
> https://www.realworldtech.com/apple-custom-gpu/
PowerVR has unified shaders since MBX/SGX.
Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
Shader cores have multiple ALUs with different precision - both FP32 and FP16.
This is for Series 6:
https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> This of course would bite Apple in the ass, if image quality standards were applied
> as they are and were to AMD, nVidia, and Intel GPUs.
You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
FP16 is used in places where limited range does not cause artifacts.
> Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
Apple? I don't think Apple controls developer decisions to that extent.
>
> As far as more and more information comes out, Apple gives a rats ass to the
> needs of its customers. And sooner or later, it will cost them dearly.
>
> Pete
> nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> > You realize Apple's graphics are Apple-developed and totally
> > different from the GPUs in any Android system, yes?
>
> Doesn't matter as the performance of Apple tested in that article was far lower than even standard AMD iGPUs
> of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> >
> > Also, care to offer some evidence for your claim of using 16-bit ints?
>
> It seems like the Apple performance article I read was wrong in certain details. However
> Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
>
> https://www.realworldtech.com/apple-custom-gpu/
PowerVR has unified shaders since MBX/SGX.
Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
Shader cores have multiple ALUs with different precision - both FP32 and FP16.
This is for Series 6:
https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> This of course would bite Apple in the ass, if image quality standards were applied
> as they are and were to AMD, nVidia, and Intel GPUs.
You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
FP16 is used in places where limited range does not cause artifacts.
> Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
Apple? I don't think Apple controls developer decisions to that extent.
>
> As far as more and more information comes out, Apple gives a rats ass to the
> needs of its customers. And sooner or later, it will cost them dearly.
>
> Pete