By: vvid (no.delete@this.thanks.com), June 24, 2020 12:52 pm
Room: Moderated Discussions
pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 24, 2020 8:48 am wrote:
> vvid (no.delete@this.thanks.com) on June 23, 2020 5:51 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> > > nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> >
> > > > You realize Apple's graphics are Apple-developed and totally
> > > > different from the GPUs in any Android system, yes?
> > >
> > > Doesn't matter as the performance of Apple tested in that
> > > article was far lower than even standard AMD iGPUs
> > > of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> > > Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> > > >
> > > > Also, care to offer some evidence for your claim of using 16-bit ints?
> > >
> > > It seems like the Apple performance article I read was wrong in certain details. However
> > > Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
> > >
> > > https://www.realworldtech.com/apple-custom-gpu/
> >
> > PowerVR has unified shaders since MBX/SGX.
> > Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
> > http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
> > Shader cores have multiple ALUs with different precision - both FP32 and FP16.
> >
> > This is for Series 6:
> > https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> >
> > > This of course would bite Apple in the ass, if image quality standards were applied
> > > as they are and were to AMD, nVidia, and Intel GPUs.
> >
> > You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and
> > power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
> > FP16 is used in places where limited range does not cause artifacts.
> >
>
> You had better learn something about things before shooting your mouth off. 16 bit HP FP on
> both nVidia and Radeon GPUs is for AI and not graphics.
AI? Seems you're confusing the FP16 tensor cores and FP16 precision available in shader cores.
"Double Speed FP16" has been introduced in Maxwell for Tegra.
I think FP16 is widely used on Nintendo Switch (for graphics).
For desktops it probably does not worth the hassle.
https://therealmjp.github.io/posts/shader-fp16/
> Graphics IQ with 16 bit rendering is
> worse than with 32 bit. Banding and distance are some areas that show differences between the
> methods. Software rendering which uses 32 bit SP on those AMD64 CPUs which was used for comparisons
> in the old IQ wars and 16 bit FP is not available on those CPUs. AI is moving to 8 bit FP or
> even 4 bit integers for more performance so those GPUs are adding those too.
PowerVR supports FP32 and FP16 at the same time. If FP16 calculations results in banding,
just use FP32. But for majority of stuff like texture color modulation, FP16 is enough.
> You sound like those guys about good enough being great. 640K was one such statement that has been
> shown to be ridiculous. 320x240 was good enough (NOT!). It has been shown time and time again that
> good enough because of hardware limitations fails at some point. Good enough because of physical
> attributes endures. The latter of human eye properties works for AA and AF. Human eye properties
> don't change much over time (except for old people which they get worse as they age). The former
> like 16 bit FP is good enough for 8 bit displays. Well displays are at 8 or 10 bits due to limitations
> of LCDs. With OLEDs they can go to 12, 14, or even 16 bits. Then you get banding using 16 bit floating
> point because eyes will see it. So good enough becomes not good at all.
14-16 bit displays are totally irrelevant now.
It is somewhat stupid to aim for displays of 2030 in 2020 and limit yourself to, say, half of possible framerate just on that basis.
I'm guess even "IQ pedant" can understand the compromise.
Games are written for current hardware, not for some future super computers (except Crysis).
You'll be surprised, but HDR games today are using formats like DXGI_FORMAT_R11G11B10_FLOAT.
That's 5bit of mantissa in a channel.
> > > Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
> >
> > Apple? I don't think Apple controls developer decisions to that extent.
>
> They render fully on other android devices, so its not likely the developer,
> but Apple. They control the "ecosystem". Ditto for the frame rate lock.
>
> Pete
>
> >
> > >
> > > As far as more and more information comes out, Apple gives a rats ass to the
> > > needs of its customers. And sooner or later, it will cost them dearly.
> > >
> > > Pete
> >
> >
> >
>
>
> vvid (no.delete@this.thanks.com) on June 23, 2020 5:51 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> > > nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> >
> > > > You realize Apple's graphics are Apple-developed and totally
> > > > different from the GPUs in any Android system, yes?
> > >
> > > Doesn't matter as the performance of Apple tested in that
> > > article was far lower than even standard AMD iGPUs
> > > of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> > > Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> > > >
> > > > Also, care to offer some evidence for your claim of using 16-bit ints?
> > >
> > > It seems like the Apple performance article I read was wrong in certain details. However
> > > Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
> > >
> > > https://www.realworldtech.com/apple-custom-gpu/
> >
> > PowerVR has unified shaders since MBX/SGX.
> > Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
> > http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
> > Shader cores have multiple ALUs with different precision - both FP32 and FP16.
> >
> > This is for Series 6:
> > https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> >
> > > This of course would bite Apple in the ass, if image quality standards were applied
> > > as they are and were to AMD, nVidia, and Intel GPUs.
> >
> > You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and
> > power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
> > FP16 is used in places where limited range does not cause artifacts.
> >
>
> You had better learn something about things before shooting your mouth off. 16 bit HP FP on
> both nVidia and Radeon GPUs is for AI and not graphics.
AI? Seems you're confusing the FP16 tensor cores and FP16 precision available in shader cores.
"Double Speed FP16" has been introduced in Maxwell for Tegra.
I think FP16 is widely used on Nintendo Switch (for graphics).
For desktops it probably does not worth the hassle.
https://therealmjp.github.io/posts/shader-fp16/
> Graphics IQ with 16 bit rendering is
> worse than with 32 bit. Banding and distance are some areas that show differences between the
> methods. Software rendering which uses 32 bit SP on those AMD64 CPUs which was used for comparisons
> in the old IQ wars and 16 bit FP is not available on those CPUs. AI is moving to 8 bit FP or
> even 4 bit integers for more performance so those GPUs are adding those too.
PowerVR supports FP32 and FP16 at the same time. If FP16 calculations results in banding,
just use FP32. But for majority of stuff like texture color modulation, FP16 is enough.
> You sound like those guys about good enough being great. 640K was one such statement that has been
> shown to be ridiculous. 320x240 was good enough (NOT!). It has been shown time and time again that
> good enough because of hardware limitations fails at some point. Good enough because of physical
> attributes endures. The latter of human eye properties works for AA and AF. Human eye properties
> don't change much over time (except for old people which they get worse as they age). The former
> like 16 bit FP is good enough for 8 bit displays. Well displays are at 8 or 10 bits due to limitations
> of LCDs. With OLEDs they can go to 12, 14, or even 16 bits. Then you get banding using 16 bit floating
> point because eyes will see it. So good enough becomes not good at all.
14-16 bit displays are totally irrelevant now.
It is somewhat stupid to aim for displays of 2030 in 2020 and limit yourself to, say, half of possible framerate just on that basis.
I'm guess even "IQ pedant" can understand the compromise.
Games are written for current hardware, not for some future super computers (except Crysis).
You'll be surprised, but HDR games today are using formats like DXGI_FORMAT_R11G11B10_FLOAT.
That's 5bit of mantissa in a channel.
> > > Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
> >
> > Apple? I don't think Apple controls developer decisions to that extent.
>
> They render fully on other android devices, so its not likely the developer,
> but Apple. They control the "ecosystem". Ditto for the frame rate lock.
>
> Pete
>
> >
> > >
> > > As far as more and more information comes out, Apple gives a rats ass to the
> > > needs of its customers. And sooner or later, it will cost them dearly.
> > >
> > > Pete
> >
> >
> >
>
>