By: anon (anon.delete@this.anontech.anon), June 24, 2020 1:22 pm
Room: Moderated Discussions
pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 24, 2020 8:48 am wrote:
> vvid (no.delete@this.thanks.com) on June 23, 2020 5:51 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> > > nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> >
> > > > You realize Apple's graphics are Apple-developed and totally
> > > > different from the GPUs in any Android system, yes?
> > >
> > > Doesn't matter as the performance of Apple tested in that
> > > article was far lower than even standard AMD iGPUs
> > > of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> > > Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> > > >
> > > > Also, care to offer some evidence for your claim of using 16-bit ints?
> > >
> > > It seems like the Apple performance article I read was wrong in certain details. However
> > > Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
> > >
> > > https://www.realworldtech.com/apple-custom-gpu/
> >
> > PowerVR has unified shaders since MBX/SGX.
> > Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
> > http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
> > Shader cores have multiple ALUs with different precision - both FP32 and FP16.
> >
> > This is for Series 6:
> > https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> >
> > > This of course would bite Apple in the ass, if image quality standards were applied
> > > as they are and were to AMD, nVidia, and Intel GPUs.
> >
> > You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and
> > power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
> > FP16 is used in places where limited range does not cause artifacts.
> >
>
> You had better learn something about things before shooting your mouth off. 16 bit HP FP on
> both nVidia and Radeon GPUs is for AI and not graphics. Graphics IQ with 16 bit rendering is
> worse than with 32 bit. Banding and distance are some areas that show differences between the
> methods. Software rendering which uses 32 bit SP on those AMD64 CPUs which was used for comparisons
> in the old IQ wars and 16 bit FP is not available on those CPUs. AI is moving to 8 bit FP or
> even 4 bit integers for more performance so those GPUs are adding those too.
>
> You sound like those guys about good enough being great. 640K was one such statement that has been
> shown to be ridiculous. 320x240 was good enough (NOT!). It has been shown time and time again that
> good enough because of hardware limitations fails at some point. Good enough because of physical
> attributes endures. The latter of human eye properties works for AA and AF. Human eye properties
> don't change much over time (except for old people which they get worse as they age). The former
> like 16 bit FP is good enough for 8 bit displays. Well displays are at 8 or 10 bits due to limitations
> of LCDs. With OLEDs they can go to 12, 14, or even 16 bits. Then you get banding using 16 bit floating
> point because eyes will see it. So good enough becomes not good at all.
>
> > > Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
> >
> > Apple? I don't think Apple controls developer decisions to that extent.
>
> They render fully on other android devices, so its not likely the developer,
> but Apple. They control the "ecosystem". Ditto for the frame rate lock.
You're literally wrong. As the Anandtech article (and others have done), there's a config text file in the app sandbox files that can be modified where people have unlocked 4x resolution in the app. As for the frame rate lock, there's most obviously not a 27FPS lock from the OS, since there are games that run at 120FPS on iPad, including some very popular ones like Fortnite.
Seems like you just have a grudge, and don't want to look at facts.
>
> Pete
>
> >
> > >
> > > As far as more and more information comes out, Apple gives a rats ass to the
> > > needs of its customers. And sooner or later, it will cost them dearly.
> > >
> > > Pete
> >
> >
> >
>
>
> vvid (no.delete@this.thanks.com) on June 23, 2020 5:51 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> > > nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> >
> > > > You realize Apple's graphics are Apple-developed and totally
> > > > different from the GPUs in any Android system, yes?
> > >
> > > Doesn't matter as the performance of Apple tested in that
> > > article was far lower than even standard AMD iGPUs
> > > of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> > > Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> > > >
> > > > Also, care to offer some evidence for your claim of using 16-bit ints?
> > >
> > > It seems like the Apple performance article I read was wrong in certain details. However
> > > Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
> > >
> > > https://www.realworldtech.com/apple-custom-gpu/
> >
> > PowerVR has unified shaders since MBX/SGX.
> > Historically it can run arbitrary code on the shader units, even firmware (SGX Micro Kernel).
> > http://cdn.imgtec.com/sdk-documentation/PowerVR+Series5.Architecture+Guide+for+Developers.pdf
> > Shader cores have multiple ALUs with different precision - both FP32 and FP16.
> >
> > This is for Series 6:
> > https://www.anandtech.com/show/7793/imaginations-powervr-rogue-architecture-exposed/2
> >
> > > This of course would bite Apple in the ass, if image quality standards were applied
> > > as they are and were to AMD, nVidia, and Intel GPUs.
> >
> > You're totally misguided. You think that FP16 rendering is bad, but in reality it is fast and
> > power efficient. This is why FP16 ALUs were reintroduced in both AMD and Nvidia cards.
> > FP16 is used in places where limited range does not cause artifacts.
> >
>
> You had better learn something about things before shooting your mouth off. 16 bit HP FP on
> both nVidia and Radeon GPUs is for AI and not graphics. Graphics IQ with 16 bit rendering is
> worse than with 32 bit. Banding and distance are some areas that show differences between the
> methods. Software rendering which uses 32 bit SP on those AMD64 CPUs which was used for comparisons
> in the old IQ wars and 16 bit FP is not available on those CPUs. AI is moving to 8 bit FP or
> even 4 bit integers for more performance so those GPUs are adding those too.
>
> You sound like those guys about good enough being great. 640K was one such statement that has been
> shown to be ridiculous. 320x240 was good enough (NOT!). It has been shown time and time again that
> good enough because of hardware limitations fails at some point. Good enough because of physical
> attributes endures. The latter of human eye properties works for AA and AF. Human eye properties
> don't change much over time (except for old people which they get worse as they age). The former
> like 16 bit FP is good enough for 8 bit displays. Well displays are at 8 or 10 bits due to limitations
> of LCDs. With OLEDs they can go to 12, 14, or even 16 bits. Then you get banding using 16 bit floating
> point because eyes will see it. So good enough becomes not good at all.
>
> > > Besides Apple renders Civ6 at 1/4 the resolution than it displays on the iPad Pro by default.
> >
> > Apple? I don't think Apple controls developer decisions to that extent.
>
> They render fully on other android devices, so its not likely the developer,
> but Apple. They control the "ecosystem". Ditto for the frame rate lock.
You're literally wrong. As the Anandtech article (and others have done), there's a config text file in the app sandbox files that can be modified where people have unlocked 4x resolution in the app. As for the frame rate lock, there's most obviously not a 27FPS lock from the OS, since there are games that run at 120FPS on iPad, including some very popular ones like Fortnite.
Seems like you just have a grudge, and don't want to look at facts.
>
> Pete
>
> >
> > >
> > > As far as more and more information comes out, Apple gives a rats ass to the
> > > needs of its customers. And sooner or later, it will cost them dearly.
> > >
> > > Pete
> >
> >
> >
>
>