By: nobody in particular (nobody.delete@this.nowhe.re), June 23, 2020 4:27 pm
Room: Moderated Discussions
pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 4:06 pm wrote:
> nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 10:09 am wrote:
> > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 7:55 pm wrote:
> > > > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 22, 2020 5:29 pm wrote:
> > > > > Chester (lamchester.delete@this.gmail.com) on June 22, 2020 1:32 pm wrote:
> > > > > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 11:44 am wrote:
> > > > > > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 11:26 am wrote:
> > > > > > > > It's real!!!!
> > > > > > >
> > > > > > > Well we saw something of how x86 apps will work.
> > > > > > > Any comments from the various people making strong claims regarding this?
> > > > > > > Of particular interest was (img)
> > > > > >
> > > > > > We haven't seen much in terms of CPU performance. Several comments:
> > > > >
> > > > > I looked into Apple GPU performance. What they don't tell people is that their GPU renders using 16 bit
> > > > > integers versus AMD/NV/Intel which use 32 bit floating point. That is much easier for their GPU. Its
> > > > > like comparing ping pong balls versus basketballs. Manipulating the latter is a lot more work. And that
> > > > > is how they look more efficient or faster on these lightweight benchmarks on iPads/phones and such.
> > > > >
> > > > > When they have to do rendering for games without losing fidelity, they are much slower. Their
> > > > > top end A12Z iGPU in an iPad get 27FPS on a 1080p AAA Game like Tomb Raider while a 6CU GCN 2700U
> > > > > APU gets 52FPS. This comparison using Geekbench graphics says the A12Z has 1.8 times the 2700U
> > > > > score. And of course there is the "we do better against a 2 year old part with our brand new
> > > > > part" problem. A14 will need to compete with Ryzen 5xxxU APUs and not Ryzen 2700U APUs.
> > > > >
> > > > > So lets wait for the 3rd party benchmarks doing real work or a facsimile thereof.
> > > > >
> > > >
> > > > I'm sorry but WTF are you talking about? "GPU renders using 16 bit integers"
> > > > ??? Do you mean half's (ie 16 bit FLOATs)? Of course Apple GPUs use these
> > > > for some (not all) purposes, just like any modern GPU from nV or ATI.
> > > >
> > > > As for Tomb Raider perhaps you can tell us EXACTLY what game you are talking about. There
> > > > are multiple games called something or other to do with Tomb Raider, some of which date from
> > > > seven years ago. I've no idea which you mean, if you're even comparing the same two things,
> > > > or if the iPad version you are running is even using Metal as opposed to OpenGL.
> > >
> > > You have no right calling on proof, who hand waves continuously and never gives
> > > proof. And yes for Geekbench graphics tests, Apples GPU renders using 16 bit
> > > integers and not 32 bit SP floating point like AMD, Nvidia, or Intel.
> > >
> > > BTW there are AAA games that run on ARM. Civ6 is one that does run on Android which is ARM. However
> > > they don't run it anywhere near as fast graphics wise as a low end AMD64 laptop. Like my many year
> > > old Lenovo Idea 300 which is a A10-8700 with 6CUs at 800MHz and a single channel 8GB DDR3-1333.
> > > The Apple GPU performance article said the 2700U had 6CUs, but the 2300U is the one with 6CUs at
> > > 1100MHz and likely dual channel 2x4GB DDR4-2666. The 2700U has 10CUs at 1300MHz. The current one
> > > for comparison is the 4700U with 7CUs at 1600MHz in the same 12-27W cTDP envelope.
> > >
> > > Pete
> >
> > You realize Apple's graphics are Apple-developed and totally
> > different from the GPUs in any Android system, yes?
>
> Doesn't matter as the performance of Apple tested in that article was far lower than even standard AMD iGPUs
> of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> >
> > Also, care to offer some evidence for your claim of using 16-bit ints?
>
> It seems like the Apple performance article I read was wrong in certain details. However
> Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
>
> https://www.realworldtech.com/apple-custom-gpu/
>
> This of course would bite Apple in the ass, if image quality standards were applied
> as they are and were to AMD, nVidia, and Intel GPUs. Besides Apple renders Civ6
> at 1/4 the resolution than it displays on the iPad Pro by default.
>
> As far as more and more information comes out, Apple gives a rats ass to the
> needs of its customers. And sooner or later, it will cost them dearly.
>
> Pete
>
>
>
If you mean the Anandtech Civ6 result, you're ignoring the fact that the iPad port - across all iPads, including low-end ones - is framerate locked. That tells you nothing about the iPad Pro's GPU performance except that it has no issue going to 27fps.
> nobody in particular (nobody.delete@this.nowhe.re) on June 23, 2020 1:30 pm wrote:
> > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 23, 2020 10:09 am wrote:
> > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 7:55 pm wrote:
> > > > pgerassi (pgerassi2012.delete@this.wi.rr.com) on June 22, 2020 5:29 pm wrote:
> > > > > Chester (lamchester.delete@this.gmail.com) on June 22, 2020 1:32 pm wrote:
> > > > > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 11:44 am wrote:
> > > > > > > Maynard Handley (name99.delete@this.name99.org) on June 22, 2020 11:26 am wrote:
> > > > > > > > It's real!!!!
> > > > > > >
> > > > > > > Well we saw something of how x86 apps will work.
> > > > > > > Any comments from the various people making strong claims regarding this?
> > > > > > > Of particular interest was (img)
> > > > > >
> > > > > > We haven't seen much in terms of CPU performance. Several comments:
- Showing off Word/Excel/Powerpoint is strange. Those apps run fine on an underclocked Atom
- DNG files in Lightroom - weird they didn't show exporting/raw conversion. Getting
> > > > > > low res previews of effects was very fast on 2013-era mobile Haswell. Maybe they didn't
> > > > > > show export because FPU performance is one of Intel's strengths (2x256-bit AVX execution
> > > > > > units), and processing high res images really takes advantage of that. - Maya - I don't have Maya, but Blender's workspace view is a very light GPU load. I suspect it's the
> > > > > > same for Maya. If they were confident in CPU performance, they'd show a CPU render. They did not. - Playback of multiple 4K streams - just means their GPU has a
> > > > > > modern video engine. Intel's iGPUs could do this years ago - Tomb Raider - they ran through a small, isolated area without any enemies/allies
> > > > > > present. I expect the weakest CPUs to have no trouble with that.
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > >
> > > > > I looked into Apple GPU performance. What they don't tell people is that their GPU renders using 16 bit
> > > > > integers versus AMD/NV/Intel which use 32 bit floating point. That is much easier for their GPU. Its
> > > > > like comparing ping pong balls versus basketballs. Manipulating the latter is a lot more work. And that
> > > > > is how they look more efficient or faster on these lightweight benchmarks on iPads/phones and such.
> > > > >
> > > > > When they have to do rendering for games without losing fidelity, they are much slower. Their
> > > > > top end A12Z iGPU in an iPad get 27FPS on a 1080p AAA Game like Tomb Raider while a 6CU GCN 2700U
> > > > > APU gets 52FPS. This comparison using Geekbench graphics says the A12Z has 1.8 times the 2700U
> > > > > score. And of course there is the "we do better against a 2 year old part with our brand new
> > > > > part" problem. A14 will need to compete with Ryzen 5xxxU APUs and not Ryzen 2700U APUs.
> > > > >
> > > > > So lets wait for the 3rd party benchmarks doing real work or a facsimile thereof.
> > > > >
> > > >
> > > > I'm sorry but WTF are you talking about? "GPU renders using 16 bit integers"
> > > > ??? Do you mean half's (ie 16 bit FLOATs)? Of course Apple GPUs use these
> > > > for some (not all) purposes, just like any modern GPU from nV or ATI.
> > > >
> > > > As for Tomb Raider perhaps you can tell us EXACTLY what game you are talking about. There
> > > > are multiple games called something or other to do with Tomb Raider, some of which date from
> > > > seven years ago. I've no idea which you mean, if you're even comparing the same two things,
> > > > or if the iPad version you are running is even using Metal as opposed to OpenGL.
> > >
> > > You have no right calling on proof, who hand waves continuously and never gives
> > > proof. And yes for Geekbench graphics tests, Apples GPU renders using 16 bit
> > > integers and not 32 bit SP floating point like AMD, Nvidia, or Intel.
> > >
> > > BTW there are AAA games that run on ARM. Civ6 is one that does run on Android which is ARM. However
> > > they don't run it anywhere near as fast graphics wise as a low end AMD64 laptop. Like my many year
> > > old Lenovo Idea 300 which is a A10-8700 with 6CUs at 800MHz and a single channel 8GB DDR3-1333.
> > > The Apple GPU performance article said the 2700U had 6CUs, but the 2300U is the one with 6CUs at
> > > 1100MHz and likely dual channel 2x4GB DDR4-2666. The 2700U has 10CUs at 1300MHz. The current one
> > > for comparison is the 4700U with 7CUs at 1600MHz in the same 12-27W cTDP envelope.
> > >
> > > Pete
> >
> > You realize Apple's graphics are Apple-developed and totally
> > different from the GPUs in any Android system, yes?
>
> Doesn't matter as the performance of Apple tested in that article was far lower than even standard AMD iGPUs
> of more than a few generations ago. Even if the others ARM GPUs are 2-5 times slower than that. For years
> Intel iGPUs were laughed at because they either pulled shenanigans to look faster or were too slow.
> >
> > Also, care to offer some evidence for your claim of using 16-bit ints?
>
> It seems like the Apple performance article I read was wrong in certain details. However
> Apple GPU uses 16 bit floating point instead of others using 32 bit SP Floats:
>
> https://www.realworldtech.com/apple-custom-gpu/
>
> This of course would bite Apple in the ass, if image quality standards were applied
> as they are and were to AMD, nVidia, and Intel GPUs. Besides Apple renders Civ6
> at 1/4 the resolution than it displays on the iPad Pro by default.
>
> As far as more and more information comes out, Apple gives a rats ass to the
> needs of its customers. And sooner or later, it will cost them dearly.
>
> Pete
>
>
>
If you mean the Anandtech Civ6 result, you're ignoring the fact that the iPad port - across all iPads, including low-end ones - is framerate locked. That tells you nothing about the iPad Pro's GPU performance except that it has no issue going to 27fps.