By: zzyzx (zzyzx.delete@this.zzyzx.sh), May 22, 2022 2:46 am
Room: Moderated Discussions
Jukka Larja (roskakori2006.delete@this.gmail.com) on May 21, 2022 9:48 pm wrote:
> Some time ago there was a "scandal" about a game not running on some (mostly old AMD) CPUs.
> Turned out the game was using POPCNT, which according to Steam Hardware Survey was missing
> from about 1-2 % of Steam users' CPUs at that time. It's rather surprising that Assassin's
> Creed Odyssey even tried to require AVX. It would be interesting to know how they actually
> fixed it. Did they just drop AVX altogether, ship two binaries or make a runtime choice?
I don't know how they fixed it.
The POPCNT one sounds familiar now that you mention it, and IIRC there was another game in the AC:Od timeframe that launched requiring AVX but made less of a fuss because the devs were clear from the start that they'd fix it.
At this point (4 years later) I bet you could just about get away with requiring AVX for something with already high-ish sysreqs. Intel only very recently stopped fusing it off on Pentiums and Celerons, but I was surprised at the time of the AC:Od thing that hardly anyone even mentioned those (and by now the CPUs without AVX are weak enough that it's understandable). All of the concern was about Nehalem and Westmere.
> I admit that the newest generation of consoles are different, but the last time there was a big jump
> in console CPU performance (PS3/Xbox360 to PS4/Xbox One), SIMD usage dropped. My understanding is that
> the reason was that GPU performance and programmability increased even more (also: CPU SIMD performance
> per core dropped, while core count and per core non-SIMD performance went up), which is again the case
> with newest generation (this time the CPU SIMD performance per core has also gone up though).
I haven't seen any confirmation on this, but if the PS5's FPUs are 128-bit as it looks on die shots, this gen should be another significant decrease in effective vector:scalar ratio (to whatever extent the PS5 is the lowest common denominator and determines how things are built).
> Some time ago there was a "scandal" about a game not running on some (mostly old AMD) CPUs.
> Turned out the game was using POPCNT, which according to Steam Hardware Survey was missing
> from about 1-2 % of Steam users' CPUs at that time. It's rather surprising that Assassin's
> Creed Odyssey even tried to require AVX. It would be interesting to know how they actually
> fixed it. Did they just drop AVX altogether, ship two binaries or make a runtime choice?
I don't know how they fixed it.
The POPCNT one sounds familiar now that you mention it, and IIRC there was another game in the AC:Od timeframe that launched requiring AVX but made less of a fuss because the devs were clear from the start that they'd fix it.
At this point (4 years later) I bet you could just about get away with requiring AVX for something with already high-ish sysreqs. Intel only very recently stopped fusing it off on Pentiums and Celerons, but I was surprised at the time of the AC:Od thing that hardly anyone even mentioned those (and by now the CPUs without AVX are weak enough that it's understandable). All of the concern was about Nehalem and Westmere.
> I admit that the newest generation of consoles are different, but the last time there was a big jump
> in console CPU performance (PS3/Xbox360 to PS4/Xbox One), SIMD usage dropped. My understanding is that
> the reason was that GPU performance and programmability increased even more (also: CPU SIMD performance
> per core dropped, while core count and per core non-SIMD performance went up), which is again the case
> with newest generation (this time the CPU SIMD performance per core has also gone up though).
I haven't seen any confirmation on this, but if the PS5's FPUs are 128-bit as it looks on die shots, this gen should be another significant decrease in effective vector:scalar ratio (to whatever extent the PS5 is the lowest common denominator and determines how things are built).