I did some really shitty math that's probably wrong. if we assume the worse case scenario (256 Ampere cores), we'd get about a 40%+ performance increase (also assuming shit scales linearly, which it doesn't, and averaging two game results from 3 benching videos)
using the 1030 and the T400 as the best like for like that's currently available
RDR2 has ~30% performance increase
Cyberpunk has ~50% performance increase
and that's before DLSS (and assuming DLSS or a higher performance equivalent can run on 2 tensor cores) and the performance increase from Ampere (and possibly Lovelace). a reasonable expectation for a "Pro" but I think Nintendo will end up better than this
performance sourcing
using the 1030 and the T400 as the best like for like that's currently available
RDR2 has ~30% performance increase
Cyberpunk has ~50% performance increase
and that's before DLSS (and assuming DLSS or a higher performance equivalent can run on 2 tensor cores) and the performance increase from Ampere (and possibly Lovelace). a reasonable expectation for a "Pro" but I think Nintendo will end up better than this
performance sourcing
Last edited: