• šŸ†Famiboards 2024 Gaming Celebration Ballot!šŸ†

    We've opened the vote for Famiboards game of the year 2024! Please take a look the post here and get voting! Now nominating Best Action/Adventure and Best Art Direction!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

I saw a video where it says a new Switch is in production and might release in the next 12 months.

It was based on a Famitsu translation and mentioned by a Japanese tech journalist.

Lie to me and say it has merit.
I wonā€™t lie to you, but I will ask for the link before I tear it down.
 
I donā€˜t know how reliable Red Gaming Tech is but has mentioned that Nintendo has done multiple revisions for the Switch Pro/4K, saying that he has heard that the most recent revision has a more powerful CPU and GPU. (8:46-9:30)
 
I donā€˜t know how reliable Red Gaming Tech is but has mentioned that Nintendo has done multiple revisions for the Switch Pro/4K, saying that he has heard that the most recent revision has a more powerful CPU and GPU. (8:46-9:30)

That is so vague
My uncle at Sony also told me PS6 will have a more powerful cpu and gpu
 
I can feel those NES joy-con mockups digging into my palm. Sexy though
 
0
I'm starting to doubt that Dev kits ever actually went out in the first place
Crazy I know...
but not impossible
 
I donā€™t care who he is but he mentioned Famitsu and a ā€œcredibleā€ Japanese tech journalist saying it. He mentioned his name too but I donā€™t know.
We talked about the original Famitsu article this guy is referring to. Itā€™s nothing, sorry to say
 
0
I donā€˜t know how reliable Red Gaming Tech is but has mentioned that Nintendo has done multiple revisions for the Switch Pro/4K, saying that he has heard that the most recent revision has a more powerful CPU and GPU. (8:46-9:30)

If I remember correctly, he has contacts at least within Microsoft and has leaked stuff in the past... so, it wouldn't surprise me that he could have heard things. But ya, super vague here, sounds like he might be trying to get more info to make a video about it, we'll see.

Very interesting though, and at least confirms that something was canceled but at the same time "the current iteration" is still going. I imagine Nintendo sends out dev kits, and then tells the people later "Guys, that ain't it" and then potentially sends a new kit to some only to say "Still not it." doesn't mean anything is canceled (but causes confusion/frustration in devs for sure, and you can see if this is true why we have heard what we have...), it just means we're dealing with moving targets.

This makes a lot of sense if Nintendo was originally planning the iPhone model of making new iterations of the same platform more quickly, but didn't like the results each time internally, then keep pushing it back for another try, and until now where it's basically time for a new generation.
 
Last edited:
so in this context "more powerful" could safely be assumed to mean higher clocks?
 
I can do the same on my 3060Ti. question is where will AMD be by the time of the PS6. I assume they decide to go with a similar RT solution as Nvidia/Intel by then. but I also assume node costs goes to absurd amounts by then
We have 5 years minimum for that, we'll have consoles beating the 4090 with ease. No reason to think PS6 won't be able to run fully raytraced games, that would mean tech has stalled.
 
Just wondering buf if true... Don't these videos imply we're potentially getting a different SoC to Drake? Probably Orin still, but buffed up for 2024 standards.
 
0
We have 5 years minimum for that, we'll have consoles beating the 4090 with ease. No reason to think PS6 won't be able to run fully raytraced games, that would mean tech has stalled.
I think youā€™re being more than a little optimistic.
 
I prefer a smaller power brick compared to a fully compliant one (whose voltage level my device will not use).
There exists chargers that are not only compliant with USB PD specifications, but are actually smaller than, and can provide higher amounts of power, than the Nintendo Switch's AC adapter, (e.g. Anker 65W PIQ 3.0 PPS Compact Fast Charger Adapter, Samsung 45 W Power Adapter, etc.). So I don't think USB PD compliance causes chargers to become larger, especially with more and more companies adopting Gallium Nitride (GaN) technology.
 
Not at all, there have been bigger jumps before. For 2028-2030, this will be a 6-8 years old GPU. PS5 does beat the 2070 which released one year before it so... What's your point?
The 2070 released two years before the PS5.

I just think weā€™re beginning to bump up against manufacturing limitations that have already slowed GPU advancement significantly while causing prices to spike. The 40-series cards took two full years to come to market, they cost 2x as much as the 30 series did at launch, and their biggest FPS gains primarily stem from frame generation. Iā€™m not super hopeful that consumer-grade console hardware is going to be able to ā€™beat the 4090 with easeā€˜ anytime soon barring some major manufacturing breakthrough that makes 80 billion transistors cheap.
 
Not at all, there have been bigger jumps before.
there have been bigger jumps before but gen-on-gen leaps are getting smaller and smaller.

For 2028-2030, this will be a 6-8 years old GPU. PS5 does beat the 2070 which released one year before it so... What's your point?
ā€œI think youā€™re being optimisticā€ is pretty clear point :)

I do not expect an overall ā€œgenerationalā€ leap from RTX 50, and RDNA 4 will be where we see if chiplets can deliver for GPUs what they did for CPUs.

And even if they do, RT is two gens behind on AMDs side, and while developers are adopting RT, itā€™s not clear where the balance of silicon between hardware RT and raster performance will wind up in 5 years.

There are folks who doubt the existence of a traditional ā€œ10th genā€ at all Making hard predictions about PS6, which depend on it getting manufactured on a sub 2nm process is optimistic
 
Epic talks Fortnite and UE5.1. Epic allocates 4ms to lumen that they had to make changes to the Series S settings to hit. what would it take for drake to fit software ray tracing in that budget?

 
0
there have been bigger jumps before but gen-on-gen leaps are getting smaller and smaller.


ā€œI think youā€™re being optimisticā€ is pretty clear point :)

I do not expect an overall ā€œgenerationalā€ leap from RTX 50, and RDNA 4 will be where we see if chiplets can deliver for GPUs what they did for CPUs.

And even if they do, RT is two gens behind on AMDs side, and while developers are adopting RT, itā€™s not clear where the balance of silicon between hardware RT and raster performance will wind up in 5 years.

There are folks who doubt the existence of a traditional ā€œ10th genā€ at all Making hard predictions about PS6, which depend on it getting manufactured on a sub 2nm process is optimistic
Except the 7900XTX is already beating the 4080 in rasterization and it was December 2022...? Note this was a launch benchmark with unoptimized drivers, we're definitely beating that card in six years without any doubt, it's dumb to think we won't have PS6 doing so with ease. That's not been the case with all the nine cosnole generations so far.

 
The 2070 released two years before the PS5.

I just think weā€™re beginning to bump up against manufacturing limitations that have already slowed GPU advancement significantly while causing prices to spike. The 40-series cards took two full years to come to market, they cost 2x as much as the 30 series did at launch, and their biggest FPS gains primarily stem from frame generation. Iā€™m not super hopeful that consumer-grade console hardware is going to be able to ā€™beat the 4090 with easeā€˜ anytime soon barring some major manufacturing breakthrough that makes 80 billion transistors cheap.
All of the improvements from the 4090 definitely haven't been only because of frame generation... It can do 60 FPS in Fortnite RTX with everything turned on, which doesn't have DLSS or any Nvidia features implemented yet.
 
0
Except the 7900XTX is already beating the 4080 in rasterization and it was December 2022...? Note this was a launch benchmark with unoptimized drivers, we're definitely beating that card in six years without any doubt, it's dumb to think we won't have PS6 doing so with ease. That's not been the case with all the nine cosnole generations so far.


at what costs? these gpus aren't as expensive as they are just because greed (though it's mostly greed). 5nm had a massive jump in costs over 7nm that the current gen is on. getting a 7900XT in a console sized box and power draw would probably take 3nm
 
at what costs? these gpus aren't as expensive as they are just because greed (though it's mostly greed). 5nm had a massive jump in costs over 7nm that the current gen is on. getting a 7900XT in a console sized box and power draw would probably take 3nm
Again, we got six years to iron these things out. Actually eight, because there's no way we're getting a PS6 until 2030 in the slightest, 8 years in terms of GPUs have been traditionally huge. Just saying expecting a 2030 console to beat a top-end 2022 GPU is a very safe bet at worst.
 
0
Except the 7900XTX is already beating the 4080 in rasterization and it was December 2022...?

Of course the 7900XTX is beating the 4080 in rasterization. To quote myself from the message you literally responded to:

while developers are adopting RT, itā€™s not clear where the balance of silicon between hardware RT and raster performance will wind up in 5 years.
To restate - AMD has not gone all in on Ray Tracing, and has instead invested that silicon in more rasterization performance. Nvidia is betting that RT will matter more in the long run.

we're definitely beating that card in six years without any doubt,
I didn't say we wouldn't.

it's dumb to think we won't have PS6 doing so with ease
I would prefer if you didn't call me dumb. You can think it, but don't say it.

Let me ask you a question - how do you think those rasterization gains, generation on generation are achieved? Where do you think they come from?
 
Of course the 7900XTX is beating the 4080 in rasterization. To quote myself from the message you literally responded to:


To restate - AMD has not gone all in on Ray Tracing, and has instead invested that silicon in more rasterization performance. Nvidia is betting that RT will matter more in the long run.


I didn't say we wouldn't.


I would prefer if you didn't call me dumb. You can think it, but don't say it.

Let me ask you a question - how do you think those rasterization gains, generation on generation are achieved? Where do you think they come from?
Then we already agree, because that's what I'm saying. The 7900XTX is already comparable to a 4080 in 2022 in rasterization alone (and the RT hardware on it is already comparable to the 3090). It's a matter of time until AMD catches up in raytracing, which as I mentioned above... A 4090 was already shown to give 50-60 FPS in the full raytracing set of games like Cyberpunk and The Witcher 3, with no DLSS or frame generation involved.
 
Then we already agree, because that's what I'm saying. The 7900XTX is already comparable to a 4080 in 2022 in rasterization alone (and the RT hardware on it is already comparable to the 3090).
We clearly don't agree. The RT hardware on it is not anywhere comparable to a 3090, and you're saying that you expect PS6 to show the same sorts of performance improvements that the last 9 gens of consoles have had. One of those things is not true, and the other thing I don't think is a slam dunk.
 
We clearly don't agree. The RT hardware on it is not anywhere comparable to a 3090, and you're saying that you expect PS6 to show the same sorts of performance improvements that the last 9 gens of consoles have had. One of those things is not true, and the other thing I don't think is a slam dunk.
It is comparable in many games, it's just underperforming in the helped by Nvidia. There are benchmarks out there but in any case, for 2022... We're not even that far off from getting games to render through raytraced means, because all of the RT implementations so far do render in 60 FPS without any upscaling techniques or compromises, as long as we're still talking about the 4090.
 
0
The PS1 was built on a 1.2Āµm process. The tech was over 5 years old at the time.

The PS2 was built on a 250nm process. That tech was 3 years old at the time, and offered a 4.8x density win over the previous machine.

The PS3 was built on a 90nm process. The tech was 3 years old at the time, and t this point, the "size" of the process was already a marketing term, not reflecting actual transistor density. Sony put 6 times the number of transistors in there, but the chip was twice as large, reflecting a doubling of cost.

The PS4 was built on a 28nm process. The tech was only 2 years old at the time, and only offered a 3x improvement over the 90nm process, but again, Sony just made it larger, and lost more money on the hardware to get to the expected performance leap.

The PS5 was built on a 7nm process. The tech was only 18 months old at the time. This was, again, closer to a 3x improvement, and Sony simply made the APU larger, again at additional cost. Microsoft, believing that node shrinks would come so slow, and at such high cost, that a node shrink wouldn't be possible after launch (and thus no cheaper Slim model) made the Series S.

2nm process. which is not yet available. only offers a 45% improvement over 7nm. If Sony is willing to take an even bigger loss on PS6 than they take on PS5, they'll still need a 3x transistor density improvement to get there. There is no node on the horizon that will offer that over 7nm and if it does come, it will not be more than a year old as PS6's 2028 launch window.
 
Hm... Changing the topic a little, I still have my doubts about the 2028 launch window. It might be true but given COVID delaying everything regarding the industry, I'd rather hold a 2029-2030 launch window. That'll mean a decade or almost a decade of PS5, just like the PS4 which is still around with all that means.
 
If Sony is willing to take an even bigger loss on PS6 than they take on PS5, they'll still need a 3x transistor density improvement to get there.
To be fair, to squeeze such a monster of a console... I think you'll still be taking massive losses developing the software at first place, whoever around this business for then will be on it by their own choice.
 
0
The PS1 was built on a 1.2Āµm process. The tech was over 5 years old at the time.

The PS2 was built on a 250nm process. That tech was 3 years old at the time, and offered a 4.8x density win over the previous machine.

The PS3 was built on a 90nm process. The tech was 3 years old at the time, and t this point, the "size" of the process was already a marketing term, not reflecting actual transistor density. Sony put 6 times the number of transistors in there, but the chip was twice as large, reflecting a doubling of cost.

The PS4 was built on a 28nm process. The tech was only 2 years old at the time, and only offered a 3x improvement over the 90nm process, but again, Sony just made it larger, and lost more money on the hardware to get to the expected performance leap.

The PS5 was built on a 7nm process. The tech was only 18 months old at the time. This was, again, closer to a 3x improvement, and Sony simply made the APU larger, again at additional cost. Microsoft, believing that node shrinks would come so slow, and at such high cost, that a node shrink wouldn't be possible after launch (and thus no cheaper Slim model) made the Series S.

2nm process. which is not yet available. only offers a 45% improvement over 7nm. If Sony is willing to take an even bigger loss on PS6 than they take on PS5, they'll still need a 3x transistor density improvement to get there. There is no node on the horizon that will offer that over 7nm and if it does come, it will not be more than a year old as PS6's 2028 launch window.
This is why I believe hyper-specialization through hardware acceleration for specific CPU and GPU tasks is the only way forward until we break the limitations of silicon and Nvidia undeniably has a leg up in that.
 
Guessing the PS6 and Xbox 5 will have AI image reconstruction that will allow them to drop back down to a native 1080p and save them a lot of GPU power.

The big question will be either frame generation be good enough to bump up 30 FPS games to 60 FPS without input lag by 2028 and from AMD... I would guess yes.

Which would also help them save some more CPU and GPU power and some RAM.

I would assume raytracing will also be much easier to do in terms of power required by 2028

So even if the TFLOPs etc aren't a massive leap, there should be a massive leap in actual usable power in games

Using the power needed for a 1080p30 game to have it run at 4k60 is very nice. I think it will be able to get to being able to do full path tracing even in the most advanced games.
 
Last edited:
0
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom