• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I'm not sure, but I've heard that kopite is accurate a lot of the time on the nvidia info only has multiple errors on the drake info, should we really consider him as a reasonable source of info?
We consider Bloomberg reliable in spite of the Oled fuckup. We consider Nate reliable in spite of the late 22/ yearly 23 info.
 
I'm not sure, but I've heard that kopite is accurate a lot of the time on the nvidia info only has multiple errors on the drake info, should we really consider him as a reasonable source of info?
what do you mean? Drake is nvidia, and kopite literally got multiple bits of info wrong.

Doesn't discredit his T239 tweet though, we acknowledge that one. However, insiders are not infallible. Even Pyoro don't have a perfect track record either.
 
what do you mean? Drake is nvidia, and kopite literally got multiple bits of info wrong.

Doesn't discredit his T239 tweet though, we acknowledge that one. However, insiders are not infallible. Even Pyoro don't have a perfect track record either.
I mean, I've heard that most of the nvidia items revealed by kopite on all but drake are accurate, but there are multiple misinformation on drake.
 
I mean, I've heard that most of the nvidia items revealed by kopite on all but drake are accurate, but there are multiple misinformation on drake.
Drake/T239 is nvidia. I think you might be thinking of desktop GPUs, kopite does have pretty good track record on that subset of nvidia products IIRC.
 
what do you mean? Drake is nvidia, and kopite literally got multiple bits of info wrong.

Doesn't discredit his T239 tweet though, we acknowledge that one. However, insiders are not infallible. Even Pyoro don't have a perfect track record either.
When did Pyoro say anything which didn’t happen?
 
When did Pyoro say anything which didn’t happen?
It was something Pokémon related and he even admitted to take it with a grain of salt

Same goes for Midori when she wrongly said something about Yakuza. I think it was the name of the title.
 
It honestly makes me wonder why 120 Hz didn't become the standard for HD TVs rather than 60. The 16:9 aspect ratio was standardized as a middle ground between the 4:3 used in SD TVs and the 2.35:1 used in movies. Both of them were at 24 fps, so why not use a clean multiple of 24?
SDTV in the US was 30 or 60, usually 25 or 50 in Europe.

(To note, not just an NTSC vs. PAL difference; there were/are PAL 60 regions, plus non-PAL 50hz regions.)

Finding a clean multiple of 24 and 25 is 600 or something.

60 ended up the standard before HDTV even came in, with PAL60 becoming common for TVs to support across Europe, Brazil with its existing PAL 60 system, and of course, North America and Japan using NTSC 60 from the get-go. Everything standardised "up". I played Wii on PAL60 way back when.

As for the origin of 60, if I'm not mistaken, to reduce flickering in early arc lighting fixtures, funny enough. Numbers have been controlling light since forever, hahaha.
 

Architecture in Brief​

Qualcomm's Adreno 6xx architecture diverges from Adreno 5xx's Radeon origins, and adds a separate low priority compute queue to the command processor. This queue lets the driver reserve the primary ring buffer for higher priority tasks, and is meant to allow background compute without display stutters. AMD in contrast jumped in the deep end with multiple asynchronous compute queues.


Adreno also focuses on tile based rendering, which tries to lower bandwidth requirements when rasterizing graphics. Primitives from vertex shaders get sorted into tiles (rectangular portions of the screen), which are rendered one at a time. That improves cache locality, and lets the GPU buffer the render-in-progress tile in specialized memory. Qualcomm calls that GMEM, and Adreno 690 gets 4 MB of that. More GMEM should let Adreno 690 better leverage its larger shader array by handling more pixels at a time. For comparison, Adreno 640 has 2 MB of GMEM.

Counting SPs. Die photo from Kurnal, labels added by Clam

Mesa source code indicates Adreno 690 has 8 "CCUs". On Adreno 730, a CCU corresponds to a pair of SPs, or Shader Processors. Adreno 690 likely has a similar design. A die shot of Qualcomm's 8cx Gen 3 also shows a similar GPU layout with SPs organized in pairs, unlike Snapdragon 821's Adreno 530.


Therefore, Adreno 690 is a large GPU by Qualcomm standards. It has twice as many SPs as their highest end Snapdragon cell phone chips. Even though it uses a prior generation GPU architecture, its large shader array still demands respect.

Mesa code suggests Adreno 690 has 16 KB instruction caches and 64 KB register files. Each SP's scheduler probably has two 16 entry partitions, and manages 128-wide waves.

Cache and Memory Latency​

Memory performance falls further behind compute every year, so even GPUs have complex cache hierarchies to keep their execution units fed. To match the scaled up shader array, Adreno 690 gets a 512 KB L2 cache. For comparison, the smaller Adreno 640 in the Snapdragon 855 has a 128 KB L2. Then, it shares a 6 MB system level cache with the CPU and other blocks on the chip. Finally, a LPDDR4X memory controller handles DRAM accesses.

Adreno 690's L2 cache has a hefty 137 ns of latency. L2 accesses on Adreno will have to wait longer for data than on Nvidia's GTX 1050 3 GB, or AMD's Radeon 740M.
Nvidia and AMD's GPUs have larger L2 caches as well. Adreno 690 may have a small 1 KB texture cache like Adreno 5xx and 7xx, possibly with 38 ns of latency. But it’s hard to tell because OpenCL can only hit Adreno 690 through OpenCLOn12, which is quite unstable. In any case, Adreno 690 lacks a first level cache with comparable capacity to Pascal or RDNA 3. Nvidia's Ampere/Ada and Intel's Xe-LPG/HPG architectures have even larger L1 caches.


After L2, the Snapdragon 8cx Gen 3 has a 6 MB system level cache shared with the CPU and other on-chip blocks. From a latency test, the GPU is able to use about 2 MB of that, much like the Snapdragon 8+ Gen 1's Adreno 730. System level cache latency is comparable to DRAM latency on older GPUs. Both the GTX 1050 3 GB and AMD’s Radeon 740M can get to DRAM with less latency. If Adreno 690 has to pull data from DRAM, latency is high at over 400 ns.

Bandwidth​

The highly parallel nature of GPUs makes them high bandwidth consumers. GPU caches have to satisfy those bandwidth demands in addition to reducing stalls due to memory latency. While Adreno 690 had comparable compute throughput to AMD’s Radeon 740M and Nvidia's GTX 1050 3 GB, it has far less bandwidth available to feed those execution units.

AMD's RDNA architecture traditionally has very high first level bandwidth. The company's prior GCN architecture already had decent 64 byte per cycle L1 caches, but RDNA doubles that to 128 bytes per cycle to match its 32x 32-bit wide vector execution scheme. Pascal has 64 byte per cycle L1 texture caches, though the Vulkan test here can't see the full bandwidth. Texture accesses via OpenCL can achieve 1274 GB/s from the GTX 1050 3 GB. Qualcomm is very far behind, and that disadvantage extends to L2.


Adreno 690 only has a bandwidth lead when it can leverage the system level cache. There, Adreno 690 can sustain 130 GB/s, while the GTX 1050 3 GB and Radeon 740M are limited by DRAM. At larger test sizes, Nvidia's GTX 1050 3 GB technically wins thanks to its high speed GDDR5 memory. LPDDR-6400 lets the Radeon 740M come in just a hair behind. Finally, Adreno 690's older LPDDR4X trails at just under 60 GB/s.

Link Bandwidth​

iGPUs can enjoy faster data movement between CPU and GPU memory spaces because they're backed by the same memory controller. That holds true for Adreno 690 too. Copies between CPU and GPU memory spaces can exceed 26 GB/s of bandwidth. For comparison, Nvidia's GTX 1050 3 GB is limited by its PCIe 3.0 x16 interface.

a690_vk_link.png

Adreno 690's advantage continues if the CPU uses memcpy to get data to and from GPU memory. For an iGPU, it's basically a CPU memory bandwidth test. When the Core i5-6600K tries to do the same to data hosted on the GTX 1050 3 GB’s GDDR5, bandwidth is extremely low because the CPU cores can't cope with PCIe latency.

Final Words​

Qualcomm's Adreno 690 is an interesting look at a mobile-first company's attempts to move into higher power and performance segments. Adreno 6xx's shader array gets scaled up, bringing its compute throughput close to that of Nvidia's GTX 1050 3 GB and AMD Ryzen Z1's integrated Radeon 740M. Its L2 cache gets a corresponding capacity boost to 512 KB. But Qualcomm's iGPU still lacks cache bandwidth compared to those AMD and Nvidia designs. Adreno 690 further suffers from higher latency throughout its memory hierarchy, so it'll need a combination of higher occupancy and more instruction level parallelism to hide that latency.

However, the Snapdragon 8cx Gen 3's 6 MB system level cache can help take the edge off memory bandwidth bottlenecks. Furthermore, Qualcomm has long relied on tile based rendering to handle rasterization with lower memory requirements. That could let Adreno 690's sizeable shader array shine, as GMEM helps absorb writes to a tile until the finished result gets written out to main memory.
However, newer games are increasingly using compute shaders and even raytracing. Tile based rendering’s benefits could be limited if the traditional rasterization pipeline starts taking a back seat.
 
Howdy fam we been down bad but as the time goes, this thing is getting closer to show up

This is the most likely scenarios we should consider now

Scenario A (20% chance):

  • Switch 2 Mention/Trailer in May
  • Switch 1 Direct in June
  • Switch 2 Presentation in June
  • Switch 2 Release in Fall 2024

Scenario B (75% chance)

  • Switch 1 Direct in June
  • Switch 2 Mention in Summer
  • Switch 2 Presentation in Fall/Winter
  • Switch 2 Release in Spring 2025
Scenario D for Doom (5% chance)

  • Switch 1 Direct in June
  • Switch 1 Direct in September
  • Switch 1 Direct in February 2025
  • Switch 2 Reveal in Spring 2025
  • Switch 2 Presentation June 2025
  • Switch 2 Release in Fall 2025

May the force be with y'all
I would say that its a bigger chance that Switch 2 releases in late 2025/early 2026 than late 2024.
 
the May IR is gonna get weird with neither games past June nor the next system acknowledged

not saying it couldn't happen! it would just be weird
Not only weird it would be depressing because logically the only reason why Nintendo would not even mention coming hardware to their investors soon is if the release date for Switch 2 is further off than early 2025.
 
Not only weird it would be depressing because logically the only reason why Nintendo would not even mention coming hardware to their investors soon is if the release date for Switch 2 is further off than early 2025.

Eh, I'm not sure, through the years, how close any of our leaps of logic were to what actually happened.

Nintendo has to do this
- Nintendo doesn't and it works out
Ah! Well, nevertheless

I've stopped worrying/getting excited about whether we'll get it right this time (insert Simpsons reference: MTV generation/neither highs nor lows/Eh). Which is a weird thing to say in a speculation thread, but I'm mostly just a prisoner of habit.
 
0
Bulkier, slower, worse battery life.

Possibly more expensive per chip, too. Almost definitely more expensive per chip later on, as 8N is probably a dead-end node by this point while the 5nm nodes have a while to go with further price reductions.
 


TSMC A16 Technology: With TSMC's industry-leading N3E technology now in production, and N2 on track for production in the second half of 2025, TSMC debuted A16, the next technology on its roadmap. A16 will combine TSMC's Super Power Rail architecture with its nanosheet transistors for planned production in 2026. It improves logic density and performance by dedicating front-side routing resources to signals, making A16 ideal for HPC products with complex signal routes and dense power delivery networks. Compared to TSMC's N2P process, A16 will provide 8-10% speed improvement at the same Vdd (positive power supply voltage), 15-20% power reduction at the same speed, and up to 1.10X chip density improvement for data center products.

N4C Technology: Bringing TSMC's advanced technology to a broader range of of applications, TSMC announced N4C, an extension of N4P technology with up to 8.5% die cost reduction and low adoption effort, scheduled for volume production in 2025. N4C offers area-efficient foundation IP and design rules that are fully compatible with the widely-adopted N4P, with better yield from die size reduction, providing a cost-effective option for value-tier products to migrate to the next advanced technology node from TSMC.
 
It honestly makes me wonder why 120 Hz didn't become the standard for HD TVs rather than 60. The 16:9 aspect ratio was standardized as a middle ground between the 4:3 used in SD TVs and the 2.35:1 used in movies. Both of them were at 24 fps, so why not use a clean multiple of 24?
@Concernt has the history right. Just added that driving refresh rates of that size was likely impractical, especially on CRTs. HD standardization in the states predated home LCD TVs being common, and plasma was long a transitional technology.

In the States at least, the driving force of HD TV was broadcasters who wanted to show sports at higher resolutions, and were often broadcasting semi-proprietary signals, at 1080i. You didn't just have to have a HD TV, but the correct brand for the channel you were trying to receive
 
the May IR is gonna get weird with neither games past June nor the next system acknowledged

not saying it couldn't happen! it would just be weird
If there is no acknowledgment in the slightest I‘m upgrading my metaphor of Nintendo playing 4D Chess with us to 64D Chess.
 
If there is no acknowledgment in the slightest I‘m upgrading my metaphor of Nintendo playing 4D Chess with us to 64D Chess.
Did Nintendo mention the NX to investors in 2016.

Like we know that the Switch got delayed, but did we get that information from investors meeting as well or was it a rumour.

Also relax guy in three days my birthday will happen and Myamoto himself will mention NG switch with a simple tweet.
 
Did Nintendo mention the NX to investors in 2016.

Like we know that the Switch got delayed, but did we get that information from investors meeting as well or was it a rumour.

Also relax guy in three days my birthday will happen and Myamoto himself will mention NG switch with a simple tweet.
It got technically already mentioned in 2015. I really hope that this new console dosen‘t have the NX timeline. Then we can wait until early 2026.

But I believe now in the power of your birthday, so we are good.
 
I worded it wrong, but what I meant that did Nintendo mention the NX in 2016 specifically, since the console got delay.

Like we know the switch was meant to release in 2016, but did we hear the rumor of the delay from the investors or from an insider.
This is what was reported after the briefing April 2016

"The Nintendo NX will launch in March 2017, the company has confirmed. The release date for Nintendo's next console was announced as the company briefed reporters on its latest financial results in Japan. The March 2017 launch would be global, Nintendo said.Apr 27, 2016"
 
This is what was reported after the briefing April 2016

"The Nintendo NX will launch in March 2017, the company has confirmed. The release date for Nintendo's next console was announced as the company briefed reporters on its latest financial results in Japan. The March 2017 launch would be global, Nintendo said.Apr 27, 2016"
So weird that we got the release date so early, but it makes sense because of the Wii U failure and the lacklustre 3DS software sales.
 
So weird that we got the release date so early, but it makes sense because of the Wii U failure and the lacklustre 3DS software sales.
It was quite the time. It was great that we had a date but was torture that Nintendo waited till October to reveal the NX. Still prefer that to what we have now though.
 
It was quite the time. It was great that we had a date but was torture that Nintendo waited till October to reveal the NX. Still prefer that to what we have now though.
True plus the day of reckoning is almost upon us with either lack of news or a small confirmation of NG switch.

Also I’m surprised that the only person mentioning switch 2 ports has been Midori, since I would have expected way more rumours surrounding that, but eh.
 
@Concernt has the history right. Just added that driving refresh rates of that size was likely impractical, especially on CRTs. HD standardization in the states predated home LCD TVs being common, and plasma was long a transitional technology.

In the States at least, the driving force of HD TV was broadcasters who wanted to show sports at higher resolutions, and were often broadcasting semi-proprietary signals, at 1080i. You didn't just have to have a HD TV, but the correct brand for the channel you were trying to receive
Here the transition was pretty different, but I'd like to say on the matter of practicality, the move to 1080(p or i) 60 was mainly at a time when televisions were still used for, well, capital T, capital V TV. NTSC and it's 60 field broadcast situation was a matter of bandwidth, they couldn't push 60 FRAMES at 480+ lines over the air at that time, and when HDTV came about, getting things working at 60hz with progressive scan was a bandwidth headache even by that time, early HDTV was still strictly interlaced. Early revisions of HDMI couldn't do 1080p60, it's that recent a standard! The bandwidth needed to push over a hundred frames to our screens has only been available for a relatively brief period of time, and as far as I'm aware, just isn't done over the air.

A limitation of modern day television sets isn't necessarily that the panel can't refresh at a high enough rate, it's usually GETTING the signal to the TV in the first place. While people so dedicated they browse and post in dedicated gaming forums (again our "bubble" at play) might have or want or plan to get a HDMI 2.1 capable television... A lot of consumers don't have any idea what that means, and are still buying, and are happy with, 2.0 and 4K60.

Be it then or now, we're, frankly, tech nerds here, right? We want the latest and greatest, the best and the fastest. Reality often has other ideas.

What Nintendo does is still very much up in the air, but I wouldn't be too surprised if they decided 4K60 HDR with chroma subsampling is "good enough". Because for most people, it is.
 
True plus the day of reckoning is almost upon us with either lack of news or a small confirmation of NG switch.

Also I’m surprised that the only person mentioning switch 2 ports has been Midori, since I would have expected way more rumours surrounding that, but eh.
This really shocked me, but considering that Sega will be the biggest third party partner for switch2, it's not surprising that the information was confirmed by Midori.

I think みどり's statement at least confirms that Sega has internally set Q1 2025 as the release date for switch2, so the odds are extremely high, and it also fits with my understanding of Furukawa's style.
 
Last edited:
A lot of consumers don't have any idea what that means, and are still buying, and are happy with, 2.0 and 4K60.
adding to this that I’m tech dweeb in a video game forum bubble and I’m still at an “extremely good enough for me” 1080p 60fps max lcd tv.

there are plenty of reasons not to be enthused about newer sets. I’m riding this one out until it dies dies

so I’m not even just an average consumer — I’m a relative enthusiast — and 1080p consistently at 60fps on the Switch 2 would blow me away

I’m really, truly not expecting it to even hit consistent 4k 60fps. I think @Concernt ’s “good enough” factor will definitely come into play.

but we’ll see in a year or so!
 
I have now become old enough that 1) this isn’t common knowledge and 3) the reference is 60Hz, not 30
Aren't TV's supposed to have 24fps mode though? I have a several years old Philips LED TV and I remember that when watching 24fps content, the TV switches to 24fps mode, at least it says so in the screen. (1920x1080 24fps)
 
the May IR is gonna get weird with neither games past June nor the next system acknowledged

not saying it couldn't happen! it would just be weird

I'm almost sure enough that i would bet money (i won't though) that they will announce ReDraketed in May shortly before or on the day of their briefing.

Though in a way that's not doing this thread any favor, with a short, dry PR release. You know ... those that start with "To whom it may...".

With zero details, outside of maybe a planned release timeframe.
 
So weird that we got the release date so early, but it makes sense because of the Wii U failure and the lacklustre 3DS software sales.
They were also announcing a partnership with DeNA and didn't want people thinking Nintendo was going to be moving away from consoles IIRC
 
I'm almost sure enough that i would bet money (i won't though) that they will announce ReDraketed in May shortly before or on the day of their briefing.

Though in a way that's not doing this thread any favor, with a short, dry PR release. You know ... those that start with "To whom it may...".

With zero details, outside of maybe a planned release timeframe.
I do remember that for the 3DS the PR contained reference to the major feature of the console, as well as its name (which would a been a strong indication yes).

So it is possible that we get information no? And even then, I think there will be/would be costs associated with leaving investors in the dark like that.

The wiiu to switch transition was overall a short one: anyone could understand why the switch could not be shown or launched, since it was not that long since the launch of the old console. This time it is a little bit different I guess. Fingers crossed.
 
I do remember that for the 3DS the PR contained reference to the major feature of the console, as well as its name (which would a been a strong indication yes).

So it is possible that we get information no? And even then, I think there will be/would be costs associated with leaving investors in the dark like that.

The wiiu to switch transition was overall a short one: anyone could understand why the switch could not be shown or launched, since it was not that long since the launch of the old console. This time it is a little bit different I guess. Fingers crossed.

If there's any new gimmicks, i doubt they'll talk about them in a PR release announcement. They will maybe say it's being a handheld and dockable system again, but that wouldn't be something new or interesting for this thread. ;D
 
I'm almost sure enough that i would bet money (i won't though) that they will announce ReDraketed in May shortly before or on the day of their briefing.

Though in a way that's not doing this thread any favor, with a short, dry PR release. You know ... those that start with "To whom it may...".

With zero details, outside of maybe a planned release timeframe.
this is exactly what I'm expecting myself
 
I'm almost sure enough that i would bet money (i won't though) that they will announce ReDraketed in May shortly before or on the day of their briefing.

Though in a way that's not doing this thread any favor, with a short, dry PR release. You know ... those that start with "To whom it may...".

With zero details, outside of maybe a planned release timeframe.

The benefit of this, is that companies can start announcing if their multiplatform projects are coming to "nx2", or whatever it will be.
 
I'm almost sure enough that i would bet money (i won't though) that they will announce ReDraketed in May shortly before or on the day of their briefing.

Though in a way that's not doing this thread any favor, with a short, dry PR release. You know ... those that start with "To whom it may...".

With zero details, outside of maybe a planned release timeframe.
I feel you on your thoughts for May however,
I think whatever Nintendo will probably say will cause massive scandalous speculation on the forums and YouTube’s till they speak on it again.
I’m soo looking forward to that lol.
 
If a press release happens, it will be something generic like "New hardware is coming this FY. Please look forward to more details in the future."
 
adding to this that I’m tech dweeb in a video game forum bubble and I’m still at an “extremely good enough for me” 1080p 60fps max lcd tv.

there are plenty of reasons not to be enthused about newer sets. I’m riding this one out until it dies dies

so I’m not even just an average consumer — I’m a relative enthusiast — and 1080p consistently at 60fps on the Switch 2 would blow me away

I’m really, truly not expecting it to even hit consistent 4k 60fps. I think @Concernt ’s “good enough” factor will definitely come into play.

but we’ll see in a year or so!

My TV is a 1080p TV as well, but it's only 42". Most manufactures don't even make 4K TVs that small, probably because the increased resolution is barely noticeable on smaller TVs. So the benefits of 4K largely depend on the size of your TV. If you have a TV 50 inches or less, 1080p will resolve a nice clean image, but if you have a 75" TV, things will start to look pretty soft.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom