• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

I don't expect more than one SKU (excluding maybe colours) for a bunch of reasons, economics are tough as of late, and one SKU can be beneficial, simplifying production and supply. As things are and have been, I don't even expect colour variations. It doesn't need a lower price or a better trim to sell, especially at launch. I expect variations may be introduced after a year, but as things are, not at launch.
 
I have a dumb question: Why is VRR such a big deal for some? Even factoring in that VRR can reduce stutters, and/or screen tearing since it's adjusting the display's refresh rate on the fly (Like FreeSync, or GSync on PC), is it considered a better solution than just locking a frame rate? I know depending on the buffering solution, latency could be an issue with regards to controller input (V-Sync I believe is notorious for adding additional latency to a player's input), so is VRR meant to counter that?

And I suppose also, in the PC spectrum, why would you let your PC always run at max frames all the time? For me personally, if I can run a game at let's say a rock solid 60fps, but could vary up to 100fps, I guess I'd rather limit the frames possible (through something like RivaTunerStatistics through MSI Afterburner), and save on the resources so my computer isn't running at 100% max power all the time.

I guess my main question is: Is VRR the be all end all solution, or is it overhyped? What are the disadvantages of using it as well as the advantages?
VRR allows you to run at a general range for framerate without the issue of stutter resulting from missing the refresh rate target, as the refresh rate is dynamic & syncs accordingly with the framerate being displayed. It won't fix some of the jarring aspects of massive changes in framerate as you've cited, but I'd still pick 40-60fps vs a locked 30fps provided the former has solid framepacing to boot.
Nonetheless, I'm not sure if we could see Nintendo bother with letting developers implement arbitrary refresh rates in portable or support VRR (the latter in particular as Dakhil has cited that Variable Refresh Rate support usually is ubiquitous with mobile displays supporting high refresh rates).
 
Why launch with an oled model, when they can sell people an upgrade years later?

I’ve said it before - using OLED specifically as a means of selling an upgrade has only very recent precedent. People act like it’s an essential part of the console strategy because they used it once.

Selling a brand new more expensive product with universally identical (or better) features has some value, and they can opt for some other form of upgrade later in its life. I can’t say what it might be but it’s kind of narrow thinking to say postponing OLED is the move.
 
I’ve said it before - using OLED specifically as a means of selling an upgrade has only very recent precedent. People act like it’s an essential part of the console strategy because they used it once.

Selling a brand new more expensive product with universally identical (or better) features has some value, and they can opt for some other form of upgrade later in its life. I can’t say what it might be but it’s kind of narrow thinking to say postponing OLED is a must.
In fact it's happened a grand total of [twice]... to my knowledge.

Sure the Steam Deck OLED and Switch OLED are cool, but they're pretty much just what they are... upgrades for some people. They aren't required. They're basically just the 3ds XL of the New Age.
 
0
I’ve said it before - using OLED specifically as a means of selling an upgrade has only very recent precedent. People act like it’s an essential part of the console strategy because they used it once.

Selling a brand new more expensive product with universally identical (or better) features has some value, and they can opt for some other form of upgrade later in its life. I can’t say what it might be but it’s kind of narrow thinking to say postponing OLED is a must.
OLED/EL specifically, maybe, but "better screen" has been a selling point of a revision of every single Nintendo handheld. Every. Single. One.

Game Boy Light and Color, Game Boy Advance SP, DS Lite and i XL, 3DS XL, New, and New XL.

Every single one!

Now I am categorically absolutely ruling out a separation of SKU based on screen technology. There's a reason that doesn't happen, being able to tease out sales from upgrades later among them, but I'm definitely not ruling out OLED being the default at launch on the basis of this precedent. There's ways to improve, you can have better OLEDs, larger screens, MiroLED, QDEL, etc. in a later model. You can have your OLED cake and eat it too, in a sense.

But not by separating console trims with screen technology at launch.
 
Last edited:
Alternatively you could be the Vita, which started with an OLED screen and then got rid of it with it's revision.

But yeah, selling a later model with a better screen is not a new thing. Virtually every Nintendo handheld did this at some point.
 
0
OLED/EL specifically, maybe, but "better screen" has been a selling point of a revision of every single Nintendo handheld. Every. Single. One.

Game Boy Light and Color, Game Boy Advance SP, DS Lite and i XL, 3DS XL, New, and New XL.

Every single one!

Now I am categorically absolutely ruling out a desperation of SKU based on screen technology. There's a reason that doesn't happen, being able to tease out sales from upgrades later among them, but I'm definitely not ruling out OLED being the default at launch on the basis of this precedent. There's ways to improve, you can have better OLEDs, larger screens, MiroLED, QDEL, etc. in a later model. You can have your OLED cake and eat it too, in a sense.

But not by separating console trims with screen technology at launch.

Screen improvements have been there, yes. But similarly somebody pointed out to me that Nintendo also doesn’t regress on screen tech? Not sure if that’s true but it’s an equally relevant precedent to saying “they always use screen tech for upgrade models.” Edit: I assume the original post I’m recalling (which was likely related to one of these OLED arguments) was focused on resolution, quality of LCD, and excluding gimmicks like having 3D, but I’m not gonna stand by the statement and don’t know if it’s true.

Edit: I’m also not saying there will be multiple SKUs based on screen type. I have no idea. But I’ve seen a lot of folks ruling out OLED like it’s an essential part of Nintendo’s strategy.
 
Last edited:
The Game Boy Light -> Game Boy Color transition sacrificed the backlit screen, and the DSi XL -> 3DS one lost the TN displays (with some random exceptions). Although those transitions gained either colors or a 3D screen. Switch 2 could gain a very big display at the expense of the OLED screen.
 
But similarly somebody pointed out to me that Nintendo also doesn’t regress on screen tech? Not sure if that’s true but it’s an equally relevant precedent to saying “they always use screen tech for upgrade models.”
But this isn't true-

Game Boy Micro to DS got you bigger screens, but not nicer ones.

DSi XL to 3DS the screen got smaller.

Game Boy Light to Game Boy Color or Game Boy Advance, you lose a lit screen!

3DS to Switch, 3D is gone!

My expectation remains a HDR 1080P60 LCD, which while it isn't an OLED, nor EL in the display technology sense, is a significant upgrade in resolution and colour space. Personally, I also prefer HDR LCD to SDR OLED, hugely so, so I very much wouldn't call it a downgrade, but beyond my own expectations, those are out and out higher specs, physical make up of the screen notwithstanding.
 
I hope the new encoding format will allow me to have beautiful screenshots that haven't been JPEGd to hell and back without having to purchase a capture card. Let me capture Link's drag queen outfit in lossless 4K, Nintendo!
 
I can see why PS5/X series doesn't have Infinity cache for their GPUs, I didn't realize that cache needs to be balanced with the rest of the SOC. I find it fascinating just to learn how companies design their SOCs
I believe what Thraktor was saying is it's a balancing act with how much cache to include because it takes up precious die space and would no doubt significantly increase manufacturing cost because of it.

* Hidden text: cannot be quoted. *

I'm actually wondering could we see a set storage amount of 256-512GB internal and then a UFS card like addition storage of 256GB-1TB
 
I have a dumb question: Why is VRR such a big deal for some? Even factoring in that VRR can reduce stutters, and/or screen tearing since it's adjusting the display's refresh rate on the fly (Like FreeSync, or GSync on PC), is it considered a better solution than just locking a frame rate? I know depending on the buffering solution, latency could be an issue with regards to controller input (V-Sync I believe is notorious for adding additional latency to a player's input), so is VRR meant to counter that?
VRR opens up frame rates that otherwise don't exist.

Imagine a 60hz screen kind of like train stop. Specifically, a train stop in Japan, where everything is precisely on time. Every 1/60th of a second, the train shows up, an picks up the passengers. Our passengers in this case are the pixels, with the cars at the front taking the pixels for the top of the screen, and the cars at the back take the bottom.

The train is unerring, it never misses. But what about the passengers? If they don't also exactly sync themselves with the train, you'll get tearing where the pixels from one frame get on at the front, and then pixels from the next frame run into the station and dash into the cars at the end.

The solution, as you mention, is buffering and/or a frame limit. A stationmaster with a whistle, on the platform, ensuring that each train has only the passengers/pixels for each frame. But what if you don't have all your pixels/passengers for a full train?

Well, the train has to go empty, which means on your screen you see the last frame for 1/60th of a second longer. You drop a frame, no big deal most of the time. But what happens if it's consistent frame droppage? Let's imagine that for every 60 refreshes you have 45 frames - 45 sets of passengers for 60 trains.

You get judder. Some frames will persist on screen for two refreshes - an empty train car - and some will persist for one refresh- two full train cars in a row. But it's worse than that, because animations are running at a smoothly paced 45fps. Think of Link's running animation like a flip book. It's got 45 pages, and if you flip it smoothly, you get a smooth run. Now pick out 15 of those pages, copy them and stick them back into the flip book. 15 times a second, Link will seem to slow down and speed up.

The traditional solution to that is a frame limiter. The stationmaster with his whistle guarantees that the trains are filled in a way that ensures even pacing. The fastest evenly paced frame rate below 60fps is 30fps. So if your game can hit 50fps, no problem, you're still probably running at 30fps.

But not imagine that the stationmaster is allowed to hold the train till it fills up, then send it on the instant that it does. And when he does hold the train, the next one queues up behind it, so after that brief period of slow down, he can speed up. Not only is he not stuck to sending exactly 60 trains every second, but he can slightly vary how much delay there is between each train.

Now we can get judder-free frame rates between 30 and 60fps. If your game can deliver high-ish-but-not-60-fps you can give that smoothness to the player. This is true even if you do a locked frame rate. A "locked 40" just isn't possible without judder on a traditional screen.

It also smoothes out the problems with even random frame drops. If a 60fps game is slightly late on a single frame, you don't wait a full refresh to send the frame out, it'll be late, and you'll get brief judder. With VRR, the frame goes out when it's ready, and so you might perceive the reduction in frame rate, the animation and frame persistence will be correct.

And I suppose also, in the PC spectrum, why would you let your PC always run at max frames all the time? For me personally, if I can run a game at let's say a rock solid 60fps, but could vary up to 100fps, I guess I'd rather limit the frames possible (through something like RivaTunerStatistics through MSI Afterburner), and save on the resources so my computer isn't running at 100% max power all the time.
Really common on Steam Deck, in fact, to save battery power. That's why the frame limiter is baked into the OS. But for folks with big beefy desktops, who want high frame rates, 60fps isn't cutting it, but they can't push their game to 120fps, that's what VRR is for.

But VRR reduces screen power draw, because it doesn't need to "waste" screen refreshes when there isn't new data to display.

I guess my main question is: Is VRR the be all end all solution, or is it overhyped? What are the disadvantages of using it as well as the advantages?
Only VRR disadvantage (as opposed to a fixed frame rate) is that the cost of the screen is higher, and it requires some complexity on the part of the game developer to make sure that they are sending a vsync signal that matches the frame rate they're delivering. If you're on UE5, basically it's free. If your engine has a solid frame-limiter already, 80% of the work is in place. If your frame-limiter is awful (see any From Software game) then you might have some more work to do, but even there, drivers can step in and patch up the problem in many cases.
 
January was never going to be a month for big leaks. People simply wanted to will it into existence.
mind-reading-shocked.gif
 
Could something like BFI (Black Frame Insertion) be used on handheld devices. This greatly improves motion clarity, Something that modern Sample and Hold displays suffer from. Compared to CRTs.
 
Could something like BFI (Black Frame Insertion) be used on handheld devices. This greatly improves motion clarity, Something that modern Sample and Hold displays suffer from. Compared to CRTs.
It significantly impairs overall display luminance, which is not exactly a forte of mobile displays to begin with. Don’t think it would be worth the trade off.
 
Just a reminder we do have paper mario thousand year door, princess peach, and Mario vs donkey Kong. We still also have NSO games worth coming out. Also, 2023 was a pack year, you probably have a backlog that needs some TLC.

We all want Switch U and Knuckles, but we can't rush it. Enjoy a few games before the reveal. Also, we should be lucky. Before we would have a dry spell before a new console coming out.
 
i dont believe Nintendo will anounce it next console in march, unless Nintendo really wants Princess Peach Showtime to flop(all eyes would be on Switch sucessor, unlike Princess Peach Showtime, Nintendo dont want that)
Nintendo has enough announced 2024 stuff to release 1 game a month. a game is gonna get "sacrificed" to Switch 2 no matter what.
 
i dont believe Nintendo will anounce it next console in march, unless Nintendo really wants Princess Peach Showtime to flop(all eyes would be on Switch sucessor, unlike Princess Peach Showtime, Nintendo dont want that)


Nah, they released Mario Sports Superstars on 3DS the same month the Switch launched.

March is just when we’re getting info, it won’t even be on sale yet.

Peach can release and we can still get the reveal the same month no problem.
 
I dont think the existince of Switch games will slow down the announcement/release of Switch 2.
Pokemon Black/White 2, USUM, Metroid Samus Returns, and Luigi's Mansion remake all came out after their successor consoles did so its not like the OG Switch will get dropped like a hot potato.

The hardcore audience will be there Day 1, and the casual market will move over in time, but Nintendo doesnt need to slam the brakes on one to cater to the other.
 
Can a 4MB L2 cache on the GPU significantly improve bandwidth efficiency in comparison to 1MB? I understand having a larger cache takes more die size since there are diminishing returns on memory scaling on more advanced nodes. Just curious if a 4MB GPU cache can alleviate bandwidth constraints if Switch 2 utilizes LPDDR5 modules instead of LPDDR5X.
Theoretically, yes.

As Nvidia mentioned, a larger L2 cache on the GPU increases the cache hit rate and reduces the cache miss rate when data can't be found on the L1 cache on the GPU, since a larger L2 cache increases the amount of data that can be stored in the L2 cache. And since a larger L2 cache on the GPU means more data can be found there, the frequency that the GPU has to request data from RAM, etc., when data can't be found on the L2 cache, has been reduced, which can theoretically increase the amount of RAM bandwidth available.

And a larger L2 cache on the GPU can also be beneficial for LPPDR5X, not just LPDDR5.

But as mentioned before, a larger L2 cache does take up more die space, which can also reduce chip yields (here and here).
I will note that the reduction to 1MB of L2 is offset versus what one would think it'd do on a Desktop GPU or the Cache-Nerfed Series S|X/PS5 with GDDR6 due to Switch 2 using LPDDR.

The Latency of LPDDR is far lower than GDDR6's very bad latency, usually this benefits CPU a lot, but modern GPUs are fairly latency sensitive, thus why they are stuffing more cache into the GPU. The Latency of GDDR makes it so when the cache misses, it takes a long time for a ping to go to the GDDR to fallback. LPDDR having extremely low latency relative to GDDR reduces the "Cache Miss Time" of the GPU's operations versus say the Series S. And then you add the overall low latency of a SoC in general ontop of all of that (when comparing to a Desktop or Laptop GPU).

You can sort of observe the impacts LPDDR Latency has on a SoC when looking at the Z1 Extreme in devices like the ROG Ally. Despite having less shaders and constantly fighting for power with the CPU. It can actually match or push past Series S under some circumstances in its 30W mode (Despite said mode averaging 3.6 TFLOPs rather than the 4 of the Series S).

Heck, you can even observe what likely is LPDDR latency offsetting bandwidth/low cache in the Steam Deck as well, with Metro Exodus Enhanced's performance scaling pretty much perfectly to the reduction in raw GPU performance, indicating that despite the large reduction in bandwidth, the latency of the LPDDR is picking up the slack and compensating for it in the Van Gogh APU.

Both Ampere/Lovelace and RDNA2/3 are Latency-sensitive architectures, so lower latency wherever you can get it (Which mitigates stalling to as short as possible), is better.
 
Especially if Peach still runs on the new system
Or is EVEN BETTER on it!

They had games coming out only on 3DS long after Switch launched, and they weren't backwards compatible. The next thing very probably is, if anything I think it could liven up sales for 2024 Switch games as people pick them up alongside their new system.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom