• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

It's an appropriate comparison, and Oodle Texture is a different thing. I've actually been meaning to explain this for a while, as it doesn't seem to be well explained anywhere, and there's a lot of misunderstanding and misinformation on the topic. So I hope you don't mind if I go on a bit of a digression, and explain some things you likely already know, just for the sake of providing a complete account of things.

Block Texture Compression

Texture compression has been a thing for a long time, with the goal of textures taking up less space not just on-disk, but in memory as well. In order for textures to be kept in memory as a compressed format, they have to be decompressed extremely quickly by the GPU, so they have to use very specific compression techniques. If we think about a compressed format like a zip file, or even a JPEG image, if you want a single piece of data from the file, you have to decompress the entire thing. For textures held in memory decompressing the entire texture every time the GPU needs a single texel would be extremely inefficient, so instead they use a technique called block encoding.

The way block encoding works is by first dividing the texture up into blocks (typically 4x4 pixels) and compressing each block individually, with a fixed compressed size (usually 128 bits). The benefit of this is that the GPU can know exactly where the texture data it needs is, and only decompress the individual 128-bit block it needs, rather than the entire texture. The compressed encoding used for the blocks is intentionally simple, so that GPUs can process a very large number of texture samples each frame.

The widespread use of block texture compression started in the late 90s when a graphics card manufacturer called S3 Graphics developed a block compression format called S3TC for their graphics cards. This was licensed for use in the Gamecube, and also became the basis of the texture compression formats Microsoft added to DirectX, and therefore became the standard for PC graphics cards. Microsoft have defined a range of texture compression formats referred to as BC formats, from BC1 to BC7. The first three of these are, I believe, direct implementations of S3TC's different modes, and then Microsoft added a few more to handle specific texture types and compression ratios. The last two of these formats to be added, BC6H and BC7, were introduced with DirectX 11 and have been supported widely since 2010, including on all modern games consoles.

There's also an alternative texture compression algorithm called ASTC, which was introduced by ARM and AMD in 2012. It's designed as a more flexible alternative to BC compression, as it allows for a variety of block sizes and number of channels. As it was never adopted by MS for DirectX, it didn't gain traction in the desktop GPU market, and isn't supported by any of Sony or MS's consoles, but did gain some support in the mobile market, and is fully supported by Apple's chips, and on Switch's TX1.

On-Disk Compression

One downside of using block compression formats is that, because they compress each block individually, they don't achieve the smallest possible file size. This is a worthwhile trade-off in memory, where you want to be able to decompress only a small part of the texture at a time, but when storing the textures on disk, this isn't really important. You're going to load the full texture from disk to memory at once, so you might as well use a technique that keeps the on-disk file size as small as possible.

For this reason almost all games (including on consoles like Switch, without dedicated decompression hardware) add an additional layer of compression when storing texture data on disk, usually a general purpose lossless compression algorithm like DEFLATE (aka Zlib). Decompressing this data when pulling it off a hard drive/SSD/game card can be a lot of work for the CPU, which can become a bottleneck, which is why Sony, MS and Nintendo now all have hardware specifically to handle decompression for algorithms like these.

Lossless Compression Algorithms

One thing that's important to note here is that there's no such thing as a free lunch with lossless compression algorithms, and there's no "inside out" compression algorithm that's going to come along and magically reduce file sizes by large amounts. There's a mathematically minimum size that any data can be losslessly compressed to based on its entropy, and compression algorithms have been circling that minimum since the 1980's.

For general-purpose compression algorithms, there's generally a trade-off between compression speed, decompression speed and compression ratios, with different algorithms being better at one or two of these three compared to others. To the extent that better compression ratios have been achieved over the years by newer algorithms it's largely because the increase in speed of CPUs has allowed for algorithms that would have been too slow otherwise. The big increase in CPU cache sizes has also allowed for algorithms that work with more data at once, which generally allows for better compression ratios. These differences are actually pretty minor, though, with differences in compression ratios often being single digit percentages between state of the art and much older compression algorithms.

The other big factor is precisely what kind of data the algorithm is compressing. Different compression algorithms can be better or worse for different data types by relatively large amounts, so there can be a big benefit to choosing a compression algorithm that's specifically tailored for the data you're trying to compress.

PS5 Decompression Hardware

PS5 has decompression hardware that supports two compression algorithms: DEFLATE and Oodle's Kraken format. DEFLATE is a widely used general purpose compression algorithm that's been around for about 30 years. Kraken doesn't have many published details, as it's proprietary, but one important thing to note here is that it's a general purpose compression algorithm, not one which is specifically designed for game data. They advertise around 10% better compression ratios than DEFLATE, although many of their examples are of non-game data, and as with any vendor provided benchmarks, I'm always a little suspicious that they may be cherry picked. I wouldn't be surprised if it does outperform DEFLATE in general though, as even a very similar LZ-based compression algorithm could provide better results by using a much larger sliding window to leverage modern CPUs' much larger cache sizes.

Rate Distortion Optimisation and Oodle Texture

Now we get to Oodle Texture. Oodle Texture is a piece of texture compression software than runs on the developers side which implements something called Rate Distortion Optimisation (RDO), and outputs textures in the standard BC formats.

To explain RDO, it's first necessary to explain in a bit more detail how block texture compression works. Each block effectively consists of two parts, one of which is an ID for a pattern, and the second is a set of colours to map to that pattern. Each block texture format has a specific set of pre-defined patterns that can be mapped to, and the texture compression software's job is to choose the pattern and colours which match the original texture data for that block as closely as possible.

Because of the way block encoded textures have a fixed block size, the choice of which patterns and colours are used for each block has no impact on the size of the final texture, so generally the compression algorithm will just want to choose whichever produces the best match to the original, uncompressed data. However, people realised that the choice of blocks does impact that second stage of running the texture through a general-purpose lossless compression algorithm. So, a technique called RDO was developed, where instead of encoding each block based on the best representation of the original texture data, block encodings are chosen to minimise the overall texture size after compressing with a general-purpose algorithm like DEFLATE or Kraken.

Oodle Texture is an implementation of RDO, and there are some important things to note about it. Firstly, it's not exclusive to PS5 by any means, it's just something Sony have licensed and made available to PS5 developers. It's already been widely used during the PS4/XBO generation, either by developers writing their own RDO implementations in-house or licensing from a company like Oodle. As it runs on developers computers, it's also doesn't depend on any specific hardware functionality in PS5, aside from BC texture support, which as mentioned has been standard for a long time now. In fact you could use RDO-optimised textures on the Gamecube if you really wanted to.

Another important note is that Oodle Texture/RDO doesn't impact texture sizes in memory. Block compression formats have a fixed size by design, so every 2K BC7 (say) texture is always going to come out the same size, whether you're using RDO or not. In fact, RDO actually makes the in-memory texture worse, not better. Because RDO relies on choosing non-optimal block encodings, there's inherently a quality loss vs non-RDO compressed textures. RDO is trading off worse in-memory representation of textures for better on-disk compression of textures, and the more you reduce file sizes with RDO, the more loss of quality you get on the textures. This can be a worthwhile trade-off if the quality loss is low enough not to be noticeable, but there's always going to be some loss of texture quality from it.

Xbox Series X/S Compression Hardware

The Xbox Series S and X also have decompression hardware for lossless compression algorithms. The first one supported is DEFLATE, which is also supported on PS5's decompression hardware and is a general-purpose lossless algorithm. The second one is much more interesting, and it's called BCPack. BCPack isn't something MS have talked much about publicly, but here's their description of it from their public documentation:



The really important thing here is that BCPack isn't a general purpose algorithm. Unlike DEFLATE and Kraken, which treat texture data the same as they would a regular text file, BCPack is specifically designed to compress BC format block-encoded textures. This is pretty important, as lossless compression algorithms designed for a very specific structured data type will always beat general-purpose algorithms, and block-encoded texture data is exactly the kind of highly structured data that's well suited to custom lossless compression algorithms.

We get a couple of hints about how BCPack works. Firstly they separate colour data from pattern data, which is a pretty easy win, as the colour data is likely to be highly correlated across a texture. Secondly, they use rANS, which is variant of Asymmetric Numerical Systems, a class of compression algorithms used in, for example, the Zstd format. There's likely some extra secret sauce in there in terms of choosing a specific implementation of rANS which works best with BC-encoded texture data.

We don't have any hard numbers on the compression ratios achieved by BCPack, but Richard Geldreich, who worked on both texture compression and lossless compression at Valve, Ensemble, and others, estimated the following:



There's definitely a significant advantage to a dedicated compression algorithm like BCPack. Sony can get close by using RDO, but that means reducing texture quality, whereas BCPack completely side-steps the need for RDO and can achieve high levels of on-disk compression without any loss to texture quality.

Why Textures Are Important

The reason that I'm focussing to much on textures here is that they comprise the bulk of the data that decompression hardware will have to deal with. Audio and video can take up quite a bit of space, but they use compression algorithms like MP3/H264/etc. which already have an entropy encoding stage, so there's no benefit to recompressing them a second time with DEFLATE or Kraken. Other game data like code will benefit from compression, but takes up very little space comparatively.

PS5's Kraken will give a small benefit to the compression ratios of non-texture data over Xbox's DEFLATE, but as most of the data that needs compressing is texture data, and BCPack likely has a much more significant benefit in compressing texture data, the Xbox decompression hardware overall is much better suited to video games. Using RDO on PS5 can close the gap, but with a quality loss that you don't get on Xbox.

Where Nintendo Stands

We know the T239 chip used in Switch 2 has a File Decompression Engine (FDE), but we don't really know anything about it, like the compression standards it supports. It's probably a safe bet that it supports DEFLATE at least. There's a reason both Sony and MS's hardware support the algorithm, as it's extremely widely used so it's easy for developers to integrate into their pipeline (and many probably already use it on Switch).

Ideally Nintendo and Nvidia would also implement a custom compression algorithm for textures like MS have, possibly optimised around the ASTC format Switch supports (which I assume is used by most well-optimised Switch games, although I may be wrong on that). It's the kind of thing that's in Nvidia's wheelhouse.

Even without a custom algorithm for game data, the FDE should still be a big win. The benefit of decompression hardware isn't really to achieve smaller file sizes, or to make SSDs "equivalent" to much higher speeds, they're to take the work of decompression off the CPU. Games were already using compression on PS4, XBO and Switch, and the games that have achieved significantly smaller file sizes on current gen consoles have largely done so by removing duplicate data that was necessary because of mechanical hard drives' slow seek speeds, not by additional compression.

On Switch, CPU decompression was a big bottleneck, so removing that and moving decompression to dedicated hardware is a win-win. Effective storage speeds can increase considerably thanks to faster decompression, and the CPU is freed to work on other things. If they also include support for a custom algorithm for texture data then they could also achieve a modest game size reduction without the need for RDO and the associated texture quality hit, but honestly that's a relatively small win compared to having dedicated decompression hardware in the first place.
Bit late but I just wanna say this was a super interesting read! I've been interested in how compression for textures worked for a while now and this was very insightful for it!
 
Going from SDR OLED to HDR LCD is definitely good enough to give you that wow factor. HDR LCD may not get as dark, but it should be able to get MUCH brighter.
To be honest I don't own a OLED Switch and didn't know it is only SDR. I was making assumptions based on my home experience with the OLED Deck, LG CX OLED tv, and BenQ EX3210U IPS HDR monitor. While it may be the greatest PC monitor I've ever used, I still think the CX OLED TV still looks better.

Though yeah the BenQ can get so bright I'll quickly get a headache if I'm not careful, I only keep the brightness set to around 30/100 during the day!
 
I want it too, but I won't wait. I want those first two years of exclusives and/or games at higher frame rates and resolutions. And when the inevitable OLED rev comes out I'll do exactly the same thing I did this time, which is sell the OG model for $50 less than I bought it for to cover the cost of the new system.

Same, I want both the exclusives and the Switch games I haven't played yet (waiting to play them in upgraded form). 4K60 TOTK and Bayonetta 3, please.
 
Yeah using a webpage is good. iOS suffers from making its stock apps tied to the OS updates, Music and Safari etc cannot be updated without an OS update which in 2024 is silly.
Every update to Safari also changes the browsing engine used by other apps. MS ran into the same issue and decided to turn Edge into a Chrome clone to get around that.
 
There's nothing special about the eShop workload (interpreted JS) that a more powerful CPU won't improve. It should be significantly smoother.
You are correct, I overstated the case. An improved CPU will help interpreted JS significantly. But it won't match performance on even a lower specced phone, for example, unless Nintendo alters their security strategy for the store (as I'm sure you're aware, just saying for the kids at home).

Basically every browser on the market uses Just In Time recompilation for Javascript. This is the same trick that makes emulators fast, but requires essentially letting your web browser make native code, with full privileges, in real time. In fact, most browsers use the same trick now to make CSS (the technology used to control how web pages display) faster.

Nintendo locks that down, because it's such an obvious security hole (and it also eats RAM under the worst cases). I'm dubious that Nintendo will move away from that security strategy for store performance.


I never knew that. Do they mean updates as in changes to the catalogue of games available, or updates as in how the eShop functions? I feel like Sony and Xbox do just fine having apps so I'm not sure why Nintendo would be against it unless their implementation means updating the system when games are added to the eshop.
Updates to how the eShop functions, in the broad sense. That includes custom layouts for, say, a seasonal sale.

The store backend is a fork of the 3DSes, which wasn't shared with the Wii U (as best I understand), and Nintendo was racing to get the Switch firmware ready for launch - launch units were flashed with beta firmware, in fact. Switch's OS is made by a non-Nintendo company (eSol Ltd), but their online infrastructure is done by Nintendo System's limited. In the AWS talk, it's mentioned that a lot of eShop front end updates are pushed by marketing.

Total speculation, but making the eShop a webpage disconnected the rapid moving online infrastructure from the slower moving, more cautious OS group, and let them work on the store front all the way up to Switch launch, and beyond, but firmware 1.0 had to be ready at the start of mass production.
 
You are correct, I overstated the case. An improved CPU will help interpreted JS significantly. But it won't match performance on even a lower specced phone, for example, unless Nintendo alters their security strategy for the store (as I'm sure you're aware, just saying for the kids at home).

Basically every browser on the market uses Just In Time recompilation for Javascript. This is the same trick that makes emulators fast, but requires essentially letting your web browser make native code, with full privileges, in real time. In fact, most browsers use the same trick now to make CSS (the technology used to control how web pages display) faster.

Nintendo locks that down, because it's such an obvious security hole (and it also eats RAM under the worst cases). I'm dubious that Nintendo will move away from that security strategy for store performance.



Updates to how the eShop functions, in the broad sense. That includes custom layouts for, say, a seasonal sale.

The store backend is a fork of the 3DSes, which wasn't shared with the Wii U (as best I understand), and Nintendo was racing to get the Switch firmware ready for launch - launch units were flashed with beta firmware, in fact. Switch's OS is made by a non-Nintendo company (eSol Ltd), but their online infrastructure is done by Nintendo System's limited. In the AWS talk, it's mentioned that a lot of eShop front end updates are pushed by marketing.

Total speculation, but making the eShop a webpage disconnected the rapid moving online infrastructure from the slower moving, more cautious OS group, and let them work on the store front all the way up to Switch launch, and beyond, but firmware 1.0 had to be ready at the start of mass production.
Was the 3DS OS Developed internally at Nintendo or was that also done by a third party?

I assume the Switch 2 will likely heavily base its OS on the existing Switch OS which is pretty robust security-wise I dont think many kernel exploits have ever been found with it
 
HDR, smaller bezels, and lamination would go a long way in improving the appearance of an LCD display.

As for how Nintendo would market it after touting the benefits of an OLED, they would probably emphasize the larger resolution, size, and HDR. As well as the overall performance boost letting games look sharper.

All until the inevitable OLED model of course. But - fingers crossed this is just OLED day 1.
 
Last edited:
I think these casual consumers will be willing to look past a worse screen if they can play all-new exciting games on it at an affordable price.
Not only that, but most people have an LCD Switch like myself. I may be in the minority but I am not risking it with burn-in and I prefer to at least have the option to go for an LCD (saving some money in the process), in which burn-in will not be a thing. On top of that, the Switch screen was solid imo. Technology would have evolved after 7 years plus with the increase in horsepower (remarkably better graphics) it would be a significant improvement regardless.
 
Not only that, but most people have an LCD Switch like myself. I may be in the minority but I am not risking it with burn-in and I prefer to at least have the option to go for an LCD (saving some money in the process), in which burn-in will not be a thing. On top of that, the Switch screen was solid imo. Technology would have evolved after 7 years plus with the increase in horsepower (remarkably better graphics) it would be a significant improvement regardless.
If you check out Wulff Denn on youtube he has an ongoing series where he shows what happens to the OLED screen if you have it on 24/7/365/Non Stip and honestly its pretty impressive

You pretty much have nothing to worry about regarding Burn In of Modern OLED Displays or atleast the one the Switch OLED is using
 
People should stop worrying about OLED Burn-in. :)
I have a Sony OLED for almost 3 Years and no sign of Burn-in and I use it everyday to watch movies and play on it with my Switch and PS3.
 
People should stop worrying about OLED Burn-in. :)
I have a Sony OLED for almost 3 Years and no sign of Burn-in and I use it everyday to watch movies and play on it with my Switch and PS3.
I have a Google Pixel with an OLED screen that's about to be three years old and it has burn-in so bad I can almost read it. 😅

Not tryina be argumentative, just saying it is a thing.
 
I have a Google Pixel with an OLED screen that's about to be three years old and it has burn-in so bad I can almost read it. 😅

Not tryina be argumentative, just saying it is a thing.

Phones have vastly more persistent UI elements than game consoles do unless you play the exact same game over and over for years. If the Switch 2 has a HDR OLED screen, it very likely will be technically capable of burn-in, but not practically.
 
That is not true. That was one patent they did not use that someone made a fake mockup with, it was never even prototyped as far as we know.
They did prototype a pill-shaped design, which was basically the same idea aside from technically not being an oval. It's covered in this video (w/ a timestamp to the images):

Edit: As @Individualised said, these images are from a point in the timeline where the "Switch" name was already being used. The Mont Blanc SOC was still being used at that point.
 
Last edited:
They did prototype a pill-shaped design, which was basically the same idea aside from technically not being an oval. It's covered in this video (w/ a timestamp to the images):

Didnt this patent try to do controllers that were part of the screen

I dont think it would have ever been anything except a concept, Probably didnt even make it to a physical prototype stage, The reddit "leak" was just a 3D Printed Fake
 
Around March 2014, the ST Switch was a very different device. It was an oval shaped screen with two analog sticks in the middle of the display. When did Nintendo switch from this device to the one with detachable controllers. Was this the Switch by September 2014? Or was still the original design?
As I understand it the name Switch started being used when the system became the stadium (not oval; that was just the patent) design with the sticks on screen. That was the first "dockable hybrid" iteration.
 
Except tnat's the color layout of Xbox controllers not the SNES controller:

Microsoft-Wireless-Controller-for-Xbox-360

il_1080xN.3918407903_es8u.jpg
MS bought Nintendo confirmed.
 
0
With only one more retro game announced for NSO, it becomes more and more clear that something is happening either next month or in March at the latest. Perhaps it will be a Direct (Mini). Or it might just be a series of social media posts.

The cat is out of the bag, everybody and their mothers are talking about Nintendo's next device, so I'm curious if they'll really just pretend for a whole Direct presentation that they have nothing new in store in terms of hardware. It would genuinely be funny, but man, let's just at least mention officially that it exists.
 
What are the chances of seeing a "Nintendo Switch / Nintendo Switch 2" as launch platforms for a game in the next direct? Even if it's the last big thing in direct like a 3D Mario?
 
What are the chances of seeing a "Nintendo Switch / Nintendo Switch 2" as launch platforms for a game in the next direct? Even if it's the last big thing in direct like a 3D Mario?

Zero unless ReDraketed is announced before the Direct.
 
But it's quite interesting fantasy. Just maybe, maybe Nintendo can use this T215 Luigiko chip to extend Switch 1 for another 2 years because they want to exceed 155 million PS2 sales. Hmmmmm
The Year of Luigi has returned, and this time; he brings a Switch revision that will crush the PS2. A much better outcome than last YoL, that's for sure.
 
The Year of Luigi has returned, and this time; he brings a Switch revision that will crush the PS2. A much better outcome than last YoL, that's for sure.
That‘s impossible. Just a console with the name 2 in it (Switch 2) can outsell the PS2. It is fait. Even though they could name the Chip of the Switch 2 Luigiko to honor the anniversary./s
 
Lots of smoke on this. I can't help but think due to the Middle east situation that any plans involving January (If true) are up in the air. If its planned for Feb or March then I can see them just playing it by ear. I'd imagine this red sea situation looks much more clear in the next 2 weeks. Either it stops or we go on the offensive and clean house to restore security.
Option 2 it is.
 
A newer CPU architecture means dropping 32-bit support, which is a problem if Nintendo wants to maintain backwards compatibility with Nintendo Switch games.

How likely is it to include a 32-bit compatibility layer, let alone the feasibility of it?

I know the “2038 Problem” is a real thing regarding 32-bit systems, but I thought there are ways to combat that without having to always resort in moving away from it entirely? And if so, is that something Nvidia would help co-develop with Nintendo to continue BC?

Then again, one possibility is NGS will be BC, but Switch 3 and beyond will drop 32-bit entirely, and many Switch 1 games by that point will have patches/updates to work natively with Switch 2 hardware without the BC layer.
 
Didnt this patent try to do controllers that were part of the screen

I dont think it would have ever been anything except a concept, Probably didnt even make it to a physical prototype stage, The reddit "leak" was just a 3D Printed Fake
The reddit leak was a 3D printed fake but I believe Nintendo actually prototyped it internally
 
That‘s impossible. Just a console with the name 2 in it (Switch 2) can outsell the PS2. It is fait. Even though they could name the Chip of the Switch 2 Luigiko to honor the anniversary./s
That's why this Switch 1 gen is the big opportunity to topple PS2 sales. We didn't know about Switch 2 future. But current Switch just need a little more time with slight upgrade in power ala PS4 Pro (T215 maybe ?) to become top seller console ever.
 
I don't think Nintendo cares as much about this as you'd expect.
I keep seeing this said but do we really know??
One day in some future Nintendo interview, they’ll come out and say it has been a long held dream of Nintendo’s to finally overcome the atrocity of ps2 dominion
 
Then again, one possibility is NGS will be BC, but Switch 3 and beyond will drop 32-bit entirely, and many Switch 1 games by that point will have patches/updates to work natively with Switch 2 hardware without the BC layer.

I suspect that Nintendo will not worry too much about having multiple generations of BC built into their hardware. I do expect SNG to have have BC with Switch, but the successor to SNG may drop native Switch support. That may sound bad right now in 2024, but in the year 2030 or later, how often are you really going to be replaying a bunch of your Switch library of games? Softening this blow even further, I expect Nintendo will have added Switch games to NSO via emulation at that point. I could actually see Nintendo adding first party Switch games to NSO here in a few years. Nintendo needs to keep new content coming to NSO to keep people subscribing, and adding in a bunch of Switch games would certainly bolster the value of the service.
 
I keep seeing this said but do we really know??
One day in some future Nintendo interview, they’ll come out and say it has been a long held dream of Nintendo’s to finally overcome the atrocity of ps2 dominion
The 3DS was announced when the DS had just finished a 27.8 mi year. The only consoles to ever sell that much was the Switch in peak pandemic (28 mi) and... the DS... twice (30 and 31 mi).

And not only they announced the 3DS too early (for hardware sales, that is), they also cut software support and shifted marketing completely to the 3DS.

Despite all that, they ended just a hair behind the PS2. They absolutely could have the first place, but they prioritized other things (most likely money, since flash cards became mainstream and software sales were plummeting).
 
HDR, smaller bezels, and lamination would go a long way in improving the appearance of an LCD display.

As for how Nintendo would market it after touting the benefits of an OLED, they would probably emphasize the larger resolution, size, and HDR. As well as the overall performance boost letting games look sharper.

All until the inevitable OLED model of course. But - fingers crossed this is just OLED day 1.
If it's OLED but has considerable PWM flicker issues then I don't want it. I'd much rather have an LCD panel with good colors and contrast but none or almost no flicker.
 
So, I know this hasn't been discussed in a minute (though, I've been less active here), but do we consider an OLED screen a no-go for Switch 2?
lack of backward compatibly, that a big no no for me, i can survive if Switch sucessor dont feature a OLED
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom