• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Doesn't Nvidia have some fancy upscaling tech that makes this discussion somewhat moot? Wouldn't all the OG Switch games potentially get upscaled to 1080p by the GPU?
No, unfortunately.

"Fancy upscaling tech" works by finding detail that is lost in the low resolution image and adding it back into the picture. Think of aliasing - aliasing is caused by the underlying art assets support a higher resolution than the one you've got and the mismatch at the edges shows up as a stair-step. Nvidia's various AI upscaling tech finds that lost data, and restores it, giving you a smooth edge.

But in some images there just isn't more detail to find. Pixel art is the best example. That art was drawn exactly to the pixel grid. Upscaling the NES Super Mario Bros can't find a seperate mouth and mustache for Mario because there isn't one. If you try to stick that game's art on a screen whose own grid doesn't like up with the art's original grid, you get smearing.

This isn't just a pixel art problem. Even in 3D games, if you have a texture, DLSS can't invent a higher resolution version. And if the art was carefully created for a specific screen resolution, uprezzig it to something that doesn't match will introduce artifacts. It's just less of an issue in most 3D games because the camera moving around means that you're rarely in that position.

Individual users have different tolerances for these artifacts but they are artifacts. Individual users may have different preferences for trading off these artifacts for other advantages - like aging eyes not having to fight a small screen. Individual games have different levels of artifacting, depending on the nature of the assets, the engine, and the visual style.

Switch is a console with a strong retro-oriented subset of fans. Those fans are going to be most hit by the 720p->1080p shift. I think offering a "pixel perfect" BC mode, off by default, would be a pretty reasonable move. But I can imagine Nintendo being afraid of folks turning it on by accident and wondering why all of their games only take up two thirds of the screen.

I worked phone tech support for years, and the most common calls were "I accidentally turned on a useful accessibility feature, and now I think your service is broken." Well, second most, this was the early 2000s so "grandma has a thousand viruses from porn her grandson looked at" ranked higher, but the accessibility thing was a decent second.
 
I'd hope there's some revamping of regarding the VC titles to provide more filters :D.
Their existing Filters are already solid ngl, but perhaps with more power it's possible for N64 and GC titles if there's any hope for additions from that catalogue.
 
It's already going to cost significant frametime to get BC running on Switch 2 (assuming it's not being done by including Switch 1 hardware in it), using DLSS in BC mode on top of that would lead to horrendous latency.
 
No, unfortunately.

"Fancy upscaling tech" works by finding detail that is lost in the low resolution image and adding it back into the picture. Think of aliasing - aliasing is caused by the underlying art assets support a higher resolution than the one you've got and the mismatch at the edges shows up as a stair-step. Nvidia's various AI upscaling tech finds that lost data, and restores it, giving you a smooth edge.

But in some images there just isn't more detail to find. Pixel art is the best example. That art was drawn exactly to the pixel grid. Upscaling the NES Super Mario Bros can't find a seperate mouth and mustache for Mario because there isn't one. If you try to stick that game's art on a screen whose own grid doesn't like up with the art's original grid, you get smearing.

This isn't just a pixel art problem. Even in 3D games, if you have a texture, DLSS can't invent a higher resolution version. And if the art was carefully created for a specific screen resolution, uprezzig it to something that doesn't match will introduce artifacts. It's just less of an issue in most 3D games because the camera moving around means that you're rarely in that position.

Individual users have different tolerances for these artifacts but they are artifacts. Individual users may have different preferences for trading off these artifacts for other advantages - like aging eyes not having to fight a small screen. Individual games have different levels of artifacting, depending on the nature of the assets, the engine, and the visual style.

Switch is a console with a strong retro-oriented subset of fans. Those fans are going to be most hit by the 720p->1080p shift. I think offering a "pixel perfect" BC mode, off by default, would be a pretty reasonable move. But I can imagine Nintendo being afraid of folks turning it on by accident and wondering why all of their games only take up two thirds of the screen.

I worked phone tech support for years, and the most common calls were "I accidentally turned on a useful accessibility feature, and now I think your service is broken." Well, second most, this was the early 2000s so "grandma has a thousand viruses from porn her grandson looked at" ranked higher, but the accessibility thing was a decent second.

taking-notes-bj-novak.gif
 
Doesn't Nvidia have some fancy upscaling tech that makes this discussion somewhat moot? Wouldn't all the OG Switch games potentially get upscaled to 1080p by the GPU?
They talk a lot about their techniques meant for streaming videos, but a lot of that is about cleaning up compression artifacts. Something more straightforward like FSR (or NVIDIA's equivalent I never remember the name of) would make more sense as an option for old games.
For games running in BC mode, it does. Owlboy, Sea of Stars, Blasphemous, The Messenger are all pixel art games designed to run at 360p, which integer scales onto 720p. This is a case where the smaller screen doesn't do you any favors, lack of integer scaling is going to make pixel art games smeary. That's why pixel perfect modes are offered in emulators.
If they are legit running as 640x360 games scaled straight to 720p, then scaling them from 720p to 1080p with the right method would look exactly the same as scaling them from 640x360 to 1920x1080. But most pixel games I see cheat to greater or lesser degrees.
 
No, unfortunately.

"Fancy upscaling tech" works by finding detail that is lost in the low resolution image and adding it back into the picture. Think of aliasing - aliasing is caused by the underlying art assets support a higher resolution than the one you've got and the mismatch at the edges shows up as a stair-step. Nvidia's various AI upscaling tech finds that lost data, and restores it, giving you a smooth edge.

But in some images there just isn't more detail to find. Pixel art is the best example. That art was drawn exactly to the pixel grid. Upscaling the NES Super Mario Bros can't find a seperate mouth and mustache for Mario because there isn't one. If you try to stick that game's art on a screen whose own grid doesn't like up with the art's original grid, you get smearing.

This isn't just a pixel art problem. Even in 3D games, if you have a texture, DLSS can't invent a higher resolution version. And if the art was carefully created for a specific screen resolution, uprezzig it to something that doesn't match will introduce artifacts. It's just less of an issue in most 3D games because the camera moving around means that you're rarely in that position.

Individual users have different tolerances for these artifacts but they are artifacts. Individual users may have different preferences for trading off these artifacts for other advantages - like aging eyes not having to fight a small screen. Individual games have different levels of artifacting, depending on the nature of the assets, the engine, and the visual style.

Switch is a console with a strong retro-oriented subset of fans. Those fans are going to be most hit by the 720p->1080p shift. I think offering a "pixel perfect" BC mode, off by default, would be a pretty reasonable move. But I can imagine Nintendo being afraid of folks turning it on by accident and wondering why all of their games only take up two thirds of the screen.

I worked phone tech support for years, and the most common calls were "I accidentally turned on a useful accessibility feature, and now I think your service is broken." Well, second most, this was the early 2000s so "grandma has a thousand viruses from porn her grandson looked at" ranked higher, but the accessibility thing was a decent second.
what do you think about the idea of a future DLSS feature that upscales textures on the fly as they load into the gpu? could we eventually see (prolly not any time soon) 2k textures that can be texture dlss'd to like 4k and stuff?
 
what do you think about the idea of a future DLSS feature that upscales textures on the fly as they load into the gpu? could we eventually see (prolly not any time soon) 2k textures that can be texture dlss'd to like 4k and stuff?

Probably not because some tile need to be seamless to tile with each other. Upscaling them without taking that into account would result in something like this

ffvgarbagezoom.jpg

Thanks to Fortress of Doors for the image
 
Expected. Not based on solid info. Take it as speculation until stated otherwise.
Can you elaborate on your reasons for speculating 16?

I'm thinking it will be 12, because I don't see those 4 gb making a difference to the amount of ports it will receive. Most games will run on the series s with less than 8 gb reserved for games So what Nintendo gets in return is slightly less compromised ports, and/or better Ray Tracing. I don't think that will survive Nintendos cost benefit analysis.
 
0
It's already going to cost significant frametime to get BC running on Switch 2 (assuming it's not being done by including Switch 1 hardware in it), using DLSS in BC mode on top of that would lead to horrendous latency.
if the game is running fast enough to exceed its framerate cap, there's nothing to worry about. and DLSS won't be added to BC
 
Come again?

Nintendo has been doing new model refreshes of their hardware since the Gameboy days. They haven’t done it for every bit of hardware but they have done it a lot.

I wouldn’t call it “new,” but the idea that fans now expect it and take it for granted is kind of new, I guess.
That's why I specifically said "mid-gen refresh" (and said "Pro" in my previous posts). Lemme try and clarify what I meant.

Nintendo has released variants of their handhelds for decades, often hitting a different subset of the market. But the 3DS XL isn't a "premium" version, it's a variant for adults with big hands and bad eyes. The OLED model is a "premium" model, it is designed to be superior to the base model in all the ways it isn't identical. (It might not achieve that but it's certainly the intent).

When you release a console, you are pretty much guaranteed that, in seven years, all the parts of that console will be either 1) cheaper, 2) better, 3) both in 7 years, but the worst case scenario is that tech stagnates entirely and instead they're 4) the same price. For each part you can either upgrade them, or leave them be, but you don't have to regress.

When you release a premium console, you think it's the same situation, but it's not. To push the premium features down into the baseline console, it's not enough that they get cheaper, they have to get as cheap as the old baseline part. And you have fewer years to do it in.

You put a $50 LCD in your Handheld Games console. You want to sell a premium console to folks and make a few extra bucks yourself. So four years later you put a $75 OLED screen into your Premium Version.

No matter what happens you'll be able to offer the same or better screen than the LCD model, without losing any money. Even if the tech barely improves, at minimum you'll be able to just offer. But in order to put the OLED screen into your base model, without losing money, it can't just drop in price, it needs to drop to $50, in just a couple years. And that's just to offer the exact same screen as the premium model.

Successors work by riding the natural improvement cycle of hardware. It doesn't matter how much movement there is, or in subset of the technology, a path will be open to making a superior product, at similar cost, in seven years, even if it means breaking the original architecture.

Pro refreshes depend on keeping the architecture of the original product in place, and charging customers more to make improvements elsewhere.

Making a successor that incorporates all of the upgrades of a Pro requires betting that a specific technology will make a specific sized leap in a less-than-a-full-generation period of time. That's why I think it's inevitable we see these sorts of two-step-forward-one-step-back moves in the future.
 
No, unfortunately.

"Fancy upscaling tech" works by finding detail that is lost in the low resolution image and adding it back into the picture. Think of aliasing - aliasing is caused by the underlying art assets support a higher resolution than the one you've got and the mismatch at the edges shows up as a stair-step. Nvidia's various AI upscaling tech finds that lost data, and restores it, giving you a smooth edge.

But in some images there just isn't more detail to find. Pixel art is the best example. That art was drawn exactly to the pixel grid. Upscaling the NES Super Mario Bros can't find a seperate mouth and mustache for Mario because there isn't one. If you try to stick that game's art on a screen whose own grid doesn't like up with the art's original grid, you get smearing.

This isn't just a pixel art problem. Even in 3D games, if you have a texture, DLSS can't invent a higher resolution version. And if the art was carefully created for a specific screen resolution, uprezzig it to something that doesn't match will introduce artifacts. It's just less of an issue in most 3D games because the camera moving around means that you're rarely in that position.

Individual users have different tolerances for these artifacts but they are artifacts. Individual users may have different preferences for trading off these artifacts for other advantages - like aging eyes not having to fight a small screen. Individual games have different levels of artifacting, depending on the nature of the assets, the engine, and the visual style.

Switch is a console with a strong retro-oriented subset of fans. Those fans are going to be most hit by the 720p->1080p shift. I think offering a "pixel perfect" BC mode, off by default, would be a pretty reasonable move. But I can imagine Nintendo being afraid of folks turning it on by accident and wondering why all of their games only take up two thirds of the screen.

I worked phone tech support for years, and the most common calls were "I accidentally turned on a useful accessibility feature, and now I think your service is broken." Well, second most, this was the early 2000s so "grandma has a thousand viruses from porn her grandson looked at" ranked higher, but the accessibility thing was a decent second.

Great explanation.

I suppose to address the spirit of the question that @BingBong43 was asking, generative AI would be able to address such an issue in the far future, but then we're squarely out of the realm of upscaling and it's more like "remastering the assets on the fly". We will probably get there with complete neural rendering, but we're definitely not there now. It does make me wonder what NVIDIA would call it because you just know it will be something super silly.
 
It's already going to cost significant frametime to get BC running on Switch 2 (assuming it's not being done by including Switch 1 hardware in it), using DLSS in BC mode on top of that would lead to horrendous latency.
The Switch 2's CPU is fully capable of running Switch CPU code natively because that capability has been something ARM has had built in since ARMv7 (Switch and Switch 2 used ARMv8). That is the biggest part of BC, even if the GPU instructions and shaders aren't 100%. And honestly, chances are Nvidia has taken this into consideration and have already devised a means to provide BC.

DLSS is not something that can be tacked on externally to a game. It has to be integrated in. No Switch game can conceive any functionality beyond what Switch can do. The only way a Switch game can utilize DLSS is if it's no longer a Switch game. But what Switch games on Switch 2 can do is make use of the raw power provided to them. Even if the system clocks (CPU, GPU, etc) were set to the same as they are on Switch, the actual raw performance of the hardware at thos clocks would allow Switch games to run practically at their max, which is 1080p60 at a hardware level, and whatever top limit for the software level, like XC2 sticking to 720p30 instead of dropping in either res or fps.
 
It's already going to cost significant frametime to get BC running on Switch 2 (assuming it's not being done by including Switch 1 hardware in it), using DLSS in BC mode on top of that would lead to horrendous latency.
DLSS is not an option anyway, cause it needs to be implemented in the game engine. It's not something you can just tack on. Side note, I have faith that Nvidias translation layer will be very efficient.

I'm thinking maybe something similar to how the shield can upscale video could be tacked on top of BC


Of course woudnt be as good as dlss, but better than nothing.
 
I don't need your mockery, I can do it myself ;)

360p integer scales into 1080p. Did you mean that they're 240p? I remember that was a concern with Shovel Knight, and at that point, I'd hope they would have options for 4.5x or 4x scaling on Switch 2.
Yes, thank you. I can never remember which one scales to which. Swap the games listed, then. I don't actually care about non-integer scaling of pixel art personally, it doesn't bother me, but I recognize that the artifacts are real and measurable, and to some people, very visible.

I get this way with movies, though. I see the wrong aspect ratio on something, and I'm livid. My partner will be watching Batman in 16:9, and I'll be seething in the bathroom I'm so angry. "Haha you're missing your favorite part, with the handshake and the buzzer. And I know you really respect Michael Keaton's performance in this, truly the forerunner of bringing character acting to leading man parts... do you wanna come out. Babe?"

Meanwhile I'm panting and running the water so I can pretend I can't hear her, while what I really want to do is scream "2.35:1 IS THE ONLY ASPECT RATIO APPROVED BY GOD AND JOHN FORD 16:9 IS THE DEVIL'S WIDESCREEN AND CROPPING THINGS TO FIT IT IS THE 9TH FUCKING CIRCLE OF HELL."

So when I say I don't care about integer scaling, I want it made clear that I'm sympathetic.
 
what do you think about the idea of a future DLSS feature that upscales textures on the fly as they load into the gpu? could we eventually see (prolly not any time soon) 2k textures that can be texture dlss'd to like 4k and stuff?

GoW: Ragnarok on PS5 already does something like this, by using ML to upscale some textures as they're loaded, but honestly the results aren't all that impressive. There's a GDC talk on it, and the slides are here if you're interested in the nitty gritty details.

What you're talking about (and what the GoW Ragnarok implementation does) is very different than what DLSS does, though, and there's a very important distinction there which a lot of people miss, which is what oldpuck was getting at.

DLSS doesn't just take a 1080p image in and spit out a 4K image, with lots of extra detail made up by AI. DLSS takes a series of 1080p images, and uses the details from each one of them to reconstruct a 4K image. Importantly, the game engine offsets the pixels being rendered each frame to give DLSS all the information it needs to produce a 4K image, so it doesn't have to make anything up.

For a slightly simplified version, let's say you divide the 4K image up into blocks of four pixels. On the first frame, you render only the top left pixel in every block, which means rendering a 1080p image, and send them to DLSS. On the next frame, you render the top right pixel, etc. After four frames of rendering, DLSS now has all the data needed for a full 4K image, it's just being produced over time, not all at once. So DLSS's job isn't to make up what's in the extra pixels, it's just collating all the data and making sure the right data is used for the right pixels.

The tricky part is when things move within the frame, or the camera moves, which is why DLSS also takes in motion vectors (ie indications of where everything in the frame has moved since the last frame) and other info in order to reproject pixel data into the right place. However, for static scenes where nothing's moving (which is what a lot of people are looking at when they're doing DLSS on/off comparisons) these don't really matter, and it's really just accumulating data over multiple frames like my example above, except it uses a more complex sequence of sub-pixel movements each frame to allow it to achieve better than native rendering quality.

You don't even need ML for this, either. FSR 2 does a pretty good job with static scenes without any ML, although it falls behind DLSS in motion. This is really where the use of machine learning helps DLSS, in tracking complex movement patterns in order to re-use the correct data from previous frames.
 
GoW: Ragnarok on PS5 already does something like this, by using ML to upscale some textures as they're loaded, but honestly the results aren't all that impressive. There's a GDC talk on it, and the slides are here if you're interested in the nitty gritty details.

What you're talking about (and what the GoW Ragnarok implementation does) is very different than what DLSS does, though, and there's a very important distinction there which a lot of people miss, which is what oldpuck was getting at.

DLSS doesn't just take a 1080p image in and spit out a 4K image, with lots of extra detail made up by AI. DLSS takes a series of 1080p images, and uses the details from each one of them to reconstruct a 4K image. Importantly, the game engine offsets the pixels being rendered each frame to give DLSS all the information it needs to produce a 4K image, so it doesn't have to make anything up.

For a slightly simplified version, let's say you divide the 4K image up into blocks of four pixels. On the first frame, you render only the top left pixel in every block, which means rendering a 1080p image, and send them to DLSS. On the next frame, you render the top right pixel, etc. After four frames of rendering, DLSS now has all the data needed for a full 4K image, it's just being produced over time, not all at once. So DLSS's job isn't to make up what's in the extra pixels, it's just collating all the data and making sure the right data is used for the right pixels.

The tricky part is when things move within the frame, or the camera moves, which is why DLSS also takes in motion vectors (ie indications of where everything in the frame has moved since the last frame) and other info in order to reproject pixel data into the right place. However, for static scenes where nothing's moving (which is what a lot of people are looking at when they're doing DLSS on/off comparisons) these don't really matter, and it's really just accumulating data over multiple frames like my example above, except it uses a more complex sequence of sub-pixel movements each frame to allow it to achieve better than native rendering quality.

You don't even need ML for this, either. FSR 2 does a pretty good job with static scenes without any ML, although it falls behind DLSS in motion. This is really where the use of machine learning helps DLSS, in tracking complex movement patterns in order to re-use the correct data from previous frames.

While you are correct on the differences here, as I understand, NVIDIA does not want it to always be the case that the technology to address the upscaling issue is fragmented with different solutions; they would prefer a holistic solution (full neural rendering) and have already stated at some point they imagine that DLSS will evolve into something like that in the future. In order words, upscaling is not the end goal for DLSS. Whether they achieve that goal or not remains to be seen but if there's any company that can pull it off, I believe it is them.

Looking ahead, Catanzaro suggested that in a few years, DLSS would evolve into a "completely neural rendering system." Such a system has the potential to revolutionize the traditional rendering approach that relies on raster graphics. Between now and the development of DLSS 10, Nvidia's VP stated that the GeForce-exclusive technology would continue to undergo gradual improvements.


The ultimate goal of DLSS, as Catanzaro explained, is to empower developers to create games that are "more immersive and more beautiful" than what we can currently imagine.

SOURCE
 
Last edited:
I don't need your mockery, I can do it myself ;)


Yes, thank you. I can never remember which one scales to which. Swap the games listed, then. I don't actually care about non-integer scaling of pixel art personally, it doesn't bother me, but I recognize that the artifacts are real and measurable, and to some people, very visible.

I get this way with movies, though. I see the wrong aspect ratio on something, and I'm livid. My partner will be watching Batman in 16:9, and I'll be seething in the bathroom I'm so angry. "Haha you're missing your favorite part, with the handshake and the buzzer. And I know you really respect Michael Keaton's performance in this, truly the forerunner of bringing character acting to leading man parts... do you wanna come out. Babe?"

Meanwhile I'm panting and running the water so I can pretend I can't hear her, while what I really want to do is scream "2.35:1 IS THE ONLY ASPECT RATIO APPROVED BY GOD AND JOHN FORD 16:9 IS THE DEVIL'S WIDESCREEN AND CROPPING THINGS TO FIT IT IS THE 9TH FUCKING CIRCLE OF HELL."

So when I say I don't care about integer scaling, I want it made clear that I'm sympathetic.

Breaking news: Switch 2 to use a 4:3 screen with VHS pan-and-scan technology just to screw with oldpuck.
 
yi8dzy1i6tf81.jpg


2017 vs 2022 LCD screen tech.

As someone who's never owned an OLED, that seems like a vast improvement.
So I got REALLY curious about this and decided to take a similar picture comparing the Switch OLED, Steam Deck LCD 64GB model, Switch LCD Erista, and Vita OLED 1000, in that order. They're all at max brightness.
I know my camera isn't the best but here are the results with lights on and off.

The Steam Deck is blinding bright at times and the Vita is very dim but that might be due to age.
The Switch OLED is, obviously, the clear winner in terms of color quality and how pleasing it is to see.
The Steam Deck is good but I feel like it isn't as pleasing to see, likely due to the colors and due to blinding max brightness, it strained my eyes. The colors are good enough at that brightness but I would realistically never use it like that so I normally lower the brightness and the colors don't look as good by then.
The Switch LCD is also not too bad. The brightness leaves a lot to be desired but the colors feel more pleasing to my eyes. Even if they don't look as bright as it might look like in the picture.
The Vita is the best and worst of three worlds. You get good colors but not as good as Switch OLED, nice pixel density, and brightness is good enough for indoors play but you wouldn't see a damn thing outdoors.
Blacks are very nice in the loading screens on the OLEDs though.

0126241800.jpg

0126241800a.jpg

0126241805.jpg

0126241805a.jpg


PS: I wish we had the latest update on Vita. It really is a comfy way to play Stardew Valley.
PS.2: If we get analogue triggers on Switch 2, I wish they are like the Steam Controller (not Deck). They have a click at the end and the travel is very short and nice.
 
So I got REALLY curious about this and decided to take a similar picture comparing the Switch OLED, Steam Deck LCD 64GB model, Switch LCD Erista, and Vita OLED 1000, in that order. They're all at max brightness.
I know my camera isn't the best but here are the results with lights on and off.

The Steam Deck is blinding bright at times and the Vita is very dim but that might be due to age.
The Switch OLED is, obviously, the clear winner in terms of color quality and how pleasing it is to see.
The Steam Deck is good but I feel like it isn't as pleasing to see, likely due to the colors and due to blinding max brightness, it strained my eyes. The colors are good enough at that brightness but I would realistically never use it like that so I normally lower the brightness and the colors don't look as good by then.
The Switch LCD is also not too bad. The brightness leaves a lot to be desired but the colors feel more pleasing to my eyes. Even if they don't look as bright as it might look like in the picture.
The Vita is the best and worst of three worlds. You get good colors but not as good as Switch OLED, nice pixel density, and brightness is good enough for indoors play but you wouldn't see a damn thing outdoors.
Blacks are very nice in the loading screens on the OLEDs though.

0126241800.jpg

0126241800a.jpg

0126241805.jpg

0126241805a.jpg


PS: I wish we had the latest update on Vita. It really is a comfy way to play Stardew Valley.
PS.2: If we get analogue triggers on Switch 2, I wish they are like the Steam Controller (not Deck). They have a click at the end and the travel is very short and nice.
At low light settings like these even the LCD Switch is not much of an issue to me, it looks quite fine. The real problem is when there is even a tiny bit of sunlight where the screen basically gives up which of course has a huge impact on the handheld experience even when using at home. To be fair it’s something which also the Steam Deck struggles with (even worse when I look at certain comparisons) and many other LCD Screens have this issue too.
 
how long they will deny Switch sucessor exist? they cant do this forever, the chances of the factory leaking someting of the console is huge(despite all security involved)
🤷 Dunno, I just wanted to use that meme because of who asked the question 😅
 
how long they will deny Switch sucessor exist? they cant do this forever, the chances of the factory leaking someting of the console is huge(despite all security involved)
As long as you can‘t buy the Switch Successor, it dosen‘t exists at all. One day you will wake up passing a store and see the Switch 2 just sitting there in a shelve. I swear Nintendo would do that if it was possible.
 
0
So I got REALLY curious about this and decided to take a similar picture comparing the Switch OLED, Steam Deck LCD 64GB model, Switch LCD Erista, and Vita OLED 1000, in that order. They're all at max brightness.
I know my camera isn't the best but here are the results with lights on and off.
These are really nice comparisons! It's hard to show screen differences through, well, another person's screen, but these side by sides are nice. I happen to have an OLED SteamDeck and an OLED Switch, so why not? Well, here is why not - the moire effect between the screen and my phone camera. It does both screens a huge disservice.

IMG-9401.jpg


I'll say the Steam Deck screen is brighter, but in terms of visual noise, color reproduction, they're basically identical for me. If I adjust the brightness down on the SD till it matches the Switch, they look exactly the same in the finer details. The general consensus is that the SD is the better screen. I might actually go for the Switch, whose slightly higher pixel density contributes nicely in some places. Which I had a game with better blacks to show those.
 
Unfortunately, the LCD display lottery's still happening with the Nintendo Switch equipped with the Tegra X1+ (Mariko).


Goddammit.

I guess I never knew because I've only had my 2017 launch Switch and then when it starting having major heating issues I got a Switch OLED in 2022. Do better Nintendo! (If possible)
 
0
If 80FHM6613 is indeed the model number of PS Portal’s display panel, it is unlikely to be a Sharp product. Even though in the past Sharp used some incomprehensible model numbers, such as LS0ZDC0174 (5.5 inch) and LJ0DAS0022 (6.2 inch), their current model numbers are very consistent. For examples:

LS058C0011 (5.8 inch)
LQ064X3LW02 (6.4 inch)
LX065A1BB04 (6.5 inch)

So unless the model number is a fully custom one to obscure its origin, the PlayStation Portal doesn’t seem to be equipped with a Sharp LCD panel.
Sorry for the late reply, but thank you so much for the information. I wasn't aware of how Sharp's model numbers are usually implemented.

I thought Sharp since the Bloomberg article mentioned Sharp being involved in the R&D for a new video game console. And there's currently no information available about who's the supplier of the 80FHM6613 displays.
 
I will say we are so technology-spoiled to fill so many pages full of paranoia about the possibility of the tech used in the screen of the next Nintendo machine that is not to their preference....

Guys.

It's not that heavy.
 
Last edited:
These are really nice comparisons! It's hard to show screen differences through, well, another person's screen, but these side by sides are nice. I happen to have an OLED SteamDeck and an OLED Switch, so why not? Well, here is why not - the moire effect between the screen and my phone camera. It does both screens a huge disservice.

IMG-9401.jpg


I'll say the Steam Deck screen is brighter, but in terms of visual noise, color reproduction, they're basically identical for me. If I adjust the brightness down on the SD till it matches the Switch, they look exactly the same in the finer details. The general consensus is that the SD is the better screen. I might actually go for the Switch, whose slightly higher pixel density contributes nicely in some places. Which I had a game with better blacks to show those.
That's pretty nice. They do look like a match in that picture. I wish I had a Steam Deck OLED but I could barely justify myself purchasing one, a second one would be pushing it lol
 
0
how long they will deny Switch sucessor exist? they cant do this forever, the chances of the factory leaking someting of the console is huge(despite all security involved)
They very much can do this forever if they so choose even if the device fully leaks. They will acknowledge the successor when they deem it prudent no sooner or later.
 
The questions asked during last meeting like this (October? November?) were completely useless. All were about Mario movie SMH.

I hope it won’t be more of the same but it probably will.
We need an extremely online shareholder to go ask about the switch successor for us just like the shareholder that asked why male Splatoon avatars did not have more varied hairstyles. THE PEOPLE WANNA KNOW!
 
We need an extremely online shareholder to go ask about the switch successor for us just like the shareholder that asked why male Splatoon avatars did not have more varied hairstyles. THE PEOPLE WANNA KNOW!
Shareholder Chad you are our only hope, my minuscule amount of shares isn’t enough 😔
 
0
I can't believe all this shit went down when I went to bed...... and now I'm 10 pages behind again.
 
In my opinion, I am convinced that the PS Portal display is made by BOE.
The part number for the PS Portal display is "80FHM6613".
ZfKinYy.jpg


Generally, the first two digits of the display part number are the manufacturer's code.
However, in the PS Portal, that digit is hidden.
w8wSZWe.jpg


So what do the other digits in the part number mean?
Naturally, the following interpretations are possible
80: 8 inch
FHM: Full-HD Module
6613: Unique code

If you look up "FHM" in this section, only the BOE uses it.
In other words, we can determine that the PS Portal display is made by BOE.
 
In my opinion, I am convinced that the PS Portal display is made by BOE.
The part number for the PS Portal display is "80FHM6613".
ZfKinYy.jpg


Generally, the first two digits of the display part number are the manufacturer's code.
However, in the PS Portal, that digit is hidden.
w8wSZWe.jpg


So what do the other digits in the part number mean?
Naturally, the following interpretations are possible
80: 8 inch
FHM: Full-HD Module
6613: Unique code

If you look up "FHM" in this section, only the BOE uses it.
In other words, we can determine that the PS Portal display is made by BOE.
That's a really nice find.
 
Lots of people are guessing we are gonna get news the week of February 7th but does anyone think we'll get some more juicy info next week? I am also curious what made Nate's next video get delayed.
 
So I got REALLY curious about this and decided to take a similar picture comparing the Switch OLED, Steam Deck LCD 64GB model, Switch LCD Erista, and Vita OLED 1000, in that order. They're all at max brightness.
I know my camera isn't the best but here are the results with lights on and off.

The Steam Deck is blinding bright at times and the Vita is very dim but that might be due to age.
The Switch OLED is, obviously, the clear winner in terms of color quality and how pleasing it is to see.
The Steam Deck is good but I feel like it isn't as pleasing to see, likely due to the colors and due to blinding max brightness, it strained my eyes. The colors are good enough at that brightness but I would realistically never use it like that so I normally lower the brightness and the colors don't look as good by then.
The Switch LCD is also not too bad. The brightness leaves a lot to be desired but the colors feel more pleasing to my eyes. Even if they don't look as bright as it might look like in the picture.
The Vita is the best and worst of three worlds. You get good colors but not as good as Switch OLED, nice pixel density, and brightness is good enough for indoors play but you wouldn't see a damn thing outdoors.
Blacks are very nice in the loading screens on the OLEDs though.

0126241800.jpg

0126241800a.jpg

0126241805.jpg

0126241805a.jpg


PS: I wish we had the latest update on Vita. It really is a comfy way to play Stardew Valley.
PS.2: If we get analogue triggers on Switch 2, I wish they are like the Steam Controller (not Deck). They have a click at the end and the travel is very short and nice.
Thank you for this, I was pulling my hair out trying to find a proper comparison of a Switch OLED to Steam Deck with a decent camera haha

And honestly, I think that just reconfirms my contentness with LCD if it means beefier specs elsewhere.
 
Lots of people are guessing we are gonna get news the week of February 7th but does anyone think we'll get some more juicy info next week? I am also curious what made Nate's next video get delayed.
I think it would be weird to announce it the week of the meeting, rather than before it. There's the possiblity of both, Monday, but if anything I think next week, either the 30th or the 1st, is possible. Team January may yet
... See something.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom