LiC
Member
But a 1080p screen consumes more power to illuminate than a 720p screen does regardless of the render resolution.If it were a world where every game were forced to use the screen's max resolution, this might be a concern.
But a 1080p screen consumes more power to illuminate than a 720p screen does regardless of the render resolution.If it were a world where every game were forced to use the screen's max resolution, this might be a concern.
How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.But a 1080p screen consumes more power to illuminate than a 720p screen does regardless of the render resolution.
SkyQuest mentioned that yields are one of the biggest challenges for miniLED manufacturers. So currently not likely.Maybe the "OLED model" for Drake will be a 1080p screen with VRR and miniLED? And that one could actually be a pro? Speaking of which, on the chip side, what would a "new Drake" or a "Drake Pro" look like?
Smaller node? Higher clocks for CPU/GPU? LPDDR5X RAM?
That's .. oddly specific.still thinking 1.25GHz or 1.78GHz for the cpu depending on the node
1.78GHz is when the cpu is boosted in some Switch games to speed up decompression.That's .. oddly specific.
1.25GHz is just a ballpark. Over the Switch's default clock but below 1.5GHz
Cute that you think at 1.25GHz it’ll be “leagues better”.1.25Ghz would be fine as the CPU would still be leagues better than the ones the Xbone and PS4. They could probably stretch it to 1.4Ghz possible
I don’t think devs would like that.I believe that there is the possibility of different clock profiles, some privileging CPU and others GPU but maintaining an equal energy expenditure, it would be good for cpubound games and also for games with performance and graphics modes.
Really hoping it's at least 1.5GHz. I think that's doable if we get 6 or 5nm node, but we'll see.1.78GHz is when the cpu is boosted in some Switch games to speed up decompression.
1.25GHz is just a ballpark. Over the Switch's default clock but below 1.5GHz
I'm assuming because the CPU profiles are traditionally the same both handheld and docked, so it feels like there's less room to maneuver than with the GPU clocks.I know Nintendo can be conservative but why are people going below even conservative with the CPU clockspeed
I’m not sure I follow, if the GPU can be clocked low in portable mode along with the memory to save battery life, I don’t understand why the CPU has to be clocked so low below even conservative. It’s not the A57 nor is it on the 20nm node.I'm assuming because the CPU profiles are traditionally the same both handheld and docked, so it feels like there's less room to maneuver than with the GPU clocks.
Hey, you asked for a rationale. I'm not smart enough to have an opinion on the matter either way.I’m not sure I follow, if the GPU can be clocked low in portable mode along with the memory to save battery life, I don’t understand why the CPU has to be clocked so low below even conservative. It’s not the A57 nor is it on the 20nm node.
Like are the expectations of the chip to consume 2-4W in portable mode? Because that’s unrealistic.
Twice as much. OLED power draw scales linearly with pixel count. How much that is in absolute terms I'm not sure.How much of 'more' is it? I firmly believe it's negligible.
The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting.
It also has a 7nm A12 with a 2+4 Big.LITTLE CPU design. The 500 nit screen is almost definitely soaking up more power than the Switch's, but it's running an SOC that is built for power scaling and manufactured on a vastly more efficient process. I'm not sure anything useful can be extracted from the comparison.I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
Wait, what? It's an OLED screen with physical pixels - why would the native rendering resolution matter for the screen's power draw? If it's scaling the image, it doesn't matter if the line is aliased or native resolution, it's gotta light all them pixels up.If it were a world where every game were forced to use the screen's max resolution, this might be a concern. But it's not the case for Switch, wasn't for Vita, isn't for Steam Deck, won't be for Drake.
Yes, you have said this before, but it is putting words in the mouths of people arguing for a lower resolution screen. No one is saying that the reason they want a 720p screen is because they don't want their precious 720p games to feel bad next to manly 1080p games.So leaving out a 1080p+ screen for this reason would just be punishing the games that can handle it so the games that can't won't seem as inferior. I've put it this way before,
This is why your rationale is a straw man. Folks have said - I have said - that the current 720p screen is a retina quality display, and that most player's eyes (including my own) are physically incapable of seeing the pixel distinctions on it already. A 1080p screen wouldn't add value, but it would add manufacturing and power draw cost. A 540p screen obviously isn't in the same position.but should the Switch's screen have been 540p because more games would actually hit that?
Here I've taken a few 720p shots from eShop listings or my own collection and show how they appear in 1080p by nearest neighbor, bilinear, and FSR1.
You're not wrong, which is one of the reasons I was so resistant to a 1080p screen. However, There Will Be(tm) a 1080p handheld from Nintendo. This problem will happen eventually, and the longer Nintendo waits, the more of the library has this issue.A 1080p screen will make native 720p (unpatched) handheld pixel art / sprite based games look worse, of which I own a lot (), and developers have little reason to go back and patch, let's say, Bloodstained: Circle of the Moon or Hyper Light Drifter or Touhou Luna Nights because they all run perfectly on the current Switch.
Low, Nintendo doesn't do CES anymore.What are the odds that we here rumblings of a new Nintendo device at CES 2023 in January
OLED screens may consume less power than LCD, but there are still 2.25x as many pixels in a 1080p screen needing to be lit than a 720p screen. And you can't compare an iPad's battery life to a game console, let alone expecting that to tell you anything about the power draw of the screens.How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
Best bet is NVIDIA hinting at the SoC in their keynote speech and even then that's a big "if."What are the odds that we here rumblings of a new Nintendo device at CES 2023 in January
the only way this happens is if they announce a new shield with specs/features that don't align with Orin. and given Drake is made for Nintendo first, they might not even put out the specsBest bet is NVIDIA hinting at the SoC in their keynote speech and even then that's a big "if."
Betcha a dollar it's the "lol because Nintendo" crowdI know Nintendo can be conservative but why are people going below even conservative with the CPU clockspeed
Yea, just you.Am i the only one that thinks if Nintendo doesn't use VR as a selling point when introducing this new system, that they will do so in a more powerful revision 3 years after this new device comes out?
Am i the only one that thinks if Nintendo doesn't use VR as a selling point when introducing this new system, that they will do so in a more powerful revision 3 years after this new device comes out?
Nintendo might not care too much about VR. As fun as the experience of VR can be its still an incredibly small market share and probably will be until it becomes more normalized by businesses. Most people I know just look at VR as a novelty and with an "o neat" mentality.Am i the only one that thinks if Nintendo doesn't use VR as a selling point when introducing this new system, that they will do so in a more powerful revision 3 years after this new device comes out?
I was trying to find data for this by Google, but it was hard to find apples-to-apples comparisons, and the closest I could find didn't seem like a slam dunk.How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
The 720p screens of Bloodstained Curse of the Moon already look like they're somewhat unevenly scaled, with one pretend pixel becoming a square of 2-3 pixels (with blending where they meet). Scaling these images to 1080p by bilinear or box scaling doesn't look drastically different to me. FSR in general doesn't seem a good match for keeping pixel art with its intended look.A 1080p screen will make native 720p (unpatched) handheld pixel art / sprite based games look worse, of which I own a lot (), and developers have little reason to go back and patch, let's say, Bloodstained: Circle of the Moon or Hyper Light Drifter or Touhou Luna Nights because they all run perfectly on the current Switch.
I was replying to someone talking about not screen power draw, but T239 power draw.Wait, what? It's an OLED screen with physical pixels - why would the native rendering resolution matter for the screen's power draw? If it's scaling the image, it doesn't matter if the line is aliased or native resolution, it's gotta light all them pixels up.
It feels like the implication to me when people ask why bother having a resolution many games won't hit. If it's a resolution every game will hit, I think the bar is set too low.Yes, you have said this before, but it is putting words in the mouths of people arguing for a lower resolution screen. No one is saying that the reason they want a 720p screen is because they don't want their precious 720p games to feel bad next to manly 1080p games.
Having been looking at screens of approximately these sizes and resolutions for the last decade-ish I find it very hard to believe it's a minority of people who can distinguish between a 7" 720p and 7" 1080p screen. The difference might be lessened if comparing photographs or video with all kinds of natural blur going on anyway, but that's a pretty different beast from what we see on gaming machines.This is why your rationale is a straw man. Folks have said - I have said - that the current 720p screen is a retina quality display, and that most player's eyes (including my own) are physically incapable of seeing the pixel distinctions on it already. A 1080p screen wouldn't add value, but it would add manufacturing and power draw cost. A 540p screen obviously isn't in the same position.
It's incredibly uncommon for games to have pixel art that's actually 720p, so there's already scaling going on. I disagree it looks significantly worse as demonstrated in the Imgur gallery, and even if it did look worse I think it's a fair trade to have someone's unupdated faux-300p game look worse if thousands of other games from 2023-2030 are less limited.Meanwhile, for those who can see the difference it's not about the 720p games seeming inferior, it's that upscaled games from the existing Switch library will show artifacts, especially games that use pixel art, which is not exactly an uncommon art style at the moment.
I did consider that last point, but to demonstrate it properly would mean scaling both the 720p and 1080p images to 2160p, wasn't sure if anyone else was interested that much. But now I am enough to do so in a limited way. For the 3D images I'll use the FSR upscale, for the 2D ones box scaling. Original on left half, scaled on right half. To my eye and viewing on a 55" 4K, Hyper Light Drifter seems the most affected, with slight blur noticeable on the right side since it has such stark edges between different colors and is very sharp on 720p. It's noticeable on Curse of the Moon to a lesser extent, since it's already a bit blurry on the left side.I appreciate what you're doing here with these, but I'm not sure it actually shows what the effect will be on actual hardware. Either I look at the images scaled to my own monitor's resolution, inside a browser viewport, or I view them at native res side by side at different physical sizes which ignores pixel density.
Nintendo might not care too much about VR. As fun as the experience of VR can be its still an incredibly small market share and probably will be until it becomes more normalized by businesses. Most people I know just look at VR as a novelty and with an "o neat" mentality.
One final not though, when I have friends use my Quest they tend to think more highly of VR but not highly enough to want to get invested in it.
Hilariously enough it’s Miyamoto who keeps getting asked about or answering VR. All of these are answers are throughout the years with the last one being in 2019. Outside of a financial perspective it seems they still have issues with VR.
Whatever scaling algorithm you propose will look 'fine', but I will always prefer (beyond just these three games I plucked from my library) either native res or integer upscaling to 1440p/2160p, or patching the game itself to output 1080p. That's just the sort of person I am. The kind to buy an expensive RetroTink 4K upscaler and hook it to my launch day Switch so I can get 1:1 pixel mapping instead of letting my TV slather on some blur.The 720p screens of Bloodstained Curse of the Moon already look like they're somewhat unevenly scaled, with one pretend pixel becoming a square of 2-3 pixels (with blending where they meet). Scaling these images to 1080p by bilinear or box scaling doesn't look drastically different to me. FSR in general doesn't seem a good match for keeping pixel art with its intended look.
Hyper Light Drifter has such a sharp look it's a rare case where the nearest neighbor scaling seems like it works well--at least in stills. I'd still trust bilinear/box more in motion.
Touhou looks very similar to a higher color Curse of the Moon in that it looks like an original image is already being blown up approximately (but not exactly) 2x, so I don't think bilinear/box changes it much.
You see, I am hoping we skip 1080p entirely, i.e. stick with 720p for Switch 2 and move to a 1440p handheld screen for Switch '3', and integer scale by default. I'm not expecting it, of course. As I've said, I will accept a 1080p OLED screen and deal with the slight imperfections for my 2D indies.You're not wrong, which is one of the reasons I was so resistant to a 1080p screen. However, There Will Be(tm) a 1080p handheld from Nintendo. This problem will happen eventually, and the longer Nintendo waits, the more of the library has this issue.
Given that the last response was from 2019, I wonder if anything has changed. I know the Labo wasn't exactly a screaming success and VR has improved since then generally speaking. The fact that Miyamoto said "we want families to play together" is probably the one thing that keeps me reluctant to feel that Nintendo is pursuing VR as a major part of the next hardware. They could always attempt to make it an add-on like Sony does with PSVR but I find it hard to believe that Nintendo would want to fracture their install base and develop side content or games specifically for something that might not sell well given the current cost of the lower entry VR devices. For the record, I wouldn't be upset if Nintendo did pull some wild VR trick out of their hat.Hilariously enough it’s Miyamoto who keeps getting asked about or answering VR. All of these are answers are throughout the years with the last one being in 2019. Outside of a financial perspective it seems they still have issues with VR.
Just judging from the responses, Nintendo’s history, & business MO, I would say that their answer hasn’t changed that much. It faces three main problems for Nintendo:Given that the last response was from 2019, I wonder if anything has changed. I know the Labo wasn't exactly a screaming success and VR has improved since then generally speaking. The fact that Miyamoto said "we want families to play together" is probably the one thing that keeps me reluctant to feel that Nintendo is pursuing VR as a major part of the next hardware. They could always attempt to make it an add-on like Sony does with PSVR but I find it hard to believe that Nintendo would want to fracture their install base and develop side content or games specifically for something that might not sell well given the current cost of the lower entry VR devices. For the record, I wouldn't be upset if Nintendo did pull some wild VR trick out of their hat.
But improvement has also been made so that higher density oled emits around the same heat to lower density ones. OLED dies on too much heat. Too make it denser, you need to reduce power draw.Twice as much. OLED power draw scales linearly with pixel count. How much that is in absolute terms I'm not sure.
They can just use the Dock Mode visuals if that's the case. Drake can trick the games to run on Dock Mode settings.Meanwhile, for those who can see the difference it's not about the 720p games seeming inferior, it's that upscaled games from the existing Switch library will show artifacts, especially games that use pixel art, which is not exactly an uncommon art style at the moment.
We've discussed this, and it probably isn't going to work without per-game testing. The games in question are unlikely to get that level of attentionThey can just use the Dock Mode visuals if that's the case. Drake can trick the games to run on Dock Mode settings.
I can't imagine a 1440p screen on a sub 7 inch device, and if anyone tries to put an 8 inch switch in my hands, my wrists will ask my feet to punch the for me, as my wrists can't do it themselves due to arthritisYou see, I am hoping we skip 1080p entirely, i.e. stick with 720p for Switch 2 and move to a 1440p handheld screen for Switch '3', and integer scale by default. I'm not expecting it, of course. As I've said, I will accept a 1080p OLED screen and deal with the slight imperfections for my 2D indies.
We've discussed this, and it probably isn't going to work without per-game testing. The games in question are unlikely to get that level of attention
I can't imagine a 1440p screen on a sub 7 inch device, and if anyone tries to put an 8 inch switch in my hands, my wrists will ask my feet to punch the for me, as my wrists can't do it themselves due to arthritis
What? No, this is wrong.A 1Ghz Drake CPU is roughly like a PS4 Pro CPU (or One X, very similar there)
Where are you getting 20% at 1GHz? Did you mean at 720MHz?If you look at docked Drake, with a 1Ghz clocked GPU, you're getting ~20% more power than a PS4, which puts you in a good position to run PS4 games with enough frame time left over to do DLSS. So that's my baseline GPU assumption.
Did you account infinity cache offering a 25% performance uplift as per AMD? The thing the consoles lack.I've got a crapload of Digital Foundry benchmarks in a spreadsheet over here, but the analysis is mine. Essentially, if you compare cards with the same SM/CUs, AMD gets similar game perf on average, but is clocking their cards significantly faster to get there.
RDNA2 is clearly still not as RAM efficient as Ampere either, and that garbles up the analysis. Ampere can get way out ahead of RDNA2 in VRAM starved situations, which confounds the analysis. In short - 35% is a conservative floor, that gives RDNA2 the benefit of the doubt.
Quick little asterisk:A78C is about a clock for clock equivalent to the Zen 2 CPU**, as you point out, both of which are about 2.5x more powerful than the Jaguar CPUs in the last gen consoles.
Yeah, and that is Orin for A78AE which is less efficient than A78C (A78AE has multiple clusters and thus latency versus A78C which is one cluster with very little latency)Quick little asterisk:
Sounds about right for chiplet-based Zen 2 (ie the desktop SKUs that end in x or have no suffix). But the monolithic Zen 2 chips tend to score a bit worse in Geekbench (I'm assuming that is due to having 2x4 MB L3 cache instead of 2x16). You can actually see this with the 4700S, which are PS5 chips with defective GPUs, I think. Clock for clock, Orin beats that.
An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.Quick little asterisk:
Sounds about right for chiplet-based Zen 2 (ie the desktop SKUs that end in x or have no suffix). But the monolithic Zen 2 chips tend to score a bit worse in Geekbench (I'm assuming that is due to having 2x4 MB L3 cache instead of 2x16). You can actually see this with the 4700S, which are PS5 chips with defective GPUs, I think. Clock for clock, Orin beats that.
According to Alex from DF PS5 uses 6.5 cpu cores for gaming during their analysis of Gotham Knights, which I wonder if this is something Sony will free up down the line(but I'm guessing not since these newer consoles have a lot of system wide functions running in the background).An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.
Assuming we end up with 8 cores, would that be preferable if they run at lower clocks, say 1.25 Ghz mentioned over having fewer cores and higher clocks? Don't modern games use a lot more threads?
Those PC Ryzens are probably not 100% eqalent to their console counterparts though. Pretty sure consoles have less cache among other things.An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.