Dekuman
Kremling
- Pronouns
- He/Him/His
Trailer shows the new Switch controller again, but this time they meant the new pro controller. audience think it's still for the old Switch.Ah yes, abbreviated as the Switch U.
Trailer shows the new Switch controller again, but this time they meant the new pro controller. audience think it's still for the old Switch.Ah yes, abbreviated as the Switch U.
I believe the only other title that used Unity that was also published by Nintendo was Jump Rope Challenge.As far as I know, the only Nintendo Switch game that was published by Nintendo that used Unity was Snipperclips, which was developed by SFB Games. But SFB Games is an independent company, not a subsidiary of Nintendo.
It's coming out in a couple of days LMAOOOO
All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.Given the popularity of Unity, even with Nintendo themselves, I can't help but imagine a "bad omen for unity" as a bad omen for Switch.
CPU operations -> GPU Queues -> Driver -> Screen
*Read inputs *Tesselate Geometry *Keep video buffer *Draw buffer to screen
*Hit detection *Render textures
*Physics *Run shaders
*Progress animations *Perform post-processing
Is a 60 FPS cap any easier or harder to implement than a 30 FPS cap?All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.
Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting
30 fps is 3x the power of 60 fps, and that's why 30 fps won't die
Lemme give you a simplified view of a game engine.
Code:CPU operations -> GPU Queues -> Driver -> Screen *Read inputs *Tesselate Geometry *Keep video buffer *Draw buffer to screen *Hit detection *Render textures *Physics *Run shaders *Progress animations *Perform post-processing
So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.
A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.
With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.
But notice - the CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but 24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.
What if you run at a frame rate between 30 and 60fps?
As frame rate goes up, latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.
But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.
Because you have more frames smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very unsmooth but for some folks, they'd rather have the extra fames over judder.
You get screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.
Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.
What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.
The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.
In practice frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.
So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between dropped frames and bad frame pacing.
A dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.
Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".
Bad frame pacing is when you update on the tick most of the time, them you miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.
Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.
WayForward's sorta clever solution
According the DF, Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.
When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.
In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.
TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.
Frame rate limiters are necessary to get high quality 30fps options.
Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).
In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise
i c wut u did thereChecknate.
I think their surveys may have brought back that Super or 2 are the way to go. But I doubt they would name it something without the Switch. The form factor will remain the same. They just need to constantly hammer that this device is a successor.Naming it the switch pro would just lead to confusion about whether or not it's their next generation of hardware after so many years of Switch Pro rumors. They'll probably just pick an entirely new name based on whatever the new gimmick it has is that they advertise on (If the gimmick they focus on is 5G they'll call it the Nintendo Connect, if the gimmick is AR they'll call it the Nintendo Reality, etc) but also make sure to list backwards compatibility on the feature list and during the presentation wherein they specify what games are coming to it.
Why do we assume there will be surveys and not a bunch of suits gathered around a table arguing, and then when they agree the guy who agrees with the richest shareholder wins despite being outvoted 10:1?i c wut u did there
I think their surveys may have brought back that Super or 2 are the way to go. But I doubt they would name it something without the Switch. The form factor will remain the same. They just need to constantly hammer that this device is a successor.
I’m Team Switch^2 myselfWhy do we assume there will be surveys and not a bunch of suits gathered around a table arguing, and then when they agree the guy who agrees with the richest shareholder wins despite being outvoted 10:1?
It's going to have a corporate name. I don't even know if Super or 2 are realistically in the running, even though I really want Super! I'd also be happy with ^2(Squared).
Many people complaining about Tears of the Kingdom being $70 in the US, but between discount eShop cards at CostCo and the Voucher Program, I’m only paying $50 for TotK!I was bitter when I got my switch and learned that the voucher program was a thing
And then it came back and that bitterness went away
I got it for less than 50 after tax. While it could just have cheaper prices, hunting for price cuts on eShop cards, using Gold Points and vouchers can really add up to huge savings. I think I saved something like 30 bucks on pre-orders this year alone? It's gotten to the point where digital is cheaper and more convenient than physical.Many people complaining about Tears of the Kingdom being $70 in the US, but between discount eShop cards at CostCo and the Voucher Program, I’m only paying $50 for TotK!
If you're talking about NeXT then I'd have to strongly disagree. Did it sell well? No. But I don't see how you can't say it wasn't a success when Apple snatched up the company, and utilized what they had built for the rebirth of Mac. Steve Jobs was a pimp.That perfect cube - that does nothing - is about to be the single biggest failure in the history of personal computing.
Swiitch?Switch 2 with this logo
I got it for less than 50 after tax. While it could just have cheaper prices, hunting for price cuts on eShop cards, using Gold Points and vouchers can really add up to huge savings. I think I saved something like 30 bucks on pre-orders this year alone? It's gotten to the point where digital is cheaper and more convenient than physical.
Ooo, I hate that! But I'm glad someone is trying!Switch 2 with this logo
I am so boredOoo, I hate that! But I'm glad someone is trying!
You and me both, I've resorted to prompting Bing AI to generate rumours about it just for kicks. Something I have learned is it is aware of the rumours, of Famiboards, and the leaked details about T239. It even has an idea of the timeline and Linux commits. Even if I don't mention Nintendo, it seems to understand that T239 and Nintendo are somehow related. I doubt it means anything, but I find it fun!I am so bored
Yeah I find Bing’s hallucinations pretty wild and fun. Like look at thisYou and me both, I've resorted to prompting Bing AI to generate rumours about it just for kicks. Something I have learned is it is aware of the rumours, of Famiboards, and the leaked details about T239. It even has an idea of the timeline and Linux commits. Even if I don't mention Nintendo, it seems to understand that T239 and Nintendo are somehow related. I doubt it means anything, but I find it fun!
It was a quote from the movie... Woz (Seth Rogen) says it.If you're talking about NeXT then I'd have to strongly disagree. Did it sell well? No. But I don't see how you can't say it wasn't a success when Apple snatched up the company, and utilized what they had built for the rebirth of Mac. Steve Jobs was a pimp.
From now on you change your avatar before all your movie referencesIt was a quote from the movie... Woz (Seth Rogen) says it.
great movie
Hey Famiboard Family! This is Mike Odyssey,
No more leaks or rumors from me.
I don't like this label I am getting lately as a leaker/insider rumor source. With all honesty I just want to make content that makes people happy. Those who have followed me for a while know this to be true. I don't shy away from my disability and I work hard to inspire others.So I've decided that every unreleased information I get from now on, I will either delete or forward to my closest Youtube friends in the field. Im not interested in being the first with news. I am interested in making people happy, that is all. You guys have one less lunatic to worry about when it comes to leaks lol I just want to make content. Hope you can check out my vids sometime. You might like what I have to offer. Take care everyone!
It's all good man. You came off as a nice and decent person in that video to me and labelled the info as rumour coming from a source and taking it with a grain of salt. You didn't claim anything else. You didn't go overboard or crazy like some other tubers do. You were as level headed as possible. As long as you label things right you should be fine. Like you said you already got one right from your source (same source?), so keep it going. Let the days be the judge. And take care of yourself. I hope you get better.I make a video stating there would be an announcement/event from April 12 to the 14th, within that timeframe. before the trailer was announced on the 12th and released on the 13th. To be honest I don't want to be considered a leaker/rumor/insider person. I just want to make content that inspires people. I can honestly care less about being first with news when I have a disability that is leaving me blind lol I just released a post on here stating that I was no longer sharing info in this manner. Only official news, and rumors yes, but only the ones already out there.
I think the should save the "Super" tagline for mid gen refreshes on the Switch line. Call it Switch 2, then mid gen Super NS2I like Switch 2 or super nintendo switch
Was that a One Piece reference?I think we're in a the night is darkest just before the dawn period.
SighFrom now on you change your avatar before all your movie references
Clearly a Florence and the Machine reference.Was that a One Piece reference?
Hahahhaa it’s gonna be a looong couple of weeks…I think we're in a the night is darkest just before the dawn period.
In all fairness I can see them naming it something outlandish like Ultra Switch but who knows. They’re not as daring under Furukawa
Ah yes. the FwitchThe Nintendo Switch - Furukawa Model
Nintendo Switch (Furukawa's Version)The Nintendo Switch - Furukawa Model
All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.
Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting
30 fps is 3x the power of 60 fps, and that's why 30 fps won't die
Lemme give you a simplified view of a game engine.
Code:CPU operations -> GPU Queues -> Driver -> Screen *Read inputs *Tesselate Geometry *Keep video buffer *Draw buffer to screen *Hit detection *Render textures *Physics *Run shaders *Progress animations *Perform post-processing
So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.
A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.
With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.
But notice - the CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but 24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.
What if you run at a frame rate between 30 and 60fps?
As frame rate goes up, latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.
But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.
Because you have more frames smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very unsmooth but for some folks, they'd rather have the extra fames over judder.
You get screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.
Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.
What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.
The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.
In practice frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.
So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between dropped frames and bad frame pacing.
A dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.
Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".
Bad frame pacing is when you update on the tick most of the time, them you miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.
Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.
WayForward's sorta clever solution
According the DF, Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.
When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.
In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.
TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.
Frame rate limiters are necessary to get high quality 30fps options.
Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).
In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise
It's still wild to me that OLED Model changed its official name between reveal and launch.Nintendo Switch (Furukawa's Version)
I think interlacing, analogue outputs and CRTs may have had something to do with that.Why is it that things like frame pacing and screen tearing didn't seem to be an issue prior to the PS3/360 generation? Framerate drops were surely common, but I do not remember screen tearing and frame pacing issues back on the PS2/GC.
In theory it’s the same. In practice, it’s less complicated. Partially just because frame pacing issues get smaller if you have a 120Hz display and latency issues that a cap can introduce tend to hide behind the higher frame ratesIs a 60 FPS cap any easier or harder to implement than a 30 FPS cap?
How so?It's still wild to me that OLED Model changed its official name between reveal and launch.
look...you can't fix stupid. That's just a sad reality and youtube is a gold mine for stupid.
Maybe that's too harsh but I just don't care lmao.
What was the official name at reveal??It's still wild to me that OLED Model changed its official name between reveal and launch.
Do you mean how (OLED Model) went from being inside parentheses to following a dash?It's still wild to me that OLED Model changed its official name between reveal and launch.
Yep.Do you mean how (OLED Model) went from being inside parentheses to following a dash?
lmaoo I thought it was something else entirelyYep.
Probably. With it releasing this late in the year and late in the Switch’s life, it’s highly probable.Is it safe to assume Metroid Prime 4 gets the Twilight Princess/BoTW treatment and launches on Switch and Switch2?
That is what almost every cross gen title is.then it would be a switch game with optional enhancements
Honestly at this point I almost expect it.Is it safe to assume Metroid Prime 4 gets the Twilight Princess/BoTW treatment and launches on Switch and Switch2?
Honestly at this point I almost expect it.
My opinion is this game could be a good pretender for a cross gen launch title..
I would love to believe Nintendo thinks Metroid Prime 4 is worthy of being a cross-generation highlight in lieu of Zelda.
I’m only expecting a Switch (1) game for now tho.