• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

As far as I know, the only Nintendo Switch game that was published by Nintendo that used Unity was Snipperclips, which was developed by SFB Games. But SFB Games is an independent company, not a subsidiary of Nintendo.
I believe the only other title that used Unity that was also published by Nintendo was Jump Rope Challenge.
 
It's coming out in a couple of days LMAOOOO


I know I am late on this one, but with XBC3 DLC dropping much sooner than expected, right after Advance Wars and shortly before Zelda TotK, Nintendo continues to stack the first half of this year. The idea that Nintendo planned to carry the Switch with DLC for games in the second half of the year continue to far less plausible. Instead of DLC being a driving force in the second half, it looks far more likely that most of this DLC is planned to wrap up well before the final quarter of this calendar year. Instead of holding back something like the XBC3 DLC to help fill in a gap later this year, they are instead squeezing it into a spot that doesn't really need it. Zelda TotK is going to cast a big shadow, there is no reason to put anything within a few weeks of it, but here we are with significant DLC dropping less three weeks from TotK release date. I think its fair to assume that a lackluster second half of the year will not be a thing. There is no way Nintendo released all this content in the first half only to turn around and suffer a drought. It might not be new hardware, could just be a few few high quality first party games that haven't been announced. Anyone sticking to their guns with this being a normal year for Nintendo is fooling themselves. We are a few weeks away from there only being one more dated first party game from Nintendo. We assume Prime 4 is still coming to Switch, but they didnt even mention it when stealth dropping Prime Remastered.

The writing on the wall here is that Zelda is the final big push with Switch with some less notable title still flowing in for quite some time. The N64 swan song was with Zelda MM, GameCube with Zelda TP, Wii with Zelda SS, and Wii U with BotW. History suggest a Zelda title on a Nintendo console late in its life aligns with a plan to move on to new hardware. Until Nintendo shows it hand for the second half software I am sticking with the late 2023 release of redacted.
 
Naming it the switch pro would just lead to confusion about whether or not it's their next generation of hardware after so many years of Switch Pro rumors. They'll probably just pick an entirely new name based on whatever the new gimmick it has is that they advertise on (If the gimmick they focus on is 5G they'll call it the Nintendo Connect, if the gimmick is AR they'll call it the Nintendo Reality, etc) but also make sure to list backwards compatibility on the feature list and during the presentation wherein they specify what games are coming to it.
 
Given the popularity of Unity, even with Nintendo themselves, I can't help but imagine a "bad omen for unity" as a bad omen for Switch.
All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.

Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting

30 fps is 3x the power of 60 fps, and that's why 30 fps won't die

Lemme give you a simplified view of a game engine.

Code:
CPU operations          -> GPU Queues                     -> Driver                     -> Screen
*Read inputs               *Tesselate Geometry               *Keep video buffer            *Draw buffer to screen
*Hit detection             *Render textures
*Physics                   *Run shaders
*Progress animations       *Perform post-processing

So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.

A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.

With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.

But notice - the CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but 24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.

What if you run at a frame rate between 30 and 60fps?

As frame rate goes up, latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.

But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.

Because you have more frames smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very unsmooth but for some folks, they'd rather have the extra fames over judder.

You get screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.

Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.

What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.

The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.

In practice frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.

So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between dropped frames and bad frame pacing.

A dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.

Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".

Bad frame pacing is when you update on the tick most of the time, them you miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.

Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.

WayForward's sorta clever solution
According the DF, Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.

When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.

In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.

TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.

Frame rate limiters are necessary to get high quality 30fps options.

Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).

In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise
 
All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.

Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting

30 fps is 3x the power of 60 fps, and that's why 30 fps won't die

Lemme give you a simplified view of a game engine.

Code:
CPU operations          -> GPU Queues                     -> Driver                     -> Screen
*Read inputs               *Tesselate Geometry               *Keep video buffer            *Draw buffer to screen
*Hit detection             *Render textures
*Physics                   *Run shaders
*Progress animations       *Perform post-processing

So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.

A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.

With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.

But notice - the CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but 24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.

What if you run at a frame rate between 30 and 60fps?

As frame rate goes up, latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.

But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.

Because you have more frames smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very unsmooth but for some folks, they'd rather have the extra fames over judder.

You get screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.

Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.

What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.

The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.

In practice frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.

So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between dropped frames and bad frame pacing.

A dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.

Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".

Bad frame pacing is when you update on the tick most of the time, them you miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.

Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.

WayForward's sorta clever solution
According the DF, Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.

When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.

In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.

TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.

Frame rate limiters are necessary to get high quality 30fps options.

Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).

In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise
Is a 60 FPS cap any easier or harder to implement than a 30 FPS cap?
 
Checknate.
i c wut u did there 👁️

Naming it the switch pro would just lead to confusion about whether or not it's their next generation of hardware after so many years of Switch Pro rumors. They'll probably just pick an entirely new name based on whatever the new gimmick it has is that they advertise on (If the gimmick they focus on is 5G they'll call it the Nintendo Connect, if the gimmick is AR they'll call it the Nintendo Reality, etc) but also make sure to list backwards compatibility on the feature list and during the presentation wherein they specify what games are coming to it.
I think their surveys may have brought back that Super or 2 are the way to go. But I doubt they would name it something without the Switch. The form factor will remain the same. They just need to constantly hammer that this device is a successor.
 
i c wut u did there 👁️


I think their surveys may have brought back that Super or 2 are the way to go. But I doubt they would name it something without the Switch. The form factor will remain the same. They just need to constantly hammer that this device is a successor.
Why do we assume there will be surveys and not a bunch of suits gathered around a table arguing, and then when they agree the guy who agrees with the richest shareholder wins despite being outvoted 10:1?

It's going to have a corporate name. I don't even know if Super or 2 are realistically in the running, even though I really want Super! I'd also be happy with ^2(Squared).
 
Why do we assume there will be surveys and not a bunch of suits gathered around a table arguing, and then when they agree the guy who agrees with the richest shareholder wins despite being outvoted 10:1?

It's going to have a corporate name. I don't even know if Super or 2 are realistically in the running, even though I really want Super! I'd also be happy with ^2(Squared).
I’m Team Switch^2 myself :p

I was actually thinking more of a focal group than a survey. A survey would be too open
 
Many people complaining about Tears of the Kingdom being $70 in the US, but between discount eShop cards at CostCo and the Voucher Program, I’m only paying $50 for TotK!
I got it for less than 50 after tax. While it could just have cheaper prices, hunting for price cuts on eShop cards, using Gold Points and vouchers can really add up to huge savings. I think I saved something like 30 bucks on pre-orders this year alone? It's gotten to the point where digital is cheaper and more convenient than physical.
 
That perfect cube - that does nothing - is about to be the single biggest failure in the history of personal computing.
If you're talking about NeXT then I'd have to strongly disagree. Did it sell well? No. But I don't see how you can't say it wasn't a success when Apple snatched up the company, and utilized what they had built for the rebirth of Mac. Steve Jobs was a pimp.
 
Switch 2 with this logo

image.png
Swiitch?
 
I got it for less than 50 after tax. While it could just have cheaper prices, hunting for price cuts on eShop cards, using Gold Points and vouchers can really add up to huge savings. I think I saved something like 30 bucks on pre-orders this year alone? It's gotten to the point where digital is cheaper and more convenient than physical.

I'm with you there. My daughter got 3 physical games for her birthday and it's driving me crazy keeping track of all the cartridges and switching them in and out all the time. I really hate the cartridge slot on the Switch. I hope they somehow make it easier to load games in and out with the next Switch. Do we really need the flap? This thing isn't water resistance rated.
 
I am so bored
You and me both, I've resorted to prompting Bing AI to generate rumours about it just for kicks. Something I have learned is it is aware of the rumours, of Famiboards, and the leaked details about T239. It even has an idea of the timeline and Linux commits. Even if I don't mention Nintendo, it seems to understand that T239 and Nintendo are somehow related. I doubt it means anything, but I find it fun!
 
You and me both, I've resorted to prompting Bing AI to generate rumours about it just for kicks. Something I have learned is it is aware of the rumours, of Famiboards, and the leaked details about T239. It even has an idea of the timeline and Linux commits. Even if I don't mention Nintendo, it seems to understand that T239 and Nintendo are somehow related. I doubt it means anything, but I find it fun!
Yeah I find Bing’s hallucinations pretty wild and fun. Like look at this

image.png
 
If you're talking about NeXT then I'd have to strongly disagree. Did it sell well? No. But I don't see how you can't say it wasn't a success when Apple snatched up the company, and utilized what they had built for the rebirth of Mac. Steve Jobs was a pimp.
It was a quote from the movie... Woz (Seth Rogen) says it.

great movie
 
Hey Famiboard Family! This is Mike Odyssey,

No more leaks or rumors from me.

I don't like this label I am getting lately as a leaker/insider rumor source. With all honesty I just want to make content that makes people happy. Those who have followed me for a while know this to be true. I don't shy away from my disability and I work hard to inspire others.So I've decided that every unreleased information I get from now on, I will either delete or forward to my closest Youtube friends in the field. Im not interested in being the first with news. I am interested in making people happy, that is all. You guys have one less lunatic to worry about when it comes to leaks lol I just want to make content. Hope you can check out my vids sometime. You might like what I have to offer. Take care everyone!

I make a video stating there would be an announcement/event from April 12 to the 14th, within that timeframe. before the trailer was announced on the 12th and released on the 13th. To be honest I don't want to be considered a leaker/rumor/insider person. I just want to make content that inspires people. I can honestly care less about being first with news when I have a disability that is leaving me blind lol I just released a post on here stating that I was no longer sharing info in this manner. Only official news, and rumors yes, but only the ones already out there.
It's all good man. You came off as a nice and decent person in that video to me and labelled the info as rumour coming from a source and taking it with a grain of salt. You didn't claim anything else. You didn't go overboard or crazy like some other tubers do. You were as level headed as possible. As long as you label things right you should be fine. Like you said you already got one right from your source (same source?), so keep it going. Let the days be the judge. And take care of yourself. I hope you get better.

And thanks for the explanation ☺️
 
In all fairness I can see them naming it something outlandish like Ultra Switch but who knows. They’re not as daring under Furukawa
 
Oh it’s 100% gonna be called something insane that nobody could ever guess but will somehow make sense and then slowly grow on us
 
All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.

Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting

30 fps is 3x the power of 60 fps, and that's why 30 fps won't die

Lemme give you a simplified view of a game engine.

Code:
CPU operations          -> GPU Queues                     -> Driver                     -> Screen
*Read inputs               *Tesselate Geometry               *Keep video buffer            *Draw buffer to screen
*Hit detection             *Render textures
*Physics                   *Run shaders
*Progress animations       *Perform post-processing

So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.

A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.

With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.

But notice - the CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but 24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.

What if you run at a frame rate between 30 and 60fps?

As frame rate goes up, latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.

But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.

Because you have more frames smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very unsmooth but for some folks, they'd rather have the extra fames over judder.

You get screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.

Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.

What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.

The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.

In practice frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.

So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between dropped frames and bad frame pacing.

A dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.

Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".

Bad frame pacing is when you update on the tick most of the time, them you miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.

Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.

WayForward's sorta clever solution
According the DF, Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.

When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.

In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.

TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.

Frame rate limiters are necessary to get high quality 30fps options.

Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).

In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise

Why is it that things like frame pacing and screen tearing didn't seem to be an issue prior to the PS3/360 generation? Framerate drops were surely common, but I do not remember screen tearing and frame pacing issues back on the PS2/GC.
 
Why is it that things like frame pacing and screen tearing didn't seem to be an issue prior to the PS3/360 generation? Framerate drops were surely common, but I do not remember screen tearing and frame pacing issues back on the PS2/GC.
I think interlacing, analogue outputs and CRTs may have had something to do with that.
 
Is a 60 FPS cap any easier or harder to implement than a 30 FPS cap?
In theory it’s the same. In practice, it’s less complicated. Partially just because frame pacing issues get smaller if you have a 120Hz display and latency issues that a cap can introduce tend to hide behind the higher frame rates
 
0
Is it safe to assume Metroid Prime 4 gets the Twilight Princess/BoTW treatment and launches on Switch and Switch2?

I am not sure what is after Pikmin 4 aside from Pokemon DLC, but I feel that game would make a great showcase for new hardware.

Wouldn't be surprised if ToTK gets an enhancement patch or a Master/GoTY version or something.

Don't think 3D Mario makes launch but it could be a launch window title, at least, I hope it is.
 
.


I would love to believe Nintendo thinks Metroid Prime 4 is worthy of being a cross-generation highlight in lieu of Zelda.

I’m only expecting a Switch (1) game for now tho.
My opinion is this game could be a good pretender for a cross gen launch title.
I can see this game for a launch release ( maybe launch window ) it’s been a long time that the game has been rebooted
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom