• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Slight digression sorry - Does anyone know at what point Switch EDEV became available for people with a “normal” level of developer access? I use EDEV as an example of something more affordable. Basically trying to figure out when NG kits might be available to us peons and couldn’t find the original Switch kit timeline anywhere. Thanks!
 
0
So there's ...
  • Gamescom - August 23-27
  • Pax West - September 1-4
  • Possible Direct before TGS
  • Tokyo Game Show - September 21-24
  • Nintendo quarter results - November
  • The Game Awards - December
Anything else relevant this year?
They're not mentioning Switch 2 until early 2024 at the earliest imo. Then fully revealed in Spring/Summer and released in Autumn/Winter 2024.
 
One curious thing I noticed recently is that both the Steam Deck (see here) and the ROG Ally (here), which each have 16GB of LPDDR5 on a 128 bit bus, use four 32-bit 4GB parts instead of two 64-bit 8GB parts. I would have assumed that an 8GB 64-bit module would be cheaper than two 4GB modules, but perhaps that's not the case. Of course there could be other reasons for using four modules instead of two (for instance improved heat dissipation), but given that RAM would be one of the most expensive components after the SoC, I would have expected them to go with the cheaper option, particularly in Steam Deck's case, where they're working with little to no margin.

Nintendo used a pair of 32-bit 2GB modules instead of one 64-bit 4GB module on the Switch (and all subsequent revisions), so they're no strangers to using a larger number of lower capacity LPDDR parts. Thinking about this, I suspect it's a limitation of the memory controllers. I believe all TX1 based products use a pair of 32-bit modules, and AMD's LPDDR5 controllers may also be built for 32-bit parts. Looking at the Orin Jetson boards, they all appear to use 64-bit LPDDR5 modules, so if T239 is using the same memory controller it would seem that Switch NG may also have to use 64-bit parts. Which is pretty much what I expected, anyway, but it never occurred to me that it might be enforced by the memory controller.
 
One curious thing I noticed recently is that both the Steam Deck (see here) and the ROG Ally (here), which each have 16GB of LPDDR5 on a 128 bit bus, use four 32-bit 4GB parts instead of two 64-bit 8GB parts. I would have assumed that an 8GB 64-bit module would be cheaper than two 4GB modules, but perhaps that's not the case. Of course there could be other reasons for using four modules instead of two (for instance improved heat dissipation), but given that RAM would be one of the most expensive components after the SoC, I would have expected them to go with the cheaper option, particularly in Steam Deck's case, where they're working with little to no margin.

Nintendo used a pair of 32-bit 2GB modules instead of one 64-bit 4GB module on the Switch (and all subsequent revisions), so they're no strangers to using a larger number of lower capacity LPDDR parts. Thinking about this, I suspect it's a limitation of the memory controllers. I believe all TX1 based products use a pair of 32-bit modules, and AMD's LPDDR5 controllers may also be built for 32-bit parts. Looking at the Orin Jetson boards, they all appear to use 64-bit LPDDR5 modules, so if T239 is using the same memory controller it would seem that Switch NG may also have to use 64-bit parts. Which is pretty much what I expected, anyway, but it never occurred to me that it might be enforced by the memory controller.
So what capacities exist in 64 bit lpddr5 modules? Would using 2 64 bit modules rule out certain capacities?
 
According to Nightdive, WB owns NOLF IP.

They found out that WB owns it but when asked they pretended to not know and rejected their request of license.
It's just complicated. The game was released by Fox Interactive and Sierra as Publisher. Sierra was Vivendi and is now Activision. Fox Interactive was actually also sold off to Vivendi and should also belong to Activision now. But who knows, maybe some shenanigans happened and the rights stayed with Fox, so it could be possible that some copyrights now reside at Disney.
Monolith Productions (not Soft!) as the creator studio was sold off meanwhile to Warner Bros, which is currently most likely holding all rights. But there is the potential that either Activision (soon MS) or Disney still own something of this franchise. To clear this up, they would need to activate their lawyers and this is most likely just way too expensive to kick off for a franchise no one expect any revenue from.
 
Its somewhat difficult to directly compare Nintendo's best work to a team like Naughty Dog. Nintendo has basically been working with hardware a generation behind Naughty Dog even since the PS3 hit the market. While a game like Zelda TotK isn't going to wow anyone in 2023 based on its visuals, on a technical level, it had developers commenting on how impressive it was. The physics system was already impressive, very impressive relative to the hardware its on, but then they throw in a crazy crafting system that was probably a nightmare to keep from causing countless game breaking glitches. I believe Nintendo is very competent when it comes to their technical prowess, but even better at managing their resources. So many developers let the game get away from them and try to find optimizations towards the end of development, often coming up short of maintain the target framerate. Nintendo more often than not hits its desired framerate and holds it.



I think its fair to point out that Naughty Dog is a team on not a publisher with lots of teams. Is Naughty Dog more technically capable than most Nintendo teams? Yes, but Nintendo has a lot of teams and a few of them are top tier. Still, PS5 is out and exclusives for the hardware are going to be a thing. No matter how talented Nintendo's team is, they cant make up for a 6 Tflop deficit and CPU performance that is less than half. Nintendo will make some terrific looking but on a technical level they cant be on the same level as Sony's top games, how could they, they have less to work with.

I believe Nintendo, like other developers out there, play the waiting game, sit in the wings, and observe what others are doing before they act on how to get the most out of hardware. But since they’ve been “behind” in power since the PS3 era, it would seem the initial impression are Nintendo have no idea how HD development works, let alone 4K, DLSS, or Ray Tracing development.

I personally believe it’s not so clear cut as evidenced by what they can develop under such limited hardware; case in point TOTK as you mentioned earlier. The physics system alone is probably a masterclass in game development, and how you can really optimize, and polish the game's code to simply make things work. We're talking about a physics systems that operates within only THREE physical CPU cores (that’s on top of everything else the CPU has to calculate, and operate), and there’s a possibility they’re using some of the GPUs silicon to help assist with that too. I can’t remember a time outside of Nintendo's own IPs when a massively large open world title simply just “worked.” Now in fairness, a lot of that is due to Nintendo's commitment for polish, and only launching a game when it’s ready rather than being subjected to beancounters upstairs who are trying to meet some deadline or quota for their end of the year bonus.

That said, we know from others out there either directly in the industry, or even enthusiasts that new techniques are always being discovered and utilized. Take for example, Randy Linden, the man responsible for reverse engineering the source code of Doom 1993, built his own game engine for the SNES, and using the power of the SuperFX 2 chip, managed to get the game running on it. There’s a fantastic interview with Digital Foundry almost a year ago that is a great deep dive into game development, plus he even mentions how he heard of a new technique from a programmer that may actually give you more performance today with Doom on SNES (Skip to before the 15min for that relevant part)



Of course, that was in the 90s during the height of the SNES' popularity. We’ve seen since the 90s that enthusiasts are finding new ways to either optimize an existing game on a platform, or simply develop a brand new title so it works on native hardware. As for the latter, there was Micro Mages for the NES, which for an actual NES title looks absolutely incredible for 8-bit.



As for the former though, there is the wizard that is Kaze Emanuar who “fixed” Super Mario 64's source code to get it running on real hardware at a full 60 fps. He has since worked on building his own levels that'll work on real hardware using the techniques he's worked on over time, but here’s the one with him explaining him achieving triple the FPS for Mario 64:



I know you’ve seen these videos before, GT, but for others, this is a great way to kill an hour of your time to show what was possible back then, and what is possible now given new discoveries after the fact.

I suppose what I’m getting at is just because Nintendo, or other developers for that matter don’t have the latest and greatest hardware, doesn’t mean they’re necessarily behind the competition. More than likely, they’re discovering new ways and techniques in the background on how to optimize their games even further than what most developers were working on back in the day. We’ve seen what Zelda TOTK can be with hardware that is effectively a PS3.5. Imagine what they can do with a PS4.5.

As a side note, had the GCN been more popular for example, I think a game like GTA3 would’ve been possible under the limited storage requirements, either by using multiple game discs, or even a single disc. Same goes I believe with other games that at the time were deemed too large to fit within the constraints of the GCN's 1.5GB disc. And who knows, maybe some enthusiast has thought of that very thing. Same is true for the Wii U. Yes, it’s hardware was “outdated” by the time it launched, but that doesn’t mean it was particularly slow. It did not have the grunt of the PS4, but I think through some optimizations, even a game like Doom 2016 for example would’ve been theoretically possible, say the Doom SNES of the Wii U. But that’s just me saying this without any programming experience. Since the Wii U is now more of a system for the homebrew community, I wonder if we may see certain games, or demos down the road. It’s interesting how many late gen PS3 titles really were too much for those systems, but I’d be curious how with a more modern GPU, more available ram, plus the EDRAM pool, how some of these titles would’ve faired.
 
Last edited:
So what capacities exist in 64 bit lpddr5 modules? Would using 2 64 bit modules rule out certain capacities?
In theory, Nintendo could get a manufacturer to make any capacity they want. So nothing is completely ruled out.

Apple shipped 225 million iPhones last year alone. In practice, Nintendo probably isn’t large enough to drive custom part prices down for it to be worth it - the phone market has pushed the margins down to the bottom already.

It doesn’t seem like there are any 4GB modules in production, the smallest are 6GB. 12GB seems pretty close to a lock to me.
 
0
I wonder what happened to Fox Interactive properties...properties like No One Lives Forever...
No One LIves Forever is in IP limbo (Warner Bros., Fox, and Activision all might or might not own part or all of it), one of the last major 90s/00s PC games to not get a modern re-release/port/remake. System Shock 1 and 2 were in a similar position for awhile, where somehow an insurance company owned the IP or at least part of it. But it's higher-profile so was easier to resolve.
 
0
I believe Nintendo, like other developers out there, play the waiting game, sit in the wings, and observe what others are doing before they act on how to get the most out of hardware. But since they’ve been “behind” in power since the PS3 era, it would seem the initial impression are Nintendo have no idea how HD development works, let alone 4K, DLSS, or Ray Tracing development.

I personally believe it’s not so clear cut as evidenced by what they can develop under such limited hardware; case in point TOTK as you mentioned earlier. The physics system alone is probably a masterclass in game development, and how you can really optimize, and polish the game's code to simply make things work. We're talking about a physics systems that operates within only THREE physical CPU cores (that’s on top of everything else the CPU has to calculate, and operate), and there’s a possibility they’re using some of the GPUs silicon to help assist with that too. I can’t remember a time outside of Nintendo's own IPs when a massively large open world title simply just “worked.” Now in fairness, a lot of that is due to Nintendo's commitment for polish, and only launching a game when it’s ready rather than being subjected to beancounters upstairs who are trying to meet some deadline or quota for their end of the year bonus.

That said, we know from others out there either directly in the industry, or even enthusiasts that new techniques are always being discovered and utilized. Take for example, Randy Linden, the man responsible for reverse engineering the source code of Doom 1993, built his own game engine for the SNES, and using the power of the SuperFX 2 chip, managed to get the game running on it. There’s a fantastic interview with Digital Foundry almost a year ago that is a great deep dive into game development, plus he even mentions how he heard of a new technique from a programmer that may actually give you more performance today with Doom on SNES (Skip to before the 15min for that relevant part)



Of course, that was in the 90s during the height of the SNES' popularity. We’ve seen since the 90s that enthusiasts are finding new ways to either optimize an existing game on a platform, or simply develop a brand new title so it works on native hardware. As for the latter, there was Micro Mages for the NES, which for an actual NES title looks absolutely incredible for 8-bit.



As for the former though, there is the wizard that is Kaze Emanuar who “fixed” Super Mario 64's source code to get it running on real hardware at a full 60 fps. He has since worked on building his own levels that'll work on real hardware using the techniques he's worked on over time, but here’s the one with him explaining him achieving triple the FPS for Mario 64:



I know you’ve seen these videos before, GT, but for others, this is a great way to kill an hour of your time to show what was possible back then, and what is possible now given new discoveries after the fact.

I suppose what I’m getting at is just because Nintendo, or other developers for that matter don’t have the latest and greatest hardware, doesn’t mean they’re necessarily behind the competition. More than likely, they’re discovering new ways and techniques in the background on how to optimize their games even further than what most developers were working on back in the day. We’ve seen what Zelda TOTK can be with hardware that is effectively a PS3.5. Imagine what they can do with a PS4.5.

As a side note, had the GCN been more popular for example, I think a game like GTA3 would’ve been possible under the limited storage requirements, either by using multiple game discs, or even a single disc. Same goes I believe with other games that at the time were deemed too large to fit within the constraints of the GCN's 1.5GB disc. And who knows, maybe some enthusiast has thought of that very thing. Same is true for the Wii U. Yes, it’s hardware was “outdated” by the time it launched, but that doesn’t mean it was particularly slow. It did not have the grunt of the PS4, but I think through some optimizations, even a game like Doom 2016 for example would’ve been theoretically possible, say the Doom SNES of the Wii U. But that’s just me saying this without any programming experience. Since the Wii U is now more of a system for the homebrew community, I wonder if we may see certain games, or demos down the road. It’s interesting how many late gen PS3 titles really were too much for those systems, but I’d be curious how with a more modern GPU, more available ram, plus the EDRAM pool, how some of these titles would’ve faired.


Good points. Nintendo is forced to come up with more solutions and new techniques that reduced performance demands because they are working with less capable hardware compared to their competitors on PlayStation and Xbox. There were a few patents filed by Nintendo that related to some techniques they implemented in Zelda TotK, I believe one was specific to how Ascend worked.

No question GTA3 could have been on GameCube seeing as how True Crime Streets of New York 1 & 2 were on GameCube. The capacity limitations for GC disk is an overexaggerated problem. The primary reasoning the console underperformed was due to it being marketed towards kids and looking a bit to Fisher Price. Nintendo now realizes that if they market to young adults, the kids want to be there just the same. Sony had the head start and the momentum coming off the PS1, and on top of that Microsoft added additional competition. Nintendo had an image problem in the late 90's and early 2000's, and as long as they don't screw it up, its largely been resolved with the Switch.

Wii U would have been considered capable hardware if it had released in 2010 and had software tools anywhere near as good as Switch. Instead Wii U came out in 2012, way to late, and with terrible software development tools making it a nightmare for developers to get the most out of the hardware. My nephews will often times boot up Black Ops 2 on my Wii U when they are over, and every time I see that game I am shocked at how well the visuals hold up. Framerate in single player wasn't perfect, but multiplayer was nearly locked to 60fps. I have been playing Batman Origins on my Wii U for the past week on the gamepad, same thing, the visuals hold up surprisingly well. With that said, Wii U too little too late and developers never had the incentive to really optimize their games for the hardware. Most games didn't even seem to take advantage of the fact that it had twice the memory of the 360/PS3. I remember hearing complaints about the lack of a hard drive for faster streaming in of assets, a grievance that didn't make sense to me because they should have been able to cache that data in the extra 512MB of memory.
 
One curious thing I noticed recently is that both the Steam Deck (see here) and the ROG Ally (here), which each have 16GB of LPDDR5 on a 128 bit bus, use four 32-bit 4GB parts instead of two 64-bit 8GB parts. I would have assumed that an 8GB 64-bit module would be cheaper than two 4GB modules, but perhaps that's not the case. Of course there could be other reasons for using four modules instead of two (for instance improved heat dissipation), but given that RAM would be one of the most expensive components after the SoC, I would have expected them to go with the cheaper option, particularly in Steam Deck's case, where they're working with little to no margin.

Nintendo used a pair of 32-bit 2GB modules instead of one 64-bit 4GB module on the Switch (and all subsequent revisions), so they're no strangers to using a larger number of lower capacity LPDDR parts. Thinking about this, I suspect it's a limitation of the memory controllers. I believe all TX1 based products use a pair of 32-bit modules, and AMD's LPDDR5 controllers may also be built for 32-bit parts. Looking at the Orin Jetson boards, they all appear to use 64-bit LPDDR5 modules, so if T239 is using the same memory controller it would seem that Switch NG may also have to use 64-bit parts. Which is pretty much what I expected, anyway, but it never occurred to me that it might be enforced by the memory controller.
Sorry to quote you twice, but I had a thought.

If it's indeed is just the memory controller, then why wouldn't they fix it for Mariko? Mariko got a new memory controller anyway, to support lpddr5 x. If there was money to save by going with fewer 64 bit modes, and no significant drawback Mariko would have been an opportunity to correct that.

Maybe, as you said it was for thermal reasons.
 
Nintendo had an image problem in the late 90's and early 2000's,

E7bTLzgWQAEFLT9.jpg
 
Sorry to quote you twice, but I had a thought.

If it's indeed is just the memory controller, then why wouldn't they fix it for Mariko? Mariko got a new memory controller anyway, to support lpddr5 x. If there was money to save by going with fewer 64 bit modes, and no significant drawback Mariko would have been an opportunity to correct that.

Maybe, as you said it was for thermal reasons.
Unless I'm mistaken, that was likely to maintain compatibility by keeping the number of channels the same.
 
0
If it's indeed is just the memory controller, then why wouldn't they fix it for Mariko? Mariko got a new memory controller anyway, to support lpddr5 x. If there was money to save by going with fewer 64 bit modes, and no significant drawback Mariko would have been an opportunity to correct that.
In the case of Mariko, I suspect that 64 bit was a bad choice. 64 bit RAM tends to be at a premium, and going single channel instead of dual channel might have been less power efficient.

I think Thraktor's question is, why not go with a 32 bit bus on Drake? And I think he's right, that the memory controller dictates it. Your question - why go with that memory controller - I suspect is two factors.

First, Nvidia has invested a lot of money and time in their memory controllers to make them maximally power efficient. If Orin had a good design, reusing Orin's likely saved dev time. Second, if they'd gone 32 bit, their only real options would have been 8GB of RAM (4x2GB modules), 16 (4x4GB modules). Or, I suppose, 2x12GB modules, at half the bus speed (because only 2 channels instead of 4).

8GB would probably be too tight on RAM, 16GB would be too expensive, and narrow bus would be repeating the one big technical downfall of the Switch. Dual channel 64-bit hit a sweet spot of cost, RAM size, and RAM speed. That they could reuse Orin's memory controller was a nice bonus.
 
0
One curious thing I noticed recently is that both the Steam Deck (see here) and the ROG Ally (here), which each have 16GB of LPDDR5 on a 128 bit bus, use four 32-bit 4GB parts instead of two 64-bit 8GB parts. I would have assumed that an 8GB 64-bit module would be cheaper than two 4GB modules, but perhaps that's not the case. Of course there could be other reasons for using four modules instead of two (for instance improved heat dissipation), but given that RAM would be one of the most expensive components after the SoC, I would have expected them to go with the cheaper option, particularly in Steam Deck's case, where they're working with little to no margin.

Nintendo used a pair of 32-bit 2GB modules instead of one 64-bit 4GB module on the Switch (and all subsequent revisions), so they're no strangers to using a larger number of lower capacity LPDDR parts. Thinking about this, I suspect it's a limitation of the memory controllers. I believe all TX1 based products use a pair of 32-bit modules, and AMD's LPDDR5 controllers may also be built for 32-bit parts. Looking at the Orin Jetson boards, they all appear to use 64-bit LPDDR5 modules, so if T239 is using the same memory controller it would seem that Switch NG may also have to use 64-bit parts. Which is pretty much what I expected, anyway, but it never occurred to me that it might be enforced by the memory controller.
Actually, Jetson Orin Nano's using two 32-bit LPDDR5 modules (here and here).

So I assume Drake theoretically can also use 32-bit LPDDR5/LPDDR5X modules if Nintendo so chooses.
But I imagine space's a premium, especially if Nintendo wants the motherboard for Nintendo's new hardware to be similar in size to the OLED model's motherboard (here and here). (This is assuming that Drake can hypothetically support 32-bit LPDDR5/LPDDR5X modules.)

Edit: Actually, I think Serve the Home's picture is the 8 GB variant of Jetson Orin Nano, not the 4 GB variant. Never mind.

Anyway, although off-topic, the good news is Locuza's a believer.
 
Last edited:
Simpsons: Hit and Run 2 👀

First, I’m not “people”. My expression of disgust was more at the idea that Nintendo EPD haven’t accomplished anything which holds up to Naughty Dog and Guerilla on a technical level, when they’ve surpassed both. The narrative of them being behind the industry and playing perpetual catch-up, when they’ve been leaders on technical levels, among others. I also find it hella wild that you’re telling me, someone in creative events, that I’ve mistaken “art style” for “technical prowess”. All I wrote was “URGH”. So, the fact that you took THAT much from it, which is wrong, by the way, then tried to techbrosplain to me is telling, hilarious and too cute. In the case of Guerilla especially, please, don’t take my word for it when I tell you Breath surpassed Horizon: Zero Dawn on multiple accounts in 2017. I’ll let a past post and the video speak. Whatever. As You Were.
This video is worth more than 1000000 words.
 
0
Brazil Game Show 2023 - October 11-15

Nintendo announced their presence, will be one of the event major sponsors and the organization announced that will be the biggest stand ever for BGS
Brazilian devs can also make meetings and business talks regarding development for their platforms there

EDIT for reference: Last year Nintendo stand was 1,000m²
Nintendo Live - September 1-4th

(They won't reveal anything there but it's a noteworthy event.)

Is Nintendo Live Japan still happening? Couldn't find it.

It seems Nintendo returning and/or expanding their presence pretty much everywhere.
It seems the right thing to do, regardless of announcing new hardware or not.

They're increasing brand awareness.

They're milking their current platform to the last drop.
They're sitting on a huge install base of very active users and have a great portfolio of amazing games with wide appeal.
Rarely have we seen such a situation. Any investment in marketing could see huge returns in the short term.

When they announce new hardware, they'll have a lot of eyes on them and a lot of cash on hand.

All next year's events will be very interesting.
 
0
I believe Nintendo, like other developers out there, play the waiting game, sit in the wings, and observe what others are doing before they act on how to get the most out of hardware. But since they’ve been “behind” in power since the PS3 era, it would seem the initial impression are Nintendo have no idea how HD development works, let alone 4K, DLSS, or Ray Tracing development.

I personally believe it’s not so clear cut as evidenced by what they can develop under such limited hardware; case in point TOTK as you mentioned earlier. The physics system alone is probably a masterclass in game development, and how you can really optimize, and polish the game's code to simply make things work. We're talking about a physics systems that operates within only THREE physical CPU cores (that’s on top of everything else the CPU has to calculate, and operate), and there’s a possibility they’re using some of the GPUs silicon to help assist with that too. I can’t remember a time outside of Nintendo's own IPs when a massively large open world title simply just “worked.” Now in fairness, a lot of that is due to Nintendo's commitment for polish, and only launching a game when it’s ready rather than being subjected to beancounters upstairs who are trying to meet some deadline or quota for their end of the year bonus.

That said, we know from others out there either directly in the industry, or even enthusiasts that new techniques are always being discovered and utilized. Take for example, Randy Linden, the man responsible for reverse engineering the source code of Doom 1993, built his own game engine for the SNES, and using the power of the SuperFX 2 chip, managed to get the game running on it. There’s a fantastic interview with Digital Foundry almost a year ago that is a great deep dive into game development, plus he even mentions how he heard of a new technique from a programmer that may actually give you more performance today with Doom on SNES (Skip to before the 15min for that relevant part)



Of course, that was in the 90s during the height of the SNES' popularity. We’ve seen since the 90s that enthusiasts are finding new ways to either optimize an existing game on a platform, or simply develop a brand new title so it works on native hardware. As for the latter, there was Micro Mages for the NES, which for an actual NES title looks absolutely incredible for 8-bit.



As for the former though, there is the wizard that is Kaze Emanuar who “fixed” Super Mario 64's source code to get it running on real hardware at a full 60 fps. He has since worked on building his own levels that'll work on real hardware using the techniques he's worked on over time, but here’s the one with him explaining him achieving triple the FPS for Mario 64:



I know you’ve seen these videos before, GT, but for others, this is a great way to kill an hour of your time to show what was possible back then, and what is possible now given new discoveries after the fact.

I suppose what I’m getting at is just because Nintendo, or other developers for that matter don’t have the latest and greatest hardware, doesn’t mean they’re necessarily behind the competition. More than likely, they’re discovering new ways and techniques in the background on how to optimize their games even further than what most developers were working on back in the day. We’ve seen what Zelda TOTK can be with hardware that is effectively a PS3.5. Imagine what they can do with a PS4.5.

As a side note, had the GCN been more popular for example, I think a game like GTA3 would’ve been possible under the limited storage requirements, either by using multiple game discs, or even a single disc. Same goes I believe with other games that at the time were deemed too large to fit within the constraints of the GCN's 1.5GB disc. And who knows, maybe some enthusiast has thought of that very thing. Same is true for the Wii U. Yes, it’s hardware was “outdated” by the time it launched, but that doesn’t mean it was particularly slow. It did not have the grunt of the PS4, but I think through some optimizations, even a game like Doom 2016 for example would’ve been theoretically possible, say the Doom SNES of the Wii U. But that’s just me saying this without any programming experience. Since the Wii U is now more of a system for the homebrew community, I wonder if we may see certain games, or demos down the road. It’s interesting how many late gen PS3 titles really were too much for those systems, but I’d be curious how with a more modern GPU, more available ram, plus the EDRAM pool, how some of these titles would’ve faired.


Whether Nintendo has been technically less capable than competitors is debatable. Personally I really don't think so.

But I kind of agree that they've been historically working with less capable hardware and observing what competitors do with bleeding edge tech.
But that's about to change.

Some consumers seem to be shifting to handheld devices and the industry is taking notice. The Switch is a success and Steam handhelds are now the base target for most multi-platform games. That puts a soft-cap on the industry.

Its partnership with Nvidia puts Nintendo at the forefront of emergent tech and rendering techniques. No competitors have access to dedicated ML hardware or advanced RT acceleration. Beyond DLSS, that partnership may extend to physics, voice recognition/generation, behavioral AI, etc.

It will be very interesting to watch what Nintendo does, both software and hardware-wise, in the next decade.
 
They're not mentioning Switch 2 until early 2024 at the earliest imo. Then fully revealed in Spring/Summer and released in Autumn/Winter 2024.
that what i predicting/expecting in regard to Nintendo next hardware, anounce it in summer of next year for a fall/holiday release
 
Whether Nintendo has been technically less capable than competitors is debatable. Personally I really don't think so.
I don't think it's really that debatable. I think detractors are just confusing rendering features with hardware. while Nintendo jumped into physically based rendering fairly late, their headliner, Zelda, was chock full of modern rendering features; as well as a bunch of methods you don't really see in other games. Nintendo more adapts high end features and scale them down to their hardware.
 
Actually, Jetson Orin Nano's using two 32-bit LPDDR5 modules (here and here).

So I assume Drake theoretically can also use 32-bit LPDDR5/LPDDR5X modules if Nintendo so chooses.
But I imagine space's a premium, especially if Nintendo wants the motherboard for Nintendo's new hardware to be similar in size to the OLED model's motherboard (here and here). (This is assuming that Drake can hypothetically support 32-bit LPDDR5/LPDDR5X modules.)

Edit: Actually, I think Serve the Home's picture is the 8 GB variant of Jetson Orin Nano, not the 4 GB variant. Never mind.

Anyway, although off-topic, the good news is Locuza's a believer.

Who is Locuza?
 
In an hour, we'll have a THQ Nordic presentation.


I totally expect 100% of the announcements there to be released on the successor. Their Switch support has been spectacular, some miracle ports they had were among the best. They're gonna surpass themselves on the successor.

Alone in the Dark should be a launch title. They released the console pushing GBC game on NSO, this is definitely is a hint for the things to come.
 
In an hour, we'll have a THQ Nordic presentation.


I totally expect 100% of the announcements there to be released on the successor. Their Switch support has been spectacular, some miracle ports they had were among the best. They're gonna surpass themselves on the successor.

Alone in the Dark should be a launch title. They released the console pushing GBC game on NSO, this is definitely is a hint for the things to come.

I just want to see the Time Splitters Trilogy to be ported to modern consoles.
 
I just want to see the Time Splitters Trilogy to be ported to modern consoles.
That's Deep Silver.

But THQ Nordic has Second Sight, another game from David Doak, creator of TimeSplitters, Goldeneye and Perfect Dark. They promised to remaster that in 2021, maybe it's finally ready? They released on steam last year.
 
That's Deep Silver.

But THQ Nordic has Second Sight, another game from David Doak, creator of TimeSplitters, Goldeneye and Perfect Dark. They promised to remaster that in 2021, maybe it's finally ready? They released on steam last year.

THQ Nordic owns Deep Silver.

Apparently I was mistaken.
 
Last edited:
Higher upscaling takes longer, and introduces more scaling artifacts. So it's not the just difference between 1440p and 4k output - it's also about delivering a high enough quality base image that the upscaling works well.
I wouldn't consider "more scaling artifacts" a big minus. It's going to be scaled to the screen resolution one way or another, and using two (or more) different methods of scaling to get there probably results in more types of artifacts simultaneously.
Integer scaling is artifactless, so there are cases where these is no penalty for upscaling "twice".
Even if it's not the same by definition, I'd put "all the pixels are huge and chunky" in the same category as scaling artifacts as bad things to happen to an image.
All I'm seeing is 60fps games won't have much upscaling. Better pray they hit some integer scale of 2160p
I mean... the next integer scale step down would be 1080p60, like a lot of Switch games already are. But the higher res the screen is, the less noticeable off-integer scaling is. Something like 1300->2160 would still look a lot better than Switch games that do 900->1080.
Am I dumb to think Nintendo can just rely on Switch 1 if the Switch 2 tanks?
Little bit? It might have a PS2 effect of giving Switch 1 a longer tail since PS3/Switch 2 isn't taking over properly, but it's more the silver lining of the cloud than something a company can rely on for long.
Love this idea and surprised we haven't seen Nintendo do something like this before.
Game Boy Micro and New 3DS not-XL both did such things.
 
THQ Nordic owns Deep Silver.
They're separate entities, Embracer owns all of them, plus Gearbox, Saber etc.

They do sometimes exchange IP's between them. In 2020 THQ gave Red Faction to Deep Silver and Deep Silver gave Risen in return. THQ released Risen on Switch in january this year.
 
In an hour, we'll have a THQ Nordic presentation.


I totally expect 100% of the announcements there to be released on the successor. Their Switch support has been spectacular, some miracle ports they had were among the best. They're gonna surpass themselves on the successor.

Alone in the Dark should be a launch title. They released the console pushing GBC game on NSO, this is definitely is a hint for the things to come.
I love to see the Call of the wild on Switch or Switch 2
 
Good points. Nintendo is forced to come up with more solutions and new techniques that reduced performance demands because they are working with less capable hardware compared to their competitors on PlayStation and Xbox. There were a few patents filed by Nintendo that related to some techniques they implemented in Zelda TotK, I believe one was specific to how Ascend worked.

No question GTA3 could have been on GameCube seeing as how True Crime Streets of New York 1 & 2 were on GameCube. The capacity limitations for GC disk is an overexaggerated problem. The primary reasoning the console underperformed was due to it being marketed towards kids and looking a bit to Fisher Price. Nintendo now realizes that if they market to young adults, the kids want to be there just the same. Sony had the head start and the momentum coming off the PS1, and on top of that Microsoft added additional competition. Nintendo had an image problem in the late 90's and early 2000's, and as long as they don't screw it up, its largely been resolved with the Switch.

Wii U would have been considered capable hardware if it had released in 2010 and had software tools anywhere near as good as Switch. Instead Wii U came out in 2012, way to late, and with terrible software development tools making it a nightmare for developers to get the most out of the hardware. My nephews will often times boot up Black Ops 2 on my Wii U when they are over, and every time I see that game I am shocked at how well the visuals hold up. Framerate in single player wasn't perfect, but multiplayer was nearly locked to 60fps. I have been playing Batman Origins on my Wii U for the past week on the gamepad, same thing, the visuals hold up surprisingly well. With that said, Wii U too little too late and developers never had the incentive to really optimize their games for the hardware. Most games didn't even seem to take advantage of the fact that it had twice the memory of the 360/PS3. I remember hearing complaints about the lack of a hard drive for faster streaming in of assets, a grievance that didn't make sense to me because they should have been able to cache that data in the extra 512MB of memory.

What’s funny is they had the 1GB of memory, plus that fast pool of edram which was only 32MB. Most developers never took advantage like you said. But I think the same held true for Xbone in many cases with its esram pool.

I want to say developers just don’t like split pools of ram because it means they have to segregate their resources based on where the ram pools are needed, correct? That if you use a unified pool, you can just dump your resources, and optimize from there?

And I could see the family friendly marketing to some extent, but on the other hand, the GCN was never short of teen, or adult oriented content either. The perception of Nintendo though probed isn’t help things, so I could see where you’re coming from with that.

Whether Nintendo has been technically less capable than competitors is debatable. Personally I really don't think so.

But I kind of agree that they've been historically working with less capable hardware and observing what competitors do with bleeding edge tech.
But that's about to change.

Some consumers seem to be shifting to handheld devices and the industry is taking notice. The Switch is a success and Steam handhelds are now the base target for most multi-platform games. That puts a soft-cap on the industry.

Its partnership with Nvidia puts Nintendo at the forefront of emergent tech and rendering techniques. No competitors have access to dedicated ML hardware or advanced RT acceleration. Beyond DLSS, that partnership may extend to physics, voice recognition/generation, behavioral AI, etc.

It will be very interesting to watch what Nintendo does, both software and hardware-wise, in the next decade.

I made a post earlier about this, and it had to do with the advances in mobile hardware, ARM, and what this means for gaming. Sony and Microsoft are using x86 right now, but I anticipate over the next decade or so Arm-based gaming devices will take center stage as the primarily driver.

Apple has done this already using their own silicon that is arm-based, and Nintendo are the only ones of the big three who have made the transition too, and it also has to do how far the advances have gone over the last decade.

Even in terms of enterprise level computing, more Arm-based hardware is heading that way, which could be huge for both efficiency, plus power consumption.

There’s also RISC V, which I’m not too familiar with, but I’d imagine that’ll get some use as well. I feel x86 will in the near future starting declining, but could be wrong. Legacy support will of course be around for decades, so even if Arm becomes the dominant architecture, x86 systems will still get support for years afterwards I’m sure.
 
In an hour, we'll have a THQ Nordic presentation.


I totally expect 100% of the announcements there to be released on the successor. Their Switch support has been spectacular, some miracle ports they had were among the best. They're gonna surpass themselves on the successor.

Alone in the Dark should be a launch title. They released the console pushing GBC game on NSO, this is definitely is a hint for the things to come.
Wonder if it'll be a DQ11 situation when Square showed that off for the first time and confirmed it for NX.
 
I want to say developers just don’t like split pools of ram because it means they have to segregate their resources based on where the ram pools are needed, correct? That if you use a unified pool, you can just dump your resources, and optimize from there?

And I could see the family friendly marketing to some extent, but on the other hand, the GCN was never short of teen, or adult oriented content either. The perception of Nintendo though probed isn’t help things, so I could see where you’re coming from with that.

Wii U's edram was a bit more appropriate compared to Xbox One. The 32MB on Wii U is enough to store the frame buffers for a 720p image and thus developers could do all the rendering directly to the edram. 32MB is not enough to hold all the frame buffers at 1080p or even 900p, and this meant they would have to use a work around solution such as tiling. Remember that Wii U had low main memory bandwidth, something like 12GB/s, so the edram really did do its job of alleviating the bandwidth burden for rendering.

On GC there was quite a bit of M rated content that came it the console, but it always sold worse compared to those titles on PlayStation or Xbox. The Capcom deal was really to try and broaden the appeal of the console, but it was never able to shake off the Fisher Price lunch box perception from the mainstream audience.
 
I think Nintendo will be fine launching in the holiday. It’s not some new revelation. Like Nintendo wanted a March release for Switch. They said they needed to make sure games were ready.
 
Actually, Jetson Orin Nano's using two 32-bit LPDDR5 modules (here and here).

So I assume Drake theoretically can also use 32-bit LPDDR5/LPDDR5X modules if Nintendo so chooses.
But I imagine space's a premium, especially if Nintendo wants the motherboard for Nintendo's new hardware to be similar in size to the OLED model's motherboard (here and here). (This is assuming that Drake can hypothetically support 32-bit LPDDR5/LPDDR5X modules.)

Edit: Actually, I think Serve the Home's picture is the 8 GB variant of Jetson Orin Nano, not the 4 GB variant. Never mind.

Anyway, although off-topic, the good news is Locuza's a believer.

I'd love for Locuza to do a die annotation for Orin. I'm working on one right now but Locuza is far more skilled.

Slightly off topic, but I'm having trouble finding the posts that were about the internal storage standard likely for T239. Would appreciate someone linking me the post!
 
Watch his video, it's more of an OP-ED saying not to launch in the holidays frame.
I will give it a watch from start to finish. I tend to really dislike many Youtube talking head opinion pieces where often people no more qualified than us just say things with authority but I will give it a chance. Assuming that isn't really what this is.
I think Nintendo will be fine launching in the holiday. It’s not some new revelation. Like Nintendo wanted a March release for Switch. They said they needed to make sure games were ready.
Yeah, I think one of the major factors must be when a suitable selection of games will be ready.
 
Why, didn't PS4 launch in November?

It did, but I agree with what he is saying. Consoles no longer need the holiday season for a successful launch. Consoles see a spike in sales for the holiday, no question, but at launch the console is always supply limited, so they are now compounding the issue combining the massive demand for new consoles at launch along with increased demand thanks to the holiday season. September would make for a good launch month. Get the initial wave of hardcore gamers out of the way and move on to the more mainstream audience they enter the holiday season in November. Even under this scenario, Nintendo will need to start manufacturing at the very beginning of 2024 in order to make sure adequate supply is available. I am convinced that they can move 20 million units very fast if they have supply.
 
Actually, Jetson Orin Nano's using two 32-bit LPDDR5 modules (here and here).

So I assume Drake theoretically can also use 32-bit LPDDR5/LPDDR5X modules if Nintendo so chooses.
But I imagine space's a premium, especially if Nintendo wants the motherboard for Nintendo's new hardware to be similar in size to the OLED model's motherboard (here and here). (This is assuming that Drake can hypothetically support 32-bit LPDDR5/LPDDR5X modules.)

Edit: Actually, I think Serve the Home's picture is the 8 GB variant of Jetson Orin Nano, not the 4 GB variant. Never mind.

Anyway, although off-topic, the good news is Locuza's a believer.


I'd love for Locuza to do a die annotation for Orin. I'm working on one right now but Locuza is far more skilled.

Slightly off topic, but I'm having trouble finding the posts that were about the internal storage standard likely for T239. Would appreciate someone linking me the post!
Pardon the ignorance, but what is "die annotation"? Is this just looking at the die and breaking it down piece by piece? I am not exactly sure what Locuza is saying or who they are, but that Tweet has me excited for some reason lol.

It did, but I agree with what he is saying. Consoles no longer need the holiday season for a successful launch. Consoles see a spike in sales for the holiday, no question, but at launch the console is always supply limited, so they are now compounding the issue combining the massive demand for new consoles at launch along with increased demand thanks to the holiday season. September would make for a good launch month. Get the initial wave of hardcore gamers out of the way and move on to the more mainstream audience they enter the holiday season in November. Even under this scenario, Nintendo will need to start manufacturing at the very beginning of 2024 in order to make sure adequate supply is available. I am convinced that they can move 20 million units very fast if they have supply.
Oh for sure, I never thought consoles needed to launch during the holidays to receive some kind of bonus on sales. I just never regarded it as a bad thing. But in terms of an already sparse inventory at launch, I could totally see how it has very little upside since it can take a console years to be readily available in a worse-case scenario. The Switch for example launched in March 2017 and probably wouldn't have seen any more success with a Holiday launch with how tough it was to get one at launch.

I am crossing my fingers for a good supply at launch, although, the realist in me still expects to see some shortages lol.
 
Last edited:
Pardon the ignorance, but what is "die annotation"? Is this just looking at the die and breaking it down piece by piece? I am not exactly sure what Locuza is saying or who they are, but that Tweet has me excited for some reason lol.
No worries at all! Yeah so it's looking at a high resolution photo of the die and then identifying the various logic blocks on it like CPU cores, cache, accelerators, etc.

Here's an example, this is the IO die for Zen 4:
 
It did, but I agree with what he is saying. Consoles no longer need the holiday season for a successful launch. Consoles see a spike in sales for the holiday, no question, but at launch the console is always supply limited, so they are now compounding the issue combining the massive demand for new consoles at launch along with increased demand thanks to the holiday season. September would make for a good launch month. Get the initial wave of hardcore gamers out of the way and move on to the more mainstream audience they enter the holiday season in November. Even under this scenario, Nintendo will need to start manufacturing at the very beginning of 2024 in order to make sure adequate supply is available. I am convinced that they can move 20 million units very fast if they have supply.
This!

Releasing in November, having to supply day-one buyers and casuals, is a perfect setup for scalpers.

I would also add that releasing in Summer with one major launch title then releasing another every 2-3 months, they would reach the holidays with multiple major games, probably in very distinct genre, to maximize casual appeal.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom