• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Yeah, now that they have Lite V2 and OLED to kinda take the lower priced brackets, they are fine to go for 500 in the first year, and then move it... but i dont think they will.
They semingly tried to not change the price of the hardware once its announced to much, so i assume they will target a price they feel wont be to high in 1-2 years from now, so 4-450€ seems fine.
I can't see Nintendo releasing a console above $400 honestly. Their target demographic for a large part consists out of casual gamers and families. I would personally be fine with even a $500 console, and many of us here probably will be if it's really good hardware. But most people I meet already think 300 is a lot of money for their Animal Crossing or Mario Kart device.
 
That used a sensor bar, but yes, the same concept can be applied to the TV/screen of the device using a modern full colour camera. All the sensor bar is, is two bright IR LEDs. Using a normal camera sensor, it could lock onto any object in the room as its "point of reference" instead of having a horrid, overly complicated, messy, annoying sensor bar, or one built into the dock limiting how it can be set up.

People who want the sensor bar back haven't used a sensor bar in a WHILE, it was no panacea and it was a PAIN to set up compared to the Joy-Con's no-setup motion controls.

It's not hard for a sensor to detect "bright rectangle = TV = forward". Even low end mobile phones can do AR nowadays without an IR blaster point of reference.

I am not asking for another sensor bar, I would like a camera similar to modern VR camera setups.
 
I am not asking for another sensor bar, I would like a camera similar to modern VR camera setups.
So, on the controller.


Because "modern VR camera setups" tend to put the cameras on the person. Which on a controller would be... well, the controllers.

Unless you mean a stationary unit like lighthouses or Oculus Rift sensors? So... like the sensor bar?
 
So do the VR headset companies spend a bunch of money on cameras for no reason and it doesn't actually help at all with tracking motion
Nope, they put them on something the person has on them, though, so they don't have to use some fixed point of reference.

So, on the controllers themselves?
 
0
How expensive is inside-out tracking currently.
Well, given even low-end phones can do it, not very? It's not like it needs PSVR2 levels of tracking accuracy. Meanwhile, EXTERNAL tracking remains supremely expensive because well, it's more hardware. A pair of Valve Lighthouses can cost in the 100 dollar range- no controllers included.

Meanwhile, Joy-Con (R) already has most the components necessary for inside out, it just wasn't designed to do it. It has a camera, its own processor, motion vector sensors, gyroscope, all the "bits", just not put together. It would be an RnD challenge, sure, but it would be absolutely possible to slip in some decent inside-out tracking.

Plus, that way they wouldn't have to do something monumentally backwards like a sensor bar. Why have the remote output light and an awkward stationary unit do the positional tracking when the controller can just... do both, as Nintendo Switch already does! Nintendo LABO's variety kit included a mode that used the IR motion camera and gyro sensor present on the Joy-Con (R) to pretty much DO inside-out tracking for the racetrack creation tool. This is a solved problem; Give it a back up motion camera, and give the Joy-Con (L) a pair, too! Then it just has to be supported system-level.
 
Nintendo could also potentially benefit from future proofing with the migration from Armv8 to Armv9, as well as have access to higher amounts of L3 cache, if Nintendo decides to use the Cortex-A710 or the Cortex-A715.

The question is does Nintendo see any worth using the Cortex-A710 or the Cortex-A715?

Based on a review from Geekerwan, the Cortex-A710 has noticeably worse power efficiency with respect to performance than the Cortex-A78, regardless of the foundry company being used.
ocR1sOg.png


And based on another review from Geekerwan, although the Cortex-A715 does make significant improvements to power efficiency with respect to performance compared to the Cortex-A710, the Cortex-A715 still can't quite match the Cortex-A78 at lower TDPs, although the Cortex-A715 does start to beat the Cortex-A78 at TDPs greater than or equal to ~1.25 W.
KFEWW3v.png



I think the Geekerwan videos are very informative, and they've done as good a job as you could reasonably do demonstrating the efficiency of these cores, but there are a couple of caveats to be kept in mind when interpreting them. The first is that power consumption is implementation-dependent, and the same core on the same process can behave differently in different chips. As a case in point, compare the A710 on the Dimensity 9000 on TSMC N4 in the first graph to the A710 in the Snapdragon 8+ Gen1 on TSMC N4 in the second graph. Both identical CPUs on identical processes, but the Snapdragon appears to have a meaningful improvement in efficiency over the Dimensity 9000, closing the gap quite a bit on the A78 in the Dimensity 8100. A variety of factors could play into this, including the cache configuration, the binning of each chip, etc.

Another one to keep in mind is that they're measuring "Platform Power" rather than the power consumed purely by the CPU core itself. There's nothing inherently wrong with this, but it can include various uncore power consumption which may vary from SoC to SoC independently of the CPU core. This probably wouldn't have a huge effect, relatively speaking, at the higher end of the curve, but at lower power consumptions it may throw off the results somewhat.


Thanks, I didn't see that it was officially confirmed. I'd still expect A78 cores, but with Nvidia seemingly moving to Neoverse cores for everything else, there's always the off-chance that Nvidia would go with the N2 for T239.

I forgot to mentioned there was an article from the Financial Times where SoftBank wants to mandate that companies that buy Arm based SoC designs from other companies (e.g. Nvidia, etc.) pay Arm a device royalty fee.

And assuming Qualcomm's claim that companies with a Cortex technology licence can't use SoC designs with a mix of Arm IP and non-Arm IP after 2024, only SoC designs with only Arm IP, is true, and assuming that restriction applies to companies paying Arm a device royalty fee, then using a SoC design with the Neoverse N IP for the CPU and Nvidia's GPU IP for the GPU is likely off limits to Nintendo.

One way to bypass that restriction, as I've mentioned here, is if Nintendo used a custom Arm based CPU design from Nvidia for SoCs designed after 2024. Nvidia has been working on a custom Arm based CPU design codenamed Estes.


As part of the termination of the ARM acquisition, Nvidia has ended up with a 20 year ARM license agreement, so they'll be able to work on those license terms until at least 2040. Incidentally, while I do fully expect Nvidia to move over to their own ARM CPU designs in the next few years, they will likely be HPC designs aimed to compete more with Xeon and Epyc than smartphone SoCs. It's entirely possible that an off-the-shelf ARM core may be more suitable for Nintendo's purposes. Of course this presupposes a lot about what Nintendo do after [redacted], whether they stick with Nvidia, move to a very different form-factor, etc., so I'm probably getting a bit ahead of myself.

I should clarify that I meant to say the main point of the patent has nothing really to do with Thunderbolt.

As far as I know, the patent only uses Thunderbolt as an example, not necessarily that Nintendo's looking into using Thunderbolt.

As I said before, I don't think Nintendo's going to use Thunderbolt anytime soon, because Nintendo probably needs to pay a third party lab to certify that Nintendo's following Intel's Thunderbolt specifications.

There's a possibility Nintendo could take a look into supporting USB4 40 Gbps. But nothing's guaranteed on that front.

And I don't think Nintendo's going to support external GPUs via USB4 40 Gbps anytime soon due to the technical issues highlighted by ILikeFeet.

I'll start by saying that I agree that any kind of "GPU in the dock" idea is both a significant technical headache and would add far more cost than it's worth. However, in the unlikely event they wanted to do something like this, Nintendo wouldn't need either Thunderbolt or USB4.

The important thing to note is that the communication between the Switch and the dock is handled by a proprietary USB-C alt mode. The provision for this was included back in the first version of the USB-C specification, and it works just like any other alt mode, except that Nintendo has no obligation to be compatible with any other devices. They get 8 pins (4 differential pairs) to work with, and they can send whatever kind of signals they want over them. It just so happens that the DisplayPort alt mode already did everything Nintendo needed for the original Switch, so to avoid lots of custom parts they just co-opted that and wrapped it inside their proprietary alt mode.

If Nintendo wanted to send a lot of data back and forth to the dock, and had a custom chipset in the dock like a GPU to communicate with, then there's no reason to use a protocol like Thunderbolt or USB4. Both of these are relatively expensive because they need to reliably transmit data over several metres of cable, whereas Nintendo only needs to transmit through the USB-C port itself. There's an existing standard that is designed for high bandwidth over short distances; PCIe. Nintendo could literally just wire two lanes of PCIe 4.0 to the USB-C port and get similar bandwidth to USB4 or Thunderbolt without any of the extra hardware. The total trace length from Switch SoC to dock chip would be the same or shorter than most PCIe traces in desktop PCs, and although there would be some impedance from the actual USB-C connector itself, Nintendo has control over the physical connectors used, so I'd imagine they could make it work.
 
I can't see Nintendo releasing a console above $400 honestly. Their target demographic for a large part consists out of casual gamers and families. I would personally be fine with even a $500 console, and many of us here probably will be if it's really good hardware. But most people I meet already think 300 is a lot of money for their Animal Crossing or Mario Kart device.
Oh, I can't see 500 from a similar perspective... Then again, that demographic is not the one that buys on launch.
The first 10-20m was mostly pushed by core fans for Zelda 3d mario and xeno2. After that the price becomes a problem.
As o if they reduce it 50-100$ after the first 18 months, yeah, 500 can work. But since they are against price reductions, they won't choose that strategy
 
$300 absolutely is a lot of money to the average person.

Personally, my cats would want me to buy them a new Nintendo console.
Yep. To me it's worth it because I'll spend thousands of hours on it, but most people aren't nerds or fanboys.

Something I've been wondering about: how will they differentiate the box arts? Will the logo be different enough? Will it still be red? Will the size stay the same?

Or will they maybe even go the Xbox route and have one universal case design, with a text box or emblem saying "Switch 2 enhanced/exclusive"?
 
That used a sensor bar, but yes, the same concept can be applied to the TV/screen of the device using a modern full colour camera. All the sensor bar is, is two bright IR LEDs. Using a normal camera sensor, it could lock onto any object in the room as its "point of reference" instead of having a horrid, overly complicated, messy, annoying sensor bar, or one built into the dock limiting how it can be set up.

People who want the sensor bar back haven't used a sensor bar in a WHILE, it was no panacea and it was a PAIN to set up compared to the Joy-Con's no-setup motion controls.

It's not hard for a sensor to detect "bright rectangle = TV = forward". Even low end mobile phones can do AR nowadays without an IR blaster point of reference.
The advantage of bringing back an IR sensor bar is that it’s simple and requires very little power so Nintendo could easily add a wireless one that detaches from the dock and recharges via USB-C at very little cost, especially since they already are spending money to put IR cameras on the right hand joycons.

Other cameras or magnetometers would have advantages but would be much more expensive.
 
0
You described how it was appealing to you. You can't argue that something being appealing is an objective fact... you can only say why you think that thing is appealing. Which you did, and I appreciate your viewpoint... but I see no merit it continuing to discuss such things
Many People judged the WiiU by how it looked and I feel like its aesthetic contributed a lot to the narrative that Nintendo consoles are just for little kids. I don‘t think that it was the main reason for its failure, but when your hardware already has many issues in the first place, it dosen’t help that it also looks like a fisher price toy.
 
As part of the termination of the ARM acquisition, Nvidia has ended up with a 20 year ARM license agreement, so they'll be able to work on those license terms until at least 2040.
I was talking more about Nintendo than Nvidia since as mentioned, Nvidia's probably fine with the 20 year Arm licence.

But assuming Nintendo has to pay Arm a device royalty fee starting in 2024, my question is can Nintendo buy from companies Arm based SoC designs that has a mix of Arm IP and non-Arm IP (e.g. Arm CPU with Nvidia GPU, etc.)?
Or as the condition of paying Arm a device royalty fee, Nintendo can only buy from companies Arm based SoC designs with only Arm IP?
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
* Hidden text: cannot be quoted. *
So we can expect a June/July announcement with a Holiday release?
 
Happy Easter, everyone. I mindlessly scrolled this board too often so I gave it up for Lent. I was (mostly) successful, but I did check it a few times. But even on those occasions…I held fast not checking this thread.

So please. It’s been over 40 days. Please…tell me there’s good news.
 
Happy Easter, everyone. I mindlessly scrolled this board too often so I gave it up for Lent. I was (mostly) successful, but I did check it a few times. But even on those occasions…I held fast not checking this thread.

So please. It’s been over 40 days. Please…tell me there’s good news.
Check two posts above yours. 😅
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
* Hidden text: cannot be quoted. *
jen-ok.gif


I hope
 
Check two posts above yours. 😅
Lol what are the odds. I clicked the button to go straight to the newest thinking “I’m dozens of pages behind, no way I’ll catch anything!” That’s exciting but I’m trying not to get my hopes up too much, again…

I saw a stray mention on Reddit weeks back about dev kits going out? Was that recent, old, reliable, yay or nay? What happened/what were this thread’s thoughts?
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
* Hidden text: cannot be quoted. *
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Nintendo won't release what doesn't exist, there is no Drake, no T239, no Switch 2, just the Switch until the end of the decade and WE WILL LIKE IT :mad:
 
If the Xenoblade Chronicles saga is over with this DLC, then it’d make sense for Monolith’s new cash cow to be the X universe. There’s a story to be told and they can address the issues the game had via a HD Remaster. Not saying it’s definite, but it’s possible.

Though I give more validity to the first part of the leak than the rest of it.
 
I sometimes find this thread difficult to read because of randoms that don’t seem to be contributing but seem antagonistic.
 
* Hidden text: cannot be quoted. *
Oh my god, are you sure? I mean I guess that could work but I think it needs to be explained really well or else it’ll do worse than the Virtual Boy. Idk, Nintendo does wacky things but this might take the cake as the craziest one of all.
 
Meanwhile, Joy-Con (R) already has most the components necessary for inside out, it just wasn't designed to do it. It has a camera, its own processor, motion vector sensors, gyroscope, all the "bits", just not put together. It would be an RnD challenge, sure, but it would be absolutely possible to slip in some decent inside-out tracking.
Honestly, removing the camera from the right joycon and adding a regular one + an IR? ToF? sensor to go along with it at the front of the tablet would probably be much better anyways. Not only games like the eating minigame on 1-2 switch would still work but you'd have better tracking.
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
* Hidden text: cannot be quoted. *
Jeez, I don't think they needed to be banned. They weren't being too hostile or anything, just stubborn at best.
 
Oh my god, are you sure? I mean I guess that could work but I think it needs to be explained really well or else it’ll do worse than the Virtual Boy. Idk, Nintendo does wacky things but this might take the cake as the craziest one of all.
Idk I think that's a bit of dooming. Yeah it's not a great decision, but Virtual Boy? C'mon, it'll at least do Wii U numbers.

EDIT: Dammit.
 
Just when I'm about to go to sleep, stuff starts going again. I'm still trying to wrap my head around that Persona leak! 😫
 
Thanks for sharing. This patent isn't suggestive of an eGPU though. In the patent description and drawings, the "USB, Thunderbolt, or other general standard" ports are located on the back of the dock. It isn't the connection between the Switch and the dock. Further more, the inclusion of "Thunderbolt" along with other connector standards is to broaden the embodiments of this patent. It's doubtful that Nintendo actually intended to utilize the standard.

What this patent really covers is a "swivel block" on the back of the dock that houses the HDMI, USB, and other external connectors. It is mainly to solve the issue when your TV is located on the opposite side of the dock's HDMI port. In that scenario, the HDMI is bent into a U shape, creating a torque that potentially can pull and drop the dock and console onto the floor. By swiveling the HDMI port to face your TV, the cable no longer needs to be bent. You might think that it's a small problem, but Nintendo obviously thought otherwise and probably had the repair data from its service depots to justify a solution.

The clever design notwithstanding, any product managers worth their salt would you that the solution appears too complex and costly. It isn't a surprise that Nintendo solve it with a couple of simpler modifications on the OLED Model: The dock's back cover is now removable, and the HDMI cable is made softer thus lessening the torque when bent.
 
As soon as one disputatious member was banned, another one pop right up. Regardless whether socket puppets were involved, the effects on the thread are the same. I personally prefer to ignore or report than to engage. Back to hardware speculation:
* Hidden text: cannot be quoted. *
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
The main question I've been asking myself is if there's any evidence that T239 is intended for other customers alongside Nintendo. The support being added to L4T (and upstreamed to the Linux kernel) suggests that it's not an exclusive product, and in theory could mean it would be taped out before Nintendo requires it (or even if Nintendo cancelled the device intended to use it). That would require at least one significant customer other than Nintendo being lined up for it, though, and I can't see any realistic situation where that's the case. The lack of modules like the PVA makes it unsuitable for automotive use-cases, which is effectively Nvidia's only other market for SoCs outside Nintendo. The only other possible customers I can imagine would be in ARM Windows laptops, but that's still a very small niche, and the lack of games compiled for ARM would negate Nvidia's biggest advantage, GPU performance. The use of ARM big cores (ie A78/A710/etc.) would also put them behind the likes of Qualcomm in CPU performance, who are already behind Intel and AMD.
IMO the bolded is a flawed assumption a lot of us jumped to early on, and I no longer believe it's true. I think Nvidia works on their Linux stack for everything, as a starting point for development, as a basis for simulation, automated testing, etc. in the future, and as future proofing in case they ever do decide to productive it as a Jetson board or whatever. There's also a reason why Linux development specifically would be relevant to the Switch and the upcoming hardware, which is that the nvservices GPU driver in the Switch firmware is largely derived from Nvidia's Linux GPU drivers.

We actually have very strong evidence that Linux support does not indicate plans for a non-Nintendo product: the May 2016 release 24.1 of L4T contains several commits referencing Odin, i.e. the Switch('s motherboard). The first of these is from April 2015, shortly after the TX1 Switch project was kicked off. A later commit even mentions the SDEV 0.9 devkit, and another one casually includes the specs for the Switch's LCD.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom