• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

@Concernt

Hmmmm... what can I say, perhaps I was just lucky with my setup, I don't know... never ever had any issues with the wiimotes, and I still use them to this day so it's definitely not about fuzzy memory on my part.

Accessibility is obviously a plus so please don't misunderstand me, I'd happily welcome whatever option they can come up with as long as it doesn't mess with muscle memory, which is the second huge issue of the joycon "pointer" solution (first being gyro drift, but I guess it could be argued the two are mutually related somehow)

I'm just puzzled about your own claim of Switch being all about convenience; or should I say, I can certainly agree that's the end goal, a goal which has mostly been achieved, just not particularly well regarding this very specific control scheme IMHO. On the other hand, I think it can be argued pointer is definitely still a thing with Nintendo, considering it has been continuosly implemented for almost 20 years. So I'd be quite happy if they could find a better implementation for those of us who prefer a more reliable, muscle-memory friendly (that's the key!) scheme --> btw while I'd personally prefer this being the one and only default control, I do understand it would realistically only be implemented as one of many different options (due to the accessibility issues mentioned before), and I'd be ok with that :giggle:

PS: thanks for the trackpad explanation, didn't know about that patent
 
(Completely unrelated, but now there's only 70,000 posts to page 4000! Well, technically 69,998.)


Wii Sensor Bar is... Misremembered, I feel. It was nowhere near as reliable as people recall and it wasn't just not necessary to recalibrate, if something went wrong you weren't ALLOWED recalibrate. It was a camera on a stick pointing at two IR LEDs and with the low bitrate feed of the Wii Remote's Bluetooth connection, the motion data of the sensor bar camera combo was jittery, smoothed out by software fixes for individual games.

It was definitely plain inconvenient, having to route another cable to play the console, but there's so much more to it. With gyro, more broadly "inside out" motion, you don't need to lift your hand and point at the screen. You press the calibrate button wherever you're comfortable and can, with small movements, have very accurate control of the camera. We'd lose that, and that's also an accessibility feature, not having to lift the remote up so it can see the screen. Nintendo also doesn't realistically have the caché to dictate how people organise their living rooms, it wouldn't be a hype new feature, it would bring back an inconvenience.

I think the most significant problem is that Nintendo Switch simply isn't a home console, it's a hybrid. Inside-out and gyro tracking can work in all modes, beacon based or sensor bar tracking cannot. An inconvenience that reduces the viability of handheld mode? That takes serious engineering effort to integrate into tabletop mode? One they moved on from, one they put in the Wii U to a deafening silence?

It's not 2006 anymore, better options are absolutely available for tracking controllers, without locking it to one mode, and if then, only when you have your setup organised right.

Nintendo Switch is all about convenience, an experience that just works without hassle, and that is 100% part of the appeal. Requiring reorganisation of entertainment centres or an external accessory with another wire or set of batteries to worry about goes against that, it just doesn't align with the brand.
sry if it has been asked before, but what would be the cutbacks of getting the tablet to recieve IR info and recalibrating from there? it would work on both tabletop and docked. maybe put the IR camera along the shoulder buttons?
 


Likely fake but thought i'd drop it here.

march 14th for wilds

If this is real we may have our release date : p

big confidential text isn't a great sign but hey the logos look great.


Definitely fake:

  • If that Resident Evil 9 logo is real, whomever approved it needs to be fired.
  • The Dragon's Dogma 2 DLC is set for Black Friday? Nothing is being released the day after Thanksgiving (US Holiday)
  • Why would the timeline go as far back as 2022? If this was made in early 2022, there is no way that they would've nailed all of those release dates in a row like that/
The lack of anything Mega Man makes it somewhat believable though.
 
(Completely unrelated, but now there's only 70,000 posts to page 4000! Well, technically 69,998.)



sry if it has been asked before, but what would be the cutbacks of getting the tablet to recieve IR info and recalibrating from there? it would work on both tabletop and docked. maybe put the IR camera along the shoulder buttons?
On the Wii, the sensor bar and Wii U GamePad did not "receive IR info".

If you mean implementing it so the console itself has a sensor bar type device, then... Well.

Look where that got the Wii U GamePad, in design and popularity.
 
0
I'd happily welcome whatever option they can come up with as long as it doesn't mess with muscle memory, which is the second huge issue of the joycon "pointer" solution (first being gyro drift, but I guess it could be argued the two are mutually related somehow)

My solution is a camera on the upper bezel of the tablet and some IR LEDs on the joy-cons (probably as accessories that would come in the box)
With this setup body tracking would also be possible. And it would work on both docked and tabletop modes. Aaaand it's cheap.

Just don't tell Concernt I said that xDDD
 
@Concernt

Hmmmm... what can I say, perhaps I was just lucky with my setup, I don't know... never ever had any issues with the wiimotes, and I still use them to this day so it's definitely not about fuzzy memory on my part.

Accessibility is obviously a plus so please don't misunderstand me, I'd happily welcome whatever option they can come up with as long as it doesn't mess with muscle memory, which is the second huge issue of the joycon "pointer" solution (first being gyro drift, but I guess it could be argued the two are mutually related somehow)

I'm just puzzled about your own claim of Switch being all about convenience; or should I say, I can certainly agree that's the end goal, a goal which has mostly been achieved, just not particularly well regarding this very specific control scheme IMHO. On the other hand, I think it can be argued pointer is definitely still a thing with Nintendo, considering it has been continuosly implemented for almost 20 years. So I'd be quite happy if they could find a better implementation for those of us who prefer a more reliable, muscle-memory friendly (that's the key!) scheme --> btw while I'd personally prefer this being the one and only default control, I do understand it would realistically only be implemented as one of many different options (due to the accessibility issues mentioned before), and I'd be ok with that :giggle:

PS: thanks for the trackpad explanation, didn't know about that patent
The thing is that your muscle memory is not the muscle memory of the populous at large. Your experience is not everyone's experience. Personally, playing so much Splatoon and its sequels, my muscle memory is far more attuned with gyro controls than IR pointer controls. If you want a gyro pointer that "feels" like an IR pointer, LG TV remotes make use of a technology called "Freespace" that does exactly that. A refinement on gyro, rather than a regression to IR pointing. The solution realistically has to work for almost everyone, practically, in every mode, something IR pointers cannot do, but improved inside-out tracking can.

I would also like to point out I never once denied Nintendo's implementation of pointers. In fact, it goes way further back to their early mouse implementations. However, times and technology have changed and so how it is implemented has changed. Even there, their most successful pointing device was the touch screen, something they maintain to this day.

I wouldn't like to see an effective "move backwards" in technology for the sake of marginal improvements in pointing functionality, when we can have global improvements to Joy-Con motion that can also be used for pointer controls.
 
My solution is a camera on the upper bezel of the tablet and some IR LEDs on the joy-cons (probably as accessories that would come in the box)
With this setup body tracking would also be possible. And it would work on both docked and tabletop modes. Aaaand it's cheap.

Just don't tell Concernt I said that xDDD
Cheap, in the BOM, SURE.

Design wise? Again, Wii U GamePad. There's a 'cost' there.

Having to set up the console centered beneath the TV facing the player? Eaugh. Who's doing that. Nintendo can't expect their super-convenience box to demand players rearrange the living room, as I keep saying.

Why can't we just have BETTER CONTROLLERS?
 
@Concernt oh, i was talking about in the perspective of the controllers doing the grunt work (like what we have in Joy-Con R), the console is just there as a interpreter

i know the Wii Family just shines the fancy lights XD
 
That analise is cool, but the right thing to do in RTX ampere is isolate the performance of each RT core to full understand the time it takes to renderize ray tracing, or am I wrong?
You are only half right :) It's not just about number of cores, but also about their clock speeds - how fast those cores are running.

We have a way to combine that into one number - TFLOPS, which is what I used instead of cores.
 
Cheap, in the BOM, SURE.

Design wise? Again, Wii U GamePad. There's a 'cost' there.

Having to set up the console centered beneath the TV facing the player? Eaugh. Who's doing that. Nintendo can't expect their super-convenience box to demand players rearrange the living room, as I keep saying.

Why can't we just have BETTER CONTROLLERS?

xDDDDD


Or you're right, or I'm right, or neither of us will get better motion controls whatsoever. Only time will tell.

I just hope we get the same level of tracking that a standalone headset was able to achieve 4 years ago.
 
xDDDDD


Or you're right, or I'm right, or neither of us will get better motion controls whatsoever. Only time will tell.

I just hope we get the same level of tracking that a standalone headset was able to achieve 4 years ago.
I sure hope "better controllers" is what wins...
 
Since GTC is coming up in a few hours, I wonder if we'll get any more information on Thor.
Possibly, especially since Arm announced the Neoverse V3AE almost a week ago, and Nvidia mentioned that Thor used Arm Neoverse Poseidon AE for Thor's CPU.

Speaking of GTC 2024, there's a rumour courtesy of SemiAnalysis about B100 being fabricated using TSMC's 4 nm* process node.

If that turns out to be true, then TSMC's 4N process node's a very long lasting process node for Nvidia. Fortunately, Nvidia's GTC 2024 presentation's today at 13:00 (UTC-07:00).

* → a marketing nomenclature used by all foundry companies
 
Definitely fake:

  • If that Resident Evil 9 logo is real, whomever approved it needs to be fired.
  • The Dragon's Dogma 2 DLC is set for Black Friday? Nothing is being released the day after Thanksgiving (US Holiday)
  • Why would the timeline go as far back as 2022? If this was made in early 2022, there is no way that they would've nailed all of those release dates in a row like that/
The lack of anything Mega Man makes it somewhat believable though.
street-fighter-logo-stock-1645526436043.jpg
 
RT cores don't render anything, shader cores do. 3090 will be faster than a 3060 even with 24 RT cores because there's so much more shader cores
TR cores calculate all traces of ray light and the final color that result of that to pass the information to GPU exibe the right colors of scene. Without that, the GPU and CPU that need spend their time doing that type o calculation and it takes more time, what result in less frames per second.

You are only half right :) It's not just about number of cores, but also about their clock speeds - how fast those cores are running.

We have a way to combine that into one number - TFLOPS, which is what I used instead of cores.
The Ray Tracing cores enter in the FLOPS calculation of that GPU?
 
If they had no basis on reality, then what was the point of the tests? They didn't just randomly pick some numbers, made tests, and then tell their superiors that nothing about them has any meaning. They're based on something. Not running on actual hardware doesn't mean years of work and analysis didn't help determine these numbers. They aren't output. They are input. Will those combos be exact on actual hardware? Unlikely, but they would also unlikely be way off.

* Hidden text: cannot be quoted. *
I appreciate that "if" is in caps. But it should be a little bigger: IF


No.
* Hidden text: cannot be quoted. *
* Hidden text: cannot be quoted. *
They were not directly tested, they were running on Windows on some RTX 20 or 30 dGPU. Not on Nintendo's OS and not on T239. The point of the test was to benchmark DLSS in relative terms, not absolute ones; we have no reason to conclude the clock speed numbers they chose -- which just needed to be locked to some value to get reproducible results -- were related to the wattage names, or to T239. @Z0m3le, you know this. If you want to draw a different conclusion, cool, but this is extremely important context and you should stop omitting it.

* Hidden text: cannot be quoted. *
Always nice to have a LiC jumpscare lol
 
The thing is that your muscle memory is not the muscle memory of the populous at large. Your experience is not everyone's experience.
Never claimed either.
And I'm not asking for an IR-based solution, just whatever allows me to reliably (i.e. no swearing required - that's moving backwards if you ask me! XD) handling a virtual pointer on screen.

But I do believe we can certainly agree on the fact that

1. current pseudo-pointer implementation is inherently, objectively worse in terms of muscle memory involvement, regardless of mine, yours, or any other people's experience due to - as far as I understand - unavoidable gyro drift;
2. a solution does not need to work in every mode, as multiple control schemes do exist and are widely adopted in this industry, the most popular one being likely the dualism of pad / keyboard+mouse

If not, I guess we'll just have to agree to disagree. I definitely do not want to change your or anybody else's mind, hope you can accept that. No hard feelings by the way! ^__^
 
Never claimed either.
And I'm not asking for an IR-based solution, just whatever allows me to reliably (i.e. no swearing required - that's moving backwards if you ask me! XD) handling a virtual pointer on screen.

But I do believe we can certainly agree on the fact that

1. current pseudo-pointer implementation is inherently, objectively worse in terms of muscle memory involvement, regardless of mine, yours, or any other people's experience due to - as far as I understand - unavoidable gyro drift;
2. a solution does not need to work in every mode, as multiple control schemes do exist and are widely adopted in this industry, the most popular one being likely the dualism of pad / keyboard+mouse

If not, I guess we'll just have to agree to disagree. I definitely do not want to change your or anybody else's mind, hope you can accept that. No hard feelings by the way! ^__^
While I agree on 2, this is the case, such as the touch screen, it's still the case Nintendo would rather features benefit all modes.

On 1, I cannot possibly agree, my muscle memory is far better with Splatoon 3's gyro than the Wii's pointer. It's just not true that the pointer is strictly superior in this way, nor is it true that gyro drift is unavoidable, there's several systems that have been deployed to remedy it. Wii U GamePad had a magnetometer for autocalibration, for instance.

I will remain that I ABSOLUTELY want to change minds, I would rather people didn't have the wrong idea, like that gyro is somehow unrecoverable.
 


Here are at least the performance claims of this Tango emulator. It wouldn't work out of the box on a hypothetical Switch 3, since it seems fairly Linux specific, but a customized version could by used, should Nintendo choose to license it.

This is definitely not the only option that exists, though. Just off the top of my head, there's also dynarmic (which I believe was notably used by Yuzu) and probably qemu.
This answers what I want answered, that it should be fairly low overhead. I'm less worried about linux syscalls. At bare minimum, something like this could be used in a patch without having to rework a bunch of other code.
 
While I agree on 2, this is the case, such as the touch screen, it's still the case Nintendo would rather features benefit all modes.

On 1, I cannot possibly agree, my muscle memory is far better with Splatoon 3's gyro than the Wii's pointer. It's just not true that the pointer is strictly superior in this way, nor is it true that gyro drift is unavoidable, there's several systems that have been deployed to remedy it. Wii U GamePad had a magnetometer for autocalibration, for instance.

I will remain that I ABSOLUTELY want to change minds, I would rather people didn't have the wrong idea, like that gyro is somehow unrecoverable.
I would argue that gyro control and pointer control wouldn't even be the same option in something like splatoon as they are fundamentally different ways of interacting with the game. (As in there would be an additional control scheme)
However they solve the pointer issue I hope it feels better than wii AND switch
A game like skyward sword suffers because of it ... it sucks that we have a 60 FPS HD version of skyward sword only for it to still feel better on Wii.

I feel like nintendo has sort of an uphill battle trying to support all of its many control schemes over the years... it's probably one of the reasons Kid Icarus is stuck on 3DS
 
2024 sadly got delayed because Nintendo are freakish when it comes to having their main line games as polished as they want.
There is nothing "freakish" about that, in my opinion. An excellent thread recently opened on Fami reminds us, for example, that many of Mario Sunshine's problems stem from the fact that it had to be released in a hurry. I know the conversation here is focused on the hardware, obviously, but the hardware would be useless without the games.

So there's nothing sad about Nintendo delaying a release date. What is sad is to have launched the Wii U with NSMBU as a flaghship, for example. Talking about games that are "as polished as they want" might suggest that this is a whim or a bad thing.
 
Let's say Drake is using TSMC 4N. That alone is by far the biggest improvement between Ampere and Ada. Backported from Ada we also have the clock gate thing (I don't know exactly what it is, but I know its benefit) and the media block (AV1). Only Ada's OFA would be missing. What other features Ada has that we probably wouldn't see making it into Drake's "Ampere" GPU?

I mean, at this point, why wouldn't they just use an Ada GPU?? That's something I never understood but never asked for opinions here. Are there any other features from Ada that would make it more expensive, and so they just opted for an "Ampere" GPU that's almost an Ada GPU?
Let's say Drake is using TSMC 4N. That alone is by far the biggest improvement between Ampere and Ada. Backported from Ada we also have the clock gate thing (I don't know exactly what it is, but I know its benefit) and the media block (AV1). Only Ada's OFA would be missing. What other features Ada has that we probably wouldn't see making it into Drake's "Ampere" GPU?

I mean, at this point, why wouldn't they just use an Ada GPU?? That's something I never understood but never asked for opinions here. Are there any other features from Ada that would make it more expensive, and so they just opted for an "Ampere" GPU that's almost an Ada GPU?

I believe that's why the SEC 8nm never left my thoughts. Because, for me, knowing that Drake has an Ampere GPU (could someone remind me of where did we have this confirmation?) was always making that connection with the node. If they chose to just go with TSMC 4N, then I don't understand why not go with Lovelace anyway. Would Nvidia charge more just because it was their next architecture? And is there still a possibility of it being Lovelace somehow?

I believe that's why the SEC 8nm never left my thoughts. Because, for me, knowing that Drake has an Ampere GPU (could someone remind me of where did we have this confirmation?) was always making that connection with the node. If they chose to just go with TSMC 4N, then I don't understand why not go with Lovelace anyway. Would Nvidia charge more just because it was their next architecture? And is there still a possibility of it being Lovelace somehow?
Because are no Ada on Tegra yet? I think that is for 2025 schedule?
 
Wii U GamePad had a magnetometer for autocalibration, for instance.

On wii u I would reset the gyro like every 20 seconds. That's because my main input to aim is the gyro, and not the stick. So, doing fast rotations would make the gyro to accumulate too many errors too fast. On my 800 hours on splatoon, I can say the magnetometer never did anything for me really.

I don't recall swearing, but I will remain that I ABSOLUTELY want to change minds, I would rather people didn't have the wrong idea, like that gyro is somehow unrecoverable.

I think a big difference is that gyro works fine when holding the controller with two hands. But if you want to play wii style, then you're going to use the accelerometer just as much as the gyro, because now you'll also move your arms instead of only rotating your wrists. And that's the biggest problem, because while a gameplay with gyro only can hold up decently, when you need to use the accelerometer too, the experience overall falls from a cliff. You need the accelerometers to be able to use the joy-con like a gun, and the accelerometer needs a way to have its position in space corrected at least 30 times/second.

So, when I tried to play splatoon with the joy-con in a wii style, it was simply impossible. You can't build any muscle memory because it just doesn't work as anyone would expect it to work. You can't rely on the accelerometer in a 3DoF controller. Maybe (just maybe) that's what Turrican3 was talking about (using the joy-cons separately)
 
Last edited:
Never claimed either.
And I'm not asking for an IR-based solution, just whatever allows me to reliably (i.e. no swearing required - that's moving backwards if you ask me! XD) handling a virtual pointer on screen.

But I do believe we can certainly agree on the fact that

1. current pseudo-pointer implementation is inherently, objectively worse in terms of muscle memory involvement, regardless of mine, yours, or any other people's experience due to - as far as I understand - unavoidable gyro drift;
2. a solution does not need to work in every mode, as multiple control schemes do exist and are widely adopted in this industry, the most popular one being likely the dualism of pad / keyboard+mouse

If not, I guess we'll just have to agree to disagree. I definitely do not want to change your or anybody else's mind, hope you can accept that. No hard feelings by the way! ^__^
I want A Wiimote for backwards compatibility/legacy reasons. Doesn't need to be the same Wiimote. Stick a fish-eye lens on it and put together some means of it tracking the position of the TV and I'm pretty darn happy. Very agreed that the two IR points method sucked. I also think that if it were two IR points that flashed in a track-able pattern and we had a fish-eye lens to help make sure we always had view of it, it could be much better. The actual data sent to an emulated game could be calculated from the better Wiimote.
 
There is nothing "freakish" about that, in my opinion. An excellent thread recently opened on Fami reminds us, for example, that many of Mario Sunshine's problems stem from the fact that it had to be released in a hurry. I know the conversation here is focused on the hardware, obviously, but the hardware would be useless without the games.

So there's nothing sad about Nintendo delaying a release date. What is sad is to have launched the Wii U with NSMBU as a flaghship, for example. Talking about games that are "as polished as they want" might suggest that this is a whim or a bad thing.
I meant freakish in a good way. Like it's something that more developers should to more in this day in age.
Like they scrapped metroid prime 4 because they didn't like the product they were producing and decided to hand it over to Retro studio, which is something that takes courage.

Sorry if i didn't worded it in a better way 😓
 
On 1, I cannot possibly agree, my muscle memory is far better with Splatoon 3's gyro than the Wii's pointer. It's just not true that the pointer is strictly superior in this way, nor is it true that gyro drift is unavoidable, there's several systems that have been deployed to remedy it. Wii U GamePad had a magnetometer for autocalibration, for instance.

I will remain that I ABSOLUTELY want to change minds, I would rather people didn't have the wrong idea, like that gyro is somehow unrecoverable.
Then I would kindly ask you to explain me how does that actually work, because I am a very long-time follower of those kind of discussions, and as far as I understand (understood?) the consensus was that gyro would sooner rather than later accumulate enough reading errors/noise/whatever to force user to reset.

So, when I tried to play splatoon with the joy-con in a wii style, it was simply impossible. You can't build any muscle memory because it just doesn't work as anyone would expect. You can't rely on the accelerometer in a 3DoF controller. Maybe (just maybe) that's what Turrican3 was talking about (using the joy-cons separately)
I apologize if I didn't make it clear before, I thought that mentioning my awful, awful control issues with Metroid Prime Remastered in the context of a wiimote-style pointing discussion was enough but I guess I was wrong, sorry!

So yeah, I mean pointer as in typical FPS/TPS/RTS control scenarios (a-la-NPC Pikmin, Metroid Prime Trilogy on the Wii, Resident Evil 4 Wii, etc.)
 
I apologize if I didn't make it clear before, I thought that mentioning Metroid Prime Remastered in the context of a wiimote-style pointing was enough but I guess I was wrong, sorry!

So yeah, I mean pointer as in typical FPS and TPS control scenarios (a-la-NPC Pikmin, Metroid Prime Trilogy on the Wii, Resident Evil 4 Wii, etc.)

That's what I thought lol

When we play with two hands, it's natural that we'll only rotate the controller, so we're only using the gyro. It will drift (how fast will depend on how you play), but it's playable (otherwise I wouldn't be able to beat the last splatoon 3 level lol). But if you move the controller sideways, now you're using the accelerometer too. These games fuse both acc and gyro inputs. If you're playing with the right joy-con separately, it's natural that you'll move your arm more than you'll rotate your wrist (no one really shoots like by only rotating their wrist lol). The problem is that the accelerometer will accumulate errors on the very first move you do...
It just can't keep up with its position in space by itself (like it would be required so it could work as a gun). It's even worse than playing with really old optical mouses that had both negative and positive acceleration; you couldn't really build a muscle memory because even if you performed the same move every time (like the distance traveled on your mousepad would be always the same) it would have a different input because the sensor wasn't reliable enough. The accelerometer is just even worse than that lol

So you need either an external camera doing the positional tracking of the controller (correcting both gyro and acc every 30 or 60 seconds) or you need to do this tracking on the controller itself (inside-out tracking). I'll say that even with the gyro drift, I like to hold the controller with two hands. There are some benefits to it. But I also want to be able to play with a single controller, like a gun, because for me it's just more fun...
 
Further thoughts on updated wiimote.

Fish-eye lens. Sensor bar is still part of it, but it's USB powered - you can power it off the TV. Wiimote samples the camera at 240hz. The sensor bar has two LED clusters (left and right) and also runs at 240hz. Left cuts out 1/3 flashes and right cuts out 1/4 flashes. Maybe some variation on this that's more easily tracked. End result is that with the fish-eye lens, the sensor bar is always in view if the Wiimote is pointed at the TV. A small bit of compute corrects for the fish-eye, and similar software also helps adjust the returned values to match a smaller TV as was expected in the Wii era (27"?) to a modern TV (48"?).

Assuming that all Wii games there would be emulated, the emulator could have a per-game adjustment.

Nunchuck could be completely optional as a left joycon handles all the buttons and stick present on the original joycon.
 
But if you want to play wii style, then you're going to use the accelerometer just as much as the gyro, because now you'll also move your arms instead of only rotating your wrists.
Why on earth would you do such a thing? Aiming is all in the fingers. Maybe a bit of wrist too. It's this kind of misinformation that gives motion controls a bad name.

I'm playing Katrielle Layton, which has a lovely gyro cursor. I play it in bedtime mode when all tuckered in with my arm resting. I'm not waving my arm about in bed.

Even when playing docked (not Layton but something like TOTK) my arm is resting on the arm of the chair.
 
I do wonder what that third power draw test is for. The first 2 seem pretty obviously handheld and docked respectively, but the third? I think the prevailing theory is that it’s a stress test but what if it was a rough equivalent of the PS5 Pro’s High CPU mode but instead for the GPU? Boost the GPU clock higher in exchange for a lower CPU clock.
 
Further thoughts on updated wiimote.

Fish-eye lens. Sensor bar is still part of it, but it's USB powered - you can power it off the TV. Wiimote samples the camera at 240hz. The sensor bar has two LED clusters (left and right) and also runs at 240hz. Left cuts out 1/3 flashes and right cuts out 1/4 flashes. Maybe some variation on this that's more easily tracked. End result is that with the fish-eye lens, the sensor bar is always in view if the Wiimote is pointed at the TV. A small bit of compute corrects for the fish-eye, and similar software also helps adjust the returned values to match a smaller TV as was expected in the Wii era (27"?) to a modern TV (48"?).

Assuming that all Wii games there would be emulated, the emulator could have a per-game adjustment.

Nunchuck could be completely optional as a left joycon handles all the buttons and stick present on the original joycon.
I really don't think we would need a sensor bar these days... image tracking /motion sensing could be enough with improved software
The LG tv remote feels pretty natural albeit a little wonky
But something like that paired with knowing where the TV is at all times could keep the pointer from drifting I would think
 
I really don't think we would need a sensor bar these days... image tracking /motion sensing could be enough with improved software
The LG tv remote feels pretty natural albeit a little wonky
But something like that paired with knowing where the TV is at all times could keep the pointer from drifting I would think
A single IR source should be enough. I mostly say sensor bar because I have nostalgia for it. Single IR LED flashing in a recognizable pattern centered and above or below the TV plus the rest of the available data should be more than enough given a little bit of compute. All two LEDs does is give distance from the TV.
 
I guess Thor will be counted as Hopper architecture, but idk.
Nvidia did explicitly mention during GTC 2022 that Thor's using a GPU based on the Ada Lovelace architecture.
The next-generation superchip comes packed with the cutting-edge AI capabilities first introduced in the NVIDIA Hopper™ Multi-Instance GPU architecture, along with the NVIDIA Grace™ CPU and NVIDIA Ada Lovelace GPU.
 
despite Switch sucessor been substancially more powerful then Nintendo Switch, should we expect a significant jump on file size? Super Mario Odyssey for example is 5.6GB, and next 3D Mario is 50GB, can we expect this significant jump? Super Mario Odyssey 5.6GB to next 3D Mario: 8.8GB?
 
I mean ... it's the same way they did the Village and Resi 7 logos.

I would say yes and no because the VIII in village fits perfectly with the Roman numerals. By sticking “X” at the end of the game title, despite the color highlights, it WILL give a false impression that this is the 10th RE rather than it’s supposed to be the 9th.

A better option is to use IX as part of a word after the word Evil.

Some examples include, but not limited to:

Resident Evil CrucifIX
Resident Evil MatrIX
R
esident Evil VIXens
Resident Evil PhoenIX

In fairness, if that photo is true, it could also just be a placeholder until the actual title is announced. Truthfully, I think the photo is fake, though I do expect a 9th RE game to be announced in the near future.
 
despite Switch sucessor been substancially more powerful then Nintendo Switch, should we expect a significant jump on file size? Super Mario Odyssey for example is 5.6GB, and next 3D Mario is 50GB, can we expect this significant jump? Super Mario Odyssey 5.6GB to next 3D Mario: 8.8GB?
Absolutely with more complex assets comes greater file size. Harder to predict an amount, though. Game space use has generally been more "How much can we get away with?" than "What is the logical next step compared to the previous game?" And choices like how much space to spend on video throw things all out of whack. Like, Super Mario Galaxy 2 uses less than half the space of Super Mario Galaxy, largely because it doesn't use over 2GB of videos. Odyssey is then less than twice as large as Galaxy, or more than four times as large as Galaxy 2.

But FWIW, it seems like if we ignore video, Sunshine is around 300 MB. Galaxy and Galaxy 2 a bit over 1 GB. Odyssey is listed as 5.6 but I have no insight into its internal makeup. About x4 again next time might make sense.


LATE EDIT ADDING NICHE DATA FOR LATE READERS: I used Dolphin to extract the data to regular Windows folders, then Snap2HTML to make easily browsable versions of those directories, so if you want to see how the file makeup of Super Mario Sunshine, Super Mario Galaxy, and Super Mario Galaxy 2 compare, there you go.
 
Last edited:
The Ray Tracing cores enter in the FLOPS calculation of that GPU?
The ray tracing cores run at the same clock speed as the shader cores, and the ratio between the number of shader cores and RT cores is fixed. So RT doesn't affect the TFLOPS number, but the TFLOPS number is an accurate reflection of the number of RT cores X the clock speed.
 
I would say yes and no because the VIII in village fits perfectly with the Roman numerals. By sticking “X” at the end of the game title, despite the color highlights, it WILL give a false impression that this is the 10th RE rather than it’s supposed to be the 9th.

A better option is to use IX as part of a word after the word Evil.

Some examples include, but not limited to:

Resident Evil CrucifIX
Resident Evil MatrIX
R
esident Evil VIXens
Resident Evil PhoenIX

In fairness, if that photo is true, it could also just be a placeholder until the actual title is announced. Truthfully, I think the photo is fake, though I do expect a 9th RE game to be announced in the near future.
Bad kerning and spelling is a Resident Evil staple at this point ;-)
 
0
Absolutely with more complex assets comes greater file size. Harder to predict an amount, though. Game space use has generally been more "How much can we get away with?" than "What is the logical next step compared to the previous game?" And choices like how much space to spend on video throw things all out of whack. Like, Super Mario Galaxy 2 uses less than half the space of Super Mario Galaxy, largely because it doesn't use over 2GB of videos. Odyssey is then less than twice as large as Galaxy, or more than four times as large as Galaxy 2.

But FWIW, it seems like if we ignore video, Sunshine is around 300 MB. Galaxy and Galaxy 2 a bit over 1 GB. Odyssey is listed as 5.6 but I have no insight into its internal makeup. About x4 again next time might make sense.
It should be stressed that Nintendo is also very good with File Size in comparison to other companies. I have little doubt in my mind that they decided to further improve their compression tools to compensate. They might whittle that increase in file size to 3.5, 3 or even twice the size of the games previously if they can get them right.

They're always going to increase whether we like it or not, but Nintendo will likely try to mitigate it.
 
It should be stressed that Nintendo is also very good with File Size in comparison to other companies. I have little doubt in my mind that they decided to further improve their compression tools to compensate. They might whittle that increase in file size to 3.5, 3 or even twice the size of the games previously if they can get them right.

They're always going to increase whether we like it or not, but Nintendo will likely try to mitigate it.
I also think their art styles helps them not needing as high res assets as other games has.
 
Wake up TSMC 4N believers! Nintendo can now use TSMC 4N because it's an outdated node now🤭
Screenshot_20240318_174504_YouTube.jpg


Edit: For context, next-gen DCAI parts from Nvidia are using TSMC 4NP, an evolution of 4N. Probably a further customized 4N with N4P enhancements.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom