• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I don’t know why I have an obsession with RAM but I do. 😭 I’m really hoping we get 5X instead of just 5. Whenever NG was planned, and it seems they always planned for a 2024 release, surely Nintendo knew there would be a RAM revision by that time. LPDDR5 came out (all the way back) in 2020, and it seems tape out was in 2022, even after 5X was announced. During planning, couldn’t Samsung have dropped the hint that 5X would be widespread by 2024? It just seems that, barring a high price, 5X should be the obvious choice for a H2 2024 device.
They absolutely knew 5X roadmap and they could have a 5X controller if they thought it was needed. Or even to replace the ram years from now once 5x get cheaper than 5.

They have a fixed budget though and each dollar or watt overspent in one area is a dollar or watt you have to cut somewhere else. There's a big balancing act in place and 5 may end up the best choice once all things are accounted.

Keep in mind LPDDR5 might deliver a bandwidth proportional to the ones given to Ampere graphic cards. There will be some sacrifices to be made in ports from PS5/Series, sure, but the same goes for 5X. Only devs with the hardware in hand will truly know how much of a difference that extra 30% will make in general and if that is worth cutting somewhere else for it.
 
I don’t know why I have an obsession with RAM but I do. 😭 I’m really hoping we get 5X instead of just 5.
But I thought 5X (and 5, apparently) went through a node reduction recently?
The primary driver of power consumption for memory isn't the modules, it's the memory controller itself. Which, in Drake's case, is part of the chip. So whatever the power efficiency is, it's baked in.

The standard was published in 2021, so in theory Nvidia could have implemented the standard, which is very similar to 5. But I don't believe there was time to actually test with real RAM. There is a slim possibility that they decided to go for it as future proofing, and and even slimmer possibility that, since clocks are settled last, they decided to take advantage of some the bandwidth later in development.

But I'd bet against all of that - if you look at Jetson Power tools for Orin, the difference between the memory clock at 3200Mhz and 2133Mhz is more than the difference between all 8 CPU cores at 1GHz, and 2Ghz. As @ILikeFeet has pointed out before, memory clocks tend to be pushed towards the limit, and pushing them higher eats power faster than it gives you performance.
 
I don't expect Nvidia to adopt RISC-V outside of microcontrollers (p. 3) for a very long time due to Nvidia's 20 year Arm licence.
As far as Nintendo, depends on if Nintendo decides to continue partnering with Nvidia for 20 years.

20 year is a long time. If the benefits of moving to Risc5 was large enough, they would probably still do it, license or not.

Also there was info from the Nvidia RIsc5 chip posted in this thread

found this in the nvidia leak

Ct5FY5b.png

T254 NVRISC5 seems pretty cleartext to me. Even if T254 is some cancelled chip, it confirms Nvidia has been dabbling with risc5.


As for Nintendo have no reason to move as long as Arm offers a generational performance boost imo.
 
I've been playing Skyward Sword HD recently and I was reminded that it was apparently emulating the GPU while the CPU was native, and was not the same implementation as Super Mario Galaxy in 3D All-Stars. Combined with recent comments from Miyamoto about backwards compatibility being easier than ever, this is probably their ideal in keeping backwards compatibility, moving forward. Just make a translation layer if the Switch 1 GPU isn't forward compatible with Switch 2, if I understand it correctly.

Ring Fit Adventure and Labo have no digital versions. Nintendo Switch Sports has a digital version, on the other hand.
To my knowledge, this is incorrect. I believe Skyward Sword is using a compatibility layer for the Wii graphics API, rather than emulating the hardware.

However, Super Mario Galaxy, Pikmin 1, and Pikmin 2 are all native CPU ports running with an emulated Wii GPU (I believe it was confirmed that the Pikmin games were based on the New Play Control versions).
 
They absolutely knew 5X roadmap and they could have a 5X controller if they thought it was needed. Or even to replace the ram years from now once 5x get cheaper than 5.

They have a fixed budget though and each dollar or watt overspent in one area is a dollar or watt you have to cut somewhere else. There's a big balancing act in place and 5 may end up the best choice once all things are accounted.

Keep in mind LPDDR5 might deliver a bandwidth proportional to the ones given to Ampere graphic cards. There will be some sacrifices to be made in ports from PS5/Series, sure, but the same goes for 5X. Only devs with the hardware in hand will truly know how much of a difference that extra 30% will make in general and if that is worth cutting somewhere else for it.

The primary driver of power consumption for memory isn't the modules, it's the memory controller itself. Which, in Drake's case, is part of the chip. So whatever the power efficiency is, it's baked in.

The standard was published in 2021, so in theory Nvidia could have implemented the standard, which is very similar to 5. But I don't believe there was time to actually test with real RAM. There is a slim possibility that they decided to go for it as future proofing, and and even slimmer possibility that, since clocks are settled last, they decided to take advantage of some the bandwidth later in development.

But I'd bet against all of that - if you look at Jetson Power tools for Orin, the difference between the memory clock at 3200Mhz and 2133Mhz is more than the difference between all 8 CPU cores at 1GHz, and 2Ghz. As @ILikeFeet has pointed out before, memory clocks tend to be pushed towards the limit, and pushing them higher eats power faster than it gives you performance.

Okay, thanks for the feedback!

The only thing I kind of want everyone on here to know though is it seems there IS power efficiency gains with 5X, at least. This thread, I got the impression, thought there was no efficiency gains with 5X. Can we at least fully confirm this? At least so I can say that I added to the conversation instead of just asking n00b questions. 🫣
 
20 year is a long time. If the benefits of moving to Risc5 was large enough, they would probably still do it, license or not.

Also there was info from the Nvidia RIsc5 chip posted in this thread



T254 NVRISC5 seems pretty cleartext to me. Even if T254 is some cancelled chip, it confirms Nvidia has been dabbling with risc5.


As for Nintendo have no reason to move as long as Arm offers a generational performance boost imo.
Nvidia uses RISC-V, and has for a while, but not for the main CPU. Their recent GPUs contain a RISC-V microcontroller for management and security purposes.
 
and why exactly is that people could still emulate stuff early even back in the day i would even say back then it was worse cause stuff like GameCube sold not so great compared to switch so it would get less sales with switch the games will sell well regardless cause its so popular
Uh people could emulate switch games even before release. Worst, people could just play on a hack switch before release and uploads it to various video and social media sites.

Back in the day, you wouldn't even know that the game has leaked until you looked for it. Information is not as widespread as it is now back in the day.
 
0
20 year is a long time. If the benefits of moving to Risc5 was large enough, they would probably still do it, license or not.

Also there was info from the Nvidia RIsc5 chip posted in this thread



T254 NVRISC5 seems pretty cleartext to me. Even if T254 is some cancelled chip, it confirms Nvidia has been dabbling with risc5.


As for Nintendo have no reason to move as long as Arm offers a generational performance boost imo.
I'm pretty sure these are in reference to the Falcon Security microcontroller, which debuted in the Turing era (iirc) and uses RISC-V ISA. If you look into the Nvidia leak files, there are some .mk files that reference RISC-V for each Nvidia RTX generation.
 
0
I disagree when the DS and PSP existed
Dude we had download services during the ds and psp area, from where I live piracy is rampant. Everyones buy a psp but no ones buys games, a totally different time than the GB and GBA.

During the GB and GBA era not so much, all I see are bootlegs but that requires money and the online presence is not that widespread unless you are into that thing or is subscribed to the scene. Everyone I know just go to a romsite and download whatever their friend recommended, they don't actively seek out or is aware of new game leaks.

Nowadays, you don't even need to seek it, the news about the leak will come to you in some form.
 
Okay, thanks for the feedback!

The only thing I kind of want everyone on here to know though is it seems there IS power efficiency gains with 5X, at least.
My understanding is that the ram itself gets a power improvement because it is node shrunk itself. It's not a protocol enhancement. So using 5X RAM may have some power benefits? I'm genuinely unsure
 
0
Dude we had download services during the ds and psp area, from where I live piracy is rampant. Everyones buy a psp but no ones buys games, a totally different time than the GB and GBA.

During the GB and GBA era not so much, all I see are bootlegs but that requires money and the online presence is not that widespread unless you are into that thing or is subscribed to the scene. Everyone I know just go to a romsite and download whatever their friend recommended, they don't actively seek out or is aware of new game leaks.

Nowadays, you don't even need to seek it, the news about the leak will come to you in some form.
that's why I think the DS and PSP era was worse. it actively affected sales, moreso than it's doing on Switch. the increase in hardware performance is its own DRM in a way. most folks don't own that powerful of a PC and emulating switch would give most of those folks an experience not dissimilar for Switch itself
 
EDIT: I'm an idiot, I knew it. Thraktor posted about this here back in August. Ignore this whole post.

EDIT 2: Further to Thraktor's post, this new Oct. 2023 investor presentation at least shows that NVIDIA still doesn't consider game consoles as a "growth driver".

I hope I'm not a couple of months behind (edit: I am) or totally missing information right in front of me, but did NVIDIA just stop acknowledging/considering their Nintendo partnership as a "growth driver", in their investor presentations?

I mean I understand that they would choose to highlight the leading driving factors at any given time (e.g. big push to AI atm), but at the same time you would think their partnership with Nintendo is important/profitable-enough for them to at least mention it.

I'm not sure how this works - do they exclude some sensitive information/data from publicly available investor briefings? (or maybe they're under NDA?)
Maybe it's not a big deal for them at this specific point in time, and so they prefer to focus on other areas?

They seem to mention the Switch, or at least acknowledge "game consoles" as a growth driver up until the May 30th 2023 investor presentation, and from August 2023 onward they replaced "Gaming laptops & game consoles" with "Gaming laptop & Gen AI on PCs".

Fake edit: Btw you can get the inv. presentations here.

February 2022
nv202232dpc.jpg


November 21st, 2022

nov21_2022m7ekj.jpg


February 27th, 2023
february27_2023fddyv.jpg


May 30, 2023

may30_2023tzcvh.jpg


August 28th, 2023
august28_2023fsckv.jpg


October 2023
nv2023_24yfb5.jpg
 
Last edited:
I agree with this completely, there's so many details we aren't currently aware with Drake. I do believe that memory latency issues are a concern in the industry, If they weren't we wouldn't have companies using expensive solutions like large on die caches. Us trying to get a rough idea of how Drake will perform by comparing to desktop RTX is flawed as well because this will be the very first Ampere based SoC dedicated to gaming.
I very much do not think that it is any level of concern in this industry but I’m an outsider looking in. If it was such a concern, they would not be developing any new memory standards of GDDR. if it was such a concern, they would have gone with DDR memory for the consoles.

what I’m getting at is that the latency concern is being oversold when in the end, it will literally not matter as much as people think. Bandwidth is what will matter the most. Period.
 
if i may ask, is it focusing on demanding games running on rtx 2050? (the likes of Cyberpunk, Elden Ring, etc...)
It's a meaty video - the edit I saw was over 30 minutes.

About half of it is summary of rumors/leaks, then other half are games tests.

I wasn't privy to why Rich picked the games he did - though I suggested one that he did test, extensively - but all the games tested have solid PC ports, with DLSS support, and are games for which DF has good console matches settings. I won't get any deeper, because I don't want to spark a conversation that isn't based on the video itself. But I think we should keep @LiC's admonition in mind that this isn't hard data about what T239 can do, but more about expectations setting.
 
5X is not an efficiency win though. It's able to be clocked higher, but at the cost of power consumption (unlike 4X which is more power efficient than 4).

So it may not be that attractive to Nintendo for that reason.
To add to what he said, Samsung lpddr5x gave 20% power efficiency at the same clocks as lpddr5. I think that was just on node reduction alone. Macaronix apparently offers up to 24% power reduction for lpddr5x, and I'm assuming thats at the same clock speeds.

I think the bigger reason if they hold back on lpddr5x could be the price. RAM is one of the most expensive parts in a BOM. More expensive than the t239 most likely, or at least as expensive. Lpddr5 as we know is very mature, but lpddr5x has been common on flagship phones since the s23 came out in February.

I hope that if they do go with lpddr5x, they choose to max out the clocks in docked.
If that's true, they may be happy to just get a more power efficient version of 5, so they can clock it faster in handheld mode.
88GB/s might just be enough for handheld 🤔. I know some have suggested as low 68GB/s but I hope we don't get that. That could be an annoying bottleneck for ports on handheld mode.

My only question are now

(I can research this later, but I'm busy)
1. What the power draw is for lpddr5 at 88 GB/s?

2. What is the best theoretical sufficient bandwidth to have between handheld and docked mode so that neither feel bottlenecked between each other?
This is assuming we get a similar power difference of 2-2.5x GPU speed as Switch 1.
-Switch handheld and docked mode bandwidth was only a 20% difference. It was not great, though the lack of bandwidth in general was why we didn't get a lot of 1080p ports. If Switch 2 ends up with 88 and 102 GB/s, we could end up in a worse position, as the difference is only 15%.
-Something like 102 and 133 GB/s would fare better if we somehow get lpddr5x (obviously it's already been predetermined now) 88GB/s and 133 GB/ seems good too🤔
 
It's a meaty video - the edit I saw was over 30 minutes.

About half of it is summary of rumors/leaks, then other half are games tests.

I wasn't privy to why Rich picked the games he did - though I suggested one that he did test, extensively - but all the games tested have solid PC ports, with DLSS support, and are games for which DF has good console matches settings. I won't get any deeper, because I don't want to spark a conversation that isn't based on the video itself. But I think we should keep @LiC's admonition in mind that this isn't hard data about what T239 can do, but more about expectations setting.
How did you and Rich becomes friends (/acquaintances)? Was it a result of your widely public efforts in this thread? That’s a pretty cool connection.
 
I do wonder how Nintendo is going to solve BC for games that have a locked sub native resolution if the screen is going to be 1080p.

Many games are going to look nasty if devs don’t patch their games. Is there nothing Nvidia can do or does it require to go the lengths Microsoft did with BC?
 
I do wonder how Nintendo is going to solve BC for games that have a locked sub native resolution if the screen is going to be 1080p.

Many games are going to look nasty if devs don’t patch their games. Is there nothing Nvidia can do or does it require to go the lengths Microsoft did with BC?
There was some speculation that the NERD AI upscaling patent was for BC, and not a DLSS substitute.

But most likely these games will just look like shit if not patched.
 
image.png

Since it's a bit on the technical side of things, this is what I meant in Switch 2 topic. No VR, but just the 3D viewing box feature from 3DS in a headset for Switch 2. To have another way to Switch, while maintaining the traditional gameplay, and thus no real VR or VR capabilities. The production costs are probably way less high, than making an actual VR headset. And Nintendo has experience with this form of 3D. And I'd still buy it, I really did like the 3D feature of the 3DS.
 
Did Nate give any update on when his video is coming out?
Just tweeted that he's recording soon so probably sometime this week. This might not be a switch 2 video though. He could still be corroborating and sourcing info. I'm guessing that it's probably an Activision/Blizzard Microsoft podcast considering that's pretty huge news and Nate likes Xbox.
 
0
How did you and Rich becomes friends (/acquaintances)? Was it a result of your widely public efforts in this thread? That’s a pretty cool connection.
Definitely acquaintances.

I realized I was watching enough DF videos that, even though I'm not much of a PC gamer, I could afford the bare minimum on Patreon, which got me on the Discord, where I use the same handle. We talked a little about Nintendo there, he saw my name on some posts here while doing research, and we started talking.
 
switch games have been leaked since like 2018.......I remember Ultimate was leaked 2 weeks early, it still went on to be the best selling game in the series by a LONG shot
 
I do wonder how Nintendo is going to solve BC for games that have a locked sub native resolution if the screen is going to be 1080p.

Many games are going to look nasty if devs don’t patch their games. Is there nothing Nvidia can do or does it require to go the lengths Microsoft did with BC?
They probably just won't do anything. It'll be suboptimal, but it's already pretty common for Switch games to be subnative in portable in the first place.
 
I doubt higher data throughput would be necessary to add such features, since each Joy-Con connector is, I believe, USB 2.0. Plenty of headroom as regards input, even VR sensors can cope with USB 2.0 speeds.

As for compatibility, I 100% expect backwards compatibility in the connector. Sure they can improve how they ATTACH, but changing the connector means leaving Switch owners bringing over their Joy-Con will be left with no way to charge them on the new system, and I doubt that's acceptable. They can improve them in many ways without completely breaking compatibility. Though as I said, while I expect them to CHARGE Joy-Con fine, I doubt they'll allow you to play in handheld mode with original Joy-Con, due to the likely larger size of device. This is a fairly easy thing to deal with UX wise, don't have a latch for OG Joy-Con on the new console, and pop up a little error message saying "Please attach Joy-Con 2.0 to use Handheld Mode."
You're right, I guess what I meant was power since I guess haptic feedback might consume more in addition to the existing HD Rumble.

If the connectors charge the OG Joy-Cons but you can't use them in Handheld mode because of a larger screen size that'd disappointing but understandable. I would probably be playing docked in some scenarios at least because of potentially better image quality with something like DLSS or just the raw horsepower of NG. I have the Nyxi Wizard Joy-Cons which are pretty comfortable to use docked or unattached with the included grip, kind of like a Pro Controller.
 
I think it's a pretty safe bet anyways but I'm really hoping for better ergonomics. The Switch is just a bit too... rectangular :p
Yep. That was always my biggest issue with the original Joy-Cons. They are fantastic in terms of feature set but there's just not a lot of room for my slightly above average-sized palms to rest, in stark contrast to something like a Dualsense or the Pro Controller. I get that it's shaped the way it is for portability but hopefully with a bigger screen we get a pair of Joy-Cons that are mostly just as portable but are far more comfortable.
 
Bro I so want this. I want Jensen to walk out like a PIMP in the big Nintendo event showcasing the next console and proceed to wax eloquent about Drake. How good would that be? Please be a true rumor.



🔥
Leak Express: Nvidia wants its name on Switch 2.The work of customizing the chip for the new Nintendo console was such and they are so proud of the result that they want to be part of the presentation of the console or even have the Nvidia brand appear on the box.

SUPER (NVIDIA) POWER
 
Last edited:
I really wish Samsung UFS 3.0 cards were still a thing because what you are suggesting is not remotely future proof enough imo. Perhaps for the expandable storage if the idea is we have to manage it with a smaller fast internal storage but for example the XBox Series S has 2.4GB/s uncompressed and 4.8GB/s compressed read speeds.

I totally get we are talking a handheld vs a console but phones are using UFS 4.0 internal storage as of early this year. If this is a device which launches in potentially late 2024, is expected last 6+ years and it is hoping to have 1/5 the raw read speed of the lowest common denominator from 2020 this could become a huge issue for the NG ports in time.

I hope we get 3D NAND carts, UFS 4.0 storage... less concerned on the expandable storage speed
UFS is plausible. Orin supports UFS 3.0 (pre-pandemic tech). A custom product will support the next one. It’s a well-understood form of tech. In the Bill Of Materials for the Note20 Ultra phone, you can see that 12GB LPDDR5 RAM and 128GB UFS 3.0 storage are included together at one price, which was $61.50. That was in 2020. In 2024, the same RAM can be had for less (meaning 16GB RAM is possible), while UFS 4.0 isn’t out of reach. Both RAM and storage can be had for a good price. The closest we have on the successor’s RAM amount is the official Nvidia tweet from earlier this year, which mentioned a 16GB RAM device for low-powered consoles. When one steps outside the rumour mills from Celebrity Internet Youtwitch Podcasters, and heads to the horse’s mouth, the discoveries are quite incredible.

I don't think 16GB LPDDR5/512GB UFS 3.1 is off the table for Nintendo, but I think 12GB/256GB is more likely. Nintendo has a lot more money being spent on the device in other areas, like coming with two wireless controllers, a dock, etc.

Counterpoints are that they also have the online subscription service and more games will be bought for a successor to offset those costs, then they expect to sell more units, so, would be able to get a better rate on components. I keep coming back to Nvidia’s tweet, known complaints about the XSS, phones moving rather quickly from 12GB to 16GB, AND the fact that less powerful contemporaries have 16GB, THEN having to reconcile the loading time and performance reports, competent RT and DLSS, and targeted specs with the idea of less. I feel that not enough people (not you) understand that industry-leading (portable) performance was a big factor in the Switch’s success - If it was just portability without the industry-leading part, then the 3DS should’ve been closer to the DS than the NES (their second best-selling home console), not their least successful portable. I also feel that a big part of the advantages of portability is the device being quick and snappy - So, the higher amount of RAM, faster storage and a well-clocked CPU are imperative to delivering that. It’s one of many reasons why the pessimism and lowball speculating never washed with me. There is a coherent reasoning and messaging coming from developers, from the horse’s mouth, and it’s irreconcilable with some of the posts on here, as well as the forever flawed DF premises.
 
Bro I so want this. I want Jensen to walk out like a PIMP in the big Nintendo event showcasing the next console and proceed to and wax eloquent about Drake. How good would that be? Please be a true rumor.



🔥
Leak Express: Nvidia wants its name on Switch 2.The work of customizing the chip for the new Nintendo console was such and they are so proud of the result that they want to be part of the presentation of the console or even have the Nvidia brand appear on the box.

they can be on the box and in the presentation, but they won't be in the name. it's still a customer product after all. they were paid to design and make it
 
Is it just me or has the Nintendo Switch trailer disappeared?
Nintendo getting ready to upload the switch 2 trailers and they probably dont want people to get confused and click on the switch 1 trailers while searching it im guessing

edit: yep just searched all over it is gone I can only find a re upload by gamespot, its happening
 
Seeing a few posts about bandwidth, and I have a question because I’m very uncertain on the point/what it means. According to the official ARM site, the A78C CPU is reported to support up to 60GB/s bandwidth - Would that be additional to the bandwidth numbers discussed here, or is it serving an entirely different purpose?
 
Bro I so want this. I want Jensen to walk out like a PIMP in the big Nintendo event showcasing the next console and proceed to wax eloquent about Drake. How good would that be? Please be a true rumor.



🔥
Leak Express: Nvidia wants its name on Switch 2.The work of customizing the chip for the new Nintendo console was such and they are so proud of the result that they want to be part of the presentation of the console or even have the Nvidia brand appear on the box.

SUPER (NVIDIA) POWER

Totally agree. But being on the box and the presentation alone would be pretty big, imo.
I doubt this will happen, but it wouldn't be the first time. Remember the Gamecube?
image_7b72f964-034d-4071-aa99-bfdda968b2eb.jpg


Seeing a few posts about bandwidth, and I have a question because I’m very uncertain on the point/what it means. According to the official ARM site, the A78C CPU is reported to support up to 60GB/s bandwidth - Would that be additional to the bandwidth numbers discussed here, or is it serving an entirely different purpose?
The 60GB/s refer to the L3 bandwidth. This isn't surprising, as internal caches provides bandwidths that are in the same ballpark or even higher. In some cases, like Nvidia ADA GPUs, the L2 cache can offer in excess of 1TB/s of bandwidth. That being said, these caches are too small and (generally) private to the cores, meaning that they aren't useable for workloads that exceed the cache size. The bandwidth we discuss is the one provided by the RAM memory, which is the slowest memory but provides the most amount.
This old article by Digital Foundry shows how distributed is the usage of memory for a modern game.
 
Last edited:
Bro I so want this. I want Jensen to walk out like a PIMP in the big Nintendo event showcasing the next console and proceed to wax eloquent about Drake. How good would that be? Please be a true rumor.



🔥
Leak Express: Nvidia wants its name on Switch 2.The work of customizing the chip for the new Nintendo console was such and they are so proud of the result that they want to be part of the presentation of the console or even have the Nvidia brand appear on the box.

SUPER (NVIDIA) POWER

It wouldn't be the first time by any means, if anything the LACK of Nvidia branding on ANY part of Nintendo Switch was the exception, not the rule. Even the Wii had "ATI" printed on it, and Nintendo made a special ATI branded GameCube for the engineers way back when. Nintendo has a history of PROUDLY announcing their GPU vendor. If Nvidia wants their name on the box, that isn't some big ask, heck, they could have their name on the CONSOLE if they want, in the small text alongside the legalese, right where ATI used to get one.

There's also a really nice bit of wordplay possible-
SUPER Nintendo Switch used Nvidia Deep Learning SUPER Resolution technology.

I would tend to believe this rumour despite the source being questionable. If anything I think the distancing of Nvidia from the branding of Nintendo Switch may have been their own choice, since Nintendo Switch was never going to be a graphical powerhouse, it may have risked cheapening their brand. With the T239 processor being, bluntly, more impressive than Tegra X1, I think Nvidia might be more willing to embrace it more publicly.
 
Nintendo getting ready to upload the switch 2 trailers and they probably dont want people to get confused and click on the switch 1 trailers while searching it im guessing

edit: yep just searched all over it is gone I can only find a re upload by gamespot, its happening
OLED Trailer is still up, though.

The LCD display rumor might turn out to be real after all!
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom