• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

You don't have to lol; you can not believe me if you want. I also didn't say anything crazy or out there.
I'm just wondering because if people ask about future Switch stuff in any conversation or forums, I would prefer to link to Famiboards threads like this, once I know there's accurate info. That's all.
 
I'm just wondering because if people ask about future Switch stuff in any conversation or forums, I would prefer to link to Famiboards threads like this, once I know there's accurate info. That's all.

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


I'm not saying you're lying, but it is strange to keep a source anonymous yet give out hints as to who they are and confirm who they're not. Shouldn't you be trying to protect your source?

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Would a well-timed PS5 Pro announcement/reveal kill any hype for a Switch successor announcement?

* Hidden text: cannot be quoted. *
The PS5 Pro is supposedly being announced in September. I know 2025 for Switch 2 is being brought up now (and I have been admittedly dismissive of it; though I still think it's unlikely) but isn't the common consensus still 2024? Even Nate seems to be very certain it's 2024 - I think most people missed that post. I don't see how a PS5 Pro announcement would affect a Switch 2 announcement that would have probably already happened months prior?
 
0
Please maintain the conversation in HIDE
There is no HIDE, only HYDE

5c55fd0e4b1e6dbadd2c8c0f912b454a.jpg
 
I'm just wondering because if people ask about future Switch stuff in any conversation or forums, I would prefer to link to Famiboards threads like this, once I know there's accurate info. That's all.
Keep in mind reminding your friends that information here should always be taken with a grain of salt regardless how trustworthy it seem to be. Rumors often don’t show the full picture, can be misinterpreted or be just outdated. No one can ensure that what they share is accurate except for Nintendo.
 
Meelow doing riddles to make people go crazy...
Remember > If It's a small company, then forget Square, Atlus, Sega, Koei Tecmo, Capcom, EA, Bandai

they already have the devkit btw

Ehh, someone's got to. At least what I am saying is factual and not like those old twitter users that would post "Mother 3 HD Remake getting announced at the Nintendo Direct tomorrow!!!" lmao.

I also have been on these forums for far too long, and I won't post anything that I don't know is true. I'm not trying to be in the insider game but the little info I do have is exciting.

The last time I had any information was in 2017 when I was told by a friend who worked with SE that FFVII Remake would be three parts and I posted that on Era and haven't said anything since.
 
You have to tell me why people keep asking for proof for most insider info when 99% of alleged insider information have usually 0 evidence backing them up aside from credibility from the person telling them. Like, no one is willing to snitch on their sources which is both a blessing and a curse.

It's okay if you don't believe anything, just do basic courtesy. Don't be like people who constantly shills or hates on insiders, they all hate it.
 
Quick reminder: brainchild (hi buddy!) mentioned this in two of his posts more than a year ago:

I guess if you don't care about extreme pop-in it's not an issue, but for me, I've already run the project on slow HDDs and the results were atrocious, despite the game still being "functional". There's just a minimum standard I'm not willing to compromise. And many of my assets are "high-density". I would still be using UE4 if I didn't think Nanite was necessary. I don't need Lumen. There are a billion ways to fake global illumination. There is very little comparable to the perceived quality of geometry offered with Nanite and I'm taking full advantage of that.
With all due respect, I completely disagree with this take.

First, you do not need a big team to create models with high poly counts and high resolutions. What you need are comprehensive resources and access to powerful procedural, generative, and/or machine learning tools. Houdini, for instance, allows you to create cities on the scale of that Matrix demo, even if you're only one person. I should know as I have already done it. If you use open street map data, your turnaround time will be even faster.

In fact, the Matrix demo artists used Houdini to generate the city in the demo:



And the presenter above explicitly mentions its benefits for smaller teams:

Screenshot2022040722.png


There are even tools that convert 2D photos into high-quality 3D models, which I also use. Converting such assets to Nanite is trivial and allows indies to create game worlds on a scale comparable to AAA productions.

Also, the Lumen comparison is flawed since the lighting was not replaced with any alternative solutions. Simply turning Lumen off as if any game would actually ship like that is not a fair comparison. RTXGI with sufficient probe sampling will produce better results (mainly due to reduced visual latency) and yet yield much better performance. Lumen is simply not production-ready and other solutions have already been proven to work just fine. The problem is that there aren't that many examples of Nanite assets being illuminated by GI solutions other than Lumen. I can assure you, however, RTXGI looks great on Nanite assets. In fact, procedurally produced Nanite assets + DirectX ray-traced reflections + RTXGI = Indie cheat code for scalable, photorealistic AAA graphics.

EDIT:

Also, NVIDIA Omniverse is great for indies. I'm excited to use the new UE5 connector, but I want to wait until I can migrate my UE4 project first.

From the viewpoint of an indie developer, support for the UE5 toolset is more important that the raw specs themselves. And based on the reports from GDC, we can assume that the next Switch has the aces up its sleeve that would content developers.

Thus, I suggest we tone down the discussion about raw specs and pivot to a discussion about which games are viable on the next gen and which are not.

Personally, I assume that most indie outputs that are built using ray tracing will be ported. Desordre and Jusant to name just two very recent titles will likely come to Switch 2. Racing games, combat games and sports games in general from all kinds of publishers will likely also be considered to be ported. Even some sandbox games might make the jump. Imagine playing Dead or Alive, The Crew or WRC with ray tracing enabled on portable mode. That in itself is an exciting prospect.
 
Those all could work on paper, but I'm not sure how the general consumer would handle it. Phones can get incredibly hot while charging, and that's without them being pushed hard. Not to mention the extra noise that comes from something like a Switch with passive cooling. It's for sure a solid idea on paper, but I'm not convinced something like this would work well.

Plus, imagine the blowback if a game targets console + "Expansion Pak" in handheld, with the consumer being left with a juttery low-res mess if they play it on their normal Switch 2. While the optics of a big budget game like GTA VI running poorly don't look good for Nintendo, I'd imagine having a bad experience unless you spend extra money and carry around more shit looks ten times worse. No, that wouldn't be too dissimilar to a game running bad on the base PS4 and running fine on a PS4 Pro, but I feel like the optics of it happening with the battery pack are worse, even if it would cost less for the consumer. A Hyper Ultra Switch 2 Advance/PS4 Pro is sold as a flat upgrade, and the PS4 Pro did things that the PS4 couldn't like output at 4k. The battery pack would just have portable games properly render at 1080p, or at least get closer to it. It wouldn't be an upgrade like a PS4 Pro; it would just be used to bring subpar ports/games "too big" for the system up to a slightly more playable state.


* Hidden text: cannot be quoted. *

I mean that situation is present today with the Switch as is. Look at the Batman Arkham ports or the recent Mortal Kombat game. Would you rather have the option to overclock the system (legally) and get proper performance out of the game if you owned it? We've seen clearly it's possible to get much better performance from Batman by overclocking, and no one's system is melting or anything like that. It's just an issue of battery draining faster.

To be honest a battery pack + maybe a stand for a cheap upgrade price is more consumer friendly than even the DSi and New 3DS approach where you'd have to buy a completely new system to get the benefit of extra horsepower and certainly more consumer friendly than Sony/MS' mid-gen refreshes that cost hundreds of dollars.

Nintendo could sell an official battery pack + stand for like $25 if they really wanted to. To me that's more consumer friendly than saying "hey pay another $400 for a Switch 2 Pro".

Or you could even just integrate the solution into Joycons. I mean they are literally two attachments on the side of the system with batteries inside of them, why not let them function like a battery extender to unlock a performance mode if the player wants to do that? If the main battery in the systems is say 5000 MaH size battery, but into the Joycons you can squeeze an extra 1200 MaH on each side, well suddenly you have an overall battery "pool" that's fairly large.
 
Which is wild, seeing how we had a lot of rumors between August to October.
My sentiments exactly. It’s like Nintendo’s Legal Department is on par with Disney’s: everyone’s scared.


I'm assuming things will start picking up again next month with CES.
Why at CES, tho? Isn’t the video game business not so present?


No, a lot of people I've seen don't even understand why a PS5 Pro is happening and it's just an upgrade, the PS4 Pro and Xbox One X didn't hurt the Switch.
Exactly. Plus that type of revision is usually for the more hardcore crowd. The PS5 Pro isn’t like a Switch OLED: it’s pretty much for everyone


Would a well-timed PS5 Pro announcement/reveal kill any hype for a Switch successor announcement?
No, especially since Nintendo devices have been companion devices for years (a Nintendo console sharing household with a Sony/Microsoft one).
 
Apologies to all those who replied!

I don't know why I thought the PS4 Pro had two APU's and was not just a 2x overclock.

With that said, a dock that included it's own separate T239 that provided an extra boost to hardware performance without the need of a separate developer profile would satisfy the few who primarily play in docked mode.
 
Some users here asked yesterday in this thread why it might not be viable to shrink T239 to newer nodes if it's fabbed on a bleeding edge node such as TSMC 4N. Nikkei Asia released an story that provides some good insights into why and has opinions of some people who are inside the silicon manufacturing chain:

“The logic that the cost per transistor goes down ended, basically, with 28 nm,” said Ondrej Burkacky, Senior Partner of McKinsey & Co. Cutting-edge chips will become increasingly less affordable, he added, and advances will have to make economic sense for customers.

“While processor performance still improves, the rate of increase is declining with each generation. … We need to invest tens or hundreds of times as much in those successor technologies [beyond semiconductor physics] because the improvement to computers is so important to an economy,” Neil Thompson, director of the FutureTech research project at Massachusetts Institute of Technology Computer Science and Artificial Intelligence Lab
20231213-sh-chipmaking-cost-bar_sp-900x1512.png



It's a good read👍

Apologies to all those who replied!

I don't know why I thought the PS4 Pro had two APU's and was not just a 2x overclock.

With that said, a dock that included it's own separate T239 that provided an extra boost to hardware performance without the need of a separate developer profile would satisfy the few who primarily play in docked mode.
PS4 Pro is an entirely new APU. It isn’t just a 2x overclock. You might be mistaking it for the butterfly GPU structure, which basically mirrors PS4 GPU layout and doubles it.

And while it's true that a Dock housing Switch 2 structure inside would be a solution for players only interested into Docked play, I'd question the economic feasibility of this, be it the fact it would need a separately designed dock or be it that the pool of Docked only player is very small. Besides, it dilute the value of Switch hybrid solution.
 
Last edited:
Some of people here asked yesterday why some users in this thread say that it might not be viable to shrink T239 to newer nodes if it's fabbed on a bleeding edge node such as TSMC 4N. Nikkei Asia released an story that provides some good insights into why and has opinions of some people who are inside the silicon manufacturing chain:




20231213-sh-chipmaking-cost-bar_sp-900x1512.png



It's a good read👍
I'm assuming this cost figure per chip is assuming a constant size rather than a constant transistor amount? Obviously a 1 cm² chip on 2 nm will be a lot denser than one on 28 nm.
 
I'm assuming this cost figure per chip is assuming a constant size rather than a constant transistor amount? Obviously a 1 cm² chip on 2 nm will be a lot denser than one on 28 nm.
Given that these are estimatives for Apple SoCs, probably not. The biggest factors driving the rising costs are due to fact newer nodes require more intensive capital R&D, new materials, solutions and techniques into the fabbing process, increased usage of EUV machinery and increased cycle times, lesser to no scaling of SRAM, etc. Which increases the costs of a chip fabbed on each newer node. You can see this by looking at the wafer costs per node and see how they're quickly increasing, specially after 5nm and EUV. And, to boot, the gains of jumping between node to node are becoming marginal, while costs are surging.
 
0
Compatibility problems, expense, diminishing returns. Nintendo has definitely explored the concept, supplementary computing devices, but they haven't done it. Not for lack of technology, but it you want games to smoothly dock and undock, you can't just move the whole thing to another GPU while the game is running. You're better off starting with a large GPU, and downclocking it for portable play, which is what T239 appears to do.

I think there's more to it than we even realize as well.

I am not a programmer, never typed in a line of code in my life, but I do at least understand that all these systems are very complicated with thousands and thousands of lines of code that must work in harmony. Case in point, the whole concept of going from Docked to Handheld with the Nintendo Switch. There must be some code written that tells the system "Hey, you're in docked mode, so do X,Y, and Z to the SoC." "Ok, great. Thanks!" I'm probably oversimplifying it, but my point is this whole process of going from docked to handheld, and vice verse must work. Every. Single. Time. And work fast, work well, and each game is coded in a way to adjust that on the fly whether it's the game itself doing it, or the system. I don't know specifically. Again, with the Switch, it just works. And it works very well. I can attach, and remove the Switch from the dock over and over again, and it'll always work. That to me just sounds like good programming, but maybe I'm again oversimplifying it.

I think I'm right in saying no current eGPU setup, where it's any Thunderbolt 4 Laptop w/ an eGPU enclosure, or the Asus ROG Ally w/ the XG Mobile, can do what the Switch can do in terms of just picking the device up, and it works. No input needed from the user.

This is something Nintendo would want to solve with a Supplemental Computing Device. It couldn't have any lag, or require external input from the user. It would have to work like the Switch 1 currently works going from docked to handheld, and vice versa. I remember this SCD patent showing up during the Wii U days, and I do wonder if this was something Nintendo at one point prototyped, but decided it was cheaper, and less complicated if they simply had an additional chip that would communicate directly with the Switch to tell it if it was in docked, or handheld mode.

Like you said, was probably better to have a beefier chip on board, and underclock it between the two modes rather two separate chips.


Again, if someone smarter than me can correct me on certain details, and nuances of how the Switch works between the two modes, please feel free.
 
Can someone with knowledge describe the difference between what the CPU does, and the GPU? Like maybe in terms of a game like TOTK? (Ex: the CPU handles load times while GPU handles enemies on screen, or whatever I’m sure I’m wrong.) A few pages ago it was said the A78C is much weaker than Zen2, and I just want some real world context. From my understanding, CPU and RAM bandwidth will be the biggest bottlenecks for Switch2. I want to learn more about what CPUs do in general. Like what is a “CPU heavy” game?
 
Can someone with knowledge describe the difference between what the CPU does, and the GPU? Like maybe in terms of a game like TOTK? (Ex: the CPU handles load times while GPU handles enemies on screen, or whatever I’m sure I’m wrong.) A few pages ago it was said the A78C is much weaker than Zen2, and I just want some real world context. From my understanding, CPU and RAM bandwidth will be the biggest bottlenecks for Switch2. I want to learn more about what CPUs do in general. Like what is a “CPU heavy” game?
A CPU is a Central Processing Unit. Its main goal is to turn the game data from storage into game logic and respond to your inputs. A GPU is a Graphics Processing Unit. Its main goal is to turn that game logic into images that can be projected by the screen. If a game has a lot of data to turn into game logic, e.g. a large world, complex enemy AI, large amounts of NPCs, etc., it's considered CPU heavy. If a game has a lot of image data that needs processing, e.g. ray tracing effects, global illumination, anti-aliasing, etc., it's considered GPU heavy. A game like Total War or Shadow of Mordor is super CPU heavy because it has to calculate thousands of soldiers, but have relatively light graphics, whereas Minecraft or Portal RTX is super GPU heavy because ray tracing is very graphically intensive, but have relatively simple game logic.
 
Last edited:
A CPU is a Central Processing Unit. Its main goal is to turn the game data from the cartridge into game logic and respond to your inputs. A GPU is a Graphics Processing Unit. Its main goal is to turn that game logic into images that can be projected by the screen. If a game has a lot of data to turn into game logic, e.g. a large world, complex enemy AI, large amounts of NPCs, etc., it's considered CPU heavy. If a game has a lot of image data that needs processing, e.g. ray tracing effects, global illumination, anti-aliasing, etc., it's considered GPU heavy. A game like Total War or Shadow of Mordor is super CPU heavy because it has to calculate thousands of soldiers, but is relatively light graphically, whereas Minecraft or Portal RTX is super GPU heavy because ray tracing is very graphically intensive, but has relative simple game logic.

Sorry to be pedantic, but isn't it Central Processing Unit?

EDIT: See you corrected yourself. lol
 
0
Can someone with knowledge describe the difference between what the CPU does, and the GPU? Like maybe in terms of a game like TOTK? (Ex: the CPU handles load times while GPU handles enemies on screen, or whatever I’m sure I’m wrong.) A few pages ago it was said the A78C is much weaker than Zen2, and I just want some real world context. From my understanding, CPU and RAM bandwidth will be the biggest bottlenecks for Switch2. I want to learn more about what CPUs do in general. Like what is a “CPU heavy” game?
At the same clock a A78C is more powerful because it has a higher ipc (even though the Zen 2 has two times the threads) but since the Switch 2 will be a portable device, which can‘t clock as high as stationary devices due to battery live, it will be less „powerful“ and developer‘ll have to cut down a bit on cpu bound games, or lower the framerate or both.
 
...maybe Nintendo isn't the only company that needs new hardware

cause those Xbox November sales in the best market are BAD and this should be the best year for it too
 
Frankly, it doesn't matter what Xbox does at this point. They're always gonna be on the backfoot against PlayStation, and have been continuously since 2013.
The thing is the Playstation brand is so strong. Sony doesn't really have to do much but people will buy their consoles for the EA games, for fortnite, for Call of Duty, etc etc. Microsoft has to make in roads there but they are struggling and it really isn't their focus. Although now they have Call of Duty lmao.
 
Can someone with knowledge describe the difference between what the CPU does, and the GPU? Like maybe in terms of a game like TOTK? (Ex: the CPU handles load times while GPU handles enemies on screen, or whatever I’m sure I’m wrong.) A few pages ago it was said the A78C is much weaker than Zen2, and I just want some real world context. From my understanding, CPU and RAM bandwidth will be the biggest bottlenecks for Switch2. I want to learn more about what CPUs do in general. Like what is a “CPU heavy” game?
GPUs are specialized processors designed to run tasks related to graphics. They are broadly responsible for generating the image that you see, but little else. Sometimes they can also be leveraged to accelerate certain kinds of compute heavy tasks, such as physics, but it's fairly situational, and not universally done. This role may sound limited, but graphics workloads are quite heavy in general, especially in high budget modern games, which is why we needed a dedicated processor for it to begin with.

There are frequently also some other smaller dedicated processors/blocks that assist with tasks like audio and encoding and decoding video files. More recently, blocks for file decompression have begun appearing, and T239 is expected to include one of those.

CPUs are general purpose processors that do basically everything else, including telling the specialized processors what to do.
 
I mean that situation is present today with the Switch as is. Look at the Batman Arkham ports or the recent Mortal Kombat game. Would you rather have the option to overclock the system (legally) and get proper performance out of the game if you owned it? We've seen clearly it's possible to get much better performance from Batman by overclocking, and no one's system is melting or anything like that. It's just an issue of battery draining faster.

To be honest a battery pack + maybe a stand for a cheap upgrade price is more consumer friendly than even the DSi and New 3DS approach where you'd have to buy a completely new system to get the benefit of extra horsepower and certainly more consumer friendly than Sony/MS' mid-gen refreshes that cost hundreds of dollars.

Nintendo could sell an official battery pack + stand for like $25 if they really wanted to. To me that's more consumer friendly than saying "hey pay another $400 for a Switch 2 Pro".

Or you could even just integrate the solution into Joycons. I mean they are literally two attachments on the side of the system with batteries inside of them, why not let them function like a battery extender to unlock a performance mode if the player wants to do that? If the main battery in the systems is say 5000 MaH size battery, but into the Joycons you can squeeze an extra 1200 MaH on each side, well suddenly you have an overall battery "pool" that's fairly large.
I'd prefer my ports to be functional in the first place.
 
Apologies to all those who replied!

I don't know why I thought the PS4 Pro had two APU's and was not just a 2x overclock.

With that said, a dock that included it's own separate T239 that provided an extra boost to hardware performance without the need of a separate developer profile would satisfy the few who primarily play in docked mode.
I not sure what you meant by dock with a t239. There’s no power difference from using the t239 in the dock vs in the console. If you meant a dock that will play Switch 2 game on your Switch then that’s a stationary console with more steps.
 
0
I think there's more to it than we even realize as well.

I am not a programmer, never typed in a line of code in my life, but I do at least understand that all these systems are very complicated with thousands and thousands of lines of code that must work in harmony. Case in point, the whole concept of going from Docked to Handheld with the Nintendo Switch. There must be some code written that tells the system "Hey, you're in docked mode, so do X,Y, and Z to the SoC." "Ok, great. Thanks!" I'm probably oversimplifying it, but my point is this whole process of going from docked to handheld, and vice verse must work. Every. Single. Time. And work fast, work well, and each game is coded in a way to adjust that on the fly whether it's the game itself doing it, or the system. I don't know specifically. Again, with the Switch, it just works. And it works very well. I can attach, and remove the Switch from the dock over and over again, and it'll always work. That to me just sounds like good programming, but maybe I'm again oversimplifying it.

I think I'm right in saying no current eGPU setup, where it's any Thunderbolt 4 Laptop w/ an eGPU enclosure, or the Asus ROG Ally w/ the XG Mobile, can do what the Switch can do in terms of just picking the device up, and it works. No input needed from the user.

This is something Nintendo would want to solve with a Supplemental Computing Device. It couldn't have any lag, or require external input from the user. It would have to work like the Switch 1 currently works going from docked to handheld, and vice versa. I remember this SCD patent showing up during the Wii U days, and I do wonder if this was something Nintendo at one point prototyped, but decided it was cheaper, and less complicated if they simply had an additional chip that would communicate directly with the Switch to tell it if it was in docked, or handheld mode.

Like you said, was probably better to have a beefier chip on board, and underclock it between the two modes rather two separate chips.


Again, if someone smarter than me can correct me on certain details, and nuances of how the Switch works between the two modes, please feel free.
I can only guess how it works and would also like an explanation to be honest.

(but my guess would be: when the system is docked, after the USB handshake happened and the system knows to what it was docked it sends an interrupt and the game pauses after the last frame, it sets a flag that from the next frame the engine should use the different render profile, sends a command for the gpu driver that it should switch to the different profile (higher clocks, external out instead of display), and when this has done sends the game another interrupt to comunicate to resume. Now the game starts again and renders the next frame, but with the new gpu profile.

At least thats my guess.

Can someone with knowledge describe the difference between what the CPU does, and the GPU? Like maybe in terms of a game like TOTK? (Ex: the CPU handles load times while GPU handles enemies on screen, or whatever I’m sure I’m wrong.) A few pages ago it was said the A78C is much weaker than Zen2, and I just want some real world context. From my understanding, CPU and RAM bandwidth will be the biggest bottlenecks for Switch2. I want to learn more about what CPUs do in general. Like what is a “CPU heavy” game?
Game logic (where is what, what is happening, physics, what needs to be kept track, what needs to be rendered,...) -> CPU
Visuals (deciding what of the objets that are given are visible, rendering the scene, post processing, filters, compositing different rendering steps, etc) -> GPU

Thats the reason that cpu load cant really scale between docked and handheld (it would lead to it being able to handle less stuff in handheld mode, think: having problem with 50 enemies on screen while its fine docked, or that the physics simulation would be less precise and would feel different between handheld and docked), and is also why a strong cpu is important for ports. In the worst case you would need to cut framerate to have more time per frame to calculate the stuff.
Easy tricks how ports reduced cpu load on switch ports: render less background characters then the other consoles, having flags not be moving in the background, etc. But thats between different consoles, that would hardly be an option between different states like the switch.

On the other hand, if the gpu is weaker, you can reduce the amount of details to reduce the amount of work it has to do.
Rendering less pixel -> bam. rendering the shadows with so much details takes way to long? no problem, render everything in high resolution, but shadows (since its rendered in its own step) in a quarter of the resolution.

There are obviously limits to how much you can scale GPU tasks.

And the reason why ram can switch between fast and slow modes between docked and handheld:
its shared with the cpu, the GPU will use less of its bandwidth when its using a lower clock speed, and the cpu is not using more, so there is free bandwidth that is not being used. So reducing the clock speed means it needs less voltage and less power without impairing either CPU or GPU.
 
Frankly, it doesn't matter what Xbox does at this point. They're always gonna be on the backfoot against PlayStation, and have been continuously since 2013.

I'm not a marketing expert, but perhaps Microsoft should've stopped with the goddamn confusing names, and just kept it simple. Seriously, it started with Xbox, then Xbox 360 (which means it made a full circle, and ended up right back where it started?), Xbox One (which WTF, Microsoft?), and now Xbox Series, which I still don't understand what that's suppose to mean.

Sony at least kept up the naming continuity with 1, 2, 3, 4, and now 5. Simple. Straight to the point, and everyone knows it's new.

Should've been Xbox, Xbox 2, Xbox 3, and then Xbox 4. Even if the successor to Xbox 360 was Xbox 720, it would've maintained some consistency, though I'd be curious what Xbox Series would've been named. Xbox 1080? Xbox 1440? Which at that point sounds like they were referring to its resolution output.

Legitimately, the name "Xbox" is an awesome name for a gaming console, but the naming schemes they use are awful imo, which I think have at least partially contributed to them falling from grace since the 360 days.
 
I'm not a marketing expert, but perhaps Microsoft should've stopped with the goddamn confusing names, and just kept it simple. Seriously, it started with Xbox, then Xbox 360 (which means it made a full circle, and ended up right back where it started?), Xbox One (which WTF, Microsoft?), and now Xbox Series, which I still don't understand what that's suppose to mean.

Sony at least kept up the naming continuity with 1, 2, 3, 4, and now 5. Simple. Straight to the point, and everyone knows it's new.

Should've been Xbox, Xbox 2, Xbox 3, and then Xbox 4. Even if the successor to Xbox 360 was Xbox 720, it would've maintained some consistency, though I'd be curious what Xbox Series would've been named. Xbox 1080? Xbox 1440? Which at that point sounds like they were referring to its resolution output.

Legitimately, the name "Xbox" is an awesome name for a gaming console, but the naming schemes they use are awful imo, which I think have at least partially contributed to them falling from grace since the 360 days.
MS are just terrified of being behind Sony in the numbering.

Imo they should have called 360 Xbox 3 (where is 2? who cares it's just a name), and continued with straight numbering from there.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom