if anything that just means we can expect some kinds of Drake exclusives at or around launchNintendo says the next generation hardware needs unique new experiences. (And about backward compatibility.)
if anything that just means we can expect some kinds of Drake exclusives at or around launchNintendo says the next generation hardware needs unique new experiences. (And about backward compatibility.)
don't think that's what is implied at all. they just want some software that sells the virtues of the platform. it doesn't discount cross-gen, but hint as some early exclusivesLack of cross-gen seemingly implies that Drake might launch later than expected. However, I think it's exciting that Nintendo is keen on having exclusives for it (I would've probably complained about Switch holding games back)
Nintendo follows a pattern of not hinting at anything. if it's not announced, it doesn't exist. even if it's announced the day after, they'll speak as if nothing is coming for the foreseeable futureIs there any insight in these answers? they seem generic as hell, as per usual. Would the DLSS model even be a "next-gen" system? Nintendo is so fucking frustrating my god.
Is there any insight in these answers? they seem generic as hell, as per usual. Would the DLSS model even be a "next-gen" system? Nintendo is so fucking frustrating my god.
We’re not at the point when we can talk about the next generation console.次世代機のことはまだお話できる段階にはありませんが
Well, at least something exist, not like it was a secret or anything but still.We‘re not at the point when we can talk about the next generation console.
I am sure they mentioned "next generation console or hardware" multiple times in past years.Well, at least something exist, not like it was a secret or anything but still.
I'm still surprise people are surprise with their non-answers.I think anyone in this thread could have told you Nintendo isn't ready to talk about new hardware yet
even I don't blink at Nintendo Q&AI'm still surprise people are surprise with their non-answers.
Way way stronger. Jaguar wasn't even that good against contemporary ARM cores (for the same performance, the A57 consumed less power, for example)How much stronger would 8 A78s at 1.1GHz be than 8 Jaguars at 1.6Ghz?
Since they did not said that, it has to be for 2024 right??“Good questions, guys. Yeah, Switch 2 coming out next year. You better be ready!”
On PC, DLSS/Graphics APIs/GPU drivers are all different pieces of software that communicate with each other through generic interfaces that allow you to wire them up in different combinations, and let the OS perform various security checks. These security checks prevent a word document sent to your email from running on your graphics hardware and then gaining access to main memory, where it scans for your credit card number and sends it to a phisherman in Ethiopia.How would the nvidia driver on your PC interacting with the in game DLSS info be any different than the DLSS driver on nvn2 interacting with the in game info on a drake game?
Both a “come on guys” statement but also a genuine question cause I don’t know the answer and I’m not all that familiar with API and driver relations
Nintendo should try to get more games like Genshin as wellI will repeat again.
Seeing COD:MW2 numbers, It would be a very smart move for Nintendo to get a port of it (with the rumoured expansion) for Drake launch window, alongside Warzone 2 and past remasters.
COD + Rockstar stuff in year 1 would be a killer combination that can increase the appeal of nintendo systems in the west. In Asian market they have a very big advantage over competition but they need to strengthen themselves in overseas markets.
Way way stronger. Jaguar wasn't even that good against contemporary ARM cores (for the same performance, the A57 consumed less power, for example)
Like a little over twice or so.How much stronger would 8 A78s at 1.1GHz be than 8 Jaguars at 1.6Ghz?
I agree that the difference won't be as dramatic as that gen, but I believe that developers sooner or later will start to optimize towards PS5/XBSX smaller caches and the difference in Perf/Clock between Ariel and Desktop Ryzen will lessen as the generation unfolds. Then I believe that the advantages that the PS5/XSX do have will make systems that do not outperform them on paper, like a system with lower core counts, start to age towards the end of the gen.The PS3/Xbox 360 generation isn't really comparable to the current generation in CPU terms, though. The PS3 and Xbox 360 CPUs were very, ahem, idiosyncratic CPUs which had a lot of theoretical performance but required code to be carefully tailored to actually achieve that.
The PS3's Cell CPU in particular is a CPU with massive theoretical performance, but made in such a way that achieving anywhere close to that theoretical performance is very challenging. While people have claimed that it was ahead of it's time in terms of moving to multi-core CPUs, in reality it represented a design philosophy which quickly became obsolete. The PPE was pretty typical of this philosophy, an in-order PowerPC core with a long pipeline and a high clock speed. The 7 SPEs, though, took this to the extreme. The SPEs:
If you weren't tailoring your code very carefully for the SPE, this is a recipe for horrible performance. IBM's paper on the SPE design rather nonchalantly refers to this:
- Were in-order
- Had a long pipeline
- Had no cache
- Had no branch predictor
An 18-cycle mispredict penalty when you don't even have a branch predictor is pretty scary stuff. The recommendation is, pretty much, just not to use branches. For example, on SPEs, the recommended approach for loops was to unroll them. That is, instead of having a standard for loop which iterates, say 100 times, you just duplicate the code 100 times instead, hence avoiding branches and avoiding the branch mispredict penalty.
The lack of cache was another thing that had to be carefully worked around. Each SPE had 256KB of SRAM, and there was a DMA engine to pull in data from main memory, so if you made sure the SPE always had the data it needed in SRAM you'd be okay, but if the SPE ever tried to access main memory the latency would be enormous and the SPE would stall completely until the data arrived.
The first line of the paper is actually quite telling about the design philosophy, as it calls the SPE a "11 FO4 streaming data processor". The phrase FO4 isn't something you hear too often these days about CPU designs, but was arguably the key metric for CPU designers in the early 2000s. FO4 refers to the complexity of each stage in a pipelined processor. A lower FO4means each stage in the pipeline is electrically simpler, which means signals propagate through them more quickly and therefore you can hit higher clock speeds. Chasing lower and lower FO4 values was the design philosophy of the time, motivated by the notion that you can reach a low enough FO4 to hit optimal theoretical performance. This meant longer and longer pipelines, and the Pentium 4, Xbox 360's Xenos, and PS3's Cell were all products of this design philosophy.
Another popular idea at the time was that compilers (or just software developers) would become smart enough to deal with the most esoteric design, so CPU designers could just focus on theoretical performance and leave it to someone else to figure out how to actually make us of it. Intel's ill-fated Itanium processors, with their VLIW design, were another example of this somewhat earlier than Cell.
And have CPUs since Cell followed it's lead in chasing low-FO4 designs with long pipelines and throwing out useless trinkets like caches and branch predictors? Well, no. In fact, CPU design recently has gone in exactly the opposite direction. The Core 2 Duo was an early example of this, already on the market just before PS3 hit. It's a shorter pipeline design with lower clock speeds than the preceding Pentium 4, but outperformed it in most real-world code, because it was designed around how real-world code would run on it, rather than theoretically optimal code.
If you look at the arguable leader in modern CPU design, Apple, and how they've managed to achieve that level of performance, it's by doing exactly the opposite of what Sony and IBM did with Cell. Rather than pushing long pipelines and high clock speeds, they've got large, low-latency caches, an extremely large re-order buffer for out-of-order execution, and very sophisticated branch prediction. Apple are pushing things a bit more than others in this direction, but this is representative of modern CPU design; try to make as much real-world code run as well as possible, rather than asking developers to contort themselves to reach the chip's theoretical performance.
Getting back to my original point, the Core 2 Duo was designed so that, if you were to just write some typical code without much care for what CPU it would run on, it would outperform the likes of a Pentium 4 or Xenos or Cell. Early games in the PS3/Xbox 360 generation wouldn't have been heavily optimised around the Xenos and Cell CPUs, however if there's one place where software developers get to tightly optimise their code around specific hardware it's the games industry. Over the course of the generation AAA developers were figuring out how to squeeze more and more out of those CPUs, and the likes of the Core 2 Duo wasn't able to keep up with performance you could get out of Xenos or Cell when code was extremely well optimised for them.
The PS5 and Xbox Series X/S are in a very different position. Not just because the CPU cores are the same or very similar to those used in many gaming PCs, but because the CPUs aren't weird esoteric designs which take years for developers to properly leverage. Developers will undoubtedly get more out of them over the generation, and as cross-gen games disappear and we see more 30fps current gen exclusives PCs will need more power to hit 60+fps, but it's not going to be anywhere near what we saw in the PS3/Xbox 360 generation.
Like a little over twice or so.
Clocking it that low isn’t doing any favors.
If you clocked it at 1.6GHz, it should be a bit over 3x the CPU performance.
With Nintendo’s change in FY forecast, does this cast any doubt on a March launch?
It feels like if anything, it would mean that the new systems sales would not be tracked in these numbers correct?
Yes I know the leading / convenient theory is May, just talking about the “early 2023” option that’s been floated for ages.
Lots of folks would look at IPC numbers, but you can't do that comparing x86 and arm architectures. Even if you could, it ignores things like Jaguar's weak cache design and remarkably bad branch predictor. There are some benchmarks however that suggest a 2.5x perf increase at the same clocksI mean I figure it would be much stronger but I was wondering ball park. Like I know A78 is about 3x A57 clock for clock. So I was think core for core clock for clock A78 vs Jaguar would be about 3x. And 1.1 GHz vs 1.6 GHz should produce about double the performance roughly?
It's much more performant but I'm just trying to scale I guess.
I think anything in this FY was written off. If it was launching before April, we'd know by now, akin to the initial Switch revealWith Nintendo’s change in FY forecast, does this cast any doubt on a March launch?
It feels like if anything, it would mean that the new systems sales would not be tracked in these numbers correct?
Yes I know the leading / convenient theory is May, just talking about the “early 2023” option that’s been floated for ages.
Not everything will scale simply by better IPC. The better IPC is used as a generic look at it but it’s more nuanced.What's your major issue with clocking it that low?
I think anything in this FY was written off. If it was launching before April, we'd know by now, akin to the initial Switch reveal
It's still possible, but I think it's unlikely. A two month window would be cutting it very close, and I believe leaks would be more full force than they are now. There was quite a bit we learned out the Switch prior to its October revealI see. I kind of assumed, especially given so many peoples’ stances on “they absolutely will not announce it this holiday at the risk of tanking sales” meant that it would need January or later reveal.
What you’re saying kind of implies March was never an option, but there’s been plenty of arguments for a reveal to launch timeframe of a couple months so I’m not sure why it’s not possible?
And this isn’t even one of the better ORIN scoresLots of folks would look at IPC numbers, but you can't do that comparing x86 and arm architectures. Even if you could, it ignores things like Jaguar's weak cache design and remarkably bad branch predictor. There are some benchmarks however that suggest a 2.5x perf increase at the same clocks
This is an Orin benchmark again once of the various Jaguar APUs. There are very few Orin benches out there, but this one tracks with some smart phone benches I've seen, and the user is well enough known in the Nvidia community I'm fairly certain he didn't screw up Orin's power config (which I've seen several benchmarkers do).
I do think the lack of recent leaks is suspicious, but I also think the number of leaks we got through last year was substantial.It's still possible, but I think it's unlikely. A two month window would be cutting it very close, and I believe leaks would be more full force than they are now. There was quite a bit we learned out the Switch prior to its October reveal
Smarch was always the obvious optionI never saw March anyway
I’m partial to Ju-march-ne myselfSmarch was always the obvious option
Isn’t production improving? Really doubt that Nintendo would release a new console next year if they are starting to produce more switch, surely they will wait till early 2024 for a Drake reveal and release it during summer 2024 Most likely? By then Switch will have sold 130+ million without a price drop at least?
I'm SO ready to consume every milissecond of every DF video about Drake, every comparison, every discussion, every frame...
Nintendo could have put Zelda anywhere. Why May 12?
she says, as an enabler of consumption
I guess I should say why not June? or late May even. it's a weird dayBecause that’s when they think they can finish it? It’s a Friday, and the developers probably did reasonable estimates of work left with some buffer. I think the only prerequisite they’d have is on or after Drake launch…I’ll never forgive them or understand their decision making otherwise
I guess I should say why not June? or late May even. it's a weird day