• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

No way Nintendo EDP 3 is working right now in the new zelda making textures the same resolution as in BOTW or TOTK. Yeah, they won't push photorealism, but if they can push in another artstyle, they will push.

Maybe. But by the time that next big revolutionary 3D Zelda game comes out, the next upgrade hardware AFTER this Drake hardware will be coming out.

I think the Switch ecosystem and Nintendo will feel perfectly fine with ToTk type gaming (in terms of grandness and scale and gameplay) for another 5 years or so at least. This new hardware will just make these game look and run fantastic as if they were released on modern hardware from this decade.
 
I think it was needed a long time ago.

We shouldn't try and force a conversation on nonsense because it's quiet. When there is something interesting or legit to talk about the thread will blow up. This thread does not have to be consistently active.
I get that, but that's something that'll just happen on it's own. Enforcing the thread rules can help, but I just think it's getting excessive.
 
Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.
It's confirmed, but hope is not lost.

Nvidias own documentation on DLSs seems to paint a very different picture of runtime speed, compared to Riches test. It's possible he ran into some kind of significant bottleneck.
 
Fashion Dreamer has launched, coloured buttons in tow. As has been explored, the coloured buttons are a pretty big coincidence if it's just a coincidence.
 
It's confirmed, but hope is not lost.

Nvidias own documentation on DLSs seems to paint a very different picture of runtime speed, compared to Riches test. It's possible he ran into some kind of significant bottleneck.
Fair enough. I wasn't losing hope, was just curious as to how and when it was deconfirmed.


Well those are indicative specs as well, hard to judge without CPU, customisations, total ram etc. I imagine there will be some aspects that were better than what Rich came up with while others might not be as good.
 
Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
 
Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
If you haven't watched the DF video, go ahead and do it now. Don't worry about tech stuff if you're not sure what they mean, you should still get some ideas from looking at visuals alone.

Keep in mind that is the floor of what Switch 2 can do (meaning it's most likely going to be better than what you see in DF video).

If anyone were disappointed, it's probably because they set themselves up with unrealistic expectations. Switch 2 because of its docked/handheld form factor, isn't meant to be competitive with PS5 and XSX's raw power.
 
Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.
Yes. LiC checked it and the DLA mentions were removed.
Follow-up: I'm double checking, and there are some auto-generated files for T239 in the leak that remove most mentions of DLA while adding the FDE.
In my notes I've got two places where DLA definitions show up publicly for Orin but not Drake: T234/T239, and T234/T239. There are from a June 2021 commit, and work was ongoing as you mentioned, so we don't know for sure that they're final, but at this stage at least they do imply some differences. The second one is also where the NVJPG, PVA, and camera blocks appear to be removed from T239.

For completeness: I never really bothered reporting on DLA-related findings from the leak because the info seemed incomplete and not important to me. I think the one file that mentions it for T239 is just an identical copy of T234's file for the same classes, including the PVA and camera stuff, so it looks like it just wasn't updated there.

That's not my understanding.

Oldpuck said it was a misstep of his, DLA is NOT eliminated.

Edit: Or does that mean DLA won't be included on Drake? I've been known to misinterpret
You misunderstood him. "I couldn't locate the DLA stuff (Differences between T239 and T234 to see if DLA was removed from T239), and told Rich that (The DLA) it wasn't eliminated (from T239)".

However LiC showed that the DLA were indeed removed from T239.

Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
I suggest that you watch both DF videos. The one in the main channel and one in the DF Clips. But if you want a TLDR, without delving into tech, Switch 2 will be a proper generational leap and will be able to run current-gen (PS5/XSeries) games with much ease and less effort from developers compared to all the cuts devs had to do with Switch downports. This level of jump in performance is basically as if Nintendo went from Wii to Wii U and Nintendo games on the new hardware will look downright fantastic.
 
You misunderstood him. "I couldn't locate the DLA stuff (Differences between T239 and T234 to see if DLA was removed from T239), and told Rich that (The DLA) it wasn't eliminated (from T239)".

However LiC showed that the DLA were indeed removed from T239.
Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?
 
RTX 2050 is Ampere based. Hence why he's using as an approximate GPU to T239. RTX 2050 = RTX 3050M but with half the memory bus. The use the same die, GA107.

Think of the RTX 2050 as the GTX 16xx of the Ampere generation. They used the new architecture, but Nvidia named them as if they're part of the generation before. RTX 3050/Ti, RTX 2050 and MX570 are all Ampere based GPUs and based on the same die, GA107, but with different power budgets, memory, memory clocks and memory bus.
I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
 
Fair enough. I wasn't losing hope, was just curious as to how and when it was deconfirmed.


Well those are indicative specs as well, hard to judge without CPU, customisations, total ram etc. I imagine there will be some aspects that were better than what Rich came up with while others might not be as good.
I believe dlss in particular, will likely run very close to optimal speed on Drake, and it might not have on Riches setup.

Considering Nintendo and Nvidia had DLSS in mind since the conceptual stage of NG.
 
Would 1440p 30fps docked mode translate closely to 1080p 30fps handheld mode?
Probably! I brought up 720p only because the benchmark has the native 720p performance, and that doesn't quite hit 60 on that rig, and 720p is the internal resolution for 1440p Performance mode.

Which is a roundabout way of saying "DLSS overhead isn't the reason 1440p can't hit 60fps, it's just the game itself"
 
0
If you haven't watched the DF video, go ahead and do it now. Don't worry about tech stuff if you're not sure what they mean, you should still get some ideas from looking at visuals alone.

Keep in mind that is the floor of what Switch 2 can do (meaning it's most likely going to be better than what you see in DF video)

I saw the video, yeah, but wasn't really sure due to the fact I didn't understand the terms that were said. But it looked allright to me. I wasn't bothered by the 30fps whatsoever. But if this could be the lowest setting, so to speak, of what the Switch 2 could do, and it could actually be better than what the DF video showed, I'm not one complaining. Thanks😊
 
Why wouldn't they? If they can get extra performance for essentially free, they should, especially on an 8-inch tablet screen where the occasional visual error is much less noticable.
DLSS is magic, but it's not infinite magic. 360->1080 looks noticeably rougher than 540->1080. So, not free. And no, being on an 8" screen doesn't make the differences unnoticeable, any more than it's unnoticeable when a Switch portable game uses subnative resolution.
If DLSS scales this badly at 2 teraflops or 4 teraflops then the Switch 2's potential is massively lowered as DLSS would be unusable and FSR2 would likely be unusable as well (forcing the Switch 2 to have most of its cycles eaten up by native resolution), but I'm pretty doubtful of this.
FSR2 being unusable seems especially unlikely, considering it's started showing up in Switch games.
You reduce the output resolution just enough to get below 16ms and scale up the rest of the way to 4k.
I don't know, man, if the question is "Doesn't that make X impossible?", "Do not do X" isn't much of a counterpoint, it's just agreeing.
question though, and i know this has been talked about heavily in this thread so apologies if it's been confirmed as not possible but, why wouldn't it be possible to DLSS from 480p? i noticed DF didn't bother testing that at all in the video, unless i missed it.
You can go from any resolution to any resolution. Just, the greater the difference between input and output resolution, the worse job the final result will be at fooling you into thinking it's a proper version of the output resolution. 480->720, you'll accept it as 720 easy. 480->1080, you might accept it as 1080 most of the time. 480->4K, you will think that is one messed up 4K.
For a non-serious example, though, my favorite is this one: 72p to 1440p.


Here's the fuller version, which also includes 144p and 288p. 288->1440 would proportionally be similar to 432->4K.

The big problem with the DLSS testing in the vid is that it ignores PC games often scale stuff like LOD depending on OUTPUT resolution.
Obviously 4K DLSS looks unfeasable : because they're running the 4K LODs and/or post-process and/or textures.
Well, that's the way it SHOULD be done if you want to produce an image that looks proper 4K, rather than just... 1080p with benefits, or whatever.
I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?
Because it's the first one to have hardware that specifically allows decently reaching such high resolutions from lower resolution input.
 
Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?
I think instead of fixating into the 18ms frametime cost of DLSS 4K that Rich showed, it's better to just think that DLSS isn't a free resource. But to answer you properly, yes. DLSS 4K would be pretty costly from what Rich gathered. That being said, there are ways to mitigate the costs, like DLSS working on a already buffered frame while the engine works in the next frame, in exchange for increased latency. But that's up to the devs.

As I said, for now, just think that DLSS 4K is expensive and isn't a "free lunch". For some games, it will be unfeasible to target DLSS 4K and they might go with DLSS 1440p, 1800p, etc.
I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
Agreed. I guess their thinking was that RTX was a too premium of a brand to be used on a xx40 class. But they just made things very confusing in the process. IMO they should have simply called it MX 570 and cancelled the real MX570 (Which makes zero sense and was barely picked up by laptop OEMs).
 
I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
It is confusing, but it's not entirely pointless. The 2050M wasn't launched with the rest of the 20 series (obviously, it's 30 series silicon). RTX 20 cards were given a second launch during pandemic for capacity reasons, using bins of dies that might ordinarily be scrapped, or just warehoused dies that would have been cut in price radically when the 30 series came out. The 2050M was part of the same move, taking a GA107 that likely would have been completely scrapped in ordinary times.

I don't think Nvidia wanted to call attention to a product that had previous gen performance, and that they intended to retire the moment the chip shortage ended.
 
Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
Yay.

Switch 2 will be able to run pretty much any next generation game at 30fps and a DLSS 4K. Especially with the specific optimization that goes into consoles vs. PC, the fact that the real life docked clock speeds will be higher, and the continued optimization of DLSS. Which will get even better considering more developers will be using it in a closed console environment.
 
Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?

And via DLSS Concurrency. Add an extra frame of latency, but then get more responsiveness back due to the higher framerates.
 
I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
Sense doesn't make sense in the video game industry. Shame on you for thinking otherwise.
I saw the video, yeah, but wasn't really sure due to the fact I didn't understand the terms that were said. But it looked allright to me. I wasn't bothered by the 30fps whatsoever. But if this could be the lowest setting, so to speak, of what the Switch 2 could do, and it could actually be better than what the DF video showed, I'm not one complaining. Thanks😊
Honestly, your comment made me realise something else. There is a lot of people in this world that don't particularly care about 60fps on everything, and a lot of Switch games don't even hit that marker consistently. I think the Switch would be fine with 30fps as a bare minimum, and that's all cool and stuff. Besides, not every game has 60fps as a standard on "high power" systems anyway. No point losing sleep over that. 60fps is nice, but thinking it's a requirement is a bit goofy.

Also, resolution wise, as long as the game goes as high as it can after DLSS, I'm happy with some games going a bit lower to sacrifice resolution for a 60fps boost.
 
that they intended to retire the moment the chip shortage ended.
Well, if that's the case, then they failed. Because it's being picked up by OEMs for CUDA compatible office laptops and cheap gaming laptops. I think OEMs fully intend to replace the RTX 3050 4GB with RTX 2050 and upsell the RTX 3050 6GB.
 
But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?

Not as tech savvy here too, but when Rich tried the higher clocks, it was already beyond what we could see on Switch 2 based on what we know about it (because the 2050 already has 33% more CUDA cores and tensor cores).

So, at 750MHz I would say you would need drake at 1GHz to equal the raw performance. Anything we get past that is profit lol
Of course, I'm talking theoretically here, nothing more. How switch 2 will work (in comparison with this tested setup) is something we can't know.
 
Not as tech savvy here too, but when Rich tried the higher clocks, it was already beyond what we could see on Switch 2 based on what we know about it (because the 2050 already has 33% more CUDA cores and tensor cores).

So, at 750MHz I would say you would need drake at 1GHz to equal the raw performance. Anything we get past that is profit lol
Of course, I'm talking theoretically here, nothing more. How switch 2 will work (in comparison with this tested setup) is something we can't know.

I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
 
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.

It's not comparable to the Steam Deck. This is dedicated hardware with a custom chipset, not a sized-down PC that has to be compatible with "everything".
 
0
I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
Yep, since the demoed 2050 had less bandwidth than what the T239 already has in its LPDDR5 configuration. Not by much, but it essentially gets all of the benefits in comparison.
 
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
What, not better? But if it literally doubled its frames on most situations, not to say the higher settings all of the games had in comparison.
 
I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?

It seems the advantage for switch 2 is gonna be more RAM for the GPU, but less bandwidth (because it has to share the bandwidth with the CPU, but on PC the GPU and CPU have their own memory)

But I don't have enough knowledge here, and I imagine it's too hard trying to compare a closed system with PC when you have specs that aren't that far from each other.
 
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.

Don't be too pessimistic. I'm repeating myself here, but Switch 2 will have several advantages:

  • dedicated console (good for optimisation) with lighter OS
  • better GPU RAM across the board
  • the ability to offload the DLSS rendering to the next frame, making it essentially "free" at the cost of an extra frame of latency, which could then be countered by the higher framerate
 
for real. the switch 2's t239 system-on-a-chip having RT cores and tensor (for AI processing) cores means after multiple generations nintendo is using hardware features that are more advanced than the ps5 and xbox series x hardware features 🤩
Arguably the better chip design as well. ARM seems to be the future.
 
Yep, since the demoed 2050 had less bandwidth than what the T239 already has in its LPDDR5 configuration. Not by much, but it essentially gets all of the benefits in comparison.

Thanks, that's what I thought.

It seems the advantage for switch 2 is gonna be more RAM for the GPU, but less bandwidth (because it has to share the bandwidth with the CPU, but on PC the GPU and CPU each has its own memory)

But I don't have enough knowledge here, and I imagine it's too hard trying to compare a closed system with PC when you have specs that aren't that far from each other.

Are you sure that's how it works? I thought that each unit would have full bandwidth within the amount of RAM they were using, they just have to share the RAM.
 
Thanks, that's what I thought.



Are you sure that's how it works? I thought that each unit would have full bandwidth within the amount of RAM they were using, they just have to share the RAM.
My understanding is the interface supports a maximum total bandwidth regardless of what is doing the accessing. It's one big pipe with two mouths to feed.
 
0
I get that, but that's something that'll just happen on it's own. Enforcing the thread rules can help, but I just think it's getting excessive.
I disagree.

Has anyone even been banned? I've just seen a bunch of warnings. It's not that big of deal. If there was a wave of bans maybe I'd see your side but I really don't. I don't see where it's been excessive at all.
 
Are you sure that's how it works? I thought that each unit would have full bandwidth within the amount of RAM they were using, they just have to share the RAM.

I have always read (elsewhere) that when you have one single pool of memory, the bandwidth is shared through CPU and GPU. But if the way it works is that only one of them is having access to the RAM at a time, then I think it shouldn't matter and both should use the full bandwidth all the time?

But then all I have read so far in other places was I lie and there was never a problem to begin with lol which would be weird if this is the case.

EDIT: thinking about it. If we have a system where cpu and gpu have their own memory, they should access these memories at the same time. But if you have a system with only one pool, then the cpu needs to wait the gpu to finish its work before it can use the RAM too, and vice versa (if both aren't accessing it at the same time), so when we talk about bandwidth here we're talking about time. If the GPU has to wait for the CPU to use the RAM, then the GPU will have less time to use it, which mean less bandwidth in practice. And if both access it at the same time (if that's how it could also work), them bandwidth is still shared anyway. Someone correct me if I'm wrong here
 
Last edited:
I disagree.

Has anyone even been banned? I've just seen a bunch of warnings. It's not that big of deal. If there was a wave of bans maybe I'd see your side but I really don't. I don't see where it's been excessive at all.
You have a point. The constant warnings just seem tantamount to nagging to me, especially when I don't consider going off topic all that detrimental to the thread.
 
Sense doesn't make sense in the video game industry. Shame on you for thinking otherwise.

Honestly, your comment made me realise something else. There is a lot of people in this world that don't particularly care about 60fps on everything, and a lot of Switch games don't even hit that marker consistently. I think the Switch would be fine with 30fps as a bare minimum, and that's all cool and stuff. Besides, not every game has 60fps as a standard on "high power" systems anyway. No point losing sleep over that. 60fps is nice, but thinking it's a requirement is a bit goofy.

Also, resolution wise, as long as the game goes as high as it can after DLSS, I'm happy with some games going a bit lower to sacrifice resolution for a 60fps boost.

I don't care too much what third parties do, but I really hope Nintendo focuses on 60fps this gen (another reason why I want that system to get as close to Series S as possible in raw flops before DLSS), or offer performance modes.

30fps may be fine for some, but the age of LCD is dying off fast and QD-OLED/OLED and whatever technologies come later have near instant response times, making 30fps look an absolute slideshow to people like me without some kind of motion blur setting.
 
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
The video shows every game tested exceed Steam Deck in performance, image quality and resolution.

I’m not sure what it would take to make you excited?

I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
It’s a bit apples and oranges. More RAM, better latency, but the bandwidth pool is only slightly larger but has to feed the CPU as well, which the laptop doesn’t.

Ultimately, the only way to get closer is to wait for the thing itself to come out.
 
So would it be possible to do something like

0-16.6 ms: CPU works on first frame
16.6-33.3 ms: CPU works on second frame, GPU works on first frame
33.3-50 ms: CPU works on third frame, GPU works on second frame, tensor cores do DLSS step on first frame

etc

But with the latency of a game running at 60 Hz by separating game logic from the rendering?

Or am I missing something and saying something stupid here.

I would very much like this to be the case so that Smash is forced to separate game logic from rendering so that rollback is easy to introduce.
 
I don't care too much what third parties do, but I really hope Nintendo focuses on 60fps this gen (another reason why I want that system to get as close to Series S as possible in raw flops before DLSS), or offer performance modes.

30fps may be fine for some, but the age of LCD is dying off fast and QD-OLED/OLED and whatever technologies come later have near instant response times, making 30fps look an absolute slideshow to people like me without some kind of motion blur setting.
Nintendo seems to go for 60fps or 30fps depending on the developer and what they want to do for their game. Monolith Soft and Zelda team are fine with trading in performance, while 3D Mario and Smash Bros teams understand why their games need 60fps. It's something i can live with, especially since they usually make sense as to why their games are 30fps depending on the game.
 
Nintendo seems to go for 60fps or 30fps depending on the developer and what they want to do for their game. Monolith Soft and Zelda team are fine with trading in performance, while 3D Mario and Smash Bros teams understand why their games need 60fps. It's something i can live with, especially since they usually make sense as to why their games are 30fps depending on the game.

Usually they don't have much of a choice either, since Nintendo hardware is often very limited since the Wii. This time, there needn't be that kind of constraint. Zelda and XB are big games with bigger worlds vs 3D Mario and Smash.

Or they could just throw in a 60fps mode if and get with the program!
 
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
I'm not saying that you're wrong (Everyone has their own perception) but what was showed in DF video is significantly ahead of what SteamDeck produces? Cyberpunk 2077 at 1080p30 with PS5 equivalent settings or Control at 1080p30 with PS5 equivalent settings but with full Ray-Tracing reflections are well beyond what SteamDeck can imagine to do. I'm guessing what you're trying to convey is that what DF video showed isn't a leap over what SteamDeck already can do, which is to run last-gen and current-gen games at decent image quality and graphical fidelity?
And if both access it at the same time (if that's it could work), them bandwidth is still shared anyway
That's how UMA (Unified Memory Architecture) works, yes. Both CPU and GPU can fetch data at the same time, without the need of copy transfer between CPU and GPU. The disavantage of said model is that, by the fact CPU and GPU share memory, they compete with each other for bandwidth, which can lead to contention issues. But that's something that the OS (On PCs) and developers (On fixed hardware. i.e: consoles) need to work around.
 
Usually they don't have much of a choice either, since Nintendo hardware is often very limited since the Wii. This time, there needn't be that kind of constraint. Zelda and XB are big games with bigger worlds vs 3D Mario and Smash.

Or they could just throw in a 60fps mode if and get with the program!
Oh I dream that there's a performance and quality mode. That'd be ideal, but I understand if some games just outright don't.

I should probably point out that Nintendo is a... unique beast when it comes to the games they make. The reason why a lot of their games probably won't get a quality/performance modes is because games like Zelda are very gameplay focused, to the point that changing the graphics might not change the performance. If we get to that point, then performance modes might not be a thing. It's a shame, but I'll flail my arms in the air and shout "it is what it is" and give up.
 
0
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
Do you own a Steam Deck? (I don't)

Was wondering what made you say that considering the other replies are saying what's shown in DF video shows the setup, which we clearly know cannot dupe T239 exactly but is at lower end of spectrum of what T239 can do, is outperforming Steam Deck easily. Maybe you saw something others didn't, what was it?
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom