Looks I'm gonna nail the switch power prediction and mario wonder prediction wow I'm good.
Your expectations | |||
Performance ridiculously ... | 0 | 0% | |
Really below current gen,... | 2 | 100.00% | |
Slightly below current ge... | 0 | 0% | |
On pair with current gen,... | 0 | 0% | |
Total: | 2 |
Looks I'm gonna nail the switch power prediction and mario wonder prediction wow I'm good.
Pemalite said: 1. Considering how low-performing an RTX 2050 is, it would likely be a waste having more than 4GB of memory, it's not pushing high-end 1440P/4k visuals, not with only a paltry 112GB/s of memory bandwidth anyway... 2. That is blatantly false. 3. Look. Developers work within the confines of the hardware. |
1 We don't even need to discuss 1440p/4k, which I didn't bring up and don't expect the Switch 2 to be able to achieve natively in demanding titles. Just being able to run modern games at 1080p is starting to be a limit for 4GB GPU's. Here is what Cyberpunk 2077 uses at the settings in the Digital Foundry video (Medium, DLSS Quality, 1080p.) Roughly 4.5 GB of GPU memory and 3 GB of system ram.
There have been many other examples: Hogwarts Legacy, The Last of Us Part 1, etc. released this year.
2. You said "the Switch 2 will not allocate RAM for any specific uses" and then "The Xbox One/Playstation 4/Xbox Series X/Playstation 5 ALL allocate at-least 2.5GB of Ram OR MORE for the OS."
Anyway, I said "be able to allocate", not that it is some system-determined reservation. After you account for the OS's minimum reservation, CPU-workload demands for the game, etc. you should expect about 6-8 GB out of about 12-16GB to be available.
Using the numbers we have here for example, assuming at least 2.5GB for the OS (which I still think is an overestimate for Nintendo's feature sparse systems), ~3GB for CPU-based game-related tasks; 12GB - 2.5GB - 3GB = 6.5GB left unallocated.
3. I mean that is even more reason why alleviating a VRAM bottleneck could lead to performance gains.
4. There has been a lot of efforts to streamline the pipelines to work around the memory-hierarchy, I don't disagree. But at least in the current PC gaming situation VRAM is a big issue, even for 1080p.
5. Eh, when I said the neutered 2050 is an "underestimate" I wasn't thinking something crazy like the Switch 2 will be able to output at 1440p native 60fps or anything. I suppose I should've put the word "bit" before it. Basically I suspect we'll get more capacity to hit native 1080p low-medium settings with the Switch 2 than the defanged 2050 was able to in Digital Foundry's videos. That's not an unrealistic expectation for low-end hardware at the end of the year 2024.
Last edited by sc94597 - on 15 November 2023sc94597 said: 1 We don't even need to discuss 1440p/4k, which I didn't bring up and don't expect the Switch 2 to be able to achieve natively in demanding titles. |
Seems you didn't read my post.
The point is, 1440P is relevant... Because that is a resolution the Switch 2.0 will -not- be running at, thus reducing it's need for higher amounts of Ram.
That's my point.
sc94597 said: Here is what Cyberpunk 2077 uses at the settings in the Digital Foundry video (Medium, DLSS Quality, 1080p.) Roughly 4.5 GB of GPU memory and 3 GB of system ram. There have been many other examples: Hogwarts Legacy, The Last of Us Part 1, etc. released this year. |
There have been many instances in the past where games have exceeded a GPU's VRAM limit and ran fine. - They just buffer data into System Ram and fetch the data they need on a per-needs basis. It's not ideal, but it's hardly the end of the world.
An RTX 2050 being a slow heap of trash is not going to have the horsepower to efficiently leverage 8GB/16GB of VRAM. - That's the reality of it... If you are working with a dataset that large, the 2050's performance is going to be terrible anyway.
As for the Switch 2.0, it's ultimately irrelevant as consoles tend to have a unified memory architecture, making all this redundant... But I think you would be deceiving yourself if you believe it will somehow be significantly better than a 2050's capability.
sc94597 said: 2. You said "the Switch 2 will not allocate RAM for any specific uses" and then "The Xbox One/Playstation 4/Xbox Series X/Playstation 5 ALL allocate at-least 2.5GB of Ram OR MORE for the OS." Anyway, I said "be able to allocate", not that it is some system-determined reservation. After you account for the OS's minimum reservation, CPU-workload demands for the game, etc. you should expect about 6-8 GB out of about 12-16GB to be available. Using the numbers we have here for example, assuming at least 2.5GB for the OS (which I still think is an overestimate for Nintendo's feature sparse systems), ~3GB for CPU-based game-related tasks; 12GB - 2.5GB - 3GB = 6.5GB left unallocated. |
Irrelevant. As I have already established, Console operating systems are not more memory or CPU efficient than a PC, not since the 7th gen.
sc94597 said: 3. I mean that is even more reason why alleviating a VRAM bottleneck could lead to performance gains. |
They would be better off investing in more functional units or faster VRAM than wasting money on higher capacity VRAM with the rtx2050, it nets you a larger return on investment if performance and/or visuals are your goal.
There is such a thing as "diminishing returns" when it comes to Ram. - Because Ram, unless you are filling up the entire pool completely, 100% of the time, has diminishing returns if the hardware's processing capabilities are unable to keep up with that dataset anyway.
I also have an RTX 3060 notebook with 6GB of VRAM, which is significantly faster than the RTX 2050, but doesn't come with a corresponding increase in VRAM.
And it's fine in everything within expectation. - Some games will use 7GB/8GB of VRAM (1GB/2GB Buffered into the 64GB of System memory) and it's fine.
System behaves as it should.
sc94597 said: 4. There has been a lot of efforts to streamline the pipelines to work around the memory-hierarchy, I don't disagree. But at least in the current PC gaming situation VRAM is a big issue, even for 1080p. |
It's really not.
The situation is overblown unless you are doing 1440P/4k gaming with Ray Tracing and Ultra settings.
sc94597 said: 5. Eh, when I said the neutered 2050 is an "underestimate" I wasn't thinking something crazy like the Switch 2 will be able to output at 1440p native 60fps or anything. I suppose I should've put the word "bit" before it. Basically I suspect we'll get more capacity to hit native 1080p low-medium settings with the Switch 2 than the defanged 2050 was able to in Digital Foundry's videos. That's not an unrealistic expectation for low-end hardware at the end of the year 2024. |
I wouldn't be surprised if the Switch 2.0 is another 720P device, just like the Steamdeck. (Technically 800P.)
More power savings can be had at that resolution... And more chances to actually have games perform 30/60fps consistently... And it places even less emphasis on the Ram capacity.
--::{PC Gaming Master Race}::--
Duplicate
Last edited by sc94597 - on 15 November 2023I hope this doesn't lead to another 30 pages debate on what the Switch will perform like. Personally, the graphics chase has been long gone since we've attain a quasi near realistic 3D representation of our world. Artistic direction, world designs and such have proven how needlessly the chase for power has been in the light of this generation diminishing returns.
Anywoo, it'll perform similarly to a buffed up PS4- PS4 Pro in handheld mode and I'll be happy with my 720p/60 fps games.
Goodnight !
Though as far as we know, the most discussed rumor about the Switch 2 innards with Tegra239 is quite old now, no ? Like more than a year. With the rapid and exponential advance in technologies. What is there to guarantee this kind of info isn't outdated by now and they've opted for another chipset ? We are about or more than a year from release imo so it goes to say these tweaks can definitely happen.
Mannnn ... I really can't wait for the actual reveal so we can definitely be finished with this ceaseless debate
Switch Friend Code : 3905-6122-2909
Pemalite said: |
Seems you didn't read my post. The point is, 1440P is relevant... Because that is a resolution the Switch 2.0 will -not- be running at, thus reducing it's need for higher amounts of Ram. That's my point. |
And my point was that you don't need to be targeting 1440p for VRAM to be an issue or bottleneck. Which is why you said "reducing it's need" and not "eliminating its need." Posted it in another comment as you posted your response, but the RTX 3050 6GB (Refresh) outperforms the RTX 3050 ti 4GB at 1080p in the majority of modern titles, despite having less memory bandwidth (192 GB/s for 3050ti vs. 144 GB/s 3050 6GB), and an average of 50 Mhz lower max clock speeds at a given TDP. That is purely because of the extra VRAM capacity.
There have been many instances in the past where games have exceeded a GPU's VRAM limit and ran fine. - They just buffer data into System Ram and fetch the data they need on a per-needs basis. It's not ideal, but it's hardly the end of the world. An RTX 2050 being a slow heap of trash is not going to have the horsepower to efficiently leverage 8GB/16GB of VRAM . - That's the reality of it... If you are working with a dataset that large, the 2050's performance is going to be terrible anyway. As for the Switch 2.0, it's ultimately irrelevant as consoles tend to have a unified memory architecture, making all this redundant... But I think you would be deceiving yourself if you believe it will somehow be significantly better than a 2050's capability. |
Sure, and there are many instances of games (especially those released this year) where it becomes the primary bottleneck, even at 1080p. It won't be able to utilize 8/16GB, but 6GB is becoming the minimum for 1080p gaming these days. A 2050 likely will be able to utilize at least a portion of that.
Define "significantly". Being able to achieve 1080p 30fps natively in many modern titles at low-medium settings? I don't think that is far-fetched.
Irrelevant. As I have already established, Console operating systems are not more memory or CPU efficient than a PC, not since the 7th gen. |
How much VRAM did the original Switch use? What sort of features do you think Nintendo will add that will make the Switch 2's OS consume more memory? Memory-usage is directly proportional to the feature-set of the OS (assuming efficiency isn't very different.) Microsoft (with Windows and Xbox) and Sony have a plethora of features in their base OS, Nintendo far f.ewer.
They would be better off investing in more functional units or faster VRAM than wasting money on higher capacity VRAM with the rtx2050, it nets you a larger return on investment if performance and/or visuals are your goal. |
Again, the 3050ti vs. 3050 6GB example shows that VRAM capacity (6GB vs. 4GB) can lead to higher performance than VRAM bandwidth (192 GB/s vs. 144 GB/s)
I wouldn't be surprised if the Switch 2.0 is another 720P device, just like the Steamdeck. (Technically 800P.) More power savings can be had at that resolution... And more chances to actually have games perform 30/60fps consistently... And it places even less emphasis on the Ram capacity. |
I think there will be quite a few games that target 720p and upscale to 1080p, so we're in agreement there.
Last edited by sc94597 - on 15 November 2023Soundwave said: The poll categories are meaningless because it's not how hardware functions any longer. Hardware architecture is a massive factor these days in terms of what games a system can play. XBox Series S also dramatically lowers the floor for all next-gen games as basically every game has to have a version for that. Of course it will not run as well as the PS5/XBS but no one is expecting literally the same settings, we all know some compromises will be made, but for the most part it's still the same freaking game that you can now take with you on the go. But at the same time PS5/XS don't run at the same settings as high end PCs, yet people still play and enjoy those games just fine. |
Because it runs ps5 games at lower settings and with visual downgrades... like the ps4.
Let's examine your logic. My 3050 can run the same games as the 4090, albeit at lower setting. Using your logic my 3050 is effectively a 4090 because it runs the same games.
The switch 2 will be a ps4 and that is a huge jump for Nintendo. Day 1 here. The ps5 stuff is nonsense.
Last edited by Chrkeller - on 16 November 2023
i7-13700k |
Vengeance 32 gb |
RTX 4090 Ventus 3x E OC |
Switch OLED
zeldaring said: Looks I'm gonna nail the switch power prediction and mario wonder prediction wow I'm good. |
Don't be so modest. We also nailed the apple iPhone pro prediction. You know that magical chipset the big 3 should be afraid of that runs a 2021 game, that isnt even visually demanding, like "drunken frog."
i7-13700k |
Vengeance 32 gb |
RTX 4090 Ventus 3x E OC |
Switch OLED
The 3050 can run any game on the PS5/XSX that is on the PC. A 3050 will run any game a 4090 runs. So that's kind of a dumb analogy.
Switch 2 will have many, many PS5 games that don't run on a PS4 period. So it's a PS4 that plays a lot of PS5 games ... which makes it not a PS4 a anymore. No one cares what label you want to put on it. Some people sure have a hard on for needing a label on everything to comprehend it, if there's something that doesn't quite fit that label it's like their brain stops working.
And lol at crowing over a phone pocket device running a modern game at pretty decent performance with no active cooling that's smaller than a Switch and fits in your pocket. That's a great achievement and iPhone gaming will only get better every year because every iPhone from here on out will have even better performance than that, that is the ground floor going forward. Most people talking about that port are impressed by it, lol at clinging to one sentence.
Laughing at that is like laughing at a 14 year old who is a bit clumsy but is also 6 foot 9 and already dunking on grown men in pick up basketball and has only been playing for a year. That kid could easily be a monster in 3 years. The M3 chip that's going to be in iPads is going to run next-gen games full stop on the go.
Last edited by Soundwave - on 16 November 2023Chrkeller said:
Don't be so modest. We also nailed the apple iPhone pro prediction. You know that magical chipset the big 3 should be afraid of that runs a 2021 game, that isnt even visually demanding, like "drunken frog." |
Yes sir. Still have to wait for official confirmation though for switch and wonder, but yea that apple thread with that op getting mad was funny.