Forums - Gaming Discussion - 2080 Max Q stronger than XSX/PS5 (Allegedly)

Trumpstyle said:
CGI-Quality said:

Actually, the 3000 series will debut next year and you have no idea if either console will be above a 2080 (I'm betting they, perhaps, match it, but they'll most likely contain console variants of the 2070, which will be a considerable upgrade over the 1060 equivalent that is the Xbox One X's GPU). 

But, we've discussed this type of thing already. Not sure why you keep doing this to yourself. :P

You should read twitter, no 3000 series next year, only a proffessional card on tsmc 7nm euv and geforce 2080 ti super to counter Navi 21. Where the console will land next year we shall see, I think they both will beat geforce 2080.

Why do that when a good few more sources say otherwise? It's something you should learn — always do more research. Your posts are simply you telling yourself what you want to happen.

Edit: But, let's humor the idea and say Ampere skips 2020, that still doesn't instantly mean the consoles will have GPUs that match a 2080, let alone 'beat' it. 



                                                                                                             

Around the Network
CGI-Quality said:
Trumpstyle said:

You should read twitter, no 3000 series next year, only a proffessional card on tsmc 7nm euv and geforce 2080 ti super to counter Navi 21. Where the console will land next year we shall see, I think they both will beat geforce 2080.

Why do that when a good few more sources say otherwise? It's something you should learn — always do more research. Your posts are simply you telling yourself what you want to happen.

Edit: But, let's humor the idea and say Ampere skips 2020, that still doesn't instantly mean the consoles will have GPUs that match a 2080, let alone 'beat' it. 

I didn't know you had sources saying Nvidia will release the 3000 series next year, how can you not think next-gen consoles will beat geforce 2080? You have Eurogamer, Tom Warren, verified insider and windows central saying Xbox series X is 12TF. Jason Schreier saying both consoles matching geforce 2080 and 3 people saying Ps5 is above Xbox series X but as I said before I expect Microsoft is just w8ing for Sony to reveal their number than Microsoft will beat it.

About the nvidias gpu, they were planning to launch their professional gpu on tsmc 7nm Euv and the gaming series (3000) on samsung 7nm euv, but the yields are so bad at samsung they were forced to switch back to Tsmc and are now delayed, not "skipped". You can read more about it here.

https://wccftech.com/nvidia-next-generation-ampere-7nm-graphics-cards-landing-1h-2020/

https://wccftech.com/nvidia-tsmc-7nm-next-generation-gpu-ampere/

I'm just making a fun prediction based on twitter that we will see no gaming gpus from nvidia next year except geforce 2080 ti SUPER to counter Navi 21.



"Donald Trump is the greatest president that god has ever created" - Trumpstyle

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:
CGI-Quality said:

Why do that when a good few more sources say otherwise? It's something you should learn — always do more research. Your posts are simply you telling yourself what you want to happen.

Edit: But, let's humor the idea and say Ampere skips 2020, that still doesn't instantly mean the consoles will have GPUs that match a 2080, let alone 'beat' it. 

I didn't know you had sources saying Nvidia will release the 3000 series next year, how can you not think next-gen consoles will beat geforce 2080? You have Eurogamer, Tom Warren, verified insider and windows central saying Xbox series X is 12TF. Jason Schreier saying both consoles matching geforce 2080 and 3 people saying Ps5 is above Xbox series X but as I said before I expect Microsoft is just w8ing for Sony to reveal their number than Microsoft will beat it.

About the nvidias gpu, they were planning to launch their professional gpu on tsmc 7nm Euv and the gaming series (3000) on samsung 7nm euv, but the yields are so bad at samsung they were forced to switch back to Tsmc and are now delayed, not "skipped". You can read more about it here.

https://wccftech.com/nvidia-next-generation-ampere-7nm-graphics-cards-landing-1h-2020/

https://wccftech.com/nvidia-tsmc-7nm-next-generation-gpu-ampere/

I'm just making a fun prediction based on twitter that we will see no gaming gpus from nvidia next year except geforce 2080 ti SUPER to counter Navi 21.

How can I not? Because I wait for the official word. That's something I feel you unwisely don't do. We see this every generation — people hype themselves up for something that falls short of those expectations, but it's particularly bad around the start of a new gen. 

Also, Eurogamer (Digital Foundry, more accurately) said they couldn't confirm or deny the rumored specs of the Series X, so scratch them from that list. Then we have Jason, who actually said: "The only thing to know for sure is that both Sony and Microsoft are aiming higher than that "10.7 teraflops". As I make clear, I'm not one of those on the teraflop bandwagon. Still, while it could be true, nothing is solid as of right now.

To the talk of Microsoft 'waiting for Sony to unveil so they can pounce' — good luck with that idea. Things aren't going to magically change when those things are unveiled, at least, not on the hardware side. They could update things through software, but even then, it's a stretch.

Now, to Ampere, your quotation marks around the word skipped tells me you aren't following. No matter the reasons for a potential to launch in 2021, doing so means it would 'skip' next year (whether it's for yields, polish, etc. is irrelevant). The point was that such a thing is slim and, like I said, sounds more like hopes because you know those cards would easily wipe away any chance that a console would have an advantage (which they won't anyway, given there are cards out that mop the floor with a 2080 and neither console will see anything like those in them).

As for my sources?

https://www.tweaktown.com/news/68689/nvidia-geforce-rtx-3080-ti-june-2020-earlier/index.html

https://www.techradar.com/news/nvidia-rtx-3080-graphics-card-could-be-powering-gaming-pcs-in-june-2020

https://www.guru3d.com/news-story/geforce-rtx-3000-ampere-data-center-marchconsumers-june-2020.html

All reputable.

Btw, one should learn not to use WCCF for rumors. They tend to jump the gun and often have things wrong until the official word is dropped. Of course, all of it is rumor, but it's better to treat them as such. On top of that, this launch would follow a consistent path for NVIDIA, so it isn't a stretch to expect the new cards next year.

Last edited by CGI-Quality - on 26 December 2019

                                                                                                             

2080 max q is about 10% inferior to a regular gtx 1080, so i highly doubt it would be the benchmark for the next navi line.
besides j. huang isn't really an impartial party here



Dosent the 5700xt get pretty close to the rtx2080? I think I have seen some test in witch it got between 10-20%. If the 12tf of the xbox turns out true then it will leave the 2080 behind.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network

Don't see how this could be true. 2080 max Q is weaker than a mobile 2080, which in turn is weaker than a desktop 2080, 2080 max Q is 6.4 tflop compared to 9.3 tflop for a mobile 2080 and 10 tflop on boost clocks for the desktop 2080. If the 12 tflop Xbox Series X rumor is true (and that rumor has been backed up by multiple sources), then XSX's GPU will be a good bit stronger than the 9.7 tflop Radeon 5700XT, and Radeon 5700XT comes within 20% of a desktop 2080 in most games. The 12 tflop AMD GPU in Series X then should fall between a desktop 2080 and a 2080 Super in terms of raw specs, and possibly ahead of 2080 Super taking into account the increased optimization that most console games receive compared to PC games, which often lack focused optimization due to the large number of potential hardware configurations on PC.

Even if this was true, does it matter if a $2000+ laptop, or even a $1500+ laptop has better specs than a $500 console? Apples meet oranges. 

Last edited by shikamaru317 - on 26 December 2019

What it all comes down to is: If you shell out enough money, you will always get the most power and best grfxxx on a PC. Consoles will have the better value, sure. But if you don't care about money and just want the best of the best, PC was, is and always will be your one and only option.

Can we all agree with that?

(oh, and it's also cool that you can get a PC with crazy power for mobile in the form of a notebook. I, mean, come on, that's pretty cool, isn't it?)



Official member of VGC's Nintendo family, approved by the one and only RolStoppable. I feel honored.

Again, here is the problem with the 'multiple sources confirm' talk.....

One of the PS4's specs rumored were these (when it was still codenamed)...

The supposed tech specs of the Omni:

  • CPU is an x86 system with a 256-bit bus
  • An APU with a fast GPU
  • 4GB of DDR3 RAM
  • 4GB of GDDR3 RAM
  • Capable of 3.2TFLOPS of data
  • An “off-the-shelf” design, more reminiscent of a PC than a next-gen gaming console

GameSopt's PS3 Rumor (2006);

"Sony also laid out the technical specs of the device. The PlayStation 3 will feature the much-vaunted Cell processor, which will run at 3.2GHz, giving the whole system 2 teraflops of overall performance. It will sport 256MB XDR main RAM at 3.2GHz, and it will have 256MB of GDDR VRAM at 700MHz."

If that wasn't enough, full (rumored) PS3 specs:

PLAYSTATION 3 SPECIFICATIONS

CPU: Cell Processor PowerPC-base Core @3.2GHz
--1 VMX vector unit per core
--512KB L2 cache
--7 x SPE @3.2GHz
--7 x 128b 128 SIMD GPRs
--7 x 256KB SRAM for SPE
--*1 of 8 SPEs reserved for redundancy
--Total floating point performance: 218 gigaflops

GPU RSX @ 550MHz
--1.8 TFLOPS floating point Performance
--Full HD (up to 1080p) x 2 channels
--Multi-way programmable parallel Floating point shader pipelines
--Sound Dolby 5.1ch, DTS, LPCM, etc. (Cell-based processing)

MEMORY
256MB XDR Main RAM @3.2GHz
256MB GDDR3 VRAM @700MHz
System Bandwidth Main RAM-- 25.6GB/s
VRAM--22.4GB/s
RSX-- 20GB/s (write) + 15GB/s (read)
SB2.5GB/s (write) + 2.5GB/s (read)

SYSTEM FLOATING POINT PERFORMANCE:
2 teraflops

STORAGE
--HDD Detachable 2.5" HDD slot x 1
--I/O--USB Front x 4, Rear x 2 (USB2.0)
--Memory Stickstandard/Duo, PRO x 1
--SD standard/mini x 1
--CompactFlash(Type I, II) x 1

COMMUNICATION
--Ethernet (10BASE-T, 100BASE-TX, 1000BASE-T) x 3 (input x 1 + output x 2)
--Wi-Fi IEEE 802.11 b/g
--Bluetooth--Bluetooth 2.0 (EDR)
--ControllerBluetooth (up to 7)
--USB 2.0 (wired)
--Wi-Fi (PSP)
--Network (over IP)

AV OUTPUT
Screen size 480i, 480p, 720p, 1080i, 1080p
HDMI out x 2
AV multi out x 1
Digital out (optical) x 1

DISC MEDIA
CD
PlayStation CD-ROM
PlayStation2 CD-ROM
CD-DA
CD-DA (ROM),
CD-R,
CD-RW
SACD Hybrid (CD layer),
SACD HD
DualDisc (audio side)
DualDisc (DVD side)
PlayStation 2 DVD-ROM
PlayStation 3 DVD-ROM
DVD-ROM
DVD-R
DVD-RW
DVD+R,
DVD+RW
Blu-ray Disc
PlayStation 3 BD-ROM
BD-ROM
BD

Since 'FLOPS' are such a big deal for a lot of you, did the PS4 launch with that much theoretical performance (though this did predict the right amount of RAM, just not the type)? Was the PS3 really capable of more theoretical performance than its successor? Was the RSX Reality Synthesizer (one of the components that would prove to be a thorn in the PS3's side) really a 1.8 teraflop monster? Think those things through. No matter who backs a rumor (those came straight from ExtremeTech and GameSpot, reputable sources of those times), as I and Pem have said, temper expectations.

I'm not saying remove excitement (yes, the next gen consoles are going to be considerably more powerful than the Pro/X), just don't push rumors as facts (or gripe to Hell and back should they prove false ), no matter how strong they may seem. I remember reading both of these articles when they were printed and fell for them too (especially those P3 specs).

Let's have a look at another report on the PS4 and Xbox One, shall we?

DigitalSpy in January 2013:

"One particularly juicy rumor surrounding the PS4 relates to the system's graphical capabilities. If the reports are to be believed, the PS4 will feature 'Ultra HD' visuals in 4K resolution, a feature which is being billed as a means to convince customers to upgrade to one of Sony's latest hi-tech television sets."

"Developers who claim to have gotten their hands on both machines recently told VG247 that the new Xbox will have a run capacity of 1.23 teraflops backed up by 8GB of RAM, 3GB of which is dedicated to the operating system, apps and security, with the rest for games. A Radeon HD 8770 GPU will serve as the system's graphics chip, which should be good enough to run the upcoming Unreal Engine 4."

"By comparison, the PS4 is said to feature a quad-core AMD APU Solution processor, an AMD 8000 series GPU and between 2 and 4GB of GDDR5 under the hood, but the big news concerns its rumored run-capability of 1.84 teraflops, almost 50% greater than what has been listed for the new Xbox."

In fairness to some of you, a few of those things happened, but crucial parts (like the GPUs) were wrong and that's a big deal, especially when it was claimed that developers had gotten their hands on these things. And, yes, we also had rumors that said these consoles would have less RAM then they ended up having, but it never left the rumor mill and remained 'TBD'. Much of the time, parts of the rumors are true, while others prove to be way off and that's where the "I expected better" posts will come from. 

On the flipside, I hope to be proven wrong, because I'd prefer machines running with a 2080 SUPER (at the lowest) equivalent, 24GB of RAM (12-16 dedicated to gaming), and procs with at least a base clock of 3.0GHz. But dreams and reality are often at odds with each other. :P

Last edited by CGI-Quality - on 26 December 2019

                                                                                                             

CGI-Quality said:

Again, here is the problem with the 'multiple sources confirm' talk.....

One of the PS4's specs rumored were these (when it was still codenamed)...

The supposed tech specs of the Omni:

  • CPU is an x86 system with a 256-bit bus
  • An APU with a fast GPU
  • 4GB of DDR3 RAM
  • 4GB of GDDR3 RAM
  • Capable of 3.2TFLOPS of data
  • An “off-the-shelf” design, more reminiscent of a PC than a next-gen gaming console

GameSopt's PS3 Rumor (2006);

"Sony also laid out the technical specs of the device. The PlayStation 3 will feature the much-vaunted Cell processor, which will run at 3.2GHz, giving the whole system 2 teraflops of overall performance. It will sport 256MB XDR main RAM at 3.2GHz, and it will have 256MB of GDDR VRAM at 700MHz."

If that wasn't enough, full (rumored) PS3 specs:

PLAYSTATION 3 SPECIFICATIONS

CPU: Cell Processor PowerPC-base Core @3.2GHz
--1 VMX vector unit per core
--512KB L2 cache
--7 x SPE @3.2GHz
--7 x 128b 128 SIMD GPRs
--7 x 256KB SRAM for SPE
--*1 of 8 SPEs reserved for redundancy
--Total floating point performance: 218 gigaflops

GPU RSX @ 550MHz
--1.8 TFLOPS floating point Performance
--Full HD (up to 1080p) x 2 channels
--Multi-way programmable parallel Floating point shader pipelines
--Sound Dolby 5.1ch, DTS, LPCM, etc. (Cell-based processing)

MEMORY
256MB XDR Main RAM @3.2GHz
256MB GDDR3 VRAM @700MHz
System Bandwidth Main RAM-- 25.6GB/s
VRAM--22.4GB/s
RSX-- 20GB/s (write) + 15GB/s (read)
SB2.5GB/s (write) + 2.5GB/s (read)

SYSTEM FLOATING POINT PERFORMANCE:
2 teraflops

STORAGE
--HDD Detachable 2.5" HDD slot x 1
--I/O--USB Front x 4, Rear x 2 (USB2.0)
--Memory Stickstandard/Duo, PRO x 1
--SD standard/mini x 1
--CompactFlash(Type I, II) x 1

COMMUNICATION
--Ethernet (10BASE-T, 100BASE-TX, 1000BASE-T) x 3 (input x 1 + output x 2)
--Wi-Fi IEEE 802.11 b/g
--Bluetooth--Bluetooth 2.0 (EDR)
--ControllerBluetooth (up to 7)
--USB 2.0 (wired)
--Wi-Fi (PSP)
--Network (over IP)

AV OUTPUT
Screen size 480i, 480p, 720p, 1080i, 1080p
HDMI out x 2
AV multi out x 1
Digital out (optical) x 1

DISC MEDIA
CD
PlayStation CD-ROM
PlayStation2 CD-ROM
CD-DA
CD-DA (ROM),
CD-R,
CD-RW
SACD Hybrid (CD layer),
SACD HD
DualDisc (audio side)
DualDisc (DVD side)
PlayStation 2 DVD-ROM
PlayStation 3 DVD-ROM
DVD-ROM
DVD-R
DVD-RW
DVD+R,
DVD+RW
Blu-ray Disc
PlayStation 3 BD-ROM
BD-ROM
BD

Since 'FLOPS' are such a big deal for a lot of you, did the PS4 launch with that much theoretical performance (though this did predict the right amount of RAM, just not the type)? Was the PS3 really capable of more theoretical performance than its successor? Was the RSX Reality Synthesizer (one of the components that would prove to be a thorn in the PS3's side) really a 1.8 teraflop monster? Think those things through. No matter who backs a rumor (those came straight from ExtremeTech and GameSpot, reputable sources of those times), as I and Pem have said, temper expectations.

I'm not saying remove excitement (yes, the next gen consoles are going to be considerably more powerful than the Pro/X), just don't push rumors as facts (or gripe to Hell and back should they prove false ), no matter how strong they may seem. I remember reading both of these articles when they were printed and fell for them too (especially those P3 specs).

Let's have a look at another report on the PS4 and Xbox One, shall we?

DigitalSpy in January 2013:

"One particularly juicy rumor surrounding the PS4 relates to the system's graphical capabilities. If the reports are to be believed, the PS4 will feature 'Ultra HD' visuals in 4K resolution, a feature which is being billed as a means to convince customers to upgrade to one of Sony's latest hi-tech television sets."

"Developers who claim to have gotten their hands on both machines recently told VG247 that the new Xbox will have a run capacity of 1.23 teraflops backed up by 8GB of RAM, 3GB of which is dedicated to the operating system, apps and security, with the rest for games. A Radeon HD 8770 GPU will serve as the system's graphics chip, which should be good enough to run the upcoming Unreal Engine 4."

"By comparison, the PS4 is said to feature a quad-core AMD APU Solution processor, an AMD 8000 series GPU and between 2 and 4GB of GDDR5 under the hood, but the big news concerns its rumored run-capability of 1.84 teraflops, almost 50% greater than what has been listed for the new Xbox."

In fairness to some of you, a few of those things happened, but crucial parts (like the GPUs) were wrong and that's a big deal, especially when it was claimed that developers had gotten their hands on these things. And, yes, we also had rumors that said these consoles would have less RAM then they ended up having, but it never left the rumor mill and remained 'TBD'. Much of the time, parts of the rumors are true, while others prove to be way off and that's where the "I expected better" posts will come from. 

On the flipside, I hope to be proven wrong, because I'd prefer machines running with a 2080 SUPER (at the lowest) equivalent, 24GB of RAM (12-16 dedicated to gaming), and procs with at least a base clock of 3.0GHz. But dreams and reality are often at odds with each other. :P

While I agree with you that tempering expectations and erring on the side of caution is probably a good thing, I do think that the situation with the rumored Xbox Series X specs is quite a bit different from the previous gen situations you highlighted. That PS3 rumor was obviously bogus, as the top single chip PC GPU in 2006, Nvidia's 8800 GTX, was only around 350 gflops, there was no way that PS3 was getting 5x the graphical power of the top PC GPU. Meanwhile, I'm pretty sure that PS4 spec rumor came from the Orbis development kits that Sony sent out in 2012, which were PC based dev kits that were designed to emulate next-gen performance, because the actual PS4 chipset hadn't been completed by AMD yet. So the specs of that leak were actually accurate from what I've heard anyway, they just weren't the specs of the finalized PS4 design. I'm pretty sure that those overspecced Orbis dev kits were what lead to Witcher 3 and Wtach Dogs being downgraded and AC Unity having severe performance issues, as the games were originally built with the Orbis specs in mind rather than the finalized PS4 (Liverpool) specs. That is what some CD Project and Ubisoft developers hinted at anyway. 

The sources that the Series X specs rumor came from are known to have some of the most reliable Microsoft sources in the industry. It's also worth noting that Phil Spencer said to expect 8 times the graphical power of Xbox One and 2x the graphical power of Xbox One X, which works out to 10.4-12 tflops if he was referring to flops, though he may have been referring to actual game performance rather than theoretical performance (flops), and a 10 tflop RDNA chip can double the XB1 X's older 6 tflop GCN GPU in terms of real world performance I'm pretty sure. So his statement is no guarantee of 12 tflops, though it does mean that 12 tflop is a possibility.

I'm not saying that 12 tflop Xbox Series X is sure thing, just that it is a distinct possibility with multiple reliable sources saying it. Even if Series X is 11, 10, or even 9 tflop I will be happy, as long as the actual game graphics come pretty close to that Hellblade 2 in-engine trailer. Next-gen has already exceeded my expectations if that in-engine trailer was anywhere close to what we will actually get next-gen. 

Last edited by shikamaru317 - on 26 December 2019

~snip

While I agree with you that tempering expectations and erring on the side of caution is probably a good thing, I do think that the situation with the rumored Xbox Series X specs is quite a bit different from the previous gen situations you highlighted. That PS3 rumor was obviously bogus, as the top single chip PC GPU in 2006, Nvidia's 8800 GTX, was only around 350 gflops, there was no way that PS3 was getting 5x the graphical power of the top PC GPU. Meanwhile, I'm pretty sure that PS4 spec rumor came from the Orbis development kits that Sony sent out in 2012, which were PC based dev kits that were designed to emulate next-gen performance, because the actual PS4 chipset hadn't been completed by AMD yet. So the specs of that leak were actually accurate from what I've heard anyway, they just weren't the specs of the finalized PS4 design. I'm pretty sure that those overspecced Orbis dev kits were what lead to Witcher 3 being downgraded and AC Unity having severe performance issues, as the games were originally built with the Orbis specs in mind rather than the finalized PS4 (Liverpool) specs. That is what some developers hinted at anyway. 

The sources that the Series X specs rumor came from are known to have some of the most reliable Microsoft sources in the industry. It's also worth noting that Phil Spencer said to expect 8 times the graphical power of Xbox One and 2x the graphical power of Xbox One X, which works out to 10.4-12 tflops if he was referring to flops, though he may have been referring to actual game performance rather than theoretical performance (flops), and a 10 tflop RDNA chip can double the XB1 X's older 6 tflop GCN GPU in terms of real world performance I'm pretty sure. So his statement is no guarantee of 12 tflops, though it does mean that 12 tflop is a possibility.

I'm not saying that 12 tflop Xbox Series X is sure thing, just that it is a distinct possibility with multiple reliable sources saying it. Even if Series X is 11, 10, or even 9 tflop I will be happy, as long as the actual game graphics come pretty close to that Hellblade 2 in-engine trailer. Next-gen has already exceeded my expectations if that in-engine trailer was anywhere close to what we will actually get next-gen. 

Point is, it could all be bogus, just like, some of it could be true (which will likely be the case). Just like you say "if he was referring to flops", we don't know specifically what he meant. That's why only (or mostly) basing things on a potential flop count is unrealistic in my eyes.

Make no mistake, there's nothing wrong with speculating/predicting, but when those things turn to matter-of-fact discussions, that's when I challenge it and use history as a means to remind people that, no matter how 'sure' something may seem, it is only when the horse speaks the final word will you be able to have that security. All I can tell you for sure, based on what I do (and have seen), the visual difference between gens is going to be substantial and you would be surprised to learn that it doesn't take as much power as people think it does.

Last edited by CGI-Quality - on 26 December 2019