By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Rumor: Xbox "Lockhart" specs leaked, is $300

goopy20 said:
Conina said:

300% less? So Lockhart will have minus 200 percent performance compared to the PS5?

Err.. well let me see. 9.2 Tflops divided by 4 Tflops = 2.3 times more powerful. That's quite a difference from the 0,3 Tflops edge the ps4 has over the Xone, which already resulted in games running at 1080p vs 720p.

Like I said, a game targeting 1080p on ps5 would run in 540p on Series S, effectively making it look like Mine Craft on a 55inch tv.

Oh geez, where should I begin?

The PS4 had a 1.84 teraflop GPU and the Xbox One had a 1.31 teraflops GPU:

https://www.techradar.com/news/gaming/consoles/ps4-vs-xbox-720-which-is-better-1127315/2

1.84 - 1.31 = 0.53 AND NOT 0.3. You almost halved the difference by your lie miscalculation.

1.31 / 1.84 = 0.712 .... so the Xbox One had 28.8% less tflops than the PS4.

4 / 9.2 = 0.435... so the rumoured Lockhart would have 56.5% less tflops than the rumoured PS4... AND NOT 300% less!

56.5 is a bit less than 300, don't you think?

Please learn some math!



Around the Network
DonFerrari said:
Bofferbrauer2 said:

When an XBO game ran in 720p, then the PS4 title generally ran in 900p, not 1080p. After all, 1080p is twice as large as 720p, but the PS4 ain't twice as strong as the XBO.

Those 30% more TFlops would get pretty expensive. Just look at the current AMD GPUs: The RX 5500 and RX 5600 are at 4.8 TFlops and 6 TFlops respectively, yet the price difference is over $100. And It's not just the price of the chip: The cooling can be much weaker and smaller and cheaper, and thus also the casing, making the console overall also cheaper to produce.

You don't seem to understand what compromises need to be made for a modern console to reach a $300 pricepoint. The big consoles will be sold at a massive loss  since they are coming with over 12 TFlops and an 8-core CPU. An RX 5700XT, currently the strongest AMD GPU, clocks in at just 9 TFlops and yet costs about $400, the Ryzen 7 3700X costs about $300. AMD doesn't need the console money nearly as badly anymore as they did for the current gen and thus will also ask higher prices for their hardware.

Granted, the prices will be much lower than retail prices, but for CPU and GPU alone I expect at least $500 from AMD, if not even $600 considering the expected size of that GPU chip The raytracing part of the NVidia RTX takes about 90 mm2. Add those to the size of the chip that's needed to reach 12+ TFlops on RDNA at reduced clock speeds (done to reduce wear and extend the life of the chip and it's cooler), and you'll reach around 500mm2, which gets pretty expensive to produce. And at 7nm they can't go very much lower either without risking not covering production costs.

We already got Daniel Ahmad estimative that seems to have been validated that PS5 cost to build is 450 and XSX 460-510. That would leave marketing, logistic and small margin for sellers. So PS5 and X1X could both be sold below 500 with some loss from platform holder.

I know those estimations, but I don't really believe them unless he shows what exactly he calculated there. Unless he just gave the pure production costs of each part and added those together, without taking any margins into account, I find those numbers very hard to believe.



Conina said:
goopy20 said:

Err.. well let me see. 9.2 Tflops divided by 4 Tflops = 2.3 times more powerful. That's quite a difference from the 0,3 Tflops edge the ps4 has over the Xone, which already resulted in games running at 1080p vs 720p.

Like I said, a game targeting 1080p on ps5 would run in 540p on Series S, effectively making it look like Mine Craft on a 55inch tv.

Oh geez, where should I begin?

The PS4 had a 1.84 teraflop GPU and the Xbox One had a 1.31 teraflops GPU:

https://www.techradar.com/news/gaming/consoles/ps4-vs-xbox-720-which-is-better-1127315/2

1.84 - 1.31 = 0.53 AND NOT 0.3. You almost halved the difference by your lie miscalculation.

1.31 / 1.84 = 0.712 .... so the Xbox One had 28.8% less tflops than the PS4.

4 / 9.2 = 0.435... so the rumoured Lockhart would have 56.5% less tflops than the rumoured PS4... AND NOT 300% less!

56.5 is a bit less than 300, don't you think?

Please learn some math!

Okay you got me, I suck at math. Doesn't change the fact that a game running at 1080p/30fps on ps5 will have to get scaled down to 540p to reach 30fps on Series S with half the Tflops. The difference between a 12Flops Series X and the 9,2 ps5 sounds pretty significant and would be worth a premium to me. But when it's a difference between 4 and 9,2, it's no big deal?   

Last edited by goopy20 - on 12 March 2020

goopy20 said:
Conina said:

Oh geez, where should I begin?

The PS4 had a 1.84 teraflop GPU and the Xbox One had a 1.31 teraflops GPU:

https://www.techradar.com/news/gaming/consoles/ps4-vs-xbox-720-which-is-better-1127315/2

1.84 - 1.31 = 0.53 AND NOT 0.3. You almost halved the difference by your lie miscalculation.

1.31 / 1.84 = 0.712 .... so the Xbox One had 28.8% less tflops than the PS4.

4 / 9.2 = 0.435... so the rumoured Lockhart would have 56.5% less tflops than the rumoured PS4... AND NOT 300% less!

56.5 is a bit less than 300, don't you think?

Please learn some math!

Okay you got me, I suck at match. Doesn't change the fact that a game running at 1080p/30fps on ps5 will have to get scaled down to 540p to reach 30fps on Series S with half the Tflops. The difference between a 12Flops Series X and the 9,2 ps5 sounds pretty significant and would be worth a premium to me. But when it's a difference between 4 and 9,2, it's no big deal?   

960x540 is only one fourth (1/4) of 1980x1080. With perfect scaling (so if CPU, RAM, bandwith... aren't bottlenecks) you would only need a 2.3 Tflop-GPU (1/4 of 9.2 Tflops), not 4 Tflops (same architecture and everything else the same).

Since scaling ain't perfect, you can expect factors of 3 - 5 when you reduce the resolution to 25%.

Let's have a look at your own chart:

3.6x - 3.8x the performance by reducing the resolution to 25%, as expected.

Yeah, you still suck at math.



goopy20 said:

Ray tracing might just be an optional thing on pc right now for the 1% of pc gamers who own a RTX card, but it won't be next gen.

You can do Ray Tracing on non-RTX cards on PC.
It's just the hit to performance is rather significant, hence why we have Ray Tracing cores now to minimize that impact.

goopy20 said:

On consoles they'll be able to build their games from the ground up with RT in mind, and it should be a lot more spectacular than having some reflective puddles in BF. 

That really depends how extensive the Ray Tracing cores are on next-gen, if the Ray Tracing capabilities come up short even against the RTX 2060, don't expect anything significant.

However, it's still early days and developers are still experimenting, we aren't going to have "spectacular" ray tracing effects for at least a few more years even with consoles supporting the technology in hardware.

shikamaru317 said:

You're forgetting that we haven't actually seen AMD's hardware raytracing in action yet. By the time that AMD's RDNA 2 GPU's release later this year, about 2 years will have passed since Nvidia released their first 20 series GPU. AMD will have had alot of time to study Nvidia's raytracing implementation and improve upon it to make their own hardware raytracing more efficient. It's also entirely possible that some developers will choose to make ray tracing an optional feature on consoles, allowing people to choose to take a framerate and/or resolution hit in order to experience raytracing. 

AMD is a generation behind nVidia for the most part though.
nVidia will have second generation ray tracing hardware this year, which will likely correspond with a significant increase in compute hardware to tackle the problem and efficiency gains.

Bofferbrauer2 said:

When an XBO game ran in 720p, then the PS4 title generally ran in 900p, not 1080p. After all, 1080p is twice as large as 720p, but the PS4 ain't twice as strong as the XBO.

720P vs 900P is a difference of 56.25% which is more than the theoretical "30%" performance difference. (Which is why flops is bullshit.)


Bofferbrauer2 said:

Those 30% more TFlops would get pretty expensive. Just look at the current AMD GPUs: The RX 5500 and RX 5600 are at 4.8 TFlops and 6 TFlops respectively, yet the price difference is over $100. And It's not just the price of the chip: The cooling can be much weaker and smaller and cheaper, and thus also the casing, making the console overall also cheaper to produce.

The price isn't in the Teraflops.

50% wider memory bus, requires more PCB layers, more intricate power delivery.
Some variants have 50% more memory as well.

The real cost is in the die size... The RX 5600 is from a die-harvested RX 5700, where-as the RX 5500 is built from it's own smaller, cheaper die that is 58.8% smaller.

So in reality... The RX 5500 should be 50% cheaper than the RX 5600 due to the roughly 50% decrease in all the hardware, but that is generally not the case... Often seeing a 75% price discrepancy between the two parts in Oceania.

Bofferbrauer2 said:

You don't seem to understand what compromises need to be made for a modern console to reach a $300 pricepoint. The big consoles will be sold at a massive loss  since they are coming with over 12 TFlops and an 8-core CPU. An RX 5700XT, currently the strongest AMD GPU, clocks in at just 9 TFlops and yet costs about $400, the Ryzen 7 3700X costs about $300. AMD doesn't need the console money nearly as badly anymore as they did for the current gen and thus will also ask higher prices for their hardware.

PC costs and Console costs tend to be a little different.
Microsoft and Sony buy in bulk which can be a significant price reduction, plus they get their own contracts done for fabrication.

There is also part consolidation, GPU's on PC's for instance have their own power delivery and RAM, on consoles that is all shared with the other components.

Plus you have the profit margins, AMD tries to retain a 64% or higher profit margin... But that really depends on how performant they are relative to nVidia and what markets they are chasing... AMD's semi-custom division tends to get lump sums with licensing revenue, totally different pricing structure.

And of course you have the die-size vs clock frequency part of the equation, having a smaller chip that can clock higher can mean more Teraflops, but also cheaper than the part it replaces, we saw this often with AMD's evolution of Graphics Core Next where they constantly re-balanced hardware every year.

goopy20 said:

There are a lot of games running at 720p on Xone vs 1080p on ps4 and not just the games that came out later on in this console generation, for example: COD Ghosts, MGS5, Pro Evolution 2015, Golf Club. Also a lot of exclusives like Quantum Break, Titanfall, Dead Rising 3 and even Halo 4 & 5 were 720p... The funny thing is that some of you commenting probably own a Xone and never even realized you've been playing games at almost the same resolution as the 360. It's because it wasn't a deal breaker for the Xone and not having native 4k isn't going to be a deal breaker next gen either.


Allot of those games either employ a dynamic resolution or use frame reconstruction on Xbox One, so typical pixel counting isn't accurate, those technologies will be front and center for the 9th gen.

Conina said:

960x540 is only one fourth (1/4) of 1980x1080. With perfect scaling (so if CPU, RAM, bandwith... aren't bottlenecks) you would only need a 2.3 Tflop-GPU (1/4 of 9.2 Tflops), not 4 Tflops (same architecture and everything else the same).

Since scaling ain't perfect, you can expect factors of 3 - 5 when you reduce the resolution to 25%.

Let's have a look at your own chart:

<SNIP>

3.6x - 3.8x the performance by reducing the resolution to 25%, as expected.

Yeah, you still suck at math.

Nah. You can't just increase Teraflops and expect a linear increase in performance, not if bandwidth is the same, not if the texture sampling rate was the same, not if the geometry engines was the same, you need to scale everything else upwards.

It's why Navi with less Teraflops performs better than Graphics Core Next with more Teraflops, because *everything else* got a substantial increase in performance.

And if your design has a bottleneck, sometimes the hit to performance is larger than the performance delta would otherwise imply.





www.youtube.com/@Pemalite

Around the Network
Conina said:
goopy20 said:

Okay you got me, I suck at match. Doesn't change the fact that a game running at 1080p/30fps on ps5 will have to get scaled down to 540p to reach 30fps on Series S with half the Tflops. The difference between a 12Flops Series X and the 9,2 ps5 sounds pretty significant and would be worth a premium to me. But when it's a difference between 4 and 9,2, it's no big deal?   

960x540 is only one fourth (1/4) of 1980x1080. With perfect scaling (so if CPU, RAM, bandwith... aren't bottlenecks) you would only need a 2.3 Tflop-GPU (1/4 of 9.2 Tflops), not 4 Tflops (same architecture and everything else the same).

Since scaling ain't perfect, you can expect factors of 3 - 5 when you reduce the resolution to 25%.

Let's have a look at your own chart:

3.6x - 3.8x the performance by reducing the resolution to 25%, as expected.

Yeah, you still suck at math.

Yes if games were native 4k on Ps5, they would be 1080p on Series S. I'm pretty sure I told you that already and anyone who knows anything about game design will also tell you that native 4k is not gonna be a target for next gen console games as it's way too expensive. My guess is that 1440p and 1080p will be standard as that looks good enough on a tv and it frees up a ton of resources that developers can use elsewhere.

So, for arguments sake lets say a game is 1080p/30fps on ps5, in what kind of resolution do you think it would have to run on Series S to hit that same 30 fps? 



Radek said:
goopy20 said:

Yes if games were native 4k on Ps5, they would be 1080p on Series S. I'm pretty sure I told you that already and anyone who knows anything about game design will also tell you that native 4k is not gonna be a target for next gen console games as it's way too expensive. My guess is that 1440p and 1080p will be standard as that looks good enough on a tv and it frees up a ton  of resources that developers can use elsewhere.

So, for arguments sake lets say a game is 1080p/30fps on ps5, in what kind of resolution do you think it would have to run on Series S to hit that same 30 fps? 

Arguing that is pointless to begin with because PS5 games won't target 1080p. 1440p being 44% of native 4K is the absolute minimum. Just like 720p which is 44% of 1080p is the absolute minimum on base PS4. Very few games are 720p even on base Xbox One. Why would 2020 Series S target resolution lower (540p) than Xbox 360 from 2005?

1800p is the minimun. We don't see 720p ps4 games, right? Then we won't really see 1440p ps5 games (not including disgraces like arc).



The only thing I learned in this thread is goopy thinks PS5 games are aiming for 1080p.

Therefore Series S will be a 540p console.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Nu-13 said:
Radek said:

Arguing that is pointless to begin with because PS5 games won't target 1080p. 1440p being 44% of native 4K is the absolute minimum. Just like 720p which is 44% of 1080p is the absolute minimum on base PS4. Very few games are 720p even on base Xbox One. Why would 2020 Series S target resolution lower (540p) than Xbox 360 from 2005?

1800p is the minimun. We don't see 720p ps4 games, right? Then we won't really see 1440p ps5 games (not including disgraces like arc).

I know I said I'll leave this thread but its too irksome to read this and not respond lol

You cannot arbitrarily compare 1440p for next gen systems to 720p for current gen.  720p is a blurry image by modern standards which hides lots of details like textures and DOF, making a move to higher resolutions a necessity to actually explpoit the hardwares potential.  Its not adequet  for most people's gaming set ups unless they have a small 27" TV and sit quite far from their display. Even content on youtube rarely sits at 720p, 1080p is standard for what people experience and expect.

1440p with quality reconstruction is not a blurry image that hides lots of details, it is not below standards, infact it is an quality very few console gamers have ever experience. Its a brilliant quality, clear image which makes a notible improvement on 1080p and has delivered us some of (if not the) best looking games of the console generation (Uncharted 4, Death Stranding, God of War, Horizon, soon to be TLOU2 as well).  Even 1080p (reconstructed to 4k) next generation will be way more acceptable compared to 720p this generation, so I expect we will see it, especially in experimental ray tracing titles and in game settings where 60fps/RTX is seen as an option but not the default. There's an actual human experience behind these numbers which is why 30-60fps has been the standard for 30 years, the numbers aren't just going up for the sake of it. Developers are not going to mandate 1800p because its X % of 4k or X multiples of 1080p. They're going to look at the balance of visual characterics in their game and aim for a combination that achieves their vision and is most impressive for the end user, this is also why dynamic resolution and VRS will permeate next gen.

Again I would really love to see someone who has played a game like God of War at 1440p on a 50" 4k display complain that the image quality is bad & that gamers will not be happy with it over the coming years.



Radek said:
Nu-13 said:

1800p is the minimun. We don't see 720p ps4 games, right? Then we won't really see 1440p ps5 games (not including disgraces like arc).

1800p sounds good as it's almost 3 times as high as 1080p. I'd say anywhere from 1620p to 4K with 1800p being the sweet spot for more frames and / or effects. For example Hunt Showdown is 1800p on One X, 1440p on Pro, 1080p on PS4 and 900p one Xbox One / S. 1800p is 4 times 900p so it's still a big upgrade over both 900p and 1080p.

There are plenty of 900p games on ps4 as well. I am no game developer but everyone and their grandmother knows that on consoles, developers always need to make compromises and will aim for the best bang for the buck. Native 4k or 1800p isn't that as it would cut fps in half compared to 1080p, meaning overall visual fidelity would have to be cut in half as well. Of course, we'll see some cross-gen and Indie games in native 4k but it will be a design choice based on how visually ambitious the game is. The ps3 also had a ton of Indie games running in 1080p, but that didn't mean they got praised for their awesome graphics compared to something like TLOU running in 720p.

Now, I understand 8k sounds awesome and we don't know yet how AMD's Ray Tracing cores will perform, but I don't think people realize the kind of resources native 4k and RT requires, let alone 8k lol. Just to give you an idea:

To enable 4K 60FPS in control natively, we will need a graphics card that is 2x stronger than the RTX 2080 Ti. That's without raytracing. With raytracing, that number doubles again. Nvidia's RT series may be powerful, but we need beefier graphics cards to make the 4K 60FPS dream possible. This is especially true if we want to keep adding raytracing into the mix.  

https://www.overclock3d.net/reviews/gpu_displays/control_rtx_raytracing_pc_analysis/6

  

So the question is, do we really want the next gen games just to be defined by resolution, with not enough power dedicated to providing an actual leap in graphical fidelity? You know, the stuff that actually matters in defining new experiences associated with a new console generation.

I'm sure Permalite, Cgi-quality or anyone who knows anything about graphics will tell you that developers pushing native 4k on consoles may not be the best idea. Obviously, games should look sharper on ps5, but not at the cost of half the gpu resources. It's far more likely the ps5 will be using techniques like temporal injection, checkerboard rendering etc. that'll get the job done nicely, without butchering the performance and ambitions of these next gen games. 

Last edited by goopy20 - on 13 March 2020