By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Rumor: Xbox "Lockhart" specs leaked, is $300

Bofferbrauer2 said:
DonFerrari said:

We already got Daniel Ahmad estimative that seems to have been validated that PS5 cost to build is 450 and XSX 460-510. That would leave marketing, logistic and small margin for sellers. So PS5 and X1X could both be sold below 500 with some loss from platform holder.

I know those estimations, but I don't really believe them unless he shows what exactly he calculated there. Unless he just gave the pure production costs of each part and added those together, without taking any margins into account, I find those numbers very hard to believe.

Ok then, we will have to wait to see what they decide for launch price and better cost estimation.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
zero129 said:
goopy20 said:

Okay you got me, I suck at match. Doesn't change the fact that a game running at 1080p/30fps on ps5 will have to get scaled down to 540p to reach 30fps on Series S with half the Tflops. The difference between a 12Flops Series X and the 9,2 ps5 sounds pretty significant and would be worth a premium to me. But when it's a difference between 4 and 9,2, it's no big deal?   

Why do you keep using quotes that are biased to your agenda and absolutely doesnt show any kind of context?.

For instance a user could still run Dishonored 2 at 1440P with a gtx1060 with everything on ultra with a mix of some less noticeable settings from High/medium settings and i bet if you showed the average user both they wouldnt be able to tell the difference. 

Not everything has to be all Ultra or medium settings you know. Thats the great thing about game engines this days, they are all build to scale.

You said it yourself, if the average user wouldn't be able to tell the difference between high/medium settings, why would a developer waste resources on using them for the base platform? Ultra settings is an optional thing on pc for those who have the hardware to turn them on. On consoles, however, graphics settings are a design choice. Developers will optimize their games to look as good as possible on the locked specs of a base console, and they'll always do so in ways that require as little resources as possible. It's called optimization and it's the reason why console games still look as good as they do on a 9 year old budget gpu. 

Last edited by goopy20 - on 13 March 2020

Pemalite said:

1. AMD is a generation behind nVidia for the most part though.
nVidia will have second generation ray tracing hardware this year, which will likely correspond with a significant increase in compute hardware to tackle the problem and efficiency gains.

Bofferbrauer2 said:

When an XBO game ran in 720p, then the PS4 title generally ran in 900p, not 1080p. After all, 1080p is twice as large as 720p, but the PS4 ain't twice as strong as the XBO.

2. 720P vs 900P is a difference of 56.25% which is more than the theoretical "30%" performance difference. (Which is why flops is bullshit.)

Bofferbrauer2 said:

Those 30% more TFlops would get pretty expensive. Just look at the current AMD GPUs: The RX 5500 and RX 5600 are at 4.8 TFlops and 6 TFlops respectively, yet the price difference is over $100. And It's not just the price of the chip: The cooling can be much weaker and smaller and cheaper, and thus also the casing, making the console overall also cheaper to produce.

3a. The price isn't in the Teraflops.

50% wider memory bus, requires more PCB layers, more intricate power delivery.
Some variants have 50% more memory as well.

The real cost is in the die size... The RX 5600 is from a die-harvested RX 5700, where-as the RX 5500 is built from it's own smaller, cheaper die that is 58.8% smaller.

So in reality... The RX 5500 should be 50% cheaper than the RX 5600 due to the roughly 50% decrease in all the hardware, but that is generally not the case... Often seeing a 75% price discrepancy between the two parts in Oceania.

Bofferbrauer2 said:

You don't seem to understand what compromises need to be made for a modern console to reach a $300 pricepoint. The big consoles will be sold at a massive loss  since they are coming with over 12 TFlops and an 8-core CPU. An RX 5700XT, currently the strongest AMD GPU, clocks in at just 9 TFlops and yet costs about $400, the Ryzen 7 3700X costs about $300. AMD doesn't need the console money nearly as badly anymore as they did for the current gen and thus will also ask higher prices for their hardware.

3b. PC costs and Console costs tend to be a little different.
Microsoft and Sony buy in bulk which can be a significant price reduction, plus they get their own contracts done for fabrication.

There is also part consolidation, GPU's on PC's for instance have their own power delivery and RAM, on consoles that is all shared with the other components.

Plus you have the profit margins, AMD tries to retain a 64% or higher profit margin... But that really depends on how performant they are relative to nVidia and what markets they are chasing... AMD's semi-custom division tends to get lump sums with licensing revenue, totally different pricing structure.

And of course you have the die-size vs clock frequency part of the equation, having a smaller chip that can clock higher can mean more Teraflops, but also cheaper than the part it replaces, we saw this often with AMD's evolution of Graphics Core Next where they constantly re-balanced hardware every year.

1: Until we know more about RDNA2 and Ampere, I'd rest that case. AMD could be closing the gap with a strong leap, Ampere not turn out as great as expected, and so on...

2: I know, but this was in response to someone claiming that the PS4 was twice as powerful and that the graphical disparity between the consoles were larger than they really are, and just told him that the difference was smaller than he thought.

3a+b: I know, just didn't want to go too much into detail and keep it simple.

Last edited by Bofferbrauer2 - on 13 March 2020

Otter said:
Nu-13 said:

1800p is the minimun. We don't see 720p ps4 games, right? Then we won't really see 1440p ps5 games (not including disgraces like arc).

I know I said I'll leave this thread but its too irksome to read this and not respond lol

You cannot arbitrarily compare 1440p for next gen systems to 720p for current gen.  720p is a blurry image by modern standards which hides lots of details like textures and DOF, making a move to higher resolutions a necessity to actually explpoit the hardwares potential.  Its not adequet  for most people's gaming set ups unless they have a small 27" TV and sit quite far from their display. Even content on youtube rarely sits at 720p, 1080p is standard for what people experience and expect.

1440p with quality reconstruction is not a blurry image that hides lots of details, it is not below standards, infact it is an quality very few console gamers have ever experience. Its a brilliant quality, clear image which makes a notible improvement on 1080p and has delivered us some of (if not the) best looking games of the console generation (Uncharted 4, Death Stranding, God of War, Horizon, soon to be TLOU2 as well).  Even 1080p (reconstructed to 4k) next generation will be way more acceptable compared to 720p this generation, so I expect we will see it, especially in experimental ray tracing titles and in game settings where 60fps/RTX is seen as an option but not the default. There's an actual human experience behind these numbers which is why 30-60fps has been the standard for 30 years, the numbers aren't just going up for the sake of it. Developers are not going to mandate 1800p because its X % of 4k or X multiples of 1080p. They're going to look at the balance of visual characterics in their game and aim for a combination that achieves their vision and is most impressive for the end user, this is also why dynamic resolution and VRS will permeate next gen.

Again I would really love to see someone who has played a game like God of War at 1440p on a 50" 4k display complain that the image quality is bad & that gamers will not be happy with it over the coming years.

Just because a game is 1440P, doesn't mean everything in said game is rendered at 1440P.

Bofferbrauer2 said:

1: Until we know more about RDNA2 and Ampere, I'd rest that case. AMD could be closing the gap with a strong leap, Ampere not turn out as great as expected, and so on...

There will definitely be a sizable uptick in performance either way thanks to the increased transistor budget allowing nVidia some movement on that front and to account for any stuff ups.

shikamaru317 said:

2. Nvidia's 20 series will be 2 years old when next-gen consoles launch. There is a good chance that AMD's RDNA 2 ray tracing will be more efficient than Nvidia's 20 series raytracing, even if it is less efficient than Nvidia's upcoming 30 series. And like I said before, developers may make ray tracing an optional feature in many next-gen games, giving gamers a choice between ray tracing and higher resolution and/or framerate. 

There is a chance that AMD's RDNA 2 will be less efficient too and less capable on the Ray Tracing front, lets not count our eggs in a row just yet... nVidia has had the technology edge for years now.

shikamaru317 said:

3. PC games don't receive the same level of optimization that console games do, and console hardware has less OS overhead than PC's. So comparing Control on PC to next-gen console games and saying that you won't see a graphical improvement if they aim for native 4K is a bad comparison.  

Do consoles receive extra optimization? Sure.
But they aren't receiving a level of optimization that has made the Radeon 7850 unable to play console games with Playstation 4/Xbox One levels of visuals for the most part.

And OS overhead is actually *less* on PC. Windows 10 doesn't need 2x CPU cores and 3GB of Ram on a PC.




www.youtube.com/@Pemalite

shikamaru317 said:
goopy20 said:

There are plenty of 900p games on ps4 as well. I am no game developer but everyone and their grandmother knows that on consoles, developers always need to make compromises and will aim for the best bang for the buck. Native 4k or 1800p isn't that as it would cut fps in half compared to 1080p, meaning overall visual fidelity would have to be cut in half as well. Of course, we'll see some cross-gen and Indie games in native 4k but it will be a design choice based on how visually ambitious the game is. The ps3 also had a ton of Indie games running in 1080p, but that didn't mean they got praised for their awesome graphics compared to something like TLOU running in 720p.

Now, I understand 8k sounds awesome and we don't know yet how AMD's Ray Tracing cores will perform, but I don't think people realize the kind of resources native 4k and RT requires, let alone 8k lol. Just to give you an idea:

To enable 4K 60FPS in control natively, we will need a graphics card that is 2x stronger than the RTX 2080 Ti. That's without raytracing. With raytracing, that number doubles again. Nvidia's RT series may be powerful, but we need beefier graphics cards to make the 4K 60FPS dream possible. This is especially true if we want to keep adding raytracing into the mix.  

https://www.overclock3d.net/reviews/gpu_displays/control_rtx_raytracing_pc_analysis/6

  

So the question is, do we really want the next gen games just to be defined by resolution, with not enough power dedicated to providing an actual leap in graphical fidelity? You know, the stuff that actually matters in defining new experiences associated with a new console generation.

I'm sure Permalite, Cgi-quality or anyone who knows anything about graphics will tell you that developers pushing native 4k on consoles may not be the best idea. Obviously, games should look sharper on ps5, but not at the cost of half the gpu resources. It's far more likely the ps5 will be using techniques like temporal injection, checkerboard rendering etc. that'll get the job done nicely, without butchering the performance and ambitions of these next gen games. 

You seem to be forgetting some things:

1. Control is a game that was designed to scale on everything from a 1.3 tflop GPU Jaguar CPU base Xbox One to PC's with 12 tflop 2080ti GPU's and Core i9 CPU's. As a late gen game it had to be designed to run on a huge scale of hardware, and that means that visual compromises had to be made on both ends of the scale to get it to run on everything. If Control was designed as a PC exclusive, with minimum specs at 4 tflop and recommended specs at 10 tflop for instance, you would have a much better looking game. 

2. Nvidia's 20 series will be 2 years old when next-gen consoles launch. There is a good chance that AMD's RDNA 2 ray tracing will be more efficient than Nvidia's 20 series raytracing, even if it is less efficient than Nvidia's upcoming 30 series. And like I said before, developers may make ray tracing an optional feature in many next-gen games, giving gamers a choice between ray tracing and higher resolution and/or framerate. 

3. PC games don't receive the same level of optimization that console games do, and console hardware has less OS overhead than PC's. So comparing Control on PC to next-gen console games and saying that you won't see a graphical improvement if they aim for native 4K is a bad comparison.  

What does all of that mean? Well, for one thing, the minimum spec for next gen will increase by over 4x compared to last gen, from 1.3 tflop GCN to 4 tflop RDNA 2 (equivalent to 6 tflop GCN at least, possibly more, especially if MS decides to overclock XSS some before release). Assuming the same resolution as last gen, 900p for XB1 compared to 900p for XSS, that extra 4x power can be fully utilized to push graphics forward with ray tracing and other improvements. Then those graphical improvements from the 4x spec improvement on the low end can be ported over to the two higher end consoles, PS5 and XSX, and resolution scaled up as high as it can go without compromising framerate, using variable resolution tech to lower resolution in demanding scenes and increase it in less demanding scenes.

Now will devs actually aim for 900p on XSS so that the full 4x spec improvement can be put toward graphical improvements? Some probably will, while others will follow Microsoft's target and aim for 1080p. I'll be very surprised if we see more than a handful of games that run at less than 900p on XSS, sames goes for 1800p on XSX. PS5, well hard to say without knowing more about it's specs, if it is actually 9.2 tflop with Sony attempting to overclock it to 10 tflop as rumored, I would say that we'll see 1440p as the minimum for PS5. And yes, I agree with you that any PS5 and XSX games that are less than native 4K will likely use methods like checkerboarding and temporal injection to improve their upscaling to 4K. 

Almost any current gen game will cripple the performance of a 2080Ti in native 4k and RT enabled. 

In any case, this whole "Series S is a great idea" is based on assuming developers will waste a ton of resources on native 4k and ultra graphics settings (that most people won't notice) on the ps5. Yes if they would do that, they could lower the graphics settings and have the game running in 1080p on Series S. The only thing that you seem to be forgetting is that developers don't just waste resources like that, and if they did, they would be doing a poor job. In reality they will be as efficient as possible, use things like checkerboard rendering that won't take that much of a hit on performance and avoid using graphics settings that take up too much resources with a relatively small gain in visuals. 

Also, if what you say is true and they did target 1080p on Series S, wouldn't that mean visual fidelity would have to get scaled down across all platforms? The whole idea of having minimum and high-end console specs sounds pretty terrible to me. Consoles aren't pc's and I don't want developers making 4 different versions of their games, I want them to focus on a single platform and use it to its fullest potential.  

Last edited by goopy20 - on 13 March 2020

Around the Network
shikamaru317 said:
victor83fernandes said:

1 - 600 back in 2006 was worth a lot more than 600 now

2 - This could be 600 but its a premium, only for the hardcore who can afford it, just like Xbox X at 500dollars VS xbox S at 250 dollars

3 - They don't expect it to sell a lot, just like current gen, the base consoles sell the most, its all about having the option for people willing to pay

4 - Options are good, it caters to everyone, ideally 3 models, S-AD for people who don't care for graphics and will buy only digital, S model for people who are not fussed with graphics (same people who bought an S instead of the X this gen), and the series X, for the kind of people who bought the X this generation because they are willing to pay for premium graphics and hardware.

5 - The cost of the series X is definitely more than 460dollars, I have no idea how you came up with that number but a 12TF next gen graphics + good CPU + controller + SSD + motherboard, power unit, antennas, case, cables, RAM, 4K bluray drive + cooling system + shipping + building costs + labour costs definitely costs more than 500 dollars, do your research properly.

1. Perhaps, but I don't think gamers will be any more forgiving of $600 now than they were 14 years ago. Gamers are an entitled bunch, just look how much they rage when a publisher suggests increasing the price of games from $60, where it has been at for 15+ years, even though every other product has seen inflation based price increases in that time span.

2. I doubt it. Phil has said in interviews that Microsoft understands what gamers consider to be a reasonable price for hardware. I can't see him going for $600, that is definitely over gamer expectations. 

3. I'm pretty sure they have higher expectations for Series X than you think. Sales expectations were low for Xbox One X because it was a mid-gen refresh, MS knew that selling people a $500 system 3 years before next gen starts is a tougher prospect than selling them a $500 console at the start of a new gen. Yes, sales expectations will be higher for Series S than Series X, but they likely want to sell at least 25m Series X consoles this gen imo, not 5m or so like Xbox One X.

4. I actually agree with you that they need a disc drive S model for slightly more than the discless model. That might be the model I buy if they release it. I'm just not sure that it is happening, every leak so far has mentioned S being discless, none of them have mentioned the possibility of a slightly more expensive disc drive S model. It's also harder to sell people on a 4 tflop console for $350 when PS5 is expected to be about 10 tflop for $450, only $100 more for 6 more tflops of graphical power. They could go for an unusual price like $320 or $330 for the disc drive model, but console pricing almost always comes in increments of $50. 

5. The $460-510 estimated cost to build for Series X comes from Industry Analyst Daniel Ahmad (ZhugeEX), based on the rumored specs for Xbox Series X, compared to the rumored specs for PS5, which was recently leaked to have a $450 cost to build. If he is right, I definitely expect MS to sell Series X for $500. 

1 - Its not perhaps, its definitely, its called inflation, 600 back then is over 800now, 600 now is a bargain, people buy 1500dollars phones now, back then that was unthinkable, people are ready to pay, new games will be 65dollars, so only people with money will jump on launch day, games haven't jumped price yet because you artificially already pay much more, if you buy 1 game per month, and you pay 15 dollars for online, then the game cost you 75dollars, back then ps3 online was free.

2 - But 600 is very very reasonable for hardware that costs over 700, that's his own words, gamers bought the X and thought it was reasonable at 500 with much lower specs, some gamers even buy 1500dollars PCs and think its reasonable price, its all relative, if the machine is 12teraflops with SSD and better architecture, then 600 is more than reasonable

3 - Wrong, expectations for the X were high, in fact there was more fuss about it than the series X, I believe they already accepted that they cant beat Sony no matter what. If they thought they could, they wouldn't be in a rush to market the thing, they would have waited. 25million series X? I don't know what drugs are you on, but 70-80 % of sales will definitely be the cheap model, the series X will sell around 15 million, and series S around 40million

4 - Why? people on this website are hardcore gamers, why buy the cheap model if there's a better model available? I will always go for the best of the best model, my doubts is should I just build a PC which can play xbox games and thousands os pc games and emulators and free online and cheaper games, ps5 on the other hand is a given that I buy one because of the exclusives, xbox wont even have exclusives for the first 2 years, so I can wait and see

Yeah hard to sell xbox at all, it could have the same power as a ps5 and be 100dollars difference, people would still go for ps5, 100dollars is only slightly more than the price of 1 game, 100 dollars is nothing these days, its basically 1 night out with the girlfriend

Also 4TF is just a rumour, and 11TF ps5 is also just a rumour, could be the ps5 pro and not the lower model. We don't know yet.

With that said, lets not forget this generation ps4 was more powerful and a full 100dollars cheaper, no need to buy rechargeable batteries either, so that was like 120cheaper

5 - Yeah if that guy is so good, then how much did he say was the shipping costs? And the manufacturing costs? As in salaries of employes. How much did he say was a next gen controller? How much was the cables and antennas?

I haven't even checked and I can assure you he's wrong, fortunately I got a brain, and Ive been reading gaming news for several generations now

Rumoured ps5 specs? C'mon start thinking for yourself, specs were not even revealed yet, how could analysts know the final specs if even developers dont know?



Pemalite said:
Otter said:

I know I said I'll leave this thread but its too irksome to read this and not respond lol

You cannot arbitrarily compare 1440p for next gen systems to 720p for current gen.  720p is a blurry image by modern standards which hides lots of details like textures and DOF, making a move to higher resolutions a necessity to actually explpoit the hardwares potential.  Its not adequet  for most people's gaming set ups unless they have a small 27" TV and sit quite far from their display. Even content on youtube rarely sits at 720p, 1080p is standard for what people experience and expect.

1440p with quality reconstruction is not a blurry image that hides lots of details, it is not below standards, infact it is an quality very few console gamers have ever experience. Its a brilliant quality, clear image which makes a notible improvement on 1080p and has delivered us some of (if not the) best looking games of the console generation (Uncharted 4, Death Stranding, God of War, Horizon, soon to be TLOU2 as well).  Even 1080p (reconstructed to 4k) next generation will be way more acceptable compared to 720p this generation, so I expect we will see it, especially in experimental ray tracing titles and in game settings where 60fps/RTX is seen as an option but not the default. There's an actual human experience behind these numbers which is why 30-60fps has been the standard for 30 years, the numbers aren't just going up for the sake of it. Developers are not going to mandate 1800p because its X % of 4k or X multiples of 1080p. They're going to look at the balance of visual characterics in their game and aim for a combination that achieves their vision and is most impressive for the end user, this is also why dynamic resolution and VRS will permeate next gen.

Again I would really love to see someone who has played a game like God of War at 1440p on a 50" 4k display complain that the image quality is bad & that gamers will not be happy with it over the coming years.

Just because a game is 1440P, doesn't mean everything in said game is rendered at 1440P.

Bofferbrauer2 said:

1: Until we know more about RDNA2 and Ampere, I'd rest that case. AMD could be closing the gap with a strong leap, Ampere not turn out as great as expected, and so on...

There will definitely be a sizable uptick in performance either way thanks to the increased transistor budget allowing nVidia some movement on that front and to account for any stuff ups.

shikamaru317 said:

2. Nvidia's 20 series will be 2 years old when next-gen consoles launch. There is a good chance that AMD's RDNA 2 ray tracing will be more efficient than Nvidia's 20 series raytracing, even if it is less efficient than Nvidia's upcoming 30 series. And like I said before, developers may make ray tracing an optional feature in many next-gen games, giving gamers a choice between ray tracing and higher resolution and/or framerate. 

There is a chance that AMD's RDNA 2 will be less efficient too and less capable on the Ray Tracing front, lets not count our eggs in a row just yet... nVidia has had the technology edge for years now.

shikamaru317 said:

3. PC games don't receive the same level of optimization that console games do, and console hardware has less OS overhead than PC's. So comparing Control on PC to next-gen console games and saying that you won't see a graphical improvement if they aim for native 4K is a bad comparison.  

Do consoles receive extra optimization? Sure.
But they aren't receiving a level of optimization that has made the Radeon 7850 unable to play console games with Playstation 4/Xbox One levels of visuals for the most part.

And OS overhead is actually *less* on PC. Windows 10 doesn't need 2x CPU cores and 3GB of Ram on a PC.

Guys, so much talk about resolution, its pointless, anything that is 1080p is already fantastic, I have a 1080p 120projector, and I've played same consoles, same games before on my 4k HDR Panasonic 50inch, I've played full 4k on my xbox X, and I still prefer it in 1080p on the projector. (reason I have bought is the projector is in my house back in Portugal, the TV is with me in the UK where I work, I'd love a projector here too but I have no space here)

Movement seems more natural, the overall image looks more realistic, not so fake as on LCD TVS, the only good thing about high resolution is the super sampling, hence why my xbox X looks better playing in 4K on a 1080p screen, image is clean, not 1 single jaggie even at 120inch. In fact when I build my next PC I will look for a 1080p monitor to get the benefit of super sampling, I prefer antialiasing over resolution any day of the week. In fact even N64games look fantastic with AA on.
On the other hand 720p is unacceptable, looks way more blurry, very noticeable.
Forget resolution, 1440p is more than anyone really needs at any screen size.
Start worrying about 60fps on all games as a minimum with AA at minimum 4x

1080p is more than sharp enough, the biggest difference is AA and many other effects, I've also played games at 1440p on PC that look horrible even with 16x AA. 

Image quality is way more than resolution. Even HDR is just marketing, because contrast looks better on my projector than on my 4K HDR Panasonic. In fact everyone that comes is way more impressed with image size and how natural and realistic it looks, no one has praised my 4K TV, its barely better than a good 1080p TV, has no wow factor.

If you want quality then get good quality equipment and great speakers. a 4K HDR cheap Chinese brand will look miles worse than a top of the range 1080p TV.

Last edited by victor83fernandes - on 13 March 2020

victor83fernandes said:

Image quality is way more than resolution. Even HDR is just marketing, because contrast looks better on my projector than on my 4K HDR Panasonic. In fact everyone that comes is way more impressed with image size and how natural and realistic it looks, no one has praised my 4K TV, its barely better than a good 1080p TV, has no wow factor.

If you want quality then get good quality equipment and great speakers. a 4K HDR cheap Chinese brand will look miles worse than a top of the range 1080p TV.

I really doubt that the contrast of your projector looks even better than the contrast of a cheap TV without HDR.

Projectors have some advantages (screen size, better immersion due to a bigger field of view), but especially contrast is one of their weak points... worse black level combined with lower brightness (especially stretched on a big screen) than standard tvs.

Which projector do you have?

Which Panasonic TV do you have?



shikamaru317 said:

You seem to be forgetting some things:

1. Control is a game that was designed to scale on everything from a 1.3 tflop GPU Jaguar CPU base Xbox One to PC's with 12 tflop 2080ti GPU's and Core i9 CPU's. As a late gen game it had to be designed to run on a huge scale of hardware, and that means that visual compromises had to be made on both ends of the scale to get it to run on everything. If Control was designed as a PC exclusive, with minimum specs at 4 tflop and recommended specs at 10 tflop for instance, you would have a much better looking game. 

I agree with most of what you said or don't have real disagreements, but this I will respond to.

It's hard to say what Control would be if it was designed solely for PCs (not base consoles as well). Because there would be graphics settings to adjust and ray tracing is optional. Generally PC games are already designed to work on various levels of specs so more people can play them.

What does minimum 4tflop mean really? Does that mean to run the game at lowest settings, 720p and 30 fps you need 4tflops? Even then, settings could likely be tweaked further for a playable experience on lower specs. In theory Control could function on Switch with a serious visual overhaul but could fundamentally be the same game.

We could say numerous 8th gen era games were designed for X1 as a minimum. Such as Witcher 3, because I don't think they anticipated scaling it back for Switch.

Crysis 2 and 3 on PC require specs far better than 7th gen consoles. You can't even lower the graphics settings to console levels in the PC versions. Yet the scaled back versions designed for consoles are essentially the same games.

Like I've said before, its evidently easier to lower the GPU demands of games over really high CPU and RAM requirements. Even if a game is designed for a powerful graphics card in mind, lower spec console can generally handle the same game with scaled back visuals.

Last edited by Mr Puggsly - on 14 March 2020

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
shikamaru317 said:

You seem to be forgetting some things:

1. Control is a game that was designed to scale on everything from a 1.3 tflop GPU Jaguar CPU base Xbox One to PC's with 12 tflop 2080ti GPU's and Core i9 CPU's. As a late gen game it had to be designed to run on a huge scale of hardware, and that means that visual compromises had to be made on both ends of the scale to get it to run on everything. If Control was designed as a PC exclusive, with minimum specs at 4 tflop and recommended specs at 10 tflop for instance, you would have a much better looking game. 

I agree with most of what you said or don't have real disagreements, but this I will respond to.

It's hard to say what Control would be if it was designed solely for PCs (not base consoles as well). Because there would be graphics settings to adjust and ray tracing is optional. Generally PC games are already designed to work on various levels of specs so more people can play them.

What does minimum 4tflop mean really? Does that mean to run the game at lowest settings, 720p and 30 fps you need 4tflops? Even then, settings could likely be tweaked further for a playable experience on lower specs. In theory Control could function on Switch with a serious visual overhaul but could fundamentally be the same game.

We could say numerous 8th gen era games were designed for X1 as a minimum. Such as Witcher 3, because I don't think they anticipated scaling it back for Switch.

Crysis 2 and 3 on PC require specs far better than 7th gen consoles. You can't even lower the graphics settings to console levels in the PC versions. Yet the scaled back versions designed for consoles are essentially the same games.

Like I've said before, its evidently easier to lower the GPU demands of games over really high CPU and RAM requirements. Even if a game is designed for a powerful graphics card in mind, lower spec console can generally handle the same game with scaled back visuals.

Not all games can be ported to the Switch, though. We've seen some pretty decent ports but there are also some really bad ones like Fifa, Overwatch, Ark etc. We also know that Cyberpunk can't be done on the Switch, hell they delayed it because they're having problems getting it to run on the X1. It probably depends if it's a cpu or gpu heavy game. 

Last edited by goopy20 - on 14 March 2020