By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Would you rather have spent more $ for more powerful 8th gen consoles?

Tagged games:

 

I...

am happy with current console power 88 56.77%
 
would rather have paid more for more power 67 43.23%
 
Total:155

Whatever we have now is fine. It isn't #Pcmasterrace gaming, but its more than enough for most (including me).



Made a bet with LipeJJ and HylianYoshi that the XB1 will reach 30 million before Wii U reaches 15 million. Loser has to get avatar picked by winner for 6 months (or if I lose, either 6 months avatar control for both Lipe and Hylian, or my patrick avatar comes back forever).

Around the Network
CGI-Quality said:
shikamaru317 said:

A CD Project dev hinted that they were expecting more power from XB1 and PS4, which is what lead to the Witcher 3 downgrade. Likewise, a former Ubisoft dev who worked on Watch Dogs said that they were expectng more power as well, which is what lead to the Watch Dogs downgrade.

There's also the fact that PS4 and XB1 were equivalent to mid-range PC's at release. 1300 gflops and 1800 gflops for XB1 and PS4, while the best PC GPU in 2013 had 5000+ gflops. Meanwhile, when 360 released in 2005, it had a 240 gflops GPU, while the best PC GPU at the time were around 260 gflops, a much smaller gap.

I haven't heard those devs say any of that.

@Conina: That's what I was looking for. Thanks.

Not sure about CDPR and Ubi, but from what I remember, folks at Epic were saying before PS4/XBO launched that they need 2.5TFLOPS consoles (which is what fully operational PS4 GPU@ 1GHz is rated at - but that would mean higher price of PS4)...since neither of them had enough juice, they put SVOGI on hold for UE4 at the time.

As for weakest gen, I'd argue that 4th gen was pretty weak...or at least very late to catch up with computers. Genesis launched in '89 in NA, SNES in '91, and both computers that had similar specs (Amiga and Atari ST) launched in 1985.



No, i wouldn't have. I'm happy with the current power and graphics and such for my PS4.



which to choose when...
i am happy with current console power & also willing to paid more for more power 



shikamaru317 said:
CGI-Quality said:

They did? What developers have said this? And how were the machines "too weak at launch"? 

A CD Project dev hinted that they were expecting more power from XB1 and PS4, which is what lead to the Witcher 3 downgrade. Likewise, a former Ubisoft dev who worked on Watch Dogs said that they were expectng more power as well, which is what lead to the Watch Dogs downgrade.

There's also the fact that PS4 and XB1 were equivalent to mid-range PC's at release. 1300 gflops and 1800 gflops for XB1 and PS4, while the best PC GPU in 2013 had 5000+ gflops. Meanwhile, when 360 released in 2005, it had a 240 gflops GPU, while the best PC GPU at the time were around 260 gflops, a much smaller gap.

Ahh, this is interesting to hear. People were so upset by upgrades but turns out it's not even those developers fault lol

All those flops numbers are so revealing. The increase in hardware performance has really taken a drastic halt.



Around the Network
MoHasanie said:
I'm happy with the current power of consoles. If they were more expensive, I would've bought an 8th gen console much later.


Same.



Slimebeast said:
shikamaru317 said:

A CD Project dev hinted that they were expecting more power from XB1 and PS4, which is what lead to the Witcher 3 downgrade. Likewise, a former Ubisoft dev who worked on Watch Dogs said that they were expectng more power as well, which is what lead to the Watch Dogs downgrade.

There's also the fact that PS4 and XB1 were equivalent to mid-range PC's at release. 1300 gflops and 1800 gflops for XB1 and PS4, while the best PC GPU in 2013 had 5000+ gflops. Meanwhile, when 360 released in 2005, it had a 240 gflops GPU, while the best PC GPU at the time were around 260 gflops, a much smaller gap.

Ahh, this is interesting to hear. People were so upset by upgrades but turns out it's not even those developers fault lol

All those flops numbers are so revealing. The increase in hardware performance has really taken a drastic halt.

The flop numbers are not revealing at all.

Anyone who thinks they can take a GPU from a decade ago and compare it to a modern GPU on flops alone is absolutely kidding themselves.
Even if the Xbox 360 and Xbox One had the EXACT same amount of flops, the Xbox One would be substantually faster.

Fact of the matter is, the Xbox 360's GPU isn't the same as any PC derived GPU, it has characteristics from both the Radeon x19xx series and the extremely inefficient Radeon 29xx series, so it cannot be compared to any PC GPU.

The PS4's GPU however closely resembled the Geforce 7900 series, but with cut down TMU's and ROP's, it wasn't high-end, but it was close enough.
However... It was also overshadowed by nVidia Geforce 8000 series which launched just before the PS3 if memory serves me right.

So when the PS3 launched it's GPU was already relegated to mid-range in terms of performance anyway, relative to the PC.

The Xbox One and Playstation 4 though, they were already using hardware that was almost a couple of years old, they were only mid-range and they were also overshadowed by AMD's more efficient Graphics Core Next 1.2/Gen 2 GPU update.

With that said... If you wish to play the Gflop game, The PC also had multi-GPU's back then as well, overclocking the x1950 XT though, you could theoretically get to almost 500Gflop... Thus a couple of them would yield you almost a Teraflop. This was a decade ago, it puts the Xbox 360's "240gflop" GPU into perspective doesn't it?
It's not comparable for obvious reason still.

HoloDust said:
CGI-Quality said:

I haven't heard those devs say any of that.

@Conina: That's what I was looking for. Thanks.

Not sure about CDPR and Ubi, but from what I remember, folks at Epic were saying before PS4/XBO launched that they need 2.5TFLOPS consoles (which is what fully operational PS4 GPU@ 1GHz is rated at - but that would mean higher price of PS4)...since neither of them had enough juice, they put SVOGI on hold for UE4 at the time.

As for weakest gen, I'd argue that 4th gen was pretty weak...or at least very late to catch up with computers. Genesis launched in '89 in NA, SNES in '91, and both computers that had similar specs (Amiga and Atari ST) launched in 1985.

Increasing the PS4's GPU clock to 1Ghz might not have cost a single cent extra.



--::{PC Gaming Master Race}::--

Pemalite said:
Slimebeast said:

Ahh, this is interesting to hear. People were so upset by upgrades but turns out it's not even those developers fault lol

All those flops numbers are so revealing. The increase in hardware performance has really taken a drastic halt.

The flop numbers are not revealing at all.

Anyone who thinks they can take a GPU from a decade ago and compare it to a modern GPU on flops alone is absolutely kidding themselves.
Even if the Xbox 360 and Xbox One had the EXACT same amount of flops, the Xbox One would be substantually faster.

Fact of the matter is, the Xbox 360's GPU isn't the same as any PC derived GPU, it has characteristics from both the Radeon x19xx series and the extremely inefficient Radeon 29xx series, so it cannot be compared to any PC GPU.

The PS4's GPU however closely resembled the Geforce 7900 series, but with cut down TMU's and ROP's, it wasn't high-end, but it was close enough.
However... It was also overshadowed by nVidia Geforce 8000 series which launched just before the PS3 if memory serves me right.

So when the PS3 launched it's GPU was already relegated to mid-range in terms of performance anyway, relative to the PC.

The Xbox One and Playstation 4 though, they were already using hardware that was almost a couple of years old, they were only mid-range and they were also overshadowed by AMD's more efficient Graphics Core Next 1.2/Gen 2 GPU update.

With that said... If you wish to play the Gflop game, The PC also had multi-GPU's back then as well, overclocking the x1950 XT though, you could theoretically get to almost 500Gflop... Thus a couple of them would yield you almost a Teraflop. This was a decade ago, it puts the Xbox 360's "240gflop" GPU into perspective doesn't it?
It's not comparable for obvious reason still.

I didn't say flops were directly proportional to performance, but they're still a good indication.

For example, your X1950 example and almost 500gflops, well it turns out it was at least 50% faster than an X360 at the time. So, not perfectly proportional but in the same ballpark.

Your post is nevertheless very interesting. Aslo, you seem to agree that with PS4 and BXO we witnessed by far the weakest generationional leap relatively speaking. In the past a new console generation could be 20 times stronger, now it's only 6-8 times stronger and next time the difference will be even smaller.



For Nintendo definitely. But for Sony and MS, they are expensive enough already. Especially when you take into account yearly online fees.



Who knows...I just bought a gtx1070 for 499€



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3