By using this site, you agree to our Privacy Policy and our Terms of Use. Close

I just want an Alpha Centauri remake... I probably won't even need another Turn Based game again considering I still play that game 20 years later...

CIV 5 is definitely a solid title though, certainly preferring that over Civ 6 right now... And it was the ducks nuts back when I was running triple displays.

fatslob-:O said:
Pemalite said:

They tend to do that every console generation...

Hence why it's called a "generation" where there is a break-away in technology.

Except there is no "break-away" in technology since the new systems will be backwards compatible with old software so I don't see eye to eye with this argument ...

And there's also no guarantee as well that the OS will become more demanding considering we saw a decrease in OS requirements during the middle of 7th and 8th gen ... 

Pemalite said:

And? That wasn't an Auxiliary ARM processor, nor does it mean that the PS2 having the PS1 chips for things like I/O and backwards compatibility mean that the Playstation 5 will have Auxiliary ARM processors.

My point still stands in spirit regardless. There's no precedent to rule out an auxiliary processor in a successor ...

Actually, it could stand in full if we take a look at the lines of Nintendo handhelds where they had auxiliary processors for the longest ... 

Pemalite said:

Certainly not definitively.

Considering the PS4 Pro came with an ARM processor it's pretty much a done deal at this point ... 

I could care about the platform branding such as 'PS5', what matters most is the underlying technology since after all the Wii only proved to be an overclocked Gamecube ... 

We aren't going to agree on these points, so I am just going to leave it as it is.
My approach is simple... We have no evidence either for or against these points, so lets not assert them as factual yet.

fatslob-:O said:

Downsampling would be a waste of power according to most developers ... 

Majority of shipments currently maybe 4K displays but it's useless since households have a very slow replacement cycle ...

You forget that next gen isn't releasing right now, right? It is still over a year away.

And you also haven't forgotten that these consoles are going to be on the market for many years right? We probably won't be starting to talk about their generational replacements until after 2025.

fatslob-:O said:

Even with RT cores, it's still a huge performance killer as we see on Turing. I imagine most devs will also want to do their lighting/shadowing effects in RT as much as possible thereby increasing the stress on the hardware. They want more than just RT reflection or RTAO. They'd also like to have RT soft shadows, RTGI, RT transparency/transmittance, RT sub-surface scattering, etc and there's no telling what else ... (most developers will probably blow their ray budget on lighting effects over higher resolutions)

We have no idea how efficient Navi is relative to Turing on an efficiency front, so no point speculating. (It comes to that pesky evidence thing again.)

fatslob-:O said:

If you're using a low bitrate, why even stream in 4K at all ? 

Sure using a low bitrate is possible but by then you pretty much effectively killed the use case for 4K streaming which was it's higher pristine image quality ... 

Just about the entire world sucks at fixed broadband aside from South Korea or Singapore so it's not just in the US where half the people can't even get 25Mbps on down link! 

Because it's not all about bitrate? I expected someone like yourself to have a better understanding of it all.

Broadband speeds is a tricky situation, many outlets are employing dynamic resolutions/bitrates when streaming video, so if someones connection isn't able to maintain 4k, it will drop down... That also means if you want the best you don't miss out either.

Netflix and Amazon have been making some big pushes in 4k content, even mandating the need for video streams to be in 4k for some of their curated stuff.
https://www.indiewire.com/2017/04/netflix-amazon-4k-documentary-1201799403/

So clearly there is a market for 4k.

fatslob-:O said:

@Bold Well, no because that happens to Nvidia as well with their Tegra series. For instance their K1 chip, which debuted an integrated Kepler GPU was introduced TWO years after the desktop counterparts. For the X1, it took 2 quarters for it to debut after the desktop parts and it was their fastest lead up to an integrated solution by far. Nvidia has only JUST started shipping their Drive PX Xavier boards recently which features the Volta architecture from nearly 2 years ago ... 

Tegra isn't targeting the PC market.
And that is also nVidia's cadence.

Intel for example typically releases it's mobile processors ahead of the desktop on a per-architecture basis... Ice Lake for example is coming to mobile first.
And it makes sense... Mobile is actually the larger market.

fatslob-:O said:

Whatever the issue is you have with your Ryzen 2700U device, you have unrealistic expectations as to how chip development progresses ... (even if AMD wanted a 7nm APU badly for their lineup it can't happen in under 6 months) 

AMD has had years to plan for a 7nm APU, just like it's desktop efforts, it's not something they would have had to do in under 6 months.
It's also not "my" issues, it's a platform-wide issue that is well documented.

AMD's 2500u/3500u just has less throttling and better temps, allowing it to turbo up more than the 2700u/3700u resulting in better performance in some cases, meaning AMD's 2700u/3700u designs (Or binning... Or both!) were pretty terrible.

It's not as much of an issue on the desktop side... Because despite the desktop Ryzen 3000 series APU's only being 12nm, they have significantly more TDP to ensure they have the room to let the CPU and GPU's breathe... Plus 12nm is a cheap and mature process, so it makes sense to use it on low-end parts.

https://www.tomshardware.com/news/amd-ryzen-3000-zen-2-microarchitecture-7nm,39609.html

Last edited by Pemalite - on 26 June 2019

--::{PC Gaming Master Race}::--