By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:

Not so because AMD intends on keeping GCN in the foreseeable! As long as Sony, Microsoft, Google, and potentially EA as well as other ISVs (Valve wants to keep developing open source drivers for AMD on Linux/DXVK and Ubisoft are developing their Vulkan backend exclusively for AMD hardware on Stadia) wants to keep GCN then GCN will stay alive...

AMD's Roadmaps and Anandtech seem to agree that after Navi. AMD will have a next-gen Architecture though.
https://www.anandtech.com/show/12233/amd-tech-day-at-ces-2018-roadmap-revealed-with-ryzen-apus-zen-on-12nm-vega-on-7nm

And I honestly can't wait... Just like Terascale got long in the tooth in it's twilight years, the same is happening with Graphics Core Next.

Unless you have information that I do not?

fatslob-:O said:

Gamecube was pretty overrated IMO since quite a few devs didn't find an improvement whenever they ported code from PS2. Take for example Baldur's Gate Dark Alliance, NFS Underground series, and one of the lead programmers proclaimed that the GC couldn't handle Burnout 3 because it was inferior in comparison to PS2 hardware ... (I'd be interested in other opinions from programmers within the industry that worked during that generation how the GC actually fared against the PS2)

I think ERP from Beyond3D might be Brian Fehdrau but judging by his testing, Flipper's clipping performance was a massive showstopper in it's hardware design ... (now I'm definitely convinced that the GC is just overrated after actually hearing from some developers)

I'm starting to get the impression that the GC was only ever good at texturing. It couldn't handle high poly counts, alpha effects, physics, AI like the PS2 could ...

Metroid Prime looks great but you can't really use exclusives as a measuring stick for comparison ... (no way effects would rival 360 at all since it'll probably blow the doors off the GC when original Xbox by comparison ran nearly all of the code faster in comparison to the GC)

Anyone who asserts that the Gamecube was inferior to the Playstation 2 is an opinion not worth it's weight... It's pretty much that.

At the end of the day... The proof is in the pudding, Gamecube games in general were a big step up over the Playstation 2 and would trend closer to the Original Xbox in terms of visuals than the Playstation 2.

Metroid on Gamecube showed that the console could handle high poly counts... They even had 3D mesh water ripples... Watch the digital foundry video I linked to prior.
But you are right, the Gamecube was a texturing powerhouse.


fatslob-:O said:

GeForce 2 at most could potentially similar be similar to pixel shaders but from GeForce 3 then and onwards, it had a truly programmable vertex pipeline ... 

The only hardware where it was acceptable to do multiple passes was on the PS2 with it's stupidly fast eDRAM but then it'll be a massive performance impact. Doing multiple passes on MGS2 brought even the Xbox down to it's knees by comparison to the PS2. Multiple passes was especially a no go from 7th gen and onwards even when the 360 had eDRAM because programmers started realizing that poor cache locality resulted in bad performance on parallel processing ... 

Don't foget the Gamecube and Wii had the 1t SRAM and the Xbox 360 had eDRAM and the Xbox One had eSRAM.
It's not a pre-requisite for doing multiple passes. In-fact deferred renderers tend to rely on it more so.

The Original Xbox wasn't built for that kind of workload though, it's architecture was traditional-PC at the time.

The Geforce 2 didn't have fully programmable pixel shader pipelines... The point I am trying to convey though is that certain pieces of hardware, even if they lack the intrinsic functionality of a more modern piece of hardware, can still actually be capable of similar effects in hardware using a few tricks of the trade.

That doesn't mean said effects will be employed however due to lack of horse power or how said effects will blow out the render time budget.

fatslob-:O said:
The WII U is a mix between the R600 and the R700 according to the leading WII U emulator developer ... 

The die-shots show it to have allot in common with R700. But R700 has allot of common with R600 anyway... R700 is based on the R600 design, it's all Terascale.
So I wouldn't be surprised. In saying that, it doesn't really matter at the end of the day.
WiiU was a bust.

fatslob-:O said:

Maybe so market wise but Nvidia won't be able to as easily maintain support for older hardware in the future ... 

I don't deny that Nvidia has a sound approach since quite a few developers are resisting the push for more explicit APIs like DX12 or Vulkan but it'll come at the cost of bad experiences later on with as hardware ages in comparison to AMD ... 

nVidia won't maintain support for older hardware, they will eventually relegate older parts to life-support.
In-fact it has already started to happen to Fermi. Kepler is next.

https://www.anandtech.com/show/12624/nvidia-moves-fermi-to-legacy-ends-32bit-os-support

fatslob-:O said:
But that's just the high level view. If we take a look at the lower level like say Nvidia's PTX which is their intermediate representation for their GPUs (AMD equivalent is AMDIL IIRC) then the changes are really profound. Fermi only supports binding upto 8 RWTexture2D in any given shader stages while Kepler ups this 64, Fermi doesn't support bindless handles for textures/images and not to mention the changes behind PTX ... 

Same can be said for Graphics Core Next... There are always smaller tweaks between Graphics Core Next revisions... It's actually expected whenever a GPU lineup is refreshed, new features and capabilities are expected to be added.

I mean, AMD doubled the ACE units in Bonaire for example or introduced primitive discard in Polaris, there is always some deviation.
Hence... The entire point of why I look at it from a high-level perspective, certain GPU families can certainly be grouped together... Sure you can nit-pick individualistic points of contention and go from there, but at the end of the day you won't get far.

fatslob-:O said:

Nvidia licenses the ISA, not ARM's in-house designs ... 

It's probably the biggest reasons why Nvidia's CPUs are still trash in comparison to what ARM offers but since their mainly a GPU chip design company, they'll obviously cheap out whenever they can ... 

nVidia's license actually includes access to everything. -Around 15 companies have such a license.

Whether they will use it is another matter entirely.

fatslob-:O said:

It's not good for preservation purposes especially since Nvidia dropped 3D vision support ... (it sucks especially if a feature is dropped because the future won't be able to experience the same things we did)

I don't believe it'll be sustainable in the near future to just deprecate things especially when investment in software is rising ... 

And AMD dropped Mantle Support... And made the entire situation of True Audio confusing as some parts didn't have the block in hardware and then they abolished the feature from their drivers.

It happens. That's life. Neither company has a perfect track record.

fatslob-:O said:

They should launch 10nm first before talking about 7nm ... 

I still don't trust Intel's manufacturing group ... 

7nm will be less ambitious than their 10nm process and the team handling 7nm has hit every milestone on time.

fatslob-:O said:

Older games aren't worth benchmarking because their code becomes unmaintained so their more of a liability to include than gaining any real useful data point ... 

It's not good to use unmaintained software to collect performance data of new hardware since bottlenecks could be due to code rather than the hardware itself ... (older code just isn't going to look so great running on a new piece of hardware) 

This is why I strongly advocate against using older games but I guess we'll agree to disagree ...

Grand Theft Auto 5 is certainly maintained.

World of WarCraft is certainly maintained.

A benchmark suite should present data that shows how the hardware performs in every single scenario. Not just a nitpick to show the hardware in the best possible light.

fatslob-:O said:
It's true Nvidia holds the performance crown but the tables are turning and the playing field is changing against them so their going to be in hostile territory soon ..

To be frank... I will believe it when I see it. People have been proclaiming AMD's return to being competitive in the GPU landscape for years, but other than price... They haven't really put up much of a fight.

In saying that, I honestly hope it happens... I really do, the PC is at it's best when AMD is at it's best and taking the fight to nVidia and Intel, it brings prices down, innovation happens rapidly... Consumer wins.



--::{PC Gaming Master Race}::--