By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
Pemalite said:

Of course Microsoft sits on top of the food chain.

Also. NEC used to build graphics chips, Intel even licensed them at one point, same with Number Nine, they hold multiple patents in graphics technologies, including parts of the rendering pipeline that Direct X leans upon.

And no. They can't just tell S3 to "take a hike". Because patents and licensing agreements.
Even successive technologies like Direct X Texture Compression uses licensing from S3 as it was based around and improved upon over S3 Texture Compression.

And you are right, S3TC does expire soon.

You're right, NEC did design a graphics chip and it was licensed by Intel in 1982 but that was a one time deal before DirectX (1995) was even created! (They might've held patents before but I imagine those are already expired since they didn't quite make it to the 21st century in the videocard market and Intel designed their own graphics chips shortly thereafter so I doubt NEC had a say in the DirectX specs.) 

The main players (AMD, Intel, Nvidia) have already licensed S3TC so Microsoft doesn't need S3 graphics anymore and they won't be able to bully new players (Qualcomm, I'll be interested in knowing how they'll perform in 32-bit Windows games with high CPU emulation overhead) anymore either (such is the tragedy of being expendable) ... 

The golden days of patenting computer graphics technology is over (I imagine we've still yet to see peak for other fields such as biotechnology) ... (the last big patented computer graphics technology appears to be hardware implementation for conservative rasterization and Volta's independent thread scheduling) 

Microsoft may not need S3 graphics anymore. But it is still supporting their technologies even in Direct X 12.
Besides. You are missing the entire point of this.

I was providing examples of where hardware manufacturers contributed to the Direct X specification. Nitpicking at a few edge-case scenarios isn't changing that fact.

As for Qualcomm. They likely license AMD's technologies and patents anyway.
ADRENO is a word play on "Radeon". - And Qualcomm bought the technology from AMD in the first place.

As for patenting computer graphics technology? Nope. Still happening. G-Sync, PhysX, Hairworks and so on are examples. Might not be happening at the same rates as it used to during the 3D boom, but it's still occuring.

fatslob-:O said:

Not sure I see a problem with rebadging since that's mostly dependent on how frequent microachitecture designs are ... (AMD created 4 more graphics architectures since GCN and for Nvidia 4 more graphics achitectures since Kepler so AMD is pushing out new graphics architectures just as often as Nvidia is. Moreover it looks like AMD has stopped rebadging this round since their new baseline appears to be Vega all around with Vega 10, 11, 12 and Raven Ridge.)


It is a big issue.

AMD hasn't created "4x more graphics architectures since GCN".
Graphics Core Next 2, 3, 4, 5 and so on... Are all just iterative updates of GCN 1.0. They aren't brand new top-to-bottom designs.

Not only that, but AMD is rebadging GCN 1.0 parts from 2012 even today with the Radeon RX 520 and 530.

And by rebadging those parts, they miss out on some vital functionality... Their Tessellation performance is laughable, they have no Delta Colour Compression, no True Audio, no power saving and power state enhancements, no HEVC decoding, no HDMI 2.0... And if I remember correctly, also no 10-bit colour support, no Virtual Super Resolution... And I could go on, but I made my point.


As for Vega... Vega is just the GPU on top of the Radeon RX 500 stack. It is what Fury was to the Radeon 300 series.

fatslob-:O said:

AMD needs Eric Demers (he works at Qualcomm) back  ... (Raja Koduri isn't going to be their Jim Keller of GPUs)

Agreed.

But part of the issue is... AMD only made the Radeon group an independent entity again just a couple years ago, they are still abiding by their old plan, we won't see the fruits of AMD's seperate of the Radeon group for another year or two.

AMD needs to invest. Not just iterate.

nVidia has done an excellent job innovating even without any real competition... And that has paid off for them.

fatslob-:O said:

It's less of a rarity than you think, practically all current AMD (GCN3+) and Intel graphics (Gen 8+) support FP16 in addition to PS4 Pro

Just because they support FP16, doesn't mean there will be any gains with it over FP32.
The amount of GPU's available with double FP16 is miniscule. And then if you compare those GPU's with Steams statistics... Well. You get the point.

fatslob-:O said:

Even if the implementation is not double rate, FP16 can still benefit from register savings thus increasing occupancy and performance ...

Then the performance gains will likely not be worth it for the significantly reduced precision.

fatslob-:O said:

You seem to think that the industry is centered around Nvidia but Nvidia's only big when you look at Windows PCs however that's not so for other vendors (Apple, Sony, Xbox) or integrated graphics (where Nvidia has 0% marketshare over there) all of which takes devs a considerable portion of their attention ...

On the contrary. That isn't what I think.

Apple isn't doing high-end AAA games in any great extent anyway.

QUAKECore89 said:

I actually thought the i3 was gonna get 3 cores/ 6 threads, however instead they're getting 4 cores/ 8 threads & non-hyperthreading to compete Ryzen 3 1200/1300X as well as Ryzen 5 1400.

Edit: I'm not sure about the price and performance, we'll see about this when they announced...

Dual cores needed to die like 5+ years ago anyway... Haha



--::{PC Gaming Master Race}::--