By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:

It's a big step in quality compared to trilinear filtering from last gen consoles ... Like I said, with 72 TMU's 4x anisotropic filtering should be almost free on the PS4 so a little more work from the developers side and their decision too they should be able to net 8x anisotropic filtering pretty easily and I wouldn't single out the X1 either being able to do it. Don't worry too much about the consoles being bottlenecked by fixed function units. 

The reason why Nvidia was inferior in AF was because their hardwired algorithm was more angle dependent than AMD's. BTW there's and even higher quality texture filtering scheme than 16x AF ...


I am not arguing the fact that the PS4 and Xbox One should almost get free AF, but you did just re-affirm my entire argument, that there are higher/better levels of Filtering.

Actually, nVidia had the edge in filtering after the Geforce 6000 series right up untill AMD launched their Radeon 6000 series (Ironic, huh?)
In the Radeon 5000 series AMD had a bug in it's filtering, which became an eyesore in some games, it was a pet peeve of mine when I had dual Radeon 5850's back then.
Prior to the Geforce 6000 series, you had the Geforce FX, nVidia pulled all sorts of crazies in the drivers in order to achieve performance parity with ATI, that includes reducing filtering quality for a performance gain.
AMD did similar things once their edge started to slip against nVidia with the Radeon x8xx and x19xx and obviously, the 29xx series.


fatslob-:O said:

It's not all about having higher compression ratio's ... A high signal to noise ratio isn't very ideal for quality. 

What you described in your second statement was mipmapping. 

Actually, 3DC+ does no better than BC1 or DXT5 in terms of compression ratio ...

 

Exactly my point, it's not all about higher compression ratio's, otherwise we would be sitting at 32:1 compression ratio's as standard by now as it's well and truly possible.

No, I wasn't describing Mip-mapping.

3DC and thus 3DC+ is more or less an evolutionary step from DXT5, it's not supposed to compress to higher ratio's, it's supposed to compress more formats, which results in less memory required overall.

fatslob-:O said:

 

This can't just keep happening ... Sooner or later more and more manufacturer's will have to close down preventing a flood of DRAM's in the market's future. There's only three big DRAM manufacturer's left in the industry and that's Samsung, Micron, and SK Hynix but there used to be another manufacturer known as Elpida but they were no more after that scenario you described ...

Flooding the market with goods isn't a good idea in the long term since that just eliminates entire industries in general ...

Of course it can keep happening.
Just as there are times when DRAM isn't profitable, there are other times where it's stupidly profitable, which helps even things out, DRAM manufacturers play for the long haul and try to capitalise on market swings (For example: LDDR2/3 and DDR4).

Over the past few years though there has been consolidation, which means there is less competition, but it also means there is also more volatility when something goes wrong. (I.E. Factory fire.)



--::{PC Gaming Master Race}::--