By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
Pemalite said:


Nope, Evergreen did not, this was a hardware bug, not a software one.
Keep in mind I had a pair of Radeon 5870's then upgraded to a pair of Radeon 6950's and the difference in some games was actually startling. (In-fact I have had at-least one of ATI's high-end cards starting with the dual-GPU ATI Rage Fury MAXX.)
Anand reported on it here: http://www.anandtech.com/show/3987/amds-radeon-6870-6850-renewing-competition-in-the-midrange-market/5

Nvidia claims otherwise ... The HD 6000 series did have better texture filtering but there's no such thing as "correct" anisotropic filtering since the whole concept is arbitrary. Nvidia may have handled the transitions better but angle independence is the only one to be able to fix cases like that to this ...

(I hope you didn't listen to guys like this ...) 

 


I know the 6000 series had better filtering, it's the reason why I ended up hanging onto them for a few years instead of flogging them off instantly like with the 5870's.
I'm not disgreeing with you on angle independant filtering, but keep in mind, that when AMD launched it on the 5000 series, there was a bug in the texture units, which caused this:


For clean textures, AMD certainly had an edge, but they fell flat on their faces with any noisy textures, which kept AMD from having "the best filtering" untill the 6000 series.

fatslob-:O said:

Judging from the description, you were ... Changing the texture quality for a surface on the fly implies that it is "mipmapping" in effect. 


It's not changing the texture quality, it's swapping the texture for a completely new one.

fatslob-:O said:

Being interoperable makes no difference as long as the compression algorithms support a common surface format. If your wondering about their similarities their both based off of S3TC and 3DC+ only builds upon it. There are no practical differences in terms of conserving memory usage (In fact it's worse when your using an INT8 texture which tons of games use.), the biggest change is increase in accuracy which translates to higher quality and Microsoft THEMSELVES say so ...


It does make a big difference, as you don't have to rework other systems to work with the new compression algorithm such as shading.

DXT compression for instance is inherintly bad with non-decal textures, 3dc is bloody fantastic with normal maps.


fatslob-:O said:

The big reason why we're focusing on power consumption rather than performance is for the purpose of exascale and there are better options such as stacked DRAM ...

It's mostly because mobile devices are more popular than fixed devices like a desktop PC or a Console, the shift from DDR3 to DDR4 brought with it power savings, LDDR3 is energy efficient too.
It's only natural for mainstream technologies to shift with the times to where the most growth is, it's what makes the most business sense at the end of the day.



Cats.



--::{PC Gaming Master Race}::--