By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:


I hope 4x Anisotropic isn't all the PS4 pushes out, 8x should be the minimum, 16x should be the goal.
However even at 16x you still have other techniques available to improve the filtering quality, just look at anisotropic quality between nVidia and AMD, despite both having 16x options, they still both offer differing levels of quality.

It's a big step in quality compared to trilinear filtering from last gen consoles ... Like I said, with 72 TMU's 4x anisotropic filtering should be almost free on the PS4 so a little more work from the developers side and their decision too they should be able to net 8x anisotropic filtering pretty easily and I wouldn't single out the X1 either being able to do it. Don't worry too much about the consoles being bottlenecked by fixed function units. 

The reason why Nvidia was inferior in AF was because their hardwired algorithm was more angle dependent than AMD's. BTW there's and even higher quality texture filtering scheme than 16x AF ... 

Pemalite said:

DXT5 assumes a 4:1 compression ratio, there are methods for higher levels of compression, you could for instance setup a 128-bit palletted texture using 4-bits per pixel in the index buffer which would give you a compression ratio of an impressive 32:1, which is ironically possible even on the XBox 360.

Plus, super high-quality textures aren't always needed and not always used anyway, when someones camera gets close-enough to a particular surface, the low quality texture is swapped for a higher quality one, most evident in Unreal powered games, this conserves memory. (You get the side-effect of pop-in unfortunatly, but made well, can be a non-issue.)
Plus, some textures are used multiple times in a single scene, so you don't need a hundred 16k textures for every rock on the ground loaded into memory, just the one texture of a rock which can then be modified with shaders to look different. - Instancing is another method to give some differentiation between the same object like grass, rocks, wood panels on a building etc'.

Besides, 3dc+ should be the new baseline this generation as all consoles and PC hardware support it, which also means more stuff is generally compressed.
On the flipside, we have finally moved past sub-1Gb memory systems, we gotta' use that new memory baselines for something, bigger textures is obviously the first step.

It's not all about having higher compression ratio's ... A high signal to noise ratio isn't very ideal for quality. 

What you described in your second statement was mipmapping. 

Actually, 3DC+ does no better than BC1 or DXT5 in terms of compression ratio ... 

Pemalite said:

Of course it's not sustainable, yet has happened constantly in history in the DRAM market, heck (showing my age now) I remember it happening to EDO Ram.

DRAM manufacturers attempt to predict the levels of demand in advance, this doesn't always align with where the market is or is heading, for example, DRAM manufacturers boosted DDR3 Ram production prior to Window 8's launch in the anticipation that it's memory requirements would increase, that didn't happen and DRAM manufacturers the world-over, cried. (Windows 8's lack of high-demand didn't help things either.)

Sometimes a new DRAM standard is launched, then the company's switch the factory's over to the new standard in order to capitalise on the initial high-prices, because of such there is more DRAM being produced than the market actually needs, flooding the marketplace causing prices to drop.

In both cases, companies end up with massive piles of DRAM sitting around doing nothing, which means no revenue and thus prices drop, this is why 8Gb of DDR3 got to as low as $30-$40 at one point then quad-drupled in price once manufacturing capacity decreased. (It's sitting at about $100 AUD now that the market has settled.)

GDDR5 production will continue to increase as the Playstation 4 continues steaming ahead and as nVidia and AMD eventually release more high-volume graphics processors (Aka. Lower-end) with GDDR5 memories, then the market will eventually "pop" once nVidia and AMD shift to DDR6/Some other memory standard and the low-end shifts to cheaper DDR4. (Once scales of economy cheapens DDR4 that is.)

Again, this happened constantly through history, DRAM is a voltatile market.

This can't just keep happening ... Sooner or later more and more manufacturer's will have to close down preventing a flood of DRAM's in the market's future. There's only three big DRAM manufacturer's left in the industry and that's Samsung, Micron, and SK Hynix but there used to be another manufacturer known as Elpida but they were no more after that scenario you described ...

Flooding the market with goods isn't a good idea in the long term since that just eliminates entire industries in general ...