By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS4 Blowout: GPGPU, DirectX 11; Sony London to “Set the Bar for the Industry”, Global Illumination, Instant Radiosity, More

Wh1pL4shL1ve_007 said:
^^^Ohh ok, but when tessellation is on, does it consume more "power".


Yes there is a performance hit associated with it. Tho in many cases that performance hit would be less than just using much more detailed base models.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
Wh1pL4shL1ve_007 said:
^^^Ohh ok, but when tessellation is on, does it consume more "power".

You ultimately have to draw the polygons regardless.  The way tesselation is used in current tech demos and games just repurpose polygons.  You don't need 100k polygons on a model while you play a game, but you would during a cutscene.  Conversely you wouldn't need that many polygons in the background during these cutscenes and you can draw them into that 100k model and dumb the model down to 50k while playing and drawing more polygons in the background.  Depends how you want to subdivide the polygons and set control points for them.



These should help explain what tessellation can be used for



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

dahuman said:
Oh boy, I can't wait for Etho and others all of a sudden start praising how wonderful a GPGPU is and how it will change gaming after reading that thing while it didn't matter on the Wii U :P

Hummmmm... really? Same anwser... GPGPU will not be used (at least not in this new generation) and particularly there is no change of water for wine because of GPGPU... there is no miracle... no game running or looking better because the GPGPU.

The others points in the article was expected.



dahuman said:

I'd be happy enough with 720P 60FPS at pretty good details on consoles TBH since I know they are weak, I just don't know what people are expecting from the GPGPU other than for decompression, physics and "maybe" some AI assist.

lol you and me agree.

So why you said my name before.. lol lol... I never defended GPGPU.

PS. I'm a little drunk now.



Around the Network

wow that tesselation stuff is intense



Sounds awesome cant waite for the ps4.



VITA 32 GIG CARD.250 GIG SLIM & 160 GIG PHAT PS3

ethomaz said:
dahuman said:
Oh boy, I can't wait for Etho and others all of a sudden start praising how wonderful a GPGPU is and how it will change gaming after reading that thing while it didn't matter on the Wii U :P

Hummmmm... really? Same anwser... GPGPU will not be used (at least not in this new generation) and particularly there is no change of water for wine because of GPGPU... there is no miracle... no game running or looking better because the GPGPU.

The others points in the article was expected.


It will be used, UE4 uses DirectCompute aka compute shaders (Direct X's GPGPU API) to calculate it's voxel based lighting volumes for example. But mostly it will be used for a few gimmicky effects like fluid/cloth simulation or particle physics. I am sure Nvidia will port PhysX to the GPU on next gen consoles in some copacity, mostly to encourage devs to implement it in cross platform games.

But definitely not the offloading CPU work to make up for weak CPU nonsense that some people seem to think. And really it doesn't apply in the same way in the console space as it does in the PC space, devs already have low lovel access to the GPU hardware on consoles so GPGPU doesn't make much difference in that space exept for slightly more capable hardware.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

I feel that Nintendo's perspective on GPGPU is doing things like vertex processing on the GPU instead of relying on the CPU to help out. In comparison to what is presently done with the current console generation, doing things like animation etc on the GPU could be called GPGPU in comparison to current console paradigms so I guess they aren't exactly lying even if the truth is less spectacular.

We should also remember that a lot of what the GPU currently does was once the domain of the CPU and with respect to this change in processing model we do have GPGPU by old standards because as the GPU has picked up more workloads we've grown accustomed to it.



Tease.

as someone who has done CUDA programming, I can tell you that GPGPU programming no easier than using cell. Objects needs to be copied to the GPU's memory, which could range from fast but small local memory to the slow but larger shared memory. When you copy back, the memory needs to coalesced to achieve optimal write performance. Not to mention, you need to manage and sync warps (cluster of threads) which can cause huge bottlenecks if you require all data before sending stuff back to the main memory.

Of course, this is all from 3 years ago.. not sure how GPGPU programming changed these days.