Quantcast
Real Time Raytracing Demo from Crytek running on AMD Vega 56 , PS5/Xbox 2 will able to run Raytracing???

Forums - Gaming Discussion - Real Time Raytracing Demo from Crytek running on AMD Vega 56 , PS5/Xbox 2 will able to run Raytracing???

How can this possible

Magic 2 16.67%
 
Flying Spaghetti 2 16.67%
 
It's a Hoax 1 8.33%
 
Engineering miracle 4 33.33%
 
Someone sacrificed their live to Satan 3 25.00%
 
Total:12
vivster said:

Wow, so much to unpack here. I'll let pema do the bulk work here but a few things:

Is this rendered 1080p or 720p? The latter won't fly on any next gen console.
Looks like they skipped AA on this one, which is a significant additional performance drag.
Seems to be 30fps in a very quiet and empty scene, i.e. absolutely not representative of an actual game.

Low quality RT is of course possible on mainstream GPUs, as is anything low quality. Color me non-impressed.

Well, the video on Youtube is available in 4K@30fps, but that doesn't really mean anything.

Does anyone know if they have a conference at GDC? Hopefully they do and tell us more about it.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

It is possible? Perhaps. But it is much more likely that they use ray tracing to generate some pre-baked solution and save a lot of the processing power to put on much better things on that budget performance.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363


vivster said:

Wow, so much to unpack here. I'll let pema do the bulk work here but a few things:

Is this rendered 1080p or 720p? The latter won't fly on any next gen console.
Looks like they skipped AA on this one, which is a significant additional performance drag.
Seems to be 30fps in a very quiet and empty scene, i.e. absolutely not representative of an actual game.

Low quality RT is of course possible on mainstream GPUs, as is anything low quality. Color me non-impressed.

It's definitely not skipping anti-aliasing, I am extremely aliasing sensitive and this demo doesn't look bad to me at all, there are far less jaggies than you would see in a game with no anti-aliasing at all. They just aren't using a really high end AA setting. For instance, if there was no AA at all, the powerlines running over the streets would be painful for me to look at. 



I can't help but think you all might be let down by the next gen specs when they drop.... Either that or get ready for $600 USD for the next gen which suits me fine but the PS3 launch looms large in the mind of Sony.....



Did you know the cell could do raytracing aswell?



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Around the Network
kirby007 said:
Did you know the cell could do raytracing aswell?

Perfect Cell could do a lot of stuff.



Well, if I'm not gravely mistaken, this is a bit apples to oranges comparison.

Demos and games on nVidia RTX cards use path tracing, while Crytek are using Sparse Voxel Octree Total Illumination (SVOTI, which is more or less SVOGI...invented by nVidia, no less), which uses voxels to approximate scene geometry and than casts cones onto them.

I don't expect to see much of DXR and RTX type of things next gen, maybe for few crucial things here and there, but I fully expect to see SVOGI - right about the time this gen kicked in, Epic was building SVOGI as a main lighting system for UE4, but due to consoles being underpowered they dropped it at the end, to re-implement it at later date (UE4 currently supports it to my knowledge). Next gen consoles will definitely have enough juice to use it properly.

Lately, Metro was all the rage on RTX cards, but I was secretly hoping for some company to do exactly what Crytek did - show how much close to "real" thing you can get with voxel based approach. Of course, there are things that will not be as good as with true ray-tracing, but I think it's fairly good approximation.

 



Ray Tracing is inherently a compute limited problem... Ironically one of AMD's greatest strengths in GPU design.
We started down the path of Ray Tracing with Path Tracing during the 7th gen. (Might have been some dabbling in that space in the 6th gen with a few early deferred renderers? Not sure.)

Goes without saying that next-gen we will start dabbling more in Ray Tracing... And during the 10th gen we will more or less be in a fully ray traced world or close enough to it anyway, hopefully by then AMD has all it's GPU ducks in a row.



Pemalite said:
Ray Tracing is inherently a compute limited problem... Ironically one of AMD's greatest strengths in GPU design.
We started down the path of Ray Tracing with Path Tracing during the 7th gen. (Might have been some dabbling in that space in the 6th gen with a few early deferred renderers? Not sure.)

Goes without saying that next-gen we will start dabbling more in Ray Tracing... And during the 10th gen we will more or less be in a fully ray traced world or close enough to it anyway, hopefully by then AMD has all it's GPU ducks in a row.

I mostly hope in gen 10 we'll be fully in voxel based worlds (or some variation of it), all else IMO will then, more or less,, come by default.



HoloDust said:

Well, if I'm not gravely mistaken, this is a bit apples to oranges comparison.

Demos and games on nVidia RTX cards use path tracing, while Crytek are using Sparse Voxel Octree Total Illumination (SVOTI, which is more or less SVOGI...invented by nVidia, no less), which uses voxels to approximate scene geometry and than casts cones onto them.

I don't expect to see much of DXR and RTX type of things next gen, maybe for few crucial things here and there, but I fully expect to see SVOGI - right about the time this gen kicked in, Epic was building SVOGI as a main lighting system for UE4, but due to consoles being underpowered they dropped it at the end, to re-implement it at later date (UE4 currently supports it to my knowledge). Next gen consoles will definitely have enough juice to use it properly.

Lately, Metro was all the rage on RTX cards, but I was secretly hoping for some company to do exactly what Crytek did - show how much close to "real" thing you can get with voxel based approach. Of course, there are things that will not be as good as with true ray-tracing, but I think it's fairly good approximation.

 

Yes it is much better to get good enough for a very small fraction of the cost.

I would expect they to use ray tracing on preparing either pre-baked effects of better quality or for creating cutscene



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363