By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
vivster said:
Checked out the computerbase review. Slightly better than my expectation. I might even go for a FE myself since they turned out this good. If I can get my hands on one.

The 2080Ti is good, but the vanilla 2080 isn't that much better than a 1080Ti, and there won't be much of a price difference between them.

Who the fuck epected the 2080 to be much better than the 1080ti? Also who the fuck would want a 2080? That's like people buying a 9700k.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network
Megiddo said:
Is this the general PC gaming thread? Just wanted to say I was a Kickstarter backer and I've been enjoying The Bards Tale 4 quite a bit. It's not a game for everyone. Performance/Optimization is poor and the UI seems pretty buggy but the actual gameplay mechanics I think are great. If you like old school dungeon crawlers/RPGs with an emphasis on puzzle solving and you can tolerate optimization/performance issues then I highly recommend it. I've got 10 hours into it and it's like a complete blast from the past. The way they did turn-based grid combat is excellent and the voice acting in particular is quite impressive for a 'AA' title.

Sure is. Welcome!

HoloDust said:
Honestly, looking at those benchmarks, I think 2080/Ti are pretty bad given how much time has passed since 1080 and how much they're priced.

Yeah. It's the kind of jump I would expect after 1 year.
Makes you wonder how much of a leap they could have provided if they dedicated all the extra fluff into rasterization.

vivster said:

Who the fuck epected the 2080 to be much better than the 1080ti? Also who the fuck would want a 2080? That's like people buying a 9700k.

There is a fairly large price differential between the 2080 and 2080Ti.

I am hoping this forces Vega prices down to be honest, although doubt it.



--::{PC Gaming Master Race}::--

Pemalite said:
HoloDust said:
Honestly, looking at those benchmarks, I think 2080/Ti are pretty bad given how much time has passed since 1080 and how much they're priced.

Yeah. It's the kind of jump I would expect after 1 year.
Makes you wonder how much of a leap they could have provided if they dedicated all the extra fluff into rasterization.

I honestly can't quite wrap my head around why they're pushing RT at this point - I even went down the rabbit's whole and recalled Larrabee thinking maybe they know something about Intel and some potential dGPU that we don't...

It's interesting tech nevertheless (though not so sure about their denoising approach being anything but short term solution), I just don't see it as very useful at the moment, so it is kinda waste of silicon, but given the state in which AMD is now when it comes to GPUs, they at least don't have to worry about them much.



HoloDust said:
Pemalite said:

Yeah. It's the kind of jump I would expect after 1 year.
Makes you wonder how much of a leap they could have provided if they dedicated all the extra fluff into rasterization.

I honestly can't quite wrap my head around why they're pushing RT at this point - I even went down the rabbit's whole and recalled Larrabee thinking maybe they know something about Intel and some potential dGPU that we don't...

It's interesting tech nevertheless (though not so sure about their denoising approach being anything but short term solution), I just don't see it as very useful at the moment, so it is kinda waste of silicon, but given the state in which AMD is now when it comes to GPUs, they at least don't have to worry about them much.

It is a waste of silicon at this point. Nothing can even use it.
Plus... It will still be years before it will be common place in GPU's anyway, so developers have zero/minimal incentive to implement it.

AMD has yet to jump onboard, nVidia is only reserving it for it's high-end GPU's... Intel is nowhere to be seen, mobile and consoles is meh... And then we have years worth of GPU's that are still capable gaming parts also.

Traction to adopt Ray Tracing will be slow, shit... It took Tessellation years as well and that had more GPU's on the market that could leverage it.

Ray Tracing is very much a compute problem, which has typically been an advantage of Graphics Core Next, so it will be interesting to see what path AMD takes things... Whether they will build fixed function units to handle it... Or just include it as part of the current compute hardware that can work with Ray Tracing or Rasterization (Pro's and Con's to each). - Denoising is a necessary evil at this early stage, so will be interesting to see how things change over the coming years.

But one thing is for sure... A large portion of the RTX cards is entirely useless at the moment... And I would not be buying those GPU's because of those aspects.



--::{PC Gaming Master Race}::--

Pemalite said:
HoloDust said:

I honestly can't quite wrap my head around why they're pushing RT at this point - I even went down the rabbit's whole and recalled Larrabee thinking maybe they know something about Intel and some potential dGPU that we don't...

It's interesting tech nevertheless (though not so sure about their denoising approach being anything but short term solution), I just don't see it as very useful at the moment, so it is kinda waste of silicon, but given the state in which AMD is now when it comes to GPUs, they at least don't have to worry about them much.

It is a waste of silicon at this point. Nothing can even use it.
Plus... It will still be years before it will be common place in GPU's anyway, so developers have zero/minimal incentive to implement it.

AMD has yet to jump onboard, nVidia is only reserving it for it's high-end GPU's... Intel is nowhere to be seen, mobile and consoles is meh... And then we have years worth of GPU's that are still capable gaming parts also.

Traction to adopt Ray Tracing will be slow, shit... It took Tessellation years as well and that had more GPU's on the market that could leverage it.

Ray Tracing is very much a compute problem, which has typically been an advantage of Graphics Core Next, so it will be interesting to see what path AMD takes things... Whether they will build fixed function units to handle it... Or just include it as part of the current compute hardware that can work with Ray Tracing or Rasterization (Pro's and Con's to each). - Denoising is a necessary evil at this early stage, so will be interesting to see how things change over the coming years.

But one thing is for sure... A large portion of the RTX cards is entirely useless at the moment... And I would not be buying those GPU's because of those aspects.

I'm wondering  if that part of silicon that does all the BHV calculations (IIRC, it's in RT cores) can be used for something else - I can't say I'm familiar that much with all the aspects of voxel octrees and similar approaches, but at least from my layman POV, BHV seems like something that perhaps can be used for some  part of it.

I'm thinkng, maybe, sometimes in the mid-term future, we'll see GPUs that are mostly built for something like voxels and raytracing...so I'm wondering whether this is perhaps one of those moments in GPU history (likeT&L, unified shaders or tesselation) that we'll remember eventually.



Around the Network
HoloDust said:

I'm wondering  if that part of silicon that does all the BHV calculations (IIRC, it's in RT cores) can be used for something else - I can't say I'm familiar that much with all the aspects of voxel octrees and similar approaches, but at least from my layman POV, BHV seems like something that perhaps can be used for some  part of it.

I'm thinkng, maybe, sometimes in the mid-term future, we'll see GPUs that are mostly built for something like voxels and raytracing...so I'm wondering whether this is perhaps one of those moments in GPU history (likeT&L, unified shaders or tesselation) that we'll remember eventually.

BHV? Or did you mean BVH or rather... Bounding Volume Hierarchy?
Either way... nVidia's RT cores are very simple... But there isn't allot of deep information on them other than that... So what extent they can be leveraged for non-Ray Traced tasks I can't say with any definitive certainty.

We are most certainly at one of those "big crossroads" in GPU technology just like T&L, Unified Shaders, Tessellation, Texture Filtering, Texture Compression, Anti-Aliasing... And one thing is for certain... The Geforce 20180 Ti is new hardware, the real hardware that will really show what Ray Tracing is all about won't come for a few years yet... And then games will follow later.



--::{PC Gaming Master Race}::--

HoloDust said:
Pemalite said:

Yeah. It's the kind of jump I would expect after 1 year.
Makes you wonder how much of a leap they could have provided if they dedicated all the extra fluff into rasterization.

I honestly can't quite wrap my head around why they're pushing RT at this point - I even went down the rabbit's whole and recalled Larrabee thinking maybe they know something about Intel and some potential dGPU that we don't...

It's interesting tech nevertheless (though not so sure about their denoising approach being anything but short term solution), I just don't see it as very useful at the moment, so it is kinda waste of silicon, but given the state in which AMD is now when it comes to GPUs, they at least don't have to worry about them much.

That's a dangerous stance when it comes to innovation. You have to start at some point. It's basically the perfect time now that AMD does literally nothing. It paves the way for devs to get familiar with the tech and eggs on AMD to provide similar solutions. And come next generation we can start making actual use of it.

Speaking of wasted silicon, how the fuck does Nvidia not have a feature like zero core yet?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:

That's a dangerous stance when it comes to innovation. You have to start at some point. It's basically the perfect time now that AMD does literally nothing. It paves the way for devs to get familiar with the tech and eggs on AMD to provide similar solutions. And come next generation we can start making actual use of it.

Speaking of wasted silicon, how the fuck does Nvidia not have a feature like zero core yet?

I meant that it's wasted silicon because nothing is using it right now... I recognize the chicken and egg scenario happening here.

There are also other ways you can implement Ray Tracing whilst not sacrificing rasterization capability, we will just have to wait and see how Intel and AMD responds. (Which could be years from now.)

In-Fact... Hybrid Ray Tracing+Rasterization has been a "thing" for years now anyway, so it's only natural we continue along that path, it's just nVidia is willing to "bet" on it by dedicating die area to the specific problem.



--::{PC Gaming Master Race}::--

Pemalite said:
vivster said:

That's a dangerous stance when it comes to innovation. You have to start at some point. It's basically the perfect time now that AMD does literally nothing. It paves the way for devs to get familiar with the tech and eggs on AMD to provide similar solutions. And come next generation we can start making actual use of it.

Speaking of wasted silicon, how the fuck does Nvidia not have a feature like zero core yet?

I meant that it's wasted silicon because nothing is using it right now... I recognize the chicken and egg scenario happening here.

There are also other ways you can implement Ray Tracing whilst not sacrificing rasterization capability, we will just have to wait and see how Intel and AMD responds. (Which could be years from now.)

In-Fact... Hybrid Ray Tracing+Rasterization has been a "thing" for years now anyway, so it's only natural we continue along that path, it's just nVidia is willing to "bet" on it by dedicating die area to the specific problem.

Considering that RT is the single most important graphics advancement since going 3D I'm willing to sacrifice a bit of silicon for it. It seems like the perfect thing to make dedicated hardware for, seeing how it's so simple and so easy to parallelize.

It'd be cool to have whole dedicated GPUs just for RT but that's probably too inconvenient to ever reach the mass market. Just imagine all the Gigarays on a chip as big as Turing filled exclusively with RT cores.

Last edited by vivster - on 20 September 2018

If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Pemalite said:
HoloDust said:

I'm wondering  if that part of silicon that does all the BHV calculations (IIRC, it's in RT cores) can be used for something else - I can't say I'm familiar that much with all the aspects of voxel octrees and similar approaches, but at least from my layman POV, BHV seems like something that perhaps can be used for some  part of it.

I'm thinkng, maybe, sometimes in the mid-term future, we'll see GPUs that are mostly built for something like voxels and raytracing...so I'm wondering whether this is perhaps one of those moments in GPU history (likeT&L, unified shaders or tesselation) that we'll remember eventually.

BHV? Or did you mean BVH or rather... Bounding Volume Hierarchy?
Either way... nVidia's RT cores are very simple... But there isn't allot of deep information on them other than that... So what extent they can be leveraged for non-Ray Traced tasks I can't say with any definitive certainty.

We are most certainly at one of those "big crossroads" in GPU technology just like T&L, Unified Shaders, Tessellation, Texture Filtering, Texture Compression, Anti-Aliasing... And one thing is for certain... The Geforce 20180 Ti is new hardware, the real hardware that will really show what Ray Tracing is all about won't come for a few years yet... And then games will follow later.

Yeah, typo - I meant BVH.

As I said (at least from what I udnerstood about it from Anandtech's article on Turing architecture), the way it searches for what intersects what reminded me a bit of sparse voxel octrees ...though, as I said, I'm not too familiar with much of the principles of either.