By using this site, you agree to our Privacy Policy and our Terms of Use. Close
vivster said:
RT is exactly why I didn't want to splurge on the 2080ti yet. Turing was basically just a prototype with very few RT and Tensor cores. Since they were so few it was natural to assume that the second generation will have and increase of at least 100%, which would already be huge.Considering the shrink and increased IPC performance it is very easy for me to believe that we'll see a 4x increase in RT performance. Also can't wait to see what they can do with many more tensor cores and DLSS 3.0. God, I love Nvidia for being so progressive and putting cool stuff into their chip, even if it slightly reduces rasterization performance. RTX voice is already crazy good and with the inclusion of hardware accelerated AI the possibilities are limitless.

I wonder if we'll ever see innovation like that in CPUs. For example using AI to improve multicore usage.

Yeah. Never buy a first generation product for a specific technology, it will be outclassed and replaced at an insanely rapid pace.

After a few generations they work out the kinks and you get a significantly better product, happened with Tessellation, Programmable Pixel Shaders, multi-monitor, crossfire (Remember when you needed a Master and Slave GPU!?) and so on.

When Tessellation first came out with the Radeon 5870, AMD's implementation was pretty average, good for a first-showing... But a Radeon 7970 had an upwards of a 4x increase in performance over the 6970... Which in turn had an upwards of 3x increase over the Radeon 5870, so it was moving rapidly, from there increases still happened, but they weren't as notable in retrospect.

CPU's are an entirely different beast... In the last few years we have gone from 4-core CPU's being your mid-range-high-end PC to 6-12-core parts, mostly thanks to AMD.
It will take time for developers to take advantage of that... Whether "AI" will improve things there is highly doubtful, it takes a fundamental shift in thinking about engine technology to make your code more parallel.

JEMC said:

I had to look at it, and turns out that the HBCC described in those slides to be the reason for the development of NVCache is from AMD, from their Vega cards. But I don't know if AMD decided to keep it in RDNA or not.

It will be interesting to see if both use something similar or if AMD decided to simply add more VRAM than Nvidia to their cards.

It's mostly for professional cards with multi-terabyte datasets, useless for gaming really.

Zkuq said:
I was considering getting a new graphics card last summer, and while I still haven't exactly got much real need due to the games I've played recently, my GTX 770 is that the risk of running into something it runs poorly is getting greater and greater. Thus getting a new one is sounding quite appealing right now. I really hope RTX 3060 is going to be priced similarly to RTX 2060 (or lower), otherwise the decision's going to be between RTX 2060 and RTX 3050 or whatever...

You have come this far. A 1060 alone is a big step up... So any mid-range GPU after that will be a significant overhaul.

Chazore said:

That's what irks me the most tbh. I'm still with my 1440p 165hz monitor, but I've virtually next to no interest in 4k, mostly due to the lacking HDR support on PC, as well as 4k monitors costing a mint. If I grab a 3000 series GPU, I'll likely expect to do 4k 60fps, but with RT being added into the mix I may as well stick with 1440p, since I'm all about having decent settings and higher fps, over another res bump and toned back settings. 

What I really need is a monitor that doesn't have crushed blacks, because that's the biggest gripe I have with this monitor at the moment. 

I think the general PC consensus is that unless you are running a 32" or larger display... The benefits of 4k over 1440P is fairly marginal, but the difference a 120-144-180-200-240hz display can be a significant one.

Personally I am happy with 1440P, dialing up the visuals and keeping that high framerate... My next display will be a higher refresh rate with decent HDR. #ChristmasWishlist

Captain_Yuri said:
The main reason I want a 4k monitor is while my Alienware 34 inch Ultrawide 3440 x 1440p Gsync is pretty fap worthy, the consoles doesn't like that resolution so instead, they just display at 1080p. I doubt that will change with the next gen so getting a 4k monitor would be awesome. But at this rate, it might be a better option for me to stick with my current monitor for pc gaming and instead, buy a 55 inch LG OLED tv and a ps5 since buying both of those will be cheaper than getting a 32 inch 4k 144hz gaming monitor...

I mean... It took Microsoft a year or two to add 1440P support to the Xbox One X, despite their pre-launch promise and being a PC company.
So support with consoles was always a bit "shit" for odd resolutions.



--::{PC Gaming Master Race}::--