By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia Gets SALTY

Conina said:
Both RTX 2080 and Radeon VII are underwhelming. I really can't get exited for a video card for $699 MSRP with similar performance and launch price of a two years older card (GTX 1080 Ti should keep up with both and had the same $699 MSRP in March 2017)

Those cards are like 13-14 Tflops.
They make the PS4pro + Xbox One X look weak.

Thats underwhelming? O_O



Around the Network

And they will be extra salty with PS5 and Scarlet being AMD powered again and doing good sales plus having good performance for console side.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
And they will be extra salty with PS5 and Scarlet being AMD powered again and doing good sales plus having good performance for console side.

For their prospective probably while ps and xb move a lot of units, console chips have very thin margins compared to graphics cards and laptops gpus.

Mind me, I don't have anything against AMD, but PS and Xbox use their chips not because they're superior to Nividia's but only because they're chepaer.



deskpro2k3 said:
Cobretti2 said:
Well i haven't seen the performance of this card, but if what he said is fact, then you can't say he is being salty.

Salty would be downplaying and bagging something good out.

I'm not 100% but I think It is as good as the rtx 2080, but he is calling it lousy. So does that mean rtx 2080 is lousy too? When you switch raytracing and dlss on the game looks better but the fps takes a hit.

2080 is lousy , but at least it's 215W card built of 12nm node. Radeon VI is even more lousy given that it needs 300W and it's 7nm card

JRPGfan said:

Nvidia: "What its only as good as our RTX 2080 ? bish please.... we got Dlss & Raytracing."

note:
DLSS is a rendering technique like the PS4pro does, kinda like the checkerboard rendering (less imagine quality for more performance).
Also AMD will have raytraceing too I suspect.

Yes he does sound Salty.

If I understand the tech correctly, DLSS is using neural networks (that need to be trained per title) to make higher resolution image with more details than what it gets on input - so, not like checkerboarding.



HoloDust said:
deskpro2k3 said:

I'm not 100% but I think It is as good as the rtx 2080, but he is calling it lousy. So does that mean rtx 2080 is lousy too? When you switch raytracing and dlss on the game looks better but the fps takes a hit.

2080 is lousy , but at least it's 215W card built of 12nm node. Radeon VI is even more lousy given that it needs 300W and it's 7nm card

JRPGfan said:

Nvidia: "What its only as good as our RTX 2080 ? bish please.... we got Dlss & Raytracing."

note:
DLSS is a rendering technique like the PS4pro does, kinda like the checkerboard rendering (less imagine quality for more performance).
Also AMD will have raytraceing too I suspect.

Yes he does sound Salty.

If I understand the tech correctly, DLSS is using neural networks (that need to be trained per title) to make higher resolution image with more details than what it gets on input - so, not like checkerboarding.

Yes instead of 1 technique that kinda works for everything, they have a A.I optimise how to do it best for each title.
But its still the same way to go about getting better performance, it comes at the cost of imagine quality.

Its a better technique than Checkerboard rendering that the PS4pro does, but its in sense the same thing.
(its just one works from the top, and scales down, while another works from the bottom and scales up)

 

About the power load:


It usually 225+ watts and has short bursts where it can go upto 314watts.

Radeon VI will probably be the same, even if they say 300watts, during gameing it ll likely be 225watt ish too.

Last edited by JRPGfan - on 12 January 2019

Around the Network
freebs2 said:

Mind me, I don't have anything against AMD, but PS and Xbox use their chips not because they're superior to Nividia's but only because they're chepaer.

It's more that when the PS4 and Xbox One were in the planning stages, devs made it absolutely clear that they wanted the CPU and GPU to all be part of the same chip. Intel's (integrated) GPUs are complete garbage, and nVidia don't make x86 CPUs, so that left AMD as the only real option.



JRPGfan said:
 

Yes instead of 1 technique that kinda works for everything, they have a A.I optimise how to do it best for each title.
But its still the same way to go about getting better performance, it comes at the cost of imagine quality.

Its a better technique than Checkerboard rendering that the PS4pro does, but its in sense the same thing.
(its just one works from the top, and scales down, while another works from the bottom and scales up)

About the power load:


It usually 225+ watts and has short bursts where it can go upto 314watts.

Radeon VI will probably be the same, even if they say 300watts, during gameing it ll likely be 225watt ish too.

Does anything about this card screams "hey, I consume less power than it seems!"? My, my.

Besides, anyone can do this peak consumption stuff. Vega's particularly good on it.

Face it, the thing's dead on arrival. In fact, it is so dead AMD has no business calling it a "live stream", amirite?



 

 

 

 

 

JRPGfan said:


Radeon VI will probably be the same, even if they say 300watts, during gameing it ll likely be 225watt ish too.

Why do you think that?

Radeon VI will be more efficient than a Vega 64, but it will use the better efficiency to push more pixels and polygons than the Vega 64, not to deliver the same performance with less power consumption:



haxxiy said:
JRPGfan said:

Does anything about this card screams "hey, I consume less power than it seems!"? My, my.

Funny gif :) well done editting job.

Actual card:



Conina said:
JRPGfan said:

Radeon VI will probably be the same, even if they say 300watts, during gameing it ll likely be 225watt ish too.

Why do you think that?

Radeon VI will be more efficient than a Vega 64, but it will use the better efficiency to push more pixels and polygons than the Vega 64, not to deliver the same performance with less power consumption:

I think perf/watt will massively be improved with the Radeon VII compaired to the RX Vega 64.

Look at what AMD did with their new Ryzen chips.
They beat out the Intel 9900k (which is 180watt+) and do so at 130watts.

They just had a massive improvement in perf/watt (for their cpu line), and at the same time finally managed to beat intel in cpu performance.