By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

If anyone wants to have a visual orgasm:



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Pemalite said:
Is that a tower CPU cooler on that GPU? JFC.

Can't let Nvidia be the only one to innovate cooling solutions.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
JEMC said:
There's been a lot of rumors regarding the performance of Big Navi, but not a single one about yield problems.

But it doesn't really have to be a very big chip. The Navi 10 chip of the 5700 series is 251 mm², double that and it would still be smaller than the Ampere chip of the 3080. Of course, it's not that simple and the added ray tracing hardware will also take more space.

Yea that's true. It's just annoying as to how late it is. Like if you have a competitive product, what would be the reasoning to not steal your competitors hype? They had almost a month to prepare since Nvidia did that countdown and they have 3-4 weeks since the event to come out with something to give people a reason to wait for Big Navi. Instead, they are like, go ahead Nvidia, do your thing, I'll just showcase my product a month later. Have all dem sales.

It's not like they even need to commit to a release date. They could have showed it off in September and been like, releasing in November/December and people would be fine with that if it's that good.

Well, to be fair, there's nothing AMD could do to stop Nvidia from selling the first batches of their cards. There's a demand for them and even if they had a better product, Nvidia's brand powe would be enough to still sell them. So why not wait until you have a finished product to launch it? You'll also have the benefit of doing some final tweaks to the BIOSes and the price to get a more competitive product.

By the way, and talking about selling cards, looks like cryptominers are at it again:

Hold your breath: GeForce RTX 3080 Might Become Hard to get - Cryptocurrency Miners Again
https://www.guru3d.com/news-story/hold-your-breath-geforce-rtx-3080-might-become-hard-to-get-cryptocurrency-miners-again.html
The first in the row of to be released RTX series 30 graphics card is, of course, the GeForce RTX 3080, and something very ominous is happing in Asia; miners already have obtained many cards, and they want them badly s the mining compute capacity of 115 MH/s is triple that of the RTX 2080.

In the photos below originating from the Chinese forum Baidu, there are users actually who managed to get hold of several cards (before launch). It's staggering to see as photos are showing RTX 3080 cards in piled up in stacks. The fact that the cards already have been sold is weird, but more worrying is that a new mining craze might ignite shortages, and this will drive availability for consumers down and prices up.

Miners that mine Ethereum, as you can see in the screenshot for the RTX 3080 achieves a computing power of 115 MH/s, nearly three times faster than an RTX 2080 that is capable of 35 to 40 megahashes per second. No values for energy consumption are given, the 3080 seems theoretically more efficient if one contrasts the tdp of both cards with the hash rate. Mining cryptocurrency in Asia is quite profitable, given the relatively low energy costs.

(Can't post the pics, sorry)

With the increase in compute power that Ampere brings, this time around it could be the Nvidia cards the ones more affected. Or not, I know nothing about crypto mining other than it's bad for our wallets.

Pemalite said:
Is that a tower CPU cooler on that GPU? JFC.

From what I've read in several places, using a CPU cooler in engeenering boards isn't uncommon.

Captain_Yuri said:

If anyone wants to have a visual orgasm:

That's cheating. Everything is x10 times better (or less worse) with that song.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

So now we know why AMD is waiting. Just let the miners buy up all Nvidia cards and be the only card available.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network

JFC thats cant be real, look at how fucking dusty that thing is



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Ok, I'm past the first day with my new TV. Verdict: Pretty cool.
At least for now I can say that it was probably not wasted money, especially once I can get some new games running on it. Not sure if I would call it mind blowing but it's definitely within my high expectations.

A few caveats. That optical illusion of a bulging screen is not gone yet. I have to assume that it's an optical illusion because it wouldn't make sense for the display to have actually warped rows of pixels. The weird thing is I can see it from most angles but not every angle. It definitely has to to have something to do with me staring at a curved display for 5 years. I think you can compare it to the motion aftereffect illusion. Where your eyes are so used to motion that they try to adapt it to static things.

I actually got a headache yesterday from the extreme contrast. When people talk about HDR and OLED and say "infinite" contrast ratio they are not kidding. I usually love high contrasts on everything because of my terrible eyesight but this was too much. I had to turn down the brightness a notch. Those extremely bright whites on extremely dark blacks are so aggressive. I guess it highlights how bad my previous TV with FALD actually was. The best FALD can't do shit when you want to have a high contrast between adjacent pixels. So yeah, no halo around my mouse pointer anymore. I should add that my living room is constantly dimly lit or completely dark, so the bright whites are especially noticeable. I cannot imagine anyone having an issue with the brightness of the TV unless the sun is blasting directly at it.

The TV works well with 1440p. Nvidia or Windows didn't give me any issues. What I can definitely confirm is that HDR will not work when you mirror on 2 displays if one display has no HDR. Works fine when you extend it. What I'm not sure about is HDR + Gsync. Nvidia says Gsync is enabled and Windows says HDR is enabled. The TV also says HDR is enabled. Digging a bit through the internet it seems it is actually possible on some displays to have Gsync + HDR without Gsync Ultimate. Will be able to properly test this once I get my new PC. For now I'll only be gaming at 1440p on the TV.

I will now set up my new side monitor to see how much my 1050ti likes to power two 4k screens.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

kirby007 said:
JFC thats cant be real, look at how fucking dusty that thing is

Yeah, I noticed the dusty Thermaltake PSU. Another suspicious thing, at least to me, is that it's a regular PC case. For testing and being able to have easy access to the card, an open bench like those that reviewers use would make more sense.

vivster said:

Ok, I'm past the first day with my new TV. Verdict: Pretty cool.
At least for now I can say that it was probably not wasted money, especially once I can get some new games running on it. Not sure if I would call it mind blowing but it's definitely within my high expectations.

A few caveats. That optical illusion of a bulging screen is not gone yet. I have to assume that it's an optical illusion because it wouldn't make sense for the display to have actually warped rows of pixels. The weird thing is I can see it from most angles but not every angle. It definitely has to to have something to do with me staring at a curved display for 5 years. I think you can compare it to the motion aftereffect illusion. Where your eyes are so used to motion that they try to adapt it to static things.

I actually got a headache yesterday from the extreme contrast. When people talk about HDR and OLED and say "infinite" contrast ratio they are not kidding. I usually love high contrasts on everything because of my terrible eyesight but this was too much. I had to turn down the brightness a notch. Those extremely bright whites on extremely dark blacks are so aggressive. I guess it highlights how bad my previous TV with FALD actually was. The best FALD can't do shit when you want to have a high contrast between adjacent pixels. So yeah, no halo around my mouse pointer anymore. I should add that my living room is constantly dimly lit or completely dark, so the bright whites are especially noticeable. I cannot imagine anyone having an issue with the brightness of the TV unless the sun is blasting directly at it.

The TV works well with 1440p. Nvidia or Windows didn't give me any issues. What I can definitely confirm is that HDR will not work when you mirror on 2 displays if one display has no HDR. Works fine when you extend it. What I'm not sure about is HDR + Gsync. Nvidia says Gsync is enabled and Windows says HDR is enabled. The TV also says HDR is enabled. Digging a bit through the internet it seems it is actually possible on some displays to have Gsync + HDR without Gsync Ultimate. Will be able to properly test this once I get my new PC. For now I'll only be gaming at 1440p on the TV.

I will now set up my new side monitor to see how much my 1050ti likes to power two 4k screens.

Good to see that you're, so far, happy with your purchase. I agree that the warp feeling is mostly due to using a curved display for so long and that it should go away rather soon.

If you pay attention, I'm sure you'll be able to hear that poor 1050Ti crying for mercy.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

The 1050ti seems to handle it pretty well.

Careful, really dim room, therefore terrible picture. Also bad perspective, the TV is far away, it's actually 4 times as big as the monitor.

Before this week I did not actually have a side monitor, so this is new. And it's also incredibly useful.

Spoiler!


If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Looks amazing



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850