By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx's PC gaming emporium - Catch up on all the latest PC Gaming related news

 

Zarx changed his avatar again. Thoughts?

Noice 248 61.39%
 
So soon? I just got used to the last one 14 3.47%
 
it sucks 22 5.45%
 
Your cropping skills are lacking 13 3.22%
 
Too noisy, can't tell WTF it even is 14 3.47%
 
Meh 32 7.92%
 
Total:343
Captain_Yuri said:

Samsung Unveils Odyssey Neo G7 & G8 4K Mini-LED Gaming Monitors: Up To 240 Hz Freesync Premium Pro Support

https://wccftech.com/samsung-reveals-odyssey-neo-g7-g8-4k-mini-led-gaming-monitors-amd-freesync-premium-pro/

"1196 dimming zones locally, NVIDIA G-Sync compatibility and AMD FreeSync Premium Pro certification, 2,000 nits of brightness. While the Samsung Odyssey Neo G7 does not have a release date or pricing, the Odyssey Neo G8 does have a launch date for June of this year and will sell with an MSRP of $1,500."

The prices are certainly much better than last year but still a tad too expensive. The biggest issue is that it's a curved screen. Ultrawide + Curve, you can deal with. 32 inch 16x9 + curve is pretty cringe.

DF Direct Weekly #62: Last of Us Remake PS5, Hitman 3 RT Is Heavy, Starfield Delayed!

According to DF, the reason why Hitman Ray Tracing is so demanding is cause it was supposed to showcase Ray Tracing features of Arc and supposedly, Arc has more advanced hardware Ray Tracing features than Ampere. But Intel is having the opposite problem that Turing had where Turing released with hardware but no Ray Tracing games where as Arc is releasing with games but no hardware.

Ngl really want a Neo G8. Yeah the curved 16:9 is meme-worthy and not great for productivity though its decent for entertainment and casual use. Sadly this looks to be the only decent offering at this size and at a kind of reasonable price rn, though there are others getting into the mini LED market. Also disappointing that the Neo G7 doesn't support HDMI 2.1 and even downgraded to 165hz.

About ARC, its pretty much doa seeing as there is no firm date for outside Asia. Crazy how they've got partners on board and they can't even get the product out lol. A bit embarrassing tbh.

Talking about monitors Asus announces its first 500hz monitor.

https://www.theverge.com/2022/5/24/23139263/asus-500hz-nvidia-g-sync-gaming-monitor-display-computex-2022

Albeit it runs on a 24" 1080P screen. Which also begs the question how many frames or hz is the limit to which our eyes can perceive, considering the limitations of LCD motion handling and tech.



Around the Network
hinch said:
Captain_Yuri said:

Samsung Unveils Odyssey Neo G7 & G8 4K Mini-LED Gaming Monitors: Up To 240 Hz Freesync Premium Pro Support

https://wccftech.com/samsung-reveals-odyssey-neo-g7-g8-4k-mini-led-gaming-monitors-amd-freesync-premium-pro/

"1196 dimming zones locally, NVIDIA G-Sync compatibility and AMD FreeSync Premium Pro certification, 2,000 nits of brightness. While the Samsung Odyssey Neo G7 does not have a release date or pricing, the Odyssey Neo G8 does have a launch date for June of this year and will sell with an MSRP of $1,500."

The prices are certainly much better than last year but still a tad too expensive. The biggest issue is that it's a curved screen. Ultrawide + Curve, you can deal with. 32 inch 16x9 + curve is pretty cringe.

DF Direct Weekly #62: Last of Us Remake PS5, Hitman 3 RT Is Heavy, Starfield Delayed!

According to DF, the reason why Hitman Ray Tracing is so demanding is cause it was supposed to showcase Ray Tracing features of Arc and supposedly, Arc has more advanced hardware Ray Tracing features than Ampere. But Intel is having the opposite problem that Turing had where Turing released with hardware but no Ray Tracing games where as Arc is releasing with games but no hardware.

Ngl really want a Neo G8. Yeah the curved 16:9 is meme-worthy and not great for productivity.. but this looks to be the only decent offering at this size and at a kind of reasonable price. Also disappointing that the Neo G7 doesn't support HDMI 2.1 and even downgraded to 165hz. About ARC, its pretty much doa seeing as there is no firm date. Crazy how they've got partners on board and they can't even get the product out lol. A bit embarrassing tbh.

Talking about monitors Asus annouces its first 500hz monitor.

https://www.theverge.com/2022/5/24/23139263/asus-500hz-nvidia-g-sync-gaming-monitor-display-computex-2022

Albiet it runs on a 1080P screen. Which also begs the question how many frames or hz is the limit to which our eyes can perceive considering the limitations of LCD motion handling and tech.

"This panel also uses a new esports TN"

I am kind of interested in this monitor that just got announced:

MSI MEG 342C QD-OLED with 34″ Ultrawide Panel and 175Hz Refresh Rate

https://tftcentral.co.uk/news/msi-meg-342c-qd-oled-with-34-ultrawide-panel-and-175hz-refresh-rate

Unlike the Alienware QD-OLED, this one doesn't have the G-sync module so potentially, it could have the HDMI 2.1 that people are looking for. The rest of the specs match the Alienware afaik. The only urghh thing is that it's highly unlikely MSI will offer the Burn-In warranty that the Alienware does but still, if priced right, could be a good alternative.



             

                               Anime: Haruhi                                                                                                           Nsfw Anime Thread                                     

Yeah I mean serious competitive esports guys who only care for gaming may want it, but can't imagine anyone else wanting a TN panel in 2022 lol.

And oo that looks good. Most likely using the same Sammy panel as the Alienware. Hopefully that has HDMI 2.1. Should be standard in all high end monitors 1440P and above. The other extras sound pretty decent as well; like ambient light sensor, smart phone control and built in KVM switch. Will be silent too as opposed to the AW.

Not sure on pricing but iirc last years high end MSI monitors were on the pricier side of the scale. Though these panels (so for) looks like like they may for a way more affordable products than last years high end tech using mini-LEDs. If they price this at under $1400 I'd call that a W.

Last edited by hinch - on 24 May 2022

What I don't understand is why all the QD-OLED monitors are .34" ultrawide. Why don't they use it for 16:9 ones, that are more popular?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

What I don't understand is why all the QD-OLED monitors are .34" ultrawide. Why don't they use it for 16:9 ones, that are more popular?

From what I gather from HDTVtest.. these 34" panels are cut using the same motherglass as the ones they use in their commercial TV panels. Much like LG do with their WOLEDs. I think they would have to retool their equipment and process for other sizes for production.



Around the Network

Also since all QD-OLED monitors are sourced from Samsung and they are only making 34 inch ultrawides, all QD-OLED monitors are 34 inch ultrawides.

If I had to guess, the yields are probably better than going 4k 16x9 due to the PPI as well and since this is their first generation, they didn't want to put too much of a risk. Even their QD-OLED TV sizes are limited to like 55 inch-65 inch. Most likely, we will see 16x9 QD-OLED monitors next year as the yields improve drastically.



             

                               Anime: Haruhi                                                                                                           Nsfw Anime Thread                                     

Captain_Yuri said:
JEMC said:

Yeah, that's their whole point. With that said, I hope it backfires and a lot of people will ask them for the repair kit because that doesn't look cheap, especially with shipping.

Captain_Yuri said:

That would certain be an interesting idea but idk if AMD will go that route anytime soon since they could sell you a dGPU instead. But maybe later down the line, that would be pretty nuts.

Given the leaked diagram for an Asus board, I doubt AMD wil go that route as well

https://videocardz.com/newz/leaked-asus-x670-prime-pcb-diagram-for-amd-ryzen-7000-cpus-confirms-this-motherboard-has-it-two-chipsets

I expected the two chipsets to be together in the same package, not split and so far away one from the other.

I just hope those chipsets don't require active cooling like initial X570 boards. It is also very strange having two chipsets which suggests AMD is trying to add in additional IO capabilities but they aren't integrating that directly into the CPU. It will be interesting to see how it turns out.

I think they are going to bifurcate PCI-E. PCI-E 4.0 unless you have that secondary chip which enables PCI-E 5.0 links.
I think one chipset will have PCI-E 5.0 for the SSD or GPU, but not both at the same time if I recall.

Not exactly the most elegant of solutions.

JEMC said:

I hadn't thought about the 4 vs. 8 cores design of Zen2 and 3, thanks for bringing that out.

Should mean an uptick in multi-threaded scenarios that extended past a single CCX as there is less of a latency/bandwidth hit with communication.

Bofferbrauer2 said:

From what I heard, Zen5 would be a much bigger upgrade, with an architecture overhaul. Zen4 seems more like a stopgap in between Zen3 and Zen5 since not much changes on the CPU side apart from the larger L2 cache.

For all intents, AMD was just rolling out new technology/platform with Zen4, rather than reinvent the performance wheel, DDR5, PCI-E 5.0, new socket and chipset designs, new I/O chip (No longer built at 14nm).
Every chip also becomes an APU now which would have some positive ramifications for workstations and hopefully video encode/decode if AMD doesn't castrate that.

I'll make the upgrade, mostly because I want more Ram, 128GB isn't enough for what I am trying to do... And an SSD that exceeds the limits of PCI-E 4.0... Sounds moist.

Last edited by Pemalite - on 24 May 2022

--::{PC Gaming Master Race}::--

hinch said:
JEMC said:

What I don't understand is why all the QD-OLED monitors are .34" ultrawide. Why don't they use it for 16:9 ones, that are more popular?

From what I gather from HDTVtest.. these 34" panels are cut using the same motherglass as the ones they use in their commercial TV panels. Much like LG do with their WOLEDs. I think they would have to retool their equipment and process for other sizes for production.

So those 34" panels are leftovers from the OLED sheets used for TVs?  Makes sense.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Pemalite said:
Captain_Yuri said:

I just hope those chipsets don't require active cooling like initial X570 boards. It is also very strange having two chipsets which suggests AMD is trying to add in additional IO capabilities but they aren't integrating that directly into the CPU. It will be interesting to see how it turns out.

I think they are going to bifurcate PCI-E. PCI-E 4.0 unless you have that secondary chip which enables PCI-E 5.0 links.
I think one chipset will have PCI-E 5.0 for the SSD or GPU, but not both at the same time if I recall.

Not exactly the most elegant of solutions.

Certainly would be interesting if that's the reason although I'd prefer just having as many lanes as possible going directly to the CPU instead of going through a chipset.



             

                               Anime: Haruhi                                                                                                           Nsfw Anime Thread                                     

Captain_Yuri said:
Pemalite said:

I think they are going to bifurcate PCI-E. PCI-E 4.0 unless you have that secondary chip which enables PCI-E 5.0 links.
I think one chipset will have PCI-E 5.0 for the SSD or GPU, but not both at the same time if I recall.

Not exactly the most elegant of solutions.

Certainly would be interesting if that's the reason although I'd prefer just having as many lanes as possible going directly to the CPU instead of going through a chipset.

Requires allot of motherboard traces and layers in order to route allot of PCI-E lanes.

And the other issue that PCI-E is starting to suffer from is signal attenuation, the faster you make a PCI-E lane, the less tolerances they have... So we need repeaters/chipsets closer to the ports themselves.

Conversely... Not all PCI-E Communication is between the CPU and PCI-E devices anyway, so there is less of a need to push all those lanes onto the CPU.



--::{PC Gaming Master Race}::--