By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
Captain_Yuri said:

Well I was referring to the RX vega that's coming out eventually not the Vega that's already out.

I know, and those are the cards I'm talking about.

If Vega is a success among miners and they buy lots of them, the gaming marcketshare will only change in favor of Nvidia (because their cards aren't as productive as AMD ones), and that will make the gap between AMD and Nvidia that we see in Conina's Steam Stats grow more and more. And the same will happen with the data that publishers and developers collect through their games when they scan our PC to know what's the best setup for the game in our machine (because I think they send that data to the publisher to know, for example, what kind of optimisations/fixes should be given a priority). 

So yes, if Vega is a success among miners, AMD will make money from their cards as they'll sell all they have, but that will hurt them in the long term because pub/devs will focus more and more on optimising their games for Nvidia's hardware as that what their data tells them that most gamers use.

Neh. If vega doesn't offer great performance for great value, I doubt the share would increase much at all tbh. I get what you are saying but would it even increase that much on steam's hardware survey even if miners weren't a thing and Vega still had neh worthy performance/value? At least this way, they might get money.

And it's not like they are failing now means that they can't come back if they manage to find a magical pixy dust like they did with Ryzen. We are seeing that just because a company has been dominant doesn't mean the the tide cannot change. The problem with GPUs though is that Nvidia hasn't exactly been slacking like Intel has and their GPUs do gain care worthy performance increase from gen to gen so for AMD it will be much harder.

All I am saying is this. If Vega really does compete with a 1080/1070 and gets priced a bit less, then most people would view it as a disappointment and it might not even sell that much even if Miners weren't buying gpus. But since Miners are buying gpus, it might sell a lot regardless of it's gaming performance. Yea it's steam stats might not go up very much but really... Assuming the performance isn't as great as it was hyped out to be, would it even gain much market share anyway? I doubt it since Nvidia can just lower the price of their 1080s/1070s and include a game while doing so.

Hopefully Navi will bring AMD back to the board but who knows at this point.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:
JEMC said:

I know, and those are the cards I'm talking about.

If Vega is a success among miners and they buy lots of them, the gaming marcketshare will only change in favor of Nvidia (because their cards aren't as productive as AMD ones), and that will make the gap between AMD and Nvidia that we see in Conina's Steam Stats grow more and more. And the same will happen with the data that publishers and developers collect through their games when they scan our PC to know what's the best setup for the game in our machine (because I think they send that data to the publisher to know, for example, what kind of optimisations/fixes should be given a priority). 

So yes, if Vega is a success among miners, AMD will make money from their cards as they'll sell all they have, but that will hurt them in the long term because pub/devs will focus more and more on optimising their games for Nvidia's hardware as that what their data tells them that most gamers use.

Neh. If vega doesn't offer great performance for great value, I doubt the share would increase much at all tbh. I get what you are saying but would it even increase that much on steam's hardware survey even if miners weren't a thing and Vega still had neh worthy performance/value? At least this way, they might get money.

And it's not like they are failing now means that they can't come back if they manage to find a magical pixy dust like they did with Ryzen. We are seeing that just because a company has been dominant doesn't mean the the tide cannot change. The problem with GPUs though is that Nvidia hasn't exactly been slacking like Intel has and their GPUs do gain care worthy performance increase from gen to gen so for AMD it will be much harder.

All I am saying is this. If Vega really does compete with a 1080/1070 and gets priced a bit less, then most people would view it as a disappointment and it might not even sell that much even if Miners weren't buying gpus. But since Miners are buying gpus, it might sell a lot regardless of it's gaming performance. Yea it's steam stats might not go up very much but really... Assuming the performance isn't as great as it was hyped out to be, would it even gain much market share anyway? I doubt it since Nvidia can just lower the price of their 1080s/1070s and include a game while doing so.

Hopefully Navi will bring AMD back to the board but who knows at this point.

Well, of course Nvidia isn't slacking like Intel. They've found ways to use GPUs in science that reward pushing the boundaries of the hardware, unlike what happens with CPUs where most of the time they're already "good enough".

As for Vega competing with Nvidia's 1070/1080, well, that wouldn't be a surprise for me to be honest. Ever since the first demos, using Doom, AMD has compared Vega with the 1080 showing similar results. If there's someone out there that expects Vega 10 to compete or even beat the 1080Ti, then the problem doesn't come from AMD, but from himself.

That said, we'll see. Workstation class products don't performs as well as their gaming counterparts, so the end product could end up beating the 1080.

And AMD needs to get those cards to gamers, because it's not about gaining ground in the marketshare, it's about trying to minimize the loss they're having and will have in the short term.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Nvidia might be moving to Multi-Chip-Module GPU design
http://www.guru3d.com/news-story/nvidia-might-be-moving-to-multi-chip-module-gpu-design.html
With Moore's law becoming more difficult each year technology is bound to change. At one point it will be impossible to shrink transistors even further, hence companies like Nvidia already are thinking about new methodologies and technologies to adapt to that. Meet the Multi-Chip-Module GPU design.

Nvidia published a paper that shows how they can connect multiple parts (GPU modules) with an interconnect. According to the research, this will allow for bigger GPUs with more processing power. Not only will is help tackling the common problems, it would also be cheaper to achieve as fabbing four dies that you connect is cheaper to do than to make one huge monolithic design.

Thinking about it, AMD is doing exactly this with Threadripper and EPYC processors where they basically connect two to four Summit Ridge (ZEN) dies with that wide PCIe lane link (they use 64 PCie lanes per link with 128 available), Infinity Fabric.

According to the researchers, as an example a GPU with four GPU modules they recommend three architecture optimizations that will allow for minimal loss off data-communication in-between the different modules. According to the paper the loss in performance compared to a monolithic single die chip would be merely 10%

Of course when you think about it, in essence SLI is already a similar methodology (not technology), however as you guys know it can be rather inefficient and challenging in scaling and compatibility. The paper states this MCM design would be performing 26.8% better compared to any multi-GPU solution. If and when Nvidia is going to fab MCM multi GPU module based chips is not known, for now this is just a paper on the topic. The fact that they publish it indicates it is bound to happen at one point in time though.

 

This is the research paper for those that want to give it a look, even if it doesn't say anything that's not on the article: http://research.nvidia.com/publication/2017-06_MCM-GPU%3A-Multi-Chip-Module-GPUs

I think it's interesting, and it could be the same thing AMD will do with Navi, when they revealed that roadmap last year with the tag "scalability"



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Lul. What was that about "stitching together processors"^^ I know that was Intel but it's kinda hilarious.

I'll be looking forward to 2035 when the first games start to officially support multi GPU architectures.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

fatslob-:O said:

Intel's investment is over a 2 year period whereas Samsung's investment is over a 1 year period so not only is Samsung spending twice as much as Intel in the same period but this does not factor in IBM and GF R&D so Samsung has even more capital to draw from ... 

It's great that Intel has started to produce ICs for other chip designers but it's too little and too late so their days at being leaders in transistor technology are over ... 


Intel has already invested billions in that plant already. That's just Fab 42.

https://en.wikipedia.org/wiki/List_of_semiconductor_fabrication_plants

Intel clearly still ahead.


fatslob-:O said:

It wasn't AMD who solely topped Intel with the K8, they had help from Microsoft standardizing AMD64 in their Windows ABI ...

By the time 64bit OS's, Software and Games started to gain traction, allot of users had transitioned away AMD's K8. Thus it's 64bit extensions often went unused.

fatslob-:O said:

They need to keep looking to Microsoft again so that they change the spec once more in DirectX to begin to be even remotely competitive with Nvidia. AMD needs to seriously streamline some more of their graphics hardware extensions into official optional features for DirectX ...

I agree.

 

fatslob-:O said:

The only way I can see Vega competing against Pascal is with shader model 6 games and there's none of them so far. It's just another lost generation for AMD at this point ...

Vega was AMD "Catch up" part. It wasn't meant to beat nVidia. If AMD wanted to be competitive with Vega, it needed to launch 12 months ago, not in a month or two from now.

fatslob-:O said:

The biggest leverage AMD has so far against Nvidia is that Microsoft is on their side so they should use that partnership that they have with Xbox to extend into changing the DirectX spec ...

nVidia is locking in gamers with it's Propriety technology. That's an uphill battle to beat.
You buy a G-Sync monitor, chances are you are only buying nVidia GPU's for the life of that monitor.

vivster said:
My PC woes continue as it just shut off yesterday during watching a stream. Yet again it didn't boot up again and was just stuck before Windows could load. At least this time I could after some time conclude that the reason it wasn't booting up was because of USB. Whenever it was stuck I just removed all USB cables and it booted instantly.
Some months ago one of the USB ports even caused a bluescreen just by plugging something into it.

I heard of X99 chipsets and their garbage USB but now I can live it.

As for the sudden crash yesterday I don't know what could have caused it. I looked at the event viewer and there was nothing. It just shut off. I don't want to think it's the PSU or CPU again but what else could cause this?

My x79 has been rock solid. (Wish it would actually die so I could have an excuse to upgrade!)

The USB issue seems to be more common than you would think. I have an AMD AM3+ rig and a Socket 775 rig, both hate one of my external drives. They instantly hang when I plug it in. On my x79 board though, it's problem free.

fatslob-:O said:

EUV will have low yields ? LOL wut is this ? EUV is just a change to the scanner, if anything EUV should have better yields because of it's higher resolution imaging ... 

EUV should mean less patterning. So it should reduce costs.
Intel is going to forgo EUV and retain higher patterning and then transition to EUV later. Basically they don't wish to spend big on that technology now, but later and just put up with higher chip fabrication costs in the shorter term.

Captain_Yuri said:

Ram almost 99% of the time comes with free lifetime warranty 

The Lifetime warranty is often a farce anyway. :P
It's for the "Lifetime" of the product. I.E. For as long as it's a product that is sold on the market.

So your DDR3/DDR2 kits with Lifetime warranty? Yeah. That's probably over and done with now. :P

Thankfully here in Australia, if we were to buy a kit and the "lifetime" warranty ended up only being a month... Australian law will back the consumer. Most goods you can expect 12+ months out of them. Minimum.


Captain_Yuri said:

So all the recent vega news continues to be disappointing considering their current Vega gpu is between a 1070 and a 1080. Although granted that this is a workstation series card but man...

I expected Vega to drop between the 1080 and 1080Ti. Between the 1070 makes JEMC right I think? (Please don't tell him!) Haha. (I can't win 'em all! :P)

JEMC said:

Taking back 10.4% of the market in just one quarter is... too good to be good? I mean, Ryzen CPUs are proving to be worth purchasing, but gaining so much ground is very, very surprising.

The Enthusiast circles have gone nuts over them. I am not surprised at all.

Steam's statistics are a little bit skewed towards DIY users though.

Ka-pi96 said:
How hard is it to change a graphics card? Before when my computer has needed an upgrade I've just bought a brand new one, no idea how to do any upgrade stuff so... is it hard?

A monkey can do it.



Killy_Vorkosigan said:
My Steam games are installed on a 500GB SSD since 2012, worth every penny of it. I won't ever put my games on a HDD .

I have *all* my games on a 4 Terabyte drive. With only a few that I play most often on the SSD. Works great.


JEMC said:

I think it's interesting, and it could be the same thing AMD will do with Navi, when they revealed that roadmap last year with the tag "scalability"

AMD was going down this path years ago with their "small die" strategy, where they would use two smaller highly-efficient chips on a card, to beat a bigger one. And it actually started to work out for them.



--::{PC Gaming Master Race}::--

Around the Network

Let's go with the news:

 

SALES & "SALES"/DEALS

The original, '90s version of Shadow Warrior is free on GOG right now
http://www.pcgamer.com/the-original-90s-version-of-shadow-warrior-is-free-on-gog-right-now/
Hear that sound? (...)Is the free game klaxon.
This time it's heralding the free-ness of Shadow Warrior Classic. It's the old 1990s Shadow Warrior, so you'll need to have some patience for the way FPS games used to be, but even if you've no intention to play it at least it's damn free.
That offer linked to above is on GOG, which bundles in all applicable add-ons and expansions. But it's also free on Steam, though without the aforementioned bonuses.

 

Get Batman: Arkham Knight for 75 percent off today
http://www.pcgamer.com/get-batman-arkham-knight-for-75-percent-off-today/
It's been a couple of years since the last entry in the Arkham series of Batman games. Andy's review says it's not quite as good as the older ones, however it's certainly still worth a look. It's especially worth considering today, as you can get it for 75 percent off over at Bundle Stars.

 

SOFTWARE

Download: GeForce 384.80 Hotfix driver
http://www.guru3d.com/news-story/download-geforce-384-80-hotfix-driver.html
You can now download the GeForce GameReady 384.80 Hotfix driver as released by NVIDIA, this driver Fixes Watch Dogs 2 crash on startup and resolved stutters in VR Simultaneous Multi Projection apps.

>>Contrary to the usual download links, this time you can get it from Guru3D.

 

MODS/EMULATORS

OpenIV team pulls the plug on the 'Liberty City in GTA V' mod
http://www.pcgamer.com/openiv-team-pulls-the-plug-on-the-liberty-city-in-gta-v-mod/
"The development of OpenIV will be continued as before. OpenIV never supported GTA Online modding and will not support it in the future. Our work will be continued within the Rockstar modding policy," it continued. "Unfortunately, our highly anticipated mod 'Liberty City in GTA V' will not be released because it clearly contradicts with Rockstar modding policy. Liberty City mod is a big loss for us, since it was a huge part of our motivation to push OpenIV functionality."

 

GAMING NEWS

After almost four months, and selling almost 500K units, NieR: Automata has not received any single patch
http://www.dsogaming.com/news/after-almost-four-months-and-selling-almost-500k-units-nier-automata-has-not-received-any-single-patch/
When Square Enix announced that NieR: Automata would be released on the PC, a lot of PC gamers got excited. Normally, the PC version would be, by default, the game’s definitive version, however NieR: Automata was plagued by a lot of issues on the PC. Fast forward almost four months and here we are today without any patch to address its reported issues.

>>That's something vivster already knows.

 

SEGA and Creative Assembly announce A Total War Saga
http://www.dsogaming.com/news/sega-and-creative-assembly-announce-a-total-war-saga/
SEGa and Creative Assembly have shared initial plans for a new class of historical Total War game that will launch under the badge of “A Total War Saga”. This new spin-off series will arrive with gamers before the next major historical release in the franchise, promising a busy time for historical strategy fans.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

The rest of the news:

 

New Overwatch hero Doomfist teased by Blizzard
http://www.pcgamer.com/new-overwatch-hero-doomfist/
Blizzard has released a teaser hinting at new details about Doomfist, the super-powered villain expected to be Overwatch's next playable hero. In an in-universe blog post today, the Times of Numbani revealed that the Talon organization targeted a maximum security prison in order to free one man: "Akande Ogundimu, better known as Doomfist."

 

Activision aims to improve the 'cadence' of Destiny 2 DLC
http://www.pcgamer.com/activision-aims-to-improve-the-cadence-of-destiny-2-dlc/
Destiny is a pretty popular game. But Activision Publishing CEO Eric Hirshberg isn't entirely happy with it. Specifically, he thinks the "cadence" of post-release content has been a disappointment for players. As a result, he said in an interview with GamesIndustry that Activision is taking a more aggressive approach to DLC creation for the sequel.

 

Aztez, a 2D beat-em-up strategy game set in ancient Mexico, is coming in August
http://www.pcgamer.com/aztez-a-2d-beat-em-up-strategy-game-set-in-ancient-mexico-is-coming-in-august/
I got my first look at Aztez way back in 2013, and I was quite taken by it. It promised a mix of turn-based strategy and brutal, bloody 2D brawls—an odd blend, yes—rendered in a stark monochromatic visual style based on 16th-century Mexico and Central America. But after that brief preview, it completely fell off the radar, and I didn't hear another word about it—until, surprisingly, last night, when the two-man studio Team Colorblind announced that it's (almost) finished and will be out on August 1.

 

Digital Extremes reveals Warframe drop rates
http://www.pcgamer.com/digital-extremes-reveals-warframe-drop-rates/
Publishers including Blizzard, Riot, and Perfect World have in recent months revealed the drop rates for their games in order to comply with Chinese law. But none have gone to the extreme, you might say, of Warframe developer Digital Extremes, which has posted an incredibly detailed list of the drop rates for what appears to be nearly every single item in the game.

 

Lords of the Fallen 2 team has been downsized and its scope reduced
http://www.pcgamer.com/lords-of-the-fallen-2-team-has-been-downsized-and-its-scope-reduced/
Speaking to Eurogamer, CI Games' Tomasz Gop – who worked on the first game and was working on the new one – has been let go from the studio. "I was let go because of a reduction in team, in scope, in budget, in business approach," he said.

 

Kentucky Route Zero continues string of weird videos, promises 'busy summer yet'
http://www.pcgamer.com/kentucky-route-zero-continues-string-of-weird-videos-promises-busy-summer-yet/
Cardboard Computer remains ever-tight lipped about when we can expect KRZ's elusive fifth and final act, however has spent the past year teasing weird game-related art-inspired videos via its WEVP-TV website.

 

League of Legends RP faces 20 percent price hike in UK following Brexit
http://www.pcgamer.com/league-of-legends-rp-cost-faces-20-percent-hike-in-uk-following-brexit/
Riot has announced that League of Legends players residing in the UK will face a 20 percent price rise on RP (Riot Points) following the country's decision to part ways the EU.
As of 11.59pm BST on July 25, the new prices will come into play—changes which Riot says are a direct result of a weakened pound (GBP) following last year's Brexit vote.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Chazore, if you still haven't bought the RAM for your next PC, you may want to do it soon:

 

Micron Fab-2 accident might raise DRAM prices further still
http://techreport.com/news/32196/micron-fab-2-accident-might-raise-dram-prices-further-still
Folks looking to jump on a new system but lamenting the high price of memory might want to go ahead and take the plunge. Electronics supply and manufacturing news blog Evertiq is reporting that Micron's Fab-2 in Taiwan halted production at the beginning of this month following a failure in the facility's "nitrogen gas dispensing system."

The failure apparently led to the contamination of wafers and equipment in the facility. According to Evertiq, this will cut Micron Taiwan's production capacity from 125,000 wafer starts to around 60,000. Purportedly, that drop alone makes up around 5.5% of the global DRAM supply for this month, which means the parts that were already in short supply likely aren't getting any cheaper soon.

According to Evertiq, Micron Taiwan (also known as Inotera) is the primary supplier of the LPDDR4 packages that go in the iPhone. The site posits that the delay could affect iPhone shipments along with the PC and server markets that the fab services. Hopefully, Micron can get Fab-2 up and running again soon, because paying for RAM at a higher price per gigabyte now than I did in 2012 is getting old real fast.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Pemalite said:

Intel has already invested billions in that plant already. That's just Fab 42.

https://en.wikipedia.org/wiki/List_of_semiconductor_fabrication_plants

Intel clearly still ahead. 

It's still not in production yet ... (And it won't be up until 2020/2021) 

That's $2.3 billion (Intel) vs ~$3.5 billion (Samsung) per year but Samsung doesn't only rely on itself since there's a symbiotic relationship with IBM and GF ... 

I wouldn't be so certain who is ahead anymore when it's not clear that Intel will be the first to transition to new scanner technology and new transistor structure ... (At this point it's starting to look like Intel will lose the EUV race and it's also looks like Intel won't be the first to transition to a new transistor sructure since they'll stick with FinFETs for 7nm and Samsung has plans to introduce their MCBFET (multi-channel bridge FETs) technology the year after that.)

Pemalite said:

By the time 64bit OS's, Software and Games started to gain traction, allot of users had transitioned away AMD's K8. Thus it's 64bit extensions often went unused.

It wasn't big at the time for desktop users but it was a big thing in servers, it was the biggest contributor to how AMD destroyed the monopoly Intel had on x86 servers at the time ... 

Pemalite said:

Vega was AMD "Catch up" part. It wasn't meant to beat nVidia. If AMD wanted to be competitive with Vega, it needed to launch 12 months ago, not in a month or two from now.

nVidia is locking in gamers with it's Propriety technology. That's an uphill battle to beat.

You buy a G-Sync monitor, chances are you are only buying nVidia GPU's for the life of that monitor.

Die size says otherwise ... (GP102 at 471mm^2 vs Vega 10 at 484mm^2) 

And while Nvidia may be trying to lock in customers with their proprietary technology they can't exactly ignore industry standards like HDMI 2.1 which will provide adaptive refresh rates in it's spec. Nvidia could decide to ignore HDMI 2.1 and charge extra for G-sync technology but it'll be dumb how you'll have to pay extra on top of HDMI 2.1 for the same functionality you can get without having to pay for anymore but hopefully Nvidia realizes that it's dumb to scam their customers like that and drop G-sync for good in favour of HDMI 2.1 ... 

Pemalite said:

EUV should mean less patterning. So it should reduce costs.

Intel is going to forgo EUV and retain higher patterning and then transition to EUV later. Basically they don't wish to spend big on that technology now, but later and just put up with higher chip fabrication costs in the shorter term.

Just means that Intel won't have the best performance then ... 

While Samsung might have to pay more for using ASML's NXE:3400B systems than Intel using it's successor, they can at least claim the crown in transistor technology ... 



fatslob-:O said:
Pemalite said:

Intel has already invested billions in that plant already. That's just Fab 42.

https://en.wikipedia.org/wiki/List_of_semiconductor_fabrication_plants

Intel clearly still ahead. 

It's still not in production yet ... (And it won't be up until 2020/2021) 

That doesn't mean they haven't invested billions into that fab already... They halted it's construction remember.


fatslob-:O said:

It wasn't big at the time for desktop users but it was a big thing in servers, it was the biggest contributor to how AMD destroyed the monopoly Intel had on x86 servers at the time ...

Can't argue with that.
But server isn't exactly stupidly high volume like other market segments. It does have higher profit margins though.

fatslob-:O said:

Die size says otherwise ... (GP102 at 471mm^2 vs Vega 10 at 484mm^2)

We both know how behind AMD is on GPU efficiency right now.
Their chips are slower, consume more energy and are large.

Vega was AMD's catch-up part. Polaris was a "stop gap" measure.

The Radeon Group got made into it's own seperate group, there was allot of distruption. Hopefully after Navi AMD switches things into a more competitive gear.

****

Also something interesting about Vega is that, with it's clocks normalized to Fury... It doesn't seem to have gained much in terms of efficiency.
You would expect there to be some big efficiency gains due to the draw-stream binning rasterizer, improved colour compression, caches, improved geometry, ROPS. etc'.

So either AMD hasn't enabled the technology in it's drivers already, or Vega has some bottlenecks at play.

http://www.gamersnexus.net/guides/2977-vega-fe-vs-fury-x-at-same-clocks-ipc






I would have liked for them to included Frostbite so we can have a taste of an extremely popular deferred renderer. But it's still some interesting data points.



--::{PC Gaming Master Race}::--