By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - GTX 1080 unveiled; 9 teraflops

Zappykins said:
eva01beserk said:

No worries here, since consoles perform better with same power caps. Unless they start adopting revision every couple of years and screw up the fix hardware for at least 5years that consoles always had. Im way to worried and excited at the same time for ps4 neo.

I think Neo will do very little. Maybe on a few exclusives, but others whise why would developers spend any time on something that will have a small market?

With another shrink or two, I could see something lik this, or maybe a bit more powerful for the next gen consols. But they may to go step upgrades, so who knows.  Like how PC's are right now where you can chose your graphics, details, textures, frame rate etc.

I hope so to. I would want neo to just be the same apu on 14nm wich will drive energy consuption and price down some. Maybe have full mantle and direct x12 suport. something slighty better for slightly cheaper. Then in the 2 years remaining release the full ps5 xbox2. 5 years has to be the limit for how short  generation lasts. I can see them then going with at least 5terraflops next gen. That could be 4k for consoles.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network
sc94597 said:
torok said:

I wouldn't say that and even devs have already said that consoles usually pack more punch than a similar GPU on a regular PC. Thirdy party titles just aren't that much into extreme optimization and such, because it won't generate any kind of financial return. It's not like making a shooter that looks way better than CoD will make you magically outsell it, so why expend more money making a technical masterpiece?

First party studios don't have to worry that much with money since their titles aren't there just to sell well, but also to be technical showcases to make nice looking trailers to move hardware. Trying to make a showcase AND make money at the same time is risky. Crytek almost went down doing it and they are now focusing on F2P. Epic already said that one of the reasons why they sold Gears is because the kind of game they would have to make now would cost 100+ million and that was too much of a risk.

I would say that. On a GPU-bound game like The Witcher 3, the performance is equivalent between a 750ti and a PS4. On CPU-relevant games the performance is in favor of PC.

First parties have been mostly irrelevant since the 7th generation started.

What are you comparing relevancy with? just gpu and putting in any cpu? any ram? ssd or hdd? I dont want to bring price again into this, but price is the only way to really compare 2 things cuz everything else is not quantifiable in the same way for consoles and pc. 

And people who say ecclusives dont matter are only kidding themselfs. It might not push hardwar as much as before, but exclusives do matter, mainly to push the consoles to their limits. Otherwie third party devs will coast on the same year one games the rest of the gen.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Can't wait to see how AMD answers.



Danman27 said:
Can't wait to see how AMD answers.

They won't.

AMD has stated that they aren't targeting the "high end" sector but instead they are focusing on the "mainstream" one. With luck, expect the best AMD card to compete with the GTX 1070.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Pascal looks as great as Maxwell did. I won't jump on the bandwagon since my 970 is doing great and I plan to keep it for some time. If you aren't on Maxwell or just want extra power for 4K, then Pascal is a no-brainer.

I actually believe in their performance numbers, even if some think that they aren't being realistic. Compare Maxwell with 2 years older GPUs and you will see similar numbers. Performance is great, price is right. Instant success. One question that remains is if we will see, like on the 9xx, a price/performance problems with 1080. The 980 was a great card, but it wasn't much better than the 970 to be worth its price. When the 980ti arrived, I think that the 980 became just a bad deal, trapped between the amazing price/performance ration of the 970 and the pure power of the ti model. If you want a excelent bang for your buck, 1070 will be the way to go. If you want more, just hold on for the 1080ti.

However, this isn't paiting a good scenario for Team Red. I'm a big AMD fan (FX owner here), but after getting beaten by Intel badly (after the Pentium 4 vs Athlon years), they are starting to lose ground to Nvidia.I love Nvidia cards not only because the normally have less compatibility/driver issues, but also because of the extra goodies. Shadowplay (even if a bit buggy), Geforce Experience, etc. They really make a difference.

I fear a world where AMD is no more and Intel and Nvidia keep a monopoly of their respective markets and just screw us. The future of AMD is on the hands of Zen, Polaris and Vega. If they screw this ones, I think they are done.

sc94597 said:


I would say that. On a GPU-bound game like The Witcher 3, the performance is equivalent between a 750ti and a PS4. On CPU-relevant games the performance is in favor of PC.

First parties have been mostly irrelevant since the 7th generation started.

A game like this depends on the amount of time/money expent on making it run better. You can't use a gem to affirm that the platforms don't perform differently. In a similar way, I could use Arkham Knight to affirm that PS4 beats a 970, something that it surely does not. A Witcher 3 video only shows that the PS4 version of W3 runs similarly to the PC version on a 750ti and that's it. In other games, you would get other results, because it depends on each port.

It's only reasonable to assume that a closed platform, with closer to the metal API will perform better compared to a similar horsepower on an open platform with more abstract APIs. If you argument was valid, which isn't the case, we could affirm that DX12 is useless because it basically tries to bring less abstract calls to DX. If they are already equal, why would MS bother to use their money to develop such APIs? And why would devs expend more time/money to work with a less abstract (harder to develop for) variant of an API if they won't get anything on return?

You are also incorrect when assuming that 8th gen changed anything regarding console APIs. You are assuming that they are PCs now, so they could perform similarly. The 7th gen consoles were already closer to PC since they used regular GPUs instead of the custom solutions found on previous gens. Changing the CPUs from PPC to x86 actually doesn't make a single difference for anyone except the guys writting the compiler (or actually, just adapting an existing one). CPU optimization is more related to how many cores it has, how fast they are, how the cache architecture is and othe characteristics that aren't exactly architecture-dependant.

Of course firsty party titles are relevant, since we are discussing technical aspects. If you are talking about it from a sales perspective, we could forget visuals because CoD and GTA aren't exactly powerhouses. But even in this case, exclusives like Gears, Halo and Uncharted sells pretty well and are relevant games. Its important to bring them into discussion since we are trying to verify how much optimization can raise the bar. Just on PS4, games like Driveclub, The Order and Uncharted show how far it can go.

I understand that you have your opinion, but professional developers state different things and their opinion is much closer to a fact than we both could get. One of the Metro devs quoted it as roughly a 2 times advantage, something that's expected. The interview is pretty good and detailed and worth a read just to see the differences between gens and the general struggle that one of the most competent developers we have has daily with their budgetary issues (check it here: http://www.eurogamer.net/articles/digitalfoundry-2014-metro-redux-what-its-really-like-to-make-a-multi-platform-game).

Anyway, I think we are waaaaay off-topic and starting to derail the thread (my bad, actually).



Around the Network
JEMC said:

They won't.

AMD has stated that they aren't targeting the "high end" sector but instead they are focusing on the "mainstream" one. With luck, expect the best AMD card to compete with the GTX 1070.

They will, but only on 2017 with Vega. Polaris is more budget-conscious. Anyway, this is just dumb. Giving up the high-end segment didn't help the on CPUs, why would it be a good strategy for GPUs?

Also, Fury X with 4GB VRAM and a 4K/VR focus was plain stupid.



torok said:
JEMC said:

They won't.

AMD has stated that they aren't targeting the "high end" sector but instead they are focusing on the "mainstream" one. With luck, expect the best AMD card to compete with the GTX 1070.

They will, but only on 2017 with Vega. Polaris is more budget-conscious. Anyway, this is just dumb. Giving up the high-end segment didn't help the on CPUs, why would it be a good strategy for GPUs?

Also, Fury X with 4GB VRAM and a 4K/VR focus was plain stupid.

On 2017 Nvidia will have the full Pascal chip, and things won't change much.

And the last time AMD did something like this, with the HD 4 and 5000 series, it worked well for them and increased their market share, which is exactly why AMD is doing it now.

After all, most sales come from the sub $300 segment.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Peh said:
Teeqoz said:

2k=1080p

4k=2160p

 

So 1080p on a 2k monitor is the native resolution

Technically, yes. But I thought he meant something higher than 1080p like 2560x1440. 

Yeah, I meant 1080p. I know 2560x1440 or any off resolution will look blurry, but since 1080p is exactly 1/4th of 2k, that should look very sharp, yes?



CGI-Quality said:
vivster said:

You're right, I forgot to count the heavily increased clock. So it's like 50% then, 60 if we're generous. That's still not enough of an improvement from the Titan X to be able to play 4k60 across the board. Games aren't getting any less demanding either.

Then tell us - what is enough to play in 4K/60fps, as a constant? The performance hits, on the 1080, alone, will be far less than previous generations. So, I'm assuming you know the tech specs to cover such a feat.

If it has a smaller hit to performance at high resolutions it might be but I'm believing that when I see benchmarks.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

JEMC said:

On 2017 Nvidia will have the full Pascal chip, and things won't change much.

And the last time AMD did something like this, with the HD 4 and 5000 series, it worked well for them and increased their market share, which is exactly why AMD is doing it now.

After all, most sales come from the sub $300 segment.

I'm hoping that, with Vega, they will already bring a GPU to compete with the 1080ti and the next Titan, so they would only give a free pass to Nvidia this year.

Of course, most sales come from the sub $300 GPUs, but they have to be careful. When they were rocking with the 500 series, Nvidia didn't had a 330-360 bucks GPU like the 970/1070 that actually delivers serious performance. They are beasts, so AMD has to, at least, match this guys. The 970 was a success and AMD fought it well with the 290x/390x.

Well, that kind of proves your point actually. The 970 was a much more important GPU than the 980/980ti and Titan X. Most people don't need more. It's really where the money is. Anyway, I think that you need a flagship to avoid the image of being an "inferior" company. The FX CPUs are reasonably competitve to the i5, but people see AMD as crap because they don't have anything to fight the i7s.