dahuman said:
|
You'd be surprised about intel quick sync though, it has the fastest fixed solution while also having better quality than those dedicated GPU's lol, I guess thats epic fail on nvidia and amd's part.
dahuman said:
|
You'd be surprised about intel quick sync though, it has the fastest fixed solution while also having better quality than those dedicated GPU's lol, I guess thats epic fail on nvidia and amd's part.
fatslob-:O said:
You'd be surprised about intel quick sync though, it has the fastest fixed solution while also having better quality than those dedicated GPU's lol, I guess thats epic fail on nvidia and amd's part. |
That will change once x265 is in full working order, it's still young but it has beasty potentials. also read my edit lol.

dahuman said:
That will change once x265 is in full working order, it's still young but it has beasty potentials. also read my edit lol. |
Oh my post wasn't asking about GCN being able to do crazy physics no but that is where the next generation APU's like kaveri and maybe PS4's could come in handy but I was just saying that AMD's prupose for releasing GCN on a graphics front not from compute standpoint was to show off forward rendering and not deferred rendering where amd and nvidia are pretty much on equal footing but to gain in advantage in games such as dirt showdown with an advanced lighting system to move back towards forward rendering in order to pull the benchmarks in favor of amd radeon cards.
fatslob-:O said:
Oh my post wasn't asking about GCN being able to do crazy physics no but that is where the next generation APU's like kaveri and maybe PS4's could come in handy but I was just saying that AMD's prupose for releasing GCN on a graphics front not from compute standpoint was to show off forward rendering and not deferred rendering where amd and nvidia are pretty much on equal footing but to gain in advantage in games such as dirt showdown with an advanced lighting system to move back towards forward rendering in order to pull the benchmarks in favor of amd radeon cards. |
lol I just hope we can get to the point where hardware can handle ray tracing like lifting a finger soon then the discussion wouldn't even be needed 

dahuman said:
lol I just hope we can get to the point where hardware can handle ray tracing like lifting a finger soon then the discussion wouldn't even be needed |
Hey I'm all for ray tracing but unless nvidia, amd, and microsoft support it we ain't getting nowhere like the RTU's(ray tracing unit) R2100 and the R2500 from Caustic's plus those RTU's have pipelines that are dedicated to ray tracing not rasterisation.
Edit: We could have this for next generation but nooooo because developers that were surveyed by mark cerny and sony said it was too much work for them.
fatslob-:O said:
Hey I'm all for ray tracing but unless nvidia, amd, and microsoft support it we ain't getting nowhere like the RTU's(ray tracing unit) R2100 and the R2500 from Caustic's plus those RTU's have pipelines that are dedicated to ray tracing not rasterisation. Edit: We could have this for next generation but nooooo because developers that were surveyed by mark cerny and sony said it was too much work for them. |
It's more like, a lot of people will be out of jobs once it's a viable solution, think on that one.

dahuman said:
|
Well then I guess we can safely say that no games are going to push raytracing then :P too bad though, it would have been sweet to see gpu rendering some nice caustics.
| DJEVOLVE said:
Sony beat MS you can say because of RROD. So don't act like the most known failure was not a issue. This most likely is the only reason sony caught up. |
Wut...? O_o
I have literally never seen the RROD card played this way... Actually, quite the opposite, everytime i see it mencioned it is either something like "RROD proves that customers forget a mess of gigantic scale since X360 sold so well this gen" or (99% of times) "LOL X360 only sold what it sold due to RROD, had it not happenned, millions of Xboxes would not have been sold to replace all of the broken ones"
Not sure what led you to this very odd conclusion^. I'm seriously amazed.
Not-so-proud owner of every current-gen system.
Next-gen is upon us folks!
And some cool and inspiring quotes
“Always forgive your enemies; nothing annoys them so much.”| fatslob-:O said: Once again your ignorance on the topic of hardware has taken over. Dude go back to school and learn some actual coding. You realize that the word "tier" in this case stands for the feature that is a part of the api not level's of hardware support. Do you even know how PRT's/tiled resourcing works ? Oh and btw the update on the hardware was to give it api compatibility not to support those features, I'm surprised that you don't even know how an api works. |
I'm sorry you're wrong. There are two distinctions to DirectX. The hardware feature level and the API support level. This is a fact.
The hardware does not need to be at the same feature level as the API in order to support features of the API. This is also a fact.
This is coming from AMD themselves. That is also a fact.
The Xenos processor in the Xbox 360 was a DirectX 9_0 hardware feature level compatible processor, but it still supported DirectX 10 API features. Not every DirectX feature requires hardware support.
By the way. I've coded in C++, C#, Visual Basic 6, Visual Basic Script, Java, Java Script, PERL, SQL, and HTML. I've written from scratch software to take a server from bare metal to a fully configured and operational box.
What the heck have you done? Until you have something substantial to contribute to this discussion, please stop commenting!
Oh, and by the fucking way. I was on the President's List and Dean's List of my college with a 4.0 in Computer Science, and I can prove it to the moderators if they would like.
You tell me how Sony, who was way ahead of Microsoft by all accounts, would be able to implment DirectX 11.2 when Microsoft revealed features that even AMD wasn't aware of? Then try to explain to me why Microsoft, who had been working on DirectX 11.2 wouldn't have a GPU that fully supports the hardware feature set for DX11.2? Albert Penello asked nearly that same question.
GCN 1.0 GPUs do not support the DX11_1 feature set. GCN 1.1 GPUs do. AMD in the press release even stated, specifically that they were proud to be the only GPU manufacturer offered a fully DX11.2 compatible GPU stack in a retail product. The nod, not to the Xbox One, but to the Bonaire GPU. THE ONLY GCN 1.1 GPU.
So once again, just be quiet.
You can try to insult me all you want. It won't do any good because the fact of the matter is, the source material I've linked to multiple times to specific content comes from solid sources. So if you want to go down this road, please do.
Adinnieken said:
I'm sorry you're wrong. There are two distinctions to DirectX. The hardware feature level and the API support level. This is a fact. By the way. I've coded in C++, C#, Visual Basic 6, Visual Basic Script, Java, Java Script, PERL, SQL, and HTML. I've written from scratch software to take a server from bare metal to a fully configured and operational box. |
@Bolded Source please ?
BTW all GCN GPU's support DX 11.1 and once again you have bo knowledge of what your talking about. http://en.wikipedia.org/wiki/Radeon_HD_7000_Series
Where's proof that the xbone's GPU based on GCN 1.1 once again show me source. I could easily say that it's based off the cape verde but I don't have any source to back it up.
Telling me to be quiet!, that's hilarious coming from the man that does not know aything about hardware.