By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Arkaign said:

Did you wake up today and feel like being a disagreeable person just for the same of being argumentative and assumptive?


It is how I wake up every day. Due to walls in my premises... I can only get out of bed on the wrong side.


Arkaign said:

You make a bunch of replies based on your opinions (900p is horrible, 1080p is dated, etc). Your opinion is valid for you, but sitting way back at a buddies house looking at a 60' TV, SW Battlefront doesn't look terrible by any means. Your own chart pretty much says this for a 60" TV @ 10'.

Yet. The chart doesn't tell the entire story. You still see benefits with increased resolutions, I can go in-depth and technical once again if you so desire.

Arkaign said:

On your 4K comments, derp, I've SEEN 4K because I've done it with 1070s on a 43" 60hz setup at close range. I simply preferred 1440p @ 100+FPS on Gsync. The smoothness was much better to me than the resolution bump. You basically started an argument on this point to basically agree with me (eg; give me a high refresh 4K/5K whenever that happens, but for now 1440P/100+ >> 2160P/60, and ESPECIALLY 2160/30). It was weirdly located as well because I was quite clearly referring to consoles pushing 4K, when we know to expect a lot of 30fps games.

Far from it.
I didn't start an argument about that point. You probably misconstrued my intention, I merely just added my opinion, take it as you will, you obviously took it hostile which is far from my original intention.

Arkaign said:

Then you make assumptions on why I got the 480, I gifted my 390 to a friend whose card died and I needed a drop-in for an HTPC that rarely sees games other than for guests, not an upgrade. Unfortunately the 480 was a whiny, coil whine mess that had basically zilch for OC.

No. I didn't make any assumptions on why you bought the Radeon 480, I couldn't care less on why you bought it, you could use it for a door-stopper or as some exotic type of freaky toilet paper.
I merely stated that you need to re-align your expectations with that card relative to AMD's prior cards, it's not a replacement for AMD's high-end offerings.

I am under the firm belief that there is no such thing as a bad card, only a bad price. - Coil Whine is also not an AMD issue. That is a manufacturer issue, nVidia cards can also exhibit the same problem.

Arkaign said:

Then you make some weird argumentative things about an insanely hypothetical speclist. Of course the human brain doesn't process visual data in 'frames per second', but higher FPS appears smoother, this is quite obvious. What's the point in really arguing about a hypothetical 1024-bit bus, when speaking of a vaporware system with 2TB of memory?

Because you brought it up?

I think you don't understand the point of a Forum.
A forum is for people to share and express themselves in writing... But it also allows other people to reply, breakdown those expressions.
If that is something you do not agree with, perhaps you might need to think about why you are here?
Anyway. No need for the attacks.

Arkaign said:
And where exactly did I state specifically that clock speed was the only thing that determined performance? Uarch, IPC, scaling, all of this has a ton to do with what the end result would be. An 8Ghz Intel Atom gen1 would be 8Ghz trash. An 8Ghz Jaguar would be better, but not ideal. An 8Ghz Haswell would be pretty damned good. Would you have preferred that I state some particular basis for the core design on a hypothetical level? Fine, Custom 8Ghz Icelake Quad, 64-Core on the BIG.little concept (on package 64 ARM-style cores to run OS/etc in the background without notably interfering with primary gaming threads). Until GPGPU gets more mature, AI in particular amongst other things does better on the CPU.

Glad we cleared that up.

Arkaign said:
I am well aware of core loading and how difficult it is to utilize multicore setups to the max, quite frequently 1 main core sees 90%+ usage while the others vary widely. Extremely talented devs with a great grasp of a particular setup can do more, as Naughty Dog did with TLOU on PS3 (some great videos on the making of it showing just how well they loaded the Cell's SPEs).

That's not the point.
You see you can take CPU development approaches in two ways.

You can spend transisters to make the CPU wider by throwing more cores at the problem... Or you can spend transisters to bolster the performance of fewer cores.
CPU's tend to work on highly serialized tasks that take advantage of high-performing cores rather than lots of cores, thus it makes sense to be conservative with your core counts (Intel could have a 100 Core CPU if it wanted...) and focus on fewer higher performing cores.

Then you move the highly parallel stuff to a chip that is designed to handle that, like a GPU.

As for Naughty Dog and the TLOU and the Playstation 3, I could go in depth with it's technical underpinnings, but I'll just state it looks as good as it does thanks to very smart artistic assets and use of those assets.

The Playstation 3 is also not representive of newer hardware that we have today, the hardware has already gone through various paradigm shifts since the PS3's launch a decade ago, GPU's are far more powerful, far more flexible, far more programmable. There is no need for Cell-styled CPU architectures. (Which wasn't really a good all-round CPU anyway.)

Arkaign said:
My whole post was meant not to be a light-hearted observation and what-if kind of offhand thing, not an aspergers episode. I've worked in IT for over a quarter century, and I wouldn't think to bore the crap out of everyone going into excessive detail about something so utterly meaningless.

As a Carer... Making fun at the disabled, ill, frail is not on. Don't do it.

And you have obviously misconstrued my intention as something hostile, I wasn't. Lighten up.



--::{PC Gaming Master Race}::--