By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why Sony should also use a Cell Processor for PS5 (x86+Cell coprocessor)

bdbdbd said:
Pemalite said:

IGN... Or from someone who was active in the development itself? Hard decision that.

Besides. Even the IGN link you posted also reinforces the argument that there was no second Cell chip for graphics.

I'm under the impression that PS3 was supposed to have one Cell-processor and no GPU, because this way the system had been cheap to manufacture, as the way Cell functions, it can work as a GPU. This way the system had looked more like PS4, when you compare CPU to GPU performance.

Already provided the evidence where Sony was actually developing a GPU in-house for the Playstation 3. That didn't pan-out of course, likely it came up short against the Xbox 360's Radeon-derived GPU so they opted for an nVidia solution instead.

And although the Xbox 360 had a shit CPU, Microsoft did choose a great GPU for their console in 2005 which gave the Xbox 360 the legs to compete with the Playstation 3 all generation long.

Conina said:
- a Ryzen based CPU + a modern GPU is probably fast enough to emulate a PS3 (and PS Vita, PS2, PSP and PS1)

Not probably. It is. Playstation 3 emulation works surprisingly well on a Ryzen system.
Sony has more intimate low-level knowledge of Cell, so they would be able to make it far more efficiently as well.


bdbdbd said:
klogg4 said:

You clearly don't have a clue how central processor unit works...

3,2 gigahertz GPU would be a beast today, as the high end GPU's run somewhere around 1,5 GHz ATM.

Clockspeed is a balancing act.
We *could* have GPU's operating at 3Ghz today, but that wouldn't be ideal... It does actually cost transistors to ensure a processing architecture can hit high clockrates... (Need to minimize leakage and such)
And you do reach a point where you are better off just using a lower clockrate and taking the GPU wider to provide a better performance/power ratio.

SegataSanshiro said:
Speaking of cloth simulation. Can someone explain what Dreamcast was doing with Dead or Alive 2 with what at the time seemed like Cloth simulation and on Gamecube I remember a greater degree of it with Luigi's Mansion? Was that just warping?

Pretty sure they were actually using proper, albeit rudimentary physics on the cloth and boobs in Dead or Alive 2 on the Dreamcast. - I could be wrong and it was a very long time ago... But the reason they would have been able to get away with it is because it was a fighting game which are relatively light on the CPU, there wasn't a ton of A.I or scripting going on.

Luigi's Mansion I never played, so I would only be guessing. But from what I can tell it was using cloth physics and mesh/texture warping as cloth seems to have weight when draped over tables and such.

But these implementations were far far far more primitive than what we have today in games, which is entirely expected, but they achieved the results they wanted for the time.



--::{PC Gaming Master Race}::--

Around the Network

The Cell never dies wow. The power of the Cell in a thread days before 2018. The Cell is from 2005



Pemalite said: 
Ruler said:

 It doesnt just work that way that they you can add in an extra core for a CPU. You ether buy an affordable 8 core, or super expensive 12 or 16 core, looking at Ryzen.

False.

Ahh yes 8 core is 300$, 12 and 16 core are 800$.

The Cell was never well received.

By who? Third parties who want to save money where ever they can? And yet they are still so greedy and put microtransactions into their latest games for this generation. Moral of the story give them Cell processor so they can think about that.instead lootboxes

 

That is completely up to the developer and how the game is bottlenecked. If you are GPU bound, then doubling the GPU performance can mean the difference between 30fps and 60fps, regardless of the CPU being used.

Most developers arent programming that way tough,otherwise we already would have had 60fps games on consoles. There are a view games designed that way like Knack 1 and Knack 2. Evil Within 1 and 2 maybe too, dont know.

If you think you have ample knowledge about Cell. Then please. Describe those "extra graphics" to me.
What effects are you talking about exactly?

 

Well it would be up to the developers as the Cell can render almost anything. Rendering shadows and real time lighting would be a good use. Also calculation of destructible environments.

 

You are not even making any sense.
You can have a GPU with more Gflops perform slower than a GPU with less Gflops. - Do you wish for me to provide some evidence for this like I have prior in other threads?
Because the GPU industry is littered with examples where this is the case.

Yeah because the optimisation is bad, AMD has the most powerful hardware right now with vega 64 performing @12 teraflops, but its still struggling to outperform a GTX 1080 ti with 10 teraflops

The WiiU and Switch don't have good graphics.
That doesn't mean the games cannot have great artistic flair, did you not watch the Digital Foundry on the breakdown of Zelda's imagry?
There were a ton of graphical sacrifices that were made. 

But because you desire evidence... Here you go.
Sub 720P resolution: http://www.eurogamer.net/articles/digitalfoundry-2017-zelda-breath-of-the-wild-uses-dynamic-resolution-scaling

Drops to 20fps were a thing: http://www.eurogamer.net/articles/2017-03-31-zelda-patch-improves-framerate-on-switch

Poor Texture Filtering: http://www.eurogamer.net/articles/digitalfoundry-2017-the-legend-of-zelda-breath-of-the-wild-face-off

Poor Anti-Aliasing, draw distance on things like grass also leaves much to be desired, low-quality shadowing.

AA is a GPU thing, the CPU isnt the bottleneck for this game on the Wii U.

Yeah but you know what the CPU is inside the Wii U? A 3 core 1.2 Ghz PowerPC, thats weaker than even the Xeon inside the XBox 360. Yet look at the games that still came on the system in this year.

Obviously the Wii U has other stuff like more RAM and a faster GPU but still, its impressive for a 1.2 Ghz CPU. And you know most Nintendo games run on 60fps. 

And? All CPU's can render. But if you think that render video is somehow superior to what RSX or a modern CPU can give us... Then you are kidding yourself.

Here is Unreal, which could run on a 300Mhz Pentium 2, think: Worst than the Original Xbox CPU.
 

Doesn't mean the Cell is great at rendering. (And if that image quality in the video you posted that has no Physics, High-Quality lighting, shadowing, Particles, A.I and so on is to go by... Eww.)

 

Its a benchmark from 2005-2007 from the PS2 era, what have you excepted? but there is even more

https://www.youtube.com/watch?v=4aOk3IpomA4#t=2m01s

https://www.youtube.com/watch?v=404wWR5mrtU

https://www.youtube.com/watch?v=MRB-zQogLeY

Except, no.
Also... Ram typically has no processing capabilities, so it doesn't actually "speed" anything up.

I demand you provide evidence that Cell would provide the same or better performance than Jaguar when equipped with a powerful GPU and plentiful amount of Ram.

Because the games say otherwise:
Battlefield 4 has more than twice the multiplayer players (24 vs 64) in a match, that is something that is very much CPU driven, not memory or GPU.
See here: http://www.bfcentral.net/bf4/battlefield-4-ps3/
And here: https://battlelog.battlefield.com/bf4/news/view/bf4-launches-on-ps4-with-64-players/

Yes it does, it enables the highest speed for renderings if they need it. You know i cant give evidence for that, so why you bring this up? Its not like i can buy a Cell and put a PC video card into it and try it out. But some current Gen Remasters show that they are struggling with the Jaguar, as they still cap the framerate to 30fps, like in Skyrim, the Eizio Collection or Resident Evil Remastered on PS4.

And i am pretty sure there are more ports from last gen, and even completely new remasters like the Crash Bandicot who cant escape 30fps number. Its pretty evident at this point that the Jaguars dont make a difference.

Batthelfied 4 is a multiplat on PS3, so its not comparable at all. How ever the PS3 did have an exclusive FPS game called MAG that allowed 256 players to play online at the same time. Sony got even received an award from Guinness World Records for that.

 

I have PUBG on PC. I have 12 threads. I can assure you, PUBG utilizes them all.

First of that is a 2000$ CPU from Intel. Second, going from 88 to 141 fps is only a 40% increase despite adding 3 times the cores, so its showing me that more cores for this games isnt doing much of a difference. Second this 2000$ CPU you are showcasing is cloacked at 3.3 Ghz just like the Cell Processor.

So this was not my whole argument, my argument was that this game would run better on a dual or single core processor cloaked at more than 3Ghz over 8 core processor who are only cloaked between 1.6Ghz to 2.3 Ghz (PS4-XBox One X).

 

But you made the statement that XDR2 is the best. You were wrong.
Here, go brush up on your Ram tech, clearly your information is stuck a decade in the past.

https://en.wikipedia.org/wiki/High_Bandwidth_Memory
https://www.anandtech.com/show/9969/jedec-publishes-hbm2-specification
https://www.anandtech.com/show/9266/amd-hbm-deep-dive

Note: the GDDR5X and GDDR6 16Gbps per pin which is stark contrast of about 12Gbps per pin of XDR2.
https://www.anandtech.com/show/12186/micron-finishes-gddr6-internal-qualification

Not to mention, you can take DDR2 Ram and if taken wide enough, made faster than GDDR5 or XDR2 anyway.

 

All of these RAM arent available in 2019 and probably cost more than XDR2 Ram. Provide evidence that DDR2 can be faster than GDRR5 or XDR2 if you say it.

 

I demand evidence for your baseless conspiracy theory.

https://www.youtube.com/watch?v=ZcF36_qMd8M

Get back to me when they are 4k, 60fps.

All those games listed would look better on PC, running high-end nVidia graphics.

Yeah they would but guess what ? PS4 exclusives are already running on AMD hardware inside the PS4 and PS4 Pro and look artistically and technically better than anything on PC. Sure  60fps and 4K is better than 30fps and 1080p, 1440p or Checkerboarding, but that doesnt make the game engine and the developers any better. I take a game where i can play around with snow over 4K and 60fps to be honest

https://www.youtube.com/watch?v=RhxZ3ph2IwU

 

You make it sound like PC doesn't get any kind of optimization? That would indeed be an ignorant assumption on your behalf.
Here an AMD driver increased performance by up to 8%.
http://www.guru3d.com/articles-pages/radeon-crimson-driver-december-2016-performance-analysis,1.html

Or hows about a 20% increase? 
https://techreport.com/review/29357/amd-radeon-software-crimson-edition-an-overview

Or hows about the performance increase that Direct X 12 brang to the table, bringing low-level console-like efficiency to PC?
https://www.extremetech.com/gaming/246377-new-directx-11-vs-directx-12-comparison-shows-uneven-results-limited-improvements

Or Hows about the performance increase that Vulkan brought to the table, based upon AMD's Mantle tech?
https://www.anandtech.com/show/11223/quick-look-vulkan-3dmark-api-overhead

Or hows about the performance gains with Windows 10? 
https://www.digitaltrends.com/computing/windows-10-review-gaming-performance/

Or when games release patches to increase performance.
https://www.extremetech.com/gaming/246840-new-ashes-singularity-update-substantially-boosts-ryzen-performance

Don't ever make the ignorant assumption that the PC never gets any kind of optimization.

Yes of course it does but so do console games and its not just at day one. Like Resident Evil Revaltions 2 for example on PS4 which run worse than the OG Xbox One now its locked 60fps, or Homefront the revolutions on all platforms, or MGSV (yes now the game runs at 60fps everywhere).

Same for PS4 Pro patches like FFXV that had various updates for the PS4 Pro over time. 

 

On PC its also the case but its also needed often times in order to have something playable in the first place. But the difference between PC and consoles is that you have a variety of Graphicscards and processors, sure you get a patch that benefits the latest graphicscard and CPUs but not the older ones most of the the times. 

But regardless of all that it doesnt answer my main point that AMD hardware for both CPUs and GPUs is running worse than Intel or Nvidia. They have the strongest GPU on the market and it still cant perform at the same level as Nvidia. Its not an issue with consoles, if Nvidia and Intel are really so much better at their job you would have seen Nvidia or Intel hardware inside consoles.

Last edited by Ruler - on 28 December 2017

TEH POWAH OF THE CELL at certain tasks is faster than the combined cores of the Jaguar. It can do gpu task but I'd prefer it to handle A.I. physics, and BC.



CPU: Ryzen 9950X
GPU: MSI 4090 SUPRIM X 24G
Motherboard: MSI MEG X670E GODLIKE
RAM: CORSAIR DOMINATOR PLATINUM 32GB DDR5
SSD: Kingston FURY Renegade 4TB
Gaming Console: PLAYSTATION 5

Every subsequent comments by Ruler in response to Pema makes me cringe a little more and more.

It's like he's living in an alternative facts universe.

Props to you Pemalite for having that kind of patience with such a person.

When the kind of replies you can expect doesn't fly higher than this: 

Ruler said:
Pemalite said: 

The Cell was never well received.

By who? Third parties who want to save money where ever they can? And yet they are still so greedy and put microtransactions into their latest games for this generation. Moral of the story give them Cell processor so they can think about that.instead lootboxes


Really... How do you keep the will to reply?

Last edited by Hynad - on 28 December 2017

Around the Network
Ruler said:
Pemalite said: 

False.

Ahh yes 8 core is 300$, 12 and 16 core are 800$.

Clearly 8-core processors are not $300. *Cough*Xbox One*Cough*Playstation 4*Cough*Dozens of ARM CPU's*Cough*.

You know what makes CPU's cost money? Die size. Jaguar is 3.1mm2 per core.
8x3.1mm2 = 24.8mm2.
https://en.wikipedia.org/wiki/Jaguar_(microarchitecture)

The Xbox One's SoC was 363mm2, meaning that the 8x CPU cores are taking up 6.8% of the die space on a 28nm Xbox One chip.
https://www.extremetech.com/gaming/171735-xbox-one-apu-reverse-engineered-reveals-sram-as-the-reason-for-small-gpu

16x Jaguar cores would still be a small cost in the grand scheme of things.


Ruler said:

By who? Third parties who want save money where  ever they can? And yet they sare still so greedy and put microtransactions into their games for this generation

Everyone. Even the creators, Sony, IBM and Toshiba abandoned Cell and went ARM and x86 for everything.

https://web.archive.org/web/20091031112643/http://www.hpcwire.com/features/Will-Roadrunner-Be-the-Cells-Last-Hurrah-66707892.html


Ruler said:

Most developers arent programming that way tough otherwise we already would have had 60fps games on consoles. There are a view games designed that way like Knack 1 and Knack 2. Evil Within 1 and 2 maybe too, dont know.

Consoles do have a ton of 60fps games.
Doom, Wolfenstein, Halo 5, Overwatch, Call of Duty, Battlefield, Xenoverse... List goes on.





Again. It is up to the developer and whether they wish to chase 60fps. Not all games will be bottlenecked by the CPU.

Ruler said:

Well it would be up to the developers as the Cell can render almost anything. Rendering shadows and real time lighting would be a good use. Also calculation of destructible environments.

The Super Nintendo's CPU could render almost anything. It really depends on how many centuries you wish to wait.

Jaguar does 4 double precision (3 is more likely) flops per cycle.
https://en.wikipedia.org/wiki/FLOPS

3 * 1,600mhz * 8 cores = 38.4Gflop in double precision.
When AVX is in play... Jaguar can do 16 single precision flops per cycle.

16 * 1,600 * 8 = 204Gflop.

Cell reaches 20.8Gflops in Double Precision, making Jaguar 84% faster than Cell.
Cell reaches 230Gflop in Single Precision making it 12.8% faster than a 1.6Ghz Jaguar.

https://en.wikipedia.org/wiki/Cell_(microprocessor)#Architecture
https://en.wikipedia.org/wiki/PlayStation_3_technical_specifications

But what about the CPU in the Xbox One X? That is clocked at 2.3Ghz... Meaning...
3 * 2300 * 8 = 55.2Gflop.
16 * 2300 * 8 = 294.4Gflop.

Meaning that 8-core Jaguar is 165% faster in double precision and 78% faster in single precision.

Meaning that for rendering, Jaguar soundly beats Cell.

Ruler said:

Yeah because the optimisation is bad, AMD has the most powerful hardware right now with vega 64 performing @12tf, but its still struggling to outperform an 1080 ti with 10tf

Fucking bullshit. It has nothing to do with optimization.
You are still clinging to the false idea that Flops is the only thing that is important to rendering a game.

Clearly I need to school you again here.

The Radeon 5870 has 2720Gflops. (1600 pipelines * 850mhz clock * 2 instructions per clock)
The Radeon 7850 has 1761Gflops (1024 Pipelines * 860mhz clock * 2 instructions per clock.)

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_HD_5000_Series
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_HD_7000_Series

You would think the Radeon 5870 would beat the Radeon 7850 right? It has more flops doesn't it?

But guess what? It's slower.
https://www.anandtech.com/bench/product/1062?vs=1076


Bet'cha wish I would stop using this thing called evidence now, huh?

Fact of the matter is, for rendering there are other aspects of a GPU that is important.

Like texture mapping units which is responsible for resizing, rotating and sampling of textures.
https://en.wikipedia.org/wiki/Texture_mapping_unit

Or Render Output Pipelines.
https://en.wikipedia.org/wiki/Render_output_unit

Geometry units, bandwidth, memory capacity, compression capabilities and so on are all part of the GPU as well, which flops doesn't account for.

AMD GPU's perform the way they do because AMD hasn't invested in these aspects to the same degree nVidia does with it's GPU's, AMD for instance is notorious for having poorer geometry capability compared to nVidia.

https://www.anandtech.com/show/8460/amd-radeon-r9-285-review/3

Ruler said:

AA is a GPU thing, the CPU isnt the bottleneck for that.

False. Anti-Aliasing can be done on the GPU or CPU.

https://software.intel.com/en-us/blogs/2011/07/18/cpu-morphological-antialiasing-mlaa-sample-now-live


Ruler said:

Yeah but you know what the CPU is? A 3 core 1.2 Ghz PowerPC, thats weaker than even the Xeon inside the XBox 360. Yet look at the games that still came on the system in this year.

Difference between us is that I know the tricks they used to achieve what they did.

Ruler said:

Obviously the Wii U has other stuff like more RAM and a faster GPU but still, its impressive for a 1.2 Ghz CPU. And you know most Nintendo games run on 60fps.

The WiiU's CPU is an Out-Of-Order design, which gives it a significant leg up on the In-Order PowerPC designs found in the Xbox 360 and Playstation 3, it does cost extra transistors, but it is worth it if you can afford it.

https://en.wikipedia.org/wiki/Out-of-order_execution
https://arstechnica.com/gaming/2012/11/why-you-cant-read-too-much-into-the-wii-us-slow-clock-speed/

Clockspeed isn't everything, that debate ended when the Pentium 4 came along.


Ruler said:

Its a benchmark from 2005-2007 from the PS2 era, what have you excepted? but there is even more

You missed the point clearly.
The point was that a Pentium 2 300mhz from 1998, with a fraction of the performance, 9~ years before the Cell came along in the Playstation 3... Was rendering 3D graphics.
It's not an exclusive feature to the Cell, ALL CPU's can render graphics.

Ruler said:

Yes it does, it enables the highest speed for renderings that need it.

No. Ram has zero processing capability. You need to prove otherwise as the burden of proof lays with you.


Ruler said:

But some remasters show that they are struggling with the Jaguar, as they still cap the framerate to 30fps, like Skyrim, Eizio collection or Resident Evil Remastered.

That's just poor porting. Don't forget either that those remasters tend to operate at higher resolutions and usually have other enhancements such as better lighting, shadowing, draw distances and so on.

That's not all free on hardware you know.



Ruler said:


And i am pretty sure there are more ports from last gen, and even completely new remasters like the Crashbandicot cant escape rendering in 30fps. Its pretty evident at this point that the Jaguars dont make a difference.

Doesn't help that they are framerate capped.
It's not as simple as just unlocking the framerate to hit 60fps.

Scripting and animations are often tied to framerate on consoles, which has a ton of caveats.



Doesn't mean Jaguar is bad, just that is what the developers chose.

Ruler said:

First of that is a 2000$ CPU.

The cost of a CPU is ultimately redundant and doesn't change how many cores a game will fundamentally utilize.

I have a 6 Core/12 thread processor and PUBG on PC, wish for me to show you some benchmarks of core scaling?
Or was the prior evidence that I provided enough to satiate your opinion?

Ruler said:

Second, going from 88 to 141 fps is only a 40% increase despite adding 3 times the cores , so that is showing that more cores for this games isnt doing much.

Doesn't change the fact there is still gains thanks to extra CPU cores.
You are arguing against evidence.

Either pony up the evidence or take a hike.

Ruler said:

Second this 2000$ CPU you are showcasing is cloacked at 3.3 Ghz just like Cell Processor.

Clockspeed is not everything.

There is a reason why people aren't using a 3.8ghz Pentium 4 in 2017, because it's a piece of crap.
https://www.anandtech.com/bench/product/92?vs=118

Ruler said:

All of these RAM arent available in 2019 and probably cost more than XDR2 Ram. Provide evidence that DDR2 can be faster than GDRR5 or XDR2 if you say it.

I never said all of them were available. GDDR6 will be coming within the next 6 months.

But HBM2, HBM and GDDR5X most certainly are.
https://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus
https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review
https://www.anandtech.com/show/11262/asus-launches-geforce-gtx-1080-11-gbps-and-1060-9-gbps

As for DDR2 being faster than GDDR5. That is just simple mathematics.
(Memory Clock * Bus Width /8) x Memory Clock Multiplier. (I.E. GDDR5 is 4x, DDR2 is 2x.)

Thus DDR2 @ (800mhz) 1600mhz on a 128-bit bus would have identical bandwidth to GDDR5 @ (400mhz) 1600mhz on a 64bit bus.
Increase the DDR2's bus width to 256-bit and it would be faster than GDDR5.


Ruler said:

 https://www.youtube.com/watch?v=ZcF36_qMd8M

I said I demand evidence. Not a video of someone else with an opinion that aligns to your own.

Ruler said:


They are already running on AMD hardware inside the PS4 and PS4 Pro and look artisticley and technically better than anything on PC.

Technically? Heck no.
StarCitizen is technically ahead of anything on the consoles, it's sheer scale is a testament to that very fact.





Artistically is completely personal opinion and will thus change depending on the individual, thus making it a completely useless and stupid argument.


Ruler said:

On PC its also the case but its also needed often times in order to have something playable in the first place.

Bullshit.
I have 640 games on Steam. If games being released in an unplayable state was the norm... I would know about it.
http://steamcommunity.com/id/Pemalite/

Ruler said:

But the difference between PC and consoles is that you have a variety of Graphicscards and processors, sure you get a patch that benefits the latest graphicscard and CPUs but not the older ones most of the the times.

False. Case in point AMD's frame pacing driver which even benefited completely different VLIW architectures.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/62439-amd-13-8-frame-pacing-driver-tested-benchmarked.html

Ruler said:


But regardless of all that it doesnt anser my main point that AMD hardware for both CPUs and GPUs runs worse than Intel or Nvidia.

AMD may not have the performance crown, but that doesn't mean their hardware is shit.

There is no such thing as a bad GPU/CPU, only a bad price. People still bought AMD FX CPU's despite the terrible performance those chips offered. Why? Price.

Ruler said:

They have the strongest GPU on the market and it still runs worse than Nvidias like i have pointed out above. Its not an issue with consoles, if Nvidia and Intel are really so much better at their job you would have seen Nvidia or Intel hardware inside consoles.

Vega isn't the strongest GPU on the market, for the reasons (and evidence) that I alluded to prior.


Hynad said:


Really... How do you keep the will to reply?

Have you not paid attention over the years? I live for debating tech. :P

Hilarious when someone makes a ludicrous comment and I get bombarded on Steam, Private Message and so on with people asking me to interject.



--::{PC Gaming Master Race}::--

I'm just gonna name drop the PS3, Sega Saturn, and the Atari Jaguar. The Saturn and the Jaguar had 2 processors and were a bitch to develop for if you wanted to be a multiplat developer or just a developer interested in the platforms in general, compared to the competition that only had a single CPU and was a lot easier to develop for and port games between the systems (N64 and PS1 for the Saturn. SNES and Mega-Drive for the Jaguar).
Then there's PS3... despite it being more powerful than the 360 and Wii, some games ran worse due to the Cell processor's architecture, and developers left their games still broken to this day, anyone remember the infamy of Skyrim on the PS3? There was also the price of the console at launch because of "The Power of the Cell", then a few years later the price was dropped and Sony lost so much money on the PS3, they're lucky their software sales held them above the water, and a lot of multiplat games were badly ported.

Cell processor with another CPU is just a horrible idea, look at the history of the industry and how well that worked out.



Pemalite said: 

Hynad said:


Really... How do you keep the will to reply?

Have you not paid attention over the years? I live for debating tech. :P

Hilarious when someone makes a ludicrous comment and I get bombarded on Steam, Private Message and so on with people asking me to interject.

But you are not debating tech here. You are lecturing someone who doesn't want to learn.

As for the second part... I've found myself guilty of that a few times in the past.



Remember when Nintendo used the same CPU 3 consoles in a row? Those were the days. It was a beast in 2001 but Nintendo couldn't move on until Switch finally changed that. I have always felt PS3 is the more modern SEGA Saturn...just if it had a company with money to keep backing it. Also does Ruler not know 360's CPU was based on the Cell?

Last edited by SegataSanshiro - on 29 December 2017

KLAMarine said:
vivster said:

Did you come to a conclusion after reading what the tech-heads had to say?

Yes. That I'd rather be browsing dirty pictures on the internet than read up on any more of this.

Fair enough. It's best to not click on threads by some authors at all to keep your sanity. But sometimes it's just nice watching car crashes in progress.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.