By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Intrinsic said:

But hardware like this didnt exist in 2013 at a price or form factir that could fit in a $400 box. Hell even $600 box.

Hardware like this did exist. Aka. Radeon 7950/7970.

Remember, console manufacturers aren't buying GPU's off the shelf like gamers so pricing for them is different, they aren't spending cash on expensive copper vapor chamber coolers with giant blowers.
They aren't including gigabytes worth of extra Ram.
They aren't including complex dedicated and over-engineered power delivery systems.
They don't need the dedicated PCB.

They just buy the chip itself, which is included in a single chip to keep costs low.
Then they also buy in bulk so they get a farther discount.

Besides, even if the cost was a little more expensive, console manufacturers typically ate the cost initially.

Intrinsic said:

Anything above 1080p if not 4k is useless to consoles being that home tvs only support 1080p/4k. Quad hd is not a living room standard. I think in most cases the extra poerr would be used to give more IQ polish and maybe used as an upscaler to get the games to output at 4k even though its built natively in 1080p. Kinda like how a lot of the XB1 games are built at 900p natively but output at 1080p.

Not exactly.

There is substantual benefit to rendering a game at 4k and downscaling it to a 1080P television, keep in mind, games are not movies.

In-fact, some games on consoles have already been doing something similar (Just not this extreme) where they have rendered higher than their native resolution and then downscaled it, as this actually how some forms of Anti-Aliasing works.

Chevinator123 said:
How will a neo even work? Will we be able to put our old/current PS4 games in and they will just somehow run? (presumably with better fps) Will devs need to put out a patch? Will devs even bother? if past PS4 games dont work doesnt that mean its essentially a PS5?

All PS4 titles as they are today, if they aren't updated... Will run exactly the same as they do now even on the new machine.
All new Titles will also run on the regular PS4, just with some concessions.

DonFerrari said:

I was waiting for someone as you to come explain =] thanks perma.

And if the chip is 4x stronger than what we had on PS4, shouldn't we have 4k for a game with 1080p if all other aspects are kept the same?

Unforuntately, the entire console didn't receive a 4x increase, there isn't a 4x increase in Ram or memory bandwidth or CPU performance or HDD/Blu-ray speed, so it's a little more complicated than that.

You won't have your current 1080P games running at 4k.

Simpler titles should have no problems running at 4k resolution though, for more demanding titles Quad-High Definition (Aka 1440P/2560x1440) should be more than feasible, it's actually the resolution the PC GPU will be targeted at. At 1440P you will see some benefit over 1080P even at 4k.

JRPGfan said:
m0ney said:
How can APU be as powerful as a dedicated mid to high end video card? Wouldn't it make sense to replace all video cards with APUs then?

Thats the future of PCs.

Intel or AMD, and useing the integrated gpu (thats on chip).

The reason why this never happend was because of memory bandwidth issues.

System ram are slow, even DDR4 isnt enough to make it work well enough at those GPU amounts.

However Hybrid Memory Cube technology is going to change that when it ends up getting used as system memory.

Nvidia cant make x86 cpus.... so they cant offer a APU that can run windows.

Which is why they started doing android stuff... trying to get into Smart phones & tablets.

Nvidia fears a future where motherboard manufactures dont sell a PCIe lane for graphics cards (forces nvidia out of the pc market)

Where everyone just uses a IGPU from a APU.

No. The reason why this has never happened is because of cost and power consumption.
AMD is actually working on an APU though with 200-300w of power consumption which should be able to beat a PS4, it's not going to be cheap though.

As for bandwidth... DDR4 and DDR3 memory can offer more bandwidth than GDDR5, it all comes down to the implementation.

Clock speed X (bits per clock/8).

Thus 4166mhz DDR4 on a 512bit bus would provide 266GB/s a second of bandwidth, that's more than the PS4 or the PS4 Neo. Go figure.
But such a thing won't happen, 512bit bus would require far to many PCB layers and routing that on a motherboard would be a nightmare.
But on a 256bit bus you would have a very respectible 133GB/s.
Modern GPU's are also more bandwidth efficient, employing a myriad of technology's to conserve bandwidth.

AMD has also used a technology known as "Side Port" once before. - It is where the motherboard manufacturer includes dedicated memory for the GPU on the motherboard in conjunction with System Ram allowing the integrated GPU to access both.
Intel does something similar with eDRAM/L4 cache.

Dedicated PCI-E is here to stay, it's used for more than just Graphics.

torok said:

It will most likely run an overclocked version of the same CPU, no Zen for it. So framerate will probably be marginally better. Visuals will improve.

As they are pushing for parity between all PS4s, games will also be the same, but prettier. Probably 4K upscaling to help sell 4K TVs, as they are really needing all the help they can to sell. PS360 is what pushed HDTV for the masses, I think PS4K/XboxSomething will do the same for 4K.

I think you might have your hopes set a little high there in regards to 4k.

You don't need Zen to upgrade the CPU, Jaguar's successors already exist, namely Puma and Puma+. - They are more efficient, faster and cheaper thanks to a better densely packed design.

JRPGfan said:

I thought the rumors where Puma+ cores instead of the jaguar ones.

Still weak small cores, but minor upgrade.

That's just it though, they are only rumors.
I'm hoping for Puma+. They are fully backwards compatible with Jaguar... Still slow, just a little less slow.

torok said:
JRPGfan said:

I thought the rumors where Puma+ cores instead of the jaguar ones.

Still weak small cores, but minor upgrade.

I don't know if they will be Jaguar cores and the rumour doesn't clarify it. They could be Puma+ cores aiming for a bit higher performance. It's more likely that they will pair these two instead of using older tech for the CPU and newer for the GPU.

Besides that, AMD will probably push a Puma+/Polaris 10 APU to the market, trying to use DX12 to compensate the limited CPU punch and creating a low cost gaming solution.

Direct X 12 isn't happening on Non-Microsoft platforms.
Even then Sony and Microsoft have low-level API's which are faster than Direct X 12 anyway.

AMD already has a giant APU heading to the PC, it's not using Puma+.

Soundwave said:

I suppose Scorpio could come out in spring/summer 2017 instead, so it wouldn't be quite a year, if its not coming out by then, then the reason likely is they are using something like HBM2 RAM, which would be net performance even higher, so all the more reason to wait. 

There probably are not that many games that will use the Neo's power until 2017 anyway, so I'm kinda content to wait and simply get the better hardware. 

For VR I think I'm going to go with Occulus Rift too. Compatible with PC and better quality resolution ... if you're to pay like $700 for VR, I figure you may as well pay a little more and get the best possible experience as VR is not exactly "budget" either way. 

HBM2 is likely not to happen, mostly because of costs, consoles are cost sensitive devices, you can't expect the, to have the best of everything.
Not only that, but these consoles aren't targeting insane resolutions so such bandwidth would be a waste.

Intrinsic said:

You've got to be joking. The NX based on its own leaks isn't even in this conversation since its rumoured to be aboit as powerful as the XB1. there are even rumors that state it wont use an X86 architecture.

And as for power, going from a 1.8TF to a 4.2TF GPU, bump in CPu clock from 1.6Ghz to 2.1Ghz and memory bandwidth from 192 to 218GB/s is a significant bump especially when you consider that the standard is still 1080p@30fps. Even if the XB1scorpio comes in at 5.5/6TF you wont notice any marginal difference when looking at 1080p content. It will be like comparing two systems @1080p where one hits 48fps and the other 40fps but both are locked to 30fps. 

Also keep in mind that in situations like these, the PS4 will still be the lead platform for devs. Lastly, no matter what sony or MS does with their upgrades, they are still limited by the current hardware. Games would be built to run on those first befire any added benefits of the newer hardware is considered. 

You can't compare the old vs new.
For one, the newer GPU can do more work per Teraflop, it's more efficient.

Secondly... Bandwidth of 192GB/s vs 218GB/s can't be compared as the new console gains a plethora of bandwidth saving technologu's such as Colour compression, which could give it upwards of 50GB/s or more in bandwidth savings (Basing that from PC observations).

torok said:

Also mind that GPGPU on PC is very different, because you don't have an unified memory. So if you are dealing with data on your CPU (system RAM) and wants to pass some task to the GPU, you have to copy it to the GPU's VRAM. That's very slow. An unified memory allows you to offload tasks that wouldn't be interesting on a shared memory architecture.

So you are correct. A routine like physics that runs on the GPU would be faster on the Neo, so it would depend if the CPU tasks of the game can be parallel or not.


Not entirely accurate.
Starting with Pascal nVidia implemented Unified Memory, can't forget unified virtual addressing either.
Intel and AMD have done similar things with their IGP's/APU's.

Azuren said:

Don't forget to factor in hardware optimization. It accounts for a lot of increased performance and is essentially why the PS3 was capable of feats like Last of Us. 

The Playstation 3 didn't manage the Last of Us levels of imagery because of hardware optimization.
It managed it because all the details like lighting and shadowing were baked into things like the textures, same thing went for Halo 4.
They also used simple geometry and then used various tricks to make it seem more complex than they really were.
Allot of the expensive effects (Like Full dynamic HDR lighting.) were also abandoned mid-generation, which when freed up allowed for a dozen simpler effects which created a more pleasing image overall.

As the generation progressed more and more effects stopped being dynamic... And then once the new generation started, everything went back to being dynamic and you can tell by playing them.

GribbleGrunger said:

ME:

Thanks. I'm really not technically minded at all so I didn't have a clue what to search for. Wouldn't the question be: if indeed AMD made those changes internally, would they be available only to chipsets made for Sony or would they be more widely available? Could Sony have patented those changes?

These fall under AMDs new semicustom strategy - Sony may have highlighted the need, but AMD is still doing the bulk of the engineering. Sony has engineers of their own no doubt, but I think anything would be cross-licenced at most (think Toshiba using Cell for whatever, or Xenos being first to unified shaders but ever new GPU after it using them).

Sony gets hardware ahead of current roadmaps out of it, AMD gets part of their R&D funded.

Another answer:

Cerny wanted more engines capable of distributing graphics vs non-graphics work. The idea is essentially getting better utilization and getting closer to the peak performance of the APU. It was no coincidence that the CPU was quite weak so the GPU could also help out with the compute related tasks.

It will also come in play for VR, which is why I've been saying that they will be doubled in Neo/Scorpio.

Unfortunately AMD has the x86 license to worry about when designing APU's, so Sony isn't allowed to get intimate with the chips, they can only petition ideas.
There would be cross-licensing deals anyway though.

m0ney said:
How can APU be as powerful as a dedicated mid to high end video card? Wouldn't it make sense to replace all video cards with APUs then?


Dedicated hardware will always be faster mostly because... You are limited in the amount of transisters and thus the chip size you can have at any given price point.
If you start eroding away at parts of the GPU in order to fit a CPU, Chipset and other Logic... Well, you are missing out on potential performance, it's one of the reasons why the Xbox One's chip, despite being almost the same size as the PS4's chip, is much slower, Microsoft used a chunk of their chip budget for memory.

If you were to take a Polaris 10 GPU and compare it to the PS4's APU, sure, they might have similar performance in graphics tasks, but the PS4's APU will be more expensive to manufacture.
Plus the PS4's CPU is extremely horrible anyway.




www.youtube.com/@Pemalite