By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - PS4 Neo GPU Is Point-For-Point A Match For RX 480

Remember, we are getting this a year before Scorpio launches, half a Tflop will not be much in the grand scheme of things.... when you are dealing with consoles of 1 or 2 Tflops then half a Tflop is a lot....
When the consoles are near 6 Tflops, it is a much smaller percentage of the overall performance.
PS4 and Neo will be lead platform also, due to Neo being out first so you can probably guarantee devs will be more used to coding to Neo and optimizing for it by the time Scorpio come out.




Around the Network
Airaku said:
GribbleGrunger said:

Neo GPU Is Point-For-Point A Match For RX 480

Proof? I don't see an comfrimations of the PS-Neo specs. Nor anything that link to this. That is a pretty big claim you have going on with no evidence to back it up.


If this is true. Then this is the PS5, not a mid-cycle upgrade. It's a bigger jump than the PS3-PS4 was.

In terms of % of performance increase... wasnt the jump from PS3 -> PS4 like x10 in most area's?

This will just be a x2.3 or so increase (if it ends up being 1.84 -> 5.5 teraflop)

 

Permalite :

We where talking about APUs (igpu) and memory bandwidth.

You replied to me about DDR4 potentially being faster than GDDR5.

I assumed you where talking about system memory on motherboards for CPUs (cuz you know APUs and iGPUs).

I know graphics cards can use higher bus widths.

Im saying thats not  gonna happend for normal consumer cpus, until we start useing Hybrid memory cubes.

When that does happend, a large portion of the discrete gpu market will go away.



JRPGfan said:
Airaku said:

Proof? I don't see an comfrimations of the PS-Neo specs. Nor anything that link to this. That is a pretty big claim you have going on with no evidence to back it up.


If this is true. Then this is the PS5, not a mid-cycle upgrade. It's a bigger jump than the PS3-PS4 was.

In terms of % of performance increase... wasnt the jump from PS3 -> PS4 like x10 in most area's?

This will just be a x2.3 or so increase (if it ends up being 1.84 -> 5.5 teraflop)

I believe the New GPU is more efficient, so in the end could be roughly 2.5 times faster than current PS4 GPU, if the new GPU is clocked at 911Mhz.(you have 5.5 TF if clocked at 1.2Ghz).      At 5.5 TF, and considering the New GPU is more efficient, it would be over 3 times as fast as PS4 GPU.

We will see if anything comes out from this E3; I'm so curious :)

But I think specs are not final yet, and maybe Sony could clock it a bit higher. Just a thoight.



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Airaku said:
GribbleGrunger said:

Neo GPU Is Point-For-Point A Match For RX 480

Proof? I don't see an comfrimations of the PS-Neo specs. Nor anything that link to this. That is a pretty big claim you have going on with no evidence to back it up.


If this is true. Then this is the PS5, not a mid-cycle upgrade. It's a bigger jump than the PS3-PS4 was.

Joke post? The info is from DF. And their ptevious leak detailing the Neo was based on documentaion released to developers and also developer feedback. The only caveat is thay they said some things are still likely to change. But proof is only better than that if its coming from sony themselves.

And the second part of your post is hilarious. So you think the jump from PS4 to PS4 neo (from 1.8TF to eveb either 4TF or 5.5TF GPU) is bigger than the jump from PS3 to PS4? Lol, thats actually funny. Just remember that the PS3 had 512MB of ram. I'll leave it at that. 



Aura7541 said:
DonFerrari said:

We may very soon discover if the new xbox will have enough traction... there is no guarantee that they will get all games from EA, if their new console plummet they may lose support, but if it sells good enough and/or it's easy to port from PS counterpart or PC them they will get games. And you gone from MS not allowing themselves to lose because they have a lot of money to a very specific point of not being without 3rd party support.

What hard lesson have they learned? They seem to be doing just the same they always done as a company. And their internal studios are even weaker than they were on X360, 

AAA 3rd party support, the Scorpio won't have trouble getting. However, the Scorpio is not going to solve Microsoft's struggles with indies and Japanese 3rd party games, especially the latter. Quadrupling the amount of terraflops isn't going to convince Japanese developers to make games for the Scorpio, especially when most of their games aren't that graphic intensive. Psycho Pass: Mandatory Happiness, for example, was a timed exclusive in Japan and yet it's being localized for only the PS4, PSV, and PC. Kingdom Hearts 2.8 is exclusive to the PS4 despite KHIII coming out on both the PS4 and XB1. The additional terraflops are also not going to erase Microsoft's very high mininum print order requirement which allegedly led to the cancellation of the physical XB1 version of Shovel Knight.

So while the huge power increase is pretty awesome, it doesn't solve the XB1's biggest problems. The console's struggles were more attributed to software than hardware.

No support is guaranteed... X1Scorpio should get all 3rd party western devs no problem... but if they completely plummet on sales, they may not get it.

Kerotan said:
JRPGfan said:

It wont do native 4k (except for watching videos).

It ll either be 1080 60fps or if games already run that, 1440 60fps, and then upscaled.

Maybe slightly more extras, AA & AF...better quality effects/shadows that sort of thing.

 

To do native 4k, you would need a much stronger gpu (even than the rumored scorpio).

Yeah I'd much prefer 1440p upscaled or just native and 60FPS than 4k and 30FPS.  

 

Plus pretty much any open world game should be able to pull off 1080p 60FPS at a minimum. 

My TV is 4k and fps isn't a big issue for me... so 4k30fps is better than 1080p60fps... but I could accept 1440p60fps

Pemalite said:
Intrinsic said:

But hardware like this didnt exist in 2013 at a price or form factir that could fit in a $400 box. Hell even $600 box.

Hardware like this did exist. Aka. Radeon 7950/7970.

Remember, console manufacturers aren't buying GPU's off the shelf like gamers so pricing for them is different, they aren't spending cash on expensive copper vapor chamber coolers with giant blowers.
They aren't including gigabytes worth of extra Ram.
They aren't including complex dedicated and over-engineered power delivery systems.
They don't need the dedicated PCB.

They just buy the chip itself, which is included in a single chip to keep costs low.
Then they also buy in bulk so they get a farther discount.

Besides, even if the cost was a little more expensive, console manufacturers typically ate the cost initially.

Intrinsic said:

Anything above 1080p if not 4k is useless to consoles being that home tvs only support 1080p/4k. Quad hd is not a living room standard. I think in most cases the extra poerr would be used to give more IQ polish and maybe used as an upscaler to get the games to output at 4k even though its built natively in 1080p. Kinda like how a lot of the XB1 games are built at 900p natively but output at 1080p.

Not exactly.

There is substantual benefit to rendering a game at 4k and downscaling it to a 1080P television, keep in mind, games are not movies.

In-fact, some games on consoles have already been doing something similar (Just not this extreme) where they have rendered higher than their native resolution and then downscaled it, as this actually how some forms of Anti-Aliasing works.

Chevinator123 said:
How will a neo even work? Will we be able to put our old/current PS4 games in and they will just somehow run? (presumably with better fps) Will devs need to put out a patch? Will devs even bother? if past PS4 games dont work doesnt that mean its essentially a PS5?

All PS4 titles as they are today, if they aren't updated... Will run exactly the same as they do now even on the new machine.
All new Titles will also run on the regular PS4, just with some concessions.

DonFerrari said:

I was waiting for someone as you to come explain =] thanks perma.

And if the chip is 4x stronger than what we had on PS4, shouldn't we have 4k for a game with 1080p if all other aspects are kept the same?

Unforuntately, the entire console didn't receive a 4x increase, there isn't a 4x increase in Ram or memory bandwidth or CPU performance or HDD/Blu-ray speed, so it's a little more complicated than that.

You won't have your current 1080P games running at 4k.

Simpler titles should have no problems running at 4k resolution though, for more demanding titles Quad-High Definition (Aka 1440P/2560x1440) should be more than feasible, it's actually the resolution the PC GPU will be targeted at. At 1440P you will see some benefit over 1080P even at 4k.

JRPGfan said:

Thats the future of PCs.

Intel or AMD, and useing the integrated gpu (thats on chip).

The reason why this never happend was because of memory bandwidth issues.

System ram are slow, even DDR4 isnt enough to make it work well enough at those GPU amounts.

However Hybrid Memory Cube technology is going to change that when it ends up getting used as system memory.

Nvidia cant make x86 cpus.... so they cant offer a APU that can run windows.

Which is why they started doing android stuff... trying to get into Smart phones & tablets.

Nvidia fears a future where motherboard manufactures dont sell a PCIe lane for graphics cards (forces nvidia out of the pc market)

Where everyone just uses a IGPU from a APU.

No. The reason why this has never happened is because of cost and power consumption.
AMD is actually working on an APU though with 200-300w of power consumption which should be able to beat a PS4, it's not going to be cheap though.

As for bandwidth... DDR4 and DDR3 memory can offer more bandwidth than GDDR5, it all comes down to the implementation.

Clock speed X (bits per clock/8).

Thus 4166mhz DDR4 on a 512bit bus would provide 266GB/s a second of bandwidth, that's more than the PS4 or the PS4 Neo. Go figure.
But such a thing won't happen, 512bit bus would require far to many PCB layers and routing that on a motherboard would be a nightmare.
But on a 256bit bus you would have a very respectible 133GB/s.
Modern GPU's are also more bandwidth efficient, employing a myriad of technology's to conserve bandwidth.

AMD has also used a technology known as "Side Port" once before. - It is where the motherboard manufacturer includes dedicated memory for the GPU on the motherboard in conjunction with System Ram allowing the integrated GPU to access both.
Intel does something similar with eDRAM/L4 cache.

Dedicated PCI-E is here to stay, it's used for more than just Graphics.

torok said:

It will most likely run an overclocked version of the same CPU, no Zen for it. So framerate will probably be marginally better. Visuals will improve.

As they are pushing for parity between all PS4s, games will also be the same, but prettier. Probably 4K upscaling to help sell 4K TVs, as they are really needing all the help they can to sell. PS360 is what pushed HDTV for the masses, I think PS4K/XboxSomething will do the same for 4K.

I think you might have your hopes set a little high there in regards to 4k.

You don't need Zen to upgrade the CPU, Jaguar's successors already exist, namely Puma and Puma+. - They are more efficient, faster and cheaper thanks to a better densely packed design.

JRPGfan said:

I thought the rumors where Puma+ cores instead of the jaguar ones.

Still weak small cores, but minor upgrade.

That's just it though, they are only rumors.
I'm hoping for Puma+. They are fully backwards compatible with Jaguar... Still slow, just a little less slow.

torok said:

I don't know if they will be Jaguar cores and the rumour doesn't clarify it. They could be Puma+ cores aiming for a bit higher performance. It's more likely that they will pair these two instead of using older tech for the CPU and newer for the GPU.

Besides that, AMD will probably push a Puma+/Polaris 10 APU to the market, trying to use DX12 to compensate the limited CPU punch and creating a low cost gaming solution.

Direct X 12 isn't happening on Non-Microsoft platforms.
Even then Sony and Microsoft have low-level API's which are faster than Direct X 12 anyway.

AMD already has a giant APU heading to the PC, it's not using Puma+.

Soundwave said:

I suppose Scorpio could come out in spring/summer 2017 instead, so it wouldn't be quite a year, if its not coming out by then, then the reason likely is they are using something like HBM2 RAM, which would be net performance even higher, so all the more reason to wait. 

There probably are not that many games that will use the Neo's power until 2017 anyway, so I'm kinda content to wait and simply get the better hardware. 

For VR I think I'm going to go with Occulus Rift too. Compatible with PC and better quality resolution ... if you're to pay like $700 for VR, I figure you may as well pay a little more and get the best possible experience as VR is not exactly "budget" either way. 

HBM2 is likely not to happen, mostly because of costs, consoles are cost sensitive devices, you can't expect the, to have the best of everything.
Not only that, but these consoles aren't targeting insane resolutions so such bandwidth would be a waste.

Intrinsic said:

You've got to be joking. The NX based on its own leaks isn't even in this conversation since its rumoured to be aboit as powerful as the XB1. there are even rumors that state it wont use an X86 architecture.

And as for power, going from a 1.8TF to a 4.2TF GPU, bump in CPu clock from 1.6Ghz to 2.1Ghz and memory bandwidth from 192 to 218GB/s is a significant bump especially when you consider that the standard is still 1080p@30fps. Even if the XB1scorpio comes in at 5.5/6TF you wont notice any marginal difference when looking at 1080p content. It will be like comparing two systems @1080p where one hits 48fps and the other 40fps but both are locked to 30fps. 

Also keep in mind that in situations like these, the PS4 will still be the lead platform for devs. Lastly, no matter what sony or MS does with their upgrades, they are still limited by the current hardware. Games would be built to run on those first befire any added benefits of the newer hardware is considered. 

You can't compare the old vs new.
For one, the newer GPU can do more work per Teraflop, it's more efficient.

Secondly... Bandwidth of 192GB/s vs 218GB/s can't be compared as the new console gains a plethora of bandwidth saving technologu's such as Colour compression, which could give it upwards of 50GB/s or more in bandwidth savings (Basing that from PC observations).

torok said:

Also mind that GPGPU on PC is very different, because you don't have an unified memory. So if you are dealing with data on your CPU (system RAM) and wants to pass some task to the GPU, you have to copy it to the GPU's VRAM. That's very slow. An unified memory allows you to offload tasks that wouldn't be interesting on a shared memory architecture.

So you are correct. A routine like physics that runs on the GPU would be faster on the Neo, so it would depend if the CPU tasks of the game can be parallel or not.


Not entirely accurate.
Starting with Pascal nVidia implemented Unified Memory, can't forget unified virtual addressing either.
Intel and AMD have done similar things with their IGP's/APU's.

Azuren said:

Don't forget to factor in hardware optimization. It accounts for a lot of increased performance and is essentially why the PS3 was capable of feats like Last of Us. 

The Playstation 3 didn't manage the Last of Us levels of imagery because of hardware optimization.
It managed it because all the details like lighting and shadowing were baked into things like the textures, same thing went for Halo 4.
They also used simple geometry and then used various tricks to make it seem more complex than they really were.
Allot of the expensive effects (Like Full dynamic HDR lighting.) were also abandoned mid-generation, which when freed up allowed for a dozen simpler effects which created a more pleasing image overall.

As the generation progressed more and more effects stopped being dynamic... And then once the new generation started, everything went back to being dynamic and you can tell by playing them.

GribbleGrunger said:

ME:

Thanks. I'm really not technically minded at all so I didn't have a clue what to search for. Wouldn't the question be: if indeed AMD made those changes internally, would they be available only to chipsets made for Sony or would they be more widely available? Could Sony have patented those changes?

These fall under AMDs new semicustom strategy - Sony may have highlighted the need, but AMD is still doing the bulk of the engineering. Sony has engineers of their own no doubt, but I think anything would be cross-licenced at most (think Toshiba using Cell for whatever, or Xenos being first to unified shaders but ever new GPU after it using them).

Sony gets hardware ahead of current roadmaps out of it, AMD gets part of their R&D funded.

Another answer:

Cerny wanted more engines capable of distributing graphics vs non-graphics work. The idea is essentially getting better utilization and getting closer to the peak performance of the APU. It was no coincidence that the CPU was quite weak so the GPU could also help out with the compute related tasks.

It will also come in play for VR, which is why I've been saying that they will be doubled in Neo/Scorpio.

Unfortunately AMD has the x86 license to worry about when designing APU's, so Sony isn't allowed to get intimate with the chips, they can only petition ideas.
There would be cross-licensing deals anyway though.

m0ney said:
How can APU be as powerful as a dedicated mid to high end video card? Wouldn't it make sense to replace all video cards with APUs then?


Dedicated hardware will always be faster mostly because... You are limited in the amount of transisters and thus the chip size you can have at any given price point.
If you start eroding away at parts of the GPU in order to fit a CPU, Chipset and other Logic... Well, you are missing out on potential performance, it's one of the reasons why the Xbox One's chip, despite being almost the same size as the PS4's chip, is much slower, Microsoft used a chunk of their chip budget for memory.

If you were to take a Polaris 10 GPU and compare it to the PS4's APU, sure, they might have similar performance in graphics tasks, but the PS4's APU will be more expensive to manufacture.
Plus the PS4's CPU is extremely horrible anyway.

I see perma... other aspects probably won't go up 4x as you said, cost would be too high. Any chance of the extra power offseting cpu and the 1440p going up to 60fps on the 1080p30fps (or 45 capped at 30) or it's much more likely that we get other graphic usage instead of frames?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
DonFerrari said:

I see perma... other aspects probably won't go up 4x as you said, cost would be too high. Any chance of the extra power offseting cpu and the 1440p going up to 60fps on the 1080p30fps (or 45 capped at 30) or it's much more likely that we get other graphic usage instead of frames?

i dont see that being the case.... ony way anything offsets cpu is if the dev moves some cpu based tasks over to the GPU. But ad long as the work is on the cpu, only way u can go up to 60fps from 30fps is if you hsve a CPU that is teice as powerful. 

Better explanation if the PS4s CPU is being maxed on a game running at 1080p@30fps, then the only way to rn that game at 1080p@60fps on the neo is if the neos CPU is capabale of teoce the performance than whats in the base PS4. 

We can get more stable framerates thanks to the GPU and lots of eye candy, but as long as the CPU is in a 33ms pipeline.... all you get is 30fps



Intrinsic said:
DonFerrari said:

I see perma... other aspects probably won't go up 4x as you said, cost would be too high. Any chance of the extra power offseting cpu and the 1440p going up to 60fps on the 1080p30fps (or 45 capped at 30) or it's much more likely that we get other graphic usage instead of frames?

i dont see that being the case.... ony way anything offsets cpu is if the dev moves some cpu based tasks over to the GPU. But ad long as the work is on the cpu, only way u can go up to 60fps from 30fps is if you hsve a CPU that is teice as powerful. 

Better explanation if the PS4s CPU is being maxed on a game running at 1080p@30fps, then the only way to rn that game at 1080p@60fps on the neo is if the neos CPU is capabale of teoce the performance than whats in the base PS4. 

We can get more stable framerates thanks to the GPU and lots of eye candy, but as long as the CPU is in a 33ms pipeline.... all you get is 30fps

i'm not following, what do you mean by the only way to get from 30 to 60 fps is by having a twice as powerful CPU?

so let's say (to simplify the argument) that I'm playing battlefield 4 on PC with processor X and a GTX 670 getting about 60 fps on average in high quality.

if i want to double the fps count to 120 i should:

a) get processor Y with double the frequency of processor X, but keep the same gfx card

b) get a gtx 970 and keep the same processor X

 

Which one of these scenarios do you think will double your FPS?

 

edit: just to help out, here's a reference benchmark for the same gfx card testing the fps output with different CPUs

http://www.anandtech.com/bench/CPU/1291



Intrinsic said:
DonFerrari said:

I see perma... other aspects probably won't go up 4x as you said, cost would be too high. Any chance of the extra power offseting cpu and the 1440p going up to 60fps on the 1080p30fps (or 45 capped at 30) or it's much more likely that we get other graphic usage instead of frames?

i dont see that being the case.... ony way anything offsets cpu is if the dev moves some cpu based tasks over to the GPU. But ad long as the work is on the cpu, only way u can go up to 60fps from 30fps is if you hsve a CPU that is teice as powerful. 

Better explanation if the PS4s CPU is being maxed on a game running at 1080p@30fps, then the only way to rn that game at 1080p@60fps on the neo is if the neos CPU is capabale of teoce the performance than whats in the base PS4. 

We can get more stable framerates thanks to the GPU and lots of eye candy, but as long as the CPU is in a 33ms pipeline.... all you get is 30fps

That is why I asked if it's possible to offset enough from CPU to GPU, not if one or another dev will do it, because I'm pretty certain that if it's possible the likes of Naughty Dog would do it. (and I put that perhaps it would be in 45 capped). The devs will always have to choose on what to improve and maximize and on what to concede, I'm aware of that.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

setsunatenshi said:
Intrinsic said:

i dont see that being the case.... ony way anything offsets cpu is if the dev moves some cpu based tasks over to the GPU. But ad long as the work is on the cpu, only way u can go up to 60fps from 30fps is if you hsve a CPU that is teice as powerful. 

Better explanation if the PS4s CPU is being maxed on a game running at 1080p@30fps, then the only way to rn that game at 1080p@60fps on the neo is if the neos CPU is capabale of teoce the performance than whats in the base PS4. 

We can get more stable framerates thanks to the GPU and lots of eye candy, but as long as the CPU is in a 33ms pipeline.... all you get is 30fps

i'm not following, what do you mean by the only way to get from 30 to 60 fps is by having a twice as powerful CPU?

so let's say (to simplify the argument) that I'm playing battlefield 4 on PC with processor X and a GTX 670 getting about 60 fps on average in high quality.

if i want to double the fps count to 120 i should:

a) get processor Y with double the frequency of processor X, but keep the same gfx card

b) get a gtx 970 and keep the same processor X

Which one of these scenarios do you think will double your FPS?

edit: just to help out, here's a reference benchmark for the same gfx card testing the fps output with different CPUs

http://www.anandtech.com/bench/CPU/1291

He is talking about putting a double capacity processing CPU, not necessarily doube frequency.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
Intrinsic said:

i dont see that being the case.... ony way anything offsets cpu is if the dev moves some cpu based tasks over to the GPU. But ad long as the work is on the cpu, only way u can go up to 60fps from 30fps is if you hsve a CPU that is teice as powerful. 

Better explanation if the PS4s CPU is being maxed on a game running at 1080p@30fps, then the only way to rn that game at 1080p@60fps on the neo is if the neos CPU is capabale of teoce the performance than whats in the base PS4. 

We can get more stable framerates thanks to the GPU and lots of eye candy, but as long as the CPU is in a 33ms pipeline.... all you get is 30fps

That is why I asked if it's possible to offset enough from CPU to GPU, not if one or another dev will do it, because I'm pretty certain that if it's possible the likes of Naughty Dog would do it. (and I put that perhaps it would be in 45 capped). The devs will always have to choose on what to improve and maximize and on what to concede, I'm aware of that.

It is possible to a point.
The reason why we don't do all our processing on a GPU is because GPU's are really bad at working on single large and complex tasks, they are best when working on thousands of small tasks.

CPU's though aren't as good as a GPU when it comes to thousands of small tasks, which is why both are seperate, but they do excel at big complex tasks.

With that said, CPU's have typically been doing some of those "small tasks" like Physics calculations, which can be moved over to the GPU, the Playstation 4 was already doing this, so whether such a thing can continue is another matter entirely.

JRPGfan said:
Airaku said:

Proof? I don't see an comfrimations of the PS-Neo specs. Nor anything that link to this. That is a pretty big claim you have going on with no evidence to back it up.


If this is true. Then this is the PS5, not a mid-cycle upgrade. It's a bigger jump than the PS3-PS4 was.

In terms of % of performance increase... wasnt the jump from PS3 -> PS4 like x10 in most area's?

This will just be a x2.3 or so increase (if it ends up being 1.84 -> 5.5 teraflop)

 

Permalite :

We where talking about APUs (igpu) and memory bandwidth.

You replied to me about DDR4 potentially being faster than GDDR5.

I assumed you where talking about system memory on motherboards for CPUs (cuz you know APUs and iGPUs).

I know graphics cards can use higher bus widths.

Im saying thats not  gonna happend for normal consumer cpus, until we start useing Hybrid memory cubes.

When that does happend, a large portion of the discrete gpu market will go away.

As systems are today, you would be right.
But there is still the possible potential for a 512bit bus for System Ram to feed an APU, just no one has done it because of costs, the amount of traces required to support a 512bit bus results in increased PCB layers and thus complexity, which is one of the issues HBM solved by using an interposer.

With that said, AMD is working on a large and power hungry APU for the HPC market, which is likely to be using HBM on a 1024 or higher bit bus.

DonFerrari said:

I see perma... other aspects probably won't go up 4x as you said, cost would be too high. Any chance of the extra power offseting cpu and the 1440p going up to 60fps on the 1080p30fps (or 45 capped at 30) or it's much more likely that we get other graphic usage instead of frames?

Very possible, but that will depend on the dev and what tasks they will offload to the GPU, not everything can be offloaded to the GPU though due to architectural and fundamental chip design differences. (Serial vs Parallel processors.)

I would think most developers would use the extra horse power to close the graphics gap between console and PC though as it shouldn't really cost them any extra development time as the work was already done for PC.




www.youtube.com/@Pemalite