this is more or less like the system the leaked paper from microsoft showed: based on demand or use-case different hardware can be active/put asleep. isn't this what wiiu also does by now?
this is more or less like the system the leaked paper from microsoft showed: based on demand or use-case different hardware can be active/put asleep. isn't this what wiiu also does by now?
This is what Virtu MVP already aims to do on the PC front. I'm not so sure it does a great job but certainly the potential is thee.
niallyb
Scoobes said:
Doesn't the OneX just use the Tegra3 or Snapdragon chip based on region/model rather than actively switching chips? And the APU and discrete pairings AMD push are based on GPUs with the same or very similar architecture (effectively using crossfire). This patent seems to describe a way of switching between two very different GPU architectures in the background. I don't think anyone else has really done this. |
There was(is) one Co that tried it, Lucid with its Lucid Hydra chip that allowed you to use cards with different architectures, even mixing cards from Nvidia and AMD at the same time.
If I remember well, it had the same problem that SLI/Xfire: it depends too much on drivers, something that won't happen on a console.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
JEMC said:
There was(is) one Co that tried it, Lucid with its Lucid Hydra chip that allowed you to use cards with different architectures, even mixing cards from Nvidia and AMD at the same time. If I remember well, it had the same problem that SLI/Xfire: it depends too much on drivers, something that won't happen on a console. |
That is quite interesting and would have been cool if it wasn't for the driver issue. Sony could probably still get away with it by saying it's in a specific consumer device/hardware, although it makes the prior art argument far more compelling.
Sounds interesting, and pricey.
e=mc^2
Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)
KylieDog said: Am I the only one who doesn't give a crap about all these rumours and would rather stay silent until the hard facts are made public? |
No, but the few that are saying so are scarce between all the people making from this a seperate core for OS-functions.
And although this is a real patent, there are
a) different product-lines than gaming in the portfolio of Sony
and b) most firms patent everything what an employee thought about whilke he was on the loo. Really, an idea might bring a prototype, but no real product, might not even get to prototype-stage or is thought of as too problematic to realize at the moment. But a petent is filed anyways, because patents are directly worth money. If you follow tech-industry: many firms that got bankrupt sold their assets. Only thing that always gets money are the patents. They are sold en bloc, in the thousands. The more patents you have, the better.
Scoobes said:
That is quite interesting and would have been cool if it wasn't for the driver issue. Sony could probably still get away with it by saying it's in a specific consumer device/hardware, although it makes the prior art argument far more compelling. |
I just don't like vague patents like this which are clearly built on similar ideas by others. I don't see a difference between background switiching different CPU architectures to different GPU architectures. It's the same power saving principle surely?
No biggie, I just wanted to highlight I thought it was a patent on dangerous ground if challenged (Can see this being a issue for mobile device developers).
slowmo said:
No biggie, I just wanted to highlight I thought it was a patent on dangerous ground if challenged (Can see this being a issue for mobile device developers). |
I didn't think the mobile CPU switching had different architectures. Aren't they still based on the same ARM based designs?
Scoobes said:
I didn't think the mobile CPU switching had different architectures. Aren't they still based on the same ARM based designs? |
Its not different I hate to admit . Check out the link below, they basically built the power saving core into the Tegra 3 as an additional core therefore its one chip and architecture technically . While it blows my initial example out of the water, I still think this has been demonstrated previously.
Http://androidcommunity.com/nvidias-tegra-3-fifth-core-architecture-will-be-known-as-4-plus-1-20120222/
Would this move make the more expensive or could they cut down on ram and cpu because of this?
Interesting read!