By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JRPGfan said:

That Nvidia coolaid :p

Just so we are clear. I am Pro-AMD. I have had multiple AMD GPU's in crossfire for generations.
Don't assume I am a "Fanboy".

JRPGfan said:

The Tegra X1 uses like 30watts of power to reach those 512 Gflops its able to do in the shield console.

Without powering a screen or anything extra.

Hell a Hard disk drive is 4-5watts alone, and PS4 & XB1 use those.

Even if Nvidia tried to do a 1teraflop Mobile chip, it would still be in the 50+watt range.

You make it sound like the Chips inside the PS4 & XB1 are bad, and their really not.

Not at the power levels of graphics they do.



Tegra's GPU uses roughly 1.51 watts of power on average in Manhattan.
http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

Here it is in the Shield TV using 10 watts.
http://www.phoronix.com/scan.php?page=article&item=nvidia-tegra-x1&num=1

And here we see it go from anywhere between 3.6w and 19.4w of power consumption.
http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/9

So how is it a 30-50w chip again? Remember. Volta will be double the performance again, at the same fabrication level and use the same amount of power, it will be close enough to an Xbox One.

The Google Pixel C with it's Tegra, is faster than the Nintendo Switch, will have a bigger and better display with higher power consumption and still gets over 13 hours of battery life in web browsing.
With all the CPU cores pegged, that drops down to 5 hours, which is still better than most other tablets.
http://www.anandtech.com/show/9972/the-google-pixel-c-review/6

As for the drives. Remember, the Switch is using NAND. Not a mechanical based drive. It's power consumption is significantly lower... And yes. Screens do play a big part in this, but Nintendo isn't using a massive high-resolution display (And one would hope it is an LTPS display) so power consumption isn't going to be as drastic as other tablets like the Pixel C.

JRPGfan said:

 

 

This chart is useing old review scores, but AMD drivers for the 470/480 have done alot since it released.

And look at that the most effecient GPU chip at 1920x1080 resolutions is a RX 470!

If you are basing efficiency based on the graph above. Then no. It's really not. That is Price/Performance not Performance/Power consumption.

As for the Radeon 470 specifically.
At Load and Idle it uses more power than the Geforce 1060 Founders Edition and most certainly looses in the performance stakes by a significant margin, where the Geforce 1060 can beat the Radeon 480.

But don't take my word for it:
http://www.tomshardware.com/reviews/amd-radeon-rx-470,4703-6.html

JRPGfan said:

Yeah RX470/480 arnt at the top of the list.

Will say this though; the launch drivers for the RX470/80 wherent the best optimised, newer drivers have seen some pretty decent gains.

Its something nvidia is usually good about, optimiseing drivers for launch so their cards revew well day1.

nVidia drivers haven't stopped improving either.

haqqaton said:

After seeing some OpenGL vs Vulkan comparisons and considering that Switch supports Vulkan and that Vulkan is great for ARM mobile chips like Tegra X1, I think we can be confident to say that Switch, even undocked, will be fairly better than Wii U. To be fair, they are comparing Vulkan to OpenGL|ES in the videos below but I think that the point stands.

Vulkan is irrellevent, it's not going to be some magical "Secret Sauce".
The Wii U has a low level API that is higher performance than Vulkan if a Developer wishes to build for it.
Same goes for the Xbox and Playstation platforms.

An API like Vulkan does offer better performance than higher level API's like Direct X 11 and OpenGL.

Wyrdness said:

It's actually around twice Wii U performance when undocked going by the specs rumours are giving.

Specs are one thing. Real world per formance is another completely.

The Switch will beat the Wii U undocked. - The question though, is by how much?

Alby_da_Wolf said:
Same CPU speed in both modes is quite obvious, as game logic must always run in the same mode (at most they could cut it by a few tens MHz by suspending unessential non-gaming services that they possibly chose to run in background while docked). GPU speed and so graphics level is obviously where big power saving can be achieved when in portable mode, and this too shouldn't be matter of arguments.
What can really disappoint people with sound reasons is the choice of top speed when docked.

The annoying part is... Nintendo could have thrown out the Big cores in the Tegra chip and just kept the slower, more power efficient cores and then used that extra TDP to throw more clock rate at the graphics chip.

They could have saved money as well due to less die-space.




--::{PC Gaming Master Race}::--