HoloDust said:
Not expert on subject, but few things to consider: "To get an idea of what a difference in precision 16 bits can make, FP16 can represent 1024 values for each power of 2 between 2-14 and 215 (its exponent range). That’s 30,720 values. Contrast this to FP32, which can represent about 8 million values for each power of 2 between 2-126 and 2127. That’s about 2 billion values—a big difference." https://devblogs.nvidia.com/parallelforall/mixed-precision-programming-cuda-8/
Current performance king in gaming GPUs, Titan X: FP32: 10,157 GFLOPS FP16: 159 GFLOPS This is of course nVidia's way to prevent gaming cards being bought instead of Teslas for tasks that actually benefit from FP16...but just shows how little FP16 performance is important in games. Honestly, not an expert on the subject, but last time I recall any talk about FP16 in gaming was some 15 or so years ago. Sure, mobiles have it, but degradation in quality seems to be quite noticeable. |
Can you give an example? FunFan just gave one that seems to prove otherwise.
I predict NX launches in 2017 - not 2016