By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Prediction: PS5 will have the longest lifecicle of any PS console

Kristof81 said:

It will be, simply because by that time (2025-2030), there won't be any new traditional consoles, just subscription based streaming services. PS5 very likely won't have a physical disc drive, and it will be used as default device for Playstation Now until Sony decides to go multiplatform way.

So you think PS5 will come out only in 2025 and that it will feature only digital games/streaming when physical is still the strongest form factor in 2018?



Bet with Teeqoz for 2 weeks of avatar and sig control that Super Mario Odyssey would ship more than 7m on its first 2 months. The game shipped 9.07m, so I won

Around the Network

Just no, people want new things and hype is generated by releasing a new and better console.



I will split the posts cause its getting too long explaining a little more

1 - CrazyGPU said:
Let asume, as most analysts and people guess, that PS5 would be released 2020 or 2021. With a 7nm fabrication process, and enough power for 4k 30fps gaming.

Permalite said:
Will it though? I think people underestimate how much resources you truly need to achieve 4k30fps and have an accompanying generational leap in fidelity.


CrazyGPU said:
I said 4k 30 fps, I didn´t say that it will have an accompanying generational leap in fidelity. I would be glad if it really achieves native 4k 30-60 fps in all games. What you say would be a dream, but far from real for PS5. I can´t see them achieving a gen leap in fidelity in 3 years with today hardware on console power envelope at 400-500
And with 8 Tflops don´t expect nothing more than medium quality 4k native graphics.
Again Teraflops are not exact, if AMD changes radically its architecture or Sony goes for Nvidia , the number of Tf will change, but I need something to relatively compare.

---------------------------------------------------------

2 - CrazyGPU said:
Now, in this gen of console wars, hardware is about teraflops and graphic resolution (Im not talking about soft, exclusives, policies, etc, just hardware)

Permalite said:
No it's not about teraflops.
That's just advertising fluff so that one company's platform looks better than another and people who don't know any better take that number and run with it without any understanding of it's implications in relation to hardware capability.

This has happened many times in the past. - Like with bits or ghz.

And just like "bits". - Flops isn't an accurate denominator in determining the complete hardware capability of a console.


CrazyGPU said:
Yes, this gen is about teraflops and graphic res. Exactly as you said, people take that number and run with it. Its how Microsoft and Sony are fighting the marketing war. They don´t speak much about bandwith, geometry throughput, pixel filtering, texel filtering and the bottleneck of the involved CPU. That´s how they market the world how capable their console are, Tflops. It doesn´t mean its the only thing that matters, but its how they show their products. The first thing Ms said about One X was that it was a 6 Tf machine. They try to win user´s mind as Intel did with GHz 2 decades ago.

-----------------------------------------------------------

3 - CrazyGPU said:
If we consider that PS4 is close to 2 teraflops for 2 megapixels per frame, then we would need a at least a 8 teraflop machine for 8 megapixels.

Permalite said:
Correlation doesn't equate to causation. You are drawing a false equivalency which is a logical fallacy. Aka. Wrong.


CrazyGPU said:
Its math. If you want to output 2 millon pixels on screen you need 2 teraflops at a fixed quality(means without changing anything else, just resolution). Then if you need to output 8 millon you need 4 times that, at the same fixed quality. And that considering other stuff like say bandwith doesn´t become a bottleneck. If it does you need to balance that too. Of course if you want to implement better AA, shading, lighting, rays, etc, you would need even more power, which would mean even more flops for calculations.
Im not being extremely precise, PS4 is not 2 teraflops, its 1.84, and you get just a little more than 2 millon pixels on screen, but the idea is the same, as when someone says 1000 MB of ram is 1 GB and thats close to 1024, the right number.



4 - CrazyGPU said:
PS5 needs to be 3840 x 2160. A 4k native console. So If we want it to handle the same games with the same shading and lighting as ps4, no improvement, just 4k native resolution for the same game. it needs 8,3 megapixels per frame.

Permalite said:
It doesn't need to be anything. Like every other preceding console generation, resolution will be completely up to the developer.

The Playstation 4 isn't even a true 1080P console, many games operate at 900P. Or 1600x900 rather than the full 1920x1080 resolution.

The Playstation 3 had games than ran at sub 720P resolutions.

In-fact the base Playstation 4 supports HDMI 1.4, so in theory it could do 4k30fps if sony/developer allowed it and the visuals were dialed back.



CrazyGPU said:

Yes, resolution is up to the developer, but how Microsoft marketed the XBOX one X for example? As a 4k native console. If you have to say in ten years what resolution the ps4 handle, you would say 1080p, despite some games doesn´t reach that even at 30 fps. Most do. The PS5 will have to run most games at 4k (Im not talking about minecraft or tetris at 8k here), even though some may run at 2k checkboard, but that will be the exeption, not the rule, like now with the ps4 pro. Some games run at 4k checkboard and some other a little more than 1080p. I think that Sony should make a machine capable of running most games at native 4k 30 fps to diferentiate from PS4 pro and take advantage of new TVs. Companies use graphics as a hook now, and its a better hooK than showing better CPU or RAM.

--------------------------------------------------------------------------------------------------------


5- CrazyGPU said:

And we can have that by 2019-2020. Xbox one X is 6 Tf and just lauched.

Permalite said:
The time frame of when the Xbox One X launching is ultimately irrelevant.
If the technology doesn't exist in the PC's mainstream, then it's never coming to a console, it is simple as that.

As for flops, again. It is a useless denominator, you can have a graphics processor with less flops outperform a graphics processor with more flops. It's stupid and nonsensical because games actually need more than just flops to render a scene.


CrazyGPU said:
It´s not useless, but It´s not accurate either. For example, if the PS5 is a 1 teraflop machine, it won´t be able to calculate fast enough for 4k resolution. But there is a window. For example, a Geforce GTX 1070 has close to 7 teraflops but its architecture is fast enough to be compared to an AMD RADEON VEGA 56 with 10.5 teraflops. That means that a 10 teraflop console without CPU bottlenecks can be able to run 4k 30 fps at high quality with today games. We have to see if they can put such a capable graphic processor on a console power envelope by 2020. If they change AMD for Nvidia, 7 teraflops would be enough for 4k 30 fps if architecture doesn´t change, but BC would be more difficult I guess. And again, bandwith and the rest of the graphic pipeline should be balanced or it will have a bottleneck. It will take more than the usual 6 years to have a processor with the flops an all the rest necessary to achieve 8k and that’s without increasing the quality of AAA games.

----------------------------------------------------------------------------------------------
6 - CrazyGPU said:
PS3 cell processor was fabricated on a 90 nm process. That means each transistor was 90 nm in size.


Permalite said:
Are you sure about that? 14nm finfet is actually based on the 20nm planar process.
Don't fall for the "nm" marketing angle that fabs push out, they fluff up the numbers to make themselves look good.

You should try checking out the individual BEOL and FEOL sizes sometime at each geometry size, you might walk away a little surprised.
And that is the reason why Intel has typically held the fabrication edge... Because even though Global Foundries has a "12nm" process, Intels 14nm+++ likely has the edge still.


CrazyGPU said:
Again, it doesn´t matter if its 12 or 10 or 7nm. We are getting close to molecular size of transistors. Intel 80386 was 1500 nm in 1985. They are using extreme ultraviolet light and thinking of x rays. There are no many shrinks left and each of them are harder and more expensive. Fabrics are beyond expensive to make. Moore Law is slower nowadays. It´s not year 2000 where you have a new graphic card architecture every year. So the shrinks needed to justify PS6 will take more time, hence PS5 will have longer lifespan.

-------------------------------------------------------------------------------------


7 - CrazyGPU said:
PS4 Pro APU was fabricated on a 16 nm process and most likely PS5 will be fabricated with 7nm transistors.

Permalite said:

7nm is a given.


CrazyGPU said:
1 nm is the size of 10 Hidrogen atoms. So we are talking sizes of 70 atoms for a transistor. They don´t have much room left for miniaturization.

Permalite said: Citation needed.


CrazyGPU said:
Atom size. https://hypertextbook.com/facts/1996/MichaelPhillip.shtml

They need new materials, they are studying silicon replacement, they are getting near molecular size, they have more quantum mecanic problems, machinery in fabrics are becoming exponentially expensive. They should get arround it but it will take more time than ever to make a justifiable PS6.

Of course you can make one in just a couple of years if its dekstop size and has 3 high end graphic cards and 1000W available. But that wouldn´t be console power envelope. 

Last edited by CrazyGPU - on 13 February 2018

I think you'd be surprised if you looked back in 10 years at this thread and realized how much technology has advanced.



Around the Network

8 - CrazyGPU said:
Its getting more time and its harder to make smaller transistor fabrication processes. And because of Quantum Mecanic laws, the performance gains they are getting are also smaller. So it will take a really long time to get to let say 3 nm and really really expensive. If the way of making processors doesn´t change, there is going to be a wall soon.

Permalite said:
DRAM/NAND have taken note of this. So instead of going smaller, they went taller.

More exotic materials like Carbon nanotubes and using triple/quad-druple patterning will go a ways to get us to smaller geometry sizes.



CrazyGPU said:
That what I said, way of making processors have to change, We´ll see if its carbon nanotubes or some other stuff. But it will take time

-------------------------------------------------------

9- CrazyGPU said:
Even Nintendo said that getting into HD was way more expensive than developing for SD.

Permalite said:
That's just simplified rhetoric. Building a game for SD or HD costs exactly the same, the PC you can take any old game and drive it's resolution from 480P all the way to 8k, it's just a resolution change often exposed by a configuration file if it's not listed in the games settings.

It's building the assets that look great at higher definition that costs.


CrazyGPU said:
Developers make new assets for higher resolution because they need them to look better.
As you said, so cost goes up.

--------------------------------------------------


10 - CrazyGPU said:
FOR A PS6 TO HAVE A REASON TO EXIST , it should be a 8k console. Even at 8k we´ll have diminishing returns in graphics.

Permalite said:
Why should 8k be a requirement for the Playstation 6?


CrazyGPU said:
Why shouldn´t? what would people justify buying a PS6 at the same resolution? If your answer is ultra quality graphics at 4k, then the console would be able to reach 8k medium quality. As now with many ps4 pro games. Let say Shadow of the colossus. 4k 30 fps or dynamic at 60, I guess, 2k. Tomb Raider. 4k Checkboard or 1080 high quality settings. Other stuff like better cpus or ram doesn´t consume much power, so they are not a constrain for making a better machine. 8000 series I7 consume arround 80W. Nothing compared to a 250 W high end graphic card. And again Companies sell what you see, That´s one of the reasons why is so hard to sell VR. You have 2 millon VR sold and 74 millon ps4s.

11 - CrazyGPU said:
Of course I can be wrong, but for that companies need to change the way they make CPUs and GPUs to get better performance faster or there has to be a VR fever so high that millons of people pay more for beast consoles and companies fight for 4k on each eye at 90 fps. Not there yet.


Permalite said:
There is a focus on efficiency right now, not brute force with graphics. Consoles are simply at the mercy of the PC's development cadence on this front.



12 - CrazyGPU said:
There also can be a new console with the same resolution, but would it make sense?

Permalite said:
Sure it would make sense. More to graphics than resolution.



CrazyGPU said:
I agree with you. Full HD films look much better than today 4k gaming. but you know, the number sells...
It would make sense for us, not for Sony or Msft marketing department. They would go for 8k. Also a 4k with awesome ultra graphics will be capable of 8k medium quality graphics. If they reach the number,they are going to market it.



Nice posts Permalite. I think that even your idea of better graphics , let say at 4k 60 fps ultra quality, would have the same hardware demand or even more than 8K if you want a generational leap than 8K gaming mantaining PS4 quality (Im not talking about 8k tetris here). And that in order to be in a console power envelope and at reasonable console prices, It would take many years. Thats one of the reasons why I think PS5 wil have the longest lifecicle. That doesn´t mean that sony can´t lauch PS5 pro, PS5 ultra, and PS5 ultimate, upgrading the cash making machinery before PS6 hehehe.

I hope Im wrong and they change the way they make processors, that would make the premium next gen 4k graphics or 8k requirements come closer, but I doubt it for PS6. And besides that in 20 years we´ll probably have streaming gaming all over the place. I will miss old times then.



RaptorChrist said:
I think you'd be surprised if you looked back in 10 years at this thread and realized how much technology has advanced.

I wish. But think about this. 

2007- Bioshock, Cod modern warfare, Crysis. PS3

2017 - Cod WWII, Wolfenstein II. PS4.

Jump is good but not awesome.

 

now 1997. Quake 1, Half life 1 not even out yet. Metal gear solid 1 in developement. PS1

2007. Metal gear solid 4 in developement, Crysis, Modern warfare.

Jump in graphic is insane. 

 

2017

2027 Same games with more resolution and better textures. I think we will have a lesser jump than from 2007 to 2027. Not a real gen leap.

the only think you will have in Sony console space is variants of PS5 and PS4s. 



CrazyGPU said:

 I can´t see them achieving a gen leap in fidelity in 3 years with today hardware on console power envelope at 400-500

That's your problem. You are only thinking about the next console generation with todays hardware rather than the hardware we will have in a few years time.

CrazyGPU said:

Again Teraflops are not exact, if AMD changes radically its architecture or Sony goes for Nvidia , the number of Tf will change, but I need something to relatively compare.

Teraflops are not exact? Hows about... Teraflops are useless unless you actually have an understanding of what they represent.

CrazyGPU said:
Yes, this gen is about teraflops and graphic res. Exactly as you said, people take that number and run with it. Its how Microsoft and Sony are fighting the marketing war. They don´t speak much about bandwith, geometry throughput, pixel filtering, texel filtering and the bottleneck of the involved CPU. That´s how they market the world how capable their console are, Tflops. It doesn´t mean its the only thing that matters, but its how they show their products. The first thing Ms said about One X was that it was a 6 Tf machine. They try to win user´s mind as Intel did with GHz 2 decades ago.

And you fell for it, hook, line and sinker.

CrazyGPU said:
Its math. If you want to output 2 millon pixels on screen you need 2 teraflops at a fixed quality(means without changing anything else, just resolution). Then if you need to output 8 millon you need 4 times that, at the same fixed quality. And that considering other stuff like say bandwith doesn´t become a bottleneck. If it does you need to balance that too. Of course if you want to implement better AA, shading, lighting, rays, etc, you would need even more power, which would mean even more flops for calculations.
Im not being extremely precise, PS4 is not 2 teraflops, its 1.84, and you get just a little more than 2 millon pixels on screen, but the idea is the same, as when someone says 1000 MB of ram is 1 GB and thats close to 1024, the right number.

It's not math. You are asserting a logical fallacy by using two different constructs and forcing a relationship with one another.

Until you can tell me how exactly single precision floating point relates to resolution, then your argument is completely baseless... Because the Playstation 2 operates at 6.2Gflops.
The Playstation 4 is 1840 gflops. Aka. A 296x increase.

The Playstation 2 typically rendered at 307,200 pixels, whilst the Playstation 4 is pushing 2,073,600 pixels. Aka. A 6.75x increase.

Ergo. Flops has no relation to rendered resolution.


CrazyGPU said:
Yes, resolution is up to the developer, but how Microsoft marketed the XBOX one X for example? As a 4k native console.

How Microsoft marketed the console is ultimately irrelevant.
If anyone thought that a console launching in 2017 with mid-range PC hardware was going to do native 4k across the board... Well. They were idiots.


CrazyGPU said:
The PS5 will have to run most games at 4k (Im not talking about minecraft or tetris at 8k here), even though some may run at 2k checkboard, but that will be the exeption, not the rule, like now with the ps4 pro. Some games run at 4k checkboard and some other a little more than 1080p. I think that Sony should make a machine capable of running most games at native 4k 30 fps to diferentiate from PS4 pro and take advantage of new TVs. Companies use graphics as a hook now, and its a better hooK than showing better CPU or RAM.

The Playstation 5 doesn't have to do anything. Sony will leverage the best bang-for-buck mid-range hardware for it's next generation console, if it's capable of native 4k, it is capable of native 4k. If it's not, it's not. It's really that simple.

But it most certainly is up to the Developer if they wish to fully leverage that capability.

CrazyGPU said:
It´s not useless, but It´s not accurate either. For example, if the PS5 is a 1 teraflop machine, it won´t be able to calculate fast enough for 4k resolution.

Sure.

CrazyGPU said:

Geforce GTX 1070 has close to 7 teraflops but its architecture is fast enough to be compared to an AMD RADEON VEGA 56 with 10.5 teraflops.

And do you have an understanding of why that is the case?

A jump from 7 to 10.5 Teraflops is a difference of 3.5 Terfalops. More than two Xbox One's combined, that's not just a small divide right there.


CrazyGPU said:

That means that a 10 teraflop console without CPU bottlenecks can be able to run 4k 30 fps at high quality with today games.

The base Xbox One/Playstation 4 can technically do 4k 30fps with visuals dialed back as they both support HDMI 1.4.

CrazyGPU said:

If they change AMD for Nvidia, 7 teraflops would be enough for 4k 30 fps if architecture doesn´t change, but BC would be more difficult I guess.

Wouldn't provide next gen fidelity.

nVidia isn't going to happen. Period. Why? Because a single chip solution is cheaper.


CrazyGPU said:

And again, bandwith and the rest of the graphic pipeline should be balanced or it will have a bottleneck. It will take more than the usual 6 years to have a processor with the flops an all the rest necessary to achieve 8k and that’s without increasing the quality of AAA games.

It is impossible to remove bottlenecks.
A bottleneck can also change depending on the game/scene being rendered and so on.

CrazyGPU said:

Again, it doesn´t matter if its 12 or 10 or 7nm.

Actually it does.

CrazyGPU said:

We are getting close to molecular size of transistors.

And that's fine, the great thing about chip design is that they are working around it to various degrees.

CrazyGPU said:

There are no many shrinks left and each of them are harder and more expensive.

Ironically... NAND/DRAM has side stepped this issue for the time being, you should look at what they are doing.
Some NAND manufacturers even started producing their NAND at 55nm rather than say... 14nm. And the chips were smaller and offered more capacity.

CrazyGPU said:

Moore Law is slower nowadays.

Moores Law isn't a real law, it was an observance.

CrazyGPU said:

It´s not year 2000 where you have a new graphic card architecture every year.

Even in the year 2000 ATI/nVidia weren't rolling out new GPU architectures every year, they were doing significant, albeit iterative updates.
I mean the Radeon 8500 is based on the foundations of the Radeon 7500.

The Radeon 9700's design thanks to ArtX laid the foundations for  the x800 and x1900 series.

CrazyGPU said:

They need new materials, they are studying silicon replacement, they are getting near molecular size, they have more quantum mecanic problems, machinery in fabrics are becoming exponentially expensive. They should get arround it but it will take more time than ever to make a justifiable PS6.

Or. If you can't go smaller... You go with bigger geometry sizes and you go taller like NAND/DRAM.

AMD took note of how die shrinks are starting to stall out... And took advantage of that with Ryzen. So instead of making one giant monolithic chip to rule them all... They took smaller chips that are cheaper to manufacture and stitched them together... And because the chips are smaller, they get more workable chips per wafer.

What you are stating isn't intrinsically wrong, but there are tons of ways to get around the problem that manufacturers are looking at.

CrazyGPU said:

Why shouldn´t? what would people justify buying a PS6 at the same resolution?

Better graphics?

CrazyGPU said:

If your answer is ultra quality graphics at 4k

Better performance?

CrazyGPU said:

Thats one of the reasons why I think PS5 wil have the longest lifecicle. That doesn´t mean that sony can´t lauch PS5 pro, PS5 ultra, and PS5 ultimate, upgrading the cash making machinery before PS6 hehehe.

I think in general we will see more iterative console releases that leverages the typical PC cadence of hardware improvements.
I.E. The Playstation 4 Pro/Xbox One X.

Will that extend the console cycle? I have no idea, we don't have a precedent yet... So lets wait and see what this Console generation gives us first before asserting hypothetical's.



--::{PC Gaming Master Race}::--

Pemalite said:

 

CrazyGPU said:
Its math. If you want to output 2 millon pixels on screen you need 2 teraflops at a fixed quality(means without changing anything else, just resolution). Then if you need to output 8 millon you need 4 times that, at the same fixed quality. And that considering other stuff like say bandwith doesn´t become a bottleneck. If it does you need to balance that too. Of course if you want to implement better AA, shading, lighting, rays, etc, you would need even more power, which would mean even more flops for calculations.
Im not being extremely precise, PS4 is not 2 teraflops, its 1.84, and you get just a little more than 2 millon pixels on screen, but the idea is the same, as when someone says 1000 MB of ram is 1 GB and thats close to 1024, the right number.

It's not math. You are asserting a logical fallacy by using two different constructs and forcing a relationship with one another.

Until you can tell me how exactly single precision floating point relates to resolution, then your argument is completely baseless... Because the Playstation 2 operates at 6.2Gflops.
The Playstation 4 is 1840 gflops. Aka. A 296x increase.

The Playstation 2 typically rendered at 307,200 pixels, whilst the Playstation 4 is pushing 2,073,600 pixels. Aka. A 6.75x increase.

Ergo. Flops has no relation to rendered resolution.


Ok, I´ll go again with the teraflops thing.

I understand that there are other things besides teraflops in the graphic pipeline. In a GPU you have many cores, you have decoders, buffers, the execution units, texture units etc. Execution units can be 16 bit, 32 bit, 64bit , SIMD or other. Then you have to feed the processor, You have different levels of cache, then the bus bandwith with memory, the type of memory, the frequency of it, ROPS and so on. It´s complex and they try to balance the jerarchy to feed the processor. The processor makes 32 bit fp operations and we name that a flop. 

Now Radeon High end GPUs have between 10,5 and 13 Teraflops and their graphic chips are still less performant than Nvidia´s GTX 1080 graphic card with 9 in many cases. Of course FPs is not the only thing you can use to compare. You also can speak about peak texture and pixel filtering, Peak rasterization rate or bandwith. 

It´s not precise for comparing graphic card performance, and worse if you want to compare different brands and architectures, BUT GIVES YOU AN IDEA. And speaking about the same architecture, AMD in this case, we can think that a 11-13 teraflop AMD Graphic card would be able to run 4k at 30 fps. 

Now, I would try to explain your PS2 example with something similar. 

There is a Guy that I respect much and I believe is one man that undestand this thigns better than anybody. He was in charge of all the unreal engines. Epic Games founder Tim Sweeney.

He showed a slide on DICE 2012 session about computational analysis for predicting what was needed for next gen consoles. What did he use for that? Teraflops. 

DOOM 1993. 320 x 200 x 30 fps x 6 operations per pixel = 10 MFlops.

Unreal 1998. 1024 x 768 x 30 fps x 48 ops per pixel = 1 Gigaflop.

Samaritan Demo 2011. 1920 x 1080 x 30fps x 40.000 operations per pixel = 2.5 Teraflops. 

you can see it here min 5-46 to 7:56.  https://www.youtube.com/watch?v=XiQweemn2_A

And next gen (PS4) didn´t get there, and many games didn´t run at 1080p 30 fps. He predicted it in 2011.

Now, the difference as you see is the operations per pixel, calculated by the GPU.  

Older graphic cards didn´t reflect shadows or light, neither did transformation and lighting.  So you had 6 ops per pixel in the 1st case.

Then you had GPUs able to calculate the bounce in lighting on a wall. 48 ops per pixel. 

On the last case you had light bouncing from an object to the floor and then to your eyes. The video explains it. And there is your difference.

for calculating all that you need an operation. A single precision flop (what Im talking a bout) is a 32 bit floating point operation. 

Acording to Tim Sweeney, founder of Unreal Technology you need 40.000 operations per pixel to manage lighting as GPUs do this gen. 3 bounces of light.

That times 30 and the resolution , you need 2.5 Teraflops for native 1080p. He is not even talking about GPUs or other stuff. Just Tflops.

Of course you can tweak your game or implement dinamic resolution or whatever. And now GPUs can eficiently manage graphics, but there is no magic. I mean Switch Tegra can output 1080p with 0.5 Teraflops but not at the same quality as a PS4 pro or with the same AA, ambient oclussion, supersampling or whatever. 

With his formula and keeping 3 bounces of lights, for 4k you would need. 3840 x 2160 x 30 fps x 40.000 = 10 Teraflops. 

You now have PC graphic cards that achieve that with 9, others with 12, but its doesn´t change much from his prediction in 2011. 

Now, do you want a leap from that? 4 bounces of light? real global illumination. Real Next Gen? It won´t happen with PS5. 3 years is nothing. 

PD:  Now, If you don´t agree and think that Tim Sweeney aproximation is completely wrong, I have nothing else to say to you. 

Last edited by CrazyGPU - on 13 February 2018

Makes sense. I wouldn't mind longer life cycles tbh



NND: 0047-7271-7918 | XBL: Nights illusion | PSN: GameNChick