By using this site, you agree to our Privacy Policy and our Terms of Use. Close
MikeB said:

@ HappySqurriel

Demo Scene is not about pushing hardware to its limits as much as it is about finding a limitation in what you're doing in order to give the appearance of more advanced visuals


Visual appearance = Visuals. The artists explore boundaries and seek workarounds for limitations, at least that what the Amiga demoscene was like. To show off your coding talents, imagination and creativity.

Anyways, the primary reason why the ever changing hardware approach of the PC became the dominant approach is because hardware is cheap in comparison to the cost of optimizing most programs


The x86 IBM compatible became dominant, because IBM had a strong relationship and good reputation amongst businesses. It's awful media (and thus gaming) abilities were looked at as non-crucial, managers didn't know the potential of multi-tasking (like copy and pasting between programs), better multi-media power (like visual and audio feedback), etc.

Because people used IBM computers at their work, many when buying a home computer would pick an IBM compatible Personal Computer so they could do some work at home as well (nomatter far more advanced software existed for other systems). Later marketing played an important role, if there was 1 Amiga ad, there would be 20 PC ads from various different companies, etc.

Much better results were reached on the Amiga (both multi-media and professional software) than on 80's MSDOS PCs.

 

We can argue the semantics of the demo scene all you want, but you and I both know that just because people could look at a game like Resident Evil: Remake and think it looked similar to a HD console game does not mean that it was nearly as demanding as a HD console game ...

 

Now, IBM became so popular and profitable because they convinced companies (rightfully so) that more powerful and newer hardware was cheaper than highly optimized software. A large portion of companies still operaterate based on custom software, and the cost of this software is directly related to the number of man hours it takes to create it ... Now, a $100,000 server may look like a massive expense to run a piece of software but when you're dealing with software with hundreds of thousands of lines of codes (and in many/most cases for legacy business applications) millions of lines of code, that $100,000 represents a drop in the bucket to optimizing an entire application.