- It's a simple enough question, but the answer is something that has long eluded consensus. For some, a "games machine" is something made by Nintendo, or something with "PlayStation" or "Xbox" written on the packaging. For others, it's all about the amount of RAM, and the speed of the CPU, and the number of GPU cores they've managed to shoe-horn into their LED-encrusted black-and-silver beauty.
For years now - decades - these two points of view have divided people. On the one hand, the console faithful tout the stability of their platform, the assurance of a 5-year life cycle, and the relatively low-cost nature of the hardware. The PC crowd on the other hand flaunt the flexibility of their hardware: their ability to improve performance at a moment's notice and to cater for new and developing trends in gaming for as long as their screaming wallets will allow.
Now, though, for better or worse we are beginning to see a real revolution in console gaming. Where once console specifications were defined and immutable, they have started to become varied and variable. Console manufacturers, it would seem, are starting to take aim against one of the major strengths of the PC platform: flexibility. But this change is going to come at a cost, and if not handled well could end up doing more harm than good.
With the arrival of the Xbox on the scene, gamers saw the introduction of a persistent online presence with "Live." Sony soon followed suit and finally delivered a competitive interface with the release of the PS3. Regulated online play, downloadable patches and content, feature-laden firmware updates, social networking... add in upgradable HDDs, USB ports, card readers and wireless networking, and suddenly, consoles are looking a lot more PC-like.
In their struggle to compete with the flexibility of the PC experience however, there are a few sticking points that today's console developers would do well to avoid - or at least tiptoe quietly around and try their best not to awaken.