- Mark these words, computer graphics processors in the not too distant horizon are poised to make a big leap in technology, much more than we've grown used to seeing over the last several years.
Oh yes, good days are ahead for PC gamers. Perhaps the leap will be enough to put some distance between PC and console games so that it'll actually worthwhile again to purchase the PC version instead for those who own both.
Over the last month, tensions between NVIDIA and Intel have been steadily escalating as verbal mortal shells have been lobbed back and forth between the two camps. Judging by some of the press statements being made, the two sides have all but unofficially declared war upon one another in the graphics processing arena.
In the process of Intel's hype machine going into overdrive in an attempt to promote their upcoming integrated processor-graphics chip, Larrabee, the company seemingly opened Pandora's Box with some statements regarding the demise of the Graphics Processing Unit (GPU) during the Intel Developer Forum in Shanghai. The most interesting statement came from Intel senior vice president Pat Gelsinger that read, in part, "graphics that we have all come to know and love today, I have news for you. It's coming to an end."
While these threats have apparently been scaring off some of GPU market leader NVIDIA's shareholders, they have only emboldened its fiery CEO, Jen-Hsun Huang, who went off on a well-publicized rant in response at a recent NVIDIA financial analyst day event.
If you haven't been keeping up with this news, this is your cue to start paying attention because this is the beginning and things are only going to get a lot more heated. Regardless of who will "win" this tech war, one thing is for sure – we the consumers are going to reap the benefits as the two struggle to outdo one another.
Perhaps Intel has decided to pursue this "sideways" approach to further developing their processor technology because they feel there's not much more room to go "up" for now. This is if one considers that their quad core chips are still impractical (offering little benefit over dual core) for most users and practical cooling remains a barrier to breaking 4.0 GHz OEM processor speeds. On the other side of the coin, NVIDIA's recent unveiling of their generation 9 GPUs left some enthusiasts feeling a bit under whelmed; perhaps NVIDIA has been purposely holding off for something big.
Intel's Larrabee is a wildcard at this point, but it has a big gap to fill given a comparison of both Intel and NVIDIA's discreet graphics offerings over the last ten plus years until now. What is almost entirely certain is to expect NVIDIA to pull off the gloves and unveil a monster the likes of which we have never seen with either their generation 10 or 11 cards. Also of interesting note is that both companies seem to be ignoring little AMD, which is developing its own processor with integrated graphics named Fusion, no doubt making use of its acquisition of ATI. Maybe AMD will come out of nowhere with a sucker punch?
Yes, fellow PC gamers, things are about to get real interesting. Start saving your money and pull up a chair.