Skip to main content

Intel Booth and i7 Second Generation Chip [CES 2012]

The last time I would have called myself a “PC Gamer” was probably in the mid-to-late nineties. I remember that, at the time, AMD and ATI were the kings of the gaming world, and my rig represented that fact. I started as a Mac user (my parents first computer was an Apple II), but then eventually made the household swap to PC, for my gaming needs. The first computer I ever build revolved around an AMD processor, and I never looked back. A fanboy from the jump, AMD then far outperformed the Intel processors in that generation, and anyone running an Intel system was as though a leper.

Obviously, times have changed. Now that I’m getting back into PC gaming (more than 10 years later), it seems like the tables have turned. Not to offend all the AMD users out there, I am still one by heritage, but it looks like Intel has significantly upped their game. The era of two chips (AMD and Intel) having the same GHz processing power, seeing AMD run circles around the Intel chips (performance wise) to an extent where computer shops (that I frequented back then) had to put “Performs as a ___ in Intel standards” on their shelf-notes are long gone.

While there are still people firmly planted in either camp, and I’m not going to get into a multi-decade debate with fans of a war I walked out on 10 years ago, it’s undeniable that Intel has come a long way and has legitimate arguments, now, for being the more efficient processor.

As part of our new focus on PC Gaming, for the New Year, I stopped in at the Intel booth this year at CES to check out what they had to display. Of course, the vast majority of their display wasn’t for us, as gaming folk, showing of everything from laptops to tablets, and examples of how their newest chips, the Intel i7’s second generation, can handle every computing problem under the sun, there was some examples of how the new chip can improve our experiences.

Most notably were two stations, at opposite ends of the booth, showing of examples in gaming at different. If you checked out previous year’s coverage of CES/Intel, you may remember a demonstration where Intel ran a copy of World of Warcraft utilizing only the CPU for graphics processing. The game ran smooth, looked decent enough, but everyone was basically of the opinion “Yea, but it’s WoW.” World of Warcraft was released in 2004, and while it’s impressive that ANY game can be run entirely off the processing power of a CPU, the concept was lost on most of the by-passers.

This year Intel upped the game, in a major way, the demonstration for this year was of RAGE. The game looked sexy as hell (it’s a good looking game, regardless of how it plays), and was smooth as butter at (what appeared to a layman at a steady 60fps). It was impressive that plenty at the booth, including myself, could hardly believe that it was happening. Intel knocked it out of the water this year by showing off a full, modern game running off the processing power of the CPU, handling all the graphics and gameplay through the CPU entirely. If the processor is that impressive on its own, of course the nonsense that you could get up to with a proper GPU as well is nearly endless.

At the other end was their secondary display of gaming power, the concept of which I just spoke: A gaming rig with an Intel processor at its core, backed with dedicated GPU from Nvidia. They were running a copy of Modern Warfare 3 on the system, with all settings maxed out of course, and it looked absolutely flawless. My new thing, for testing the stability of gaming PCs, has become violently shaking the mouse and looking for any stutter or pop-in on the graphics… With the rig on display at Intel there was nothing.

Again, I speak from the point of view of someone 10 years removed from the world of PC gaming, but the demonstrations provided at Intel’s booth were nothing short of mesmerizing. The concepts showcased were insanely impressive, and after seeing what the processors they’re building are capable of, I cannot see what the software developers come up with to take advantage (I’m looking at you specifically Adobe, more support for multi-core processing please?).