Back in December, I decided to trade in my hunk of junk six year old HP Pavilion PC for a new custom built PC. Running on an Intel i5-4570, 8GB of RAM and a 1TB HDD with Windows 7, I was in PC gaming heaven. I couldn’t quite afford a new video card, so my 3 year old Radeon HD5770 was put into the PC as a stop gap until I could afford a new video card. It worked out great, pushing most of the PC games I had to high settings.

But then, tragedy struck. I saw graphical artifacts while playing Crysis, but thought nothing of it at the time. Several days later, my video card started spinning its fans loudly while I was idling on my PC, temperatures rising by the second. Even with a quick dusting, the card still got loud and didn’t show a picture. It happened to me again: a video card died on me. I got the HD5770 as an emergency replacement for my dead GeForce 8800GT back in 2010, and now I had another dead video card. I was amazed the Radeon lasted that long, maybe pushing all those polygons in those two months was a bit hard on the old gal.

So, for the past month I’ve been playing other games, such as binging on Need for Speed: Hot Pursuit from 2010 and playing through Call of Duty: Black Ops II on my 360. After being annoyed that I couldn’t play much on the PC, I decided to test something. All CPUs these days come with a integrated graphics chip. PC gamers won’t use this, opting to buy a video card to do all the heavy lifting for their gaming needs. I thought I’d give my i5 processor’s integrated graphics chip a shot in the meantime. After installing the newest drivers, I tried a bunch of games on Intel’s own integrated graphics, the HD4600 and saw the results. Boy, I was surprised.

Continue reading