Kate is teaching a class in Colorado Springs for 3.5 weeks allowing me to “live like a single guy” in the evenings. It means I’ll be eating unhealthy food, playing computer games until the wee hours of the night and generally letting myself slip to a lower standard of hygiene.
Kate bought me a birthday gift last November which caused me to build up a computer from scratch. She bought World in Conflict and it was a brilliant game. I played the single player for maybe 8ish hours and kept hoping it wouldn’t end. Kate did excellent in picking that one. It is the best real-time strategy game I’ve played. After that, I obeyed the end of year ruckus surrounding Portal, and bought it as part of the Orange Box. Portal was brilliant: short, but perfect. After that I replayed Half-Life2 and then continued on with new-to-me content in the Episode1 and Episode2. They are brilliant as well and did not disappoint.
Being a married computer gamer, I have self-imposed limits on my gaming so that I don’t ruin my marriage or my job. The first rule is to never play a game that doesn’t have a pause feature. This rules out all massively multiplayer online games like World of Warcraft (WoW) or City of Heroes/Villains. Those would be like crack to me and I know better than to give either a taste. I’m nearly done defending the earth from an alien incursion (in Half-Life2) and it is about time to pick a new game. I’m leaning towards conquering the universe in Sins of a Solar Empire (though I shall ignore the multiplayer bits because it will have no pause button), or the most recent well-acclaimed classic RPG’s (from 2006) Elder Scrolls IV Oblivion plus sequels.
To read about the building of the PC and all the parts I choose and why, read on…
My computer is:
Motherboard GIGABYTE GA-P31-DS3L
CPU Intel E6750 Retail
Memory Corsair XMS2 PC2-2600
Video ATI Radeon HD3850
I did lots of pouring over the Tom’s Hardware VGA charts as well as the CPU Charts, and looked for some sweet spots with price points of about $200. That got me to choose the ATI video card and the Intel chip. I choose the exact motherboard because of an overclocker’s page on the which indicated it was a wonderful surprise (and they took it up to 3.72 Ghz from the stock 2.66 Ghz). I decided not to allow for a later dual-video card configuration because usually when my computer is ready for an upgrade (3-5 years), there are all new video card slot types and the CPU needs to be beefier to handle the game physics. I choose the ATI video card because the HD3850 uses a lot less power than it’s equivalent NVidia card (the 8800 GTS). I wanted something that didn’t turn my PC into an easy-bake oven. NVidia has been dominating, but ATI’s new line if HD3800 cards are quite impressive. All my games (Orange Box, World in Conflict) have played superbly at maximum resolution for my wide-screen monitor (1440×900).
I had planned to use my Windows XP OEM but it expired last week because I changed the motherboard and invalidated my operating system (See the “More Information” section at the bottom). After saying very many mean things about Microsoft Vista for many months because of my experiences with it at work, I turned my hypocrite meter up to “ludicrous” and bought Vista. Why? Because DirectX 10 will NEVER ship for Windows XP, and some games are moving towards DirectX 10 only content. So now, mostly against my will, I am a Vista user. I did feel much better about my choice after reading an interview with ATI’s Phil Rogers. He talks about the new driver model in Vista and how it is a good thing and will enable some progress. I’m hoping this driver model can be used some day to allow the video card GPU to intermingle with the CPU to provide physics and graphics since it is the most powerful piece of the computer. I heard someone talking about a possible future where all computer horsepower is unified by some new technology and doled out as needed. That would be really neat! My inner programmer knows that it probably can’t be done under today’s hardware with all the CPU/GPU pipelines as well as dedicated cache memory. That unified idea was presented in a Christmas/New Years episode of one of the podcasts I listen to… perhaps Joystiq.com? I like those guys, so I’ll give them credit even if I’m wrong.