With five years' difference between the old and new PCs, I had wanted to run PCMark05 and 3DMark06 to compare their performance, to see how computing power has advanced since 2003. But I checked the terms on Futuremark's site and they clearly state that only owners of the Professional edition can publish benchmark results in reviews. And the pro versions cost several hundred pounds each. So instead I did my best to compare the old and new PCs using real applications. Both machines were running Windows XP Professional SP3 with the latest graphics driver available.
The machines stack up like this:
Feature | Old machine | New machine |
---|---|---|
CPU | Intel Pentium 4 2.4GHz (Northwood) | Intel Core 2 Quad Q6600 (G0 SLACR) |
Motherboard | Gigabyte GA-8IHXP | Asus P5K Premium WiFi-AP |
RAM | 512MB Samsung RDRAM PC1066 533MHz | 4GB OCZ Reaper HPC PC2-8500 1,066MHz |
GPU | Gigabyte ATI Radeon 9700 Pro 128MB | BFG GeForce 8800 GT OC 512MB |
Graphics driver | Catalyst 8.8 | GeForce Release 175.19 |
DirectX version | 9.0c | 9.0c |
HDD | Seagate Barracuda 7200.7 160GB | Two Samsung Spinpoint HD753LJ 750GB |
Bear in mind that both machines are running Windows XP Professional 32-bit, which can only see up to 3GB of RAM.
World In Conflict is a fast-paced real-time strategy game rendered by a 3D games engine fully loaded with lighting and particle effects. I did play the demo on my old machine, but found the performance a little to slow to persuade me to buy the game. No such problem on the new machine.
Using version 1.009 of World In Conflict, and running the game at my monitor's native resolution of 1680x1050, the in-game Seaside benchmark was run three times at each quality level, and the median average over the three runs is shown below for each of the minimum, maximum and average framerates (frames per second) reported by the benchmark.
First the results from the old machine, on which World In Conflict would only offer the "Very Low" and "Low" quality presets:
Quality preset | Minimum | Average | Maximum |
---|---|---|---|
Very Low | 14 | 39 | 98 |
Low | 5 | 17 | 52 |
And now results from the new machine, where I skipped the "Medium" quality preset:
Quality preset | Minimum | Average | Maximum |
---|---|---|---|
Very Low | 72 | 124 | 276 |
Low | 55 | 92 | 198 |
High | 27 | 41 | 74 |
Very High | 21 | 35 | 70 |
Five years has made a lot of difference here. The new machine has scored faster (minimum and average) framerates in the High quality preset (which has high-resolution textures, particle and lighting effects, anti-aliasing, etc) than the old machine can achieve in the Very Low quality preset (which looks pretty awful by today's standards). The new machine makes the game playable in the Very High quality preset (which is lustrous), scoring faster rates than the old machine can manage in the Low preset (almost as ugly as the Very Low preset).
Team Fortress 2 is a multiplayer-only first-person shooter that's rife with carnage. While it's not nearly as demanding as newer games such as Crysis or Call of Duty 4, the old machine was still struggling to run smoothly with TF2 in 1680x1050, and the lack of RAM was forcing the game to use the hard disk as virtual memory, which led to disk thrashing and in-game stuttering and pauses. (Note that such pauses don't seem to affect framerate counters, but they do make an action game very difficult to play.)
With both machines running TF2 clients updated to the same version and running at 1680x1050 with the same graphics settings (the ones "recommended" for the old machine, with most settings on Low), I used Fraps to record the framerate every second for 180 seconds. I did this three times on 2Fort playing as the Pyro, and three times on GravelPit playing as the Demoman. Each 180-second period was unbroken (no level loads or pauses in play). With the three sets of results for each map lumped together, the results for the old machine look like this:
Map / character | Minimum | Mean | Maximum |
---|---|---|---|
2Fort / Pyro | 14 | 29.36 | 47 |
GravelPit / Demoman | 7 | 27.93 | 68 |
And on the new machine:
Map / character | Minimum | Mean | Maximum |
---|---|---|---|
2Fort / Pyro | 37 | 92.77 | 162 |
GravelPit / Demoman | 35 | 82.73 | 262 |
That's more like it. The new PC loads the game quickly and the big chunk of RAM available (3GB in Windows XP 32-bit) means that there's no disk thrashing at all. The new PC achieves higher minimum framerates than the old machine can score on average. The improved performance is extremely noticeable, and it no longer feels like you're trying to move through syrup whenever there's any action.
As a test of CPU and hard disk performance, I figured that audio file conversion would be an interesting measure. Using dBpoweramp Music Converter R13 and the same set of audio files on each machine, I measured the time it took to convert 38 FLAC files into Ogg Vorbis and MP3 files. The files came from three albums: one was stoner rock, one was classical music, and the last was jungle music. Nothing like a bit of variety. I used codecs from the dBpoweramp site: the standard Ogg Vorbis codec, the Ogg Vorbis codec enhanced for processors which support SSE, and the built-in LAME codec. All codec options were set to produce VBR with their default settings. The times on the old machine were:
Codec | Time to convert 38 files |
---|---|
Ogg Vorbis | 1214 seconds |
Ogg Vorbis SSE3 | SSE3 not supported on Pentium Northwood processor |
Ogg Vorbis SSE2 | 453 seconds |
MP3 LAME | 809 seconds |
And the times for the same process on the new machine:
Codec | Time to convert 38 files |
---|---|
Ogg Vorbis | 167 seconds |
Ogg Vorbis SSE3 | 69 seconds |
Ogg Vorbis SSE2 | 68 seconds |
MP3 LAME | 136 seconds |
That's an incredible difference. Using the new machine, dBpoweramp made full use of the Intel Core 2 Quad Q660, giving work to all four cores of the CPU, and it tore through the 38 audio files in well under a sixth of the time that the old machine took. The improvement to the LAME codec was slightly less, but still saw a large time saving.