There’s a possibly unwritten rule that tech professionals should replace their gear about every two years.
Hard drives (the spinny kind) are really only good for about 5 years, then they start to fail for various reasons. That’s an average based on general use, and i’ve found it to be true. Newer, solid state drives, probably ought to be replaced more often, but it really depends on how much info you write to them.
Likewise thanks to Moore’s Law, CPUs and graphics cards tend to get faster and more efficient with time. Well, generally, at least.
My home workstation is a 4 core 2.5GHz box with 12GB of memory, about 3TB of storage and dual monitors. It’s about 6 years old, and runs off a 250W power supply.
Recently, i spec’d out a replacement machine (which generally involves replacing everything) for about $2500 which was a 4 core 3.0GHz box with 16GB of memory. i’d move most of the storage over.
i’m not sure it’s worth upgrading.
For a significant cost, i’ll see a performance enhancement of about 17%. i’ll also have a box that uses more electricity (since the newer CPU and graphics cards will draw a lot more watts than my current rig does).
Yeah, so i won’t be able to play the latest 4K rendered shooter in near photo-realistic chicken blasting detail. i also don’t need a car that can do 200MPh at the Nuremberg ring, either.
i’m not quite sure i know how to feel about this. i may swap out the graphics card in my box, but that will probably come with replacing the monitors with something better than the pair of 21″ 1900×1200 i’ve got now, that i still need glasses to read. There’s really no reason to change them out.
Granted, i run linux at home because that’s what i generally tend to work in, and support for the newest, most lunatic graphics cards tends to be… iffy… at best. i suppose it’s a lovely way to keep me from building some insane rig so i can play Beat Hazard Ultra, but that’s my call.
Am i getting old or has the return on Moore’s Law not really kept pace?