In technology terms, the world changes quickly. New hardware, software, and tools appear all the time. And as companies, we want to take advantage of these new things to hopefully gain a competitive edge. The problem is, we often don’t know if the promises offered will ever be delivered.
A lack of data in the first place often gets us. We find we’re immature as a scientific practice gathering data about what works, what doesn’t work, how effective things are, etc. But even when we have data, for some reason we often choose to destroy it. And with its destruction goes our ability to examine whether future changes made a difference.
For example, one type of tool commonly used by organizations is some sort of incident management/problem management system. Something like Serena Teamtrack, MKS Integrity Manager, HP’s ServiceCenter, and others… it’s one of the most valuable tools in terms of collecting data we can use to improve future project performance. But these vendors are always offering upgrades, and they’re sometimes significant.
More than one organization I’ve worked with sees the upgrade coming, plans out the upgrade and as part of that process decides that they won’t be moving over all the old/closed incident data. Rather than re-import it, they just throw it away. And with it, they throw away an enormous amount of data about how the organization functions. Suddenly, a “simple” software upgrade gets you back to square one…
Not only do you not have data at all, but you need to wait a year or more to get enough data to understand things like seasonal trends, long term impacts of process decisions, etc. And while you’re waiting to build up that history again, the vendor is going to release another major feature set! And you’re going to upgrade again! And you’re going to throw your data away… again!
Don’t destroy your data. The technology world at large may move quickly, but discarding your entire history means you can never learn from it.