The CEO of a startup told me yesterday that their data mining software is so efficient that it cuts the processing time of a terabyte of data by 75-80%. Surely, he said hopefully, the energy savings alone would drive customer adoption.
I don’t think so
He got me thinking about the current fad for green computing and how vendors can take advantage of it. Then I saw this on a Wall Street Journal blog, talking about the IBM venture group’s focus on sensor networks and software and its importance for energy conservation:
While companies have been slow to adopt so-called green technology, IBM thinks that will change. â€œWeâ€™re well beyond convinced,â€ says Clark. â€œWeâ€™re betting a lot of money on it.â€ IBM will presumably make a lot of that money back helping companies integrate their new sensor software with existing systems.
How this plays out
In the Google power paper (see Powering a warehouse-sized computer) they made a number of statements that are at odds with the “reduced power use = wonderful” message of the current stage of the hype cycle.
- The capital cost of provisioning a single watt of power is more expensive than 10 years of power consumption. That conclusion didn’t seem to use dodgy Net Present Value calculations either, so it understates the impact.
- Data centers are most economically efficient operating at close to 100% of provisioned power.
- The greatest opportunity for power savings comes reducing the power consumption of idle kit, not from making busy kit more efficient.
Why is the Google view important?
Only commodity hardware and OSS-based data centers get a clear view into power issues. They don’t have the massive maintenance contracts, depreciation and system management costs that overshadow power cost in the enterprise.
Some implications for marketers
We’re going to see a lot more dumb comments from marketers attempting to get their company on the green bandwagon. As customers wise up – and plenty of them are already wise to this happy talk – there will be plenty of backtracking and, uh, clarification.
Unless the customer is caught in the “must take out 1 watt out for every watt brought in” trap – which isn’t all that common, the real savings come from accurate provisioning. Vendors can help with that just by providing accurate information about their products so customers aren’t forced to over-provision.
As Google noted, the tendency for everyone to “err on the side of safety” in figuring power requirements is expensive. The equipment does it, the codes add to it and everyone adds their own fudge factor.
Where sensor networks fit
I agree with Mr. Clark that sensor networks and the software to run them will be big. What I don’t see is data centers adding them to save a few dozen kilowatts. How could that possibly be cost-effective?
What I can see is equipment vendors adding an interface to a standard set of parameters, which could be used to slip machines into a currently non-existent deep sleep mode that takes them down 10% of their peak power requirement rather than today’s 50% while also keeping the entire data center at 99% of peak power consumption when busy.
The StorageMojo take
There’s a lot more to finding competitive advantage in the green movement than sticking higher capacity disk drives in your same old array. That is just perfuming the pig.
The big win is showing large greenfield data centers how to increase economic efficiency. The power disty folks have a golden opportunity if they sharpen their tools and get to work. The electrical code people will be a bigger problem. Can data centers get some special treatment? We’ll have to wait and see.
Sure, everyone wants a lower power bill. But over the long term the real win comes from re-architecting data centers with an eye towards total economic efficiency. Some of the work is component level, some is box level, but it is the overall system architectures that will be most affected.
Comments welcome, of course.