Is Google preparing another “moon shot”? The barge discovered in the San Francisco Bay this week hints at it, and rampant speculation has Google building a floating data center. Google hasn’t confirmed any ties to the barge, but many observers find it more than coincidental that the company filed for a floating data center patent in 2008.
Since this story broke, the Coast Guard confirmed that the barge is owned by Google, but a confidentiality agreement prevented the discussion of any other details. Basically, Google placed a gag order on the Coast Guard. In any case, the operative question becomes, why would Google be experimenting with an offshore data center?
The Data Center Power Problem
Data centers today have several points of friction, but none is more problematic than power consumption. The most significant operational cost of a data center is the recurring power cost — electricity. The cost of electrical power has been the bane of data center professionals for the last couple of decades. So any experimentation related to alternative data center models by Google would almost certainly include ways to reduce the cost of powering huge farms of servers.
Moore’s Law posits that computing power grows exponentially while the price of compute power drops just as dramatically. For the last fifty years, Moore’s Law has held true. In fact, the cost of the peripherals like storage, switches, routers and bandwidth have dropped nearly in lock-step with the cost of CPUs. But while information technology (IT) costs have plummeted, the cost of electricity has not.
Much has been done to reduce the power consumption of chips, power supplies, fans and other server sub-components. But as computing becomes an integral factor in the global economy, the need for energy-hungry data centers and mega data centers has skyrocketed.
Google Knows Power
Today, computing and powering the Internet accounts for 10 percent of the world’s electricity consumption, and that percentage is growing rapidly. Historically, the annualized capital costs of IT were the majority of the expenses related to data center operations. As the relative cost of IT has dropped, today, up to 70 percent of the costs associated with running a data center are attributable to power or power-related equipment for cooling, power conversion and power generation.
Google has been a pioneer in data center efficiency. In addition to the power used to run its servers, the typical data center uses an equivalent amount of “non-computing” or overhead power for tasks such as cooling and power conversion. Google’s data centers? They only consume 11 percent overhead power. In addition to building highly efficient data centers, Google has been a leader in using renewable energy, with 34 percent of its power consumption coming from renewable energy sources such as wind.
The hidden power cost of computing can rival the cost of IT equipment acquisition, and with the lack of data center expertise, the total cost of ownership can be taxing. That cost can be even more painful when a data center runs at 30 to 50 percent utilization rates, the industry average. Google data centers run at 80 percent.
Saving Dollars in the Data Center
No doubt, much progress has been made toward running power-efficient computing systems and lowering the associated carbon footprint. Some prognosticators predicted the explosive growth of the Internet would bring our power grid to its knees. It hasn’t. But continued research by forward-thinking companies like Google confirms there is more to be done — such as consider floating a barge in the bay to use seawater for low-cost cooling or even power generation.
Actually, cloud computing offers huge potential savings in both power and overall computing costs. You might not have considered it, but moving to a public cloud or SaaS (software as a service) applications eliminates a number of costs associated with running your own systems — namely the cost of power. In aggregate, if you were to move your common productivity applications such as email, office suites and customer relationship management to the cloud, you would reap a huge, direct benefit in reduced power consumption.
In a recent study by Lawrence Berkeley National Laboratory, lead author Eric Masanet found that moving just these applications to the cloud could cut IT energy consumption up to 87 percent — about 23 billion kilowatt-hours. The study estimated that businesses run some 4.9 million servers to host these applications. Moving them to the cloud reduces that number to 85,500.
To put that in perspective, that’s enough energy to power the city of Los Angeles for a year.
For the average business, running data centers or even hosting applications locally can be very inefficient and costly. They may be better served by letting Google and others do the heavy lifting. But, don’t be surprised if your data is floating in the San Francisco Bay — or even in space.
Raj Sabhlok is the president of Zoho Corp., which is the parent company of Zoho.com and ManageEngine. Follow him @rajsabhlok.