Green is cool and cool is green at the data centre

Contractor Pascal Perron of the State Group declares victory over reams of dead cabling he helped pull from under the McGill Data Centre. / Photo: Quan Nguyen
Contractor Pascal Perron of the State Group declares victory over reams of dead cabling he helped pull from under the McGill Data Centre. / Photo: Quan Nguyen

By Mark Shainblum

It’s funny, we might regularly rant against global warming on our blogs or Facebook pages, but how often do we think about the Internet’s own carbon footprint? It’s easy to forget that every line of text, every image, every song and every video on every website you’ve ever visited is stored on a real physical computer. Every time you conduct a Google search, watch a clip on YouTube or, yes, read the McGill Reporter online, you’re spinning a hard drive and crunching data in a power-hungry CPU somewhere in the world.

In the case of the Reporter, the hard drive you’re spinning is in one of the 500 web and data servers in the McGill Data Centre, hidden deep in the bowels of Burnside Hall. Checking your marks on Minerva? Spin. Saving a Word document to your local network? Spin. Approving an enrolment application or updating your faculty website? Spin, spin, spin.

Without data centres there would be no Wikipedia, no Facebook, no Minerva, no McGill library catalogue and, what the heck, essentially no computer network infrastructure at all. So, unless you want to go back to a world of rotary dial phones and 5.25” floppy disks, data centres and their hard drives are a good thing. Unfortunately, they are also (a) extremely complex to maintain, and (b) ravenously power hungry, with (a) usually trumping any attempt to deal with (b).

“Data communications networks are so incomprehensible to mere mortals that energy usage takes a back seat to the ability to get the network up and running,” observed eWeek Magazine’s Eric Lundquist in a recent column. In fact, until the recent spike in oil prices, many commercial data centre operators found it far easier to move to jurisdictions with cheap electricity rather than undertake the intensive, nose-to-the-grindstone work of cutting consumption.

That was obviously never an option for Quan Nguyen, Associate Director of Systems Engineering at McGill’s Network and Communications Service (NCS). Even if he weren’t concerned about the environment, his 500 servers were stuck in a basement data centre originally designed to hold a single 1960s-era mainframe. Those extinct behemoths filled whole rooms with hulking banks of blinky lights and spinning tape reels, but, Nguyen explains, they didn’t generate the kind of waste heat that endless racks of modern PCs do.

“If we ever lost air conditioning, all those servers would fry within a few minutes,” he says.

Until recently, the centre was cooled by six massive Computer Room Air Conditioners, or CRACs, that were able to cool the whole space down to about 22 degrees Celsius.

“Imagine trying to cool a hot kitchen by opening the fridge,” Nguyen explains. “That’s essentially what we were doing. It was incredibly wasteful of power.”

Cool savings on cooling systems

Moreover, the CRACs couldn’t cope with high temperature hotspots that developed in small areas above and behind the tightly spaced server racks. This limited the number of computers the centre could run at the same time. When Nguyen and his team decided they needed a new set of servers in early 2006, McGill Facilities offered to install another CRAC unit, but Nguyen instead sold them on the merits of a new, still-alternative approach called in-row cooling, which chills only the computers, not the surrounding air. Nguyen estimates installing and running a new CRAC would have cost about $1.5 million in installation and operating expenses compared with $600,000 for in-row cooling.

“And if we added 80 kilowatts of old-fashioned cooling, the whole room would have been frozen,” he says. “Everything would have been covered in ice, and yet there still would have been hotspots.”

Switching to in-row cooling was only one in a series of approaches Nguyen and his team undertook to cut the centre’s ever-growing energy demands. Before the in-row units were even installed, specialized contractors were brought in to remove thousands of metres of old cabling left over from the mainframe days. A tricky task, since some live, still-working cables were liberally intermixed with reams of dead ones that hadn’t been connected to anything in decades.

“Cold air sinks, because it’s heavier,” Nguyen says. “So if there are big holes in the floor for useless wiring, the air conditioning units will pump all their output down there.”

Perhaps most important from an energy-savings standpoint, in 2006 Nguyen initiated an aggressive campaign of server virtualization. A virtualized server runs emulation software that essentially allows it to pretend to be many computers running many different operating systems at the same time.

“On average, you’re only using about five to 10 per cent of a server’s capacity, so virtualization allows us to combine 10 computers into one physical box,” Nguyen says. “That’s a tremendous savings, 10 to one.”

Some developers and staff members were initially wary about the relatively new virtualization approach, Nguyen admits, especially since no one at McGill had experience running complex applications like Oracle databases inside virtual machines. As with in-row air conditioning, he had to undertake an education campaign to convince them of virtualization’s solidity and safety.

“We have over 100 virtual servers now,” he says. “It took us almost two years to get the new cooling system approved and set up, and if we hadn’t virtualized in the meantime, we would probably have run out of power.”

Dirty, dirty data (and it’s not what you’re thinking)

Weird but true data centre facts:

● Even imaginary people contribute to global warming! According to author and technology guru Nick Carr, maintaining an avatar – a fictional, online version of yourself – on the popular Second Life virtual world consumes 1,752 kilowatts of electricity per year. That’s about as much power as the average (non-fictional) Brazilian uses annually.

● A single, medium-sized web server has roughly the same carbon footprint as an SUV that gets 15 litres of gasoline per 100 kilometres of driving.

● In 2007, the Gartner Group estimated that data centres were responsible for 0.5 per cent of carbon dioxide emissions globally. Throw in all other varieties of information and telecommunications technology, and you’re up to 2 per cent, equivalent to the infamously dirty aviation industry.