“It’s like a ghost town out here,” Lia Crocker says as we ride in her white sedan through the Keystone Industrial Port Complex. A sprawling former U.S. Steel site 20 miles northeast of Philadelphia, the KIPC is home to one remaining U.S. Steel building and a few energy and manufacturing companies, part industrial park and part landfill.
She turns to avoid rain-filled potholes and drives over a parking lot overgrown with weeds. A rusted conversion van sits politely between faded lines, careful to make room for cars that have not come. “See where the brush starts?” Lia says. “That’s the property.”
That’s it? The engine of the technological revolution? The future is a field, and it’s covered in tall grass and rusted metal.
On Monday, May 23, thousands of fans rushed to Amazon.com to purchase Lady Gaga’s new album, “Born this Way,” on sale for 99 cents, and, in the process, crashed the site. Shortly before the servers stalled, customers were asked to save their newly purchased album on Amazon’s “Cloud Player,” a free service that lets customers store their music files remotely on Amazon’s servers, rather than their own computers. Thousands took advantage of the deal, making the sale one of the biggest commercial pushes for cloud computing.
Amazon isn’t alone in betting on “the cloud.” Earlier in May, Google unveiled Music Beta, a service that lets people store and play back music files online. And Apple announced iCloud at the Worldwide Developers Conference in San Francisco on June 6. Steve Jobs called the product Apple’s “next big insight.”
Cloud computing isn’t new, but it seems poised for a leap forward. Those who use any of the popular online email services—Gmail, Hotmail, Yahoo Mail, etc—have been using cloud computing for years, whether they realized it or not. Their emails are stored remotely on servers, and they access their files through the web. The servers—specialized computers linked together over a network—are stored in data centers—warehouse-like buildings that store and power all of the computing equipment. The significance of Amazon’s Cloud Player or Apple’s iCloud, then, is not the technology, but the sheer amount of data that will soon make itself permanently at home in data centers across the country. Email files are small compared to entire music libraries.
But what is the impact of all of that “free” computing? How much power does it take to load everyone’s information on a network, ready to be retrieved on a whim? Or more importantly, why should we care?
David Crocker has an answer to the last of those questions. David, Lia’s father and boss, is the CEO of Steel Orca, the company that thinks its plot of grass in the middle of a former U.S. Steel site is going to be the next big thing in the industry. The Steel Orca Bucks County Data Center, with plans for construction this fall, will run on 100 percent alternative energy sources and arguably be the world’s “greenest” data center. Why build it? “When people talk about environmental damage, they never talk about computing,” Crocker told me. “Well, computing clouds can cause acid rain too.”
By some estimations, digital technology accounts for 2-3 percent of the overall energy spent in the U.S. That’s more than the entire airline industry. And it’s only growing. The Cisco Visual Networking Index calculated that the amount of digital information passed over the Internet in 2010 was almost double the amount of data passed over the Internet in all of the previous years combined. By 2015, Cisco predicts, the amount of video passed over the Internet in a single second would take five years to watch.
That’s a lot of information. And, small as the electrons carrying it may be, you still have to store it all somewhere. Alex Wissner-Gross, an MIT researcher with a Ph.D. in Physics from Harvard, said that while moving information electronically will always be more efficient than the alternative—“The mass of an electron is about 2,000 times smaller than an atom,” he reminded me—there is a growing demand to understand the impact of computing on the environment. His company, CO2Stats.com, has been successful in tracking the carbon emissions of websites, which businesses use to measure their overall footprint. Going green has never been more complicated. Which brings us back to the plot of grass, and the data center they plan to build on it.
Data centers are the factories of the information age. They come in various sizes; some built for private use and others that sell space to host other companies’ servers. Steel Orca’s planned 700,000 square foot facility—big enough to fit almost four Walmart supercenters inside—plans to market its data center as an environmentally friendly alternative to its competitors—the thousands of other data centers around the country.
“The U.S. data center industry is in the midst of a major growth period,” the EPA wrote in a 2007 report to Congress. “The increasing reliance on digital data in our society is driving a rapid increase in the number and size of data centers.” All data centers have their differences, but they contain the same basic components. A tour of the Philadelphia Technology Park, a colocation data center near the Philadelphia Navy Yard, provides a good example of how a modern facility operates. Officially opened in October 2010 near the Philadelphia Navy Yard, the PTP is on the small side at 25,700 square feet. From the outside, it looks like a really big bank — pristine red bricks around large, glass windows with curtains drawn.
The first thing you’ll notice is the security. If a company trusts you with its information, they expect you to go out of your way to protect it. PTP President Corey Blanton used his a security card to buzz us into an outer security room with heavy metal doors to begin the tour. Only senior level staff are allowed to take a key to the facility home. A James Bond-like fingerprint scanner and another buzzer unlock the door in the inner room.
The area inside a data center where the computing infrastructure is held is called “white space,” and in this case, it’s aptly named. Large square slabs of white tile floor cover a room about the size of a gymnasium. On about a third of the floor, tall, black metal cabinets stand in perfect rows, like a cornfield of computing equipment. White noise fills the room through the hum of the servers and the steady whoosh of air conditioning units. It’s 72 degrees and the humidity is below 50 percent. It’s always 72 degrees, and the humidity is always below 50 percent. This is not a room for variables.
As anyone who has held a computer on his or her lap too long knows, computers get hot. When it’s vitally important that the servers continue to function properly, a lot of energy is spent keeping them at a comfortable temperature. Each cabinet holds a handful of servers stacked one upon another, cooled by an air conditioning unit that pumps cool air up from the floor. The air is sucked through the cabinet by a vent in the ceiling on the opposite side, and the steady flow of air is backed up, like everything in the data center, by a second, fully redundant environmental control system. Blanton says everything has an A and a B system. If one fails, the other takes over. With data centers, failure is the ultimate evil.
As Blanton pointed and rattled off the company’s clients, it was easy to see why. “This one is a bank in New York. Over here we have a law firm,” he said, pointing to servers behind the mesh screens of the cabinets.
“This company has a duplicate of all of their records stored in another one of our facilities in Baltimore.”
“That next row has a pharmaceutical company.” “We sold a lot of free space to a University hospital.” “Villanova contracted some space with us for their engineering department.”
Education and medicine, law and business. It’s the U.S. economy in 10,000 square feet. Security passcodes, key business information, online retail websites, emails, medical records, all of it depending on a plug that better not fail. It’s why data centers run on what is euphemistically named an uninterruptible power supply, or UPS.
We walked around the rows of cabinets, past one of the giant environmental control boxes facing its twin on the opposite wall, through another locked door and into the power room. This one is loud. Two car-sized metal boxes stand along the left wall. UPS A meet UPS B. Through another door on the right is the battery room. Each 480 volt wet-cell battery is about the size of a microwave, and there are more than 500 of them neatly stacked and numbered in long lines along the walls. Blanton said the way the data center works is kind of like a laptop. “It’s always running on a battery. Whether you’re charging it or draining it, the battery is always the source of power.”
The data center draws power from a main utility line, just like a house. But it’s not dependent on it like your average home. If a storm comes through and knocks out power in the area, its UPS units keep humming on the power stored in the room full of batteries until a back-up power unit can kick on. The PTP data center can run for 20 minutes at full capacity on battery power alone. And it only takes four seconds for its diesel generators to kick-on.
But there’s a problem in the industry. It’s something everyone talks about, but few have a solution for—many data centers are almost incurably outdated.
Julius Neudorfer, CTO and founder of North American Access Technologies, Inc., has been involved with network and data center infrastructure design since the ’80s. “Anything five years or older is becoming obsolete,” he told me. “Yet, they are going to continue to run.”
The problem, in a nutshell, is that no one anticipated the amount of computing power that would be needed this quickly. “Something that was built in 2006 was probably designed in 2003,” Neudorfer said, “and it was also probably built with a design philosophy that was five years old, maybe from 1999. What you get is data centers that can handle 50 watts per square foot, when newer centers use 100 to 200 watts per square foot.”
Not having enough power doesn’t necessarily doom a data center, but it does pose some serious challenges. “It’s very difficult to upgrade a data center that is currently operating,” Neudorfer said. “If you are ripping out things in a large scale manner while trying to operate the facility, you might be better off shutting it down. The older the data center, the harder it is to save.”
As servers have become more powerful, efficient and cheaper, data centers have struggled to accommodate the more advanced machines. It’s too expensive to fix them, so we build new ones instead. And like other major power producing industries, there are a few who are trying to do so while incorporating green technology.
David Crocker’s plans to build a data center started as an idea to turn around a tech company he was working for, which was planning on building its own center. Instead, he spun off the data center plans into another business, Steel Orca, which hopes to build 10 new facilities—mostly in the U.S. and potentially one in Europe. “There is an opportunity for someone to take a leadership role in the industry, and for those putting data in the hands of a third party, we can help reduce their carbon footprint by building an environmentally considerate facility,” he said.
For their first data center, Crocker believes they couldn’t have found a more perfect site. The KIPC has been heralded as a success story for alternative energy production. U.S. Steel once employed over 5,000 workers at the site, but now employs about 75. A handful of alternative energy companies, however, have taken advantage of favorable tax breaks through Pennsylvania’s Keystone Opportunity Zone program to relocate to the complex.
The Exelon-Conergy Solar Energy Center boasts one of the largest arrays of photovoltaic solar panels in the country with 1,700 panels. Exelon also pipes methane out of a landfill owned by Waste Management and converts it into electricity for another green energy source. The amount of electricity produced could power 30,000 homes a day, according to the KIPC.
Gamesa Technology Corporation, a company President Obama recently visited, creates wind turbine pieces in a 250,000 square foot manufacturing plant at the KIPC. There is also a bio diesel plant, natural gas energy producers and an international deepwater port.
Most data centers lack convenient access to green power. Solar panels typically produce 10 to 20 watts of power per square foot. Data centers need 100 to 200 watts of power per square foot. To power a facility on solar alone, then, you would need a field 10 times the size of the data center itself. But with so many sources of alternative energy at the KIPC, Steel Orca plans to take advantage of their unique situation.
As Lia drives me around the future site of the Steel Orca Bucks County Data Center, we pass large buildings being torn down to the left of the plot. Scrap metal lies twisted in heaps, ready to be collected and sold. We cross an abandoned rail line—a rusted sign states the obvious, “Tracks Out of Service.” It’s hard to see the future.
In the distance, spires rise out of a steam plant and chillers noisily use water from the Delaware River. There are soybeans here, growing by the thousands. Off to the right, an honest-to-goodness mountain of trash is being turned into electricity, with small pipes extracting the methane. Closer to the field, wind turbine blades look like airplane wings, lying on the asphalt, waiting to be hauled onto the interstate.
Above, scattered patches of clouds list across the bright summer sky. The field reflects the heat. There will be no acid rain today.