Bullet Proof!

EBay’s flagship data center is open for business in Utah


When I joined EBay in September of 2009, I had the privilege of taking on the responsibility of delivering the single largest infrastructure project that the company has ever undertaken. A new data center code named project Topaz. At $287 million, it is also the single largest infrastructure investment the company has ever made.  It is also the most complex construction project we have ever undertaken.  Over 1.2 million man hours of work in just 14 months.  But, the most important component of Topaz isn't the project itself, it is what it will be used for.  It will house EBay’s core businesses - ebay.com, the world’s largest online marketplace with over 90 million buyers and sellers in 32 countries and PayPal.com, one of the leading ways to pay online with 81 million registered accounts available in 190 markets and 24 currencies. In 2009, the total worth of goods sold on eBay was over $60 billion: that’s over $2000 a second. Topaz isn't just a datacenter, it is the home of our business.

Ok, no pressure there...

As you can see, we live and die by the performance of our datacenters. Our buyers and sellers depend on its reliability. Project Topaz is a critical part of the EBay engine. It is the foundation for our business and must be solid, stable, and secure. In a nut shell, it needs to be bullet proof.

On May 4, 2010 we completed project Topaz on-time and under budget. A monumental task considering the shear volume of work that had to be completed in the short time frame. The data center, located outside Salt Lake City, Utah was designed to be concurrently maintainable and fault tolerant. What that means is we can sustain major impacts to any part of the data center and it will continue to operate. We can isolate and fix any component in the datacenter without disrupting the engine. Picture that everything has a backup - even the backups have backup. Now keep in mind that nothing is really 100% bulletproof, but in terms of a resilient data center, we have built the highest level possible.

Now, many think that when you build a data center with this much redundancy, it will be extremely expensive to operate and very inefficient. Quite the contrary. Besides running the data center operations for the company, I’m also responsible to pay the power bill.    So, the datacenter must be built like a tank, be able to brush off major faults, lower our operating costs and be extremely efficient. Did I mention that these are goals in my annual performance review?  Ok, no pressure there either...

So, put yourself in my shoes for a minute.  With these rather challenging deliverables, who would you want to drive the design and delivery of a bullet proof data center?  Why two former tank commanders from West Point of course!   I was blessed to have two very solid guys running with the Topaz ball.  Mike Lewis, my Distinguished Architect who owns Data Center design and Mission Critical Engineering standards and Greg Fennewald, the local Utah Data Center Manager responsible for bringing Topaz on-line and operating it going forward. And yes, they are former tank commanders. When I saw some pictures of what they did for fun at West Point, I knew they could roll over any barrier that was in front of them to get the job done.  That car had no chance. 

I digress.  Now Mike and Greg didn’t do this alone of course, but they coordinated the logistics and collaborated with our partners to deliver.  Data Center resiliency was priority one but an almost equal requirement was efficiency. Remember, inefficiencies affect my budget directly.  So it was top of mind for everyone involved.  We ha to achieve both. The only way to deliver on that is to form true partnerships with the vendors involved in the project. And we had some stellar partners on this project. Skanska led the construction, RTKL did the design, and over forty additional companies worked tirelessly on this project for more than a year.  At its peak we had over 700 hundred workers on two shifts to deliver this data center. On top of partners, there was a multitude of eBay internal employees from almost every aspect of the business working on the success of Topaz. Tax, Legal, Risk, Finance, Procurement, Product Development, and over 100 people in Operations.  It was an incredible effort and impossible to mention and thank everyone involved.

What I am very proud to announce is that we have delivered on all of the challenges in this project.  We have built a fault tolerant Tier IV level data center that is 50% less expensive to operate than the average of all other data centers we lease today. It is also 30% more efficient than the most efficient data center in our portfolio. At a designed PUE of 1.4, it lowers both our economical and ecological costs. We only consume the energy we need, when we need it.  Now, I don’t want to go into the religious debate of who has the lowest PUE, but I do want to point one thing out.  In the business of on-line commerce, we do not have a choice but to build a highly available data center to support our customers.  From my perspective, achieving a 1.4 PUE with a hard requirement to meet this level of redundancy is quite an accomplishment.  The point is you can be resilient, efficient and cost effective if you set your mind to it from the beginning.

Now, the juicy details. ☺

Phase one is a 240,000 square foot two story building with three, 20,000 square foot rooms to house IT equipment.  These three rooms deliver 7.2 Megawatts of total server load.  We have our own substation capable of delivering up to 30 Megawatts of total power.  The 60 acre site master plan consists of four phases.  This facility is part of Technology Operations four year data center strategy. It is the next step in consolidating our leased data centers spread over three states. This not only helps us reduce our operating costs, it increases reliability. In essence, that means every server we consolidate into this new data center will lower costs and increase efficiency. On top of that, the site is built to scale so we are enabling the company to grow as we consolidate!  We get the best of both worlds.

Now lets talk about some of the efficiencies. 

First, everything is running at 400V. This means we lose an entire level of transformers and deliver 230V to the servers.  That’s a 2% efficient gain through the entire electrical system and the modular busway system (starline) allows us to change receptacles in minutes rather than days. We have a 400,000 gallon cistern which collects rain water and will be used as a our primary cooling source. We are using a water side economizer, which allows us to use the outside air to cool the data center for more than half the year instead of running expensive chillers. In addition, we are using technology that will dynamically match the power used by the pump and fan motors to the cooling loads ensuring we only consume the energy needed to support the compute load.  We also have a fully contained hot aisle design that isolates the heated air from the cold air. In addition. we have closely coupled cooling units (in-row) to add additional capacity where it is needed.  That means we can put anything anywhere and still ensure it meets its optimum cooling efficiency even with a mixed workload.  We can support racks that are less than 1,000 watts to dense racks that generate more than 30,000 watts of heat (more that an industrial pizza oven).  We have created a flexible, scalable and efficient infrastructure that increases the agility of the company.  Also, we expect Topaz to achieve LEED Gold certification status from the US Green Building Council.  A great validation of the effectiveness of our efficiency commitment in this project.

As I mentioned earlier, there are three IT rooms in the data center.  Room one is for eBay Marketplace (shown in the picture above), room two is PayPal.  Room three is planned for further consolidation work to support our strategy.

But, on May 4th, room three became something complete different - It became the coolest looking data center in the world!

 

Welcome to CLUB EBAY!

 

Every great project deservers a great party!

We converted the datacenter raised floor into a Club 51 atmosphere that the Hollywood A-list would feel right at home in - complete with a red carpet walkway. The attendees had no idea we planned to treat them like royalty.  But they were!  They were responsible for making this project a success.  Needless to say, it was one of the most unique company parties in eBay’s history. 

We started off the festivities with a video summarizing the magnitude and impact of this project on eBay and its employees.


Once we finished the video, Mazen Rawashdeh, VP of Technology Operations, James Barrese, VP of Architecture and Mark Carges, our CTO, gave their perspective of the importance of project Topaz to the success of eBay, PayPal and our adjacencies. We also had Jakob Carnemark, the Sr VP of Mission Critical from Skanska explain the complexity of the project and how crucial it was to have a full partnership in a construction project of this magnitude and complexity. Finally, I was able to bring up Mike and Greg and express our thanks for leading the construction teams to deliver this project.

Then we brought the house down!  A crew of local hip hop and break dancers proceeded to interpret the key words of our project -  Transformation, Innovation & Leadership.  They also helped us become "cool" as you can see in the picture.  And finally after years of planning, approval, construction, commissioning and verification, we plugged this puppy in and fired it up!

After the power-on ceremony.  We conducted tours of the site for all who attended.  Ironically, the majority of people that worked on this project, had done it remotely.  It was the first time that many were able to see the massive infrastructure that was built to house their technology.  The tours were complete with iPads showing full 3D models of the data center (BIM) and an interface into the infrastructure applications. 

We ended the day with a party at The Depot in downtown Salt Lake City.  It was a night filled with games, a live band and even some performances from the eBay employees themselves.  I had a chance to sing a rendition of YMCA with a motley crew of eBay employees. Our version was “Shop At eBay”! As you can see from the photo, everyone got involved!  It was quite a performance!

The Topaz launch party was a tribute to those that put their blood, sweat and tears into this project for more than two years.  Topaz is the epitome of collaborative teamwork and innovation as hundreds of people in all areas of the business came together to plan, design, validate, and execute on this incredibly complicated project. It is a testament to the eBay culture and the ability of our people to innovate, lead, execute and deliver truly exemplarily performance.  Ultimately, this data center is allowing us to deliver a better and more reliable experience for our buyers and sellers while keeping our infrastructure costs under control.

Here are some fun statistics about this incredible project:

  •  
    • Over 1.2 Million man hours worked with no loss time due to injury
    • 30 Megawatt power substation
    • 60,000 sq ft of usable IT space
    • 57 Miles of underground electrical conduit
    • 2 Million pounds of copper for underground critical power
    • 295 miles of Copper Cable Network
    • 20,000 Fiber Optic Cable 176,000 feet, Approximately (43,000) strands of fiber over (33 miles)
    • 2006 tons of Steel
    • $10 million in savings through 3-D computer modeling (BIM)
    • Over 200,000 measured points – all the way down to the individual server power plugs.

Needless to say, we are very proud of Topaz and the people that made it possible. Now, stay tuned for even more innovation and efficiency from the eBay engine.  We’re just getting  started …


 

 

Comments

When will all the data be consolidated?

When will all the data for eBay's core businesses be consolidated onto Topaz? Is there a schedule? Do you anticipate that Topaz will have a positive effect on eBay Marketplace Search once the consolidation process is finished?

excellent job all of you

It is always good to be bullet proof, safe and secure...what are you doing for inside security?

Topaz

Hi


Great to use all that free cool air.


You mention hot aisle containment with in row cooling which I assume takes the hot air and cools it and returns to room at ambient room temp?


Photo shows vented floor tiles? Am I missing some thing?


Can not see hot aisle containment?


Did you concider that the cooled air retuned into the room them takes heat from ,lighting, fans, people the building walls,etc so increase ambient air before retuning to the server intake?


Did you consider cold aisle containment? if rejected why?


Keep up the good work. its hard work breaking the mould.


Jeremy MD and founder Dataracks and eCool Solutions.


Boss of Uk company that supplies DC solutions including server racks and containment.