Inside a multi-million dollar datacentre
The hub
Behind most apps and cloud services lies a datacentre – but just what goes into building one of these hubs of the online world?
Equinix opened the doors to its new $145m (£98m) facility in Slough, England last week – a multi-storey glass and steel block that will initially pack tens of thousands of servers in a 236,000 square foot hall.
Equinix has invested more than $7bn over the past 15 years in the datacentres that make up its International Business Exchange – recently opening new centres in New York, Toronto, Melbourne, Shanghai, Singapore. This LD6 datacentre is the sixth that Equinix has launched in the London area and one of more than 100 it runs in 33 markets worldwide.
Photo: Equinix
Inside the data hall
Initially LD6 can house 1,385 server cabinets in a 236,000 square foot data hall, with space for a further 1,385 cabinets to be made available at a later date.
The hall will hold shared-colocation servers and each customer will usually deploy their own machines. Those that require a high security environment will sit within a cage.
Equinix hasn’t named customers lined up to use the centre but said financial services, network service and cloud service providers have expressed interest.
To maximise the space available for computing infrastructure the building is designed to only need support pillars down the middle of the hall.
The hall can support up to 1.5kW of machines per square metre and 12 kVA of cooling per cabinet.
Photo: Equinix
Purpose built
The centre is monitored 24/7 and has biometric controls and a mantrap to prevent unauthorised access.
A building management system polls the infrastructure every 15 seconds for temperature and humidity readings and reports back to the on-site monitoring centre.
The building has been designed to provide the optimal width for air ducts for the fresh air cooling system.
“In effect we started with the cooling technology and then designed the building around it,” said Equinix UK managing director Russell Poole.
It took about two-and-a-half years to fill a co-location space of the size offered by LD6 in another of Equinix’s datacentres.
Photo: Equinix
Data from above
Data cables run overhead, while power and cooling is provided underfloor.
Fibre cross connects run in the yellow tray with copper cross-connects above, and will feed the server cabinets below.
Photo: Equinix
A breath of fresh air
Much of the building is devoted to the machinery that keeps the servers powered on, supplied with data and properly cooled.
The building minimises energy use by cooling servers using fresh, rather than artificially chilled, air.
Air is drawn from outside and passed over heat exchangers that cool hot air pulled from the server cabinets. This cooled air is piped back to the data halls and to the cabinets through floor vents and the process begins again. The system helps keep the server room at a maximum of 22C and the air flows never mix to prevent contamination.
Photo: Equinix
Liquid refreshment
This pipe running up the side of building takes water from a borehole drilled to a depth of 350 metres.
When the temperature rises over 20C, water is sprayed on to the heat exchangers and evaporates, removing additional heat from the air circulating through the facility. If the temperature hits 30C then a chilled water system can be used to extract even more heat.
Photo: Equinix
Piped in
By minimising the energy needed to cool the server halls, Equinix can run the building with a Power Usage Effectiveness (PUE) rating of just 1.2 – ahead of the industry average of 1.7. PUE is a measure of what proportion of the total energy consumed by a datacentre goes to running the servers and how much drives the cooling and other infrastructure.
The LD6 datacentre is Leadership in Energy and Environmental Design gold-accredited.
Photo: Equinix
Redundant power
Each server is powered by two mains supplies, so one can take over in the event of an outage.
Each power line is backed by two uninterruptible power supplies, able to support the full load of the datacentre for up to eight minutes.
That eight minutes should provide enough time for the centre’s diesel generators to begin producing power. By the time the centre is complete there should be 32 generators capable of running the centre for 36 hours.
Power to the building’s cooling, fire suppression and security system are also backed up by an UPS.
Photo: Equinix
Scaling out
The LD6 datacentre takes the number of facilities in Equinix’s Slough campus to three, alongside the existing LD5 and LD4 buildings. The campus has access to a range of transatlantic cables and is one of the busiest network nodes in the UK, offering latency in the region of 30 milliseconds to New York and 4 milliseconds to Frankfurt.
Equinix’s datacentres house computing and networking that serves a significant proportion of the financial services industry – with one quarter of European equities trades originating from the Slough campus.
Photo: Equinix
Connected campus
Once phase two of the LD6 build is complete, the three datacentres in the Slough campus will provide more than 388,000 square feet (36,000 square meters) of co-location space interconnected by more than 1,000 dark fibre links.
The links provide low-latency connections to more than 100 network and cloud service providers located in the campus, including Microsoft Azure, Amazon Web Services (AWS) and Google Cloud. Alternately these high-bandwidth links also provide the ability for customers to distribute their IT infrastructure across multiple sites to provide greater resilience to outages or to allow them to rapidly provision additional capacity when needed.
Photo: Equinix
For more on this story go to: http://www.techrepublic.com/pictures/photos-inside-a-multi-million-dollar-datacentre/10/