All of last week we were integrating a set of server cabinets for Dataupia and yesterday I took two of the wiremen to the data centre in East London where the service was to be housed. We spent the day finishing the install and during several of the hurry-up-and-wait periods I chatted a bit with one of the admins there. They host for several large corporates including a well known search engine and their machine room is enormous. I thought I'd seen big machine rooms in TV facilities but even the mighty Red Bee looks small compared to this! They have four diverse power feeds (from different providers) and four separate incoming fibre sets (again, totally diverse). I had a good look around and here are a few observations;
- All the power (16A and 32A feeds, terminating in the cabinets in C-form ends) starts in the mains room on Powerconn connectors - supposedly because those connectors lock - unusual.
- Despite every bit of equipment having a switch mode supply (and hence being an inductive load) every MCB in the mains room was C-rated and double the required capacity (C32s for the 16A circuits etc.) - I'm sure using correctly rated D-breakers would be better from a safety and reliability point of view.
- Air conditioning was via the floor - cool air forced out of the raised floor void and warm air extracted from above. Given that all kit draws air in from the front and cold air is heavier than warm air I'd have thought the TV practice of dropping cold air down the front of the bays was a better configuration.
- All of the techs and admins who saw Clyde and Linus at work marvelled at the numbered cables and the fact that all our cat6 was cut to exact length - I can't believe the entire internet runs of pre-made patch cords!
- Al the fibre I saw was tight-buffered OM1 run in Copex - what technical reason is there for that? Have they never heard of loose-tube cable?!
So all in all an interesting day - I was gratified that the way we build machine rooms in TV seems more sensible than these guys (and I'm assuming this is a tier-one provider). With all this in mind I signed up for the following at my institute. If you're interested drop me a line and come next Tuesday;
IET London, Hammersmith Section
20th November 2007 - DataCentre Design & Build
Talk by Mike Stokes of SymantecMike Stokes leads the data centre consulting practice for Symantec in the UK. He has over 10 years experience in guiding clients in the definition of how to meet their requirements for data centre capacity covering capacity planning, technical specification, project definition, financial analysis and business justifications.
His talk will cover the common difficulties being currently being experienced by many companies in deciding how to provide adequate space, power supply, connectivity and cooling on a 10+ year planning horizon, for IT architectures that have been changing dramatically every three to five years, a trend which is only expected to accelerate.
This subject should have something of interest for all IT Professionals, IET engineers and managers of IT operations.