Thames flooding flushes out flawed thinking on London data centre location
By Nick Razey, CEO, Next Generation Data
Time and tide wait for no one. Wise words and all the more so amid the recent flooding experienced up and down the country with even the Thames bursting its banks just a few miles short of central London.
If it wasn’t for the Thames Barrier who knows what might have happened? It’s sobering that a fifth of all the 30 year old barrier’s closures took place in the first two months of 2014. Not surprisingly this has sparked fresh calls for the building of a new one should the current one fail through overuse.
When it comes to data centre location, these recent events should serve as a warning to any CIO still intent on going with the flow of conventional wisdom by continuing to take data centre space at premium rates in Docklands - entirely on the floodplain - or the City which is still close enough to be impacted from the inevitable disruption caused.
But no matter how much more water needs to go under the bridge until the penny drops, the mandate for any top flight data centre always remains - to mitigate all possible risk to data by ensuring it is protected and out of harm’s way at all times. This must surely include from the reaches of the River Thames.
While the Docklands is well protected against flooding, the Environment Agency accepts climate change will make existing defended areas more vulnerable over this century. Businesses, governments offices, essential services, including data centres, located on or nearby London’s floodplain are in real and present danger from flood and the inevitable impact on business continuity, either directly or from the knock-on effects such as power outages, communications breakdown and traffic congestion.
Just imagine a tidal surge similar to the one which hit Manhattan following Hurricane Sandy - a stark reminder of the devastation and prolonged disruption entirely possible on a metropolis close to the sea. And let’s not forget, like New York, much of London is exposed to the effects of the sea by being on the banks of a major tidal river.
The usual, and increasingly flawed, justifications on why London location still makes sense for the majority of data centres no longer hold water. Just over a millisecond latency is now routine over several hundred miles and nines time out of ten perfectly adequate for all but the most time sensitive of financial trading applications.
Fibre connectivity costs have tumbled to less than £10 per mile compared to the £1000s of a few years ago and major carriers and ISPs are connecting directly to carrier neutral facilities up and down the country. And remote diagnostics are now so intelligent that server hugging really has become a thing of the past.
Added to the above consider also the latest generation of business owners, solutions architects, users and IT service providers. They have been raised on virtualisation and cloud computing and are at ease with the concept of data being stored, processed and transacted at data centres based almost anywhere. Provided, of course, they’re secure, private and resilient.
While only time will tell just how much more vulnerable London becomes to flooding and whether the current Thames Barrier holds up without a hitch for another five years, a decade, or even into the next century, the tide is already turning against the ‘accepted’ London data centre location wisdom conceived in the last one.
- » Why big data’s big promises are finally within reach
- » New report shows MongoDB to be leader of the NoSQL database pack
- » Location, location, location: The changing face of data centres
- » Compliance remains the key cloud security challenge, according to CipherCloud report
- » ARM-based servers: The next evolution of the cloud?