Latest resource: Maturing from social listening to digital consumer intelligence

A practical guide to levelling up how consumer insights are used across your organization

Read the guide

Latest resource: Maturing from social listening to digital consumer intelligence

A practical guide to levelling up how consumer insights are used across your organization

Read the guide

Published December 6th 2011

Time, Care and Data Centres: For When the Apocalypse Arrives

A few short notes on how we do things at Brandwatch with regard to our data centres.

Social media monitoring companies are dependent upon the use of data centres to store all of their data and allow their application to function properly. These expensive but powerful off-site centres provide the grunt for everything we do at Brandwatch.

This week, one of our competitors is scheduled to go offline for three straight days as they relocate their data centre to a new location. They will be losing all of their data running up until November last year as they move to their upgraded tier 3 site this week.

In light of this, we have decided to compile a short description of best practices to employ regarding data centres, and a brief description of how we do things here at Brandwatch.

Best practices

This is exactly what the future will look like

Although we don’t use it to store our own data at Brandwatch, essentially what we offer to our clients is cloud storage through our data centres. Our clients do not have to store anything on their own hard drives if they don’t wish to export anything.

However, many companies similar to ourselves are turning to cloud computing and storage for their own data too. Cloud computing currently accounts for 11% of all data centre traffic and there are now 1.6 zettabytes of the stuff stored in the cloud.

We personally prefer to be responsible for the hardware ourselves, which gives us greater control over our data, so opted out of this route, though it is a perfectly viable option for many other companies. It may be something we explore as the technology in this area improves.

When we do have to undergo some maintenance on our systems, we always make sure that the app is down for as short a time as possible. By working over the weekend, usually on a Sunday night, we cause as little disruption to our clients as possible so that any upgrades do not get in the way of our clients’ work.

A potential data centre move can understandably be a stressful period, though a gradual implementation from one site to another should mean that downtime would be minimal, even if it does mean that extra equipment might have to be purchased during the exchange.

Part of our data centre in Surrey

If someday we simply must take the app offline for a three day stretch, we would be sure to plan this to occur over a seasonal holiday, like Christmas for example.

When we were selecting our data centre(s), we were careful to select a tier 3 site, as determined by the Uptime Institute. The differences between the tiers can be found here, though we would recommend using at least a tier 3 centre when dealing with this type of data.

We have also factored future expansion into mind when planning our data centres, and we have the room to upgrade our systems in case any unprecedented demands are placed upon them. We check up on our babies every week and keep our equipment running smoothly and their environment clean.

When something goes wrong

In our scary age of floods, fires, terrorists and imminent nuclear holocausts, contingency plans are essential to data centre planning. As well as checking for realistic hazards like water damage and structural integrity at our primary data centre in Surrey, we also prepare for worst-case scenarios (Terminator).

The worst case scenario

We maintain a secondary site in Greenham Common, located in a former nuclear bunker. This ‘slave’ site is a live replica, meaning it is constantly updated from the ‘master’ cluster in Surrey. This means that when the nukes inevitably rain down in Surrey, we can escape to the haven of Berkshire and continue business as usual, even if all of our new clients are scary metal dictators by that point.

Our systems are equipped to transfer entirely from one site to another within 15 minutes, and if the worst case scenario happens twice (Terminator 2), then we can even run Brandwatch on our own servers in the Brighton office, albeit at a slower pace (Terminator: Salvation).

Also very important to us is that these two sites use different providers, as well as different energy suppliers, in order to plan against any external disasters, bankruptcies or other capitalist issues that may arise that are out of our control.

We really do go the extra mile to make sure our clients will never have to go without their beloved Brandwatch, but hopefully now you have a clearer picture of just how far that mile is.

Share this post
Marketing Research
Search the blog
React Newsletter

Sign-up to receive the latest insights into online trends.

Sign up
facets Created with Sketch.
facets-bottom Created with Sketch.
New: Consumer Research

Be Consumer Fit.

Adapt and win with Consumer Research, our new digital consumer intelligence platform.

Crimson Hexagon has merged with Brandwatch. You’re in the right place!

From May 8th, all Crimson Hexagon products are now on the Brandwatch website. You’ll find them under ‘Products’ in the navigation. If you’re an existing customer and you want to know more, your account manager will be happy to help.