54 Fascinating and Incredible YouTube Statistics
By Kit SmithJan 17
As the head of my department so eloquently put: the silent warrior of any Brandwatch development team is the software tester who, no matter what, ensures that the show must go on.
I am proud to be one of those testers. Though I work as part of a wider development testing team, each of us still has our respective areas of responsibility.
Here’s what a typical day in testing might look like to a casual observer:
1) Sitting on a wobbly exercise ball, bouncing every couple of seconds to the “do ya thang” tune, rhythmically clicking the mouse button, staring at the screen for minutes at a time to find out about the latest smartphone specifications.
This will be periodically broken up by navigating to Google docs and changing a blue colour to green, like Guitar Hero, and repeating this for hours.
2) Chatting on Skype, where seemingly every other comment is some sort of smiley/facepalm/dance emoticon. Sharing weird gifs from Reddit and occasionally raising eyebrows, followed by an “OMG” expression as if a pet just died.
3) An email sent round with some colourful language and expressions that suggest there’s a lot of horseplay afoot.
4) A desk full of the latest gadgets, empty crisp packets and drinks, and I look like Neo from the Matrix: checking texts/Facebook/Twitter on my awesome smartphone or using my tablet to read a lovely email from my wife.
But, if we are to believe the saying “seeing is believing”, this would suggest that we don’t really get much work done.
In truth, though we’re happy and enjoy our day, it’s not quite what it looks like.
1) Click every possible button I can see in the Brandwatch platform, including the trillions of filter combinations and billions of click-throughs, to check for the “expected” results.
Working in a highly unstable code base such as Brandwatch is the best challenge a software tester can get.
Highly unstable doesn’t necessarily mean bad, however.
It’s actually a compliment to our developers, as it’s only ‘unstable’ because of how quickly they are able push out new features in every sprint, ensuring that Brandwatch is the fastest-evolving social media analysis software available. It’s never unreliable once it’s released, of course!
Google docs acts like a workbook for us, and I have to update every step we take individually for that week’s release cycle. It’s basically regression testing the whole app, plus testing any new feature which users are expecting to be bug-free.
2) Skype and IRC chat clients keeps us informed of each other’s work, making sure we get things done in time with no interruptions. The awesome emoticons are just for morale ;)
The occasional baffled look means we found a bug, which we know will mean we need to test it again in couple of days and potentially lose some valuable foosball time!
3) The more serious part of our job involves the automated testers more than the manual testers. We have to write acceptance tests for product owners, which is essentially an agreement with the lead devs of the project about requirements.
At the moment, we have five extremely hard-working testers, supervised by the test manager, Lee McGeever.
4) This is the part I enjoy the most. For Brandwatch website testing, we check compatibilities in all possible browsers and devices, such as smartphones, tablets, phablets, laptops and PCs. We get the chance to play with them like we own them, and for a smartphone enthusiast like me, it’s like heaven.
By the end of this year, all manual testers will be fully skilled on the automated framework and will be self-sufficient for any tasks.
We will of course also look to improve the coverage of the stable features and implement better ways for presentation, mostly as a part of our famous Funky Fridays.
Meanwhile, we will look for more of the latest gadgets with 64 cores CPU and 8k screen resolution, albeit in a parallel universe.