Software Testing World CupThis summer saw Box UK take part in the World Cup. However, rather than pitting our dribbling and goal skills against Messi and Ronaldo it was our testing prowess that was being challenged, as part of the Software Testing World Cup. This was the second official event for software testing professionals to measure their skills of detection against fellow engineers worldwide.
We at Box UK are constantly looking to improve our expertise as well as learn new things and broaden our experiences, so when we saw the event was open to applicants for the European round our testing team eagerly signed up. While the first competition in 2013 had 16 teams in total, 2014 was to have six preliminary online regional heats with approximately 250 teams in each region, with the final held in November at the annual Agile Testing conference in Berlin.
Ready, set...As per any competition or indeed any software project, it was important we all knew the requirements and rules as part of our preparation, so once registered our training plan was masterminded. We scoured the internet for videos of the first competition, and swotted up on the rules. There would be an initial set-up via video streams before a time-boxed testing period, where a purpose-built software application would be released to teams to test.
Like athletes at the 100m Olympic final, we knew we had prepared well for the competition; we may not have been wearing the lycra, but were just as eager. As testers it is always exciting to analyse and experiment with new systems, but we were also keen to compare ourselves with others.
...go!The event began in earnest and, armed with tea and biscuits as essential investigation fuel, we set to work.
We had three hours. Simple! Three hours to test a software application! That’s ages!
It sounded a long time, but this is often the first mistake when tackling a testing strategy for any project or application. We had no idea until then what this software was, who had built it, or who the ‘customer’ was. Having to familiarise ourselves with the customer, we quickly absorbed the brief and its objectives before devising our plan and test strategy, including delegating roles. At Box UK our testing team has a wide range of skills and specialities so we knew we could put the right people on the right areas; something that, again, is key on all projects.
Testing best practicesBut how were we to track those juicy bugs we hoped to uncover? Issue tracking is as integral to the software development cycle as the discovering of the issues themselves, and all teams have their own preferred methods. We love JIRA, an agile, powerful and flexible tool which enables us to monitor and process not only issues, but whole projects themselves. The competition, however, required us to use the Pronq Agile Manager tracking system, which we had studied beforehand as part of our ‘training’.
Having a standard bug report format was also essential. This is something we strive to do as part of our everyday work, as bug report writing is the crux of any issue getting logged properly and, more importantly, fixed. If a bug is not explicitly described, then how can the developer quickly understand the issue so they can fix it, or another tester or even project manager understand the ticket if necessary? Although there are no set rules, having clear replication steps in concise language is key. Screenshots and, if possible, video screen recordings can also be advantageous to understand the issue.
Hunting for bugsSo knowing the customer, the product and the requirements, we confirmed our strategy and set to attack the system with an intense bug-discovering assault course…
… we hoped. But just as we were getting started we were hit with a technical issue – half the team could not access the system! An external problem meant that teams could not login to access the software.
Frustratingly, we had hit the first hurdle. But like many projects, you sometimes come across external technical constraints, so we ploughed on with what we could – using the time resourcefully to put together templates of reports, etc. – and soon enough, we were able to log bugs.
The testing period was frantic and exhilarating. We were raising bugs left right and centre, with the live streaming on to hear the updates. The pressure was on! With such a tight deadline it would have been easy to start logging useless/inaccurate bugs or poorly-written tickets, but we were conscious that bugs had to be relevant; something that applies to real-world projects too.
We also had to submit an overall report that detailed all issues and gave the ‘customer’ a clear understanding of the status of the software, as well as providing prioritised recommendations for potential fixes and solutions. Normally we would have a lot more time to produce this in order to really hone it, so the challenge to write it quickly was huge.
It was over in a blur. We were tired but thrilled. It had been enjoyable, despite the frustration of technical issues, and we felt we had done well in the circumstances. We left late at night with a happy endorphin-filled bug-raising glow, and our nervous wait for the outcome began!
The resultsThe results were published weeks later, after the judges had analysed all tickets tracked for relevance, importance and accuracy, marking teams on bugs raised as well as the reports submitted. We discovered that we had been placed 68th out of 250 teams – a really great result.
The exercise had uncovered not just bugs, however, and one of the most intriguing aspects we found was the benefits of testing in a team. It was rewarding to work next to another tester, sharing opinions, observing how they log bugs, and seeing how they work. It’s good to learn from others; maybe they use paper checklists, maybe they use spreadsheets, and you can take new techniques to improve your own. Together, we learned new software skills, we learned new strategies.
It was also interesting to see how the competition worked via video streams and Twitter communications. You could submit questions to organisers, either about the product or technical issues such as those the teams were experiencing with initial set-up. It showed how powerful Twitter can be as an instant communication tool – I had to tweet the organiser myself to highlight our problems. Even if he did read my name as Bob (sadly I couldn’t log a bug against this user error).
While obviously we would have loved to have been higher in the ranking, we felt we had performed to a high standard against many teams and thoroughly enjoyed it. More importantly though, we had learned a lot of new things we could repeat and take into the real-world projects of everyday. But I’m keeping my name Sian!