TEST Magazine - December 2010-January 2011

Page 1

PO GE EX PA ST N TE W O IE EV PR

IN TOUCH WITH TECHNOLOGY

36

T H E E U RO P E A N S O F T WA R E T E S T E R Volume 2: Issue 4: December 2010

THE RACE WITH COMPLEXITY Ian Kennedy on challenges of testing major IT projects

Inside: Performance testing | Agile methods | Stress testing Visit T.E.S.T online at www.testmagazineonline.com


www.testmagazineonline.com


Leader | 1

T H E E U RO P E A N S O F T WA R E T E S T E R

PO GE EX PA ST N TE W O IE EV PR

IN TOUCH WITH TECHNOLOGY

36

T H E E U RO P E A N S O F T WA R E T E S T E R Volume 2: Issue 4: December 2010

THE RACE WITH COMPLEXITY Ian Kennedy on challenges of testing major IT projects

Inside: Performance testing | Agile methods | Stress testing Visit T.E.S.T online at www.testmagazineonline.com

Editor Matthew Bailey matthew.bailey@31media.co.uk Tel: +44 (0)203 056 4599 To advertise contact: Grant Farrell grant.farrell@31media.co.uk Tel: +44(0)203 056 4598 Production & Design Dean Cook dean.cook@31media.co.uk Toni Barrington toni.barrington@31media.co.uk Editorial & Advertising Enquiries 31 Media Ltd, Three Tuns House 109 Borough High Street London SE1 1NL Tel: +44 (0) 870 863 6930 Fax: +44 (0) 870 085 8837 Email: info@31media.co.uk Web: www.testmagazine.co.uk

‘Appertunity Knocks’

W

e seem to be in the middle of something of a technical revolution. Wait, if I’d come out with that sentence any time in the last 30 years (or perhaps 200 or even 2,000 years) it would probably have been every bit as valid, but we have arrived at a point where mobile ‘computing’ – for want of a better term – is available to pretty much anyone that can afford the relatively modest price. As we have seen, the model pioneered by Apple Computers with its iPhone and iPad offerings has caught on and the concept seems to have crossed platforms and operating systems. Making the development tools available to all has fuelled a boom in app production, with numerous applications available. They can be helpful, pointless, pretty, ugly, clever and dumb – all of life is here. We can wonder what we did without them and find them liberating and exasperating all at pretty much the same time. Ultimately though, if we don’t like them it’s a simple as binning waste paper to get rid of them. Current favourites of mine include Voice Memo and Dictionary (ideal for the busy journalist and in the case of the latter perfect for cheating at the crossword), Traffic Eye (allows you to view what’s happening on our roads

through any traffic camera in the country), Filtatron (a Moog synthesiser in your phone!) and Glucose Buddy (a handy repository for the diabetic on the go to record their glucose test results). All of these one would imagine have been tested to some extent, I can’t help wondering though to what extent? And is it really a revolution or just another distraction in our increasingly chaotic and hectic lives? Discuss. On a completely different subject, T.E.S.T magazine will be attending TestExpo this month (7th December Plaisterer’s Hall City Of London) and we have a preview on page 36 of this issue. I would urge you all to get down there and network with your colleagues from what I have so far found to be a very outgoing and erudite industry. While it does seem to be a little early – although having said that, the Christmas adverts have been on the telly for some time now – It only remains for me to wish all our readers a very happy Christmas and a prosperous and successful New Year in 2011. Until next time

Matt Bailey, Editor

Printed by Pensord, Tram Road, Pontllanfraith, Blackwood. NP12 2YA © 2010 31 Media Limited.All rights reserved. T.E.S.T Magazine is edited, designed, and published by 31 Media Limited. No part of T.E.S.T Magazine may be reproduced, transmitted, stored electronically, distributed, or copied, in whole or part without the prior written consent of the publisher. A reprint service is available. Opinions expressed in this journal do not necessarily reflect those of the editor or T.E.S.T Magazine or its publisher, 31 Media Limited. ISSN 2040-0160

Published by:

As we have seen, the model pioneered by Apple Computers with its iPhone and iPad offerings has caught on and the concept seems to have crossed platforms and operating systems. Making the development tools available to all has fuelled a boom in app production, with numerous applications available. They can be helpful, pointless, pretty, ugly, clever and dumb – all of life is here. Matt Bailey, Editor

www.testmagazineonline.com

December 2010 | T.E.S.T


A Global Virtual Learning Environment for the Software Testing industry Positioned to guide you to success in testing education and certification ■ ■

Options to fulfil your requirements – from start to finish Access to online and classroom courses, virtual classrooms, ebooks and a range of other testing-related content Sample exam questions, papers, revision sessions and live exam service

Alre Cer ady tifie Join d? o See ur Alu o m

Scalable to support individuals to the largest corporations Supported 24/7 with a global network of accredited training providers

www.learntesting.com For details of your local Learntesting provider, visit www.learntesting.com and begin your journey to success with Learntesting

ur ni on suppli schem pag er p e of t es 28 – rofile hi 3 for s issue 1 det ails


Contents | 3

CONTENTS DEc 2010

1

4

Leader column ‘Appertunity’ knocks? What the rise of mobile computing offers the tester.

Cover story – The race with complexity

Ian Kennedy on the challenges of testing in major IT projects and how Agile concepts need to be embraced to create a hybrid project delivery and testing methodology that can cope with the ever-increasing levels of complexity.

8

Don't leave quality behind when you go mobile With mobile devices poised to challenge PCs as the application platform of choice, Pradeep G says quality has to be an integral part of this picture.

12

Quality insurance

John Peake, senior manager quality assurance a major personal lines insurance intermediary tells T.E.S.T how the software testing function is helping his organisation to tackle its increasingly complex business.

16

Shipping Forecast

Facilita’s Gordon McKeown explains how his company’s Forecast performance and load testing tool is having an increasing impact on the performance testing market.

20

Break That

4

As hundreds of thousands of Take That fans fought to get their hands on tickets for the band’s tour, ticketing websites were unable to cope with the spike in traffic. Graham Parsons makes the case for thorough stress testing.

24

Defining testing practices for Agile methodologies

Testers must take the lead in redefining testing when Agile development processes are introduced to their organisation. Peter Varhol shows how testers can achieve this.

28

The online learning experience

Learntesting provides an innovative, highly scalable distribution platform for online

12

and blended training solutions. T.E.S.T speaks to the company’s managing director, Mike Smith.

32

Game-changing performance testing

34

Have you aligned your testing metrics to the business scorecard?

partnership between Amazon and Soasta has borne fruit in the form of a A performance testing tool that is game-changing according to Vince Vasquez.

Sudip Naha describes how a test metrics analysis and decision model (TMAD) helps in answering questions about cost of ownership, product quality and availability for release; continuous improvement in terms of learning and growth; and process quality.

36

TestExpo Winter 2010

40

T.E.S.T Directory

48

Last Word – Dave Whalen

28

his year’s winter TestExpo is focussed on ‘Better Quality with Agility’. T It takes place on Tuesday 7 December at the Plaisterers’ Hall in the City of London.

Dave Whalen is having serious problems with his data.

www.testmagazineonline.com

32 December 2010 | T.E.S.T


4 | T.E.S.T cover story

The race with complexity Ian Kennedy, managing director of British Consulting, talks about the challenges of software testing on major IT projects. He also explores how Agile testing concepts need to be embraced at management level to create a hybrid project delivery and testing methodology that can cope with the ever-increasing levels of complexity in modern IT solutions.

T.E.S.T | December 2010

www.testmagazineonline.com


T.E.S.T cover story | 5

M

ost software projects fail. This is an inherently startling statement, particularly when placed into the context of a) the amount of money spent on IT projects annually and b) the number of exceptionally well educated and capable people involved in the industry. Even more disconcerting, our success rate is getting worse, not better. From 2006 to 2009, the success rate of IT projects dropped from 35 to 32 percent, while the failure rate rose from 19 to 24 percent (source: Standish Reports 2006, 2009).The remainder are classed as challenged, meaning they were eventually delivered but late, incomplete or significantly over budget. Now put this in the perspective of a global IT spend in excess of US$2.4 trillion and growing at around 2.4 percent annually. These are sobering statistics for industry and government alike and even a small improvement would bring an enormous financial benefit.

The larger they come... While the numbers are a guide at best, they do raise some important observations; we appear to be getting worse at delivering IT projects, not better, despite both the intellectual horsepower devoted to improving the situation and the sheer amount of projects undertaken. More interestingly, research emphasises that larger IT projects are more likely to suffer these effects due to their higher levels of complexity; a comment reinforced by the UK National Audit Office which points out, across various reports, that large complex projects are significantly more at risk of failure than small ones. Neither does the future look promising for these metrics. The current drive towards event-driven business models coupled with a growing requirement for ‘on-the-fly’ process configurability, Software as a Service, and the inevitable migration to the cloud means this trend is likely to continue. We have learned enormous amounts in the last decade on how to successfully develop and test IT projects. The problem is that we are constantly playing catch up to these increasing levels of complexity www.testmagazineonline.com

and we appear to be losing the race. Furthermore, as technological advancement continues to accelerate, the level of IT interconnectedness will only increase; creating ever more integrated and complex solutions that will likely outpace advancements in outdated delivery and testing methodologies that are already struggling to keep up. This is a challenging time to be CIO, where performance is measured in IT spend, efficiency and value delivered to the business. Moreover, the ‘head of test’ is looked at as the person who makes it all work while test managers are firmly at the front line. “Driving business change and behaviour is the ultimate CIO challenge,” says Dan West CIO of ASOS. “Creating value through technology is what the business really want and strive for. Integrating new products into the business landscape is complex with many dependencies, any method or framework that assists that transition to enable business value quicker should be adopted.” So what have we learned and what might we do to turn the tide in our favour?

The benefits of testing A good place to start is to look at the problems encountered in delivering major IT projects. Any casual ‘Google’ will reveal a plethora of papers, blogs and articles about why IT projects fail to work and the benefits of thorough software testing. From detailed research papers through to media stories the response is generally split into four categories of failure: collaboration and leadership; requirements capture and change; supplier management; and of course testing. While only one directly mentions testing, the other three have major impacts on our ability to test within IT projects. Only by looking at the whole picture can we influence the outcome and create the environment needed to allow testing to flourish. Every major report on IT project management heavily criticises leadership and management and rightly so. As leaders, our role is to provide direction and inspiration to our team and we can certainly not abdicate responsibility or accountability when things do not

Most software projects fail. This is an inherently startling statement, particularly when placed into the context of a) the amount of money spent on IT projects annually and b) the number of exceptionally well educated and capable people involved in the industry. Even more disconcerting, our success rate is getting worse, not better. go the way we hope. As managers, our role is to provide the planning, execution, monitoring and control needed in projects, as well as a sound duty of care to our teams. If we cannot create and maintain the framework needed for our project to succeed then we are doomed to failure from the outset as our teams will not have the necessary environment in which to flourish. But which framework should we use? Much research has been undertaken on development and testing methodologies that we can generally classify into one of two camps; Sequential methodologies such as Waterfall or the ‘V’ lifecycle, and iterative methodologies such as DSDM, Extreme, Scum and most recently Agile.

Methodologies Sequential methodologies such as Waterfall make heavy use of the project management ‘Iron Triangle’ of cost, scope and schedule. This focus on quality is important but also fundamentally constrains our ability to react quickly to change. This constraint December 2010 | T.E.S.T


6 | T.E.S.T cover story

From detailed research papers through to media stories the response is generally split into four categories of failure: collaboration and leadership; requirements capture and change; supplier management; and of course testing. While only one directly mentions testing, the other three have major impacts on our ability to test within IT projects. Only by looking at the whole picture can we influence the outcome and create the environment needed to allow testing to flourish. T.E.S.T | December 2010

is purposeful and allows disaggregation of scope under a reductionist delivery model that manages problems by segregating elements into smaller and smaller units until they can be managed in isolation within small teams under strict governance and centralised change control. This approach is particularly prevalent on large projects, ie, large infrastructure and government-led projects where a high value is placed on fixity of requirements and where the project will likely run over many years. Corporate and government supply chains are keen proponents of this model as it allows for commoditisation of scope and thereby drives down the financial cost of procurement at the outset by increasing the amount of competitive tendering. Unfortunately, this approach is flawed as it leads to increasing numbers of stakeholders, interface boundaries, and delivery suppliers; and thereby injects further organisational complexity, and risk, into the project. The over commoditisation of supplier scope also has a tendency to encourage clients into adopting financially punitive contracts, forcing each supplier to focus on the delivery of their own scope rather than fostering a project environment of inter-supplier co-operation, which is a critical element to overall project success. The resulting impact to collaboration introduces significant risk into the delivery and testing phases of the project in the form of integration problems and bugs; which usually occur right at the end of the lifecycle when timescales are under most pressure and there is little time to fix emerging issues prior to deployment. In summary, this type of approach seeks to control complexity rather than accepting that change is a fundamental part of business, and therefore any project tasked with delivering to it.

The Agile way Contrast this with an incremental methodology like Agile (http:// agilemanifesto.org) that seeks to address some of these problems by delivering projects in a highly flexible, devolved and iterative manner; based around delivery and testing via a series of functional promotions into the

business. Agile teams are taught to be flexible, adaptable, highly cohesive and to have a focus on delivering business value. In a pure Agile team, if you can find one, everyone is multi-skilled and working towards the delivery of fully tested and operable software in short duration sprints of a few weeks at most. Heavy use is made of test automation and exploratory testing to validate changing requirements as quickly as possible and a customer representative is a key member of the team. Agile projects also have a tendency to utilise documentation and specifications to a lesser extent than more traditional projects – although it is important that we do not confuse this with an active disregard for requirements. Instead, Agile teams utilise a storyboarding technique to promote ideas and are highly focused on face-to-face verbal communication between team members to identify and solve problems quickly with minimum overhead. While this is an excellent approach with many advantages, it does not scale well into major IT projects, as the proximity between all project participants cannot be maintained and testing, at some point, ultimately needs to occur across multiple teams ie, integration and acceptance testing. The size of larger IT projects also means that it is very difficult to deliver business functionality out of the project in short duration drops, a key requirement of this type of methodology that is needed in order to maintain focus. There are simply too many people and too many things going on in a large complex IT project to utilise a pure Agile team approach, while simultaneously managing project teams as a cohesive entity and maintaining a grip on all of the requirements. This, along with strict requirements capture and management, is why larger projects tend to adopt Waterfall derivative methodologies.

Capability Based Integration So if Waterfall is too rigid, and Agile is non-scalable how can we deliver and test large complex projects better? What’s needed is a hybrid approach that can apply the flexibility, responsiveness and business value www.testmagazineonline.com


T.E.S.T cover story | 7

focus of the Agile methodology into a major project where dozens of suppliers and hundreds, or thousands, of people are trying to collaborate to deliver and test software. This requires a methodology that spans not only software development and testing, but also project management, commercial contracts, procurement and the wider business that will utilise the delivered solution. This is needed in order to create the correct collaborative approach that we all intuitively know works best. One method that has had some success is Capability Based Integration. This is essentially a transparent integrated programme delivery and testing methodology with testing concepts at its core. It structures the project team, both commercially and organisationally, to deliver working business capabilities rather than individual systems or functions. In doing so it provides an environment for suppliers and team members to work together to achieve objectives collectively in an iterative manner. It achieves this in five main ways: - Commercial and legal frameworks in which individual suppliers are incentivised to work together for mutual gain. -T op down operational requirements mapping from business KPIs down to individual process steps and IT system functions. - A risk-based delivery and testing model focused on building up operational business capability in order of priority as early as possible. - A lightweight and integrated management structure sitting between the business and individual suppliers, managing integrated delivery and testing, whilst assuring project direction is maintained. - An end client willing to embrace a highly interactive and adaptive project delivery methodology, and to accept that they are a key part of its success rather than a divorced entity that will use the end product when it finally works. Suppliers are encouraged to adopt an Agile approach to delivering operational functions, ordered by business priority. This enables teams to concentrate on the most important business aspects first before increasing capability in terms of load and complexity. The commercial www.testmagazineonline.com

model facilitates an environment where collaboration may take place across separate teams and where scope change can occur as the project develops without suppliers being unfairly penalised. This is important, as without it the focus of the supplier can only be on the completion of their individual contract, irrespective of the impact to the wider project.

South Africa success This approach was most notably utilised in the recent King Shaka Airport development in Durban South Africa by the iLembe consortium, which was charged with delivering the integrated systems and IT to the Airport Company South Africa (ACSA). The project is one of only a few airport developments that have successfully opened on time at full capacity and is a good example of a large complex IT-driven project fulfilling all project criteria successfully. It has also recently been awarded best Overseas Project of the Year 2010
by the Association for Project Management (APM) and is a good example of a large complex IT driven project fulfilling all project criteria successfully. As a result of this early success the approach is now gaining interest from several blue chip companies. Particularly those sponsoring or delivering major IT projects who recognise that projects are not black boxes that suddenly deliver change, but instead must be highly flexible and adaptable whilst rigorous and business value-focused at the same time. This is still a methodology in work. However, it has had success and offers a solution that scales some of the excellent Agile development concepts into large scale IT projects to create an environment where everyone can collaborate for project success in a constantly changing environment. Whichever methodologies are used in the future, it is clear that a strategy of sitting tight and relying on existing best practice will leave us in a progressively worse position. New and better ways must be found to successfully test increasingly complex IT projects that need to constantly interact with highly flexible and adaptable businesses that cannot stand still.

Ian Kennedy Managing director British Consulting www.britishconsulting.com

So if Waterfall is too rigid, and Agile is non-scalable how can we deliver and test large complex projects better? What’s needed is a hybrid approach that can apply the flexibility, responsiveness and business value focus of the Agile methodology into a major project where dozens of suppliers and hundreds, or thousands, of people are trying to collaborate to deliver and test software. December 2010 | T.E.S.T


8 | Testing mobile devices

Don't leave quality behind when you go mobile Mobile devices are poised to challenge PCs as the application platform of choice, with 412 million mobile internet devices expected to ship in 2012 compared to 139 million PCs. Businesses in the US alone are expected to spend $11.6 billion on mobile applications that year. Pradeep G, head of mobile testing at Cognizant says quality is an integral part of this picture.

T

o tap the massive and growing mobile device market, developers are creating applications that enable everything from mobile banking to location-based advertising. But the commercial success of these applications depends on them working smoothly and securely on a wide variety of handheld devices and networks. Performing testing quickly and cost-effectively greatly expands the market, but the complex hardware and software environment makes such testing vastly more complicated than for applications designed for PCs.

T.E.S.T | December 2010

Hardware complexity With PCs, testers have essentially only one central processing unit platform on which to test applications. Most of the other hardware components that go into a computer, such as disk drives, graphics processors and network adapters are thoroughly tested for compatibility with those operating systems and pose a minor risk of problems. Their display formats also fall within a relatively narrow range, and input devices (keyboards and mice) are well-known and familiar. But mobile carriers differentiate themselves by offering a dizzying range of handsets, each with

unique configurations that can have unpredictable effects on the performance of applications. Handsets are built around a wide variety of processors, running at various speeds with varying amounts of memory, as well as different size screens operating in different resolutions and orientations. Today's handsets also contain a greater, and a more rapidly-changing, variety of hardware than the typical PC. These may include Wi-Fi and Bluetooth network capabilities, a camera, and in more and more cases a GPS receiver and even an accelerometer, which senses www.testmagazineonline.com


Testing mobile devices | 9

movement of the device. Many devices rely on multiple digital signal processors (one to handle voice communications, the other to process audio, video and images), as well as multiple input devices, such as touchscreen and keypad. Each combination interacts in different ways with each other, and with the operating system, to create potential compatibility and performance issues that must be addressed. The handheld operating system market is more splintered than that for PCs, but is also changing more quickly. Just five years ago, Palm was a dominant operating system for mobile devices, but its influence is rapidly fading in favour of, iPhone OS, android, RIM etc. This means that testers must maintain knowledge of the tools required for an everchanging cast of operating systems. Such complexity continues throughout the mobile software stack. Various devices might utilise runtime environments ranging from J2ME to the .NET Compact Framework or BREW. Developers can also choose among different rendering standards. Testers must build and execute scripts checking the interaction among the various applications on the handset, as well as between the application and components such as the camera, microphone, charger, Bluetooth and Wi-Fi. Applications must be tested for their compatibility with any of the networks on which any given device might run. Different carriers use different methods to tunnel their own traffic into the TCP IP protocol used by the Web, changing how applications receive and transmit data. They also use different Web proxies to determine which sites their users can access, and how they will be displayed. All of these differences can affect the stability, performance or security of an application, and must be tested to assure the enduser experience. Just as mobile operating systems are constantly changing, so are the networks, protocols and other key infrastructure elements. Carriers worldwide are upgrading their networks from 2G to 3G and even to 4G with LTE (Long Term Evolution) networks. Finally, in order to be certified, applications may have to be tested for www.testmagazineonline.com

compliance with industry or carrierspecific standards. Each of these added test requirements increases the complexity, cost and time required to insure proper performance. The picture is not that bleak, though. Testing that meets these challenges helps speed an application to market by reducing rework at later stages of the development process. Studies show that finding and fixing defects after production can cost as much as 200 times more than it would have during testing. Proper testing also helps increase the range of devices, carriers and operating systems the application will run on, increasing its profit potential.

Emulators: Use with care Ideally, all mobile application testing would be done on the target device so that every possible interaction among its hardware and software elements, as well as with the carrier's network, could be tested in the most accurate environment. However, acquiring every possible device and performing testing on it is too complex and costly to be feasible. Device emulators – software that simulates the performance of the physical device – are easier to obtain and less expensive than samples of the physical devices. While they can be less accurate than the actual hardware, they can be cost-effective when used appropriately. We recommend using emulators during the initial phases of development, to validate the function of units of source code as they are completed. Early testing can find problems early in the development cycle and thus reduce the time and cost of rewriting. We recommend testing on the more accurate, but more expensive, physical devices for user acceptance testing of the application, and for testing features that interact with the hardware and the network, since these are where otherwise hidden problems are most likely to be found. Emulators for a wide range of popular handsets are available on the Web, ranging from freeware to more expensive commercial applications. Emulators can be used to test Web applications using the software development kit for a browser, or by packaging the application as a .jar or .sis (platform specific) file, installing on the emulated device and testing the application.

Applications must be tested for their compatibility with any of the networks on which any given device might run. Different carriers use different methods to tunnel their own traffic into the TCP IP protocol used by the Web, changing how applications receive and transmit data. They also use different Web proxies to determine which sites their users can access, and how they will be displayed. All of these differences can affect the stability, performance or security of an application, and must be tested to assure the end-user experience. December 2010 | T.E.S.T


10 | Testing mobile devices

Web servers and pages meant to be accessed from mobile browsers can be tested using Firefox browser plug-ins, by loading information about the handset such as user agent details, headers and handset/vendor ID into an .xml file. Using XHTML and WML add-ons to the browser, testers can verify if the Web pages display correctly.

Despite the time pressures of mobile development, proper testing is vital to increasing long-term success and differentiating an application in a highly competitive market. This requires taking into account the complexity of the mobile application environment and the needs of end users, as well as careful use of the specialised tools and processes needed to cope with this unique environment. T.E.S.T | December 2010

Pradeep G Head of mobile testing Cognizant www.cognizant.com

User interface test scripts should be based on specification requirements provided by the clients, such as descriptions of how each page should look on each target platform, definitions of how UI components are used on various devices, descriptions of device-specific interactions such as the use of soft keys, and of any required keypad actions that are not obvious. Test processes Functional test scripts should Many organisations building mobile be developed using system or applications are highly attuned to the specification requirements provided latest changes in technology. However, by the client. These should include they are often less aware of the need elements such as input and output for proper development and test parameters, menu selection criteria, processes. This means that undetected and descriptions of other operations performance issues can cripple their unique to the mobile environment, applications, no matter how appealing such as incoming calls. they may be to consumers. In building scripts, testers should In such organisations, the test group be aware of certification standards reports to the development group, developed by not only individual robbing it of the independence it vendors or carriers, but industry needs to insist on proper testing. groups. Some of the most important Such organisations should create an of these come from the Open independent test unit, or use a thirdMobile Alliance (OMA), to which party that can deliver an unbiased all major handset vendors comply, assessment of the application. covering areas including digital rights Because of the emphasis on speed, management, content provisioning and many mobile applications are device management. developed using RAD (rapid application Carriers may also develop their own development) in which multiple test scripts to determine if applications versions of the software are quickly comply with online application stores, developed, assessed by end users and and are installable on various mobile tweaked accordingly. This rapid-fire devices. This type of testing is called cycle makes it almost impossible Operator Acceptance Testing. to assess how each change performance, stability or security. Delivering a satisfactory For this reason, we recommend using user experience a ‘V’ or modified V-form methodology The mobile application market holds in which testing is done as each unit of massive promise, but even applications code is developed, thereby resolving that address a popular market niche problems at the unit level before will fail if they do not download, those units are combined into larger install and function properly on application modules and evaluated mobile devices. No matter how by users. ‘cool’ or timely an application, it will The V-model should be used for not succeed if it does not deliver a all core application components satisfactory user experience. On the such as the phonebook, messaging other hand, devices and applications and Bluetooth stacks. It should that exceed expectations for ease of also be used for mobile business use, such as the Apple iPhone, can applications to reflect the more deliver extraordinary success. iterative development model used for Despite the time pressures of mobile them. RAD should be used only for development, proper testing is vital prototypes, since adequate testing is to increasing long-term success and virtually impossible using this model. differentiating an application in a highly competitive market. This requires taking into account the Test scripts Finally, developing mobile applications complexity of the mobile application requires changes in how organisations environment and the needs of end users, as well as careful use of the build accurate scripts for user specialised tools and processes needed interface, functional and standards to cope with this unique environment. compliance testing. www.testmagazineonline.com


TM

Powerful multi-protocol testing software

Can you predict the future? Forecast tests the performance, reliability and scalability of IT systems. Combine with Facilita’s outstanding professional services and expert support and the future is no longer guesswork. visit Facilita at:

WINTER 2010

4th October

7th December

Guoman Tower Hotel. London

Plaisterers Hall, London

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: enquiries@facilita.com | www.facilita.com


12 | Testing business

Quality insurance John Peake is the senior manager quality assurance for one of the UK’s largest personal lines insurance intermediaries, BGL. Here he tells T.E.S.T’s Matt Bailey how the software testing function is helping BGL tackle its increasingly complex business.

T

he BGL Group includes some household names that everyone should be familiar with: comparethemarket. com and BUDGET car, van and home insurance to name two. The company was founded in 1992 and has grown to become one of the UK’s largest personal lines insurance intermediaries. It also works with brands like Post Office, HSBC, Santander, M&S Money, RAC and Auto Trader to offer insurance products to their customers. The BGL Group has 3.5 million customers and operates major contact centre operations (Fusion) in Peterborough, Coventry, Sunderland

T.E.S.T | December 2010

and Cape Town, South Africa. Headquartered in Peterborough, it currently employs more than 2,150 people. John Peake is the company’s senior manager quality assurance. He has had various roles within testing, including work as a test consultant for large Government accounts as well as test managing programmes in the private sector. He is ISEB Practitioner qualified. “The BGL Group specialises in the provision of car, van, bike and home insurance and also offers a wide range of supplementary products including personal accident cover, breakdown cover, home emergency

assistance, legal protection, travel and pet insurance,” explains Peake, “It has six distinct business units: comparethemarket.com – a leading insurance price comparison website; Frontline – specialising in motor and home insurance with more than half a million customers and five brands including Budget, Dial Direct and ibuyeco; Bennetts – UK’s number one for bike insurance; Junction – the affinity business, which enables the Post Office, M&S Money, Bradford & Bingley, Auto Trader, Santander, Barclays and others to offer insurance products to their customers; Fusion – the contact centre business, supporting all the other business units www.testmagazineonline.com


WHAT HAPPENS WHEN YOUR USERS HIT ‘ENTER’? Test your application performance in live network conditions • • • •

How many users can your application support? What is the impact of users on your servers and network? How does your application perform over different access technologies? Is your cloud application hosting secure?

For your free test methodology prepared by a Spirent expert, call Daryl Cornelius on +44 (0)1293 767970. Or email EMEA-leads@Spirent.com The world’s leading communications companies use Spirent’s solutions to evaluate performance of the latest applications, technologies and networks.

www.spirent.com


14 | Testing business

so that they decide not to change providers at the end of their policy.” Obviously the rise of the price comparison site has been driven by IT (and advertising!), but what are Testing at BGL they actually offering the insurance Clearly, the software testing function industry? “They continue to offer at BGL is a crucial one. John Peake opportunities for reaching millions explains their role. “At BGL we of customers by having a compelling understand our risk quickly and adapt offer,” explains Peake. “The opportunity our testing to deliver the under-lying to save money. In addition, new needs of the project. IT remains technologies such as m-commerce focused on what we want to achieve are likely to pose a significant from our testing to stop it being seen opportunity for early adopters. One as an overhead but as a value add of the key challenges at the moment activity,” he says. is the ongoing increase of insurance “As a group we are currently very focused on building quality in to every premiums, led by increasing rates from insurers as they continue to suffer process and not relying on testing to from lower investment returns since flush everything out. This will reduce the dawn of the recession. Higher the cost of errors but also give testing rates mean higher premiums for the the ability to achieve higher coverage consumer, so it’s down to insurance and success rate. Testing alone does brands to find ways of lessening the not improve your quality but helps to highlight where there is a quality issue. impact on customers.” The rising popularity of the price Using this information from testing is comparison site makes them a vital a powerful tool in evolving quality,” tool for insurers right now. “We have says Peake. established strong relationships with “A key BGL strength is its agility – the top aggregators as well as our own our willingness and ability to move Comparethemarket site,” says Peake. quickly in Quality Assurance (QA) can “In addition, we have used a number be a deciding factor in success. We of creative advertising and social media are never afraid to challenge what we strategies over the last year to really have done previously to make sure it promote our own price comparison site is valid for the current requirements. to consumers. Understanding risk and how each test will mitigate against it ensures that the Our affinity arm, Junction, has established relationships with some speed of change is kept in balance. of the UK’s leading financial services “We are constantly aiming for and consumer brands. We now offer maximum efficiency from our testing insurance products on their behalf process. Getting the most coverage from one script, ensuring we are testing and work closely with them to develop consumer focussed the right things and combining tests marketing campaigns.” are all day-to-day activities expected from our Quality Assurance Analysts.” with customer liaison services; and ACM ULR – an end-to-end service for customers making accident claims.”

Driving profit – simples! The effects of the current economic climate are being felt throughout business in the UK, surely BGL has had to adapt to the new leaner and meaner paradigm? “We have been fortunate to be able to continue driving profit and have just announced our 13th consecutive year of profit growth,” says Peake. “This has largely been down to the company’s consistent innovation and its ability to spot new distribution opportunities. Our price comparison site, Comparethemarket.com, achieved 70 percent growth this year. Because of the growth of price comparison sites, people are increasingly shopping around at renewal. There’s a significant opportunity for insurance brands to benefit by engaging their customers T.E.S.T | December 2010

Threats and opportunities

Issues like automation and off-shoring have been around for a while in testing and depending on your outlook, could be view as threats or perhaps opportunities. John Peake is upbeat about them though. “Everything has the potential to be an opportunity,” he says. “We have recently automated our regression process using an offshore model. Once created it was handed back to BGL to maintain. We selected Quick Test Pro (QTP) to work alongside our investment in Quality Centre. Before we went down the automation route we ensured we were confident in our manual scripts and that the level of detail was adequate for an offshore team to pick up. We also had timings for each script so we could rationalise the benefits. We were very keen not to fall into the high maintenance trap

that can often be associated with these tools. Communicating with the supplier (HP) directly to understand the best way to keep this to minimum was also key. “Rather than just handing off the project we kept heavily involved with it all the way and while we had our challenges, effective communication and ensuring requirements were understood from all shareholders helped set us up for success. The IT strategy is aligned closely to our business units so we currently have no major offshore plans.”

The changing role We live in a changing world and nowhere is this more keenly felt that the IT sphere. There is plenty afoot to keep tester s on their toes, but John Peake thinks the role of the tester has changed. “One of the big issues for testers is the expectation from the role,” he says. “Not too long ago it was seen as a simple case of ‘checking if it worked’. Time has moved on and now you need considerable skills to succeed in the testing world. Analysis and process improvements are all day-to-day skills for the modern tester. “Cloud, virtualisation and mobile technologies are all areas that impact BGL and the testing team. While any new technology is a challenge, being involved as early as possible and working alongside the development helps build a common understanding and sharing of knowledge. Environments are usually the challenge and getting all the applications to talk to each other. Integration tests need to be run as early as possible. Once you gain this stability your value increases dramatically from each script run,” explains Peake. “Security and load testing are always high on the agenda. Looking to carry these out as efficiently as possible is a constant challenge.” Of course a tester needs to be armed with the correct skills and experience to tackle this ever changing market. “All the QA team at BGL are at minimum ISEB foundation qualified,” says Peake. “We also have several practitioners on the team. Ensuring testers all speak the same ‘language’ is one of the benefits of everyone having this qualification. The expectation back from those on the team who are practitioner qualified is to drive process improvements and efficiencies. It’s not having the piece of paper that matters, but what you can do with it!”

John Peake Senior manager quality BGL BGL Group www.bglgroup.co.uk

www.testmagazineonline.com



16 | T.E.S.T Product Profile

Shipping Forecast Facilita’s Gordon McKeown explains how his company’s Forecast performance and load testing tool is having an increasing impact on the performance testing market.

F

acilita has gained an impressive set of clients over the last few years. So what has attracted such technically savvy users as CMC Markets, ASOS, and EADS? We recently caught up with Facilita’s Managing Director Gordon McKeown to ask him about the performance and load testing tool Forecast and how it is having an increasing impact on the performance testing market. T.E.S.T: What is the history of your company? Gordon McKeown: Facilita started as a Software design and development service company. The founders, who are still at the core of company, are seasoned software developers. The mid 1990s the DSS, now the UK Department of Work and Pensions commissioned us to create a load testing tool for benchmarking servers from suppliers such as IBM and ICL. We developed and supported the software and sold it commercially in a limited way. The software was then developed (becoming Forecast V2) and sold to several blue chip organisations. The decision was made in 2000 to specialise as a load testing vendor and to create a commercial load testing tool built on the experience we had gained. With Forecast Version 3 (V3) we started to make significant inroads into the UK test tools market. Our product development was and continues to be very intensive. When we created Forecast V4 we carried out a major re-design, incorporating customer feedback, and created a flexible platform for adding functionality, targeting new technology and enabling the testing of difficult user applications. Our ambition from the outset was to create the most effective and sophisticated load testing tool T.E.S.T | December 2010

possible and to deliver outstanding support and highly focused complementary services. We are expanding internationally, have recently appointed a reseller for Spain and Portugal and are actively investigating other markets. T.E.S.T: What is the specialist product area of your company? GM: Load and performance testing: we are the developers of Forecast the load testing tool which we sell direct to end-users or as a managed testing service. We also partner with a select group of companies that provide complimentary products. We supply training and consultancy services based around our tool. T.E.S.T: What specific tools and /or systems set your product offering apart from the competition and why? GM: Forecast is very scalable and it can simulate large numbers of users and high levels of load. Forecast can target a wide range of technologies, not just web sites. The range of base VU types includes Web, Windows GUI, Citrix, client-side Java, client-side .Net, and socket level. This range combined with a high level of functionality and good support means it can be genuinely described as ‘enterprise level’ and so it directly competes with the small set of high end tools in the market. Forecast has the features needed for Agile testing as well as ‘traditional’ testing approaches and integrates well with other tools. What differentiates us? Forecast has a modern fully object-oriented design that brings great benefits for extensibility and flexibility. Forecast can be extended rapidly to target ‘difficult to test’ applications. The extensibility feature s can save a fortune in costs by reducing the effort required to create tests. We provide

excellent support. Finally, we have highly competitive pricing and sensible licensing that is not over-restrictive. T.E.S.T: Explain how the product works. GM: Forecast simulates a population of users of a System under Test (SUT), measures response times, checks the validity of responses, monitors SUT behaviour and server and network resources under load. During test execution a central controller (Test Controller) manages a collection of distributed Load Injectors. The number of load injectors can range from one to many hundreds. Each load injector can host from one to several thousand virtual users depending on the target technology and the application. Test controller monitors both the test and the SUT with a flexible real-time display of data in charts and tables such as response times, errors and the use of systems resources. The Studio component is used to create test scripts, select test data and to specify tests. A test can simple or highly complex as required and can be composed of a mixture of VU types executing a mixture of scripts, each individual VU configured with distinct test data. Studio interworks with popular IDEs such as Microsoft’s Visual Studio, Eclipse and Netbeans. Test scripts (in a standard widely used language such as Java, C# or C++) are usually generated automatically by tracing the application. Powerful mechanisms for automatically correlating dynamic data and parameterising input data can be supplemented by user defined generation rules or by extending the relevant script generator. A powerful statistical analysis and reporting tool (Analyzer ) is www.testmagazineonline.com


Join the Revolution Don’t let your legacy application quality systems hamper your business agility At Original Software, we have listened to market frustrations and want you to share in our visionary approach for managing the quality of your applications. We understand that the need to respond faster to changing business requirements means you have to adapt the way you work when you’re delivering business-critical applications. Our solution suite aids business agility and provides an integrated approach to solving your software delivery process and management challenges.

Find out why leading companies are switching to Original Software by visiting: www.origsoft.com/business_agility


18 | T.E.S.T Product Profile

provided that integrates the data collected during test execution with data imported from additional monitoring tools. T.E.S.T: Who is a typical user of your products? GM: Forecast is used by specialist testing teams, developers and performance consultants. Often there is a division of labour so that an individual may only use one aspect of the tool. It is quite common for different people to be responsible for test creation (including scripting), test execution and analyzing the results and creating reports. The tool is well suited to a variety of testing cultures. It has the features needed for Agile testing as well as ‘traditional’ testing approaches.

errors and monitor the “vital signs” of the servers and network. They can also check how the system will perform if the user population were to increase, how it reacts to severe conditions and how it will fail under stress. T.E.S.T: Are there any specific support services that you offer for these products, if so what makes them better than the competition? GM: We offer excellent technical support without excessive bureaucracy. If necessary a Forecast developer will directly deal with a challenging issue. Load testing is a demanding discipline so we offer mentoring as well as more conventional class-room based training.

T.E.S.T: How do your products help a typical user in their software testing tasks? GM: Load testing cannot be carried out adequately on a manual basis although some have attempted this. So the primary aim of Forecast is to provide the crucial functionality needed for successful load testing. We then aim to go further and to increase the productivity of our users. One example of this is our successful efforts to decrease the amount of manual script editing required to test highly complex applications. Our tool has a particular appeal to sophisticated users because it is configurable, flexible and interworks well with other tools such as IDEs and monitoring tools. We eschew gimmicks and instead concentrate on good engineering. For instance there is no proprietary scripting language to learn.

T.E.S.T: What overall are the benefits of your product compared to the competition? GM: It has a wider technical reach than most tools and can target Web, client-side Java, client-side .Net, network level messaging, GUI testing, Citrix etc. It is open and flexible. This provides ‘future proofing’ and not only allows us to evolve the tool rapidly but also enables advanced users to target complex applications more effectively. In many circumstances our approach to test script production (“intelligent generation”) can save a lot of time and effort and can result in big savings and better tests. Purchasing a tool isn’t just a question of a technical tick list. The support and back up and even the attitude of the vendor to its customers are vital factors that win business for us. Realistic pricing and licensing mean that we represent genuine value for money.

T.E.S.T: How does your product help the user to address specific QA problems? GM: Bugs that only manifest with concurrent users, memory leaks, poorly configured middleware, inefficient application code, poor database indexing, deadlock and wrongly sized hardware are but some of the issues identified by load testing! The tester can apply realistic load to servers that is indistinguishable from a population of actual system users, measure response times, check for

T.E.S.T: Do you have any plans to develop the product in future, if so, how? GM: We have a very ambitious development programme that is based around tracking technical changes, constant striving to improve the product and most importantly, user feedback. There is an ‘arms race’ where we must match the development of technology used to realise applications. This particularly applies to the client-side where we need

T.E.S.T | December 2010

to emulate real users effectively. For instance, we are improving our approach to Ajax Web clients that use ‘push’ technology. Other highlights for the near future include enhancements to charting and reporting, better integration with other tools and more comprehensive monitoring capabilities. We are planning to maintain or even increase the pace of our technical innovation because we believe that although Forecast is one of the best tools currently available it can be made even better. So please watch this space! www.facilita.co.uk

We are planning to maintain or even increase the pace of our technical innovation because we believe that although Forecast is one of the best tools currently available it can be made even better. So please watch this space! www.testmagazineonline.com


vital focus groups Helping you overcome obstacles

28th June 2011 ●

One Day Event ● 120 Decision Makers ● 15 Thought Leading Debate Sessions Peer-to-Peer Networking ● Exhibition ● Cutting Edge Content

For more information Contact Grant Farrell on +44 (0) 203 056 4598 Email: registration@vitalfocusgroups.com

TM

Powerful multi-protocol testing software

Website: www.vitalfocusgroups.com Can you predict the future? Event Sponsor:

Sponsors:

Forecast tests the performance, reliability and scalability of IT systems. Combine with Facilita’s outstanding professional services and expert support and the future is no longer guesswork. visit Facilita at:

WINTER 2010

4th October

7th December

Guoman Tower Hotel. London

Plaisterers Hall, London

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: enquiries@facilita.com | www.facilita.com


20 | Stress testing

Break That On Friday 29 October 2010, hundreds of thousands of Take That fans logged onto the internet simultaneously in a bid to get their hands on tickets for the band’s latest tour. Although demand was expected to be astronomical, fans were left frustrated as the ticketing websites were clearly unable to cope with the spike in traffic and subsequently failed. Graham Parsons believes that it is totally unnecessary for websites to fail under the weight of heavy user traffic.

I

n October this year, Take That sold in excess of a million tickets for what turned out to be the fastest selling tour in UK history. Demand for tickets was predicted to be extraordinarily high, so it was surprising to see that as reputable and experienced ticketing websites opened for business, they found they were unable to cope with what should have been, in essence, a hugely anticipated rush of traffic. Thousands of fans were confronted with error pages from most major ticket retailers. Over the following hours, reputations came under fire as a huge outpouring of frustration was unleashed on forums and across social networking sites at the failure of ticketing websites to keep up with demand.

Passing the blame

In these situations site owners often vaguely blame the ‘technology’ behind the site for not being able to cope at peak traffic times. In an official Take That statement issued during the rush to purchase tickets the band said : “Massive demand this morning has T.E.S.T | December 2010

caused phone lines and websites to jam as ticket agents have struggled to cope with the number of people trying to buy Take That tickets.” With Ticketmaster receiving over 20 million page requests on the morning the tickets went on sale, an official statement added : “We had planned for the demand for Take That tickets to potentially exceed anything we’d ever experienced before at Ticketmaster, and believed we would be able to respond... We have undoubtedly seen an unparalleled level of demand today and whilst hundreds and thousands of tickets have been sold, we know that many of our consumers have experienced frustrating delays in securing their tickets.” The truth, however, is that there is absolutely no technical excuse for the ticketing websites to crash under heavy traffic load, particularly when the traffic levels had been so predictable. Leaving to one side the cynical view that there is no such thing as bad publicity, consumers deserve to know that such events are not an inevitable price to pay for www.testmagazineonline.com


Stress testing | 21

Traditionally performance testing tools are complex and expensive, requiring specialist knowledge and expertise plus weeks, if not months, of time to test and re-test until all the bottlenecks have been identified and eliminated. Development and Quality Assurance (QA) teams however, operate in the real world and are usually working against commercially critical deadlines within limited budgets. This can mean that in order to get the application ready in time, corners have to be cut and compromises made. www.testmagazineonline.com

the convenience of buying tickets online. They are, in fact, totally preventable if only the companies involved were prepared to make use of the latest innovative web performance testing tools. No doubt the IT teams responsible for the underperforming ticket portals will protest loudly at the accusation that they failed to test their sites before the avalanche of fans started hitting the ‘buy now’ button. So if that was the case, the logical follow-up question is why do systems that have apparently been properly load tested still fail in these, admittedly, extreme circumstances?

Have a little patience While sites can crash under heavy user loads for various reasons, this type of failure can generally be prevented if the web application and the hardware or network resources – including the capacity of the internet connection – have been subjected to correct and realistic testing. As many professional testers know, a system may fail to perform under peak

load for a number of reasons: underspecified hardware; the capability, or incorrect configuration, of the underlying infrastructure (database management systems, web servers, etc.); or performance defects which exist within the application code itself. Testers will equally know that there are tried and tested methodologies and off-the-shelf testing tools available that should be able to tell them exactly what needs to be done to ensure that applications are fit for purpose; providing that they are given the necessary resources and budgets to do the job correctly. But here we get to the crux of the problem; traditionally performance testing tools are complex and expensive, requiring specialist knowledge and expertise plus weeks, if not months, of time to test and re-test until all the bottlenecks have been identified and eliminated. Development and Quality Assurance (QA) teams however, operate in the real world and are usually working against commercially critical deadlines within limited budgets. This can mean that in order to get the application ready in time, corners have to be cut and compromises made. The result is the sort of high profile failures that loyal Take That fans experienced on the 29th October this year.

What can be done? Over the last few years, new testing tools that slash the set up and configuration time needed by traditional tools have begun to emerge. This means that there is now no legitimate excuse for testers not to correctly test each user journey, using every possible variable to replicate typical online user behaviours, under the most extreme concurrent user loads. The new tools are designed from the outset to be easy to learn and use, reducing performance testing December 2010 | T.E.S.T


22 | Stress testing

With many traditional tools a major limiting factor is the cost for the number of virtual users that can be generated. This is one of the main reasons why applications are not tested to their ultimate limits, especially when we are talking about the sort of traffic spikes the Take That fans were generating at the peak times. T.E.S.T | December 2010

timescales from weeks to days and enabling correct and realistic testing, regularly and often, during the development process. It is a broad assumption that the development teams behind the ticketing websites had either been ignorant of the new testing paradigm or chose to stick with what they believed were tried and tested processes. The main differences between the old and the new generation of tools is that now testers do not need to be scripting experts and can focus their available time on the actual testing. One such tool, StressTester, for example, is 100 percent scriptless and includes wizards and embedded video tutorials to guide testers through the configuration process. This means that realistic tests can be set up in just one or two days from first downloading the tool. Obviously setting up the tests and ensuring that each transaction and user journey can be replicated is just the first stage in the testing process, even if it is traditionally disproportionately time-consuming. To correctly test the application these transactions need to be put under the realistic simulated loads that replicate the live online environment. This is not just a case of gradually increasing the number of virtual users until the application falls over but should also aim to replicate the whole gamut of user behaviour patterns so that the tester can be sure every simulated user is really generating the load that would be generated by a real user achieving real user objectives. This includes factoring in sleep and think time; provision for all the alternative navigation and input method choices; and requests formatted, timed and threaded as they would be by clients with differing characteristics. Again using many of the traditional tools this critical task requires specialist tool experts and many man days of manual ‘trial and error’ effort to get right. With the new generation of tools this process

has been largely automated and simplified so that it can be completed by individuals with just moderate IT skills in a fraction of the time; and by generating varying load patterns from a range of geographical locations, more accurately reflecting the actual operating conditions a website has to cope with in the live environment. With many traditional tools a major limiting factor is the cost for the number of virtual users that can be generated. This is one of the main reasons why applications are not tested to their ultimate limits, especially when we are talking about the sort of traffic spikes the Take That fans were generating at the peak times. Virtual users in the new tool environment are much more realistically priced and enable testers to put the application under a realistic load without incurring prohibitive additional costs. With such tools being not only widely available but also increasingly adopted by many of the owners of the world’s busiest websites, any reluctance to change because of residual scepticism is no longer an excuse for poorly performing sites. The QA managers and testing teams behind the ticketing portals owe it not only to their frustrated and badly served customers but also to the company and shareholders they ultimately work for to keep pace with the latest technology. It is now widely understood and backed by highly credible market research that online shoppers will only accept a few seconds delay in web response times before abandoning their transactions and moving to an alternative site to complete their purchases. Fortunately for the ticketing sites on this occasion all their competitors were also struggling to cope. But had any of them had the foresight to correctly test their sites, ensuring they could scale to the predicted traffic levels on the day, they would have seen a massive return on their investment at the expense of their rivals.

Graham Parsons Chief executive Reflective Solutions www.stresstester.com

www.testmagazineonline.com


DELIVERING THE BEST TESTING TALENT We believe that the right person can transform a business and the right job can change a life. That’s why we make it our mission to find you the best talent that the market place has to offer. We know that guaranteed assurance of delivery and peace of mind is critical when up against tight deadlines and go live dates. Our Testing & Quality Assurance practice recruits professional contract and permanent testing specialists who can ensure that your projects are delivered on time and within budget.

If you need a well informed opinion, advice on a specific area of recruitment or wider staffing project, please contact Sarah Martin at testing@hays.com or call +44 (0)1273 739272.

hays.co.uk/it

With offices nationwide, our testing practice has expert consultants who will work with you to scope and design an effective bespoke solution to cater for your project requirements. By focusing purely on the testing market, our practice occupies a unique place in the market, offering a highly specialised personal approach along with the resources, scalability and infrastructure to deliver across multiple projects or larger organisations.


24 | Agile testing

Defining testing practices for Agile methodologies

Testers must take the lead in redefining testing when Agile development processes are introduced to their organisation. Seapine Software’s solutions evangelist, Peter Varhol shows how testers can redefine testing for Agile development.

E

nterprise application development teams are increasingly adopting Agile software development techniques. In its January 2010 report, ‘Agile Development: Mainstream Adoption Has Changed Agility’, Forrester Research reports that 35 percent of respondents stated Agile most closely reflects their development process. Agile methodologies enable teams to rapidly and iteratively develop applications with participation from the end user community, resulting in higher quality software delivered more quickly. When Agile development processes are adopted by an organisation, testers must take the lead in defining just how their roles must change to suit the Agile methodology. By understanding Agile processes and the changing roles of testing, testers can define their

T.E.S.T | December 2010

role as essential to building quality software. Here’s how testers can redefine testing for Agile development.

Traditional testing and the transition to Agile Traditional testing practices have taken their lead from the Waterfall model of software development, where requirements gathering, software design, and implementation follow a deliberate and often lengthy path. In these methodologies, testers focus on developing plans and individual tests to validate the software against the requirements. Testers not only look for obvious software bugs, but also for deviations from the business requirements. In this manner, they ensure both a defined level of software quality and a specific fitness for the original purpose. While developers are translating requirements into specifications, www.testmagazineonline.com


Agile testing | 25

testers are writing test plans and test cases. This documentation can easily run into the hundreds or even thousands of pages. Once the development team starts building the software, testers execute the test plan, running relevant test cases as the features gradually become available. Bugs and deviations from requirements are logged and described, and developers fix bugs as well as write new code. Agile methodologies work differently because they put a premium on immediate and rapid code development, and less on upfront documentation like test plans. They do this through involvement of one or more product owners from the user community, who work closely with the development team to define and validate the software. These product owners define the functional requirements of the software through less-formal user stories, which are high-level scenarios of how the prospective software would be used as a part of a business process. The Agile team, including the product owner, developers, and testers, take these user stories, organise them, and use them as a basis for designing and building applications incrementally and rapidly. Typically taking two to five user stories at a time, developers will reserve anywhere from a week to a month to implement that subset of stories, depending on the Agile approach used and the complexity of the stories. The features supporting those stories are coded, and at the end of that time those features are working and operational. During coding, the project owner works with developers to answer questions and resolve ambiguities as the stories are translated from business processes to software features. Because the software features must be complete, operational, and validated by the product owner at the end of each sequence, it can ideally be www.testmagazineonline.com

deployed to the business at almost any point in time. As a result, there doesn’t have to be a year-long development cycle, during which the business needs have likely changed. Applications can be deployed early and often, making some features available to users in the short term, while later features can be easily adjusted in response to user feedback and changing business needs.

Agile and testers – It only makes sense At first glance, an Agile process seems to leave little for testers to accomplish. The connection between the end user and the developer appears to be direct, so a need to ensure that the final product was what users wanted seems unwarranted. And that direct connection to the users makes midcourse corrections seem seamless and straightforward. Developers can find and fix the bugs, while product owners ensure fitness for purpose. However, independent testing is still vital. Quality is still one of the most important requirements in an enterprise application, and testing is the primary way of verifying quality and assuring that the development process produces quality software. But testers have to substantially change their approach to doing their jobs in order to assume an essential role in Agile development. Because of their unique position in the application development lifecycle, testers are best equipped to connect business needs with the working application. In particular, testers are in an ideal position to codify user stories, and to determine how to objectively measure their completion. This is a business aspect of testing that is not typically the focus in traditional methodologies. If the product owner is the principal author of the user story, it’s likely that he or she is making assumptions about the domain or the business process that are implicit in the story

Because of their unique position in the application development lifecycle, testers are best equipped to connect business needs with the working application. In particular, testers are in an ideal position to codify user stories, and to determine how to objectively measure their completion. This is a business aspect of testing that is not typically the focus in traditional methodologies. December 2010 | T.E.S.T


26 | Agile testing

but probably not apparent to the development team. Such assumptions can lead to misunderstandings with the developers, who may not fully comprehend the product owner’s assumptions or point of view. Testers are in an ideal position to work with the product owner to identify assumptions and make them explicit, and to clarify any ambiguities in those stories before developers start coding from them.

FIG 1: TESTERS MUST HAVE BOTH TECHNOLOGY-FACING AND BUSINESS-FACING RESPONSIBILITIES IN AN AGILE PROCESS.

Testing to meet user needs Because testing is accelerated, it must be focused on the needs of the user community, in addition to a technical vision of quality. All test cases should tie back to user stories, rather than formal requirements. By enabling testers and product owners to work together to define the scope of the software, testers are best positioned to determine how to test the stories once they have been implemented. Agile methodologies leave acceptance criteria up to the product owner, but that person is unlikely to be an expert in evaluating software, even against criteria he or she defined. Testers are especially qualified to play a key role here, because it is the essence of the profession. To facilitate acceptance testing, testers need to be intimately involved with translating user stories into application features. Doing so provides the ability for them to understand what the user community intended, and how it was translated into the application. This level of up-front involvement makes getting from user story to acceptance test significantly easier. Tracing the implementation of user stories has the potential to be much faster and simpler, and require far less overhead. Because developers aren’t writing extensive specifications, testers can go right to building test cases that reflect the particulars of one or more user stories. An acceptance test case should enable the product owner to verify and validate that the user story has been implemented as expected. Virtually all of these activities can be improved and accelerated with automation. Tracking a story from inception to acceptance test can be done manually, but requires testers to spend time examining documentation to perform that activity. A requirements and test management package will enable testers to record user stories and resulting features, while tracking acceptance test results back to both features and user stories. T.E.S.T | December 2010

Steps to quality software with Agile processes For testers to take a leadership role in assuring quality in Agile software development and delivery, they have to work with, and at the pace of, the project. This means working with both users and developers in defining the project and ensuring its quality and fitness for the user business needs. Here are the key steps to preparing and executing testing of an Agile project: 1. Get to know the product owner and user community. Understand fundamentally the business goals of the application. Your testing goals serve that group. 2. Break down user stories into prioritised testing requirements and track those requirements to completion. Use automated systems to capture user stories, distil requirements, and trace requirements through to implementation and back to user stories. 3. Translate testing requirements into test cases as early as possible. Work with users and business analysts to ensure that the test cases reflect real business needs. Work with developers to devise technology-facing tests. 4. Automate test cases and test execution so that tests can be rerun automatically, as part of the build process. This ensures that any regression is caught and corrected quickly. 5. Track test case execution to ensure the fitness of the application. Be able to report on test execution at any time, so that decisions can be made on application deployment. 6. Trace requirements from inception to delivery to ensure that business needs have been adequately addressed. This may mean adding requirements and tests later in the process, or deleting obsolete requirements.

FIG 2: TESTING TAKES ON BOTH BUSINESS-FACING AND TECHNOLOGY-FACING ROLES IN AGILE PROCESSES.

These are the major ways that testers can demonstrate value and make essential contributions to software development and delivery in an Agile process. An equal focus on both the business and the technology, with support for the Agile team plus the development project, is critical in ensuring testers make a complete contribution to project success. And automation of these activities can ensure timeliness and repeatability of testing activities, not only in the current iteration, but future iterations as well.

Peter Varhol Solutions evangelist Seapine Software www.seapine.com

www.testmagazineonline.com


A

I

E N T E

E

L

E

P R

I

S

E

S

L

E V

R A G E

I

C

I

N T

C O N T

I

U O U S

T

O R O U G H

F

F

A U T O M T E

T E D I N G

S

See what you’re missing! The market-leading combined, integrated platform for the automated end-to-end testing of the SOA, BPM, Cloud and legacy integration technologies that run agile enterprises. Enabling early and continuous testing from the UI through to back-end services and databases, eliminating defects and saving costs. To avoid ‘cross words’ in production, come and talk to Green Hat. www.greenhat.com West

Coast

East

Coast

London

Germany

Melbourne


28 | T.E.S.T Profile

The online learning experience Learntesting provides an innovative, highly scalable distribution platform for online and blended training solutions. T.E.S.T speaks to the company’s managing director, Mike Smith.

O

nline testing training provider Learntesting has now been an independent entity since May this year when it became a separate organisation from its original parent TSG. The company was focussed on the ISTQB software testing certification scheme, but has now increased the portfolio across general testing and related disciplines. Here, company managing director Mike Smith tells T.E.S.T all about how Learntesting is extending the boundaries of software testing and certification. TEST: What are the origins of the company; how did it start and develop; T.E.S.T DIGEST | December 2010

how has it grown and how is it structured? Mike Smith: Learntesting online training was originally developed by Testing Solutions Group (TSG) in 2003 aimed specifically at the ISTQB software testing certification scheme. It separated from TSG in May 2010 to develop as a fully independent business. In 2009, Learntesting launched its new service developed in conjunction with German technology partner, IMC, to provide an innovative, highly scalable distribution platform for online and blended training solutions. This innovation was recognised after only six months of operation when it was selected as a finalist for the www.testmagazineonline.com


T.E.S.T Profile | 29 ‘Learning Technologies Solution of the Year’ award (February 2010) by The Institute of IT Training (IITT). The solution supports selfregistration, payment and automatic content access. A feature of the Learntesting business model is that it supports both individuals and businesses. While many of the same benefits apply to both, the service is tailored specifically to the needs and value propositions of different customers. The organisation already has over 3,000 registered users – a group that is set to expand rapidly over the next year. Clients range from independent testing contractors to major global IT organisations.

• Advanced Test Manager; • Advanced Technical Test Analyst;

TEST: What range of products and services does the company offer? MS: The extensive Learntesting product range has been built over a period of many years using the knowledge and experience of highly skilled testing consultants and accredited trainers, plus eight years of online instructional design experience. We also license a limited range of third party content to complement our range. Learntesting content is structured to support three learning principles: 1. Information Acquisition 2. Knowledge Confirmation 3. Exam Preparation Our Portfolio is structured as follows: • Free content; • ISTQB courses; • ISTQB Foundation Exam preparation options; • ISTQB Advanced Exam preparation options; • Other testing & requirements engineering courses; • Agile testing; • Alumni scheme. We offer a range of free content including our Testers’ Treasure Chests, providing valuable information with three months free subscription. Anyone can self-register for these. Our full ISTQB courses cover the entire scheme with fully accredited material plus other options for blended training and smaller packages:

Other Testing & Requirements Engineering Courses: • ISEB Intermediate Certificate in Software Testing; • IREB/IIBA Requirements Engineering Foundation; • Risk Based Testing. • Software Test Estimation; • Managing the Testing Process; • Pragmatic Software Testing; • Assessing your Test Team; • Understanding and Implementing Agile Testing (Video). The portfolio of services has been significantly enhanced during 2010 and has been designed to service the wide-ranging needs of the market by productising content from full courses to exam simulations, allowing our clients and students to buy only what they require to meet their requirements. We also provide free assessments to help the process. We have added a range of specialist non-certificated testing courses plus requirements engineering (IREB) and this product range will continue to expand with our own developed content and selected licensed material. A feature of Learntesting is the use of Live Virtual Class components in some of our courses. Seamless integration of Adobe Virtual Connect Pro into our learning platform enables us to provide some of the best benefits of ‘live’ training blended into our selfstudy components. Online pass rates for certification are higher than corresponding classroom courses a fact that is surprising to many. Students are able to prepare thoroughly for the exam – study at their own pace, use additional reference material, consult with accredited tutors and only sit the exam when they are ready. We have learnt that good quality online learning is potentially more

ISTQB Courses ISTQB Certified Tester Foundation Level (CTFL): • Foundation – English; • Foundation – German. ISTQB Certified Tester Advanced Level (CTAL) – Self-study Accredited Courses: • Advanced Test Analyst; www.testmagazineonline.com

Blended courses – Integrating up to three Live Virtual Class Components (Subject to accreditation): • Advanced Test Analyst; • Advanced Test Manager; • Advanced Technical Test Analyst. ISTQB Exam Preparation Options: • Exam Simulations; • Subject matter by chapter; • Subject matter exercises; • Questions, answers and detailed analysis; • Test analyst techniques videos; • Exam preparation videos.

effective for ‘knowledge confirmation’ and ‘exam preparation’ than is usually achieved in the classroom. Our measured pass rates are consistently higher than classroom and market averages and we attribute much of this to the rich quality of ‘knowledge confirmation’ and ‘exam preparation’ content. To augment the learning experience for all our students, we have invested in a private e-book library containing 50 testing and testing-related books, including some written specifically to support the ISTQB scheme. TEST: Does the company have any specialisations within the software testing industry? MS: There are many reasons why an individual might want to get certified or enhance their knowledge using the products offered by Learntesting. Many employers across the world acknowledge and support the ISTQB software testing certification scheme. Certification at Foundation Level is recognised as one of the basic entry criteria to their own tester career development paths. If Foundation Level provides a platform for ‘A common language for Testing’ with terminology, fundamental test process and testing in the lifecycle, Advanced Level recognises in-depth knowledge and experience by the roles of Test December 2010 | T.E.S.T


30 | T.E.S.T Profile

We have devised an innovative scheme that supports the new ISTQB ‘Code of Ethics’ and the concept of CPD for testers. The scheme allows anyone already certified to have ongoing access to the very latest materials plus a whole range of other benefits for a nominal sum. T.E.S.T | December 2010

Analyst, Test Manager and Technical Test Analyst. Learntesting provides a highly effective platform for those seeking ISTQB Certification and for those already certified to keep their knowledge up to date. As well as lower cost training compared with the classroom equivalent, ‘total cost of training and certification’ is significantly reduced with the ability to study at a time and place to suit the individual. Time out of the office and lost billing time can be minimised without losing any effectiveness – in fact the reverse, it actually increases effectiveness. If testing certification leads to a common language for testing, demonstration that testers understand and can apply fundamental test principles, practices and techniques, Learntesting offers an increasing range of more specialised courses aimed at a variety of other skills to help testers and businesses increase their effectiveness and competitiveness. Learntesting also has a policy of ‘study until you pass’. We recognise that everyone has different circumstances, so although we set a recommended study period, all full Learntesting branded courses can be extended free of charge on request. Many testers are attempting to pass the ISTQB Foundation Certificate without any formal study. This is a risky policy even for some experienced testers and Learntesting has deliberately structured its products to allow testers to mitigate the risk of wasting a significant sum of money failing the exam. A structured product portfolio with a range of content from

£15 - £200 plus free self-assessments to check areas of weakness, allows students to choose the level of support they need from Learntesting to mitigate the risk of failing the exam. Learntesting offer businesses ‘Flexible Corporate Licensing’, providing one-year renewable, renegotiable, cost-effective licence to any or all Learntesting content for all those involved with software testing based on the needs and value to the different stakeholder groups including testing professionals, developers and users. Learntesting also offers businesses private course versions with tutor access to monitor their staff’s progress and help them manage the completion of their studies. TEST: Who are the company’s main customers today and in the future? MS: The company has built a global distribution model and sells products via a network of resellers around the world. It features branded portals for training providers who can offer local support and tailored solutions to suit the local needs of their students. The integrated helpdesk allows it to provide an excellent level of support to all students, including administration, accredited tutors and exam booking support. Main distribution portals are in the UK, Europe, Australia and South Korea and this group is being expanded. We have partnered with a number of exam providers and are able to offer exam vouchers redeemable at over 5,000 test centres in 165 countries around the world, so even if students just need to sit an exam, they can get this from Learntesting www.testmagazineonline.com


T.E.S.T Profile | 31 ISTQB Certified Tester Foundation Syllabus introduced a ‘Code of Ethics’ which includes: “Software testers shall participate in lifelong learning regarding the practice of their profession.” There is no concept of ‘recertification’ in the ISTQB scheme at Foundation or Advanced Level. The fact that 150,000 individuals have been certified against many different versions of the syllabi over the past 13 years means that one of the fundamental principles the scheme supports is compromised - ‘A common language for testing’. Anyone certifying against the current Foundation syllabus can now be speaking a very different language to someone from several years ago. We have devised an innovative scheme that supports the new ISTQB ‘Code of Ethics’ and the concept of CPD for testers. The scheme allows along with some value-added free anyone already certified to have exam preparation. ongoing access to the very latest This month we are launching an materials plus a whole range of other innovative ‘Alumni’ scheme allowing benefits for a nominal sum. those already certified to keep their There are a number of reasons why knowledge up to date, comply with the scheme has benefits for both the new ISTQB ‘Code of Ethics’ for individuals and businesses. Individuals testers, plus a range of other benefits. This scheme is available to individuals can demonstrate they are serious about software testing and let them and also businesses, that can provide know their employers and clients access to all their testing staff as part that they’ve invested in keeping their of the scheme. Qualifying candidates can subscribe for one year at a nominal knowledge up to date and compliant with the ISTQB Testers Code of Ethics. sum, and as a special introductory offer for TEST Magazine Readers, there For those moving onto Advanced Level study, they can bring themselves is an even bigger discount available right up to date with their knowledge using Promotion Code TES002 at from the latest version of the www.learntesting.com Foundation course. Businesses can keep their complete TEST: What is the company’s testing teams’ knowledge up to date, commitment to corporate social comply with the new Foundation responsibility, i.e. ‘green’ issues? Code of Ethics, and ensure all their MS: The Learntesting business has certificated staff ‘speak the same a strong commitment to Corporate language’ in software testing. Social Responsibility including Green Business can demonstrate to their issues. It operates as a ‘virtual’ staff they are serious about their business, discourages printed matter continuous professional development and its products are delivered almost thus providing a good motivational entirely over the internet. incentive. Businesses can integrate the Alumni scheme into a larger, TEST: What are the future plans for the business; is there anything else you flexible annual license to cover all staff involved with testing: would like to add? 1. Seeking certification MS: We have recently instituted • Professional testers. the Learntesting Alumni Scheme 2. Already certified which we see as supporting the • Keeping their knowledge professionalisation of the software up to date; testing Industry. • Preparing for the next level. We take seriously the continual 3. Need the knowledge – not the professional development (CPD) of certificate software testers. After all, if we want • Anyone involved with testing, to be recognised as a profession, we including developers and users. should start behaving like one! In If you are already a certified tester in October 2010, the new release of the www.testmagazineonline.com

either the ISTQB or ISEB certification schemes, you can join the Alumni scheme. Joining for a nominal sum gives you 12 months access to the following: 1. Access to the Learntesting e-book library containing 50 testing and testing-related books including some specifically written to support the ISTQB scheme; 2. Latest version of accredited ISTQB Foundation (CTFL) course – 2010 syllabus; 3. Latest versions of ISTQB Advanced Level Courses corresponding to equivalent existing certification (see Table below); 4. Any updates to the course material; 5. Additional testing related content published on a quarterly basis; 6. Discount on purchase of other Learntesting courses; 7. A n opportunity to network with other Alumni scheme members.

Learntesting alumni scheme – your qualifying certificates and what you get

Learntesting readers’ offer To encourage uptake within the industry, Learntesting is offering an introductory entry price of: Foundation Alumni: £25/e30 per annum, per subscriber (+VAT). Advanced Alumni: £50/e60 per annum, per subscriber (+VAT). For Readers of TEST magazine, there is a special offer reducing this to £15/£30 respectively by ordering by 31st January 2011 using promotion code TES002. Potential Alumni subscribers need to register at www.learntesting. com and provide details of the qualifying certificates to helpdesk@learntesting.com including: 1. A copy of the certificate 2. C onfirmation that they are the bona fide holder of the certificate. December 2010 | T.E.S.T


32 | Performance testing

Game-changing performance testing A partnership between Amazon and Soasta has borne fruit in the form of a performance testing tool that is game-changing according to Soasta’s Vince Vasquez.

T

en years ago, Tom Lounibos was asked what would happen if his SaaS application suddenly had a large spike in traffic. Would his site crash? Would user response time crawl to a halt? Would his site handle the increased workload without a glitch? In truth, he had no idea and he couldn’t afford the high cost of traditional performance testing to find out. From his personal experience, cloud testing provider Soasta was born. Meanwhile, Amazon launched a limited public beta of its EC2 compute product during the summer of 2006. The company, much better known for selling books online, decided to leverage its operational prowess and extra compute capacity to turn the concept of cloud computing into a reality. In a very real way, Soasta and Amazon grew up together: Soasta with its performance testing application CloudTest, and Amazon with its game-changing cloud computing environment EC2. Together,

T.E.S.T DIGEST | December | December 2009 2010

they are helping companies like Intuit answer the crucial question of whether their Web sites can withstand huge increases in traffic, and generate the information necessary to optimise their architecture and implementation.

Why the fuss about performance testing? Why has performance testing become such a hot topic? After all, most Web sites probably haven’t gone through any rigorous testing, and the Internet still keeps displaying away. Many companies use their Web sites for sales and marketing purposes. In fact, a company can spend millions of pounds creating engaging content and running promotional campaigns to draw users to its site. Unfortunately, if the site crashes or response time crawls all that time, energy and money could be wasted. Worse yet, public perception of the company could be seriously downgraded. It’s akin to running an expensive campaign to bring hungry patrons into your restaurant, and then not bothering to serve them.

Many in the US who watched the 2010 Super Bowl probably recall the Denny’s Free Grand Slam breakfast commercial. What many didn’t know is that 59 million people hit the Denny’s Web site during a four-hour period as a result of the ads. Before Denny’s launched the Super Bowl campaign, the company had no idea traffic would spike more than 1,700 percent. And it’s not just controlled promotions that might flood a Web site. In this era of social media and viral marketing, site traffic can be extremely unpredictable. Create a piece of content that lights up the tweets, and you could see site traffic burst way past original expectations. When the Avatar movie trailer came out, for instance, about two million people hit the site to be among the first to get a glimpse. Traditional events can also spin up the site traffic. For example, not many people send a Hallmark.com Valentine’s Day e-card on Feb. 13 or 15 (unless, of course, they want to find themselves sleeping in the proverbial relationship doghouse). www.testmagazineonline.com


Performance testing | 33 Likewise, more than a few of us in the US wait until the last minute to file our taxes. Prior to working with Soasta, the largest number of concurrent users Intuit had ever simulated for its TurboTax system was 4,000. Talk about waking up to a public relations nightmare if the Intuit site crashed at 10pm April 15, and thousands of people missed filing their taxes on time because of it. As you can probably imagine, many more than 4,000 people wait until the last minute to file their taxes online. In response to this potential threat, Soasta worked with Intuit over a 33-day period, eventually simulating 300,000 concurrent users on Intuit’s production site, while 25,000 people were actually online filing their taxes. This process gave the Intuit development team the data it needed to fine-tune the site, and confidence in its ability to handle the load. And, yes, April 15 came and went without a glitch.

Amazon to the rescue Of course, you don’t simulate hundreds of thousands of users on a few servers sitting under your

desk. Large tests require hundreds or even thousands of servers for short bursts of time – a complete economic non-starter in the pre-cloud computing era. Amazon EC2 to the rescue! For Soasta, the decision to partner with Amazon was simple. Actually, Soasta started running on EC2 three years ago. And over that time period, its engineers gained a tremendous amount of confidence in Amazon’s ability to deliver the compute environment necessary for Soasta to run its load and performance testing solution. Because Amazon has mature APIs that provide elasticity fast, Soasta’s product is solid and can be ready to simulate massive traffic in a matter of minutes. And Amazon continues to innovate, helping the company to continue to augment its product offering as well. The result is that Americans can now wait until the last minute to file their taxes using TurboTax with the confidence that when they hit the submit button, Uncle Sam will hear from them on time. Likewise, you can send your Valentine’s Day e-card on the 14th with confidence that your loved one will get the message.

Of course, you don’t simulate hundreds of thousands of users on a few servers sitting under your desk. Large tests require hundreds or even thousands of servers for short bursts of time – a complete economic nonstarter in the pre-cloud computing era. Amazon EC2 to the rescue!

Vince Vasquez Soasta www.soasta.com

YOU ONLY HAVE 4 SECONDS BEFORE YOUR CUSTOMER GOES AWAY… waiting for www.frustrated.com

Research says that speeding up a page by just 4 seconds decreases abandonment by a whopping 25% Source: Velocity Conference June 2010

Call the Website Performance Experts: Tel: 08445 380 127 Email: info@siteconfidence.com Web: www.siteconfidence.com

siteconfidence an ncc group company

Per formance monitoring / Exper t advice / Per formance analysis / Load tes ting www.testmagazineonline.com

September 2010 | T.E.S.T DIGEST


34 | Testing metrics

Have you aligned your testing metrics to the business scorecard? Sudip Naha of MindTree describes how a test metrics analysis and decision model (TMAD) helps in answering questions about cost of ownership, product quality and availability for release; continuous improvement in terms of learning and growth; and process quality.

T

esting organisations are maturing, many have aligned themselves with maturity models like CMM, and usage of objective measures for quality management has gained importance. Though structured measurement programmes are also being implemented, there is a lot of scope for improvement, especially in the area of testing. Organisations are no longer questioning “Why do we need testing metrics?”, but have matured to asking “What are the testing metrics that we can choose from?”, and “How do we use these metrics to see where we are with respect to the project and organisation goals, and how do we attain them?” A web search gives a list of testing metrics that we can implement, and a lot of organisations are probably using many of them. However, most test organisations still face the challenge of collating and articulating their metrics data to evaluate their contribution with respect to the financial, customer focus or continuous improvement goals.

Background A few years ago MindTree launched an organisation wide quality dashboard and, since then, our test process evangelist team has had a regular T.E.S.T | December 2010

influx of project-wise testing metrics. When we reviewed the value of this recurring activity, we noted that the number of teams actually deriving value from the exercise was quite small. The reasons were primarily attributed to the inability to decipher the real picture behind the distorted graphs. Since however the review and audit mechanisms were in place, we have been able to identify the dilemma of the project and test managers in terms of the usage of the quality dashboard. What we learned from these reviews has led to the design of our TMAD model.

The problem There was a set of 17 metrics defined and tracked in the dashboard, which individual teams as well as the process evangelist team analysed periodically. It was found that the effort spent was not yielding sufficient benefits in terms of taking the required proactive actions based on timely forecasts as well as taking decisions on the corrective actions. Primarily this was because of: Lack of appropriate categorisation: It is difficult to take decisions or make forecasts looking at individual metrics. Again, we can often go wrong if we take a metric at the face value while a set of metrics appropriately grouped

gives insight to a specific solution. Limited mapping of metrics to business goals: Most of the time in management reviews the questions are straight forward and linked to the business goals. Examples include, “When will the release happen, when will testing be completed?”, “Do you see any risks of overruns?”, “How can we reduce the test cycle?” If we can map the metrics to quantitatively support our answers, it would help us to align with the project and organisation goals better. Limited mapping of metrics to stakeholders: All the metrics may not be relevant to each and every stakeholder. The information that a delivery manager would be interested in is different from that of a group test manager which in turn would differ for a test lead. Thus, we have to provide the right information to the right person at the right time.

The solution With review mechanisms already in place, the first thing we did was to extend this objective measurement methodology to ascertain the validity of test metrics and question anomalies better. We have identified the questions that need to be associated with each metric and asked them when data seemed to be skewed or when there was a variance larger than www.testmagazineonline.com


Testing metrics | 35

the metrics baseline. This has acted as a ready reckoner for analysis and helped in easily identifying the areas where a root cause analysis was imperative. Next, we outlined the decisions that metric data may invoke and the corrective actions that can be taken. As per the Goal-Question-Metric (GQM) model, we identified a set of questions which further refined the goal. The questions to be asked, as well as the decisions or corrective actions to be taken, again depended on the value of the metric whether it is high or low with respect to a baseline. Let us see how this helps. Metric : Percentage dismissed defects. Goal : Reduce rework and improve test quality. Sample questions – scenario: Metric value higher than baseline • Is the quality of product and design documentation adequate? • Is the team aware of the changes to the requirement and scope document? Answers: • The change requests have not been formally documented and hence have led to redundant test cases and invalid defects. • The team did not have any formal knowledge transfer on the new product components. Decision: • Have a representative from the testing team in all change control board meetings. It is important to note that one seemingly good metric data does not necessarily provide the true picture, hence the decisions should be made based on the co-relation between a set of metrics. Therefore, the next step was to come out with a metrics co-relation matrix. This additionally helps in forecasting risks and proactively taking corrective actions. For example, if defect fix effectiveness is low, it may lead to a positive schedule variance since the entire team has to spend additional effort in re-opening, re-fixing and re-testing the defects.

Aligning the metrics to the business goals The foundation is now set to align the www.testmagazineonline.com

metrics to the project and business goals of the test organisation. These can often be described in terms of: 1. Customer Focus: On time product availability; Customer focused testing ensuring quality; Reduced cost of ownership. 2. Financial: Effective risk management; Expense optimisation; Long term value. 3. Learning and growth: High performance culture – Strong review process; Best practices towards development of human, information and organisational capital. Based on these areas and the parameters, we can map the metrics and are able to evaluate where we stand with respect to the goals and take proactive actions to meet them. This mapping also helps stakeholders to analyse metrics important to them. Whenever there are variances in the high level metric indexes (weighted average score of the metrics) with respect to financial, customer focus and continuous improvement goals, we can drill down to the low level attributes of cycle time, costs, quality, throughput or utilisation to know where we have to improve.

The TMAD thought process TMAD and its thought process is still evolving and there is one aspect that we need to be mindful of. Since there are predefined sets of questions for each of the metrics, teams may not delve deeper into identifying additional areas to increase the scope of the model and this can hinder improvement. Practitioners have found this model useful in justifying the test certificates by quantitative data, take corrective and preventive actions and establish and confirm quantifiable entry and exit criteria and go-no-go decisions. Also, since stakeholders across the test organisation get a consolidated view, some changes that need to be incorporated at an organisation level to support individual projects can also be driven. This ability to forecast risks and take proactive actions is bringing about a change in the current methods of data analysis and making it less effort intensive and more result oriented.

Sudip Naha Co-head of Test Lab MindTree www.mindtree.com

December 2010 | T.E.S.T DIGEST


36 | Testing events

TestExpo Winter 2010 According to the organiser, TestExpo is the UK’s leading exhibition for software testers. If you haven’t already made a date in your diary to join your peers at the forthcoming event, TestExpo Winter 2010 ‘Better Quality with Agility’ takes place on Tuesday 7 December at the Plaisterers’ Hall in the City of London.

T

he testing industry faces many challenges as information management efficiency becomes a higher priority for organisations across the globe. Since its launch, TestExpo has grown to become a respected source of testing news and innovation and a forum for debate, discussion and networking amongst testing professionals. TestExpo Winter 2010 has a combination of keynotes and presentations given by experts from leading organisations in the software testing market, together with an exhibition area where visitors and exhibitors can: network with leading organisations and industry peers; view demonstrations of the latest products and services; learn about new ideas, methods and techniques; challenge the testing pioneers with questions; and benefit from valuable special offers and discounts. The event’s theme: ‘Better quality with agility’, will encourage a timely focus on the need for software testers to achieve new heights of technical proficiency, innovation and speed in order to respond to market pressures according to the organiser, Sogeti Training.

Keynote speakers The event will include a full programme of presentations and lectures by some of the most renowned software development teams and testing industry experts: Ed Hill, UK & Ireland business leader ALM for HP will join Gary Voller, HP Software ALM solution consultant to present: Automating manual testing – Beyond the oxymoron! Manual testing is slow, resource intensive and costly. This presentation will demonstrate why manual software testing no longer needs to be tedious, inaccurate and timeT.E.S.T | December 2010

over a variety of networks (corporate consuming. The session will feature a live and in action demonstration of the WAN, MPLS, Internet, Wireless, 3G, GPRS etc). Puranik explains to deliver new capabilities of HP Sprinter. an effective network-related software performance testing service without Daniel Brzozowski, software using the live production network or engineer for business critical being a network expert. software specialist Quotium will be presenting on the topic Gordon Alexander, consultant of Penetration testing for Web at Seapine Software presents: applications. Integrating testers into Agile Brzozowski will cover penetration development’. testing, passing on some basic Alexander will examine whether information about web application the increasing adoption of Agile vulnerabilities and their origins, as development methodologies has left well as best practices for ensuring website security is well integrated with testers behind. He looks at testing and the role of testers in Agile Software Development Life Cycle. methodologies, with a focus on integrating testers into Agile QA and testing experts from teams. He’ll provide actionable IBM and SAP that will present guidelines to assuring application on leading-edge topics. quality by enabling testers to take a leadership role in defining testing for Andy Buchanan, sales director the new paradigm. for Oracle will be sharing his knowledge about Secure application Brian Shea, CEO of Sogeti UK and Ed testing in the cloud – Smart testing Hill, UK & Ireland business leader for Agile enterprises. ALM for HP Software, give the Is testing your application taking up keynote presentation: Today's trends more and more of your time, as you in software testing & quality. face the continuous cycle of upgrades Are your testing processes evolving and patches? Are you still relying on tedious manual testing? Do users still to accommodate new technologies? Shea and Hill provide an insight into complain about application quality the findings of the World Quality or performance? Buchanan explains Report 2010-2011 which highlights how you can leverage modern cloud the current state of application quality computing concepts to automate and the trends that affect it; and more functional and load testing while keeping your data secure and reducing importantly how they apply to the UK market and what this means for the manual testing burden. your organisation. There will also be a chance to debate all of these issues Frank Puranik, senior technical and more in an open forum. specialist at network emulation specialist Itrinegy will be explaining Why ensuring an application is network-ready is Imperative to the TestExpo Winter 2010 modern tester. ‘Better Quality with Agility’ is Initiatives such as Data Centre on Tuesday, 7 December at the Consolidation and Virtualization, Plaisterers’ Hall in the City of Cloud Computing (both Public and London. For more information, Private), SaaS, IaaS, PaaS, Self Service visit www.testexpo.co.uk. etc require applications to be delivered www.testmagazineonline.com



38 | TestExpo Winter 2010

Testing agility Shoubhik Sanyal, a Certified ScrumMaster with ten years’ experience in test managing Agile projects across new media, online gaming, retail and telecoms explains why Agile testing is good for you, your business and your career.

F

It’s not by chance that this December’s TestExpo has the overarching theme: ‘Better Quality with Agility’. Agile is at an all time high: the speed, collaboration and relevance of Agile testing methods are the main topics of conversation among testers, while those who already have proven Agile experience are finding themselves very much in demand. T.E.S.T | December 2010

or any business operating in competitive markets and challenging economic times, the quality and speed of software delivery can be the difference between success and failure. Now more than ever, business managers are painfully aware of the need to react quickly to market scenarios and so want to ‘get up to speed’ with the testing tools, applications and processes that are of critical strategic importance. So it’s not by chance that this December’s TestExpo has the overarching theme: ‘Better Quality with Agility’. Agile is at an all time high: the speed, collaboration and relevance of Agile testing methods are the main topics of conversation among testers, while those who already have proven Agile experience are finding themselves very much in demand. But what’s so special about Agile testing? Well, it’s not just the specific technical aspects of certain Agile practices but also the bigger picture of what Agile testing brings to QA, the software development lifecycle and to business management. In short, it’s the value Agile brings across the entire communication and business chain. Let’s start with what it is. Borne out of the principles behind the Agile Manifesto (www.agilemanifesto.org), Agile testing changes the fundamental approach to software development. Individuals and interactions take priority over processes and tools; collaboration takes precedence over hard-line negotiation. It’s all about a collaborative environment that responds to changes as the process unfolds. Just think how this changes the basic concept of the software tester as some hard-line guard, watching over code and looking for mistakes.

Instead of this staid image of a fingerpointing quality controller, the tester in an Agile environment works in parallel with software development, and in line with business scenarios and objectives. Team dynamics are changed because testing is continuous and part of the overall agenda. There’s no afterthought. It changes the notion of a ‘test’ – it’s not a final act to check whether something passed; it is part of the process.

Get the edge By bringing testing into the process Agile can drive vital efficiency gains. The speed of development coupled with collaboration enables faster and transparent software testing at reduced costs. Proactive, cross-functional Agile teams work together harmoniously, providing Agile businesses with a significant competitive edge over rival firms. So if Agile is this effective, then why not use it everywhere? Because there are ideal scenarios that lend themselves to Agile testing: projects that are ‘Agile-sized’. Teams of seven to nine work well, while a team of say, 25 might work against the team dynamics, requiring development efforts to be split into smaller teams on smaller projects. As for the most appropriate markets, there are some obvious verticals where time-to-market is the competitive driver – retail applications, new media applications, online gaming or betting applications and financial applications for example. Markets in which there aren’t compliance-intensive mission critical applications embroiled in regulations, but there are rewards for reacting to market demands or opportunities, and fast.

Quick, not compromising www.testmagazineonline.com


TestExpo Winter 2010 | 39

The speed of Agile is also the basis of some misconceptions. Some organisations steer away from it because they equate its speed with poor-quality or an incomplete testing process, which is far from true. Agile may not be suited for every application across every market, but that’s not because it is substandard. Like any testing process or methodology, it must be matched appropriately to the application as well as the underlying business drivers. There can also be resistance for other misdirected reasons. For example, managers may be wary of the new world order of Agile, fearful that their roles will change and they’ll lose some of their managerial power. Yet once they embrace Agile, these same managers are quickly enthused about how they are able to work hand-inhand with developers and testers to bring crucial strategic applications to market at a rapid pace. The cost of not considering Agile can be huge, with implications ranging from escalating costs of change to suffering competitors bringing applications to market in shorter cycles. There’s also the positive knock-on effect of how early adoption can change a team, encouraging its collective ability to think about speed, market pace and overall business benefits.

Market demands aren’t wrong So what skills drive success in Agile testing? Well the team itself needs to be prepared to adopt a way of working that sees multiple iterations as well as frequent and early testing in a development cycle that doesn’t have the vast amount of documentation and structure typical of non-Agile methods. Yet, project management and team-working skills aside, there www.testmagazineonline.com

are no formal certifications available today that specifically recognise Agile testing skills. Some training divisions are seeing and meeting a demand for Agile-based skills themselves – not because there are globally or industry-recognised qualifications, but because there aren’t any available yet. This will change,

but in the interim, it can be difficult for employers to source talented and experienced Agile testers. Agile testing transforms the careers of software testers and ultimately benefits businesses reliant on skilled testing partners. Testers gain team, resource management and business skills which allow them to expand the services they are able to offer while delivering greater added value to their employer or clients. The market for quality software testing is growing, enabling switched on individuals to embrace Agile methods and move into higher value roles in the testing market, with accordingly higher rewards. Agile is also very interesting. As an evolving and growing practice, it has a strong community of supporters, a raft of discussions around its future, and is the focus of a number of innovative open source tools. This is why I’m looking forward to catching up with the attendees at TestExpo Winter 2010 – there’s so much to discuss and debate. How can Agile be blended with other methodologies? How can we best use Agile in an off-shoring model? What are the best techniques for use in Agile and test automation? How is the test manager’s role evolving in an Agile world? In conclusion, Agile empowers software testers, developers, managers and businesses to embrace new ways to work, test, and deliver. You could say I’m biased, but I believe Agile testers are more insightful because they have to understand software development and QA from three very different perspectives – user, developer and business. Agile is hot and if you’re not already taking part in the discussion, its time you did.

Shoubhik Sanyal Certified ScrumMaster shoubhiks@gmail.com

December 2010 | T.E.S.T


Can you predic 40 | T.E.S.T company profile

Facilita Load testing solutions that deliver results Facilita has created the Forecast™ product suite which is used across multiple business sectors to performance test applications, websites and IT infrastructures of all sizes and complexity. With this class-leading testing software and unbeatable support and services Facilita will help you ensure that your IT systems are reliable, scalable and tuned for optimal performance.

Forecast, the thinking tester's power tool A sound investment: A good load testing tool is one of the most important IT investments that an organisation can make. The risks and costs associated with inadequate testing are enormous. Load testing is challenging and without good tools and support will consume expensive resources and waste a great deal of effort. Forecast has been created to meet the challenges of load testing, now and in the future. The core of the product is tried and trusted and incorporates more than a decade of experience but is designed to evolve in step with advancing technology. Realistic load testing: Forecast tests the reliability, performance and scalability of IT systems by realistically simulating from one to many thousands of users executing a mix of business processes using individually configurable data. Comprehensive technology support: Forecast provides one of the widest ranges of protocol support of any load testing tool. 1. Forecast Web thoroughly tests web-based applications and web services, identifies system bottlenecks, improves application quality and optimises network and server infrastructures. Forecast Web supports a comprehensive and growing list of protocols, standards and data formats including HTTP/HTTPS, SOAP, XML, JSON and Ajax. 2. Forecast Java is a powerful and technically advanced solution for load testing Java applications. It targets any non-GUI client-side Java API with support for all Java remoting technologies including RMI, IIOP, CORBA and Web Services. 3. Forecast Citrix simulates multiple Citrix clients and validates the Citrix environment for scalability and reliability in addition to the performance of the hosted applications. This non-intrusive approach provides very accurate client performance measurements unlike server based solutions. 4. Forecast .NET simulates multiple concurrent users of applications with client-side .NET technology. 5. Forecast WinDriver is a unique solution for performance testing Windows applications that are impossible or uneconomic to test using other methods or where user experience timings are required. WinDriver automates the client user interface and can control from one to many hundreds of concurrent client instances or desktops.

Facilita Tel: +44 (0) 1260 298109

T.E.S.T | December 2010

6. Forecast can also target less mainstream technology such as proprietary messaging protocols and systems using the OSI protocol stack. Powerful yet easy to use: Skilled testers love using Forecast because of the power and flexibility that it provides. Creating working tests is made easy with Forecast's script recording and generation features and the ability to compose complex test scenarios rapidly with a few mouse clicks. The powerful functionality of Forecast ensures that even the most challenging applications can be full tested.

4

G

Facilita Software Development Limited. Tel: +44 (0)12

Supports Waterfall and Agile (and everything in between): Forecast has the features demanded by QA teams like automatic test script creation, test data management, real-time monitoring and comprehensive charting and reporting. Forecast is successfully deployed in Agile ‘Test Driven Development’ (TDD) environments and integrates with automated test (continuous build) infrastructures. The functionality of Forecast is fully programmable and test scripts are written in standard languages (Java, C#, C++ etc). Forecast provides the flexibility of open source alternatives along with comprehensive technical support and the features of a high-end enterprise commercial tool. Flexible licensing: Geographical freedom allows licenses to be moved within an organisation without additional costs. Temporary high concurrency licenses for ‘spike’ testing are available with a sensible pricing model. Licenses can be rented for short term projects with a ‘stop the clock’ agreement or purchased for perpetual use. Our philosophy is to provide value and to avoid hidden costs. For example, server monitoring and the analysis of server metrics are not separately chargeable items and a license for Web testing includes all supported Web protocols.

Services In addition to comprehensive support and training, Facilita offers mentoring where an experienced Facilita consultant will work closely with the test team either to ‘jump start’ a project or to cultivate advanced testing techniques. Even with Forecast’s outstanding script automation features, scripting is challenging for some applications. Facilita offers a direct scripting service to help clients overcome this problem. We can advise on all aspects of performance testing and carry out testing either by providing expert consultants or fully managed testing services.

Email: enquiries@facilita.co.uk Web: www.facilita.com

www.testmagazineonline.com


T.E.S.T company profile | 41

Hays Experts in the delivery of testing resource Setting the UK standard in testing recruitment We believe that our clients should deal with industry experts when engaging with a supplier. Our testing practice provides a direct route straight to the heart of the testing community. By engaging with our specialists, clients gain instant access to a network of testing professionals who rely on us to keep them informed of the best and most exciting new roles as they come available.

Our testing expertise

Tailored technical solutions

We provide testing experts across the following disciplines:

Automated Software Testing: including Test Tool selection, evaluation & implementation, creation of automated test frameworks;

With over 5,000 contractors on assignment and thousands of candidates placed into very specialised permanent roles every year, we have fast become the pre-eminent technology expert. Our track record extends to all areas of IT and technical recruitment, from small-scale contingency through to large-scale campaign and recruitment management solutions.

Performance Testing: including Stress Testing, Load Testing, Soak Testing and Scalability Testing;

Unique database of high calibre jobseekers

Functional Testing: including System Testing, Integration Testing, Regression Testing and User Acceptance Testing;

Operational Acceptance Testing: including disaster recovery and failover; Web Testing: including cross browser compatibility and usability; Migration Testing: including data conversion and application migration; Agile Testing; Test Environments Management.

The testing talent we provide • Test analysts; • Test leads; • Test programme managers; • Automated test specialists; • Test environment managers; • Heads of testing; • Performance testers; • Operational acceptance testers. Our expert knowledge of the testing market means you recruit the best possible professionals for your business. When a more flexible approach is required, we have developed a range of creative fixed price solutions that will ensure you receive a testing service tailored to your individual requirements.

As we believe our clients should deal with true industry experts, we also deliver recruitment and workforce related solutions through the following niche practices: • Digital; • Defence; • Development; • ERP; • Finance Technology; • Infrastructure; • Leadership; • Public, voluntary and not-for-profit; • Projects, change and interim management; • Security; • Technology Sales; • Telecoms. We build networks and maintain relationships with candidates across these areas, giving our clients access to high calibre jobseekers with specific skills sets. To speak to a specialist testing consultant, please contact: Sarah Martin, senior consultant, Tel: +44 (0)1273 739272 Email: testing@hays.com Web: hays. co.uk/it

Our specialist network Working across a nationwide network of offices, we offer employers and jobseekers a highly specialised recruitment service. Whether you are looking for a permanent or contract position across a diverse range of skill sets, business sectors and levels of seniority, we can help you.

Web: hays.co.uk/it Email: testing@hays.com Tel: +44 (0)1273 739272

www.testmagazineonline.com

December 2010 | T.E.S.T


42 | T.E.S.T company profile

Learntesting There are many reasons for individuals and businesses to increase knowledge, skills and gain industry-recognised certification. However, with growing financial and time constraints, more flexible solutions are needed to realise the benefits.

Certification

Learntesting delivers a flexible, innovative online learning service designed to put you in control of all your software testing learning and certification needs. It delivers this unique service via a global network of expert training providers, a range of high-quality testing courses and content, powered by learning technology used by some of the largest businesses in the world.

- Advanced Test Manager

The Learntesting ‘Virtual Learning Environment (VLE)’ guides and supports your testing education on an ongoing basis with:

• Test Techniques

• High Quality online learning with a variety of content to suit different learning styles and budgets, including fully accredited ISTQB and ISEB courses;

As an independent recognition of achievement, after only six months of operation Learntesting was selected as a finalist for the prestigious ‘Learning Technologies Solution of the Year’ award from the Institute of IT Training in February 2010.

• ‘Live Virtual Classrooms’ led by experienced tutors for exercise revision sessions and exam preparation; • A private library of 50 testing and testing related ebooks 24x7; • Global support with 24x7 access to accredited tutors worldwide;

• ISTQB Certified Tester Foundation Level (CTFL); • ISTQB Certified Tester Advanced Level (CTAL); - Advanced Test Analyst - Advanced Technical Test Analyst • ISEB Intermediate Certificate in Software Testing.

General Testing • Agile Testing • A Private Library of Testing Books, Templates & Information

Visit www.learntesting.com for your nearest Learntesting provider and self-register for free. For more information, please contact info@learntesting.com

• Online exam style question papers and answers with detailed explanations; • Self-registration system with access to a range of free valuable content in the Learntesting ‘Testers Treasure Chests’; • Exams available globally. At Learntesting, we recognize that everyone is different and so provide something to suit everyone according to: • Personal goals; • Existing knowledge and experience; • Budget; • Study availability; • Preferred Learning Styles. We can provide this because we have invested in an ‘industrial strength’ solution – scalable from the individual to the largest corporations in the world. Learntesting provides support for the complementary ISTQB and ISEB software testing certification schemes and also other aspects of software testing:

www.learntesting.com info@learntesting.com Learntesting Ltd, 5th Floor 117-119 , Houndsditch, London EC3A 7BT.

T.E.S.T | December 2010

www.testmagazineonline.com


T.E.S.T company profile | 43

Green Hat The Green Hat difference In one software suite, Green Hat automates the validation, visualisation and virtualisation of unit, functional, regression, system, simulation, performance and integration testing, as well as performance monitoring. Green Hat offers code-free and adaptable testing from the User Interface (UI) through to back-end services and databases. Reducing testing time from weeks to minutes, Green Hat customers enjoy rapid payback on their investment. Green Hat’s testing suite supports quality assurance across the whole lifecycle, and different development methodologies including Agile and test-driven approaches. Industry vertical solutions using protocols like SWIFT, FIX, IATA or HL7 are all simply handled. Unique pre-built quality policies enable governance, and the re-use of test assets promotes high efficiency. Customers experience value quickly through the high usability of Green Hat’s software. Focusing on minimising manual and repetitive activities, Green Hat works with other application lifecycle management (ALM) technologies to provide customers with value-add solutions that slot into their Agile testing, continuous testing, upgrade assurance, governance and policy compliance. Enterprises invested in HP and IBM Rational products can simply extend their test and change management processes to the complex test environments managed by Green Hat and get full integration. Green Hat provides the broadest set of testing capabilities for enterprises with a strategic investment in legacy integration, SOA, BPM, cloud and other component-based environments, reducing the risk and cost associated with defects in processes and applications. The Green Hat difference includes: • Purpose built end-to-end integration testing of complex events, business processes and composite applications. Organisations benefit by having UI testing combined with SOA, BPM and cloud testing in one integrated suite. • Unrivalled insight into the side-effect impacts of changes made to composite applications and processes, enabling a comprehensive approach to testing that eliminates defects early in the lifecycle. • Virtualisation for missing or incomplete components to enable system testing at all stages of development. Organisations benefit through being unhindered by unavailable systems or costly access to third party systems, licences or hardware. Green Hat pioneered ‘stubbing’, and organisations benefit by having virtualisation as an integrated function, rather than a separate product. • Scaling out these environments, test automations and virtualisations into the cloud, with seamless integration between Green Hat’s products and leading cloud providers, freeing you from the constraints of real hardware without the administrative overhead.

• ‘Out-of-the-box’ deep integration with all major SOA, enterprise service bus (ESB) platforms, BPM runtime environments, governance products, and application lifecycle management (ALM) products. • ‘Out-of the box’ support for over 70 technologies and platforms, as well as transport protocols for industry vertical solutions. Also provided is an application programming interface (API) for testing custom protocols, and integration with UDDI registries/ repositories. •H elping organisations at an early stage of project or integration deployment to build an appropriate testing methodology as part of a wider SOA project methodology.

Corporate overview Since 1996, Green Hat has constantly delivered innovation in test automation. With offices that span North America, Europe and Asia/Pacific, Green Hat’s mission is to simplify the complexity associated with testing, and make processes more efficient. Green Hat delivers the market leading combined, integrated suite for automated, end-to-end testing of the legacy integration, Service Oriented Architecture (SOA), Business Process Management (BPM) and emerging cloud technologies that run Agile enterprises. Green Hat partners with global technology companies including HP, IBM, Oracle, SAP, Software AG, and TIBCO to deliver unrivalled breadth and depth of platform support for highly integrated test automation. Green Hat also works closely with the horizontal and vertical practices of global system integrators including Accenture, Atos Origin, CapGemini, Cognizant, CSC, Fujitsu, Infosys, Logica, Sapient, Tata Consulting and Wipro, as well as a significant number of regional and country-specific specialists. Strong partner relationships help deliver on customer initiatives, including testing centres of excellence. Supporting the whole development lifecycle and enabling early and continuous testing, Green Hat’s unique test automation software increases organisational agility, improves process efficiency, assures quality, lowers costs and mitigates risk.

Helping enterprises globally Green Hat is proud to have hundreds of global enterprises as customers, and this number does not include the consulting organisations who are party to many of these installations with their own staff or outsourcing arrangements. Green Hat customers enjoy global support and cite outstanding responsiveness to their current and future requirements. Green Hat’s customers span industry sectors including financial services, telecommunications, retail, transportation, healthcare, government, and energy.

sales@greenhat.com www.greenhat.com

www.testmagazineonline.com

December 2010 | T.E.S.T


44 | T.E.S.T company profile

Micro Focus Continuous Quality Assurance Micro Focus Continuous Quality Assurance (CQA) ensures that quality assurance is embedded throughout the entire development lifecycle – from requirements definition to ‘go live’.

• Streamlined requirements collaboration; • End to end traceability of requirements; • Fast and easy simulation to verify requirements; • Secure, centralized requirements repository.

Change

CQA puts the focus on identifying and eliminating defects at the beginning of the process, rather than removing them at the end of development. It provides capabilities across three key areas:

StarTeam® is a fully integrated, cost-effective software change and configuration management tool. Designed for both centralized and geographically distributed software development environments, it delivers:

Requirements: Micro Focus uniquely combines requirements definition, visualisation, and management into a single ‘3-Dimensional’ solution. This gives managers, analysts and developers the right level of detail about how software should be engineered. Removing ambiguity means the direction of the development and QA teams is clear, dramatically reducing the risk of poor business outcomes.

• A single source of key information for distributed teams; • Streamlined collaboration through a unified view of code and change requests; • Industry leading scalability combined with low total cost of ownership.

Change: Development teams regain control in their constantly shifting world with a single ‘source of truth’ to prioritize and collaborate on defects, tasks, requirements, test plans, and other in-flux artefacts. Even when software is built by global teams with complex environments and methods, Micro Focus controls change and increases the quality of outputs.

Silk is a comprehensive automated software quality management solution suite which:

Quality: Micro Focus automates the entire quality process from inception through to software delivery. Unlike solutions that emphasize ‘back end’ testing, Micro Focus ensures that tests are planned early and synchronised with business goals, even as requirements and realities change. Bringing the business and end-users into the process early makes business requirements the priority from the outset as software under development and test is continually aligned with the needs of business users. CQA provides an open framework which integrates diverse toolsets, teams and environments, giving managers continuous control and visibility over the development process to ensure that quality output is delivered on time. By ensuring correct deliverables, automating test processes, and encouraging reuse and integration, Continuous Quality Assurance continually and efficiently validates enterprise critical software. The cornerstones of Micro Focus Continuous Quality Assurance are: • Requirements Definition and Management Solutions; • Software Change and Configuration Management Solutions; • Automated Software Quality and Load Testing Solutions.

Requirements Caliber® is an enterprise software requirements definition and management suite that facilitates collaboration, impact analysis and communication, enabling software teams to deliver key project milestones with greater speed and accuracy.

Quality

• Ensures that developed applications are reliable and meet the needs of business users; • Automates the testing process, providing higher quality applications at a lower cost; • Prevents or discovers quality issues early in the development cycle, reducing rework and speeding delivery. SilkTest enables users to rapidly create test automation, ensuring continuous validation of quality throughout the development lifecycle. Users can move away from manualtesting dominated software lifecycles, to ones where automated tests continually test software for quality and improve time to market.

Take testing to the cloud Users can test and diagnose Internet-facing applications under immense global peak loads on the cloud without having to manage complex infrastructures. Among other benefits, SilkPerformer® CloudBurst gives development and quality teams: • Simulation of peak demand loads through onsite and cloud-based resources for scalable, powerful and cost effective peak load testing; • Web 2.0 client emulation to test even today’s rich internet applications effectively. Micro Focus Continuous Quality Assurance transforms ‘quality’ into a predictable managed path; moving from reactively accepting extra cost at the end of the process, to confronting waste head on and focusing on innovation. Micro Focus, a member of the FTSE 250, provides innovative software that enables companies to dramatically improve the business value of their enterprise applications. Micro Focus Enterprise Application Modernization and Management software enables customers’ business applications to respond rapidly to market changes and embrace modern architectures with reduced cost and risk.

For more information, please visit www.microfocus.com/CQA

T.E.S.T | December 2010

www.testmagazineonline.com


T.E.S.T company profile | 45

Original Software Delivering quality through innovation With a world class record of innovation, Original Software offers a solution focused completely on the goal of effective quality management. By embracing the full spectrum of Application Quality Management across a wide range of applications and environments, the company partners with customers and helps make quality a business imperative. Solutions include a quality management platform, manual testing, full test automation and test data management, all delivered with the control of business risk, cost, time and resources in mind.

Setting new standards for application quality Today’s applications are becoming increasingly complex and are critical in providing competitive advantage to the business. Failures in these key applications result in loss of revenue, goodwill and user confidence, and create an unwelcome additional workload in an already stretched environment. Managers responsible for quality have to be able to implement processes and technology that will support these important business objectives in a pragmatic and achievable way, without negatively impacting current projects. These core needs are what inspired Original Software to innovate and provide practical solutions for Application Quality Management (AQM) and Automated Software Quality (ASQ). The company has helped customers achieve real successes by implementing an effective ‘application quality eco-system’ that delivers greater business agility, faster time to market, reduced risk, decreased costs, increased productivity and an early return on investment. These successes have been built on a solution that provides a dynamic approach to quality management and automation, empowering all stakeholders in the quality process, as well as uniquely addressing all layers of the application stack. Automation has been achieved without creating a dependency on specialised skills and by minimising ongoing maintenance burdens.

An innovative approach Innovation is in the DNA at Original Software. Its intuitive solution suite directly tackles application quality issues and helps organisations achieve the ultimate goal of application excellence.

Empowering all stakeholders The design of the solution helps customers build an ‘application quality eco-system’ that extends beyond just the QA team, reaching all the relevant stakeholders within the business. The technology enables everyone involved in the delivery of IT projects to participate in the quality process – from the business analyst to the business user and from the developer to the tester. Management executives are fully empowered by having instant visibility of projects underway.

Quality that is truly code-free Original Software has observed the script maintenance and exclusivity problems caused by code-driven automation solutions and has built a solution suite that requires no programming skills. This empowers all users to define and execute their tests without the need to use any kind of code, freeing them from the automation specialist bottleneck. Not only is the technology easy to use, but quality processes are accelerated, allowing for faster delivery of business-critical projects.

Top to bottom quality Quality needs to be addressed at all layers of the business application. Original Software gives organisations the ability to check every element of an application - from the visual layer, through to the underlying service processes and messages, as well as into the database.

Addressing test data issues Data drives the quality process and as such cannot be ignored. Original Software enables the building and management of a compact test environment from production data quickly and in a data privacy compliant manner, avoiding legal and security risks. It also manages the state of that data so that it is synchronised with test scripts, enabling swift recovery and shortening test cycles.

unite all aspects of the software quality lifecycle. It helps manage the requirements, design, build, test planning and control, test execution, test environment and deployment of business applications from one central point that gives everyone involved a unified view of project status and avoids the release of an application that is not ready for use.

Helping businesses around the world Original Software’s innovative approach to solving real pain-points in the Application Quality Life Cycle has been recognised by leading multinational customers and industry analysts alike. In a 2010 report, Ovum stated: “While other companies have diversified, into other test types and sometimes outside testing completely, Original has stuck more firmly to a value proposition almost solely around unsolved challenges in functional test automation. It has filled out some yawning gaps and attempted to make test automation more accessible to nontechnical testers.” More than 400 organisations operating in over 30 countries use Original Software solutions. The company is proud of its partnerships with the likes of Coca-Cola, Unilever, HSBC, FedEx, Pfizer, DHL, HMV and many others.

A holistic approach to quality Original Software’s integrated solution suite is uniquely positioned to address all the quality needs of an application, regardless of the development methodology used. Being methodology neutral, the company can help in Agile, Waterfall or any other project type. The company provides the ability to

www.origsoft.com solutions@origsoft.com Tel: +44 (0)1256 338 666 Fax: +44 (0)1256 338 678 Grove House, Chineham Court, Basingstoke, Hampshire, RG24 8AG

www.testmagazineonline.com

December 2010 | T.E.S.T


46 | T.E.S.T company profile

Parasoft Improving productivity by delivering quality as a continuous process For over 20 years Parasoft has been studying how to efficiently create quality computer code. Our solutions leverage this research to deliver automated quality assurance as a continuous process throughout the SDLC. This promotes strong code foundations, solid functional components, and robust business processes. Whether you are delivering Service-Orientated Architectures (SOA), evolving legacy systems, or improving quality processes – draw on our expertise and award winning products to increase productivity and the quality of your business applications. Parasoft's full-lifecycle quality platform ensures secure, reliable, compliant business processes. It was built from the ground up to prevent errors involving the integrated components – as well as reduce the complexity of testing in today's distributed, heterogeneous environments.

Load/performance testing: Verify application performance and functionality under heavy load. Existing end-to-end functional tests are leveraged for load testing, removing the barrier to comprehensive and continuous performance monitoring.

Specialised platform support: Access and execute tests against a variety of platforms (AmberPoint, HP, IBM, Microsoft, Oracle/BEA, Progress Sonic, Software AG/webMethods, TIBCO).

Security testing: Prevent security vulnerabilities through penetration testing and execution of complex authentication, encryption, and access control test scenarios.

What we do

Trace code execution:

Parasoft's SOA solution allows you to discover and augment expectations around design/development policy and test case creation. These defined policies are automatically enforced, allowing your development team to prevent errors instead of finding and fixing them later in the cycle. This significantly increases team productivity and consistency.

Provide seamless integration between SOA layers by identifying, isolating, and replaying actions in a multi-layered system.

End-to-end testing: Continuously validate all critical aspects of complex transactions which may extend through web interfaces, backend services, ESBs, databases, and everything in between.

Advanced web app testing: Guide the team in developing robust, noiseless regression tests for rich and highly-dynamic browserbased applications.

Application behavior virtualisation: Automatically emulate the behavior of services, then deploys them across multiple environments – streamlining collaborative development and testing activities. Services can be emulated from functional tests or actual runtime environment data.

Continuous regression testing: Validate that business processes continuously meet expectations across multiple layers of heterogeneous systems. This reduces the risk of change and enables rapid and agile responses to business demands.

Multi-layer verification: Ensure that all aspects of the application meet uniform expectations around security, reliability, performance, and maintainability.

Policy enforcement: Provide governance and policy-validation for composite applications in BPM, SOA, and cloud environments to ensure interoperability and consistency across all SOA layers. Please contact us to arrange either a one to one briefing session or a free evaluation.

Spirent Communications plc Tel: +44(0)7834752083 Email: Daryl.Cornelius@spirent.com Web: www.spirent.com Web: www.parasoft.com Email: sales@parasoft-uk.com Tel: +44 (0) 208 263 6005

T.E.S.T | December 2010

www.testmagazineonline.com


T.E.S.T company profile | 47

Spirent Communications Spirent Communications plc is a global leader in test and measurement products and services. The world’s leading organisations rely on Spirent’s solutions and expertise to evaluate the performance of their applications, products and network services – because no other company can offer stringent application testing in a realistic network context, embracing any combination of Local Area, Wide Area, fixed line and wireless access scenarios.

application needs to serve any combination of multiple users across a range of access technologies – including broadband business connection, home ADSL, GSM, WiFi – each with their own protocols, speeds and typical user behaviour profiles.

From wireline to wireless to satellite, Spirent offers a complete portfolio of solutions to enhance user and customer quality of experience (QoE). They enable today’s communication ecosystem as well as tomorrow’s emerging enterprises to deploy lifeenriching communication networks, devices, services and applications.

With Spirent these ‘real life’ test scenarios are not only possible, but made easy by a simple graphical user interface, enabling rapid creation of complex and realistic everyday usage models as well as extreme crisis test conditions – without any specialist scripting skills.

Sectors that rely on Spirent testing include: financial services, cloud service providers, major equipment vendors, operators, government organisations and the military.

Broad industry expertise Spirent is recognised worldwide for its expertise in next-generation communication networks, devices and applications, and its engineers are chosen to advise many of the leading communication standards organisations. It provides the equipment, training and test methodology to help many of the leading technology labs and forums evaluate the performance of emerging technologies. Spirent pioneered the testing of Ethernet networks, IP Telephony and VoIP, VPNs, TriplePlay, 2G and 3G wireless, and Location Based Services. Today, it is helping to test the first deployments of high-speed Ethernet, next-generation Internet , 4G wireless and IPv6 networks in Asia, Europe and North America. Areas of expertise include: • Enterprise Networks: load testing, system performance, network security; • Convergence: VoIP, IP VPNs, IPTV; • Broadband Networking: DSL, Gigabit Ethernet, and IP; • Next-Generation Internet: IPv6; • Wireless: CDMA, UMTS, Location Based Services and LTE; • Satellite Navigation: GPS, GLONASS, Galileo.

The leader in web application testing Spirent’s expertise is utterly critical to the performance and QoE offered by cloud-based applications. Other solutions exist, offering sophisticated application testing, but only Spirent solutions take into account the complexities of the web environment and its potential impact on the service.

Unless your test scenario can emulate such a complex mix of protocols and user behaviours, you cannot be certain how the application will perform across the web.

Spirent solutions for web application testing Two factors are key to realistic web application testing – user emulation and network empairment emulation. Spirent Network Impairment Emulators bring the realism of live network conditions into the lab. They enable testers to accurately create the whole range of actual and possible delays and impairments that occur over live networks. Spirent Ethernet Network Emulators are essential for validating and evaluating new products and technologies. Examples include: testing multimedia products and services; storage networks; circuit emulation services; Timing over Packet; disaster recovery; data centre moves; mobile/wireless infrastructure; WAN and TCP acceleration; and multi-play applications. Spirent Avalanche appliances generate the most realistic possible emulation of large user populations for application, web services and content delivery testing. Any combination of users, access technologies, and applications can be easily programmed via a point and click graphical interface – specifying such variables as connections, transactions and sessions per second – then multiplied to emulate massive traffic conditions. Real time performance statistics are presented and summarised into customised test reports. Spirent Avalanche Virtual is the Industry’s only ‘all-in-one’ cloud testing solution designed to test and measure the performance, availability, security and scalability (PASS) of virtualised network infrastructures and of applications deployed in the cloud. A software solution that resides in virtualised data centre environments, Spirent Avalanche Virtual lets enterprises and cloud service providers measure the impact of large-scale cloud-based application deployments against their expected quality of experience. Realistic testing based on flexible impairment and usage programming is the only way to ensure a fully developed solution and an increase in quality of experience for the user.

Whether it is a pizza restaurant remote ordering system, an Inland Revenue tax return, or the BBC’s iPlayer, the

Spirent Communications plc Tel: +44(0)7834752083 Email: Daryl.Cornelius@spirent.com Web: www.spirent.com

www.testmagazineonline.com

December 2010 | T.E.S.T


48 | The Last Word

the last word... Seriously? Dave Whalen is having problems with his data.

T

rue story... Once upon a time, Cap'n Dave was contracted to test a commercial banking web application. The testing was pretty straight forward , the application allowed you to view transactions, reconcile your accounts, transfer funds, etc. One requirement, based on US federal regulations, was that a customer could make no more than five online transfers between accounts in a month. So late one Friday afternoon, which coincidentally was also the last day of the month, I made five transfers from my test user's checking account to her savings account. Since I wasn't playing with real money, I made five transfers of $100. I attempted a sixth transfer and as expected, received an appropriate error message. Since Monday was a new month I figured I would try another transfer first thing Monday morning. If the transfer was allowed, the test would pass and I could move on. On Monday I launched the application successfully and was presented with the log-on screen. I entered my test user's log-on but received an error message that my username and/or password were invalid. No worries, probably just a typo. I tried again – same error. It was Monday and the coffee hadn't kicked in yet so I figured I had probably typed something wrong. Third time, I made sure every character was absolutely correct. I clicked the logon button and once again it failed. Only this time, with three consecutive invalid attempts, the account was locked. I sauntered, head hung low, into my boss’s office to relate the embarrassing tale of being locked out of my test account. What he told me knocked me to the floor. Apparently, the woman who owned the account noticed some strange activity and

T.E.S.T | December 2010

changed her password. What!!! We were testing with live data and live accounts! Seriously! This can't be true! Sadly, it was true. All of the testers had been testing using accounts that they had created. It was due to sheer laziness on my part that I had decided to use an existing account. I assumed we had a copy of the production database to use for testing. I quickly raised an issue with the test project manager and the CIO, anyone who would listen, that this was completely unacceptable. They didn't get it. They told me to just be sure to reverse any changes I made during testing. I ended the contract on the spot! There are a number of key concepts related to any successful test effort. Repeatability and consistency are important, they are actually somewhat related. Tests should be ‘repeatable’ in that we always run the same test, under the same conditions, with the same data, each and every test cycle or when validating a defect fix. Repeatability drives Consistency. When we run the same tests, under the same conditions, with the same data, we should always get the same result. Bottom line, we have to do the exact same steps, with the exact same data, and the exact same actions, each and every time we run a test. If not the test becomes unreliable. But if the tests alter the data, or the data is constantly changing, how can we do that? We lose repeatability and consistency if the data changes with every new test cycle. Sadly, this is way too common. A number of organisations that I've worked with will update the test database with a current copy of the production database at the beginning of each new test cycle. The result is that tests are typically vaguely written because the data is unknown or

What he told me knocked me to the floor. Apparently, the woman who owned the account noticed some strange activity and changed her password. What!!! We were testing with live data and live accounts! Seriously! This can't be true! pieces may be missed if data doesn't exist to support the test. We really should define and build test data as part of the test authoring process. With every new test cycle, we rebuild the data with a clean, known set of test data. Use the same data each and every time. How do we create test data? We have a couple of options depending on the time you have: 1. Take a copy of production, modify it to fit your tests, and then save it! 2. Create it from scratch. Option 2 is rarely a viable option; however option 1 usually works well. As a compromise, you may be able to write test data on top of a ‘copy’ of production data. In other words, run a script that adds test data to a copy of production data. It's going to take time to create the initial test data set and as time goes by, it may need to be modified. But it saves a lot of time in the long run and increases the reliability and consistency of your test results! Can we always do this? I'd hate to say always, but I'm willing to venture about 90 percent of the time we can. And, although this should be a blinding glimpse of the obvious, never, under any circumstances, test using live data!

Dave Whalen President and senior software entomologist Whalen Technologies http://softwareentomologist.wordpress.com

www.testmagazineonline.com


In Touch With Technology

SUBSCRIBE TO T.E.S.T. st

be r 20 09

IN TOUCH WITH T ECHNOLOGY : ige de D

cem Iss ue 4: De Vo lum e 1:

T.E.S.T THE EU ROPEAN SOFTW ARE TESTER

ESTER ARE T OFTW EAN S UROP THE E

SOFTW ROPEAN

T.E.S.T THE EUROPEAN SOFTWARE TESTER

T T.E.S.

THE EU

TESTER SOFTWARE EUROPEAN T.E.S.T THE

IN

IN TOUCH WITH TECHNOLOGY

i .T Ins T.E.S e ag -p 16

by Supported

Y H TECHNOLOG N LTOOGUYC H W I T E C H NI O T H T I W TOUCH

THE EUROPEAN W A R E T E S T E R Volume 2: Issue 2: June 2010 S O F T W A R E T E S T E R S T ERRO P E A N S O F T A RTEHTEE E U 1: Marc h 2010 THE EUROPEAN SO F T WA R E T E S T E R Issue Volum e 2: Volum e 2: Issue 3: Septem ber 2010

INDIA: TESTING POWERHOUSE

Yogesh Singh and Ruchika Malhotra on optimising web perform ance testing

VOLUME 2: ISS UE 3: SEPTEMB ER 2010

VO L U M E 2 : I S S U E 2 : J U LY 2 0 1 0

ARCH 2010 ISSUE 1: M VOLUME 2:

2009 EMBER 4: DEC ISSUE ME 1: VOLU

DEALING HEELSWIT HIGH E H DEBT ! L H I C E T G H IG A H E IN I HAT

Phil Kirkham tackles technical debt industry n’s view of a hi-tec ma wo a le h gi wit A ith of Sm ltvon De cu e th Inside: Small-scale testing | Reporting | Enhanced application testing on len takes sting r testing pted teckin Dave Wha g automation | Use Insid User acce

e: Offshore testing | Pene s | sed testing | Che orld-ba l wRisk tration testing | Business de: rtua g viInsi Visit T.E.S.T online at www.testmagazineonline.com alignment y | Testin ine.com www.testmagazineonline.com ineonl ta securit azT.E.S at ag ne m onli .T st te t Inside: Da Visi e at www. .T Visit T.E.S

Visit T.E.S.T online at www .testmagazineonline.com

onlin

Simply visit www.testmagazine.co.uk/subscribe Or email subscriptions@testmagazine.co.uk *Please

note that subscription rates vary depending on geographical location

Published by 31 Media Ltd Telephone: +44 (0) 870 863 6930 Facsimile: +44 (0) 870 085 8837

www.31media.co.uk

Email: info@31media.co.uk Website: www.31media.co.uk

The European Software Tester


AppLabs is the Independent Testing Partner to some of the World’s Largest Airlines, Banks and Stock Exchanges, Healthcare and Pharmaceutical Companies, Retail Giants and Insurance Providers.

While our customers may have several development partners they have only ONE Independent Testing Partner, AppLabs. Test Center of Excellence ERP Testing Services Application Infrastructure Management Test Automation Services Test Process Consulting Performance Testing Services Security Testing Services Certification Services Usability Testing User Acceptance Testing Services Compatibility and Interoperability Testing Services Internationalization and Localization Testing Embedded Systems Testing Migration Testing System Integration Testing Mobile Device Testing Services Mobile Application Testing Services Game Testing Services

www.applabs.com

info@applabs.com


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.