JULY 2016
THE LEGACY OF TESTING FINDING BUGS THROUGH PSYCHOANALYSIS
Test
Automation
for
RECORDING_1
RECORD MODULES
Everyone
BROWSER
OPEN
MOUSE
CLICK
KEY
SEQUENCE
VALIDATE
ATTRIBUTE EQUAL
USER CODE
MY_METHOD
Recording_1.cs RECORDING_1
void ITestModule.Ru { Report.Log(ReportLev "Website", "Opening w
Report.Log(ReportLev "Mouse", "Mouse Left Report.Log(ReportLe
Any Technology Seamless Integration Broad Acceptance Robust Automation
All Te
ense
chno logi All U pdat es. es.
Quick ROI
ut New version o
1 Lic
now Remote test execution Git integration Performance improvements Fresh user interface Code templates and so much more...
www.ranorex.com/try-now
1
C O N T E N T S
T E S T C O V E R
M A G A Z I N E S T O R Y :
|
J U LY
2 0 1 6
M A N U FA C T U R I N G
S E C T O R
14
28
un()
vel.Info, web site 'http://www.ranorex.
vel.Info, Click at {X=10,Y=20}.", new RecordItemIndex(1)); vel.Info, "Keyboard", "Key sequence 'admin'.", new RecordItemIndex(2));
42
NEWS
Software industry news
FINANCIAL SECTOR
5
The legacy of testing
THOUGHT LEADERSHIP
TEST AUTOMATION
From information technology to intelligent technology 10
Banishing misinformation
Why the cloud carries such clout
Securing code to fight cyber crime
12
SECURITY TESTING
14
The new virtual (data) reality
MANUFACTURING SECTOR
Going industrial A new model of manufacturing
18
38
40
TEST MANAGEMENT
Finding bugs through psychoanalysis
42
22
PREVIEW
24
Software Testing Conference NORTH 2016 46
SUPPLIER PROFILE
A perfect partnership
34
VIRTUALISATION
AGILE TESTING
More than just buzz
28
46 T E S T M a g a z i n e | J ul y 2 01 6
E D I T O R ' S
C O M M E N T
3
THE FUTURE OF TESTING IS IN YOUR HANDS CECILIA REHN EDITOR OF TEST MAGAZINE
I
am regularly asked for input and opinion on the testing industry: its role today and where is it heading? I have now spent a year reporting on testing and QA (July 2015 was my first issue of TEST Magazine – how time flies!), and networking with industry specialists, newcomers, veterans and more. In this time, I’ve hosted and cohosted quite a few sessions/roundtables on the 'future of testing', and it is clear that there’s plenty to be said on the topic. There is a lot of uncertainty hanging in the air. A lot of questions. Many wonder if their jobs are going to be automated away? Others question whether or not we’re losing an intrinsic ‘testing mindset’ if we only recruit and value those with technical coding skills? Similarly, what will the new relationship between testers, developers, BAs and other stakeholders look like? Are we transitioning into hybrid roles, and who will define them? Some firms look to promote testers into BAs, and in other places many move from development into test, and then back again. If anything, it seems like a stint in test/QA is tremendously advantageous and valuable, allowing people to slot into other areas of an organisation, as and when needed. This leads into the next set of concerns – what skills should testers/managers be looking to develop? How can we work together to prepare the next generation of talent? And where do we recruit them from? I’ve had conversations with senior managers emphasising the importance of in‑house training; taking an active role in the community to promote IT careers in general; and looking at alternative recruitment pools such as ex‑military. Ultimately, if we can all agree that there is no one‑size‑fits‑all tester, then it makes sense that future recruits will come from diverse backgrounds. It is obvious that there’s a lot of innovation in this industry, and since technology does not stagnate, an awareness of new tools, techniques and methods will
be critical for all wishing to progress in their careers. Flexibility and a willingness to adapt and continue learning is important. Well‑developed soft skills will always be highly in demand. And I believe that all testers, whether just starting out or with 10+ years under their belt, benefit from networking with their peers, getting out there to learn more about what others are doing, and staying informed by reading industry publications (such as this magazine). There is an abundance of resources out there. This issue of TEST Magazine is filled with commentary, news and insight to help and inspire you. There are thought leadership pieces on AI and cloud testing; a supplier profile on newly‑rebranded Ten10; coverage on what digitalisation means in the manufacturing sector, and much more. There is also an interview with Mike Jarred, Head of Software Testing at the FCA on p. 28, who argues eloquently that the legacy of testing is information. I think it’s important for all testing managers to consider how their role will continue to evolve, what new areas of responsibility they can take on, and who their future staff will be. We all have questions. Yes there is uncertainty. But it’s important to remember that we have a lot of power too. We all have the opportunity and responsibility to shape the testing/QA function, and help promote the role’s importance, regardless of what future job titles will be or team members look like. And most importantly, if we’re moving away from the title ‘tester’ do we need to change the name TEST Magazine? ;) I’d love to hear your thoughts on that and everything else! Get in touch @testmagazine or on LinkedIn.
JULY 2016 | VOLUME 8 | ISSUE 3 © 2016 31 Media Limited. All rights reserved. TEST Magazine is edited, designed, and published by 31 Media Limited. No part of TEST Magazine may be reproduced, transmitted, stored electronically, distributed, or copied, in whole or part without the prior written consent of the publisher. A reprint service is available. Opinions expressed in this journal do not necessarily reflect those of the editor of TEST Magazine or its publisher, 31 Media Limited. ISSN 2040‑01‑60 GENERAL MANAGER AND EDITOR Cecilia Rehn cecilia.rehn@31media.co.uk +44 (0)203 056 4599 ADVERTISING ENQUIRIES Anna Chubb anna.chubb@31media.co.uk +44 (0)203 668 6945 Mostafa Al‑Baali mostafa.al‑baali@31media.co.uk +44 (0)203 668 6943 PRODUCTION & DESIGN JJ Jordan jj@31media.co.uk EDITORIAL INTERN Jordan Platt 31 Media Ltd, 41‑42 Daisy Business Park 19‑35 Sylvan Grove London, SE15 1PD +44 (0)870 863 6930 info@31media.co.uk www.testingmagazine.com PRINTED BY Pensord, Tram Road, Pontllanfraith, Blackwood, NP12 2YA softwaretestingnews @testmagazine TEST Magazine Group
cecilia.rehn@31media.co.uk
T E S T M a g a z i n e | J ul y 2 01 6
I N D U S T R Y
5
N E W S
ENTRIES OPEN TO THE EUROPEAN SOFTWARE TESTING AWARDS 2016 Now in its fourth year, The European Software Testing Awards celebrate companies and individuals who have accomplished significant achievements in the software testing and quality assurance market. The whole industry has been invited to enter 16 categories including the Best Agile Project; Best Use of Technology; Graduate Tester of the Year; and Best User Experience (UX) Testing in a Project.
The final entries will be judged anonymously by the 2016 judging panel, which is comprised of senior heads of testing and QA, who have all been appointed based on their extensive experience in the software testing and QA field. Commenting on her expectations from entrants, Awards judge Delia Brown, Head of Test at Home Office Technology, Test Design and Consultancy Services (TDCS) said: “It obviously depends on the category, but one of the important things for me, hopefully we’re doing it better these days, is putting forward something that is actually supporting business goals. “Enabling the delivery of business goals, and bringing something above and beyond what you would expect, so whether that’s technical innovation or something surrounding people and approach. A project above and beyond the norm.” For those wishing to enter the Awards, the booking deadline is 5 August, with final entries due by 12 August. All finalists will be announced in September. The winners will be announced at the glittering Gala Dinner at Old Billingsgate, London on 16 November.
BULGARIA PASSES OPEN SOURCE LAW Bulgaria’s parliament have voted on amendments to the country's Electronic Governance Act, which require all software written for the government to be open source and developed in a public repository. The move means custom software procured by the government will now be accessible to everyone. Article 58 of the Electronic Governance Act states that administrative authorities must include the following requirements: "When the subject of the contract includes the development of computer programs, computer programs must meet the criteria for open‑source software; all copyright and related rights on the relevant computer programs, their source code, the design of interfaces, and databases which are subject to the order should arise for the principal in full, without limitations in the use, modification,
and distribution; and development should be done in the repository maintained by the agency in accordance with Art 7c pt. 18." Bozhidar Bozhanov, advisor to the Bulgarian Deputy Prime Minister, said the move is to prevent vulnerabilities in government websites being left unpatched when a contract expires, and to detect bad security practices earlier. A new government agency is charged with enforcing the law and setting up the public repository. In addition, a public register will also be developed in the next few weeks to track all projects, from inception through to technical specs, deliverables, and subsequent control, Bozhanov said. Existing solutions are unaffected. As part of the same law, all IT contracts are also made public. Bozhanov added that the decision "is a good step for better government software and less abandonware, and I hope other countries follow our somewhat 'radical' approach of putting it in the law”.
A FIFTH OF UK BUSINESSES DO NOT UNDERSTAND DATA PROTECTION REQUIREMENTS From June 2018, any business that offers goods and services to the EU or monitors the behaviour of EU citizens will be subject to the General Data Protection Regulation (GDPR). However, 21% of UK businesses have no understanding of the impending GDPR being introduced, according to research carried out by Delphix. A further 42% in the UK have looked into some aspects of the GDPR but not into the psuedonymisation tools that the legislation recommends. Approximately, one in five of those that have studied the psuedonymisation requirements in the GDPR admit that they are having trouble understanding it. “Following the results of the EU referendum, there is confusion about whether the GDPR is still compliant. It’s important to remember the UK’s exit from the EU won’t happen overnight. In the immediate future, the UK will be subject to the same data protection regime as the rest of the EU. In the long‑term the UK will still need to prove adequacy and adopt similar data protection standards to continue trading securely within Europe. As a result, organisations need to focus on getting their GDPR preparations underway,” explained Iain Chidgey, VP International at Delphix. “The GDPR defines pseudonymisation
as the process of ensuring data is held in a format that does not directly identify a specific individual without the use of additional information. To address the challenges of a digital age and limit the risk to individuals that have their data breached, the GDPR incentivises organisations to pseudonymise their data at several different points.” Currently, just a quarter of data in the UK and Germany is masked, compared to a third in France. Respondents in the UK claimed that the biggest challenges to data masking is that data is sprawled throughout the organisation with little central control (32%) and it takes too long and delays projects (42%). A further 26% in France and Germany also claimed that data masking tools are prohibitively expensive. As a result of the new legislation, nearly half of data in the UK and Germany will be masked by 2018 (48% and 47% respectively). In France, this figure will be even higher, rising to 60%. The survey also revealed that responsibility for data protection will sit firmly within the C‑suite, but few organisations have appointed a chief data officer or a chief privacy officer. In the UK, 52% listed the CISO or head of IT security as responsible. A further 18% cited the chief data officer or data protection officer followed by the CEO or CIO (17%).
T E S T M a g a z i n e | J ul y 2 01 6
6
I N D U S T R Y
N E W S
NEW EYE TRACKING SOFTWARE FOR SMARTPHONES
SPOTIFY ACCUSES APPLE OF BLOCKING APP Spotify is riling against Apple’s demand to use its App store billing system. The music streaming service argues the Apple rules are intended to damage Spotify, reducing competition to rival Apple Music. "We cannot stand by as Apple uses the App Store approval process as a weapon to harm competitors," Spotify general counsel Horacio Gutierrez wrote in a communication with Apple's lawyers, details of which have since been shared online. The letter states that Apple’s actions raise serious concerns under both US and EU competition law. "It continues a troubling pattern of behaviour by Apple to exclude and diminish the competitiveness of Spotify on iOS and as a rival to Apple Music, particularly when seen against the backdrop of Apple's previous anticompetitive conduct aimed at Spotify." Currently, Apple charges firms up to 30% of income made to use its iTunes billing service. There is no alternative billing system,
T E S T M a g a z i n e | J ul y 2 01 6
and if companies don’t comply they can be removed from App store, and lose access to customers with iPhones. Jonathan Prince, Spotify's Global Head of Communications and Public Policy, spoke recently about Apple, Amazon and Google's internet dominance. "Apple has long used its control of iOS to squash competition in music," he said. He faulted Apple for "driving up the prices of its competitors, inappropriately forbidding us from telling our customers about lower prices, and giving itself unfair advantages across its platform through everything from the lock screen to Siri.” In response Apple, in a letter, said: "Our guidelines apply equally to all app developers…and regardless of whether or not they compete against Apple.” Further explaining, Apple’s lawyer Bruce Sewell said: "We did not alter our behaviour or our rules when we introduced our own music streaming service or when Spotify became a competitor…Ironically, it is now Spotify that wants things to be different by asking for preferential treatment from Apple."
A team of international researchers from Massachusetts Institute of Technology (MIT), University of Georgia and Germany’s Max Planck Institute for Informatics, is developing software that could let you control your smartphone through eye movements. To date, the team has trained software to identify where a person is looking with an accuracy of about a centimetre on a mobile phone and 1.7 centimetres on a tablet. 1.7 centimetres isn’t all that accurate, especially if you consider the overall size of your smartphone screen. It’s still not exact enough to use for consumer applications, says MIT study co‑author Aditya Khosla. According to Khosla, the system’s accuracy will improve with more data. To achieve this, the researchers built GazeCapture, an app that gathers data about how people look at their phones in different surroundings outside the boundaries of a lab. Users’ gaze was recorded with the phone’s front camera as they were shown pulsating dots on a smartphone screen. This information was then fed into iTracker, which uses the handset’s camera to capture a user’s face, and the software considers factors like the position and direction of the head and eyes to work out where the gaze is focused on the screen. About 1500 people have used the GazeCapture app so far, Khosla said, adding if the researchers can get data from 10,000 people they’ll be able to reduce iTracker’s error rate to half a centimetre, which should be good enough for a range of eye‑tracking applications. Other possible uses for the software could be in medical diagnoses, particularly to identify conditions such as schizophrenia and concussions, Khosla said. Andrew Duchowski, a Professor at Clemson University who specialises in eye tracking, thinks iTracker could be “hugely” useful if the researchers can get it working well on mobile devices. He has raised concerns that the app would need to work quickly and not drain too much battery.
TM
www.a1qa.com Software Testing Company
Dedicated Team
We appreciate high quality as much as you do. Eliminate your recruiting process by addressing our customized QA team.
Why Trust A1QA Dedicated Team? Team Flexibility
Speaking Your Language
Depending on tasks complexity, we will assign either specialists with unrivalled experience or more cost-eecient resources to reduce strain on your QA spending.
We identify all crucial issues and understand how to achieve your business goals.
Seamless Integration We easily adjust our testing approach and techniques to your demands.
Full Transparency You will be able to control the whole process without wasting resources on managing every stage.
400+ 12+ 1400+ 100%
engineers to choose from
years in SQA business
tested products
knowledge retention
8
I N D U S T R Y
N E W S
SOUTH KOREA LAUNCHES LOW‑COST IoT NETWORK
Jeff Weiner will remain CEO of LinkedIn, reporting to Satya Nadella, CEO of Microsoft. Reid Hoffman, chairman of the board, co‑founder and controlling shareholder of LinkedIn, and Weiner both fully support this transaction.
MICROSOFT PURCHASES LINKEDIN FOR US$26.2 BILLION Microsoft is buying LinkedIn, in its biggest ever purchase, for US$26.2 billion, and the buy‑out has been hailed as “a chance to change how the world works.” The deal was announced on 13 June by both companies prior to the market opening
VIRTUAL REALITY SICKNESS CAN BE TACKLED SAYS RESEARCHERS Columbia University researchers have found that virtual reality sickness can be solved by a simple field of view alteration. They found that, when a person’s field of view was restricted to a single view during VR, they were happier to stay in a virtual reality for longer periods of time. Some users find virtual reality environments uncomfortable due to the sensory mismatch between what users can see and where they actually are in reality, which causes VR sickness. The technique of altering users' views
T E S T M a g a z i n e | J ul y 2 01 6
on Wall Street. Microsoft has stated after the acquisition that the business‑focused social network will “retain its distinct brand, culture and independence.” After the announcement LinkedIn’s shares rose to 49% while Mircrosoft’s dipped close to 3%. LinkedIn was launched in 2003 and has a recorded 430 million members, making the deals value of each of its members US$60.
was trialled by 30 volunteers and was stated to be, by one specialist, “simple and efficient.” A professor at Columbia University, Steven Feiner, is said to have suffered from VR sickness himself. Professor Feiner explained that VR has the potential to "profoundly change" how people interact with machinery, information and other human beings. "It is critical that the experience be both comfortable and compelling, and we think we've found a way," he said.
A new low‑cost IoT network in South Korea will allow smart devices to talk to each other. Behind the initiative is phone carrier SK Telecom, which uses technology that will allow it to reach 99% of the country's population. SK Telecom is investing up to KRW100 billion by the end of 2017 to further develop the infrastructure, which it hopes will be a new source of revenue. In a statement it said the price plans are "highly affordable" and cost one‑tenth of its current LTE‑based IoT services. The new price plan is expected to ease the cost burden of startups and small and medium enterprises. Besides developing and launching IoT services of its own, SK Telecom will be making multifaceted efforts to vitalise the IoT ecosystem by encouraging the participation of developers, SMEs and startups. To this end, the company plans to set up a comprehensive program named ‘Partner Hub Program’ to nurture partners. Through the program, SK Telecom will share its expertise and knowhow, provide training and conduct joint development/marketing. “SK Telecom is proud to announce the nationwide deployment of LoRaWAN as it marks the first important step towards realising connectivity between infinite number of things, going beyond the traditional role of telecommunications centered on connectivity between people,” said Lee Hyung‑hee, President of Mobile Network Business at SK Telecom. “Going forward, SK Telecom will develop and offer a wide variety of IoT services designed to offer new value for customers, while working closely with partners including SMEs and startups to vitalise the IoT ecosystem.” Aside from South Korea, the Netherlands also has a nationwide IoT network.
I N D U S T R Y
9
N E W S
INDUSTRY EVENTS www.softwaretestingnews.co.uk
www.devopsonline.co.uk
SOFTWARE TESTING CONFERENCE NORTH Date: September 2016 Where: York, UK north.softwaretestingconference.com R E C O M M E N D E D
★★★
A POST‑BREXIT VIEW OF THE UK TECH SECTOR
CONTINUOUSLY EVOLVING: DEVOPS AT ITV
The tech sector accounts for around 10% of UK GDP, and is one of the fastest growing components of the economy. With a thriving start‑up community, including a strong focus on Fintech, London is widely renowned as the digital capital of Europe. Before the EU referendum on June 23rd, a number of surveys of technology business leaders pointed to a strong preference to remain part of the EU.
In the past five years, ITV has switched from waterfall into agile development. “It was all manual before, and it wasn’t keeping up with the pace of change that we needed. We now have development teams headed by product owners,” said Tom Clark, Head of Common Platform at ITV plc.
read more online
STARWEST Date: 2‑7 October 2016 Where: Anaheim, CA, United States www.techwell.com ★★★
DEVOPS FOCUS GROUPS Date: 18 October 2016 Where: London, UK www.devopsfocusgroups.com R E C O M M E N D E D
read more online
★★★
QA&TEST 2016 Date: 19‑21 October 2016 Where: Bilbao, Spain www.qatest.org ★★★
EUROPEAN SOFTWARE TESTING SUMMIT SOFTWARE TESTING FOR RIO 2016 OLYMPIC GAMES Every IT component of the Olympic Games, has been fully tested and assessed by Atos in partnership with the Rio 2016™ Organising Committee of the Olympic Games in time for the games to begin on 5 August 2016. The comprehensive Technical Rehearsal is designed to test readiness based on the three busiest days of the Games.
read more online
DUTCH NATIONAL POLICE USES ONLINE GAMING FOR RECRUITMENT An online gaming experience and recruitment vehicle, #Crimediggers aims to find skilled developers interested in digital forensics careers. The campaign website has had over 100,000 unique visitors and 53,381 registered players, resulting in over 1042 applications to fill 110 vacancies with the Dutch Police.
read more online
Date: 15 November 2016 Where: London, UK www.softwaretestingawards.com R E C O M M E N D E D
★★★
THE EUROPEAN SOFTWARE TESTING AWARDS Date: 16 November 2016 Where: London, UK www.softwaretestingawards.com R E C O M M E N D E D
T E S T M a g a z i n e | J ul y 2 01 6
FROM INFORMATION TECHNOLOGY TO INTELLIGENT TECHNOLOGY Siva Ganesan, Vice President and Global Head of Assurance Services, Tata Consultancy Services, argues that transitioning into intelligent testing will be the vital ingredient for assuring artificial intelligence. “The 9000 series is the most reliable computer ever made. We are all fool proof and incapable of error”, says the sentient computer HAL in Stanley Kubrick and Arthur C Clarke’s visionary epic, 2001: A Space Odyssey. HAL speaks in a soft monotone, “I’m sorry Dave”, even as he kills the human crew. “Never send a human to do a machine’s job.” The cyberpunk action classic, The Matrix, was set in an artificial intelligence (AI) infused future where AI machines had enslaved mankind with a simulated reality.
W
hile Hollywood science fiction has almost always predicted a doomsday scenario in most AI based movies, the outlook isn’t quite as bleak or as singularly alarming as it is portrayed to be. AI today is finally coming of age, and is beginning to help people have enriched daily lives at work, at home and in many other situations. AI is being used to detect metastasised cancers, manufacture personalised medication, create driverless automobiles; AI is even being used to create art and produce literature! There is no doubt, however, of the need for caution and as some term it, ‘the kill switch’ (technology that is meant to stop AI from learning how to stop human
T E S T M a g a z i n e | J ul y 2 01 6
intervention from halting its existence). But on a more fundamental level, as application of AI has moved to the realm of networked smart devices that affect a staggeringly large number of users and has proliferated into the consumer space, quality assurance (QA) in AI domain has assumed great significance.
CHANGING PRINCIPLES OF QUALITY ASSURANCE While the core principle of quality assurance remain the same, that is product and process quality, the 'testing approaches' and the way we ‘gauge quality’ will require changes and additions. Newer test approaches to enable
intelligent testing involving smart test tools, additional phases in test life cycle – including shift left and shift right becoming default test phases and a lot other newer approaches will need to be created and codified. For e.g. as the AI system learns on its own in production, generation of intelligent test cases and test execution of the ‘system in production’ will be key. Metrics, which are key indicators to ‘gauge’ quality, will also need to cover a broader horizon. For example, the traditional focus on production quality which is indicated by defects in production will no more be the only indicator of application quality. For better understanding, let me detail the changing principles using the following validation points, which are checks and
T H O U G H T
11
L E A D E R S H I P
balances of fool proofing the application of AI technology: 1. Validating the business processes, as with the advent of AI, we are talking about replacing human intervention and achieving extreme digitisation. 2. Validating customer experience, as consumers look for the benefits of hyper personalisation in a hyper connected world, that is made possible by an advanced application of AI.
The future of AI AI is no longer just about developing robots. It is about creating unsupervised machine learning systems that can automate the process of collecting and analysing data in real‑time. This shift is what is causing the pervasive spread of intelligent networked automation across industries. And quality assurance is the vital ingredient in this mix.
1. BUSINESS PROCESS VALIDATION HOW DO WE IDENTIFY A FAILED BUSINESS PROCESS THAT ENABLES EXTREME DIGITISATION?
In addition to identifying the failures as the AI system is getting developed, we will need to continuously test the intelligent system in production. Continuous testing requires a lot of ‘intelligent end‑to‑end test automation’ as the system in production is learning and improving its performance continuously.
HOW DO WE MEASURE THE QUALITY OF INTELLIGENT SYSTEM IN PRODUCTION?
A failed business process can be traced to a production incident but if we look at the big picture of the objective of extreme digitisation, which is to improve processes and increase productivity, we need additional means to measure the quality. Defects will take broader definition as the expected results need to cover intelligent handling of an unexpected requirement, increased business productivity measures, etc. This means a
tightly coupled thinking from the perspective of business goals and broader definitions of defects, and taking the feedback on quality from IT and business will become very significant steps in measuring this validation.
2. CUSTOMER EXPERIENCE VALIDATION HOW DO WE IDENTIFY A GOOD/ BAD CUSTOMER EXPERIENCE?
While a system is developed, getting user feedback on the application experience will be key. Taking a product testing approach i.e. making crowd sourcing (internal/external)/beta testing a default testing phase will be one of the ways to identify the application experience.
As the AI system learns on its own in production, generation of intelligent test cases and test execution of the ‘system in production’ will be key
HOW DO WE MEASURE CUSTOMER EXPERIENCE?
This will again require listening to beyond production system defects. We need to listen to all public domain including social media. Use of matured tools to listen to public domain which feeds a robust opinion modelling algorithm will give meaningful measures on customer experience. Defining the right customer experience metrics and traceability of these measurements to test cases and in turn to use cases will make it a 360˚ validation.
SUMMARY In conclusion, need for newer definitions of QA methodologies, measurements and smarter tools enabling intelligent testing is at its peak with disruptive technologies like AI getting wider acceptance. QA needs to broaden its horizon by aligning business goals more and more with gauging the quality of systems and also needs to get deeper into technology to enable intelligent testing of the systems that are getting developed. The increase in intelligent systems/products and the new business models are transforming the current ‘information technology’ units into ‘intelligent technology’ units which will give business the needed competitive edge. There will soon be a need for another enabling unit that will have systems enabling the end‑to‑end development, testing and support of AI systems. Scope of QA will only get broader and richer in this context. QA will have to play the dual, ‘business’ and ‘technology’ roles more prominently in the new context and every organisation needs to focus on preparing the QA teams for this intelligent future.
SIVA GANESAN VICE PRESIDENT AND GLOBAL HEAD OF ASSURANCE SERVICES TATA CONSULTANCY SERVICES
Siva runs the Assurance Services Unit (ASU) business for Tata Consultancy Services (TCS), which serves a large, diversified customer base globally. With over 25 years in the IT industry, Siva has considerable experience in building relationships ground up and helping customers unlock tremendous value from their existing testing estate.
Editor's note: This is last in the series by Siva Ganesan. From the next issue onwards, TEST Magazine will feature a new TCS author, Jayshree Natarajan.
T E S T M a g a z i n e | J ul y 2 01 6
WHY THE CLOUD CARRIES SUCH CLOUT
Valéry Raulet, CTO, Testhouse Limited, reviews how by leveraging the cloud, a company can implement a pay as you go approach to software testing.
F
or many companies, cloud technologies provide an effective solution for optimising the use of hardware resources and the opportunity to move their capital expenses (CAPEX) to operational expenses (OPEX). Software testing requires environments that are available as and when required. This can be a challenge using physical hardware. In this article, we look at what testing types can take advantage of cloud testing and what solutions are already in place in Microsoft Azure.
MIGRATION TO THE CLOUD Cloud computing has been in place for over a decade and has slowly made inroads into the enterprise community. For many companies, the advantages of cloud hosting are so compelling that they not only move some of their infrastructure to the cloud (IaaS), hosting websites on a cloud platform (PaaS), but also make use of ready‑made services (SaaS) such CRM, ERP, email, and so on. Cloud testing has the ability to leverage
T E S T M a g a z i n e | J ul y 2 01 6
cloud computing for the need of software testing. The strong selling point is that you don’t even have to host your application in the cloud to take advantage of it. As long as your application is internet facing you can cloud test it, but even that limitation is due to be removed.
TYPES OF TESTING By using infrastructure as a service (IaaS), you can deploy nearly any system to the cloud as long as it is supported. IaaS allows you to install any operating system inside a virtual machine. It is then possible to install any software, including testing tools. So instead of deploying an on premise test environment, it is in the cloud. However, the benefits of cloud testing are most noticeable with load and performance testing.
LOAD AND PERFORMANCE TESTING
When performance testing an application, it is advised to run three pieces of software: a monitoring tool such as Visual Studio, a
T H O U G H T
test controller and one or multiple agents to generate the load and simulate users (i.e. virtual users). If you need to simulate thousands of virtual users, you will need to set up multiple machines with enough bandwidth, CPU processing, and memory. You will have to purchase the software and the rights to simulate so many users. Doing so requires a lot of CAPEX and the cost of your load test environment may not be utilised efficiently unless you are load testing fairly regularly. Alternatively, you can make use of load testing using a cloud service. This will provide several advantages: • OPEX instead of CAPEX: by paying only when you use it, you don’t need to block your capital and the cost follows the usage. • Environment usage: if on premise you would have to purchase and maintain your environment. If using a service, you don’t need to set up the infrastructure or platform. • Network bandwidth: to perform large load tests, you need enough bandwidth which your company may not have and can be very expensive. Smaller companies may not have the luxury. • Quick provisioning: load test can start within minutes. • Real testing: by using a cloud‑based solution, you can simulate an even larger user load, your load test will come from internet mimicking what real users will be doing. Also, you can simulate the load coming from various locations across the globe (Azure provides 16 locations across the globe at the time of this writing). With Microsoft Visual Studio, you can easily decide between on premise and cloud by switching a button.
FUNCTIONAL TESTING
13
L E A D E R S H I P
Your test environment may not be constantly required. However, when you need one, it can take several days before IT infrastructure provisions and deploys the software. With the advent of DevOps, this is no longer acceptable. Environments need to be deployed automatically and this is becoming the norm. We now see machines and environments as a commodity where the lifecycle has become provision, use and tear down sometimes within a couple of hours. This approach can also solve issues such as testing various configurations or trying to address a client issue by reproducing their environment. Using cloud and DevOps automation, this can be achieved within minutes.
Another example where functional testing is simplified is by using cloud labs for mobile testing. Using Xamarin Test Cloud, it is possible to access over 2000 real devices. Having this capability on premise will probably be prohibitive for most companies.
PERCEIVED ISSUES For some, cloud is considered too risky as they cannot be controlled. Often this is for the following reasons: • Availability: the risk of the service not being available is what may prevent some companies jumping ship. However, a cloud infrastructure is no different from a datacentre. For instance, Azure guarantees an SLA of minimum 99.5%. • Data: hosting data in the cloud is a risk that many companies are not ready to take because it represents their most valuable asset. By using desensitised data for your testing removes this impediment. • Security: Azure is a public cloud and the perception is that anyone can access the environments. However, you can control who accesses what. For instance, IaaS and PaaS that are deployed within a virtual network can be connected through the private peering domain. By using an ExpressRoute connectivity with your private network, you will have a secure private connection with cloud assets that are only accessible to your company.
SUMMARY With some much potential, cloud solutions are here to stay. Cloud testing provides a cost‑effective solution that does not lock down your CAPEX. Even if your company has not started using cloud services, it probably won’t be long before they realise that the risk is lower than the breadth of advantages it provides. Microsoft also offer a great way of trying before you buy via the free Azure time you get with an MSDN subscription. This is a monthly, use it or lose it, fixed amount of time that can be used as you please. A perfect way to dip your toes in the water before you get your head in the cloud.
Cloud testing has the ability to leverage cloud computing for the need of software testing. The strong selling point is that you don’t even have to host your application in the cloud to take advantage of it
VALÉRY RAULET CTO TESTHOUSE LIMITED
Having started his professional career as a support engineer at Mercury, Valéry has continued working in the testing domain and has more than 10 years’ experience as a consultant in test automation and performance testing. With a Ph.D. in Computer Science, Valéry is also an accomplished trainer, author and a public speaker.
T E S T M a g a z i n e | J ul y 2 01 6
MORE THAN JUST BUZZ Are you really agile testing or is it just a buzzword you use for management? Sean Rand, Senior QA Engineer, Nephila Capital, explores how agile we truly are using his own experiences as a temperature gauge.
A G I L E
15
T E S T I N G
A
gile testing is a topic close to my heart as over the past number of years having spoken about it at conferences, discussed with industry leaders, companies and colleagues trying to really drive out if agile practices have been utilised successfully in workplaces. And the majority of my findings are that it’s not close to being successful in half of what I hear, which might be a shock to some. Agile testing has become a real buzzword in development and testing teams, still to this day employers request agile experience but I ask why? Surely a proven track record of fast, frequent delivery is further reaching than knowing the principles of agile but not applying them successfully in previous roles? I’m a firm believer that agile testing and development methodology has brought us some great advancements in technology delivery models; however, because the word agile can be all encompassing, this has allowed us to hide behind it at times.
BUILDING AN AGILE TEST TEAM Whenever I’ve built test teams in the past, and especially as I grow my team within Nephila Capital, I do not chase candidates who constantly push the word agile testing in interviews without backing it up. I let them describe their delivery methods and workings
with developers/business users/product owners, only then from their narrative can I clearly derive if they’re a suitable candidate or not to work in a highly agile test‑driven world that is the environment we work within. It’s so easy to say the words: agile, delivery, testing, frequent, release, sprint; however, to form them into a structure that truly works and works for your industry and business needs, that’s a different story. This leads us into the key fundamentals and principles that I have found to be productive ensuring the agile testing strategy delivers frequent, fast, on point, tested and accepted features released to production giving business value via a measurable feature road map: • Work in sprints and make sure everyone from the CEO through to the intern knows what sprints are and why run in a pre‑determined time blocks. • Sprint duration is aggressive and currently two weeks for the delivery of development, testing, user acceptance testing and the push to production. Thinking of taking a holiday? You’ll probably miss a sprint including a push to production; this shows how constant the feedback loop turns. • Expect at least 3 ‑ 4 releases to the test team during the given two week sprint. This is an every other day method understood from the whole team. • Sit tightly coupled. Tester, business user and developer sit a few feet from each other or at least interact many times during the day together remotely via video calls as opposed to just email. • Continuous integration (CI) build servers, TeamCity, executes all our automated tests (unit, integration, GUI) for each sprint, feature or bug branch. • Green builds via CI are release candidate branches only; this gives the developer a faster feedback cycle when pushing code ensuring regression issues haven’t been introduced. • The test team must strive to write automated tests before a line of code from a developer has been written. • Ensure the business/product owner knows what tests are written upfront, have them write tests with the tester and share those tests with the developer at all times. • QA is responsible for the quality of the testing and bug finding but isn’t responsible for overall feature quality that’s owned by the agile team. • Push the limits and let the team know that it’s ok to fail, as long as you learn from those shortfalls.
Tester, business user and developer sit a few feet from each other or at least interact many times during the day together remotely via video calls as opposed to just email
SEAN RAND SENIOR QA ENGINEER NEPHILA CAPITAL
Sean heads up the global QA team for Nephila Capital, a hedge fund manager. Sean leads the test strategy for the firm whilst also being very hands on in testing the big data platforms using a variety of languages and frameworks. He is an active member in the test community and looks to contribute to the Selenium and BDD projects via GitHub later this year.
T E S T M a g a z i n e | J ul y 2 01 6
16
• If you test the same path twice, then automate it the third time as you don’t want wasted time on a test path which can clearly be taken care of via an automated test. • Manual testing is paramount in agile, especially the form of exploratory testing. • Ensuring a large regression automation suite is in place is key allowing the focus of the tester to be present on the feature(s) being delivered. • Finally a really important notion in agile testing is to be a core member of a successful delivery team, not a member of a development team! I prefer to deliver than develop and never deliver…Food for thought for many of us I’m sure. I am fortunate to have worked for companies who have embraced the above in buckets and don’t stand still in the space, instead are always adapting and overcoming challenges to get on the front foot.
AN AGILE WORK ENVIRONMENT As an example, I head up the global test team for Nephila, whilst actively being a delivery member coding daily integration/GUI tests and forming strategic decisions for the team. Being a multi‑billion dollar hedge fund, Nephila lives in the financial world. Financial companies are notoriously a different beast of
T E S T M a g a z i n e | J ul y 2 01 6
A G I L E
company with more regulations and red tape than a supermarket opening when trying to move progressively with the times. Nothing however could be further from the truth within Nephila and that’s why it’s such an exciting and cutting edge place to work within the agile, technology and financial space. Within Nephila we work as described in the fundamental principles section: we’re nimble, we’re not scared to change direction, we don’t fear to fail on features, instead we go again and become even stronger in our delivery. This is due to not being bound by 900 page functional specifications that are out of date before they’re even finished, giving us as testers no chance to be able to sign off on a product. Thankfully everyone understands the power of being agile and delivering true value to the business frequently and it makes for a really fantastic work environment. This model ensures we have the capabilities of using features which are wanted by the business, relevant and ahead of the curve when dealing with pricing multi‑billion dollar deals across the funds for investors.
FINAL THOUGHTS In my opinion, the main reason why agile testing and delivery fails within a company comes down to one keyword: communication. I work with a lot of very highly regarded
T E S T I N G
technologists, from Microsoft MVP’s, Scala Masters to AKKA Masters – these titles show that technically we have a gifted set of developers across our technology stack. However, to work in a highly effective team is only possible due to the excellent communication skills of those team members and the business users. Without those communication skills I truly believe you can employ people who are very deep in a skillset area but will ultimately fail due the lack of soft skills. If you have managed to get this far with the article firstly I’d like to thank you. Secondly, I ask that next time you’re testing a release candidate with one or many features where you don’t quite understand parts of that feature which you really should to allow you to test rapidly, then consider what you would change to not be in that situation again, as time is precious when agile testing! No one answer fits to all the above but if you keep asking that question of what would you change in the process to best fit your environment, then generally it comes back to better relationships and communication within the test team and wider delivery team. I hope this article has triggered some thoughts in your working practices, questioning how to keep pushing forward and ensure faster, tighter test cycles within an agile delivery model. Please don’t hesitate to reach out to me with any questions via LinkedIn.
Himagiri Mukkamala, Head of Engineering for Predix, GE Digital, looks at digital transformation within the industrial space, the industrial internet and what this means in terms of software development, security and testing.
M A N U F A C T U R I N G
20
Consider, for example, what would happen if a power generation equipment or jet engine were compromised by a denial of service attack
U
nless you’ve been living in a cave, you will have heard the phrase ‘digital transformation’ many times over the past few years – some may say, too many. You may even have used it yourself, to describe the way in which companies are striving to change, not just their IT systems and communication technologies, but business processes and practices to both cope with and gain advantage from the huge surge in data now routinely generated by the modern world. Chances are you’ll also have an understanding of how big data, machine learning and the internet of things (IoT) fit into the digital transformation story and the many benefits and implications for consumers and businesses alike. You’re less likely; however, to have seen or heard much about how all this is affecting manufacturing, power generation, mass transit, health services and other large, industrial scale, businesses, which for a variety of reasons, have come late to the digital transformation table. That; however, is changing and changing fast, providing a whole new set of challenges for those involved in designing, developing and testing the software involved.
REASONS TO BE CAUTIOUS
HIMAGIRI (HIMA) MUKKAMALA HEAD OF ENGINEERING FOR PREDIX GE DIGITAL
Himagiri (Hima) Mukkamala is responsible for leading the Predix engineering organisation to deliver high quality software iteratively using agile & scrum principles. In his previous role at Sybase, Hima led various groups in the mobile platform organisation and played a significant part in the successful acquisition by SAP.
T E S T M a g a z i n e | J ul y 2 01 6
The fact that the technologies employed by the industrial world are lagging behind their consumer and enterprise counterparts should come as little surprise. To start with, whereas the typical business infrastructure might get replaced every two to three years and consumer smartphones upgraded annually, the technology behind the average power station or locomotive could easily be 20‑30 years old, even more in some cases. Additionally, such systems are crucially, often strategically, important, making security a number one concern ahead of any possible benefits likely to accrue from the technologies and processes of digital transformation. Consider, for example, what would happen if a power generation equipment or jet engine were compromised by a denial of service attack. Worse still, what if those systems were held to ransom by extremists using CryptoLocker or other form of ransomware? The chaos
S E C T O R
that would ensue is unthinkable and the main reason why traditional SCADA systems (Supervisory Control and Data Acquisition) are mostly designed to work locally with so‑called ‘air‑gaps’ isolating them from the wider internet. The potential benefits of bridging those ‘air‑gaps’ and connecting industrial systems directly to the internet are; however, growing ever more pressing as the devices being controlled (assets in SCADA speak) are engineered to deliver up data beyond that required for simple monitoring and management purposes. The carrot of commercial exploitation of big data is clearly being dangled here with companies looking to exploit the information they have already and develop new and valuable machine learning analytics and applications on the back of the greater connectivity. And that, in turn, is leading to companies looking to push old‑style SCADA technology out the door in favour of a more modern industrial internet designed to use cloud technology to communicate with and manage large‑scale assets.
THE SAME BUT DIFFERENT The aim of the industrial internet is, ultimately, to enable digital transformation on an industrial scale, and you could be forgiven for thinking that it bears more than a passing resemblance to the internet of things (IoT) where edge devices are similarly connected to monitoring and management intelligence in the cloud. The similarities, however, are mostly superficial, not least because of the huge amounts of data involved, even compared to the volumes predicted for some of the big data applications being developed to exploit the internet of things. Other differences between IoT include the complexity in the assets such as a locomotive with 300,000 parts versus a smartwatch, and the velocity at which data is streamed from various sensors on the assets. We’re not talking about switching light bulbs on and off here, or, monitoring CCTV cameras and other relatively simple applications, but managing massive assets that can generate gigantic amounts of data. The sensors in just one jet engine,
M A N U F A C T U R I N G
21
S E C T O R
Figure 1. Originally created to enable developers at GE to build custom APM applications for its own customers, the Predix platform has since been made more widely available and is already being used by over 8500 developers in companies such as Boeing, ConocoPhillips, Pitney Bowes and others to build large‑scale industrial APM systems.
for example, can produce terabytes of data every day as can the hundreds of turbines in the average wind farm, water pumping stations and so on. More than that, the IoT approach would be to push all this data into the cloud and analyse it there, but that would be far too slow and impractical for a lot of industrial data. Far better to collect and pre‑process the data and run local machine learning analytics using local gateways or on the control systems before making it available in a more digestible format both for sharing with other gateways and processing by applications in the cloud. Not only is this more efficient it also helps address many of the security concerns, as the sensors involved need little in the way of exploitable intelligence, plus the assets themselves are not directly accessible. It provides a kind of logical ‘air‑gap’ if you like, on top of which it also enables multiple layers of robust security to be put in place to provide protection far greater than available with vanilla IoT technology.
A JOINED UP APPROACH Of course there’s the inertia of the very conservative industrial market to overcome but it’s important not to dismiss the concept of an industrial internet as something that will never happen. Industrial companies may typically be slow to change, but
a growing groundswell of enthusiasm for something better than traditional SCADA solutions provide is driving the technology forward. However, it would not be ideal for every vendor to go it alone and develop its own implementation employing different, possibly proprietary, communication, security and other technologies. GE believes an integrated edge to cloud architecture and technology that can enable GE and non‑GE assets and enterprises to participate in this transformation is critical just like the platforms that have been able to successfully enable the consumer and enterprise digital transformation. It is important for developers to have access to platforms, tools and services designed specifically to build industrial grade, asset performance management (APM) systems rather than make do with general purpose alternatives. In this respect, existing platforms for enterprise cloud applications, fall a long way short of what’s required, especially in terms of the scalability and security required. Hence why GE has put its weight behind Pivotal Cloud Foundry, a commercial implementation of the open source PaaS service already proving popular with cloud developers. Not only is the firm making a substantial investment in Pivotal Cloud Foundry, GE is also using it as the base of its own Predix platform; to host highly secure, industrial grade APM systems and provide
access to the specialist tools required to better develop, test, deploy and manage the software involved. Through Cloud Foundry, Predix provides access to all the tools needed to develop and deploy cloud‑based applications and analytics but do so at scale. It also provides developers with access to advanced analytics services to help both automate management processes and exploit the information currently locked up in conventional SCADA silos. Other advantages include the ability to work with any development platform and apply a modern continuous development methodology, avoiding the inherent delays and potential for errors that can arise with a cyclic code/test approach. This is particularly important in the industrial world where minor issues can have potentially catastrophic effects, so Predix encourages a team approach to development with integrated testing and quality assurance at every phase of the process. Security services are also built in and treated as a priority at every stage together with a strong DevOps approach to enable continuous development, implementation and operation at the kind of industrial scale users of Predix expect to achieve. Sure, the industrial world is still playing catch up with the enterprise in its quest to reap the benefits of digital transformation, but it’s getting there and solutions such as Predix will see us achieving that goal sooner rather than later.
T E S T M a g a z i n e | J ul y 2 01 6
A NEW MODEL OF MANUFACTURING Colin Bull, Principal Consultant Manufacturing and Product Development, SQS, examines the great digitalisation of the manufacturing industry.
W
e envisage the traditional factory floor to form around one central production line, with workers from all walks of life machining and assembling components to products as it moves through the factory’s somewhat grimy production line. However, in the modern factory of today, the influence of lean and the emergence of the industrial internet of things (IIoT) has had a considerable impact. Gone is the grime and a mass of workers, to be replaced by a clean and sterile environment housing a mass of robots and automation assisted skilled workers. This is all whilst under the constant observation of a new breed of tech savvy engineers.
CHANGE ON THE FACTORY FLOOR Over the past 10 years or so, the traditional factory floor has experienced major changes in the way it has evolved. This is due to the influence of automation and robotic assistance for the worker. However, to continue the transformation and to
T E S T M a g a z i n e | J ul y 2 01 6
become digital, requires the convergence of information technology (IT) and operational technology (OT). Following this trend, back‑end IT has become prominent throughout the entire manufacturing process, encouraging traditional factories to become more IT savvy. A phenomenon that has inadvertently been affected by the modern consumer’s expectations for every day mundane objects to be connected and think autonomously. A prime example would be the insurgence of demand for unique personalised products, particularly within the automotive industry. This has forced factories to move from a process of delivering identical batches, to an approach which facilitates the mass customisation of individual products. Not only has this meant manufacturers have had to review the current digital thread throughout the product lifecycle and production strategy that they have in place, but that they have also had to facilitate a new era of additive manufacturing. The method of additive manufacturing refers to a process of 3D printing to build a component from individually manufactured layers, placing
M A N U F A C T U R I N G
ever more importance upon the need for manufacturers to embrace and implement digitalisation across the factory floor.
THE ‘SOFTWARE FIRST’ APPROACH In order to embrace digitalisation to meet customer demand for connected products and ensure quality assurance is maintained, manufacturers must move away from treating software as an add‑on. This means that traditional hardware manufacturers need to evolve their strategy to facilitate changing attitudes and realise the importance of building hardware around software, rather than treating software as a supplement later down the line. With IIoT providing an added layer of complexity, shop floor devices will often need to be able to communicate across the supply chain, warehouse, logistics and factory floor through a variety of different platforms, such as 5G, wi‑fi and Bluetooth. This has forced manufacturing companies to think more like IT professionals than manufacturers, encouraging the adoption of a ‘software first’ approach. Companies, traditionally built around the physical manufacturing of their products, already have knowledge of how to test them from a hardware perspective – for example testing a braking system on a car. However, they are not used to having to test software processes that integrate with the hardware, gateways and back end support systems, other software from their supply chain and external data sources.
PAIN POINTS HINDERING DIGITALISATION Digitalisation as a whole has presented manufacturers with a whole host of challenges, particularly those who have been based on brownfield sites for a number of decades. As such, they are likely to have machines that are upwards of 30 years old and won’t have the capacity to function with a connected environment, this is an opportunity for the production line integration specialists to provide solutions and enable the connectivity of
23
S E C T O R
these older assets. What places even harder constraints upon digitalisation is that it doesn’t just happen overnight and within an environment that runs 24/7, 365 days a year. Closing a plant down for a number of days or weeks just isn’t an option. If they can’t run, they can’t work, so fear of change has placed a strong hold upon manufacturers’ need to digitalise. As you can imagine, digitalisation is a process best managed by those who have prospective greenfield sites that they are looking to build on, as this means the entire factory can be built with digital thread and IPisation of products in mind. Such a phenomenon places even more strain upon hardware and software manufacturers. Traditional hardware manufacturers have had to keep pace by morphing themselves into software providers. This has presented a drastic change of approach for the industry, adopting a completely new way of thinking and working.
Closing a plant down for a number of days or weeks just isn’t an option. If they can’t run, they can’t work, so fear of change has placed a strong hold upon manufacturers’ need to digitalise
THE FUTURE OF THE MANUFACTURING INDUSTRY The future of the manufacturing industry lies in the hands of qualified software quality assurers, assigned with ensuring the smooth running of the digitalisation process to meet the demands of the digital environment. As the factory floor is unlike that seen within common software testing as a working environment, a specific new set of skills is required within the software testing industry. In order to facilitate the newly required skill set, software specialists will need to partner with production line integration specialists to address the needs of the digital factory environment. Once this has been addressed, the fear of change must be tackled in order for the manufacturing industry to fully embrace the influence of a ‘software first’ approach upon the convergence of IT and OT to transform the factory floor.
COLIN BULL PRINCIPAL CONSULTANT SQS
Colin is the SQS manufacturing domain vertical consultant with a passion for quality enablement of digital transformation in manufacturing. He has over 26 years of industry and consulting experience within automotive, aerospace & defence, high tech and heavy equipment industries.
T E S T M a g a z i n e | J ul y 2 01 6
A PERFECT PARTNERSHIP At the end of 2015, and following a successful relationship spanning many years, Centre4 Testing and The Test People merged. Now rebranded as Ten10, this combined entity positions the organisation as the largest, fastest‑growing, privately‑owned UK software testing company.
S U P P L I E R
25
P R O F I L E
T
he software testing landscape is changing, and suppliers are moulding themselves to meet increasingly complex customer demands. Cecilia Rehn, Editor of TEST Magazine caught up with Miles Worne, CEO and Ash Gawthorp, Client Engagement Director, to discuss what this merger means for Ten10’s existing and new clients. The Test People and Centre4 Testing merged last December. Tell us about the background of these firms, and the decision to join forces? Ash Gawthorp: The Test People was created in 2007, and grew rapidly to become one of the UK’s leading testing solutions and consulting firms, we focused on what were then the higher value testing disciplines – performance testing and test automation as well as strategic test advice delivered in a variety of flexible ways. The merger has been very exciting, allowing us to meet the increasing demands from our clients, as well as positioning us to play an influential role in the market and improve the UK testing industry on a larger scale. Miles Worne: Centre4 Testing was founded a bit earlier, back in 2004, and was focused on providing flexible contract resourcing. Then as the market has demanded a wider QA offer, the company has been successfully transitioning from being a contracted staffing business to a provider of a broad range of testing services.
There had traditionally been a very good, symbiotic relationship between The Test People and Centre4 Testing, supporting one another through the years. The possibility of merging had first been discussed a couple of years previously, however at the time the combined business offering was not compelling enough. The businesses have both evolved significantly since those early discussions; there was a recognition that together we could create a leading position in the UK testing market, by providing a wide range of services: from consultancy led solutions, managed services and also providing cloud‑based testing capabilities. The merger is allowing us to capitalise on the changing trends in the UK testing space, such as focusing more on agile competency, and by providing technical services that can flex according to client need.
Essentially, we took two high performing businesses, and set about combining and complementing our relative strengths. The services, geographical locations and industry segmentations were all very complementary, allowing for an easy and smooth transition. We merged back in December, and we’ve kind of been knee deep in integrating the two businesses since then. We’ve now pretty much completed our integration phase which is all very exciting! Tell us about the new name, Ten10. MW: Our new name, Ten10 actually derives from a motor racing term, ‘ten‑tenths’ which means performing to your maximum potential. We really like this analogy and feel it reflects the ambitions of our talented people, our innovative and rigorous use of tools and technology and our desire to help our clients embrace technological change and business transformation with total confidence. The analogy supports our objectives as an organisation.
Clients are looking for a faster return on investment – delivered in an iterative fashion – producing value from the start, to allow them to get their software into the hands of their users earlier
As part of the merger we’ve launched a completely new brand identity to support our new name. As the largest, fastest‑growing, privately‑owned UK‑based testing company, what does this mean for your clients? MW: As I said earlier, the merger has allowed us to meet the demands of a shifting testing landscape. When we look at the software testing market, we see it as being pretty fragmented. You’ve got your large players serving only large clients. You’ve got lots of small players and medium sized players working on smaller SMEs, and you’ve got your boutique consultancies that struggle to scale. The way I see it is, the testing industry has suffered from poor client experience due to industrialised and rigid service models. In our opinion, this service model is outdated. It’s important for us to deliver flexibility; to support our clients with a breadth of technical testing talent and services, according to individual client need. And to be able to flex and scale these bespoke offerings as desired through the duration of projects and beyond. We want to be seen as a nimble, scalable alternative to the more rigid offerings out there. We also recognised that clients are increasingly demanding more advisory
MILES WORNE CEO TEN10
Prior to joining Ten10 as CEO in September 2015, Miles spent six years as Managing Director EMEA & Chief Strategy Officer at Research Now Inc. In his earlier career he worked at Cadbury Bain & Co where he held a variety of general management and strategy development roles. Miles also worked in investment banking before studying for an MBA at Columbia Business School in New York.
T E S T M a g a z i n e | J ul y 2 01 6
26
S U P P L I E R
P R O F I L E
Ten10 actually derives from a motor racing term, ‘ten‑tenths’ which means performing to your maximum potential
services from their testing partner. We are committed to attracting and developing the most talented people in the industry and through our people we provide our clients with a rigorous and creative approach to software testing. For us it’s about collaboration, and working holistically with our clients to enable them to embrace innovation and take on technological leaps with absolute confidence.
ASH GAWTHORP CLIENT ENGAGEMENT DIRECTOR TEN10
Ash was one of the co‑founding directors of The Test People. Ash comes from a background of hardware, software and performance engineering, having graduated with a degree in Microelectronic Engineering. Ash is responsible for the technical direction and innovation within Ten10 and his teams have a heavy focus on research and development.
T E S T M a g a z i n e | J ul y 2 01 6
AG: We’re also investing heavily into R&D, to ensure that our clients can receive a progressive testing approach. It’s important to us to stay ahead of the game, so that we can continue to provide the best solution to our clients using the best‑fit tools, approaches and techniques tailored to a client's needs. By combining the two firms, Ten10 now offers a 350+ strong, UK‑based team. We can offer our clients both on and offsite flexible testing services, with the ability to deliver these at a local level through our offices across the UK, in Leeds, Manchester, London and Brighton. How is the nature of testing changing? AG: Testing has changed significantly over
the last couple of years, and we see this reflected in how client requirements have changed. Whilst every client is different there are common trends we see time after time… clients are looking for a faster return on investment – delivered in an iterative fashion – producing value from the start, to allow them to get their software into the hands of their users earlier. In addition, the variety of different platforms organisations deploy software onto has never been greater, especially when you consider the large array of mobile devices in the market – and there’s an ever growing set of frameworks and development environments client can use to develop that software. On the serverside we are testing in ever larger, and more complex environments – consuming internal and external services, deployed on a combination of internal infrastructure through to public and hybrid clouds. This raises questions around test data and security not encountered before as well as increasing the complexity of test environment deployment, configuration and management. We promote testing as early as you can, adding value and promoting software quality across the whole lifecycle.
S U P P L I E R
P R O F I L E
27
and testing is having to adapt to keep pace, it’s an exciting market to be in. The range of different devices and capability as well as the rate of change of new devices entering the market is incredible – you can have a £50 device or an £800 device, both use the same operating system, but the performance and screen resolution varies tremendously yet the app is expected to functionally work and perform well across devices at both ends of the scale and everything else inbetween. The smartphone is just one part of it however – the network needs to be taken into account as well, both wi‑fi and mobile networks. It’s interesting that what was considered exceptional scenarios in the traditional desktop environment – loss of connection, failover, reconnection and resiliency are the new normal in a mobile environment.
MW: We hear it from our clients all the time; they are looking for a technically robust testing solution that they can deliver in an integrated fashion. There is also a strong demand for access to testing capabilities, as and when required. They want to be met by true subject matter experts, and supported through agile resourcing models and excellent customer service. AG: We’re seeing a change around the skillsets we’re looking for in people – specifically, reduced demand for siloed roles. The traditional skill spectrum with an analyst at one end and developer at the other, with discrete roles in‑between has largely disappeared – we’re experiencing a blending of those roles at the edges – we’re expecting people to maintain their own individual strengths but also to be able to do a little bit of everything. Ultimately we’re a people business, it’s paramount to be able to deliver key personnel ready to take on any jobs that today’s testing work requires. How has mobile changed the testing market? AG: We see mobile as a big challenge and a huge opportunity at the same time. Mobile development is continuously evolving,
Our clients shouldn’t have to worry about this complexity – we can test against a range of devices used by their users, adding new ones as they enter the market and testing OS and browser updates on a variety of network conditions. Ultimately managing the compatibility testing process and working with them to continually improve. All these factors conspire to produce a landscape where you need that feedback quicker than you ever have done before, with high quality testing at the centre of producing that feedback. And finally what does the future hold for Ten10? AG: As we grow it’s crucial that we continue to attract and develop talented people. We’re currently investing heavily in our internal training academy which we set up a few years ago. We are also committed to making advancements in R&D. We want to be the most effective partner to support our clients and their progress.
It’s interesting that what was considered exceptional scenarios in the traditional desktop environment – loss of connection, failover, reconnection and resiliency are the new normal in a mobile environment
MW: We’re focused on continuing to serve the UK testing market from our many regional offices, and informing both existing and new clients of the enhanced capabilities we can bring to the table as Ten10. I think I can speak for everyone at Ten10 when I say that we’re all hugely excited about what the future holds. For more information about Ten10, please visit www.ten10.com.
T E S T M a g a z i n e | J ul y 2 01 6
THE LEGACY OF TESTING The Financial Conduct Authority regulates 56,000 financial services firms and financial markets in the UK, in order to help protect the UK's economy, citizens and businesses. TEST Magazine spent some time with the independent regulator to learn about the importance of stakeholder communications and what the legacy of testing is.
F I N A N C I A L
29
S E C T O R
T
esting and QA at the Financial Conduct Authority is undergoing a transition. Cecilia Rehn, Editor of TEST Magazine talked to Mike Jarred, Head of Software Testing at the FCA, about the new changes and why he sees the organisation as a big data company. Tell us about yourself and your background? I have not always worked in testing; when I left school, I had aspirations to become a tree surgeon, and I secured a place at the local agricultural college. In the months before my course commenced, I took a job at a motor factors, selling car spares to garages and body shops, helping to man phones and take orders and doing stock control. As much as I wanted a job that would allow me to work outside, I was quite happy with what I was doing and decided to stay. My career then progressed in the motor industry, and my last position before moving into IT was as an assistant parts manager for a Jaguar and Rolls Royce dealership. I moved into IT in 1989 when a provider of automotive data required industry experience in data analyst roles as they were bringing their first digital products to the market. I fell into testing as they then needed people with the industry expertise and data knowledge to make sure the software applications worked correctly. I developed this role (testing was a less understood discipline in 1989 than it is today!) and became increasingly interested in testing and saw that this was a natural fit for myself. I have subsequently worked in the financial, retail, insurance and pharmaceutical & life sciences domains in both senior testing and development roles, and I joined the FCA in 2015 as Head of Software Testing. You started at the FCA in 2015, how has the first year been? It’s been really good. I’ve been delighted with the capability of my team; they’re a really experienced group of people, that provide me with incredible support in my testing role. There are a lot of things that have changed, which I will explain in more detail. The way we’re structured in the FCA is that I have a permanent team of 11 senior experienced test managers. My team own the strategic approach to testing in the FCA as well as the standards and the approaches that we ask our suppliers to follow, and we
assure the quality of the work conducted for us by our suppliers. The strategy for testing continuously evolves and is informed by taking a long term view of our portfolio of work, and the changing nature of the technology and delivery models that we need to introduce to effectively support our business. The strategy is also informed by listening to our stakeholders, and a key principle that underscores our strategy is to provide a value for money service. We have a dynamic and varied portfolio of work which we provide testing services for, to illustrate this, we had around 200 people conducting testing at our peak last year, and currently it’s close to 130. We need a flexible provision of testers as our releases vary in size and complexity month by month. The test group in the FCA is part of the Business and Technology Solutions division. Despite the FCA being an organisation that regulates financial services organisations, I believe that BTS has more in common with a big data company. We have financial services domain experts in our business, and it is through the accumulation and analysis of data – some captured by us, some sent in by the firms we regulate – that enables the FCA to conduct effective regulation. Therefore, on the tech side, we closely resemble a big data organisation.
We have financial services domain experts in our business, and it is through the accumulation and analysis of data... that enables the FCA to conduct effective regulation. Therefore, on the tech side, we closely resemble a big data organisation
Tell us about what changes you have implemented and why? What was your first priority? When I started I felt we were too entrenched in a centre of excellence model, which had standardised the delivery of testing approaches which did not always provide value for money or adequately address risks to operational service of our applications when changes were made. It felt too prescriptive, with a one‑size‑fits‑all approach. One of my first aims was to evaluate how we engaged with our stakeholders and make sure we were delivering the right service, at the right time against an agreed set of risks we were looking to mitigate. We now provide testing services to projects through a new mechanism called 'consultative engagement' which ensures conversations around risk with project stakeholders take place. The outputs of this engagement is a risk profile which informs our testing approaches as agreed mitigation for key areas of concern for the project stakeholders.
MIKE JARRED HEAD OF SOFTWARE TESTING FINANCIAL CONDUCT AUTHORITY
Mike Jarred heads up the software testing department at the Financial Conduct Authority since 2015. Prior to this, he has held testing and development roles in the financial, retail, insurance and pharmaceutical & life sciences domains. Mike will be a keynote speaker at The Software Testing Conference NORTH, taking place 27‑28 September in York, UK.
T E S T M a g a z i n e | J ul y 2 01 6
30
Stakeholders all have different needs and generally speaking, testers need to provide information of value to stakeholder groups so they can understand the value of testing
F I N A N C I A L
We have introduced visualisation tools to show what level of risk is attached to projects and the commensurate testing effort as mitigation. It is a simple bubble chart which makes it easy to draw comparisons and approaches for similar pieces of work. One of the great benefits of the consultative engagement approach is it helps with perceptions as we can demonstrate where testing adds value. How did the various stakeholders perceive testing? One of my first priorities was to identify our stakeholders, and learn what they wanted from us. By mapping out our stakeholders it became apparent that there were numerous groups that the test team didn’t fully appreciate had a stake in the efforts of the test group. It was important for us to understand our stakeholders and what they felt testing needed to do. Overall, our stakeholders were happy with the services we provided; however, there was an opportunity to address concerns over the value of the test group, and scope of our services. Stakeholders all have different needs and generally speaking, testers need to provide information of value to those specific stakeholder groups so they can make an informed opinion
S E C T O R
on whether the testing conducted on their behalf represents good value for money. We have addressed this by using amongst other things, control charts to demonstrate the level of protection against operational failure that the test group provides. If you don’t engage with stakeholders with information, including in most instances some data, which supports your narrative of how you help them achieve their goals, negative perception is inevitable as their views become subjective. As I believe Deming said, “Without data, you are just another person with an opinion”. What other changes have you seen at the FCA? We have transformed the way we gain confidence that our testing services are helping project teams to deliver great working software. We had a governance model which was checklist based, which largely focussed on ensuring that test documentation was created and signed off at requisite times for stage gate signoff. This was important, but it did not ensure that the right testing was happening at the right time, and it did not provide confidence in the quality of testing. Now that we carry out consultative engagement, our approach has shifted to one of test assurance, where we can explore what testing is taking place and why. We changed the terms of reference of the test assurance model has so that it creates a supportive environment where test managers can come for advice and direction, and the members of the test assurance board can be used for any items that require escalation so we can aid delivery. We have also included some of our stakeholders in the test assurance board in order to improve visibility of our test approaches; in particular including our operations team in the model has improved the flow of software changes into the production environment. We are also now focussing on defect prevention, and we are calling out the cost of rework, to help ensure we build a culture of quality. We are mining the data coming out of testing, to help identify areas for improving the way we build software, and running these improvement projects using lean and six sigma tools to demonstrate the value of these improvements.
T E S T M a g a z i n e | J ul y 2 01 6
32
Finally, as we deliver more software using continuous delivery models and agile engineering practices, we are also implementing a strong strategy to improve test automation and non‑functional testing, which we see as key to unlocking problems in different areas, such as CI and integration testing. I have a great relationship with our application development manager, and we are continuously looking at how we can distribute testing across the development and testing organisation, and employ effective methods to build high quality working software. The FCA is a regulator and held accountable to the Treasury and to Parliament. How does this affect testing strategy? We are funded entirely by the firms that we regulate. The FCA is an independent body and we do not receive any funding from the government. As a regulator I am very aware that we need to be providing value for money. A core element of my role is to ensure we’re carrying this ethos of value for money and adopting working practices that enable us to save money, or proactively avoid unnecessary cost. The test group keep record of where we save money, including tracking the costs of suppliers, direct hires, tooling arrangements, and other operational costs. In 2015, the testing department won an internal award for our cost savings efforts, which was great recognition for my team.
T E S T M a g a z i n e | J ul y 2 01 6
F I N A N C I A L
S E C T O R
The legacy of testing is information. There is a vast amount of information stored inside testing management tools and inside support teams, which can be used to improve the way software is built What does the future of testing look like at FCA?
One last thing, how do you see the test management role evolving?
There are a number of initiatives that excite me: We have recently selected Cognizant as our principle supplier for application development and testing services and this creates an opportunity to ensure effective collaboration in how we build great software. This move supports our ability to deliver software using the continuous delivery model, and it makes sense to review how to distribute testing across the delivery teams, and refresh the tools we use to support delivery.
My message to testers everywhere is to evolve your thinking, and to consider the legacy of testing. The legacy of testing is information. There is a vast amount of information stored inside testing management tools and inside support teams, which can be used to improve the way software is built. If you can provide data, analytics and insight to assist improving software engineering, this can help to reduce rework. Ultimately, rework is waste.
We’ve built a testing community, which allows for exchanging of ideas and information internally. We also bring in external speakers for mini conferences, to help develop and stimulate our team. I am going to continue to builds links with our testing group and the broader testing community, something we started last year, through attending conferences, meet‑ups and forums.
Minimising rework to reduce operational cost of building software is a conversation which testers can own and can start with their peers in an organisation, and importantly with their managers. Detecting defects is valuable, but comes with a cost that is associated with poor quality. If testers can communicate and demonstrate knowledge of how to create a better product with less rework, then perceptions of our role will rapidly change.
BANISHING MISINFORMATION Andrew Ayres, Director – UK & Ireland, ROQ, dispels three central myths around test automation.
T E S T
35
A U T O M A T I O N
A
utomation is undoubtedly one of the hottest topics in the testing arena at present with organisations wrestling with the big questions, such as – When? How? & How much? Some technology vendors and consultancies are describing test automation as a panacea – which is a potentially confusing standpoint. I’d like to start off by defining and describing what test automation is and I particularly like the definition volunteered by Kolawa & Huizinga who state that in software testing terms: “Test automation is the use of special software (separate from the software being tested) to control the execution of tests and the comparison of actual outcomes with predicted outcomes.”1 The authors go on to say that test automation “can automate some repetitive but necessary tasks in a formalized testing process already in place, or add additional testing that would be difficult to perform manually.”1 So far, so good – software is controlling and enhancing the execution of tests previously performed manually, by expensive human beings. When you consider the primacy of software in the digital era – this type of automation can drive significant business value and compelling cost benefits for those who are able to make it work. Amongst other things, release cycles are shortened, customers (internal and external) get software that works sooner and the costs of testing reduce as the capability increases – a ‘royal flush’, right?
DECISION TIME Businesses have to make some conscious decisions – some of them requiring deep analysis – to ascertain where automation could play a role in their context, when they should approach it and how much they should consider. Often, these are seen like time‑wasting steps rather than necessary pre‑cursors and firms are putting the cart before the horse – adopting the spray‑and‑pray approach to automation – which seldom works. Three simple myths have given rise to this approach, which can be dispelled.
YOU CAN AUTOMATE IT ALL... NOW Certain aspects of testing benefit from the continuous involvement of people – to design the approach, to manage the execution and to report on the progress and implications of the findings. In the main, the principle beneficiaries of software that works are people. People therefore are best placed to design the testing approach for any given application – working with the business sponsors to understand the drivers for the enhanced capabilities, working with the development team (who could be external to the organisation) at the beginning of the application
development lifecycle to understand key deadlines and project‑specific nuances. In so doing, test practice leaders will understand the components, constraints and dependencies of the work – and will be able to make logical judgement calls on which aspects of the task could benefit from automation. They are unlikely (I’d hope) to conclude that all of it should be, or could be, automated. Instead, (I’m still hoping here) they would split the task out into chunks – based on volume, complexity and other factors – arriving at the elements that lend themselves to automation. Their involvement however, does not stop there – automated tests have to be developed with a ‘human touch’ – how else can we expect our behaviour to be accurately emulated (which is all that is really happening) by machines? The point is that this takes time – and these considerations are not arrived at easily – you cannot automate it all, nor can you do it instantly.
Automation should be properly considered…it’s not necessarily the answer to all ills and it cannot be implemented instantly
ANDREW AYRES DIRECTOR – UK & IRELAND ROQ
Andrew is Director ‑ UK & Ireland at ROQ having previously held roles with information technology consultancies in the UK, Europe and the Middle East – working closely with Boards of Directors, CIOs and their teams to make digital transformation a reality. He holds a Post‑Graduate Diploma in Information Management from The Open University and an MBA from Manchester Business School, University of Manchester.
T E S T M a g a z i n e | J ul y 2 01 6
36
T E S T
A U T O M A T I O N
There have been some particularly contentious articles written recently, mainly within the blogosphere – hinting at imminent tester obsolescence. This is not going to happen, but there are certainly changes coming down the track
TECHNOLOGY IS KING Vendors of all shapes and sizes are promoting the view that their solution is the ‘big round button’ that organisations can simply press and shout: ‘Voila! – We’re automated!’ [APPLAUSE]. But due to the human involvement in automation and the project‑by‑project decisions that should be taken – this view is folly. I do have to stress that I’m fundamentally not saying that technology is unimportant here – merely that there is no silver‑bullet, no workaround, no shortcuts – however good the solution is. Technology here is just the road – it’s the choice of destination and route that matters most.
TESTERS WILL BECOME EXTINCT There have been some particularly contentious articles written recently, mainly within the blogosphere – hinting at imminent tester obsolescence. This is not going to happen, but there are certainly changes coming down the track. The testing industry is going through its own period of “creative destruction” – a term first coined by the economist Joseph Schumpeter – which describes “the incessant product and process innovation mechanism by which new production units replace outdated ones.”2 This is happening in all industries – driven by technology advancement – and test automation will mean that machines perform
References 1. 2.
Kolawa, Adam; Huizinga, Dorota, ‘Automated Defect Prevention: Best Practices in Software Management’. Wiley‑IEEE Computer Society Press, (2007), p. 74. Schumpeter, J., ‘Capitalism, Socialism, and Democracy’, New York: Harper & Bros. (1942).
T E S T M a g a z i n e | J ul y 2 01 6
work that humans currently perform. This does not mean however that those very people are then ‘surplus to requirements’ – they must be reassigned to other testing activities that are often overlooked due to time/cost constraints and retrained if necessary to deliver the ‘human touch’ where it relates to automation.
CONCLUSION The main conclusion that can be drawn from this is that automation should be properly considered, that it’s not necessarily the answer to all ills and that it cannot be implemented instantly. Those considering it should carefully distinguish between the noise and hype to find a common‑sense approach, grounded in reality, based on your specific context. I realise that the temptation to jump ahead is powerful, but I also understand the negative effects of doing so.
SECURING CODE TO FIGHT CYBER CRIME Amit Ashbel, Cyber Security Evangelist, Checkmarx, explains why automated application security testing is the first step in combating cyber crime.
T
he world is moving at an incredible pace. New technologies are regularly announced and whole ecosystems developed around them; such as the internet of things (IoT) for example. However, with these new developments come security risks to both businesses and consumers; hacking and cyber crime are now widely reported. The first step to combating these increased risks is to secure the application code in order to stop vulnerabilities at the root. Automated application security testing is a vital part of this but how does automated testing work in practice and what are the benefits of an automated testing process for developers and businesses?
THE CONNECTED RISK To understand the risks we are living with today, we just have to look at the IoT, a hugely beneficial technology but every IoT device created – cars, smartTVs, baby monitors – presents another attack surface for hackers. And with Gartner claiming that we will see over 25 billion connected things by 2020, that's a lot of attack surfaces. Unfortunately there are little to no regulations around
T E S T M a g a z i n e | J ul y 2 01 6
creating these devices so security protocols are often absent which means that when consumers are prompted to type in email IDs, phone numbers, full names etc., they are doing so at great peril if there is a flaw in the application layer code. But uncovering and fixing vulnerabilities in the application code is a complicated issue if not done as part of a structured methodology using the correct tools and building the correct in‑house skill sets.
MANUAL DEVELOPMENT TESTING The normal process for software development, or the software development lifecycle (SDLC), has five main stages: design, development (coding), testing, deployment and maintenance. Within this lifecycle, many organisations still employ the manual testing methods that have been in use for decades. This manual process involves three core stages. The first stage is synching, whereby the security teams interview the development teams to ensure they are familiar with the application. The second is reviewing, whereby the whole team reviews the entire
S E C U R I T Y
39
T E S T I N G
application code in order to use everyone's field of expertise. The third is reporting when each tester reports their findings. Once the false positives are eliminated, the findings are given to the developers. This is a lengthy process and developers, who are measured on time, have often moved on to other projects and are now far less familiar with the application and its code; the vulnerabilities have simply been detected too late. This is the root of the complicated decision between fixing bugs or vulnerabilities.
BUGS VERSUS VULNERABILITIES IoT products and updates are being released at an ever‑increasing pace and when security is only thought about late in the development process, such as with manual testing methods, there is usually only time to fix the bugs that could negatively impact the user's experience or the vulnerabilities that could impact the security of the application, not both. Unfortunately, in the competitive technology marketplace, the vendor often chooses to fix the bug and come back to the vulnerability later rather than delay release. It is this unsecured root application code that presents the risk and why security needs to be built in to the development process.
AUTOMATED TESTING PROCESS Rather than wait until the development process is finished to test for flaws, automated application security testing is integrated into the development process and provides constant feedback, transforming an SDLC into a secure‑SDLC (sSDLC). Each change to the code is automatically analysed so developers are notified immediately about any vulnerabilities so vendors have a much better chance of preventing vulnerabilities that hackers can capitalise on. Of course, this also has the added benefit of encouraging better coding practices among the developers and building up their secure coding skills. Static application security testing (SAST) solutions, or white box testing, can be integrated into all types of developer environments including waterfall, continuous integration and agile/DevOps, which are gaining in popularity for more complex projects. Static code analysis (SCA), a leading SAST solution, can be easily integrated into build automation tools and continuous integration servers and normally have wide
programming language support which helps organisations that use a variety of programming and scripting languages on multiple frameworks.
ADDED BENEFITS From an organisational perspective, automated security application testing is better in terms of ROI than manual testing. It takes much more resource to test manually, specifically around scanning and analysing the code. With pen‑testing for example, a black box manual testing method, the testing is both expensive and time‑consuming with the added downside that it will only ever be as good as the individual pen‑tester. With automated security application testing, it takes less time and less people to scan the code and you don't need to re‑scan unchanged code which saves more time, while avoiding the human error factor of manual testing. Automated application security testing is also better in locating the leading application‑layer vulnerabilities and is more capable of detecting code errors, dead code and other flaws that lead to buggy software.
But uncovering and fixing vulnerabilities in the application code is a complicated issue if not done as part of a structured methodology using the correct tools and building the correct in‑house skill sets
THE FUTURE FOR APPLICATION SECURITY Application security (AppSec) is a relatively new sector; according to the recent SANS State of Application Security report, only 26% of respondents consider their AppSec programmes to be 'mature' or 'very mature'. But with 30% of respondents claiming to assign responsibility for security testing to the development team, up from 22% last year, things appear to be progressing in the right direction. The more organisations are building security into the SDLC, the more application code will be developed secure from vulnerabilities and therefore safer for both organisations and consumers. The report also highlighted that the overwhelming majority of organisations (61%) expected to see spending on AppSec increase in future which is more good news for the sector. With an ever‑increasing number of new and updated applications being constantly released, organisations can't underestimate the potential risk of every single one and the only way to stop this risk at source is to employ automated application security testing to stop vulnerabilities at the root – the application code. This is the first step in fighting, and beating, cyber crime.
AMIT ASHBEL CYBER SECURITY EVANGELIST CHECKMARX
Amit has been with the security community for more than a decade where he has taken on multiple tasks and responsibilities over the years, including technical and senior product lead positions. Amit adds valuable product knowledge and vast experience with a wide range of security platforms and familiarity with emerging cyber threats and attack trends.
T E S T M a g a z i n e | J ul y 2 01 6
THE NEW VIRTUAL (DATA) REALITY
Ash Ashutosh, CEO, Actifio, reviews why virtualising data for software testing is paramount.
T
oday’s organisations have probably virtualised some, or all, of their servers. They’ve probably virtualised some, or all, of the network. But how much of the data has been virtualised for use in software development? Although they were once new concepts that seemed outlandish, server and network virtualisation brought a number of benefits for companies who adopted them and eventually became the de facto standard approach to server and network deployments in today’s enterprises. We’ll soon be seeing the same for data virtualisation, particularly for development teams as the benefits become clear for anyone developing and testing software. Software development has become more
T E S T M a g a z i n e | J ul y 2 01 6
and more critical for the success of modern enterprises and testing is a vital part of the software development process. Therefore, time and money spent and the efficiency and quality of testing have a direct impact on the quality results of a particular software release and the experience of those who use it. When it comes to testing, a test engineer has the following tasks: preparing the test environment, performing the tests and reporting on the results. Preparation of the environment is often associated with huge time and resource expenses and the rigour and quality of testing can often be directly related to the investments herein. The majority of the time spent in the testing process today is however in the preparation
of the test environment, rather than into the testing itself. Preparing a physical copy of production data for the test infrastructure, masking sensitive data or re‑creating synthetic data, all consumes time and resources without directly contributing to the task and quality of the actual testing. This is where copy data virtualisation can contribute, reducing the time required, the storage needed and the chances for human errors, by optimising and automating this process. While the economic benefits of adopting copy data virtualisation are far reaching, from increased control of data copies to time and costs savings, it offers software testers a number of key advantages during the development process.
41
V I R T U A L I S A T I O N
Copy data virtualisation creates a single physical ‘golden’ copy of production data, of which an innumerable amount of virtual data copies can be provisioned literally within minutes, available on demand for testing anytime, anywhere. The result? Testing can take place with full virtual copies of up‑to‑date data at any time – a huge leap compared with traditional approaches.
PROTECTED DATA Virtual copies of data can also be integrated with masking software. This allows engineering and test teams to self‑serve, instantly access masked, controlled virtual copies of data that both speed up development cycles by reducing waiting time and bugs through the use of high‑fidelity virtual copies of production data. It also ensures sensitive data never leaves the production environment. Additionally, the use of workflows can automate the process, removing human involvement, the potential for error and the time and capacity consuming process of re‑creating and provisioning physical copies of data.
WIN TIME Data virtualisation slashes provisioning time for data required during test and development. Provisioning virtual copies of data of any size takes no more than a few minutes. With the test team able to focus more on the task of the testing itself and spending less time waiting for operations to produce sets of physical test data copies for them, more actual testing can be done, more thorough with improved quality results in less time. Integration with other automation and testing tools means complete host environments including operating systems and application code can all be provisioned along with the required virtual test data.
ASSURED QUALITY As data virtualisation offers the ability to create snapshots of virtual copies, test engineers are able to create point‑in‑time virtual copies of data and pass the full state of a system back to developers when a defect
is found, making it easier and quicker for an engineer to identify the cause and correct the defect.
REDUCED COST As virtualised data requires much less storage and physical infrastructure, the cost for any organisation is significantly reduced. On a broader scale, less time spent monitoring and maintaining a physical infrastructure, as well as reduced downtime means engineering and test teams place much less of a support burden on the operations department. Also, the amount of time freed up by data virtualisation adoption means the team can now focus on more important projects that can move the business forward and improve efficiencies across the organisation. 12 11
COMPLIANCE
The majority of the time spent in the testing process today is however in the preparation of the test environment, rather than into the testing itself
1
10
9 The benefits 8 of copy data virtualisation 7 don’t stop here. To adhere with compliance standards for data protection, IT departments adopt a number of overlapping technologies, such as software for backup, snapshots, disaster recovery, and more. Data virtualisation removes the need for all of these redundant technologies by creating virtual, on‑demand data copies. A technology like copy data virtualisation enables the optimisation of the above processes. Several simultaneously started virtual test environments on a single physical machine or group of machines considerably increase the IT infrastructure flexibility and the efficiency of hardware usage. When you bring all of the above benefits together, the ultimate result is that copy data virtualisation in a software testing environment allows more thorough testing in a shorter time at less cost and with additional value add. This in turn leads to overall reduced costs, increased speed and faster releases with better quality. It’s exactly what any organisation is striving for.
2
3 4
9
VIRTUAL DATA COPIES
5
ASH ASHUTOSH CEO ACTIFIO
Ash is a recognised leader and architect in the storage industry where he has spearheaded several major industry initiatives and led the authoring of numerous storage industry standards. Ash remains an avid supporter of entrepreneurship and is an advisor and board member for several commercial and non‑profit organisations. He holds a degree in Electrical Engineering and a Masters degree in Computer Science from Penn State University.
T E S T M a g a z i n e | J ul y 2 01 6
O U T S O U R C I N G
FINDING BUGS THROUGH PSYCHOANALYSIS Faisal Qureshi, QA Engineer, Amazon, reveals how testers can leverage developer psychology.
T E S T
A
developer’s psychology can play a large role in finding defects. A tester can leverage the developer’s persona, mood, current circumstances, environment, relationships with others, and intelligence, to help guide him or her as to where in the code there is a likelihood that bugs exist. If multiple developers are working with the same code base, the tester can use such information to select certain parts of the code to scrutinise more closely depending which developer was involved, based on his or her psychology and current state of mind. These factors can also allude to what type of defect the developers’ have the potential to introduce.
THE METHOD To take advantage of the developers’ psychology and circumstances while they are coding, the tester must be highly observant of the developers during development. This is also a function of time, which means to take note of these factors each day of development and associate it with the code that was written during that day. For example, a developer may be sick one day causing lethargy, which may increase the likelihood of finding a bug in the section of code that was written by the developer during that time. Developer psychology and circumstances vary from developer to developer, so the tester must create a model for the respective developers on a daily basis. It is integral that a mapping is created for the developer on that day to the code produced by such developer. Creating a mapping table will aid for both long or short projects, in both agile or waterfall methodologies.
Developer
John
Marcus
Sarah
43
M A N A G E M E N T
Psychoanalysis – 1/1/2016 Generally overconfident, had a late night
Code line numbers – 1/1/2016 50‑100 (Cluster.java)
Usually keeps to 100‑250 himself, no unusual (Cluster.java) behaviour today Usually a happy person, was extremely friendly today
250‑350 (Cluster.java)
SCENARIOS THE OVERCONFIDENT DEVELOPER
By determining that a developer is overconfident, a tester can find bugs in his or her code, as the flaws of being overconfident can be transmitted to code. The developer may think “there can be nothing wrong with my code”. As a result, the developer may neglect to test or double check the code. The tester should take advantage of this. The tester should also review the history of the developer introducing defects in previous code. Since the developer may not have tested the code before checking it in, the tester should perform full and comprehensive testing in the new code modules. Furthermore, because the developer thinks he or she is always right, he or she may also neglect to consider edge case scenarios.
A developer may be sick one day causing lethargy, which may increase the likelihood of finding a bug in the section of code that was written by the developer during that time
THE HAPPY‑GO‑LUCKY DEVELOPER
A happy‑go‑lucky developer can be heedless, thus introducing careless errors. Due to this disposition, the developer is likely to be less critical, internally or externally. Even if all the code is functionally correct, he or she may overlook other aspects of software such as performance or security. Thus, the tester should perform various types of testing including penetration testing, performance and load testing, and compatibility testing. From the developer’s perspective, just because the code is functional, they may think everything then must be “OK”.
THE UNHAPPY DEVELOPER
Whether the developer is generally unhappy, or has come out of a poor performance review with code still to write, an unhappy developer is likely to be unmotivated.
Psychoanalysis – 5/1/2016 Generally overconfident, was angry after meeting with the manager Usually keeps to himself, looked upset today Usually a happy person, saw her reviewing the requirements again
Code line numbers – 5/1/2016
Bug finding likelihood – 1/1/2016
Bug type likelihood – 1/1/2016
Bug finding likelihood – 5/1/2016
By type likelihood – 5/1/2016
1‑10 (Mapping.java)
HIGH
Edge cases
MEDIUM
Integration
10‑50 (Mapping.java)
LOW
Functional (doesn’t meet specification)
MEDIUM
Any (untested code)
50‑200 (Mapping.java)
LOW
Performance
LOW
Bug due to unclear documented requirements
Figure 1. An example of a mapping table.
T E S T M a g a z i n e | J ul y 2 01 6
44
T E S T
The tester cannot change the developers’ personalities or control daily events that occur that could impact quality, but has the responsibility to put in place processes that can help avoid similar defect introduction
First, this means the amount of code the developer writes will not be large. Second, it means that he or she will not care too much if the code has errors. Thus, as the developer extends his or her code base, this could be a bug farm. Even when defects are found, and the same developer fixes them, regression testing is vital.
THE INTROVERTED DEVELOPER
An introverted developer will hesitate to communicate with other team members. If this developer has failed to properly review the requirements, and due to his or her nature will not clarify the requirements with the other developers, functional defects can be found due to not correctly understanding what the code is supposed to do. Furthermore, by potentially not knowing the specifications, other types of defects can be manifested such as not meeting service level agreements.
THE INNOVATIVE DEVELOPER
An innovative developer likes taking risks. He or she will attempt to implement new or bleeding‑edge algorithms. Such algorithms, due to their immaturity, may have inherit defects not yet resolved in early designs
FAISAL QURESHI QA ENGINEER AMAZON
Faisal Qureshi currently lives in New York and previously worked at Motorola Solutions as a Sr. Test Engineer. He is an ISTQB Certified Tester with a BS (Hons) degree in Computer Science from NYU Polytechnic and a Masters in Computer Engineering from Columbia University.
T E S T M a g a z i n e | J ul y 2 01 6
M A N A G E M E N T
or versions. Therefore, the tester must comprehensively test against the expected results in various conditions as well as integration with other components. Integration testing in this scenario is vital as the integration of new algorithms with the existing code base may not be seamless.
NEXT STEPS As a quality owner, it is incumbent for the tester to relay the defect information to the developer to attempt to mitigate further similar defects from the respective developers. The tester cannot change the developers’ personalities or control daily events that occur that could impact quality, but has the responsibility to put in place processes that can help avoid similar defect introduction. There may be various processes put in place to address the different developers’ psychology. In addition the tester must maintain keen awareness of the developers behaviour and the environment, and analyse what kind of code is being produced each day by the factors mentioned herein.
27-28 SEPTEMBER, THE ROYAL YORK HOTEL, YORK
27‑28 September 2016, The Royal York Hotel, York # Testi ng Co nf N O RTH
Following on from the huge success of The National Software Testing Conference, which took place 17‑18 May in London, the same organisers are bringing you the Software Testing Conference NORTH.
T
he inaugural event will take place the 27‑28 September at The Royal York Hotel in the wonderful city of York. It will ensure our industry colleagues in Scotland, Ireland, and England’s northern cities have a more localised platform to join together and take advantage of high level thought leading content delivered by some key industry figures, while networking, viewing
the latest products and services, as well as enjoying interactive and lively workshops. Speakers at the Software Testing Conference NORTH are testing heads, managers, directors, or individuals that have been handpicked due to their exceptional levels of knowledge. Over 80% of the delegates at the recent National Software Testing Conference said the
content was good or fantastic, with 95% stating the keynote speakers were good or fantastic. Their expertise pairs with technical roundtables covering a host of different topics, interactive Q&A panels, and a market‑leading exhibition, ensuring that all delegates will enjoy unparalleled networking and learning opportunities.
north.softwaretestingconference.com
27-28 SEPTEMBER, THE ROYAL YORK HOTEL, YORK
EVENT PARTNER
EXHIBITORS
LANYARD SPONSOR
SUPPORTED BY
north.softwaretestingconference.com
27-28 SEPTEMBER, THE ROYAL YORK HOTEL, YORK
Featured speakers
Sriram Angajala Test Lead Eurostar
Dan Ashby Global Head of Software Quality & Testing, AstraZeneca
Cassy Calvert Head of Testing BJSS
Heather Cumming DE Test Portfolio Lead Virgin Media
Mark Galvin Systems Assurance Manager University of Cambridge
Dan Giles CSI Manager Sony CEE – Continual Service
Mike Jarred Head of Software Testing Financial Conduct Authority
Myron Kirk Head of Test CoE Boots
Wolfgang Platz Founder & Chief Product Officer Tricentis
Paula Thomsen Head of Quality Assurance Aviva
Beverley Wells Portfolio Test Lead Virgin Media
Harvey Whitford Head of Quality Assurance & Test Sky Betting and Gaming
MORE TO BE ANNOUNCED
Topics covered ★ Case studies ★ Changing trends ★ Agile ★ Test automation
★ Continuous delivery ★ BDD ★ Performance testing ★ Test data management
★ DevOps ★ User experience ★ Continuous integration ★ Analytics
north.softwaretestingconference.com
In December 2015, The Test People and Centre4 Testing merged to create the UK’s leading software testing company. We are excited to introduce our rebranded organisation, Ten10. Expert-led flexible and scalable testing solutions; from strategic consultancy through to managed services and staff augmentation. Services include: • • • • • • •
Test Strategy Functional Testing Automated Testing Performance Testing Mobile Testing Agile Testing Mobile Testing
Find out how we can help. Please call +44 (0) 203 697 1444 or email contact@ten10.com for an initial chat about your testing requirements. Offices in Brighton, Leeds, London and Manchester. www.ten10.com