TEST Magazine - Awards Supplement October 2013

Page 1

THE EUROPEAN SOFTWARE TESTING AWARDS

OCTOBER 2013

CELEBRATING TECHNICAL EXCELLENCE

www.softwaretestingawards.com

AWARDS SUPPLEME NT 2013


Passionate about testing Excited by technology

Tackling technology testing that others can’t The Test People delivers the most innovative, highly technical and competitive test and performance engineering service available today. www.thetestpeople.com Leeds

email: contactus@thetestpeople.com

Lo ndo n

Gi bra Ltar


INTRODUCTION

WELCOME Hello and welcome to The European Software Testing Awards supplement. After months of anticipation, the TESTA finalists have been announced! The judges had a tough job in going through the many entries and collectively deciding the finalists for each category, but after hours of deliberation they completed their judging duties, and the winners will be announced on November 20th at the Marriott Hotel Grosvenor Square, London. This supplement offers you the chance to learn more about some of the finalists. The companies featured share their secrets of success, as well as the products and services that caught the judge’s attention and have seen them become finalists. For the full list of finalists, please see page 13. All that is left for me to say is I look forward to seeing you on November 20th! All the best,

Sophie-Marie Odum Chair of the judging panel

Headline Sponsor

Supported by

Category Sponsors:

PAGE 1


BRANDT TECHNOLOGIES

SIMULTANEOUS PARALLEL TESTING WITH SHADOW TEST STATION AND SORT Brandt Technologies explains how it supports its clients’ needs... Brandt was established in 2002 and is an international business headquartered in Ireland, with offices in Germany and China. Brandt enables international software publishers in the information technology, antivirus, e-learning and mobile applications sectors to enter global markets. Through Brandt’s extensive experience of delivering software to global markets, we have developed processes and proprietary technology to support our clients’ needs. The challenges faced include faster turnaround time, reduced costs, growing language requirements and maintaining quality across multiple platforms. Brandt’s patented Shadow technology has been developed to specifically meet these challenges.

PRODUCTS The Shadow Test Station, through its innovative user interface, enables the test engineer to carry out the same tests in parallel, in a consistent manner across a combination of operating systems, languages and browsers. This has proven to accelerate functional and exploratory testing across a number of test scenarios. Shadow has been successfully applied to a variety of testing challenges, which range from medical applications, antivirus software, enterprise web portals, e-learning to mobile applications. Brandt’s experience with large multilingual localisation projects, the combination of capturing user interface data and efficiently sharing this data with language reviewers has had a significant impact on work practices. Using Shadow the role of the product or test expert is separated from the language expert, resulting in a significant saving in training and management time of the language expert, and shorter development cycles. Key features of Shadow Test Station: - Parallel test environment enabling simultaneous testing. - Cross platform support including Windows, Mac OSX, Linux. - Virtualised and real world test environments. - Powerful macro and scripting language for record and playback. - Android mobile support. - Automatic synchronisation with SORT. Key benefits: - Significant reduction in manual testing. - Consistent testing methodology. - Broader test coverage with reduced testing effort. - Traceability and monitoring of test passes.

SHADOW TEST STATION Shadow online review technology (SORT) manages and presents the user interface captured by the product experts (test engineers) during the test process. This content is provided online in near real time for re-use as test cases, test scripts, language review and in regression testing. The captured content can be re-used for documentation and online demos.

PAGE 2

Key features of SORT include: - Image and OCR analysis. - Image comparison for regression purposes. - Image detection, classification and cropping. - Integration with Bugzilla and Jira. - Online test progress monitoring. - Test script management. - Parallel bug logging. - Language translation memory integration. - An advanced search mechanism to mine the image data. - Video publishing platform – recorded test cases can be re-purposed as online demos within SORT, or externalised to YouTube or Dailymotion. With SORT the QA and language reviewers simply browse through screen data, identify issues and then annotate and log bugs online. SORT removes the need for language reviewers to follow complicated test scripts or install the software under test. The project management dashboard allows you to get your software ready for market quicker by slashing QA/ LQA time and reduce risks by improving test coverage and follow the test progress in real time. Extraction of valuable metrics e.g. number of issues reported or time spent per language etc. SORT key benefits 1. Reduce costs and time to market - Reduce review time by up to 50% by moving product expertise away from reviewers to product specialists. - Delivery of content online in minutes with our synchronisation technology. - Provide context for translation for improved quality and faster localisation. - Log bugs across multiple languages and platforms in a single click. 2. Reduce project management overhead - SORT simplifies project management by offering a centralised way to send notifications to users and monitor their progress. - No more support to reviewers regarding test script issues or software installation.


BRANDT TECHNOLOGIES - Centralised and searchable access point for all your screenshots. - Advanced search functionality including image search, feature search etc. - Built in metrics such as time spent by user/language per project or number of bugs/language. 3. Improve software quality - Uniform and consistent test coverage of the software across platform/languages. - Remove test script issues and prevent gaps in testing coverage.

- Increase the quality of your translations by providing context to the translators. - Real-time status reports allow managers to react faster to potential issues. 4. Online product demos Create simple training videos across languages using SORT and Shadow test scripts. SORT can leverage parts or entire Shadow test scripts and convert them into videos that can be viewed online or published to YouTube.

ANTIVIRUS TESTING – A CASE STUDY The project was undertaken for a leading antivirus company. The scope of the project required the testing of a suite of applications across multiple operating systems (Windows & Mac), cross-browser testing, mobile testing (primarily Android phones and tablets) and cross languages (28 languages). The testing included functional, user interface, internationalisation, localisation and language quality testing. The software under test was developed with a mix of programming languages, originally developed by different publishers, acquired by our client over time. Due to the nature of the business, the software is continually updated, and requires on-going testing, e.g. virus updates and validation of same. Whilst the software testing aspect of the project was carried out internally, the language QA was completed externally through a large geographically dispersed language team. Installation of the software under test was not an option and the provision of screen (image) data for language QA was required. The business case and strategic approach Prior to working with Brandt, the majority of the application testing was completed in a labour intensive manual process. This involved a team of approximately 50 QA engineers (core) and an additional 40 QA engineers at peak times. Prior attempts at applying automation scripting had failed primarily due to the mix of technologies used to develop the software. The lack of exposed resource IDs, embedded html code within application software that had been developed with C++ and other 4GL languages, created issues for other automation approaches. Working with our client, Brandt introduced and developed a test environment based on our patented Shadow technology to reduce the manual QA effort and an online portal Shadow Online Review Technology (SORT) to facilitate and manage the language QA effort. Aims and objectives of the project The key aims of the project were: - Reduce cost. - Reduce time to market. - Improve product quality. - Monitor progress (particularly language QA). - Reduce the effort in providing screen data externally. - Introduce scripted automation where possible. - Cross-check bugs found against test scripts (Testlink). - Reduce duplicate bug entries. - Greater product coverage (Outlier bugs). - Standardisation and streamlining the language QA process.

Brandt developed Shadow primarily for test projects of this nature (i.e. cross-platform and/or cross language and/or cross browser). Shadow enables a single test engineer to control multiple systems under test simultaneously. This means that a test engineer will carry out a variety of test activities in parallel. These activities include exploratory, functional, user interface, internationalisation and localisation QA, and we categorise the activity under semi-automation, i.e. the task is manual but Shadow automatically interprets the user’s actions and maps them across the other systems under test. A Shadow Test Station includes multiple systems under test (SUT), each system representing one of the many test configurations. As an example, a single test engineer was able to carry out functional testing of the English product running on Windows 7, along with French, German, Italian and Spanish simultaneously. Prior to using Shadow this activity would have been carried out by four engineers or a reduced level of testing was completed by striping the test cases across the languages. Whilst the QA engineer carried out the test pass, image data was also captured for language review through the Shadow Test Station and synchronised with SORT automatically. This enabled the customer to separate the product expertise from the language expertise and run the language QA in parallel with the functional/localisation QA effort. In addition to controlling multiple systems under test, the test case management system (Testlink) was also monitored within the Shadow interface. As the engineers progressed through the scripts then these test cases, were uploaded to SORT. By capturing both the engineers’ progress through the test plan and the image data of the software under test, one can view the progress made against bugs logged. SORT provides a web portal, where test engineers, managers and language experts can review content, captured image data and test scripts (recordals). The system integrates with both Bugzilla and Jira. For this project Bugzilla was customised and integrated with SORT and used as the bug tracking platform. Where possible manual tasks were further optimised, by either creating macro scripts which were assigned to short-cut keys and combined with the manual testing or automated scripts though the Shadow script engine. Macro scripts were commonly used to repeat data entry activities, e.g. login and registration. These scripts once recorded on one system could be replayed without edit across the SUTs irrespective of operating system, device or language. A combination of Shadow and SORT delivered a 50% reduction in manual testing effort, a 100% increase in language review throughputs and a time saving of 30% on the overall project schedule.

PAGE 3



BRICKENDON CONSULTING

OUR APPROACH TO TESTING Brickendon Consulting shares its testing solutions, developed by the team, which have been successfully implemented in two major finance projects... More often than not, performing latency testing is a complex task; especially so in a small team environment with limited budget. With no doubt, what is necessary for any organisation trading FX is the need for a robust latency testing function. However, considering the limited resources we face in today’s project environment, the latency testing solution needs to be efficient and cost effective. Brickendon test specialists combined automation and latency testing; FTAS (Functional Test Automation Solution) and ELMO (Electronic Latency Monitoring) to combine a solution for functional and non-functional test automation. Considering their client’s shortage of test specific internal resources, the powerful methodology combination was gratefully received.

THREE KEY STRENGTHS OF THE TEST AND QA TEAM Strategic and innovative test project design • Comprehensive scoping and test definition • Clear logic of workflow • Accurate resource estimation and allocation Reliable performance in execution • Effective testing towards key delivery milestones • Defect detection and resolution of issues or defects • Efficient resource management Leadership in test project management • Transparent performance management • Clear and effective communication • Proactive stakeholder engagement

TEST AND QA INNOVATIONS Brickendon Solutions have been developed by the Brickendon Test Team. All the solutions have been successfully implemented in two major finance projects in a top tier investment bank. To get more information, or to implement any of these solutions below, please contact Brickendon Consulting. Time Check Point System (TCPS) TCPS is an innovative test estimation model, established to estimate and allocate resources accurately in a test project. The solution systematically prioritises and categorises relevant test tasks. Requirements are analysed in a testing perspective with each task then scored in a matrix on importance and complexity, allowing the test manager to categorise tasks into different priority bands. TCPS significantly improves accuracy in resource allocation and enhances transparency in performance management, which clients directly benefit from. TCPS also benefits the management team to handle scope changes efficiently. Integrated Test Tools Approach (ITTA) Brickendon test specialists designed the ITTA solution with two clear goals in mind: 1) To centralise the testing assets by integrating available test tools and 2) To increase

efficiency by achieving lights-out testing mode. ITTA utilises existing test management and test automation systems. These different systems are integrated to a centralised tool, which is then used as a central location for test lifecycle management. ITTA is a consolidated testing methodology, proven to significantly improve efficiency by automating the testing process such as automated defect logging, automated test results recording and mapping requirements etc. ITTA enables the management team to orchestrate projects more efficiently by centralising the available test tools and allowing test experts to focus on more value-added tasks. Test Metrics Solutions (TMS) Using mathematical formulae designed by Brickendon consultants, a test and defect metrics is generated to identify where bottlenecks will appear within a project lifecycle. This allows the project team to effectively and efficiently prevent bottlenecks. This TMS provides a reliable tool that highlights underlying bottlenecks, which frequently go unnoticed. Variance analysis is key in enabling bottleneck identification during the project lifecycle. The comprehensive metrics provides sufficient information for project teams to efficiently collaborate for test design, execution and defect resolution. TCPS, ITTA and TMS together provide a complete test life cycle management solution. FTAS (Functional Test Automation Solution) FTAS focuses on improving efficiency by reducing the complexity of test automation. Therefore, the solution had three objectives: An ability to implement automated tests with minimal knowledge of scripting; provide a framework for developers and business analysts to automate testing; and improve multi-system reconciliation. The main benefit of FTAS is that it helps to simplify testing for the users. The value is added by providing the ability for non-technical users to start writing the automated test. This makes the solution much more easily accessible and greatly expands the benefits to the client. ELMO (Electronic Latency Monitoring) ELMO, targets at delivering a robust, real-time but lightweight latency solution. ELMO is implemented on the pricing server and collects the time information of all entrances and exits en route to the end-user application. Performing this function as a real time latency measurement demonstrates best practice and this detail separates ELMO from other forms of latency monitoring as a stand-alone innovation. ELMO does not interrupt any existing architecture; no customisation is necessary. This concept of ELMO can be adapted and applied to essentially all electronic trading platforms.

BRICKENDON CONSULTING The nature of the project environment we operate in today is complex; the requirements of each client are becoming more specific and unique. Brickendon test

PAGE 5


BRICKENDON CONSULTING consultants are encouraged to think differently and challenge the status-quo; they provide a highly efficient, methodical and best-of-breed testing service to the industry. The innovation does not stop there. After golive, the innovation stays with the client organisation via knowledge sharing. Going forward, the efficient and low maintenance solutions are easily managed by the clients’ existing support team.

Brickendon test consultants have provided value-adding solutions to clients in financial markets for many years. Our areas of performance include: • Latency testing solutions • Middleware testing • E-commerce and algorithmic trading test strategy • Financial market test design • Programme test management • Defect management

INTERVIEW WITH BALA Balaprasanna Ethirajalu (Bala) is a test manager, who joined Brickendon Consulting in October 2012. Currently managing three eFX testing projects within a top-tier European bank, he has designed three innovative solutions: Time Check Point System (TCPS), Integrated Test Tools Approach (ITTA) and Test Metrics Solutions (TMS). He speaks about challenges in the testing world and his innovative methods to tackle them. Q: What do you think is the most challenging thing in the testing world? At the moment, testing is considered as an additional element of the project management; a nice-to-have practice. I think the value of testing is not given the due importance; the perceived value in comprehensive testing is not felt or realised. This perception can be changed only when we deliver differentiated testing services, through continuous innovation. I think that’s the biggest challenge in testing. Q: You have mostly worked in the financial services environment; are there challenges in testing specific to this industry? Yes, because everything has a financial impact in the banking and finance sector. The cost of improperly addressed issues/bug can be really high. Testing professionals in the financial world need to excel in two unique ways 1) The industry knowledge – especially the financial products and market and 2) Where issues or defects are likely to occur in the specific environment. The ability to see through the possible scenarios is essential; test analysts should be able to go beyond the thought process of BAs and developers. Having said that, there are always two different aspects for test managers to consider: the technical expertise of testing and the excellence in managing the project. Q: The two different aspects, technical expertise and managerial skills, how do you achieve both? For me, I see the benefits of employing a methodical approach. Most projects have aggressive timelines and therefore resource estimations for project needs to be precise. I have seen many testing projects where estimations were nowhere near reality and this, at times, results in failure. In the finance and banking industry, having an inaccurate estimation can have two major consequences: 1) A compromise made on project go-live date or 2) A compromise on the product quality. The problem is that both of these problems are very costly! However, there was no standardised method to avoid this problem. I wanted a reliable way of resource estimation and that’s how I came up with TCPS. TCPS helps test managers to estimate testing effort in a more comprehensive manner therefore preventing misjudging the scale of a test project.

Q: How about ITTA and TMS, how do they help in the project environment? For ITTA, it is more about efficiency. Normally, test automation can be limited to automating use-cases and functionalities, but ITTA is automating the whole testing process. Most test managers spend time collating and presenting the reports because multiple testing tools used in our project environment do not speak to each other. By automating the manual processes using a centralised approach, the test managers can focus more on quality of testing than administrating the project. It is more efficient as all reports are gathered in a one place, and all stakeholders can access the reports. Another important benefit of ITTA is the ‘lights-out’ testing; I am a big believer in utilising my tools 100%. I have an excellent testing team – I would like my team to contribute with their expertise to build better testing logic, not to spend their time trying to administrate manual tasks that could be automated. For TMS, it’s about collaboration and efficiency. When working with developers and BAs in a busy schedule, being on the same page and staying efficient can be tricky; on occasion we are looking at one issue much longer than we need to – this is a bottleneck. TMS solves this issue, as a preventable measure. In its implementation, I have seen this being used by a project manager who effectively reallocated resources to prevent the bottlenecks where required. Q: Where do your inspirations come from? Testing is critical to safeguard the integrity of a product. I often think about the end-users, maybe the product is for employees of the organisation, or maybe it is for a third party, or both. I like to imagine what the impact on the organisation and the third party would be. The moral responsibility inspires me. I think the testing profession carries an honourable responsibility to ensure the organisation and the third party are not at any loss. No nasty surprises – if there is any issue or defect, this would bring disappointment, damage reputation and result in losses. In doing so, there is always a better way to achieve even better results. By thinking outside the box, I solve problems that seemed unsolvable, and trying slightly different things brings me surprising improvements. Q: Do you have any advice for the younger generation of test specialists? Please act responsibly. I mentioned about how I feel responsible for products that are endorsed by my team. I would say, please don’t just take testing as a day job, instead love testing and be passionate about it. Your passion will naturally improve the quality of service, which then will be cascaded to a bigger crowd. I think this is how I can contribute to the testing world.

To find out more about how Brickendon Test Specialists approach testing and their success stories, please contact Brickendon Consulting.

PAGE 6


Ever feel like a

robot when doing manual testing?

Cross Browser

Why not work in parallel instead? Brandt’s Shadow™ technology One engineer controls multiple systems under test Proven increase of test productivity by 200% to 400% Download a free, 30-day trial of Shadow™ at: http://www.brandttechnologies.com/shadow_trial.php Brandt is a technology company specializing in software development, testing and localization. Brandt’s Shadow™ technology offering is proven, scalable and innovative, dramatically improving productivity.

www.brandttechnologies.com


MAVERIC SYSTEMS

END-TO-END ASSURANCE FROM REQUIREMENTS TO RELEASE Maveric Systems discusses how the lifecycle assurance approach can maximise the value for a business by assuring a quality solution... Presently, “assurance activities”, while fulfilling a projected business outcome using software solution(s), are not encompassing the entire technology adoption lifecycle. The assurance activity is often fragmented among various entities providing the solution with each festering in its own cocoon. There are multiple agencies assuring various stages, between the need germination and the go-live. Typically, nobody owns the responsibility for the entire solution. It is expected that each agency performs its role diligently. The information systems department conducts an acceptance test which is usually limited to positive functionality testing. Then the system is subjected to beta testing by the end-users in test/ staging environment. This testing helps organisations in uncovering some defects, but it has never been accepted to be comprehensive or as an alternative for structured quality assurance. When the system goes live and the actual end-users start using it with real data, problems surface. The “business” that is paying for the program still has to compromise with the reduced level of service, pacify irate customers and business users, besides possibly incurring a loss of revenue. Recognising this, many organisations now initiate testing by an independent specialist agency. But when the independent testing agency is brought in, it is often constrained by a tight budget and deadline. Secondly, the independent testing agency comes in during the latter part of the solution chain (post-build). If any serious design flaw or requirements error is detected during the independent testing stage, it would be assigned to software maintenance, resulting in an ill-fitting patch to make it work by circumventing the issue rather than solving it once and for all. Therefore, “Business” has already paid for producing the defective software and again pays for rectifying the defects that should not have been there in the first place. The situation has room for improvement because no single agency owns responsibility for ensuring that the system is defect free. The life cycle assurance approach tries to address this problem.

LIFE CYCLE ASSURANCE APPROACH Life cycle assurance (LCA) approach aims at maximising the value for the “business” by assuring quality of a solution using a unified approach that spans requirements to release, thus focusing on the projected business outcome in its totality. It does not look at the project as an IT projector as a software development

PAGE 8

project, but as a project to improve business efficiency or increase business competitiveness.

LCA APPROACH AIMS AT MAXIMIZING THE VALUE FOR THE ‘BUSINESS’ BY ASSURING QUALITY OF A SOLUTION USING A UNIFIED APPROACH THAT SPANS REQUIREMENTS TO RELEASE

LCA is carried out by an independent agency (like Maveric Systems) which, instead of coming in at the end to carry out the specified testing of the software solution, comes in immediately after the idea for a new solution is germinated in the organisation and continues to stay on till the solution is stabilised in the organisation, delivering the projected business outcome. LCA focuses on providing end-to-end assurance by being focused on the overall program and the business outcomes it is targeted to deliver. In the LCA model, “assurance” begins with requirements. Deficient requirements are possibly the single largest source of software defects and failed projects. The lack of detailed requirements and the absence of a change control system are considered the biggest causes of project failure. We believe that a well-structured requirements definition, elaboration and validation framework is the best route to defect prevention, and to deliver the following benefits: • Ensure that the “right product” is delivered. • Shorten the development lifecycle by reducing rework and associated testing effort. • Potentially save up to a third of the overall project cost. Requirements assurance is provided through structured requirement gathering and requirement validation, where bi-directional traceability of requirements is maintained throughout the project lifecycle. The LCA approach ensures availability of BVPs (Build Validation Packs) much in advance. This facilitates a breather for thorough test planning. BVPs enable “business” and IT to drive solution vendors to take much greater accountability for quality as entry criteria for testing would be much more stringent and clearly defined. Acceptance criteria are validated at an early stage and largely aid the development effort in ensuring that the requirements are defined to ensure delivery of projected business outcome. Regular verification of deliverables produced during each iteration of software development is supplemented


MAVERIC SYSTEMS

by Program Health Check to identify risks early during program execution. Program Health Checks are conducted by independent agencies (external to the entities involved in project execution) to ensure that any biases are eliminated. They assess the program on various dimensions including: • Organisational readiness – Governance model, executive support and commitment and change management process. • Project management – Project plan, project execution, key performance indicators and risk management. • Operational readiness – Change control system, staffing needs, metrics capture and reporting. • Infrastructure readiness – Availability of environments for development, testing and staging. The Program Health Checks are independent of quality audits that are carried out to check adherence of projects to organisational norms and best practices.

BENEFITS THAT ACCRUE FROM THE LCA APPROACH • The first and the foremost benefit that can rightfully be expected from the LCA approach is the certainty of achieving the projected business outcome from a selected solution. • Another benefit that accrues from the LCA approach is the ability to get the solution “right “the first time. LCA ensures a robust and reliable solution by implementing tighter control at the entry and the exit of every critical stage of the solution development cycle. • LCA ensures that the schedule is subjected to independent quality control ensuring that all contingencies are considered and provided for in the schedule, avoiding costly schedule slippages. • LCA, owing to its unified approach, ensures that the solution is robust in all its components, preventing a weak link in the chain and eliminating unscheduled work stoppages that arise out of the unreliable system. • In LCA, since the focus on assurance starts at the very beginning, both the business and IT teams and any independent testing partner are in a much better position to determine the nature and extent of testing necessary to ensure achievement of the projected business outcome. Also, since solution partners in this model take greater accountability for quality, the extent of testing during system integration and UAT comes down significantly.

CHALLENGES IN IMPLEMENTING LCA There are a few challenges in adopting the LCA approach in organisations. The first one is the paucity of service providers that have the requisite expertise to take up LCA to assist businesses. Most of the organisations carrying out independent quality assurance are rooted in testing alone. LCA is a rather recent concept rising on the horizon and, therefore, testing organisations are yet to acquire the required expertise to assist businesses, especially during the requirements phase. Agile software development methodologies are becoming popular and it is argued that it is difficult to implement structured quality assurance in those projects. One misconception about agile methodologies is that they are “no documentation “philosophies. They are rather “minimum documentation” philosophies. Therefore, it would not be a major issue if the software development is proposed to be carried out using agile methodologies. As LCA would be spread over the entire lifecycle of the solution, it needs unswerving leadership commitment. It is easy to fire the agency providing the lifecycle assurance support in times of funds crunch or when faced with resistance. Another challenge could be from the agency developing the application software. Usually, agencies specialising in either software development for new software product or customising an existing COTS (Commercial Off The Shelf) product by building a layer over it or enhancing an existing product are likely to resist independent quality surveillance. If the software development is carried out in-house, then the resistance can be even stiffer.

FINAL WORDS LCA is an emerging concept to assure that IT solutions deliver the projected business outcome to the organisations investing heavily in IT solutions. LCA focuses on the “solution” as against the present practice of focusing on the “application software” alone. LCA can significantly improve ROI by reducing direct costs as well as indirect costs arising out of unscheduled work stoppages and excessive software maintenance. All in all, LCA is a new weapon for businesses in maximising the return on their investment in IT solutions.

PAGE 9


Testing, testing... 1

TESTING INNOVATOR OF THE YEAR

FI

N

AL

IS

T

2

BEST USE OF TOOLS

EXCUSION ZONE

BRICKENDON EXCUSION ZONE

BEST AUTOMATION PROJECT

3

Historically, the importance of Test and Quality Assurance has been undervalued and overlooked, but this position is rapidly changing. It is becoming increasingly recognized that having a professionally devised test strategy, and a highly trained test team, is the difference between project failure and success. We understand that the Test and Quality Assurance discipline requires in-depth knowledge of the business and the technical domains in order to accurately examine the requirements and technical build. Testing specialists in Brickendon have formulated strategies and delivered value-adding solutions to our clients.

We offer the following Test and Quality Assurance services: Strategic Test Consultancy Programme Test and Defect Management Performance Testing Test Solutions and Analysis Test Tool and Test Automation Consultancy

Read our case studies at www.brickendon.com or scan this code to contact us now:


THE TEST PEOPLE

AN INNOVATIVE APPROACH TO TESTING The Test People shares how it got to where it is today...

THE VISION In 2007, the four founding directors, Gav Winter, Chris Thompson, Ash Gawthorp and Andy Slight launched The Test People. Having been involved with various testing services organisations, as both employees and customers, the founders were frustrated by the commercial and operational methods used by typical testing services companies. The answer to most testing challenges always seemed to be a combination of “use more people and take more time”. Whereas in the majority of cases, Gav and the team knew that use of technology and innovation particularly in the disciplines of test automation and performance engineering could actually result in reduced headcount and shorter timescales. This of course is counterintuitive to most testing professional services businesses where turnover and profit is based on a combination of headcount utilisation and day rates. Consequently, The Test People was founded to change this approach. The following quote from Gav summarises The Test People philosophy: “We set up with the intention to become the software test organisation of choice for performance, automation and agile projects across the UK. We believe knowledge and innovation is the key to delivering projects and we

are the only company setup to deliver this at scale. This kind of coverage can only assist in our mission to help further software testing as a service and most importantly deliver projects for our clients.” From day one, The Test People has invested in an inhouse R&D capability led by Ash to both solve testing problems caused by technology and to innovate to create solutions to testing challenges that had hitherto been addressed either by headcount or had been left untested because the technology was deemed untestable using current methods and tools. Ash comments; “We take a highly technical and innovative approach to testing, ranging from utilising the latest toolsets to developing bespoke testing solutions for complex test, performance and automation challenges.” As the founders of The Test People all had extensive experience of retail operations, financial markets trading and egaming trading, the business was launched to focus on marketing its services to these industry verticals. This focus still remains, however The Test People has acquired numerous clients in other sectors.

THE SITUATION TODAY Six years on, in 2013, t he results achieved by implementing the strategy developed by the founders is incredible. The company has grown to a headcount of more than 100 staff with revenues for FY1013/14 expected

PAGE 11


THE TEST PEOPLE

to be over £7.5 million. The business now has offices in both London and Leeds, and has recently opened it’s Gibraltar headquarters, specifically to support its growing business in the bookmaking/egaming market. In addition and to support the growth plans for the business, The Test People has invested in the creation The TTP Academy, a proven and trusted resource of academic excellence, which has been specifically built to train, accredit and deliver the testers of tomorrow. The Academy produces skilled testers who have had hands-on experience across a range of live business projects, covering functional and user acceptance testing, automation and performance testing. The training delivers a broad range of disciplines, such as risk-based testing, test automation, performance testing, performance engineering and includes architectures, databases, networks and programming. This intensive course, led by world-class trainers with a wealth of experience and knowledge, enables The Test People to provide clients with motivated, bright, young, career-focused people, who have mastered the technical and analytical skills required to be productive testers. In addition, The Test People has: • Been selected by Goldman Sachs as part of its

Goldman Sachs Business Growth Program 2011 and then the Goldman Sachs Super Growth Program 2013. • Been chosen to represent high growth business on Bloomberg TV. • Won numerous awards including: The Yorkshire VentureFest Innovation Award 2012 and The Yorkshire Business Masters Innovators Award 2013. • And most recently has been recognised by the Sunday Times Hiscox Tech Track 100 as “the fastest growing technology company in Yorkshire & the North and 44th overall in the UK”.

have a two-way partnership that has helped us
on multiple projects enhancing William Hills offering in the wider market place. We would recommend The Test People as a name you can trust.”

THE ACADEMY PRODUCES SKILLED TESTERS WHO HAVE HAD HANDS-ON EXPERIENCE ACROSS A RANGE OF LIVE BUSINESS PROJECTS, COVERING FUNCTIONAL AND USER ACCEPTANCE TESTING, AUTOMATION AND PERFORMANCE TESTING

THE FUTURE The future looks exciting for The Test People. Here, Gav and the rest of the Board set out their vision for the next phase of the company’s growth: “We have built a solid foundation based on talent, technology and delivery. We do the right thing by the client; we innovate, we move, we adapt to be the best we can be but we need to go further. “The vision is to be the number one testing company of choice in the UK and wider markets. “Growth is planned in international markets like Gibraltar, the US and Australia, but it’s our technologyled strategy of top quality consultants in all locations, backed by a best of breed technical delivery function out of Leeds, that stands us apart in the UK. “We really are different from the other testing companies. We are both development and testing domain experts developing technological solutions for testing. We employ as many developers as we do testers. “Fundamentally, we are a delivery company who use testing and technology as a means to get our clients project live faster with the optimum amount of test coverage.

More importantly, The Test People have now built an impressive set of clients across a range of sectors who have been delighted with the services delivered.

“We are already considerably ahead of the game but we know that others will seek to emulate us so we must never lose touch with our core values.

Clients include: William Hill, ICAP, Hitachi Capital Uk, Gain Capital, NHS Choices and Open Bet .

“We promote a culture of innovation, learning and reward. Our experienced team have been carefully selected; the young talent in our company is being prepared for the future, but not only by using today’s markets technology, but by creating tomorrow’s.

The following testimonial is from The Test People’s longest standing client, William Hill, who have worked with The Test People since 2007, it summarises the typical client experience: “The Test People have been a key supplier in assisting William Hill deliver their business goals over the past few years. With bright, dedicated people, excellent knowledge of technology and a focus on delivery, we

PAGE 12

“Our technical testing academy will be run again in early 2014, giving more young people a chance to change this market for the better, and we are focused on delivering more into the testing community again, strengthening the testing as a whole.”


www.softwaretestingawards.com THE EUROPEAN SOFTWARE TESTING AWARDS

CELEBRATING TECHNICAL EXCELLENCE

The finalists are as follows:

The TechExcel Best Agile Project • Sopra Group & Student Loans Company • Headline Datacastle Sponsor • Lloyds Banking Group in partnership with Cognizant

Green Testing Team of the Year • Cigniti Technologies • Cognizant

The eggPlant Best Mobile Project • Lloyds Banking Group in partnership with Cognizant • PokerStars • The Financial Times Limited • DMG Media

• Andrew Thompson, Cognizant

The Sogeti Best Automation Project

THE EUROPEAN SOFTWARE TESTING AWARDS • Maveric Systems

TECHNICAL • CELEBRATING Brickendon Consulting

EXCELLENCE

• Tata Consultancy Services • Cigniti Technologies

Best Overall Testing Project – Public Sector • Knowit • Sopra Group & Student Loans Company Best Overall Testing Project – Finance • MagenTys • LMAX Exchange • Exeter Family Friendly • A1QA

The Capita Best Overall Testing Project – Retail • Tata Consultancy Services • John Lewis IT • Wincor Nixdorf

Young Tester of the Year • William Gibbons, Sogeti

The UKTB Testing Manager of the Year • Chris Comey, Testing Solutions

• Debbie Woelfell, Sopra Group • Tina Kelly, EGUK Testing Innovator of the Year • useMango • Barry Weston, Sogeti • Brickendon Consulting

Test Champion of the Year • Datacastle • Craig Ian Thomas, Cognizant

The BCS Best Overall Project • Tech Mahindra • Chickenwings • Norton Rose Fullbright • Aditi Technologies • Tricentis Best Overall Use of Technology • Gamesys • Jaspersoft

Best Use of Tools

• Eleks

• Brickendon Consulting

• Amdocs

• Tech Mahindra

• Brandt Technologies

• Tata Consultancy Services • Capita IT Professional Services

The Thinksoft Testing Team of the Year • Proxama • Lloyds Banking Group in partnership with Cognizant

Most Innovative Project • The Test People • ValidSoft UK • Amdocs • Gamesys

• Waitrose • MGM Advantage • Home Office IT Test Design and Consultancy Services

Testing Management Team of the Year • A1QA • Maveric Systems • Close Premium Finance

We are looking forward to announcing The Borland European Software Testing Award winner and honouring an individual for their Lifetime Achievement on the night, as well as Best Newcomer and the Best Crowd Project.



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.