8 minute read

2019 Trends & Predictions

With the way that software is tested, developed and delivered constantly evolving – what will this year hold? We asked several experts in their fields to share their insights about what the software testing and cybersecurity landscape will look like in 2019

I think it'll be more of the same from 2018. We'll still have people talk about automation and ML and AI (/ANI) replacing testers. And we'll still be talking

Advertisement

about how that's all still impossible, based on the difference between asserting expectations and exploring unexpected unknowns.

For me, I'm hoping that we start to do more 'continuous testing'. Not in the 'continuously run your automation in a CI/CD pipeline' sense, but around continuously conducting investigative testing earlier and throughout the SDLC: testing the ideas for software solutions, testing the artefacts we create and the UX and UI designs, then testing the code design and the code as it's being written, all before we operate and test the software and test in production.

This isn't new by any means, but it's still relatively unknown to many people. So I hope 2019 brings more awareness and allows people to start incorporating this continuous testing into their strategies.

On top of this, I’m hoping that the external communities mix things up a bit. In our workplaces, we encourage that people mix across roles, that we understand and appreciate the value that each team member brings in these different roles. But we don’t see that in the external communities yet.

Each conference is still aimed at a specific group of people – a dev conference for a specific type of programming language, a testing conference for a specific testing mindset, an Agile conference for Agile coaches, etc.

It’d be great to see conferences mix more, and cater to all different kinds of people, bringing folks from different domains and roles together to learn from each other. I’m hoping we can start to break this current mould in 2019 too.

The board, rather than just IT or operations departments, is now realising it has a responsibility to understand cybersecurity and ensure comprehensive procedures are followed.

Many businesses appreciate it might be a matter of time before they experience a major incident, so a more proactive approach to cybersecurity is necessary. One result will be more investment from firms in solutions which want to cut through the data, see the alerts which really matter – so action can be taken quickly.

Continuing skills shortages will mean businesses which lose cybersecurity expertise will be left facing the challenge of operating security systems and determining the real threats within all the noise of dayto-day business-as-usual alerts.

While cybercrime is constantly evolving, many criminals continue to make use of old ‘tested' vulnerabilities. Research we conducted last year on more than 170k Magento websites worldwide, found no region registered less than 78% of its sites being at less than 'high risk' from hackers for failing to update security patches. There's no reason to see this changing until there's a shift in the effectiveness of vulnerability management governance.

There's been a shocking lack of investment and interest in the greater environment and very often that ‘lessprotected' environment is used as a beachhead to gain access to sensitive areas. The growth in the IoT and the media attention it will gain in relation to security will bring this issue to the fore.

Bug bounties are trying to reinvent themselves in light of emerging startups in the field and not-forprofit initiatives such as the Open Bug Bounty project.

Most crowd security testing companies now offer highly-restricted bug bounties, available only to a small circle of privileged testers. Others already offer process-based fees instead of resultoriented fees.

We will likely see crowd security testing ending up as a peculiar metamorphose of classic penetration testing.

Millions of people have lost their money in cryptocurrencies in 2018. Many due to crypto-exchange hacks or fraud, others were victims of sophisticated spearphishing targeting their e-wallets, some simply lost their savings with the Bitcoin crash. People believed in innate immunity, utmost resistance and absolute security of cryptocurrencies, while now their illusions about cryptocurrency security have vaporised. The problem for 2019

is that many victims irrecoverably lost their confidence in blockchain technology in general. It will be time-consuming to restore their trust and convince them to leverage blockchain in other areas of practical applicability. On the other side, it’s not too bad, as potential future-victims are now paranoid and won’t be lowhanging fruit for fraudsters.

Cybercriminals have attained a decent level of proficiency in practical AI/ML usage. Most of the time, they use the emerging technology to better profile their future victims and to accelerate time, and thus effectiveness, of intrusions. As opposed to many cybersecurity startups who often use AI/ML for marketing and investor-relationship purposes, the bad guys are focused on its practical, pragmatic usage to cut their costs and boost income. We will likely see other areas of AI/ML usage by cybercriminals.

We will probably have the first cases of simple AI technologies competing against each other in 2019.

As in most areas of the software industry, the cloud is a hot topic. Many providers are moving their data into the cloud, and as a tester, I’ve always had an interest in monitoring and data visualisation, this is now coming in useful for testing distributed systems that are often opaque by nature. As a lot of the healthcare related work I do is datadriven, which is very much often the case in the healthcare industry, one of the biggest responsibilities lies around how to get the most out of the data for testing

purposes without crossing the line; the vast majority of people don’t want their data to be seen by anybody apart from medical professionals – much less used for achieving software quality. This raises the challenge of what can be done to create fit-for-purpose test data without potentially raising ethical concerns - potentially there is room for another test tool to be able to do this, through the use of AI or otherwise.

Ultimately, I think the main trends to be seen in the healthcare section of software development in times to come will be

around usability and security, as those are the two areas which both deliver the most quality to the users and have the most potential to go wrong.

How these challenges are overcome is going to be down to working with the data providers and the general public both, as healthcare is a product for everybody.

The bar is so low for public services expectations in the UK, so a real opportunity is available to make a service that smashes those expectations and sets a new precedent for quality in healthcare.

In 2019, it’s software developers and the business testers who will see the most change to their role. In order to enhance their test productivity, these teams will embrace smart testing tools that are powered by machine-learning and artificial intelligence capabilities.

DevOps teams will embrace tools and technologies to boost productivity. This includes more investment in automation of the entire DevOps pipeline activities from

coding through production – together with cloud and SaaS services that include lab environments, service virtualisation, bigdata management and more.

ML and AI tools have the ability to facilitate reliable and stable test automation – helping optimise test suites across the entire DevOps pipeline through the identification of flaky, redundant and duplicate cases. The manipulation of test data which helps decision makers validate their software quality on-demand will be crucial to teams’ success in the year ahead.

To keep pace with innovations like AR/VR and the rollout of 5G networks, DevOps teams will need to re-think their schedule, software delivery processes and existing architecture. Automation will be the key – and having the right tools, test environments and labs will be crucial.

Teams will need to spend more time on developing test cases to cover the innovative features like the new 5G network, AI/ML, AR and VR and IOT. By the end of 2019, all websites need to comply with strict accessibility requirements and to make this less painful and impactful on the overall pipeline, these tests will need to be automated as much as possible.

Enterprises are losing vital mainframe development and operations skills. Automating processes within the software development cycle helps to mitigate the effects of that lost knowledge. Take mainframe testing for example. It’s essential in helping find bugs earlier in the development lifecycle. However, unit and functional testing in the mainframe environment have traditionally been manual and time-consuming for experienced developers and prohibitively difficult for inexperienced developers to the degree that they skip it altogether.

With automated testing, however, developers are able to automatically trigger tests, identify mainframe code quality trends, share test assets, create repeatable tests and enforce testing policies. When these processes are automated, developers can confidently make changes to existing code knowing they can test the changes incrementally and immediately fix any problems that arise.

Enterprises want to leverage existing investments in people, toolsets and

technology to support mainframe system delivery. Central to this effort is constructing an open, cross-platform, mainframe-inclusive DevOps toolchain that empowers developers of all skill levels to perform and improve the processes necessary to fulfil each phase of the DevOps lifecycle. Application Program Interfaces (APIs) make this open delivery architecture even more extensible by enabling users to leverage existing system delivery automation investments, reducing risks and time and increasing efficiency and velocity throughout the lifecycle.

2019 will see companies advance in their digital transformation efforts by empowering developers to be more product focused and less project focused, making development more mindful and directed. KPIs will become critical in keeping digital transformation efforts on track. As the brain drain continues within development ranks, automation will be key in helping new to the mainframe developers be successful by making vital processes less time consuming and error-prone.

Bug bounties, AI-driven development and Machine Learning will gain momentum in 2019 as the trend continues from past years. AI is the future and it will start making its way towards the mainstream in 2019.

As the use of AI and automation is making existing processes more efficient and scalable, the focus in 2019 would be towards including AI in the mainstream software development process. We need software systems which can analyse the patterns out of the huge amount of unorganised data being generated every day and be able to learn and make decisions out of it. So, AI and automation are going to be adopted massively in mainstream software development in 2019.

With the increase in the market of IoT (Internet of Things), AI is becoming the

household help. IoT is about connected devices while AI is about connected intelligence. With the vast amount of data being generated with the IoT devices, AI is the solution needed to analyse it and make a meaningful decision out of it, which will help in improving the existing IoT devices for future.

In the year 2019, AI is going to be heavily adopted in the cybersecurity industry. AI will play a very significant role in overcoming the human errors leading to successful cyberattacks as happened in case of the data breach of Equifax.

Also, AI in cybersecurity would help in efficiently identifying the attacks and learn with the past data available hence, improve with learning from the existing data and accurately identify the future attempts of cybercriminals.

in software development, the big story in 2019 will be machine learning and AI. In the coming year, the quality of software will be as much about what machine learning and AI can accomplish as anything else. In the past, delivery processes have been designed to be lean and reduce or eliminate waste, but to me that’s an outdated, glass-half-empty way of viewing the process. This year, if we want to fully leverage these two technologies, we need to understand that the opposite of waste is value and take the glass-half-full view that becoming more efficient means increasing value, rather than reducing waste.

Once that viewpoint becomes ingrained in our MO, we’ll be able to set our sights on getting better through continuous

improvement, being quicker to react and anticipating customers' needs. As we further integrate and take advantage of machine learning and AI, however, we’ll realise that improving value requires predictive analytics.

Predictive analytics allow simulations of the delivery pipeline based on parameters and options available so you don’t have to thrash the organisation to find the path to improvement. You’ll be able to improve virtually, learn lessons through simulations and, when ready, implement new releases that you can be convinced will work.

Progressive organisations, in 2019, will be proactive through simulation. If they can simulate improvements to the pipeline, they will continuously improve faster.

Cloud management and control is going to become vitally important. Before, the mantra was 'move to the cloud at all costs'.

In 2019, the 'at all costs' part will go away, and cloud waste management will become increasingly important to businesses.

I don't see a significant change in global cloud trends in 2019, but I think there will perhaps be an even greater focus on keeping data in-country and specifically 'out of the US', as the geopolitical environment over there, and globally, heats up even more.

Don’t expect AWS to go away, as it is still growing faster than all other cloud

providers combined. It is good to have healthy competition and having Azure, Google Cloud Platform and IBM be more viable is good for the industry. The growth of Azure, specifically, will bring more cloud success in Europe.

More organisations will migrate aggressively to the public cloud and build net-new applications there.

Whether workloads are in the public cloud or on-premises, IT teams increasingly will seek a platform that moves away from 'static' resource allocation to 'dynamic' resource allocation – which means automatically scaling, dynamic routing, functions-as-a-service and other 'serverless' technologies.

This article is from: