SD Times August 2019

Page 1

FC_SDT026.qxp_Layout 1 7/29/19 11:11 AM Page 1

AUGUST 2019 • VOL. 2, ISSUE 026 • $9.95 • www.sdtimes.com


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 10:58 AM Page 2


003_SDT026.qxp_Layout 1 7/26/19 3:33 PM Page 3

Contents

VOLUME 2, ISSUE 26 • AUGUST 2019

AGILE SHOWCASE 42 The new age of Agile: Evolving from teams to the entire business

FEATURES Moving to the cloud page 12

45 Modernize mainframe apps through Agile practices

47 Becoming agile in the knowledge economy

NEWS 6

News Watch

8

AI Ethics: Early but formative days

20

Foldable technology will force developers to think outside the box

26

SKIL: A framework for humans, not machines

40

Sending encrypted data with sound

Azure targets innovation over domination

COLUMNS 48

GUEST VIEW by Dan Saks Solving the platform puzzle

49

ANALYST VIEW by Arnal Dayaratna Toward a curated web: Why digital content needs standards

50

INDUSTRY WATCH by David Rubinstein One platform, no waiting

L d Profiles Leader P fil

2 019

25

Tricentis: Automation @ DevOps Speed

29

Taking the LEAD in digital image integration

39

Follow the test pyramid to continuous testing

page 18

Low-code for mobile: Toys or tools?

page 30 Software Development Times (ISSN 1528-1965) is published 12 times per year by D2 Emerge LLC, 80 Skyline Drive, Suite 303, Plainview, NY 11803. Periodicals postage paid at Plainview, NY, and additional offices. SD Times is a registered trademark of D2 Emerge LLC. All contents © 2019 D2 Emerge LLC. All rights reserved. The price of a one-year subscription is US$179 for subscribers in the U.S., $189 in Canada, $229 elsewhere. POSTMASTER: Send address changes to SD Times, 80 Skyline Drive, Suite 303, Plainview, NY 11803. SD Times subscriber services may be reached at subscriptions@d2emerge.com.


004_SDT026.qxp_Layout 1 7/26/19 10:48 AM Page 4

®

Instantly Search Terabytes

www.sdtimes.com EDITORIAL EDITOR-IN-CHIEF David Rubinstein drubinstein@d2emerge.com NEWS EDITOR Christina Cardoza ccardoza@d2emerge.com

dtSearch’s document filters VXSSRUW ‡ SRSXODU ILOH W\SHV ‡ HPDLOV ZLWK PXOWLOHYHO DWWDFKPHQWV ‡ D ZLGH YDULHW\ RI GDWDEDVHV ‡ ZHE GDWD

SOCIAL MEDIA AND ONLINE EDITORS Jenna Sargent jsargent@d2emerge.com Jakub Lewkowicz jlewkowicz@d2emerge.com ART DIRECTOR Mara Leonardi mleonardi@d2emerge.com CONTRIBUTING WRITERS Alyson Behr, Jacqueline Emigh, Lisa Morgan, Jeffrey Schwartz

2YHU VHDUFK RSWLRQV LQFOXGLQJ ‡ HIILFLHQW PXOWLWKUHDGHG VHDUFK ‡ HDV\ PXOWLFRORU KLW KLJKOLJKWLQJ ‡ IRUHQVLFV RSWLRQV OLNH FUHGLW FDUG VHDUFK

'HYHORSHUV ‡ 6'.V IRU :LQGRZV /LQX[ PDF26 ‡ &URVV SODWIRUP $3,V IRU & -DYD DQG NET ZLWK 1(7 6WDQGDUG 1(7 &RUH

.

.

.

‡ )$4V RQ IDFHWHG VHDUFK JUDQXODU GDWD FODVVLILFDWLRQ $]XUH DQG PRUH

CONTRIBUTING ANALYSTS Cambashi, Enderle Group, Gartner, IDC, Ovum

ADVERTISING SALES PUBLISHER David Lyman 978-465-2351 dlyman@d2emerge.com SALES MANAGER Jon Sawyer jsawyer@d2emerge.com

CUSTOMER SERVICE SUBSCRIPTIONS subscriptions@d2emerge.com ADVERTISING TRAFFIC Mara Leonardi adtraffic@d2emerge.com LIST SERVICES Jourdan Pedone jpedone@d2emerge.com

Visit dtSearch.com for ‡ KXQGUHGV RI UHYLHZV DQG FDVH VWXGLHV ‡ IXOO\ IXQFWLRQDO HQWHUSULVH DQG GHYHORSHU HYDOXDWLRQV

The Smart Choice for Text Retrieval® since 1991

dtSearch.com 1-800-IT-FINDS

REPRINTS reprints@d2emerge.com ACCOUNTING accounting@d2emerge.com

PRESIDENT & CEO David Lyman CHIEF OPERATING OFFICER David Rubinstein

D2 EMERGE LLC 80 Skyline Drive Suite 303 Plainview, NY 11803 www.d2emerge.com


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 10:59 AM Page 5


006,7_SDT026.qxp_Layout 1 7/26/19 1:55 PM Page 6

6

SD Times

August 2019

www.sdtimes.com

NEWS WATCH IBM: ‘Red Hat will still be Red Hat’ IBM officially closed on its whopping $34 billion acquisition Red Hat, bringing together tools and expertise in their quest to become the world’s top hybrid cloud provider, while providing customers greater access to open-source technologies. But one thing is clear: Red Hat will remain Red Hat. That’s the sentiment expressed by Red Hat CEO in a letter to Red Hat’s employees. He wrote: “Since we announced the acquisition, I’ve been having conversations with our customers, partners, open source community members and more Red Hatters than I can count… What struck me most from those conversations was the passion. It’s passion not just for a company, but for what we do and how we do it—the open source way. That’s not going to change.” Nor will Red Hat management, the open-source projects it manages and contributes to, or its commitment to community, he explained. What does remain to be seen is how IBM embraces this new culture of openness and community. At the Red Hat Summit in May, IBM CEO Ginni Rometty said in effect that IBM will be hands-off, leaving Red Hat management, brand and its offices in place.

Npm improves JavaScript security Npm announced the first major update to npm Enterprise, delivering new security, compliance and developer experience features for working with JavaScript. For security, administrators

Top unicorns herd around Python Python continues its rise to the top of the programming language charts. A newly released report revealed Python is the top language used by “unicorn” companies in the United States. Unicorns are companies valued at more than $1 billion. The Coding Dojo report is based off of the top 25 U.S. unicorns such as WeWork, Airbnb, Epic Games and Slack. The report revealed Python is used by 20 of the top 25, followed by Java at 19 unicorns, JavaScript at 16, Ruby at 12 and Go at 11. Coding Dojo explained it was surprising to see Go among the top 5 languages used by unicorns because it only lands at number 16 on the most recent TIOBE Index. It was also surprising to find Kotlin used by 8 of the top unicorns. “Kotlin is not common overall; it has not appeared in any past Coding Dojo analysis of top programming languages and is #40 in TIOBE Index’s ranking of top programming languages overall.” the company wrote in a statement. can now choose a maximum vulnerability level allowed for all in-house JavaScript projects, which will filter out any packages that don’t meet security requirements, the company explained in a post. Also new are vulnerability reports, which provide a detailed analysis of packages that have been acquired from the public registry, and improved single-sign-on capabilities as well as user management improvements.

NativeScript 6.0 supports Vue.js The latest version of the opensource framework for building native apps is now available. NativeScript 6.0 comes with support for Angular 8 and Vue.js as well as the ability to shorten development and testing time. Support for Angular 8 includes support for the new rendering engine Ivy, while Vue.is support includes feature parity for new functionality between frameworks.

The release also comes with a number of improvements to experience, performance and stability. According to the team, developers can reuse 70 percent of application code across web, mobile and PWA development.

piler. The QDK also includes development environment extensions for VS and VS Code, integration with the Jupyter platform and resources such as simulators and resource estimators to help developers contribute.

Microsoft releases quantum kit as open source

MIT researchers create AI programming language, Gen

Microsoft is looking to open up the quantum ecosystem to experts, development and researchers with the announcement that its quantum development kit (QDK) is now open sourced. Microsoft hopes quantum computing will be used to help global challenges such as clean energy solutions and resource-efficient food production. It is already working with Case Western Reserve University to advance MRI scanning for higher accuracy. The original QDK was released 18 months ago and it includes the Q# quantum programming language and com-

In an effort to democratize and advance the field of artificial intelligence, MIT researchers have announced a new programming language designed for computer vision, robotics, statistics and more. Gen aims to take away the complexity of equations or having to manually write code and enable researchers of all skill levels to create models and algorithms for AI. “One motivation of this work is to make automated AI more accessible to people with less expertise in computer science or math,” said Marco Cusumano-Towner, a PhD student in the department of elec-


006,7_SDT026.qxp_Layout 1 7/26/19 1:56 PM Page 7

www.sdtimes.com

trical engineering and computer science at MIT.

Angular’s Bazel now in opt-in preview Angular is continuing its efforts to improve and increase developer productivity with the introduction of Bazel as an optin preview in Angular CLI 8. Bazel is open-source software tool that helps automate the building and testing of software. Google has been using Bazel for the last 12 years to build its massive applications such as Gmail, Offering Bazel as an option to developers in the CLI will enable them to use the same workflow, but make their builds faster, the team explained. Features of the optin preview include production and development builds, hidden BUILD.bazel files for small projects, unit testing with Karma, Jasmine and Protractor. This opt-in option comes as Google is expanding its CLI Builders API, which it released 2 months ago for Angular CLI. According to the company, the CLI Builders API allows for changes in the implementation of some of the built in Angular CLI commands and the creation of new ones such

as ‘ng build,’ ‘ng serve’ and ‘ng test’ commands through ‘@angular/bazel.’

Microsoft turns to Rust for safer code Microsoft is starting to explore new programming languages to protect against security vulnerabilities. The company revealed it is turning to the systems programming language Rust to help developers build more reliable and efficient software. Microsoft has long turned to languages like C++ and C# in their security efforts. C# has helped protect against memory corruption vulnerabilities, while C++ features a small memory and disk footprint, mature language, and predictable execution. Microsoft is looking to gain all the benefits from C++ and C# in one language, and it believes Rust is the answer.

Equifax reaches settlement Equifax will finally have to pay for its 2017 data breach, which compromised up to 147 million users and exposed sensitive information like credit card numbers, social security num-

People on the move

n The FreeBSD Foundation’s founder is making a return as president of the board of directors. Justin Gibbs will replace George Neville-New, who stepped down in May of this year. Gibbs is currently a software engineer at Facebook. He has been a member of the FreeBSD Development team since 1993, contributed a number of times to the FreeBSD operating system and created the FreeBSD Foundation to provide a place for FreeBSD development and community support.

bers, names, birthdays and addresses. The Federal Trade Commission (FTC) has revealed Equifax has agreed to pay at least $575 million as part of a global settlement with the FTC, Consumer Financial Protection Bureau (CFPB) and 50 U.S. states and territories. The company has the potential to pay up to $700 million as part of the settlement. Equifax will spend up to $425 million on those affected by the breach. This will include free credit monitoring or a $125 cash payment, reimbursement for time, cash payments up to $20,000, and free identity restoration services. In addition, starting next year, Equifax will provide U.S. consumers with six free credit reports a year for seven years.

Working remotely is the new norm for developers Developers have come to expect a flexible work schedule when it comes to their career. In a new report, 86 percent of respondents revealed they currently work remotely in some capacity. Thirty-three percent work from home full time while 28 percent split their time between working remotely and

n The Agile delivery and enterprise IT company Panaya is getting a new CEO. The company’s board of directors has announced David Binny as chief executive officer. Binny will continue to lead Panaya’s strategy vision, helping companies become more agile and continue to shape the company’s solutions. Previously, Binny served as Panaya’s chief product officer where we was responsible for the company’s product roadmap, technical vision, and go-to-market strategies.

August 2019

SD Times

going into the office. The report also found working remotely can positively impact a work-life balance, with 77 percent believing working remotely reduces the stress of commuting. Seventyfive percent stated working remotely gives them the ability to work wherever they want to live, and 70 percent cited it provides time to run personal errands. Other reasons included ability to balance work and passion projects, ability to care for children and/or family members, and the ability to attend family events easily.

Source{d} enables oversight over the entire SDLC Source{d} has launched source{d} Enterprise Edition (EE), which gives IT executives visibility into codebases, IT teams and processes, and offers the ability to add multiple data sources. Source{d} EE is tailored to providing analysis for large companies that have complex SDLC data and engineering organizations by supporting distributed processing across multiple nodes and cross joining of data sources, according to the company. z

n Progress has announced it is promoting its chief marketing officer Loren Jarret to general manager of developer tooling. In her new role, Jarret will be responsible for the overall success of the business, and the continued delivery of developer tools. In addition, she will be in charge of sales, demand generation, engineering, product management, product marketing, customer success, support and delivery relations for Telerik, Kendo UI, DevCraft, Test Studio and Unite UX.

7


008-10_SDT026.qxp_Layout 1 7/29/19 10:03 AM Page 8

8

SD Times

August 2019

www.sdtimes.com

AI Ethics: Early but Formative Days As artificial intelligence and machine learning advance, organizations are grappling with how to put ethical controls in place so their systems don’t run amok

BY LISA MORGAN

use is growing rapidly. As with most “new” technologies, organizations are focusing on all the potential opportunities without giving equal weight to the potential risks. While it seems that just about everyone is talking about AI, less discussed, but growing in volume and frequency, is AI ethics which questions whether categories of AI systems and individual

AI

There’s ‘some amount of unrest’ about using AI, said NYU’s Amy Webb.

implementations will advance human well-being. Rather than focusing narrowly on whether something can be done, AI ethics also considers whether something should be done. It’s already apparent that some things shouldn’t have been done, as evidenced by the outrage over the Cambridge Analytica/Facebook debacle. When Google dissolved its ethics board just one week after its announcement, lots of people had opinions about what went wrong — establishing an ethics board in the first place, the people selected for the panel, whether Google’s attempt to establish the board was a genuine effort or a marketing ploy, etc. Meanwhile, organizations around the globe are discussing ethical AI design and what “trust” should mean in light of self-learning systems. However, to affect AI ethics, it must be implementable as code. “People in technology positions don’t have formal ethics training,” said

Briana Brownell, a data scientist and CEO of technology company PureStrategy. “Are we expecting software developers to have that ethical component in their role, or do we want to have a specific ethics role within the organization to handle that stuff? Right now, it’s very far from clear what will work.”

Where Google erred Some organizations are setting up ethics boards that combine behavioral scientists, professional philosophers, psychologists, anthropologists, risk management professionals, futurists and other experts. Their collective charter is to help a company reduce the possibility of “unintended outcomes” that are incongruous with the company’s charter and societal norms. They carefully consider how the company’s AI systems could be used, misused and


008-10_SDT026.qxp_Layout 1 7/29/19 10:04 AM Page 9

www.sdtimes.com

August 2019

SD Times

despite Google’s well-intentioned efforts to engage in a leadership practice of thinking deeply and considering ethics and making that part of the process of developing products, they ended up caught in a fight about what were the proper ethical backgrounds of the people included on the panel.” Law enforcement solutions company Axon was one of the first companies to set up an AI ethics board. Despite its genuine desire to affect something positive, 42 organizations signed an open letter to Axon’s ethics board underscoring the need for product-related ethics (ethical design), the need to consider downstream uses of the product (ethical use) and the need to involve various stakeholders, including those who would be most impacted by the technology (ethical outcomes). It’s not that AI ethics boards are a bad idea. It’s just that AI ethics isn’t a mature topic yet, so it is exhibiting early-market symptoms.

The whistleblowers are coming

abused and provide insights that technologists alone would probably lack. The central question at issue was whether the members of the ethics board could be considered ethical themselves. Of particular concern were drone company founder Dyan Gibbens and Heritage Foundation president Kay Coles James. Gibbens’ presence reignited concerns about the military’s use of Google technology. James was

known for speaking out against LGBTQ rights. “One of the core challenges when you assemble these kinds of panels is what ethics do you want to represent on the panel,” said Phillip Alvelda, CEO and founder of Brainworks, a former project manager at the U.S. Defense Advanced Research Projects Agency (DARPA) and former NASA rocket scientist. “My understanding is that

Christopher Wylie, who worked at Cambridge Analytica’s parent company SCL Group, was one of several whistleblowers who disclosed the 2016 U.S. presidential campaign meddling on Facebook. Interestingly, he was also the same person who came up with a plan to harvest Facebook data to create psychological and political profiles of individual members. Jack Poulson, a former Google research scientist, was one of five Google employees who resigned over Project Dragonfly, a censored mobile search engine Google was prototyping for the Chinese market. Project Maven caused even bigger ripples when 3,100 Google employees signed a letter protesting the U.S. Department of Defense’s use of Google drone footage. In that case, a dozen employees quit. “I would assume that there are some people who feel like speaking out means their job would be jeopardized. In other cases, people feel freer or motivated to speak out,” said Amy Webb, CEO of the Future Today Institute, foresight professor at New York continued on page 10 >

9


008-10_SDT026.qxp_Layout 1 7/29/19 10:04 AM Page 10

10

SD Times

August 2019

www.sdtimes.com

< continued from page 9

University’s Stern School of Business, and author of The Big Nine: How Tech Titans and Their Thinking Machines Could Warp Humanity. “I think the very fact we’ve now seen a lot of folks from the big tech companies taking out ads in papers, writing blog posts and speaking at conferences is a sign that there’s some amount of unrest.” Right now, whistleblowing about AI ethics is in a nascent state, which is not surprising given that most companies are either experimenting with it, piloting it or have relatively early-stage implementations. However, technologists, policymakers, professors, philosophers and others are helping to advance the AI ethics work being done by the Future of Life Institute, the World Economic Forum, and the Institute of Electrical and Electronic Engineers (IEEE). In fact, standards, frameworks, certification programs and various

forms of educational materials are emerging rapidly.

What’s next? Laws and regulations are coming, though not as quickly as the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), which are both designed for the algorithmic age. For now, software developers and their employers need to decide for themselves what they value and what they’re willing to do about it because AI ethics compliance does not yet exist. “If you’re a company designing something, what are your norms, your ethical guidelines, and your operational procedures to make sure that what you develop upholds the ethics of the company?” said Brainworks’ Alvelda. “When you’re building something that has to embody some sort of ethics, what technologies, machineries and develop-

“One of the core challenges when you assemble these kinds of [ethics] panels is what ethics do you want to represent on the panel?” —Phillip Alvelda, Brainworks

ment practices do you apply to ensure what you build actually reflects what the company intends? [All of this] sits in a larger environment where [you have to ask] how well does it align with the ethics of the community and the world at large?” Designers and developers should learn about and endeavor to create ethical AI to avoid unethical outcomes. However, technologists cannot enable ethical AI alone because it requires the support of entire ecosystems. Within a company, it’s about upholding values from the boardroom to the front lines. Outside an organization, investors and shareholders must make AI ethics a prerequisite to funding, and consumers must make AI ethics a brand issue. Until then, companies and individuals are free to do what they will unless and until the courts, legislators, and regulators also step in or a major adverse event happens (which is what many AI experts expect). Developers should be thinking about AI ethics now because it will become part of their reality in the nottoo-distant future. Already, some of the tools they use and perhaps some of the products and services they build include self-learning elements. While developers aren’t solely responsible for ethical AI, accountability will arise when an aggrieved party demands to know who built the thing in the first place. z


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 10:59 AM Page 11


012-17_SDT026.qxp_Layout 1 7/26/19 11:10 AM Page 12

12

SD Times

August 2019

www.sdtimes.com

Moving to the cloud Though the road can be fraught with hazards, navigating the right path to cloud computing can be transformational for enterprises in all walks

T

he winds are shifting in the industry, and enterprises are clear for takeoff to the cloud. It’s no longer a question of should you move to the cloud. Nowadays, you need a good reason not to be in the cloud, according to Ken Corless, a principal with Deloitte Consulting in its cloud practice. In a recent IDC worldwide public cloud services spending guide, the research firm predicts public cloud spending will grow from $229 billion in 2019 to about $500 billion in 2023. RightScale also found in its 2019 State of the Cloud survey that enterprises are prioritizing public cloud with it being a top priority for 31 respondents and companies planning to spend 24 percent more on public cloud this year

BY CHRISTINA CARDOZA compared to last year. “Adoption of public cloud services continues to grow rapidly as enterprises, especially in professional services, telecommunications and retail, continue to shift from traditional application software to software-as-a-service and from traditional infrastructure to infrastructure-as-a-service to empower customer experience and operational-led digital transformation initiatives,” said Eileen Smith, program director of customer insights and analysis at the IDC. The RightScale survey also found that 94 percent of respondents are using the cloud, 91 percent are in the public cloud, and private cloud adoption is at 72 percent. Additionally, 63

percent of adopters are at intermediate to advanced stages in their cloud journeys. Sixteen percent are beginning their first projects and another 13 percent are planning their first projects. Only 8 percent of respondents revealed they had no plans to move to the cloud. “Organizations see the value and understand the value of the cloud,” said Abby Kearns, executive director of the Cloud Foundry Foundation (CFF). “They are no longer moving to the cloud to check a box because that is what the boss wants. It is more of thinking about the cloud more holistically and around the larger transformation journey.”

The new driving factors behind the cloud As cloud technology has matured and the benefits have been realized, the


012-17_SDT026.qxp_Layout 1 7/26/19 11:11 AM Page 13

www.sdtimes.com

reasons for moving to the cloud and how to get to the cloud have changed. According to Corless, a couple of years ago cost would have been the number one driver to moving to the cloud, but today agility and responsiveness are some of the top drivers. For instance, end users expect companies to respond and deliver faster, and the cloud helps facilitate that agility. “What customers find is that they can just get things done more quickly and be more agile because they no longer have to worry about their own infrastructure anymore. Instead, they can focus on getting the applications, getting the analytics and other functionality they want up and running very quickly in cloud environments,” said Dominic Sartorio, SVP of products and development for

Protegrity, enterprise and cloud data security software provider. Cost, however, will still be a big factor for enterprises moving to the cloud because it has the potential to dramatically reduce overall cost, according to Tej Redkar, chief product officer at LogicMonitor, a SaaS-based performance monitoring platform. “If you look at how much savings a cloud can add especially when you look at the infrastructure, hardware, and software, you need a lot of resources to maintain and manage all those things, and all that is being taken care of when you are in the cloud. Ultimately, it leads to a cost savings whether that’s labor costs, licensing costs, or general performance, which translate back into costs,” he said. Whether it is cost, agility or something else, one thing that is certain is moving to the cloud isn’t really about the cloud at all today, according to Deloitte’s Corless. Trends like DevOps, Agile, microservices, machine learning and automation are becoming imperative to the way an organization works and develops, and they are using cloud as a catalyst to break traditional workflows and transform the company. “The cloud truly makes you a digital company,” added Redkar. “It helps you transform digitally pretty quickly and

August 2019

SD Times

helps with the global reach of your solutions.”

Starting the journey Understanding what is driving your organization to move to the cloud is really going to spark your planning for takeoff, according to CFF’s Kearns. “Is it improving your ROI, improving velocity, decreasing costs, or deploying app workloads in a variety of different regions? Think about what you are trying to gain and then be clear with your teams about what those expectations are,” she explained. According to LogicMonitor’s Redkar, there are three adoption phases to moving to the cloud: First, any new applications you are building should be built in the cloud. Then, you should move bespoke apps into the cloud and see how they run compared to those on-premises. The third phase is transforming those bespoke applications from an infrastructure service to more of a platform-as-a-service. “For complex enterprises, it is going to take time because there are thousands of apps within the enterprise, and to cover all of those, their dependencies, their data and their dependencies on the network is going to take time,” Redkar said. Because of the number of services and solutions enterprises will have to move to the cloud, even the most aggressive adopters of the public cloud will continue to have a hybrid cloud strategy for years until they can completely move their footprint out to the cloud, according to Deloitte’s Corless. Rackware’s co-founder and chief architect Todd Matters suggests organizations start with a pilot. “Almost everyone ends up overspending in the cloud, so picking a relatively small pilot could help understand how to use the cloud and how billing works.” Corless also added that companies should start with continued on page 14 >

13


012-17_SDT026.qxp_Layout 1 7/26/19 11:12 AM Page 14

14

SD Times

August 2019

www.sdtimes.com

< continued from page 13

things that are less mission-critical because “migration is going to be bumpy and you would rather learn and make mistakes on things that aren’t going to be that detrimental to the business.” According to RightScale’s cloud survey, cloud cost optimization is the number one priority enterprises have this year. As cloud adoption begins to grow, enterprises are struggling with how to manage their cloud spending and optimize cloud strategies to reduce costs wherever possible, the report finds.

The report revealed cloud users underestimated the amount of wasted cloud spend, with respondents citing 27 percent of waste in 2019. Despite an increased interest in cloud cost management, only a few companies actually implement automated policies to address issues. Getting serious early on about deploying software like auto-scaling will help manage costs because it allows cloud adopters to dynamically increase and decrease workloads as needed, Matters explained. Monitoring software also can reduce costs by detecting

underutilized servers or over-provisioned servers that have too much CPU, memory or storage.

A multi-cloud, multi-platform approach Most enterprises are also taking a multi-cloud strategy when it comes to adoption. According to RightScale’s cloud report, 84 percent of enterprises are taking a multi-cloud approach. “You don’t want to put all your sensitive information and applications into the hands of one cloud company,” said Rackware’s CEO and co-founder Sash Sunkara. “It is important to be able to

A Cloud Smart Strategy The federal government is also realizing the importance of the cloud. Last year, the Office of Management and Budget announced a Cloud Smart Strategy proposal in order to help agencies migrate to the cloud safely and securely. At the end of June, the White House Office of Management and Budget’s final version of the plan was published. “To keep up with the country’s current pace of innovation, President Trump has placed a significant emphasis on modernizing the federal government,” stated Suzette Kent, OMB federal chief information officer. “By updating an outdated policy, Cloud Smart embraces best practices from both the federal government and the private sector, ensuring agencies have the capability to leverage leading solutions to better serve agency mission, drive improved citizen services and increase cyber security.” The strategy focuses on three main areas of cloud adoption: security, procurement and workforce. While the strategy was developed for the federal government, there are a number of things enterprises can gain from the strategy in their own migration and adoption initiatives, according to Sash Sunkara, CEO and co-founder of RackWare, a hybrid cloud management platform provider. “It is really giving people the push to say let’s move forward on this. This is a technology we have to adopt as a better organization and it gives us guidelines to help us be more secure, modernize and mature,” said Sunkara. Whether an enterprise or federal agency is moving to the cloud, security is always a top concern. The Cloud Smart Strategy details a risk-based approach as well as a multi-layer

defense strategy for securing data in the cloud. It also suggests continuously monitoring for malicious activity. Other security recommendations that the strategy details are trusted Internet connections and practicing continuous data protection and awareness, by regularly engaging in information sharing, using tools with analytical capabilities and having continuous visibility into solutions. The workforce section of the strategy goes over the need to reevaluate and reskill existing employees for a cloud-based future. It also goes over how to identify skill gaps, and recruit new team members to fill in those gaps. This is something that can be applied to all enterprises because the cloud forces development teams to focus more on usability and availability rather than maintaining datacenters, according to Sunkara. Lastly, the procurement section aims to help agencies improve their ability to purchase cloud solutions with advice and common practices to follow such as category management, service level agreements, and security requirements for contracts. Category management is designed to help agencies buy smarter. It aims to “deliver savings, value and efficiency;” “eliminate unnecessary contract redundancies”, and meet the business goals, the Cloud Smart strategy stated. “To be Cloud Smart, agencies must consider how to use their current resources to maximize value: reskilling and retraining staff, enhancing security postures, and using best practices and shared knowledge in acquisitions. Cloud Smart is about equipping agencies with the tools and knowledge they need to make these decisions for themselves, rather than a one-size-fits-all approach,” the Cloud Smart Strategy stated. z


012-17_SDT026.qxp_Layout 1 7/26/19 11:12 AM Page 15

www.sdtimes.com

easily move from one cloud to another because the costs of one cloud can significantly change or the reliability can significantly change. We have heard of outages at Amazon, outages at Google. You want to make sure that you can easily move so that no matter what cloud goes away or what datacenter goes down, you are going to be fine and those applications and sensitive data are going to be protected,” she said. Multi-cloud is going to continue to be an important part of a cloud adoption plan because apps and services and business goals change over time, so cloud adopters need to be able to adapt and adopt a strategy that allows them to move to different clouds based on their current needs, Kearns added. In addition to multi-cloud, enterprises are also adopting a multi-platform strategy. According to a recent report from the CFF, 48 percent of responders are embracing multi-platform with a combination of PaaS, containers and serverless. The report also finds containers are expected to become mainstream within the next year, with 34 percent of respondents using 100 or more containers. According to the CFFCloud Foundry Foundation’s Kearns, containers usage is rising because it actually helps enterprises along their cloud journeys. “Think of containers as the building blocks to organizations that want to take advantage of these cloud architectures. Containers are still at the very heart of that. They are what allows portability and a constant environment,” Kearns said. Deloitte’s Corless explained that containers also help you balance some of the fear against vendor lock-in due to the ease of moving containers and migrating their workloads.

The road can be bumpy Even though there is a lot of interest in the cloud and a lot of enterprises claiming to be executing a cloud strategy, there are still a number of things that can go wrong. One of the first steps to moving to the cloud is moving business applications such as third-party apps enterpriscontinued on page 16 >

August 2019

SD Times

Capital One’s journey to the cloud When discussing roadmaps and strategies for cloud migration, Capital One is often cited as a pristine example of a large-scale company that successfully moved to the public cloud. “Capital One’s CEO will tell you they are a technology company that does banking,” said Ken Corless, a principal with Deloitte Consulting in its cloud practice. “Not only are they going for it, but they are rapidly investing in, moving and re-architecting applications to be cloud-capable to cloud-friendly to cloud-native, to be able to take advantage of all the capabilities [cloud] has to offer.” Capital One’s cloud-first strategy started in 2015 when it announced all new applications would be built and run in the cloud, and all existing applications would be migrated to the cloud. "The basic question was whether it made more sense to devote resources to building and operating our own cloud infrastructure or to move to the public cloud so that we could focus on building and releasing new features and products our customers want. Given the level of customer obsession at Capital One, that question pretty much answered itself,” said George Brady, executive vice president and chief technology officer at Capital One, in an AWS case study. Brady explained Capital One took a different approach to cloud adoption and decided to tackle the hard problems first instead of approaching cloud through the easiest problems. This enabled the company to show stakeholders that the journey to the cloud would be possible and provide value. The company also established risk managers, cloud engineers and cloud management tools to help with the adoption. In addition, the company developed an open-source compliance-enforcement engine called Cloud Custodian, which helped detect if they were violating any policies. "In order for all stakeholders to feel comfortable with a big change like this one, it really helps to have a long-term view of where you’re trying to go, and why," Brady said in the case study. "Right at the beginning, we developed a five-year road map that aligned our use of the AWS Cloud with the company’s long-term business strategy. Being able to point to the value we were going to be able to get out of the cloud was key to getting company leaders on board and turning them into cloud champions themselves. Our most effective arguments were around how the cloud supports faster innovation, saving and finding value in more data, faster recovery from failures, and shifting resources from operations to higher value work." According to Capital One’s vice president of cloud strategy Bernard Golden, enterprises should look at Capital One’s journey as they begin or continue their adoption and participate in the cloud-native world. “A number of years ago it looked at banking and concluded it was ripe for disruption. After all, banking is inherently a digital business, so it makes sense that using the most advanced IT practices could give the company a competitive advantage,” Goldern wrote in a blog post. “So it adopted agile. And DevOps. And decided to go all-in on cloud computing. Along with that, Capital One recognized it needs cloud-native talent, so it recruited heavily to bring new skills in-house. When I look around my office, it feels like a cutting-edge software company.” z

15


012-17_SDT026.qxp_Layout 1 7/26/19 11:12 AM Page 16

16

SD Times

August 2019

www.sdtimes.com

< continued from page 15

es traditionally have hosted themselves, like Office 365, SharePoint, Slack, Box and Microsoft Exchange, according to Deloitte’s Corless. So while it may seem like most enterprises have moved to the cloud, they may have only moved to these types of SaaS solutions, which tend to be less disruptive and invasive cloud adoption strategies, Corless explained. This is just a stepping stone to a broader cloud adoption, according to Corless. The hard part is when you take that to the next level and start moving bespoke applications or applications enterprises have built themselves, such as consumer-facing apps, to the cloud, according to LogicMonitor’s Redkar. “The complexity of those existing workloads becomes a challenge because you can’t just lift and shift an application to the cloud,” he explained. “You need to tweak it for the cloud, and that tweaking takes time and comes with its own complexities.” Enterprises have to evaluate which workloads can be moved to the cloud, and which ones should remain on-premises. They also need to consider the data itself. According to Redkar, if enterprises are hosting their own data and have been for years, whether in a mainframe or data warehouse, that is going to be extremely difficult to move to the cloud. That is why it is important to hire the right people with the right skills to help facilitate a cloud adoption. “Once you have the right people with the right skill sets in place, then they will make the right decisions on how to run and move apps to the cloud,” said Redkar. However, that can be a challenge in itself because a culture change also needs to take place to successfully move to the cloud, according to CFF’s Kearns. “The technology part is the easy part. Changing your culture, changing your business, changing the way your business works and communicates is the

larger challenge. It isn’t just a matter of putting some apps in the cloud, it is really a matter of building that velocity in your organization,” she said. Developers, operators, line of business, product owners and security experts all need to be put into a single team with a single goal in mind around not just the application development, but iteration and ideation, according to Kearns. “The hardest thing to do is build those cross-functional teams and getting that collaboration when you need it,” she said. A cloud culture is one that thrives on change and innovation, according to David Linthicum, managing director and chief cloud strategy officer for Deloitte Consulting. And then in order

to make sure you have the right skills in place, enterprises should do a skills gap assessment, decide what technology they are going to be using and what skills they have on staff. This will help assess whether an organization needs to hire or train new employees, Linthicum explained. While cost is a driving factor for companies to move to the cloud, it’s not always cheaper than running onpremises, according to Kearns. Having the right skills within the organization can help evaluate which workloads go where and how to continuously evaluate those efforts, she explained.

Ensuring a smooth transition One major hurdle that cloud adopters continue to deal with is security. There

is an ongoing debate in the industry on whether it is more secure to stay onpremises than move to the cloud. Deloitte’s Corless argued it is a risk either way. “I believe good world-class security is easier in the public cloud than it is on-premises. I also believe big gaping security issues are also easier on the public cloud.” According to LogicMonitor’s Redkar, the cloud itself isn’t less secure; the biggest security issue when it comes to the cloud is the people who are managing cloud environments are not aware of the exposure in the cloud. Symantec’s 2019 Cloud Security Threat Report revealed while organizations fear moving to the cloud could result in a larger risk of data breaches, it is actually immature data practices. “Data breaches can have a clear impact on enterprises’ bottom line, and security teams are desperate to prevent them. However, our 2019 CSTR shows it’s not the underlying cloud technology that has exacerbated the data breach problem — it’s the immature security practices, overtaxed IT staff and risky end-user behavior surrounding cloud adoption,” said Nico Popp, senior vice president, cloud and information protection for Symantec. While tools can help organizations design and implement policies as well as apply protection methods like encryption algorithms, tokenization, and anonymization, security is actually more about people and processes than about technology, according to LogicMonitor’s Redkar. “Because most of the data used to reside in the enterprise datacenter, there were not many ways to get in. With the cloud, there are different ways to get in, different credentials which if they leak can lead to different levels of security hacks, so making sure developers and engineers understand and embrace the secure lifecycle of the infrastructure is important,” he said.


012-17_SDT026.qxp_Layout 1 7/26/19 11:13 AM Page 17

www.sdtimes.com

August 2019

SD Times

Three questions to ask to ensure hybrid cloud compliance BY JENNA SARGENT

As we continue to try to exist in a world where data breaches are commonplace occurrences, organizations have to ask more questions when it comes to storing data. This is especially true when data is being stored with a third party. Companies need to comply, not only with their own internal rules, but with external regulations such as the GDPR, and soon, the California Consumer Privacy Act. According to Todd Matters, chief architect and co-founder of cloud company RackWare, there are three questions that organizations should be asking cloud vendors.

Physical regulations The first important thing to question is what physical regulations there are, Matters explained. It is important to ensure that your cloud provider has the appropriate physical regulations in place, such as physical access to and surveillance around the data center. He believes that running data centers has become its own science. In the past, those running data centers just needed to worry about having an air conditioner to

According to Protegrity’s Sartorio, cloud vendors publish a shared responsibility model, which explained what layers of the stack the cloud vendor is going to cover such as the physical protection of their infrastructure, hardware or network. “What they don’t cover, and they are very open about this, is their customers’ own application workload and the customers’ own data. They say you as the customer are responsible for making sure whatever data and whatever applications you are standing up in these cloud environments are protected.” The reason cloud vendors don’t protect customer data is because they don’t have any way of knowing what data is

keep it under a certain temperature. Now, people have to ask questions like: Is the data center in a flood plain or an earthquake zone? And the answers to those questions will inform aspects of their disaster recovery plan. “Getting all of that information from your provider and understanding that, at least from a physical perspective is very important,” said Matters.

Infrastructure management The second important question to ask is about how infrastructure is managed. In a data center, personnel need to have access to certain servers, systems, and routers. Therefore, it’s important to know how those people are monitored and how they access those systems. In addition, there are a number of standards that are new today that didn’t exist in the past, such as password standards and two-factor authentication standards. It’s important to also understand the compliance requirements for those elements.

How cloud providers implement multi-tenancy The third question to ask is how the

coming in and out of their environments — or even if they can tell it is customer data. Further, they don’t know how sensitive it is or what regulator region it is in, Sartoio explained. In order to properly protect data, organizations need to have a set of policies in place, be able to control who sees the data, and take stock of the data that is already there. In addition, implementing a discovery tool can detect if there is any sensitive customer data out in the open that shouldn’t be. “The biggest problem is ignorance. You hear about all these embarrassing breaches all the time where someone put something in Amazon S3 that is just

cloud providers implement multi-tenancy. A cloud provider offers infrastructure for many different companies, so it is important to ask them how they handle that. “It’s definitely a worthwhile question to ask your provider. ‘How do you handle multitenancy? What safeguards do you have in place? How can you ensure that however many companies, it could be dozens or hundreds that are running on the same physical infrastructure, are you guaranteeing that they can’t access any of my data?’ ” According to Matters, compliance needs to be a joint effort between companies and cloud providers. “Regulation requirements must clearly span provider infrastructure and personnel, as well at IT aspects relegated to cloud users. Most providers have material readily available that describes their compliance strategies. This allows for the simple implementation of basic controls.” He also believes that, for the most part, companies are not asking these questions unless they’ve received an audit failure or citations that could compromise the business. z

out there for people to see because they were ignorant about cloud security,” said Sartoio. If you are not confident about the processes you have in place or your organization’s ability to control data in the cloud, you should stay on-premises for now and figure out how to make it work, according to LogicMonitor’s Redkar. “Eventually, you are going to need the cloud to help you expand your business more rapidly than your datacenter can, unless there is a regulatory need to say on-premises,” he said. “If you want to compete in the marketplace then you need to adopt the cloud.” z

17


018,19_SDT026.qxp_Layout 1 7/26/19 3:31 PM Page 18

18

SD Times

August 2019

www.sdtimes.com

Azure targets innovation T

he Azure development team at Microsoft is working to provide new features to win over more developers, through a heavy investment in tools for migrating legacy platforms to its cloud, pushing machine learning, quantum computing and other innovative features. Microsoft hopes its investments will pay off in growing its cloud market share, which currently sits at 16 percent, second only to AWS at 32 percent of the public cloud market share, according to the most recent findings available from Canalys. A big reason why major companies such as Coca Cola and Walmart are using the Azure platform: They already were Microsoft customers, using other tools and platforms. Microsoft recently released its earnings, which were better than expected, and although the growth rate this year dipped to 64 percent after hovering in the mid-70s during the last three years, the company says it saw more large and long-term Azure contracts in this quarter. Its revenue growth rate is way above the overall market growth rate, so it is gradually gaining market share despite a growth slowdown. Over the years, Microsoft has encouraged users of their on-premises software offerings to upgrade to their

BY JAKUB LEWKOWICZ cloud equivalents and introduced a payas-you-go plan, which enables companies to only pay for the services they use, whenever they use them. It now has data centers in 54 regions around the world, more than any other cloud provider. “So much around Azure has been around innovating, new capabilities and AI, which are incredibly important, but as cloud has become much more mainstream, and we see mission-critical systems now moving to the cloud, it’s really focused in on this migration opportunity and that’s where partners have been in the most recent years asking us about,” said Julia White, the corporate vice president of Microsoft Azure at this year’s Inspire event in July.

How Azure is solving cloud challenges Because data management is more complicated and more critical to the way businesses run, migration has had to become more sophisticated to address those challenges, according to Ed Anderson, research VP and distinguished analyst at Gartner. To address those concerns, Microsoft last month released Azure Migrate, which includes a new integrated part-

ner experience, Server Assessment, Server Migration, Database Assessment, and Database Migration capabilities. The company said it will make the cloud journey easier for customers because it acts as a central hub and provides users access to Microsoft and ISV tools and helps tailor the tools to a user’s migration scenario. Microsoft also introduced Azure Lighthouse, a set of capabilities intended to help partners manage an environment on behalf of their customers in a scalable way, according to White. Meanwhile, last month the company ended support for its older on-premises software SQL Server 2008/2008 R2 versions, which still have a large customer base. It is also ending all support for Windows Server R2 at the start of next year. Added pressure to migrate is leaving companies with the options to upgrade to newer versions of the servers or to rehost their 2008 and 2008 R2 workloads on Azure for three more years of security updates. “Part of this move is to try to get those moved in the cloud is to get people to move into that cloud environment and the cloud operating model,” said Anderson. “Fundamentally, what Microsoft is trying to do is get people on this path of embracing this continual


018,19_SDT026.qxp_Layout 1 7/26/19 3:31 PM Page 19

www.sdtimes.com

stream of innovations that comes with the cloud model.” Other Azure capabilities that recently made headlines include the release of Azure Data Box Heavy, a part of the Data Box family of offline transfer SSD products including Data Box (100 TB) and Data Box Disk (up to 40 TB) that Microsoft says will “enable customers to easily and securely move large volumes of data into Azure when network isn’t an option.” In addition, Microsoft introduced new features for Azure Kubernetes Service (AKS) and a Hyperscale (Citus) option in Azure Database for PostgreSQL to scale out compute, storage and memory resources as needed.

stores two years ago. The new generation is targeted for developers, according to Microsoft. Last December, the company launched Azure Machine Learning Services as well as an integration with Power BI, which brings ML to data analysts. According to Tzvi Keisar, senior program manager at Microsoft Azure, the integration helps data scientists who want to automate their ML workflow and helps a wider audience of business users without advanced data science and coding knowledge to implement AI. Microsoft also launched Azure Databricks, an Apache Spark-based

over domination Building a compelling platform

However, while market share is indeed a driving factor to compete, it isn’t the only thing on Azure’s agenda. “If you look at the competitive battle today, of course Microsoft wants to gain share but I think more fundamentally they want to become the dominant platform for more people to build, host and deliver next generation-type applications,” Anderson said. “It’s that the market will more and more move to these advanced services like artificial intelligence and advanced analytics and capabilities like that.”

Microsoft invests in emerging technology At Inspire, Microsoft announced the release of the Azure Kinect DK, a developer kit with advanced AI sensors for sophisticated computer vision and speech models. Azure Kinect DK offers multiple modes, options and software development kits (SDKs) that Microsoft explained will enable developers and partners to build solutions that understand the world, including the people, places and things around it. The Azure Kinect DK is intended as a next-gen entry in the Kinect series with superior tech and functionality to the Kinect that was used on Xbox 360’s and Xbox One’s and disappeared from

big-data service with Azure Machine Learning integration. “I don’t think Microsoft starts looking at things because the attention gets on it, but they have a big research division and I think what happens is when the market becomes ripe for something, then they deploy it. Sometimes it’s fully baked and sometimes it’s a little rough around the edges,” said Patrick Hynds, former Microsoft regional director and MVP and current CEO of DTS. “I think Microsoft kind of went too fast into platform-as-a-service, but now I think the world is starting to get ready for it. AI and machine learning is something that Microsoft was really pushing in the market.” Hynds said one of the other places where Microsoft is being really forward thinking is quantum computing. Last month, Microsoft open-sourced its quantum development kit (QDK) to experts, development and researchers, hoping the quantum ecosystem will help global challenges such as creating clean energy solutions and resourceefficient food production. Microsoft is working on making

August 2019

SD Times

Azure the go-to platform for quantum processing and deployment, according to the company. Developers will be able to access quantum processing alongside classical processing in Azure. “Based on what I’m seeing, quantum is really going to be as big a shift as AI is now and big data was. It will be a lot harder to find people who know anything about it,” Hynds said. “Microsoft is already making it easier with the Q# language that you can use to program against their simulators. Amazon seems to be completely absent from that space.”

Microsoft eases cloud fears While a majority of companies see the benefits of moving to the cloud, some are skeptical of getting rid of all of their on-premises infrastructure, due to fear of outages and losing control. “The pace is quickening, so you can’t as a developer settle in and say ‘OK I’m going to do the thing I did three years ago,’ well, that offering might not be there anymore. It’s great because we’re getting all the new stuff, but it’s going to be challenging for the people who have to maintain old stuff,” Hynds said. To target this, Microsoft offers Azure Stack for organizations that want to maintain a hybrid cloud environment, which uses a mix of onpremises, private cloud and third-party, public cloud services with orchestration between the two platforms. “Azure Stack is sort of an embodiment of the notion that I can take Azure Services and I can deploy those in places where I care about and in some cases it’ll be about location specificity, Anderson said. “There’s that consistency where you have the service fidelity and all those things that make cloud great but you’ll get a distributed and location-specific aspect that becomes part of those cloud environments as well. That’s what the future really holds.” With the prioritization of expanding and fine-tuning security and cloud capabilities, Azure continues to grow. z

19


020,22-foldable_SDT026.qxp_Layout 1 7/26/19 10:46 AM Page 20

20

SD Times

August 2019

www.sdtimes.com

Foldable technology will force developers to think outside the box Foldable technology is no longer a proof of concept. The first foldable devices are about to hit the market. BY CHRISTINA CARDOZA

Huawei’s foldable device was first announced at Mobile World Congress in Barcelona this year, and the device is now entering the final testing phase. The company is expected to release the device this September. While no official release date has been announced, Samsung is gearing up for the release of the Galaxy Fold. Other rumors of foldable devices are coming from Xiaomi, Motorola and LG. Google is also preparing for a foldable future with support for foldable devices and folding patterns in its upcoming release of Android Q. With top leading device manufacturers looking to make the switch to foldable technology, this is one trend developers won’t be able to ignore — at least not for long, according to Eran Kinsbruner, chief evangelist for the web and mobile app testing solution provider Perfecto. “This is definitely something to look out for, and it is going to be a reality. You will see this hybrid of... smartphones and tablets becoming the new innovative mobile phone for the future,” said Kinsbruner.“If it was just a minor vendor starting to run with this technology, it might not have been that critical, but when you see almost all the Android device manufacturers joining this trend and developing for this, it is just a matter of time before it reaches mainstream. It won’t be too long until Apple joins.” Part of the benefits of moving to a foldable future is the ability to provide

a better user experience, according to Kinsbruner. For instance, up until now a user would have to switch between apps on a very small screen. With a foldable device, they can unfold their phone to get 8 plus inches of screen, have more workspace and work with apps at the same time, Kinsbruner explained. However, this new user experience will be a challenge for developers. Since the space is still very new, developers are now tasked with experimenting and creating new solutions for these foldable devices. The manufacturers themselves have had their own struggles in just creating the devices. Earlier this year, Samsung delayed the release of its Galaxy Fold smartphone just days before its launch. The phone was anticipated to be re-released this month, but new reports reveal the device is still not ready. Samsung’s

mobile chief, DJ Koh, has stated “It was embarrassing. I pushed it through before it was ready. I do admit I missed something on the foldable phone, but we are in the process of recovery.” Dave McAllister, community manager and evangelist for log monitoring tool provider Scalyr, explained that the benefits of foldable technology may not be right for the mobile space. The driving force behind foldable technology is having a competitive advantage. Device manufacturers want to offer something new that hasn’t been done before. The problem, though, is that while users want to see innovation, they also want things to be simple, he explained. “Imagine someone who is used to their apps and functions on their phone being in a certain location and a certain place. All of a sudden all of that moves and they have to now figure out and remember where everything is depending on if the phone is folded or not. It is a challenge to meet the accessibility standards,” said McAllister. “As a developer, we are going to have to think about how we view software development differently in a malleable environment. I do believe the time has come and there is an interesting place for this technology, I just don’t think phones are the right place to start.” Rob Enderle, principal analyst at the Enderle Group, does see the potential for having this technology in a smartphone, but he thinks some firms are going about it the wrong way. “Instead of a large phone morphing into a tablet, they should be focused on a small phone morphing into a large one. The historical large markets that sustained were foldable phones (StarTac) and continued on page 22 >


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 10:59 AM Page 21


020,22-foldable_SDT026.qxp_Layout 1 7/26/19 10:47 AM Page 22

22

SD Times

August 2019

www.sdtimes.com

< continued from page 20

smartphones. Tablets didn’t hold so they should pick the two popular configurations, not try to turn the smartphone into something (a small tablet) that people don’t seem that excited about now,” Enderle wrote in an email to SD Times. As manufacturers still try to bring these devices to market, developers now have to think about how their applications will work not only on the device, but in tandem with other applications. In this new device display, one application can be running at the same time as another one and both of those applications could be utilizing the same phone sensors. From a resource planning perspective, developers need to reconsider how its applications use specific sensors and what it will mean to the experience if it is competing with another app for those resources, according to Perfecto’s Kinsbruner. “The foldable smartphones are not really coming with an extremely significant change to the hardware from a memory and storage consumption,” he said. “The change comes from the mobile application development perspective, having to balance multiple apps running in parallel and in the foreground, consuming network, battery and other resources from the device. This is something that might require a heavy lift from the app developers to optimize the application.” Additionally, the mobile space and operating systems are already very frag-

mented, so developers will have to worry about not only creating new foldable versions of their applications, but getting it to work across different generations of operating systems, Eran added. Foldable technology is in the early phases and there are only a small set of resources available to developers to experiment and learn about the devices. Samsung provides an app emulator for Android devices and Android Studio that can mimic the fold and unfold activity. Google also provides a foldable emulator in its Android Studio and provides documentation on getting started for building apps for foldables. Some of the areas it goes through include making apps resizable, new screen ratios and dealing with multiple windows. “When folded, foldables look like phones, fitting in your pocket or purse. When unfolded, their defining feature

is what we call screen continuity. For example, start a video with the folded smaller screen — and later you can sit down and unfold the device to get a larger tablet-sized screen for a beautiful, immersive experience. As you unfold, the app seamlessly transfers to the bigger screen without missing a beat,” Stephanie Cuthbertson, director of product management, wrote in a blog post. In addition to emulators, Kinsbruner highly recommends getting early access to real devices so developers can start playing around with the technology, and see how it looks and works with their apps in foldable and unfoldable modes as well as how it works with other operating systems and in parallel with other apps. “Developers will need to train themselves and start exploring with minimal knowledge out there so they are ready and prepared for the time these devices come to the market,” said Kinsbruner. Another thing to consider will be how to include testing the device into existing schedules and pipelines. “This is going to be a disruption for developers in how they incorporate the support for this new platform and app while not risking other commitments and ongoing roadmaps for other features,” said Kinsbruner. As time passes, Kinsbruner hopes to see the field open up and more tools become available for developers. He says, however, “it will take time because these devices take significant research and development to really simulate every action.” z


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 10:59 AM Page 23

Bad address data costs you money, customers and insight. Melissa’s 30+ years of domain experience in address management, patented fuzzy matching and multi-sourced reference datasets power the global data quality tools you need to keep addresses clean, correct and current. The result? Trusted information that improves customer communication, fraud prevention, predictive analytics, and the bottom line. • Global Address Verification • Digital Identity Verification • Email & Phone Verification • Location Intelligence • Single Customer View See the Elephant in Your Business -

Name it and Tame it!

www.Melissa.com | 1-800-MELISSA

Free API Trials, Data Quality Audit & Professional Services.


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:00 AM Page 24

The Cloud’s #1 Continuous Testing Platform Accelerate digital transformation across the enterprise with a comprehensive suite of software testing tools – from agile test management to automated continuous testing for enterprise architectures.


025tricentis_SDT026.qxp_Layout 1 7/26/19 2:02 PM Page 25

25

Tricentis: Automation @ DevOps Speed R emember the old adage: “Speed, Cost, Quality– pick any two?” This tradeoff is no longer an option. With digital transformation initiatives driving DevTest teams to scale Agile and adopt DevOps, all three vectors must be addressed. This requires a fundamental shift in how you approach testing. It’s not a matter of acquiring more or better tools; holistic process transformation is required.

That’s why Tricentis defined “continuous testing” and developed the industry-leading continuous testing platform. Organizations turn to Tricentis to make a fundamental process change in that testing process while lowering business risks and accelerating software delivery. Tricentis’ intelligent model-based approach ensures the reliability, flexibility and actionable insights companies needed to accelerate and automate the most demanding delivery cycles. “Continuous testing ensures that the right stakeholders have access to the right information at the right time. You can execute automated testing as part of the software delivery pipeline and obtain immediate feedback on the business risks associated with a software release,” said Wayne Ariola, chief marketing officer at Tricentis. Tricentis is among the 2019 SD Times 100 for its leadership in the Testing category and has also been recognized as a “leader” by Forrester, Gartner and IDC. Its continuous testing platform enables organizations to achieve more than 90 percent test automation, 5X or better speed increases and more than 80 percent risk coverage. Tricentis’ successful enablement of continuous testing is further reflected in its 80 percent annual growth rate and over 1,600 global customers. In addition, Tricentis enjoys the largest software distribution among global systems integrators.

Scaled Agile and DevOps Require Continuous Testing Agile success is easier to achieve with a small project that involves just a web interface. However, as Agile projects scale, the underlying architecture and the necessary testing strategies become more complex. While risk mitigation strategies tend to include shift-left and shift-right, what’s really needed is continuous testing throughout the lifecycle, especially as teams advance to DevOps. “When you scale Agile, you recognize that you need to radically alter the traditional testing processes. When you move to DevOps, those processes must again scale to an entirely different level of automation,” said Ariola. “As you move down the DevOps path, eventually, you conclude that continuous testing must be executed within the contexts of business objectives and business risks — not just ‘siloed application test results.”

2 019

Scaled Agile and DevOps break down when speed and quality imperatives are out of sync. A symptom of that is approaching continuous testing from a tool perspective rather than a process perspective. The former creates process friction while the latter alleviates it. “You need holistic process transformation to achieve business transformation and DevOps transformation goals,” said Ariola. “A ‘chasm’ that often plagues organizations is the gap between a small team of brilliant, technical people who succeeded with an Agile or DevOps pilot and the larger population necessary to affect scaled Agile or DevOps which is more diverse.” For a small project with a simple architecture, Selenium scripts or free Selenium-based tools work well. However, when Agile and DevOps scale, the underlying architecture becomes more complex necessitating intelligent, end-toend continuous testing processes.

Teesttii ng

Tricentis Improve User Journeys and Lower Business Risks Consistent user journeys have become more important than individual component outcomes, so software teams need a means of ensuring quality that doesn’t slow value delivery. Using Tricentis, Agile and DevOps teams can measure and govern risk, as well as quickly define and update tests across the architecture. Its model-based testing capabilities propagate a change throughout an entire test suite automatically. Essentially, Tricentis is a low-code, no-code platform that uses advanced algorithms and automation techniques to speed, simplify and improve testing outcomes. AI and machine learning are enabled in the appropriate places so organizations can react to change faster and avoid the overhead of maintaining test scripts. “Tools don’t solve the problem of testing at scale; process transformation does,” said Ariola. “Tricentis allows you to focus on the holistic problem of end-to-end continuous testing using a business risk approach. We support over 160 different technologies so you can achieve the resiliency, speed and success you seek at scale.” Learn more at www.tricentis.com. H


026DevOps_SDT026.qxp_Layout 1 7/26/19 10:52 AM Page 26

26

SD Times

August 2019

www.sdtimes.com

DEVOPS WATCH

SKIL: A framework for humans, not machines BY JAKUB LEWKOWICZ

The SKIL Framework created by the DevOps Institute focuses on a holistic approach to the human aspect of advancing DevOps rather than the machines involved in it. The DevOps Institute aims to connect people in the DevOps community and provide networking and educational opportunities. “We really feel very strongly that while automation is obviously a critical success factor for DevOps, it really is the human factor that’s going to make a difference,” Jayne Groll, CEO of the DevOps Institute, told SD Times. The framework stands for skill, knowledge, ideas and learning, and it aims to guide people toward the best way to approach upskilling. The DevOps Institute emphasized that there is a tremendous demand for the skills necessary for DevOps; the top three of which were automation skills, process skills and, perhaps most unexpectedly, soft skills, based on the Upskilling: Enterprise DevOps Skill’s Report 2019.

Groll said that there is a shortage in the skills that are critical to DevOps and that most of the challenges organizations have are tied to the need for these skills. The SKIL framework aims to address that issue by guiding people to receive certifications to prove they understand the subject, have the knowledge to get comprehensive insight, have opportunities to exchange ideas and a path for continuous learning to achieve mastery in skills. “We recognize that we needed to create a framework that helps people understand how to transform without being specific to things to things like ITIL or Agile,” Groll said. “DevOps is really about continuous learning and mastery.” There are countless ways to address upskilling including bootcamps, massive open online courses (MOOCs), higher education institutions and even some effective YouTube videos on the subject at hand. However, Groll says she hopes the framework will provide context for individuals who are told to

upskill, but are really struggling to find out the best way to go about it. Groll says she also hopes to provide context for organizations so that they can upskill their teams. She adds that the DevOps Institute targets all of these facets in its own work. The four key elements are: • Skill: Focuses on the career growth of individuals and supplies them with the requirements they need to put on CV’s whether that’s certifications, skill assessments or learning paths. • Knowledge: Rich content and insight and is meant to supplement one’s career including research reports, case studies and books written about the topic at hand. • Ideas: This includes opportunities for people to share ideas, Skillups, slack channel groups, peer-to-peer networking. • Learning: Continuous learning, listening to thought leaders give context around DevOps know-how and discovering learning paths toward mastery. z

GitLab turns its focus to DevSecOps BY JAKUB LEWKOWICZ

GitLab is taking the next steps in its DevOps initiative with the announcement that it is integrating security into its single application. The company is also releasing auto remediation, security dashboards and plans to release security approvals in an upcoming update. “The advantages of a single application are numerous: A single sign-on eliminates the need to request access to each separate tool, context switching is reduced which improves cycle time, and work is tracked in one place so you don’t have to do detective work to find the information you need,” the company wrote in a post. According to market research com-

pany Forrester, while many companies are moving toward the DevSecOps model, they fail to fully address the issue of security fully. “Many organizations have succeeded in automating continuous release and deployment for some applications but face increasing risk from lack of governance and fragmented toolchains,” Forrester observed. GitLab plans to tackle that problem by integrating security into the single sign-on application. GitLab’s DevSecOps strategy will feature Static Application Security Testing (SAST) to spot vulnerabilities before deployment and Dynamic Application Security Testing (DAST) to analyze web applications for runtime

vulnerabilities and run live attacks against the review app. Container Scanning and Dependency Scanning will also be added to the CI/CD pipeline. In addition, the app will have Kubernetes-native integrations, multicloud deployment support, feature flags, an Operations Dashboard and Incident Management in 12.1, which will enable companies to effectively manage outages. “Overall, with security automated throughout the developer workflow and DevSecOps delivered in a single application, we believe companies will continue to advance the way they deliver code, shortening release cycles and focusing on the innovation they will bring to market,” GitLab wrote. z


Your time is important, spend more of it doing the things you love. Strategic toolchain integration for Value Stream Management Integrate your software delivery toolchain Automatically flow all product-critical information - such as artifacts, fields, statuses, comments and attachments - across your best of breed tools in real time Stay in your tool doing the work you love Limit/eradicate sourcing information through IMs, emails, spreadsheets and meetings

Stop wasting time with duplicate entry No need to enter the same information into multiple tools. All teams are in sync to optimize collaboration Know that your efforts aren’t going to waste Work with confidence that you’re always building the right product, and always delivering business value

Tasktop.com


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:01 AM Page 28


29

Taking the LEAD in digital image integration xtracting text from images, reading barcode data, creating text-based PDFs, and cleaning up dirty images are just a few common issues that developers need great tools. LEAD Technologies, Inc., founded in 1990 and based out of Charlotte, N.C., has LEADTOOLS as its foundation product and is the market leader in development tools for digital imaging. The product is a family of comprehensive toolkits that give developers a head-start when integrating imaging functionality into desktop, server, tablet, and mobile applications. The product lines are designed around vertical markets and image types such as raster, document, medical, multimedia, and vector. For continuing innovation and leadership, LEAD Technologies has been selected to the SD Times 100 for 2019 in the API, Libraries, and Frameworks category. Over the past two years, there have been many major updates to the LEADTOOLS v20 SDK. For example, document library and annotation updates include text redaction, high-level OMR SDK that does not restrict the user to a specific form design, and new unstructured forms components to recognize and process such things as business cards. Other added capabilities include 3D volume rendering support for their Zero-footprint Medical Web Viewer and expanded Xamarin offerings with a Xamarin Image Viewer, image annotations, and a much needed Xamarin Camera Control that eliminates the need to write native Android and iOS camera code. Further, LEAD continues to extend LEADTOOLS to any developer that can access the Internet. With the maturity of cloud services and definite benefits, organizations are moving more and more critical applications into the cloud. This means that the imaging technology associated with those applications must also move. In July 2018, LEAD released the LEADTOOLS Cloud Services, a Web API with the LEADTOOLS SDK at the heart. In April 2019, LEAD released a major update to their Cloud Serv-

E

2 019

ices that included four new Web APIs that were based on customer requests: ConvertRedact, Merge, ExtractAAMVAID, and ExtractBusinessCard. Alongside the new Web APIs, LEAD added some additional parameters to existing services: The ability to set what OCR language to recognize as well as controlling the tradeoff between size and quality when outputting to images formats such as PDF, TIFF, and JPEG. On top of the additions, major speed and stability enhancements were also added. “Since we released LEADTOOLS Cloud Services last summer, the user base has exploded. As users request additional technology to be added for their needs, we have been quick to add to the growing

APIs, Libraries & Frameworks

LEADTOOLS number of methods our services provide,” says Hadi Chami, LEADTOOLS Developer Support Manager. The company will continue to add new Web API functionality to LEADTOOLS Cloud Services that will enhance and expand its cloud footprint. According to Chami, LEAD is also in the process of expanding the use of machine learning (ML) and computer vision throughout many more features of LEADTOOLS. Chami says, “With nearly 30 years under our belt, we pride ourselves on the ability to adapt and fulfill the never-ending needs of developers as technology grows. Once solely an imaging toolkit, LEADTOOLS has expanded into various related technologies with our Document, Medical, and Multimedia products. We’re excited at the direction technology is taking developers, and you can expect a lot more from us in the coming years.” Learn more at www.LEADTOOLS.com. H


030-35-LowCode_SDT026.qxp_Layout 1 7/26/19 9:59 AM Page 30

30

SD Times

Low-code for

August 2019

M

www.sdtimes.com

obile developers tend to be skeptical about the effectiveness of low-code tools when they know exactly what native iOS and Android development takes. In fact, some developers are so turned off by low-code platforms that the very mention of them triggers a passionate response. “Low-code is bad enough, but lowcode for mobile is even worse. The speed benefit is a farce with platforms like React Native,” said Dary Merckens, CTO of custom software development firm Gunner Technology. “The device integration is often way, way, WAY worse than React Native as well because so many low-code platforms just use stuff like PhoneGap to wrap a mobile web experience into an app form.” Since writing custom Swift or Kotlin code can be “brutal” enough that one could argue in favor of low-code platforms on that basis, low-code is still a

THE SECOND OF

THREE PARTS

no-go from where Merckens sits. “Heck, writing a PWA is a better option than low-code solutions at this point, but I’m sure low-code companies don’t want people to know that because they’re making bank duping people,” he said.

One size does not fit all There’s an entire range of low-code/nocode platforms aimed at different audiences including professional developers, web developers and business power users, which their UIs and features reflect. “If I define myself by the act of writing code and crafting code, really get into the details and the weeds, I’m going to have an incredible amount of

skepticism that lets someone who’s not as skilled as I am accomplish the same result,” said Jeffrey Hammond, VP and principal analyst at Forrester. “If you look at something like Quick Base, which is being used by business users, the tool is actually building React Native applications, it’s just not doing it from the command line like a professional developer would do.” So, if native code is the end result, producing the exact same outcomes that could be written manually, what’s the difference? Granular control. “The benefit of native development is you don’t have any constraints, but then again, you’re having to develop two applications for iOS and Android,” said Mark Runyon, principal consultant for software architecture and development firm Improving. “Most of the lowcode I’ve run into used Cordova, which is a hybrid platform. With a hybrid, you’re developing one-shots that are not necessarily as clean and perform-


030-35-LowCode_SDT026.qxp_Layout 1 7/26/19 9:59 AM Page 31

Mobile: Toys or Tools? www.sdtimes.com

August 2019

SD Times

Developers still dislike the platforms, but savvy organizations see time-to-market improvements and easing the burden on overtaxed programmers BY LISA MORGAN

ance-enhanced as a native approach.” Experts agree that there is room for low-code and no-code solutions in an enterprise, but say thought should be given to where they’re most likely to provide the best return. “There are many platforms available and each has their own learning curve. They also have different, often cluttered UIs, with [so] many options to choose from that it can get overwhelming,” said Ahmen El Awadi, lead developer at web development firm Best Response Media. “Businesses with complex app development logic and compliance needs may find it inconvenient to integrate these low-code tools into their existing stack.”

Why a developer would choose low-code Speed is the obvious benefit of using a low-code platform. When the demand for application development and maintenance exceeds what the development team can deliver, options must be considered. Some of the work might be

outsourced. Alternatively, mobile developers might write a progressive web app (PWA) or use a low-code platform to avoid writing apps for individual platforms. “It’s really an intersection of different technologies with a different approach to building the application and you see lots of different takes on how to do that,” said Forrester’s Hammond. “The reason to pick a low-code tool is to accelerate development efforts.”

More professional developers might be more inclined to use low-code tools if they could see what they’re capable of doing and how they’re doing what they do. Then again, there has been an explosion of developer-oriented productivity enhancements among tools generally over time. Arguably, low-code platforms are just another time-saving alternative. “It’s the same reason people picked up Visual Studio for C and C++. Some continued on page 32 >

31


030-35-LowCode_SDT026.qxp_Layout 1 7/26/19 9:59 AM Page 32

32

SD Times

August 2019

www.sdtimes.com

[and deliver] the functionality needed by fine for workflows, but it isn’t the kind are happy to accept a higher level of enterprise mobile users today.” of enterprise-grade app the department abstraction for speed,” said Hammond. In his view, low-code falls short of or enterprise requires. “Business users don’t care about how native code and mobile app developIn other cases, mobile developers beautiful the code is which is why they ment platforms (MAPDs) in several may inherit low-code or no-code apps are most likely to accept low-code tools. ways, including performance, delivery built by less-skilled people whose intenThey’ll use something like Betty Blocks of native app/device functionality, tions exceeded their capabilities or who or Quick Base and maybe a web devel- responsiveness to different form fac- became so overwhelmed by their day oper with PHP or HTML experience tors, the ability to implement asynchro- job they couldn’t continue working on will choose Mendix or OutSystems.” nous operations and the ability to easily the app. Still, mobile developers aren’t exact- perform and manage iOS and Android “What I usually run into is the client ly plentiful. According to Forrester surreaches a point where they need the app OS updates. veys, the average number of mobile Glenn Gruber, another senior to do something special. I can’t just take developers at a large company is only mobile strategist at Anexinet, said there the brick they give me and plug that in,” 10 to 15. are other ways to improve efficiencies said Improving’s Runyon. “That’s where “That’s enough to build two apps, so without using low-code tools, such as it gets a little challenging and tricky you can build your customer-facing Swift UI and Kotlin, which create more coming from a low-code perspective.” application and maybe one more strate- efficient and declarative code since Runyon sometimes needs to rewrite gic application for employees or apps entirely from scratch just business partners,” Forrester’s to ensure the app is capable of In an era when time to market has Hammond said. “Who’s going to doing everything his client become competitive table stakes, build the other 50, 70, or 100 wants it to do. organizations — particularly those on things when you need to mobi“I’ve never found a client the digital transformation path — need lize your business processes or that’s been able to say, ‘I’m willyou need an app that’s going to ing to restrain my requirements to find a way to get support a workgroup? It’s not and I understand that it can’t do everything done. going to be mobile developers, this and I’m OK with that,’ ” said so as long as we have this supply Runyon. “In that case, you could and demand problem, there’s use a low-code tool to do a protogoing to be a market for lowtype and tell them if they’re good code tools that help people who with it we could continue and aren’t professional mobile app wrap it up. If you want it to do developers build apps.” these specialized things, it’s going No-code tools are aimed at to take three to six months.” business users, while low-code tools are what used to take hundreds of lines of Setting expectations is always critical aimed at professional developers or code can now be done with dozens of when laying out the respective costs, web developers. Neither Forrester nor lines of code. benefits, and limitations of the various Gartner recognize “no-code” tools as a “If you’re rebuildling apps that make options, whether the developer is a product category. In their view, low- the business run, I would not go for consultant or an in-house developer. Of code tools fall along a spectrum of low- [low-code] tools,” said Gruber. course, as most experienced developers code and lower code. The latter is visuhave learned the hard way, their clients al, while the former is more likely to App evolution may be a problem may not know exactly what they want support command-line coding where Low-code tools enable organizations to until they see it or they see one set of accomplish more in less time. In an era capabilities that opens their minds to necessary. “Low-code platforms have not when time to market has become com- new possibilities. matured enough in the capabilities petitive table stakes, organizations — “I’ve seen use cases in Fortune 500 required to design and develop the particularly those on the digital trans- companies where low-code apps have native app functionality that mobile formation path — need to find a way to been built to do one specific thing, they users like and have come to expect, get everything done. do it well, and it’s perfect for what they What isn’t obvious to business users, need. However, needs tend to evolve whether it’s consumer or mobile apps,” said James Hoshor, senior mobile strate- but what can be blatantly obvious to over time,” said Runyon. “The devil is gist at technology consulting and reseller professional mobile developers, is that always in the details. Low-code is great Anexinet. “I have yet to see mobile apps low-code applications may be con- and it looks good on paper but when you straining over time if they don’t scale need an app to do that last 10 percent, it in the public app stores or used in enterprises that provide native functionality, well. Alternatively, an app a business can become really hairy and nasty. You continued on page 35 > support complex business processes user built with a no-code tool may work < continued from page 31


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:02 AM Page 33


030-35-LowCode_SDT026.qxp_Layout 1 7/26/19 10:01 AM Page 34

34

SD Times

August 2019

www.sdtimes.com

USF goes all in on low-code BY LISA MORGAN What if 70 percent of the people developing apps for your organization never studied computer science? Such is the case at the University of South Florida (USF) where low-code development is central to its development strategy. The impetus for the switch from traditional development to low-code started five years ago, when it became apparent that just about everything USF does could be accomplished using a series of workflows and forms versus custom development or off-the-shelf software. At the time, the university used Appian. Much has changed in the past five years, including who’s developing USF’s applications. What hasn’t changed is the brand of tool they’re using. “Students are natively mobile, but we’re also using multiplatform low-code for professors and people who are traveling. We like it because it’s code once and use it everywhere so if you need mobile, we have it and if you need desktop we have it,” said Sidney Fernandez, system VP for technology and CIO at USF. The main benefit of low-code use has been speed. Fernandez said apps can now

be developed in two or three months that would have taken years to build before. Moreover, the apps are instantly mobile. “That’s a big deal for us because we have students who need the same experience whether they’re booking an appointment with a counselor or looking up class notes,” said Fernandez. “We no longer have any custom developers. Everyone is now using low-code with a service bus underneath, which brings everything together.” Another reason low-code works well for USF is because it hires students for app development. There’s a 70/30 split between students that are not computer science or management information systems (MIS) majors and those who are. “We train them on understanding the business problem and then it takes them about three months to become decent developers,” said Fernandez. “In addition to improving the speed of development, we’ve also benefitted from architectural simplification, so I don’t need a significant DBA or DevOps presence to make this happen. We’re releasing new solutions every couple of weeks.” The most astute low-code developers have proven to be mechanical engineer-

ing and industrial engineering majors, although the university has had “decent luck” with MIS students since they’re trained from an analyst’s point of view. However, computer science majors are required for the integration part, which typically means pulling data from other systems into the low-code platform. Architects are also still needed to ensure app scalability. “Four years ago, we didn’t really need architects because our platform was small, but now we’re serving 55,000 students,” said Fernandez. “When you have solutions that depend on each other you want to be careful with the architecture.” USF uses some of the mobile-specific capabilities of the low-code platform such as integrating camera functionality into an app, so employees can capture pictures of receipts while traveling. USF can also extend the low-code platform if it chooses to do that. Meanwhile, more modern features are being pondered, such as chatbots that are integrated with the platform.

The future is low-code The rate of technology change generally has not gone unnoticed at USF. In fact,


030-35-LowCode_SDT026.qxp_Layout 1 7/26/19 10:00 AM Page 35

www.sdtimes.com

Alice Wei, head of Innovation and Transformation sees low-code as a shield against rapid change not only for business users but developers as well. “If you spend six months developing an app and then end up realizing the requirements were incorrect, that’s six months you’ve wasted. It doesn’t matter how beautiful your code is or how well it’s architected, it’s still going to be unusable shelfware,” said Wei. When USF started using low-code it expected traditional developers to use it. Not surprisingly, many of them were frustrated by the level of abstraction that obscured what was actually happening, so many of them opted to leave. “We’ve had to rethink our marketing and recruitment strategy because we know a lot of the traditional computer science students aren’t being trained to think of low-code as the next evolution of programming. Instead, it’s seen as a threat or any kid could do it,” said Wei. “They’re missing the point. It’s changing your skill set so instead of fixating on syntax and quality, flow and the right logic, you’re more validating that the software is working and whether it’s what the customer requested.” In Wei’s view, traditional skill sets are best applied in situations where developers are building something that requires custom code, versus a university environment that’s basically trying to optimize workflows and processes. “We realized we were not building anything original,” said Wei. “In higher ed, the workflows aren’t original enough that you need custom development from scratch. That’s a lot of overhead.” So far, USF hasn’t wanted to do anything the low-code platform couldn’t handle, other than the data integration and architecture work that are handled by computer science students. “I think things are moving faster than people expect in terms of lowcode/no-code systems being able to do relatively complicated things. We wouldn’t have expected [those capabilities] before when we started our coding careers,” said Fernandez. “A lot of lowcode projects tend to fail because they’re niche projects versus transforming the company. It needs to be part of the general enterprise strategy for it to be extremely effective. That’s a critical success enabler.” z

< continued from page 32

just need to be aware that you could potentially hit problems with the lowcode approach that you would not have with the traditional development approach. If you need to get to the market quickly and you can constrain the requirements, low-code may be exactly what you need.” Increasingly, organizations will find themselves with a mix of traditional developers and non-traditional developers all writing mobile apps, albeit not the same mobile apps. Since speed and efficiency requirements aren’t going away, there’s an argument in favor of having an overall portfolio strategy that considers app development more broadly to eliminate such things as app development inefficiencies that may not be apparent otherwise, redundant app development across departments and unnecessary friction between developers and nondevelopers who build apps. “[Having a portfolio strategy] is important, especially as things grow. For example, Aetna is a very large Quick Base user. I think they have over 1,000 people building applications inside the organization and because of that they’ve had to build a center of excellence in the IT organization that helps those folks if they get in over their heads,” said Hammond. “The center of excellence gives them advice, shares best practices and puts together some guidelines and things they should be thinking about as they build their applications. As the investment grows in the organization, that level of governance becomes important.” Although some low-code apps may be difficult to maintain, Kevin Grohoske, practice director of Business Automation at digital advisory services firm Sparkhound, has observed the opposite. “Most organizations are too overburdened with ongoing maintenance costs to build new solutions. Some organizations may have to dedicate 50% - 70% of their annual budget just to maintain their existing solutions. Leveraging a low-code/no-code platform can significantly reduce ongoing maintenance costs as the business needs change,” said Grohoske.

August 2019

SD Times

However, rather than leading with tools, a sounder strategy is to understand the specific business requirements and then determine if a lowcode/no-code platform can provide the desired feature set. If a platform doesn’t support the “must-have” mobile feature, then it may be unlikely that the platform can be extended to achieve the desired feature. Even if the low-code platform can be extended, doing so would reduce the value of using a lowcode/no-code platform in the first place, Grohoske said. Richard Salinas, managing director of Business Automation at Sparkhound, added that storage can also be a problem. “There’s a limit to the device’s storage with a low-code/no-code app since the underlying platform ‘hosts’ the app,” said Salinas. “A native, purpose-built app has its own storage and can better control its footprint. Low-code/no-code apps are not [appropriate] for every business need since they are typically better suited to single purpose functions with a low storage budget.”

Times have changed Organizations have always had power users willing to forge their own solutions because they couldn’t wait for IT to build something, although historically that development has been more desktop-based than mobile. In today’s always-on world, mobile devices have become the choice for staying connected and getting things done quickly, which has driven the demand for hybrid low-code tools. “One of the biggest differences we have now is we have folks that grew up as digital natives. It’s easier for them to articulate what they want to do and how they want to interact than the previous generation, so I think there’s a larger potential market of folks that could use [low-code and no-code] tools compared to what we saw 20 years ago,” said Forrester’s Hammond. “If they’re willing to find the right components or look for the right code samples, even if I want to use something that needs the camera or the microphone or the GPS, I can accomplish that in practically every low-code tool out there.” z

35


036,37-RIseFall_SDT026.qxp_Layout 1 7/26/19 10:55 AM Page 36

36

SD Times

August 2019

www.sdtimes.com

Low-Code’s a rapidly rising sector, but will it disappear? BY CHARLES ARAUJO

Every week, we have one or two “briefing days” in which we schedule as many as eight one-hour discussions with various technology vendors in our broad coverage area that spans everything from cognitive to customer experience platforms — and everything in between. It’s always fascinating to see the mix of companies we talk to week-to-week. But there is one technology sector that’s always on the list: low-code development platforms. These platforms, which enable enterprise and mid-market organizations to create bespoke applications using visual development interfaces and little to no coding, are multiplying at an incredible rate. We work with many of the major players in the market and watch this space carefully — and yet, rarely does a week pass without a new company showing up on our radar. While this remains a nascent market Charles Araujo is principal analyst at analyst and advisory firm Intellyx.

space that is continuing to evolve (the major analyst firms are still fighting over what to call it!), it is clear that it is now crossing into the mainstream as more enterprises become comfortable with the low-code mantra and are beginning to adopt low-code approaches in various forms. After all, a steady stream of new market entrants is the most definite sign of a growing market space. There’s a dark side, however, to the rise of the low-code market: it may be rising right out of existence.

Low-code’s rising star In an article I wrote at the end of 2017 entitled, “Is it the end of coding in the enterprise?,” I examined the impact that low-code was having on the nature of software development in organizations, and concluded that hand-coding would eventually go the way of punchcards. The continuing evolution of the space in the year and a half since has only made me more sure of this eventuality. The critical insight from my article research was that it is an organization’s “deep understanding of the market,

their algorithms and their proprietary business logic that provided the competitive value” — not its ability to code. Coding is, and always has been, a means to an end. The goal has been to create software that solved a business problem and created some form of competitive advantage in the market. The fact that organizations had to employ people to write code to do so was just the cost and complication of creating that advantage. The power and flexibility that modern low-code platforms bring to the table enable organizations to realize the benefits they have always sought, but with the advantage of being able to create and change applications more rapidly, and to engage non-coders in the development process. It’s an enticing promise that is winning converts across the enterprise spectrum — particularly as these platforms demonstrate that they can meet ever-more-complex and intricate needs.

The message heard wide and far While the uptake within enterprises is good news for the growing low-code


036,37-RIseFall_SDT026.qxp_Layout 1 7/26/19 10:55 AM Page 37

www.sdtimes.com

market, it comes with a potential downside. As enterprise leaders have warmed to this new development approach, vendors from across the technology spectrum have taken notice. Everyone, it seems, got the memo. It turns out that it wasn’t only the traditional hand-coding process that could benefit from this low-code approach. Across the enterprise technology landscape, there is pent-up frustration about the complexity of managing the technology stack and the difficulty in creating automation. There is also a growing demand to leverage technology across a vast swath of business processes and customer-facing interactions that don’t fall neatly into the traditional definition of application development. As a result, there has been a rush to embed low-code approaches into every type of technology imaginable. In fact, it’s getting hard to find a technology sector that is not embracing the low-code ethos to one degree or another. And across the board, the reason is the same: organizations need the ability to deploy, manage, and adapt technologies and automation strategies as fast as possible — and the more technical complexity that’s involved, whether in the form of coding or arcane configuration, the harder that becomes. The enterprise and vendors alike are increasingly seeing the low-code ethos as the pathway to the future. But while that is good news for the enterprise, and a win on the philosophical front for low-code players, it could spell trouble for them in the long run.

There’s plenty of work to go around. Still, the challenge for low-code vendors is that, collectively, they’ve defined themselves based on the means, rather than the end. Ironically, it’s the same challenge that those who would prefer hand coding face as they battle against low-code’s encroachment. It’s hard to defend the means when it’s the end that the organization seeks. No one outside of IT cares how applications are created — they only seek the value it delivers. And the more control they can get over that process, the better. While low-code platforms may be a better, faster way to deliver that value, it remains only the means. The risk is that as technology companies that are closer to the point of value realization (e.g. sales, marketing, and customer-facing platforms, etc.) or closer to the point of technical pain (e.g. data integration and data science platforms, etc.) embrace low-code approaches and, therefore, make it eas-

Low-code’s focus problem When the whole world has gone ‘lowcode,’ what will it mean to be a lowcode platform? Admittedly, we’re years away from this scenario. Most of the enterprise technology stack remains mired in the complexity and rigidity that low-code — in all its forms — aims to alleviate.

SD Times

platform with all the hooks and connections built-in, but with the freedom to customize and change it without fear? Again, this is not today’s reality. But there is abundant evidence that this is where we’re headed. After all, if what low-code vendors are selling is a better means to an end, why wouldn’t everyone else simply adopt that means?

The Intellyx Take: Value rules the day If I were back leading an enterprise IT organization today, I would be doing two things. First, I would be selecting and investing in a low-code platform to enable me to speed development and begin to empower my customers to take greater control of their automation needs. At the same time, however, I would be taking a long, hard look at the various platforms I used to run my business — and would begin piecing together an architectural model, built around these platforms, that would allow me to create and sustain the greatest amount of differentiated value for my organization. That architecture would, undoubtedly, demand a certain degree of ground-up development. But the vast majority of it would most likely run on top of or adjacent to one or more of these core business platforms. The driving force behind this architecture would be the ability to create unique business value and competitive advantage — and every decision I made would pivot around this focal point. I suspect that lowcode platforms, at least in their current incarnation, would find themselves relegated to limited use cases simply because they would require me to build too much. As the pace of change continues to increase and as market demands continually shift, enterprise leaders will be looking for those tools and platforms that enable them to create and sustain competitive value most easily and rapidly — and to build only what is required to do so. z

No one outside of IT cares how applications are created — they only seek the value it delivers. While low-code platforms may be a better, faster way to deliver that value, it remains only the means. ier for organizations to close the automation gap within their already-defined ecosystems, the places in which organizations need to create bespoke applications will narrow. This same process will also undercut one of low-code’s greatest counterpoints: that customization to these large platforms is expensive and slow. But why would someone create a completely custom application if, instead, they can use a low-code approach to build that application on top of an adjacent

August 2019

37


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:02 AM Page 38


039parasoft_SDT026.qxp_Layout 1 7/26/19 2:19 PM Page 39

39

Follow the test pyramid to continuous testing A

s software development grows more complex — microservices, containers, APIs — and demand for faster releases increases, testing as usual can be seen as a clumsy, out-of-place step in the delivery process. Organizations that have adopted Agile development and DevOps methodologies know that testing has to keep pace, and are finding that tools that don’t enable things like test automation and continuous testing are hindering their efforts. But like Agile and DevOps practices, in testing there is no one right answer for doing things. Mark Lambert, vice president of product at Parasoft, said doing some level of test automation from a functional perspective is foundational for continuous testing. “Service virtualization is a key enabling technology for continuous testing. Typically, for you to even be able to take advantage of service virtualization, and really become a true continuous testing practitioner, you have to have some level of test automation in place to start with,” he said. Organizations will usually start with UI testing, he said, but “the sweet spot for test automation ... is actually at the API level.” Lambert said a continuous testing practice should follow the testing pyramid as defined by Mike Cohn in his book, “Succeeding with Agile,” and Agile thought leader Martin Fowler. The base is built on unit tests, which are isolated and faster to test. As you move up the pyramid, the assets you’re testing are more integrated, and take longer to perform. The middle of the pyramid is where you do service integration, or API tests, UI testing is the smallest top of the pyramid. “If you’re going to do continuous testing, a foundation of unit tests is a must-have, API testing is the next must-have and you want to optimize your use of end-to-end UI tests” at the top of the pyramid, Lambert explained. “It’s not to say to eliminate [UI tests]; you want to make them as efficient as possible, because you’ve got to worry about maintainability of those tests.” Having done this, organizations have derived what Lambert described as the first phase of value from continuous testing — earlier-stage identification of regressions, and he said the best way to find those regressions is at the API level, because they’re easier to diagnose and reproduce. The second phase is to identify any potential integration issues as early as possible. Lambert said, “A key element of that is ensuring your application’s performance characteristics aren’t going outside of your defined service level of agreement, as it’s very easy for development teams to be adding new functionality and introducing performance

2 019

issues in their application without them realizing it. “You’ve got your unit tests running, you’ve got your functional tests running, you’re using service virtualization to run your regression tests more continuously,” he continued, “now you’ve got the ability to say, let me take a look at my non-functional business requirements — performance and security — take performance tests that you’d normally take at the later stages, apply service virtualization to the bottlenecks in the infrastructure, and then start taking your API tests and execute them continuously under load on your part of the overall system. This is when you really start getting your second phase of value from continuous testing.” Parasoft, recognized as Gartner Peer Insight Customers’ Choice for Software Test Automation, has earned its place on the 2019 SD Times 100 by providing a breadth of cutting-edge tools that span unit, functional, UI and regression tests that help its customers deliver high-quality

Teesttii ng

Parasoft software that also meets end-user wants and needs. Its solutions span everything from unit and functional testing of the API and the UI, test data management, safety and security compliance, change management, and more. Following the test pyramid, Parasoft offers C/C++test, dotTEST and Jtest development testing tools, covering such things as static analysis for uncovering deep reliability and security (with support for the OWASP, CWE and CERT standards), unit testing, coverage and traceability. SOAtest is Parasoft’s functional testing solution that focuses on validating applications at the API level. To help organizations move from manual testing to automated API testing, Parasoft offers Smart API Test Generator as part of SOAtest, leveraging AI and machine learning to build test scenarios from data relationships extracted from recorded application traffic. Further, the company’s service virtualization technology, Parasoft Virtualize, lets organizations decouple tests from external dependencies for true continuous testing and enables automated tests to be continuously executed within CI pipelines. To learn more about Parasoft’s tools and solutions, visit www.parasoft.com. H


040,41-sound_SDT026.qxp_Layout 1 7/26/19 10:54 AM Page 40

40

SD Times

August 2019

www.sdtimes.com

Sending encrypted data BY JOE TODD

From providing a simple and low-cost entry system for public transport, to facilitating peer-to-peer payments between two unconnected parties, using sound to transfer data can bring unique benefits to many different applications. Although the usability benefits and time-saving capabilities of acoustic data transmission are wellunderstood, the security implications of data-oversound are lesser known. On one hand, audio seems more secure than IP-based connectivity, which can be penetrated remotely by hackers. But can broadcasting information over soundwaves be protected from nearby eavesdroppers? With this question in mind, let’s take a look at the properties of sound and how industry-standard encryption can be applied to acoustic data transfer to render it secure and safe from the risk of prying ears.

Acoustic vs Radio Frequency (RF) Security When understanding the potential of sound as a means for secure data exchanges, it is useful to under-

Joe Todd is head of engineering at Chirp.

stand its fundamental security benefits and ability to perform just as securely as other forms of connectivity, such as RF. Acoustic data transfer enables localized connectivity, which can effectively reduce the area of potential attack. It doesn’t require IP-based connectivity to perform the transmission, which reduces the risk of remote hackers being able to interfere. Ultrasonic transmissions are also beneficial in environments that require secure near-field data transfer in sensitive or RF-saturated vicinities. Because sound doesn’t leak through walls, it cannot be eavesdropped on from listeners in adjacent buildings. This makes it highly suited to areas such as industrial sites or hotel rooms where certain sensitive data must be kept within the confines of one space. In terms of regulatory considerations, offline acoustic transmissions are compliant with the parameters set by the General Data Protection Regulation (GDPR) and the Children’s Online Privacy Protection


040,41-sound_SDT026.qxp_Layout 1 7/26/19 10:54 AM Page 41

www.sdtimes.com

August 2019

SD Times

41

ularly suitable as it doesn’t increase the size of the payload, which is useful for the low-bandwidth channel that acoustic networking provides. The first step to applying encryption to audio-based transmission is to determine the AES block size to use (128, 192, 256-bit), and pick a shared key to use on both the sender and receiver side. Next, an initialization vector (IV) or a counter needs to be provided. This value should be different for each of the payloads that are being encrypted; otherwise, this will not be secure for transmission. The way the IV is modified must be known by both parts and replicable, as you can only decrypt the data with the exact same IV. Finally, the encryption function on the data needs to be called. This method will return the encrypted payload and will consist of the same length as the raw payload. The process of decrypting an AES ciphertext is similar to the encryption process, in the reverse order.

Encryption with a public/private key infrastructure (RSA) RSA (Rivest, Shamir, Adelman) is another algorithm used to encrypt data. It is a strong technique for situations in which an individual wants to make a secure transaction to a trusted third party that already has the public keys — for example, a bank or point of sale. Additionally, the third party is also able to verify its identity using the RSA signature. Once a message has been encrypted using the public key, it is only able to be decrypted by an additional key, also known as the private key.

Time-based keying (TOTP) Act (COPPA) information security rulings — an asset that removes further compliance concerns and is particularly valuable for those delivering consumer-facing applications.

Encryption with a shared key (AES) Encryption makes data unreadable by anyone other than those with the keys to decode it. Networking technology like dataover-sound can provide the transport layer, with an encryption algorithm applied to the data to protect it from nearby listeners during transmission. Depending on the use case, different approaches can be taken. A common approach is Advanced Encryption Standard (AES), one of the most widely adopted encryption algorithms due to its proven security for a range of applications. AES was first adopted by the United States government to keep classified information safe, and is used in secure file transfer protocols including HTTPS and SSH. AES is partic-

As an alternative to encryption, time-based one-time passwords (TOTP) can be used to create throwaway single-use keys. This approach is good for situations in which an individual needs a lightweight way to authenticate using a PIN, which is safe from the risk of replay attacks. It is, however, worth noting that this method requires a clock that is roughly synchronized to both devices being used for the encryption process. As discussed, there are several distinct options when it comes to transmitting data securely and, as a result, there is no one solution to rule them all. With various encryption options readily available, sound-based connectivity can be equally as secure as the likes of RF-based transmission, with the affordance of giving the user complete control over their encryption approaches. With the option to select and build their own approaches to security, developers can be confident that the method used is the correct one for their specific scenario and successfully enable secure data transmission using sound. z


042,43_SDT026.qxp_Layout 1 7/26/19 1:54 PM Page 42

42

SD Times

August 2019

The new age of Agile:

Evolving from teams

BY DEAN LEFFINGWELL n his book, “From Project to Product,” author Mik Kersten describes how business leaders are illequipped to solve the problems posed by digital transformation. He points to the 2018 report, Corporate Longevity Forecast, which issues a “gale force warning to leaders: at the current churn rate, about half of S&P 500 companies will be replaced over the next 10 years.” It’s a grim cautioning, but not surprising given the new paradigm of competition where customers no longer just compare you to your direct competitors but to the best service they have ever received—from any company. If you

I

Dean Leffingwell is the creator of SAFe and cofounder of Scaled Agile, Inc.

launch a banking franchise, you’re not just compared to Chase and Wells Fargo, your user experience is measured against Uber and Amazon. That’s a tall order to fill. And it’s one of the many reasons why every business that wants to survive the storm must become like FedEx, which venture capitalist Marc Andreessen describes as “a software network that happens to have trucks, planes and distribution hubs attached.” Kersten claims — and there’s every reason to agree with him — that “... those who master large-scale software delivery will define the economic landscape of the 21st century.” The most forward-looking companies are actively future-proofing for this. BMW Group’s CEO expects that in their future, more than half of the staff

will be software developers. Okay, we get it — hire thousands of software developers. But is that enough? What does it actually mean to master large-scale software delivery? Based on what’s happening in the field, we’re understanding that it takes a great deal more.

Business agility: the next big move From an IT perspective, Agile with a capital “A” has served us well. Development teams get it, they usually like it, and when practiced with integrity and commitment, it consistently delivers results like faster time-to-market, improved quality, predictability, and employee engagement. Agile provides a great foundation, for sure, especially when combined with Lean, DevOps,


042,43_SDT026.qxp_Layout 1 7/26/19 1:55 PM Page 43

August 2019

SD Times

43

AGILE SHOWCASE

to the entire business if you look at companies that are leading the way in this area, you’ll see a pattern in approach and capabilities: n From C-level to marketing and HR — everyone adopts Agile

Let’s start with the idea of enterprisewide practice of Lean, Agile, and DevOps. When every team from every unit — leadership, sales, development, marketing, HR, finance, etc. — works from the same playbook, all have their eyes on the same prize, and work together in cadence and alignment, the entire organization — not just development — is able to continually and proactively deliver high-quality value faster than the competition. Working this way is an emerging trend and requires specialization of principles and practices for the context of the business unit. For instance, you wouldn’t expect marketing and development teams to follow the exact same practices, but there would be strong similarities. and Lean Portfolio Management. And now, companies are experimenting with extending Agile beyond its IT roots and applying it to the whole business. The idea of business agility is clearly catching fire but interpretations of what it is and how to achieve it vary. The underlying meaning, though, is something on which we can probably all agree: business agility is a competitive advantage that helps an enterprise adapt and thrive in the digital age by delivering innovative technical and business solutions in the shortest sustainable lead time.

Key ingredients for achieving business agility Many organizations struggle to get beyond team-level Agile because they can’t agree on the roadmap. However,

n Build core capabilities

Industry pacesetters are building advanced capabilities in the areas of team and technical agility, DevOps and release on-demand, Lean-Agile leadership, Lean systems engineering, and Lean Portfolio Management. Mastering these capabilities creates muscle memory that can go far in bolstering the development and delivery machine, but even that is not enough to avoid being sidelined by faster and more nimble competitors. n Add organizational agility

Organizational agility provides the business with the capacity to identify and capture opportunities more quickly than its rivals. This is achieved by being able to rapidly evolve strategy, organizational structures, technical and business practices, and people operations. Build-

ing organizational agility helps deliver the tangible benefits of better financial results and is an essential chapter in the playbook for achieving business agility. n Commit to a continuous

learning culture

Creating a learning-centered work culture is critical for attracting top talent and giving your workers the tools they need to be successful and grow your business. Seems logical, but it’s far too easy to sacrifice learning and development in favor of short-term wins, and it’s not unusual for an organization to view training as a one-off exercise to fill an immediate need. This is a common cry among many of the enterprises I’ve worked with who wonder why they struggle to sustain their early wins from ‘going Agile.’ Their counterparts, however—the organizations fully committed to building a continuous learning culture—are seeing something altogether different. When an organization commits to encouraging individuals—and the enterprise as a whole—to continually increase knowledge, competence, performance, and innovation, results can be dramatic. Deloitte reports that companies with continuous learning cultures enjoy a number of benefits, including: n They are 46% more likely

to be first to market n They experience

37% higher productivity n They are 92% more likely to innovate

The implications are clear: business agility — and all that is required to achieve it — is a game-changing approach to business, with significant bottom-line implications that should go far in helping businesses survive and thrive in the 21st century. z


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:02 AM Page 44

Force Multiplier: \fo(e)rs \m˵l-t˵-pl Ɍ(˵)r n: A tool that dramatically amplifies your effectiveness.

73% of customer-facing apps are highly dependent on the mainframe. Yet 2 out of 3 lost mainframe positions remain unfilled, putting quality, velocity and efficiency at risk. You need Compuware Topaz as your Force Multiplier to: • Build and deploy with agility • Understand complex applications and data • Drive continuous, automated testing Learn more at compuware.com/force-multiplier compuware.com | @compuware | linkedin.com/company/compuware

The Mainframe Software Partner For The Next 50 Years


045_SDT026.qxp_Layout 1 7/26/19 3:37 PM Page 45

August 2019

SD Times

45

AGILE SHOWCASE

Modernize mainframe apps through Agile practices gile software development is well understood, and the benefits are clear. In the world of software development for mainframes, the benefits are there but there has been some resistance to adopting Agile practices. “There’s not necessarily a resistance to do it, but where the resistance comes in is they don’t have the supporting infrastructure to allow them to do that,” explained David Rizzo, vice president of product development at Compuware, adding that if organizations running mainframes don’t have tools and processes in place for Agile development — and many don’t — their paths to faster software delivery and the ability to move quickly on business opportunities or to remediate issues will be more difficult. “That’s one of the big issues we see, is that if they’re using outdated tools and outdated process to delivery, they don’t see how they can move faster because they don’t have a means to do it.” Mainframes have traditionally been used in industries that are highly regulated, such as finance, insurance, health care and more. But even those organizations are looking to modernize their applications, and are learning that new techniques and new architectures can be applied to a world of COBOL and green-screen applications. Developers today are creating applications for web browsers, mobile devices, tablets, desktops and the cloud. Mainframes, then, are just another endpoint to target — one that comes with the reliability and security that large organizations have come to rely on. According to Rizzo, when you look at the development tooling in the market, the mainframe has become mainstream as far as the available technologies that interact with it.

A

“From a coding languages perspective, on the mainframe it’s predominantly COBOL, and it’s just another language. I talked to people that are entering the industry, coming out of colleges and universities, they’ve learned seven, 10 different languages in the time they were in school and putting COBOL in front of them is just learning another syntax, another instruction set, so to speak, that they have to work with. Using things like Jenkins, It’s just standard in the toolkit for developers in large enterprises, and connecting to the mainframe is just another platform.” Compuware provides the tools that help organizations modernize their legacy applications. Topaz is Compuware’s IDE for modern development, based on the Eclipse Java IDE. Topaz for Total Test is a tool for automated unit testing on the mainframe. And last year, Compuware launched zAdvisor, an analysis tool for measuring how well the team is performing. “We look at developers as high-performing athletes,” Rizzo said. “You always want to know what they’re doing right, and how they can be better. The zAdviser product gives them some insight into that so they can help them to continually improve.” The metrics collected in zAdvisor can quantify quality, velocity and efficiency by looking at such things as how much work development teams are putting into the system, how many stories are getting done in a sprint, how many unit tests they’re creating and how many automated scripts are created and executed during a given cycle. For instance, Rizzo explained, if you

have more automated tests, and you’re running them more frequently, your quality will improve, and you can draw that conclusion from other metrics that go with that, like how many bugs were created, how many were fixed, and what are you doing to keep that in check. The zAdvisor tool also gives a view into source code changes, such as how many elements are changing and how many are being deployed. “You can get a sense of the impact that that will have when you want to move to production,” he said. “As the old saying goes, if you don’t measure it you can’t

The mainframe has become mainstream as far as the available technologies that interact with it. —David Rizzo

improve it. So, if your start measuring it you start naturally improving it and seeing ways for it to go better.” Agile and DevOps processes are designed to make delivering software faster and more efficiently, while maintaining the highest levels of quality possible. It’s also about using your resources to deliver as many new features or defect fixes as quickly as you can. Compuware is leading the way in helping organizations running mainframes modernize their applications through modern work methodologies, tools and processes. “Companies are continuing to do Agile,” Rizzo said. “Those that have embraced it, and those that haven’t, are realizing that they need to implement something that will allow them to increase the cadence of their delivery while ensuring they deliver something that is meaningful and valuable to their end users or customers.” n


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:03 AM Page 46

.


047_SDT026.qxp_Layout 1 7/26/19 3:24 PM Page 47

August 2019

SD Times

47

AGILE SHOWCASE

Becoming agile in the knowledge economy ow do organizations effectively identify, plan and sequence work across the whole enterprise and across product portfolios, especially in the knowledge economy and new way of working? And how do you then have proper management and governance of that? These questions are top of mind in an Agile world and questions Rally Software from Broadcom are helping organizations find answers to via its application life cycle management (ALM) platform. Christopher Pola, executive advisor at Rally, says, “So you’re abiding by the principles of product flow and lean, agile enterprise because the reason why people aren’t getting benefits is they’re still managing the work and the people in a very traditional way. An example of that mindset is efficiency: We value efficiency foremost, right? That means everyone has to be working 100 percent. But when dealing with the knowledge economy and the front line, workers have more knowledge than their management, and there’s so much variability, nothing homogenous in terms of tasks, nothing’s repetitive. Capacity, efficiency, and utilization is a very bad metric to run your organization on.” Enterprises need to think differently in terms of whether it manages the people, process and technology, or the people’s scope and time. It needs to allow for variability, cycle time, and innovation. Then those changes need to be brought into both planning and execution. Laureen Knutsen, executive advisor and head of solution architects at Rally, describes an example of an actual customer: “They mentioned that they were doing resource utilization and the resource utilization people were very proud of themselves because they said they had 100 percent utilization. I

H

asked them how they’d done that. They said, ‘We keep track of every project. We had somebody come up to us and tell us that they finished with a project, not just a task, a project, two hours early. So we gave them a two-hour task to fill their time so they were 100 percent utilized. I asked them if anyone had told them again that they were done early, and they hadn’t.” That organization was making people less productive by trying to be 100 percent utilized at a consolidated level. The organization had a few thousand people on its team and it was trying to manage 100 percent of everyone’s time. That’s vintage resource management and an upper level of process that needs to change so the team can actually get to true productivity and predictability within itself. Changing that methodology entails getting down to the fundamentals of where adverse practices first started. Knutsen says that at first you’re guessing and placing bets when you’re strategizing. Then you need to let the teams estimate the work and tell you what they can really get done and have your teams work on becoming extremely predictable. Not just at the team level. When the teams become extremely predictable your projects will end up becoming more predictable. Your people will get very productive in that type of environment. Rally visualizes everything from strategy all the way through execution. There are many types of business levels that you can tie to that top-tier strategy regardless of how long that’s going to take you to complete the strategy. You can tie it all the way down to the two-week iterations. Like breaking it down through whatever

steps the company normally takes. Another challenge that needs to be overcome is that often the people doing the planning are not the people doing the work. Pola says, “First, this is all bad in a knowledge economy because your knowledge workers are the ones that have the skills. So we do that, we put the plans in place. When we manage to go to plan, then any variance from that plan, even if there’s value in changing the plan, we deem to be bad.” The result is managers manipulating spreadsheets to show a good report. Pola adds, “People don’t have the data right.

When dealing with the knowledge economy and the front line, workers have more knowledge than their management. —Christopher Pola

They’re manipulating the data to run their businesses, and it’s not accurate. It’s not what I like to think of, which is a principle beyond budgeting. It’s an open and ethical information system that provides one truth across the organization.” For Knutsen it’s evolutionary, continuous improvement, constant change, constant striving to get better at an individual level, a team level, a team of teams level, and a leadership level. She says, “It’s really hard to fail at that point as opposed to companies that are continuing to do it as they’ve always done because that’s all they’ve always done.” Pola believes the biggest improvements will come when decentralized decision-making becomes the norm because fundamentally it will help companies realize exponential improvement in cost optimization, innovation and make for happier people. n


048_SDT026.qxp_Layout 1 7/26/19 10:52 AM Page 48

48

SD Times

August 2019

www.sdtimes.com

Guest View BY DAN SAKS

Solving the platform puzzle Dan Saks is founder and co-CEO of commerce platform provider AppDirect.

P

latforms beat products. Every time. Let me tell you why. At their most basic level, platforms are digital places where companies, suppliers, and customers come together to create value that provides a benefit to everyone involved. Platforms connect software developers and customers in entirely new ways, in a manner that simply offering a product or even a line of products cannot. The five largest companies in the world — Apple, Amazon, Alphabet, Microsoft, and Facebook — are all platform companies. Together, they have a market value of almost $4.7 trillion. Beyond pure tech companies, car companies like Jaguar Land Rover and Fiat, manufacturers such as Caterpillar, and publishers such as Houghton Mifflin Harcourt, are just a few examples of the companies that expand their reach with a digital-first mindset that includes a platform strategy. Platforms don’t just belong to the giants, either. There are more than 300 “unicorn” startups, companies with $1 billion-plus valuations, which are largely platformbased and add more than $1 trillion to that growing number. While a platform itself doesn’t guarantee success, platforms are well on their way to becoming the solution of choice for companies looking to drive digital innovation, accelerate time to market and deliver a superior customer experience. In fact, more than eight out of 10 executives, 86 percent, say that platforms are the critical factor for success in the digital economy. For software developers, thriving platforms with an ecosystem of third-party products and services can represent a clear path to monetization. Let’s take a closer look at how the pieces of the platform puzzle fit together: The power of the platform. Salesforce is probably the most well-known enterprise software platform and developer ecosystem – and it is arguably the most prescient. It recognized the importance of platformization early on. The Salesforce AppExchange launched in 2005 and now offers more than 3,000 applications and components that extend and add value to Salesforce.com’s core products. Intuit is another powerful example of a company that seized

Platforms are well on their way to becoming the solution of choice for companies looking to drive digital innovation.

the opportunity to become more than a provider of products and services. Intuit’s platform, the QuickBooks App Store, serves roughly 50 million customers worldwide, boosting its online ecosystem revenue by 42 percent in the first quarter this year and helping the company rake in $6 billion in revenue overall in 2018. The beating heart of a platform company. Platforms may beat products, but a successful platform company starts with a successful product, whether it’s Apple and its iPhone, Salesforce and its CRM, or Intuit and its QuickBooks accounting solution. Here’s a disaster scenario: A company launches a new platform, and no one comes — not suppliers, not customers. To minimize this risk, a platform needs to solve genuine customer problems. Apple, Salesforce, and Intuit all use their platforms as a way to easily deliver new, innovative software to their customers without needing to shoulder the burden of in-house development. Before investing too much effort, it is critical for an enterprise considering a platform approach to understand customer pain points and how a platform can solve them. An open partnership mindset. It used to be that a winning business strategy meant creating products and building fortress-like walls around them to keep out competitors. No more. Today’s innovators are open to new ways of collaborating to deliver value to customers. Whether it is making part or all of your solutions accessible via open APIs or partnering with a competitor if the value proposition makes sense for each party, partnering should always be a first resort, not the last. Nurturing developers. One reason the Salesforce AppExchange ecosystem is so successful is that it gives developers tangible ROI for their investment in the platform. For example, Salesforce.com gives its developers tools that enable them to cut development time by 40 percent compared to traditional methods and cut their time to market by 39 percent. Intense competition for developer mindshare makes providing a clear path to monetization and ROI critical to creating a thriving developer ecosystem. Implementing an API-based integration layer to connect existing systems to new monetization solutions is one way to achieve this, and often the fastest, most cost-effective way for an enterprise to become a platform leader in the digital economy. z


049_SDT026.qxp_Layout 1 7/26/19 10:51 AM Page 49

www.sdtimes.com

August 2019

SD Times

Analyst View BY ARNAL DAYARATNA

Toward a curated web: Why digital content needs standards O

ne of the striking attributes of the contemporary state of digital transformation is the conjunction of the ubiquity of digitization with its incomplete realization in a multitude of consumer and business contexts. On one hand, digital transformation has succeeded in empowering consumers to access data and information about just about any topic—whether it be the weather, news, sports, restaurant tips or parenting advice—from just about anywhere, anytime or any device. On the other hand, digital transformation has led to the production of a deluge of data that has the potential to confuse end users simply because of the sheer volume and heterogeneity of available data. Currently, end users of web-enabled devices enjoy an embarrassment of riches with respect to data and information about any topic of interest. This surfeit of data has improved the ability of consumers to make decisions about what products to buy and transformed modalities of connecting with people via social networking, chat and video-conferencing functionality. For example, increased digitization has empowered end users of technology to compare prices of goods sold in a multitude of venues with a few clicks of the mouse, connect with a relative halfway around the world via a video call or learn a foreign language from the comfort of their living room.

How can you tell whta’s right? While digital transformation has democratized access to data, it has correspondingly encountered challenges with respect to the delivery of high quality, curated information that allows consumers to effectively filter through the deluge of data that lies at the fingertips. One of the best examples of this lack of curation of data involves topics related to health, nutrition and wellness. The worldwide web prominently represents almost every viewpoint under the sun regarding topics such as organic foods, methods of weight loss, stress reduction, causes of cancer and the safety of vaccines.

This lack of curation means that viewpoints that claim that vaccines cause autism, or that cat urine causes schizophrenia, or that organic foods are no different from conventionally grown foods, are represented with a prominence that has the potential to mislead consumers of this information. As such, enterprise portals and web content management technologies that specialize in the delivery of data would do well to consider methods of curating data to identify that which is considered more accurate and reliable, while conversely tagging data that is confirmed to be less credible.

Dr. Arnal Dayaratna is Research Director, Software Development at IDC.

Two solutions, neither great There are two obvious solutions to the challenge of curating large-scale, unstructured data: one involves establishing a centralized governance authority that determines the truth, falsity and credibility of data points; another involves a decentralized model whereby individuals with no particular relationship to one another collectively contribute to the specification of a standard as to whether a given piece of news or data is true, false or somewhere in between. The obvious shortcoming of the centralized model for determining the veracity of pieces of news is who gets to determine what is true and false. Meanwhile, the decentralized model of determining the truth of a given piece of information risks over-reliance on hysteria and prejudice that can unduly swing the pendulum regarding whether a particular piece of information is deemed true or false, one way or another. As such, vendors who specialize in the delivery of digital content have their work cut out for them to develop digital solutions to curate data while nevertheless preserving a diversity of viewpoints. This effort to curate data can help rescue users of technology from the contemporary deluge of data and enhance the reliability and relevance of portals that deliver digital content. z

The obvious shortcoming of the centralized model... is who gets to determine what is true or false.

49


050_SDT026.qxp_Layout 1 7/26/19 10:50 AM Page 50

50

SD Times

August 2019

www.sdtimes.com

Industry Watch BY DAVID RUBINSTEIN

One platform, no waiting David Rubinstein is editor-in-chief of SD Times.

T

ime (and tide) wait for no man, the expression goes. But then there’s latency. Back in the day, before the Internet, before mobile phones and newer application architectures, people for the most part had some patience. If it took a few seconds for a video to load, we waited. If we went to a website and got a 404 error, that was OK. We’d try back later. Not today. It’s an instant gratification world. ‘I want it now, and if you can’t give it to me now, I’ll get it from someone else, or just forget about it altogether.’ And the biggest enemy to this new way of retrieving information is latency. But latency is not just the time it takes for an application to fulfill your request. For businesses today, it’s also the time it takes for a customer to reach your help desk, or the time it takes to find out whether you can get an order delivered by a particular date. Or, getting the information quickly enough from a predictive analytics system to prevent a machine from failing. I was talking recently with Monte Zweben, the CEO of application modernization data platform provider Splice Machine, about this very subject. “These [latencies] all manifest themselves into incredible dollars. It might be outages, it might be customer experiences that are negative, tainting their brands. It all comes down to the fact that they can’t act quick enough.” And one big area of latency organizations are experiencing is with their legacy applications. As companies look to modernize these applications and the systems they run on, they have three options, according to Zweben: throw away their custom applications that give them differentiation from their competitors; use a common, SaaS-like application, in which everybody’s got the same things; or customize the common app, but that comes with its own problems. There is a fourth option, which most companies undertake: Rewrite the legacy app. But, to quote the movie Animal House, “that could take years, and cost thousands of lives.” “You see companies wholeheartedly rewriting entire elements of their portfolios that drive critical business processes, whether that’s customer

The biggest enemy to this new way of retrieving information is latency.

service, whether that’s predictive maintenance, whether that’s inventory optimization, whether that’s fraud detection in financial circles,” Zweben said. “The reason why this is fundamentally about latency is that when they rewrote, they used an architecture that had all the components of trying to build smart, digital processes that were data-driven and using artificial intelligence or machine learning, and these processes required lots of different compute engines to be duct-taped together. And the point is that when you have a disintegrated system that is sort of moving large volumes of data around in order to accomplish this digital transformation, the companies experience latency.” The systems that run business applications are OLTP, and the systems that analyze data are OLAP systems. Data moving from one system to the other creates latency. Zweben explained that machine learning can be used to solve that problem. ‘An example of that is thinking about bad actors, thinking about people committing fraud, or moneylaundering, or cyberthreat,” he said. “They’re constantly changing their techniques. You can’t think of IT and applications as, ‘Build your application, go through testing, and deploy it.’ Now, with a smart application, you have to actually be continuously training new models.” Splice Machine has built a new machine learning workbench that empowers data scientists to keep up with the bad actors or the changing markets and continuously be deploying new machine learning models on mission-critical applications. This, Zweben explained, means organizations can reduce the latency of when you inject a model into an app from the next time you update it. That’s another kind of latency that is different today because intelligent apps didn’t exist 20 years ago. Today, data scientists are driving everything based on running experiments on data that’s constantly changing. According to Zweben, “What’s different now is the fact that you’re operating on data from multiple sources, in vast volumes, and running experiments in order to converge on better models that need to be injected into the applications.” “Your estimated wait time is … now reduced,” said no one in customer service, ever. It’s time to deal with latency. z


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:03 AM Page 51


Full Page Ads_SDT026.qxp_Layout 1 7/26/19 11:03 AM Page 52


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.