SD Times June 2019

Page 1

FC_SDT024.qxp_Layout 1 5/22/19 11:41 AM Page 1

JUNE 2019 • VOL. 2, ISSUE 024 • $9.95 • www.sdtimes.com


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:32 PM Page 2


003_SDT024.qxp_Layout 1 5/22/19 12:38 PM Page 3

Contents

VOLUME 2, ISSUE 24 • JUNE 2019

FEATURES There’s a diversity problem in the tech industry... and it’s not getting any better

2 019 page 16

NEWS 6 21 34

News Watch

page 8

GDPR one year later: Slow compliance, lax enforcement

DevOps Watch Chances of data leaks are high in mobile apps

COLUMNS 44

GUEST VIEW by Itzhak Assaraf A new approach to personal data discovery

45

ANALYST VIEW by Rob Enderle Qualcomm’s potential hybrid AI revolution

46

INDUSTRY WATCH by David Rubinstein Of open source, data breaches and speed

page 22

Who owns continuous testing?

BUYERS GUIDE For effective DevSecOps, shift left AND extend right page 29 page 37 THE LAST OF THREE PARTS Software Development Times (ISSN 1528-1965) is published 12 times per year by D2 Emerge LLC, 80 Skyline Drive, Suite 303, Plainview, NY 11803. Periodicals postage paid at Plainview, NY, and additional offices. SD Times is a registered trademark of D2 Emerge LLC. All contents © 2019 D2 Emerge LLC. All rights reserved. The price of a one-year subscription is US$179 for subscribers in the U.S., $189 in Canada, $229 elsewhere. POSTMASTER: Send address changes to SD Times, 80 Skyline Drive, Suite 303, Plainview, NY 11803. SD Times subscriber services may be reached at subscriptions@d2emerge.com.


004_SDT023.qxp_Layout 1 4/19/19 10:40 AM Page 4

®

Instantly Search Terabytes

www.sdtimes.com EDITORIAL EDITOR-IN-CHIEF David Rubinstein drubinstein@d2emerge.com NEWS EDITOR Christina Cardoza ccardoza@d2emerge.com

dtSearch’s document filters support: ‡ popular file types ‡ emails with multilevel attachments ‡ a wide variety of databases ‡ web data

SOCIAL MEDIA AND ONLINE EDITORS Jenna Sargent jsargent@d2emerge.com Jakub Lewkowicz jlewkowicz@d2emerge.com ART DIRECTOR Mara Leonardi mleonardi@d2emerge.com CONTRIBUTING WRITERS

Over 25 search options including: ‡ efficient multithreaded search ‡ HDV\ PXOWLFRORU KLW KLJKOLJKWLQJ ‡ forensics options like credit card search

Developers: ‡ $3,V IRU & -DYD DQG 1(7 LQFOXGLQJ FURVV SODWIRUP NET Standard with Xamarin and 1(7 &RUH ‡ 6'.V IRU :LQGRZV 8:3 /LQX[ 0DF L26 LQ EHWD $QGURLG LQ EHWD ‡ )$4V RQ IDFHWHG VHDUFK JUDQXODU GDWD FODVVLILFDWLRQ $]XUH DQG PRUH

.

.

.

Alyson Behr, Jacqueline Emigh, Lisa Morgan, Jeffrey Schwartz CONTRIBUTING ANALYSTS Cambashi, Enderle Group, Gartner, IDC, Ovum

ADVERTISING SALES PUBLISHER David Lyman 978-465-2351 dlyman@d2emerge.com SALES MANAGER Jon Sawyer jsawyer@d2emerge.com

CUSTOMER SERVICE SUBSCRIPTIONS subscriptions@d2emerge.com ADVERTISING TRAFFIC Mara Leonardi adtraffic@d2emerge.com LIST SERVICES Jourdan Pedone jpedone@d2emerge.com

Visit dtSearch.com for ‡ KXQGUHGV RI UHYLHZV DQG FDVH VWXGLHV ‡ IXOO\ IXQFWLRQDO HQWHUSULVH DQG developer evaluations

The Smart Choice for Text Retrieval® since 1991

dtSearch.com 1-800-IT-FINDS

REPRINTS reprints@d2emerge.com ACCOUNTING accounting@d2emerge.com

PRESIDENT & CEO David Lyman CHIEF OPERATING OFFICER David Rubinstein

D2 EMERGE LLC 80 Skyline Drive Suite 303 Plainview, NY 11803 www.d2emerge.com


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:33 PM Page 5

JUNE 23–27, 2019 CHICAGO, IL SPECIAL OFFER:

Europe’s fun and widely popular agile testing festival is coming to North America as Agile Testing Days USA. This event will feature over 50 of the top agile testing enthusiasts speaking in Chicago.

REGISTER WITH PROMO CODE ATDMP TO SAVE UP TO $200!*

https://agiletestingdays.us

*Discount valid on packages over $400


006-7_SDT024.qxp_Layout 1 5/21/19 3:28 PM Page 6

6

SD Times

June 2019

www.sdtimes.com

NEWS WATCH

Microsoft previews Visual Studio Online, Web Template Studio, and releases .NET 4.8 Microsoft is closer to delivering a long soughtafter capability for Visual Studio developers — a web-based version of the IDE. Visual Studio Online is intended to let developers remotely debug, perform pull requests or make quick edits from a smartphone or tablet. Developers can also join sessions in Visual Studio Live Share, which Microsoft said is now available, and the new IntelliCode, which uses AI, aiming to improve the IntelliSense feature in Visual Studio. The Visual Studio and VS Code editors in Visual Studio Online will be hosted in Microsoft Azure, though it doesn’t require use of the cloud service. A new Microsoft Web Template Studio (WebTS), designed to be a cross-platform extension of Visual Studio Code, provides an open-source, wizard-based experience that guides developers through web app creation, generates code, and provides step-by-step instructions. According to the company, WebTS can generate code for front-end and back-end frameworks, pages and cloud services. The project is still in early development and supports one full-stack app path with React and Node.js, Microsoft explained. Support for Angular and Vue are currently in the works.

The company also released .NET Framework 4.8, available for Windows 7 Service Pack 1, Windows 8.1 and Windows 10 as it comes standard on the May 2019 OS update. Version 4.8 includes an improvement of the JIT through bug fixes and code generation-based performance optimization that was based on .NET Core 2.1. On the security front, NGEN images get rid of writable and executable sections to reduce the surface area susceptible to attacks. 4.8 also includes antimalware scanning for all assemblies which is a feature that wasn’t available in previous iterations. Additional improvements include updates to the Zlib native compression library and service behavior enhancements for WCF. Microsoft also announced at its Build conference last month it will deliver a unified .NET 5 platform next year that uses one Base Class Library with APIs to bring together its various toolchains into a consolidated framework, including ASP.NET and Xamarin. Microsoft expects to release .NET Core 3.0 this summer, which will bring in various Windows desktop technologies such as WPF and WinForms, as well as support for C# 8.0.

People on the move

n Docker has announced that Rob Bearden will be its new CEO. Previously the CEO of Hortonworks, Bearden has over 20 years of experience in scaling software companies. As Docker’s CEO, Bearden will work to accelerate Docker’s enterprise go-to-market strategy and fuel innovation in technologies and products. n Ahmad Nassri has joined npm as its new chief technology officer. Nassri will help accelerate npm’s growth, and will work closely with npm’s co-founder and Chief of Open Technologies Isaac Schlueter. “I’m thrilled to join the company at such an exciting time, expanding our investment in tools and best practices that ensure the continued growth of enterprise and

open source communities alike,” Nassri said. n Todd Scallan is now chief product officer at Plutora. Scallan will lead the product team and accelerate the company’s increasing momentum by implementing mechanisms, decision-making, and process to achieve growth. Scallan previous worked at Flipboard, Axcient, Interwoven, and Segue Software. n Snowflake has welcomed Frank Slootman as its Chairman and Chief Executive Officer. Slootman was the CEO of ServiceNow until 2017, leading the company from under $100 million in revenue to $1.4 billion in revenue, in addition to an IPO.

Node.js 12 is now available Node.js was upgraded to version 12, which includes faster startup and better default heap limits, as well as many upgrades and new features. The Node.js Long Term Support (LTS) release will come out in October this year. In Node.js 12, the V8 engine gets an upgrade to 7.4, providing async track traces, faster calls with arguments mismatch and faster JavaScript parsing. Node.js is also introducing TLS1.3 as its default max protocol. The update will now configure the JavaScript heap size based on available memory instead of using defaults set by V8 as it had in previous releases. Also, the default parser is switched to llhttp. Version 12 includes better support for native modules in combination with Worker Threads, which are useful for performing CPU-intensive JavaScript operations, and an updated N-API, which makes it easier to use your own threads for native asynchronous functions.

RedHat announces OpenShift 4, Enterprise Linux 8 Red Hat announced a rearchitected OpenShift at its Summit conference last month, with version 4 bringing a cloud-like experience. The company said it will be generally available in April. The release also includes Operator Hub, which makes the concept of operators a first-class citizen in the platform update. The hub was created by Red Hat, AWS, Azure and Google as the community place to store their operators


006-7_SDT024.qxp_Layout 1 5/21/19 3:28 PM Page 7

www.sdtimes.com

and make them available to the community. The operator framework was rolled out last May to enable organizations to bring stateful applications into Kubernetes by managing application maintenance, failover and scaling. Further, Red Hat has created a Certified Operator program, for enterprises that require a known support path and certifies the companies behind the operator are working together for a deeper level of integration. Among other new features in 4.0 are OpenShift Service Mesh that is based on Istio but natively integrates on the operations side with Prometheus for monitoring and Grafana for dashboards, and on the developer side with Jaeger for native tracing and Kiali for visualizing what microservices are in the service mesh and how they are connected. Red Hat is going into beta with serverless technology on the platform, specifically Knative, which provides the plumbing that makes serverless work. CodeReady Workspaces, announced in GA in March, is for customers who want a well-defined developer experience. “Customers have said, ‘We want to provide the developer with the IDE, and on the back end we want to wire that in so they can write their code, commit their code, and not have to care about any of the details,’ like where the source code lives, or CI pipelines,” Micklea said. “It’s all just available to them.” Also, Red Hat Enterprise Linux 8 has been released, introducing Application Streams — a capability that delivers languages and frameworks that are updated frequently to empower developers to inno-

Apache Software Foundation announces GitHub integration The largest open source foundation, with over 350 opensource projects and 200 million lines of code, has joined forces with GitHub. The Apache Software Foundation (ASF) has been talking about integrating with GitHub since 2016, and now it has successfully completed the migration of its Git service to GitHub. Prior to its integration with GitHub, Apache projects could use two different version control services: Apache Subversion and Git. But as the popularity of GitHub grew over the years, a number of projects and communities wanted their code to be hosted there as well, the ASF explained. Unfortunately, they could only host read-only mirrors on GitHub, which limited the use of GitHub’s tools on those projects, the foundation explained. By integrating with GitHub, the ASF’s projects will be hosted on a single platform where they can be reviewed and collaborated on by the 31 million developers who use GitHub worldwide, GitHub explained.

vate on the latest versions while maintaining application stability. Red Hat Universal Base Image also is generally available for developers to build Red Hat-certified Linux containers.

Docker announces update to platform Docker announced Docker Enterprise 3.0, an update to its container platform with new desktop functionality, developer tooling, simplified Docker Kubernetes Service, and the ability to run the platform as a managed service. The new release introduces Docker Applications, a set of tools upon which developers can integrate multi-service applications into a single, deployable package, the company said in its announcement. Docker Applications offers

application templates, a designer and version packs, and automates creation of Docker Compose and Kubernetes YAML files. This, the company explained in its announcement, “makes it possible for flexible deployment across different environments.”’ Docker claims its Kubernetes Service (DKS) is the first to integrate K8s from desktop to server, and that it’s compatible with Docker Compose, Kubernetes YAML and Helm charts. Other new functionality in DKS, the company said, includes enhanced security, access controls and automated life-cycle management. It also provides customers with an option to use Docker Swarm Services for container orchestration. Finally, Docker is offering Enterprise as a service, on premises or in the cloud.

June 2019

SD Times

Red Hat, Microsoft brings OpenShift to the Azure cloud Red Hat and Microsoft announced that Azure Red Hat OpenShift — a joint Kubernetes solution running in the Microsoft Azure public cloud. The partnership will allow IT organizations to use Red Hat OpenShift Container Platform on-premises and to bring Azure services to those workloads. The service is backed by Red Hat’s expertise in open source and MIcrosoft’s cloud strengths, the companies said. Together, OpenShift and Azure give organizations a way to bring containerized applications into existing workloads, and then manage and orchestrate those workloads in hybrid environments.

Google unveils new enterprise edition of the Google Glass Google is not giving up on its augmented reality wearable device: the Google Glass. Despite the dwindling interest in the device over the last couple of years, Google is releasing a new enterprise edition in the hopes of gaining back some momentum. The Google Glass Enterprise Edition 2 comes with an improved platform, a powerful CPU, enhanced performance, and a new price point. Features include the Qualcomm Snapdragon XR1 platform, a new artificial intelligence engine, updated camera and quality, and ability to develop and deploy easily with Android support. To support scaled deployments, Glass Enterprise Edition 2 supports Android Enterprise Mobile Device Managemen. z

7


008-15_Diversity.qxp_Layout 1 5/21/19 3:35 PM Page 8

There’s a

diversity problem in the tech industry ...and it’s not getting


008-15_Diversity.qxp_Layout 1 5/21/19 3:36 PM Page 9

www.sdtimes.com

June 2019

SD Times

BY JENNA SARGENT n the past several years, the tech industry seems to have tried to put more effort into promoting and increasing diversity. But are those initiatives actually working? In many aspects, it seems that the answer is no; things aren’t actually getting any better. According to a 2018 report from the National Center for Women & Information Technology, 57 percent of the U.S. workforce is made up of women, but only 26 percent of technology-related positions are held by women. According to another report from consulting firm McKinsey and Company, the situation is worse for women of color, with black, Latina, and Native American women only making up 4 percent of roles in the computing workforce — almost none of which are senior leadership roles — despite making up 16 percent of the general population. And it’s not just larger, established companies that are struggling to be more diverse. Silicon Valley Bank’s Women in Technology Leadership 2019 report confirms the issue persists in startups too. It found that only 56 percent of startups have at least one woman in an executive position, while only 40 percent have at least one woman on their board of directors. “For every inclusive start-up, there’s one that’s the exact opposite. It will take time, effort and determination to make the shift permanent,” said Laura Baldwin, president of media company O’Reilly. On March 6, at a hearing in a Congressional subcommittee on consumer protection and commerce, Mark Luckie, a digital media strategist and former manager at Facebook and Twitter, said a common explaSilicon Valley Bank’s nation for this lack of diversity is Women in Technology the pipeline — that not enough Leadership 2019 report women and people of color have tech-related degrees. But this confirms the issue isn’t true. There are still more persists in startups too. women and people of color gradIt found that only uating than being hired, he 56 percent of startups explained. “There is a common have at least one woman refrain in Silicon Valley: ‘We can’t lower the bar.’ This term is widely in an executive position, understood to infer that black, lat- while only 40 percent have inx, and women candidates are at least one woman on less qualified. Their hiring would their board of directors. be a token, putting them over more qualified white or Asian male candidates, who are actually often equally or sometimes less qualified,” Luckie said. When women or people of color are hired, they often face unwelcoming environments. A Harvard Business Review survey found that half of all diverse — meaning women, racial or ethnic minorities, and LGBTQ — employees experience bias in their day-to-day work experience. And as a result of daily bias, 52 percent of women are leaving their jobs, according to a report from the Center for Talent Innovation. Their reasons for leaving include “hostile macho cultures,” isolation, and difficulty with embodying leadership attributes, the report stated. According to McKinsey and Company’s report, the number of women in comput-

I

g any better

continued on page 10 >

9


008-15_Diversity.qxp_Layout 1 5/21/19 3:36 PM Page 10

SD Times

June 2019

www.sdtimes.com

< continued from page 9

ing has actually decreased over the last 25 years. Frustrations at work may also be combining with opportunities or situations that occur in their personal lives, making it more likely that women choose to leave their tech careers behind. “In my observations, when women lose opportunities, they actually lose time to build accumulative advantage,” said Urvashi Tyagi, vice president of engineering at American Express. “Over the years, they start to fall behind and don’t see a path to reach their potential. In the first decade of their careers, when a demanding personal circumstance occurs — like the birth of a child, challenging family and relationship dynamics, or the loss of a job —women may be on the edge, ready to scale back or drop out of the workforce.” Tyagi has noticed an increase in the amount of female students when recruiting for tech roles at college campuses. But this progress in the pipeline is negated if those women end up leaving their careers. “This obviously erases some of the progress the industry has made and reduces the pool of female engineers in mid-career and leadership roles.” Despite reports like these, some within the industry are seeing change. Elizabeth Ferrao, co-founder of the New York City chapter of Women Who Code, said that while she’s not sure if the numbers back up the fact that there is change, in the past few years she has seen more of an awareness on leadership’s part. “I’ve seen a cognizance of leadership to recognize whether your managers are being reflected amongst your employees and those ethnicities are there and those genders are there. So I think there’s definitely an increased cognizance of it,” Ferrao said. O’Reilly’s Baldwin has also noticed some positive changes within the industry, but admits that there’s still a lot of work to be done, even within her company. “Even at O’Reilly we have a long way to go regarding our own hiring processes. We do our best to institu-

‘I’ve seen a cognizance of leadership to recognize whether your managers are being reflected amongst your employees and those ethnicities are there and those genders are there. So I think there’s definitely an increased cognizance of it.’ —Elizabeth Ferraro, Women Who Code

tionalize our approach toward diversity and inclusion but we have yet to achieve what I would call ‘success.’” Recognizing that there is still work to be done within your organization can help you get started on making changes from within. A few things Baldwin still believes need work in the tech industry include getting more women and minorities on boards of directors and ensuring equal pay. “Companies really need to try and make diversity a part of their culture from the top down, and then diverse hires will come more naturally as well,” she said.

Why is diversity important? The impact of not having different backgrounds represented in the tech industry will have lasting impacts on

Photo: John Werner

10

Joy Buolamwini, graduate researcher at the MIT Media Lab, gives a TED Talk about bias in facial recognition algorithms.

our society. The negative effects of technology bias already are being seen. For example, most facial recognition software can almost perfectly identity white faces, but can’t consistently identify people with darker skin tones. This is because those algorithms are trained on data that features primarily white faces. The consequences of this bias are huge when these systems are used by entities such as hiring managers, insurance companies, or police departments, just to name a few, explained Joy Buolamwini, graduate researcher at the MIT Media Lab, in a TED Talk she gave about bias in facial recognition algorithms. “We’ve seen that algorithmic bias doesn’t always lead to fair outcomes.” She believes that fixing these biases starts with the people that are writing code. “What can we do about it? Well, we can start thinking about how we create more inclusive code and employ inclusive coding practices. It really starts with people.” She recommends teams be comprised of diverse individuals that can check each other’s blind spots. She also believes that teams should look at their development practices to ensure that they are developing fair systems. “We’ve used tools of computational creation to unlock immense wealth. We now have the opportunity to unlock even greater equality if we make social change a priority and not an afterthought,” Buolamwini said. In addition to the effect on product development, a lack of diversity can really help cybercriminals out. James Slaby, director of cyber protection at cybersecontinued on page 12 >


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:33 PM Page 11


008-15_Diversity.qxp_Layout 1 5/21/19 3:37 PM Page 12

12

SD Times

June 2019

www.sdtimes.com

< continued from page 11

Photo: Matt Furman

curity company Acronis, listed just a few of the reasons that cybersecurity teams need to be more representative. First, there is a shortage of qualified cybersecurity experts, in part due to competition from government agencies, service providers, vendors, and the criminal sector. When competing with all of these different groups for talent, organizations should be trying to cast as wide a net as possible when hiring. Second, customized malware attacks are one of the most common phishing attacks. Having a team that is as diverse as possible can help defend against those. For example, a female hacker might be able to more effectively create a phishing email that uses a women’s issue as the hook. According to Slaby, the same is true of other racial, ethnic, religious, gender, sexual preference, or

Another important step in reducing and eliminating unconscious bias is to regularly educate and train employees. “Unconscious bias training should become an extension of the yearly compliance-driven trainings that are mandatory in many companies,” said American Express’ Tyagi.

Write your job postings to attract more diverse candidates According to job hiring site, Monster, the way a job description is written can influence who is going to apply. For example, women tend to only apply to a job when they feel 100 percent qualified. If they see a posting where they only meet 75 percent of the qualifications, they may not apply for the position, Monster explained in a post. Monster recommends only listing the absolute “must-haves” on the list of requirements, rather than stuffing the

Professional organizations can lift you up when the industry doesn’t

‘Unconscious bias training should become an extension of the yearly compliance-driven trainings that are mandatory in many companies.’ —Urvashi Tyagi, American Express

other groups. “Cyber criminals are always looking for the weakest link in their targets, and now have the same tools used by corporate marketers to measure and analyze the success rate of various tactics,” said Slaby. “If there is a blind spot in your security team due to its homogeneity (e.g., it’s all straight, white, Christian males), the bad guys will find it. Diverse backgrounds and viewpoints offer a better defense against a variety of social-engineering attacks. It doesn’t literally take a thief to catch a thief in the cyber security world, but the ability to think like your attacker is a huge plus, so a diverse team is more likely to be able to counter a multiplicity of psychological tactics.”

tomers, the community, and the society at large, makes an organization more attractive to women technologists.” Tyagi also believes that companies need to ensure that their leadership roles are filled with more diverse talent, which will in turn attract more diverse candidates. O’Reilly’s Baldwin agrees, saying: “Companies really need to try and make diversity a part of their culture from the top down, and then diverse hires will come more naturally as well.” Another thing that companies can do to improve hiring practices is to have leaders “regularly educate themselves, capture diversity KPIs and monitor progress, to continuously learn and evolve in becoming a more inclusive space for all,” Tyagi explained. Baldwin recommends that organizations carefully consider the wording of job postings and ask themselves if there are certain words that are associated with a certain gender.

description with skills that aren’t necessary for the job. The company also recommends branding your organization as a more diverse one. “Whether it’s your career site, social channels, or how you present yourself at industry events, you want to position your company as being as diverse as possible, and highlight the strength of your diverse workforce,” the company wrote. Tyagi recommends that rather than showcasing a geeky company culture and cool technologies, companies should focus on the problems they are trying to solve and the impact it has on helping people around the world. “Connecting the existence of a company to the value it creates for the cus-

Joining a professional organization, whether internal or external to your company, can be highly beneficial to under-represented minorities (URMs). Elizabeth Ferrao, co-founder of the New York City chapter of Women Who Code, gave a talk at a diversity luncheon at O’Reilly’s Software Architecture conference in New York City in February, where she noted that some benefits of joining a group within your organization include an attachment to a strong internal organization, the potential for introductions to leadership, network effects, and personal brand growth. Joining an external group has similar benefits, but the network and connections will be tied to other organizations, rather than their own. Along with Estella Madison Gonzalez, Ferrao co-founded the New York chapter five years ago. The original network was started in San Francisco and had been going strong for a few years before the New York chapter formed, but now the NYC chapter has actually surpassed that original network, in continued on page 15 >


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:33 PM Page 13


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:36 PM Page 48


008-15_Diversity.qxp_Layout 1 5/21/19 3:38 PM Page 15

www.sdtimes.com

June 2019

SD Times

15

< continued from page 12

Conferences can help, too In addition to joining professional organizations, conferences can be a great way for members of under-represented groups to further their tech career. According to O’Reilly’s Baldwin, conferences are “a natural place to present different approaches, viewpoints and experiences. They are a place where people gather, to learn from each other and share stories.” But if those conferences aren’t a real place for inclusion, those perspectives will be biased, Baldwin explained. For example, O’Reilly does a number of things at their conferences to be more inclusive, including encouraging speaking proposals from under-represented minorities and planning a

Conferences for women in tech Women in Technology Summit San Jose, CA — 6/9/19

Leadership Training - Women in IT Toronto — 10/21 - 10/25

Tech Women Rising Seattle, WA — 6/7 - 6/9

Wonder Women Tech National Conference Long Beach, CA — 11/7 - 11/9

Women of Silicon Roundabout London — 6/25/19

Women in Tech Summit Southeast Raleigh, NC — 11/8 - 11/9

Grace Hopper Celebration Orlando, FL — 10/1 - 10/4

Tech Up for Women NYC — 11/12

Women in Tech Summit West Denver, CO — 10/4/19

European Women in Technology Amsterdam — November

Photo: AnitaB.org

terms of membership. American Express’ Tyagi believes that companies should sponsor and promote employee participation in these types of organizations. At American Express, for example, they have a Women in Technology (WIT) community that organizes local events and partners with diversity organizations, such as the Helen Keller National Center, Girls Who Code, Black Girls Code, and AnitaB.org. According to Tyagi, WIT also organizes internal events, such as Sip & Chat roundtables, cluster mentoring, and a development series that engages diverse colleagues and helps them grow professionally. Getting involved with an organization — whether external or internal — can help you build up your network. “I think spending your time on free projects on groups like this — like Women Who Code and Girls Develop It — are really great ways to start that network,” said Ferrao. “To meet the people who will lift you up in a couple of years. And realize that it takes time, it takes effort. And you have to go out of your way and out of your comfort zone. But I do think that there is a pretty large reward for it.” Having a network of similar people will ensure that there are people around you who will lift you up when things are rough, Ferrao explained.

A group of women technologists taking advantage of networking opportunities at the 2018 Grace Hopper Celebration.

diverse speaker lineup, offering support and travel expenses for those speakers, offering a Diversity and Inclusion scholarship program for at least 10 URMs, offering pronoun ribbons and encouraging all attendees to use them to normalize the sharing of pronouns, designating all-gender restrooms, and setting aside religious observance rooms, among other things. “Most importantly, while the entire organization is behind our efforts, we have dedicated one employee who is assigned to owning our inclusion and diversity programs. Suzanne Axtell has worked tirelessly to hold O’Reilly to the best possible standard and the quality of her effort has made a significant difference for us, the industry as a

whole and for our conference attendees,” Baldwin explained. Unfortunately, URMs are often hesitant to ask their companies to send them to conferences. Baldwin recommends that organizations invest in sending their under-represented employees to conferences and take the burden of asking permission off of those employees. “Communities gather at conferences, and having URMs more visible in these communities supports URMs as experts and active participants. It can also help reduce a sense of isolation some URMs experience, particularly if they’re the only member of an underrepresented group at their company or on their team,” Baldwin said. z


016-019_SDT024.qxp_Layout 1 5/21/19 3:39 PM Page 16

16

SD Times

June 2019

www.sdtimes.com

2 019 APIs, Librarries and Frameworks orks he software industry is a lot like birds. A technology advancement comes along, and software providers flock to it, hawking their own solutions as they try to differentiate from the pack. After the initial lift these companies get from market momentum, many fall back to earth. The strongest and fastest, though, spread their wings and are carried along by the winds of these new ideas. The 2019 SD Times 100 recognizes those companies and organizations that are the leaders, innovators and influencers in the software development market. They have flown ahead of the flock with new, innovative projects or by establishing leadership positions, or by influencing how and what we create. This year’s list sees many of the mature companies coming back to roost, but also includes a few fledglings that are just learning to fly on their own. This is reflective of the fact that software development does not stand still. New methodologies, architectures and paradigms are emerging, and the SD Times 100 companies are at the point of the flying V, leading the way for the rest to follow. And the nest, as they say, is history. z

T

API Fortress Aspose Jitterbit Kong LEAD Technologies MuleSoft NodeSource Postman TIBCO Software


016-019_SDT024.qxp_Layout 1 5/21/19 3:40 PM Page 17

www.sdtimes.com

Big Data B andd Analytics Cloudera Confluent Elastic Databricks Kinetica Splice Machine Splunk Tableau Software Talend

June 2019

Daatabase and Database base Management Couchbase DataStax Datical DBmaestro Melissa MongoDB

Neo4j Oracle Progress Redgate Software Redis Labs Tiger Graph

SD Times

17


016-019_SDT024.qxp_Layout 1 5/21/19 3:42 PM Page 18

18

SD Times

June 2019

www.sdtimes.com

DevOpps

Value Stream m Managementt CollabNet VersionOne ConnectAll GitLab Micro Focus Plutora Tasktop

Developpment T Toools Docker GitHub Gremlin Jama Software JetBrains LaunchDarkly Optimizely Perforce

Pivotal Revulytics SmartBear Sparx Systems Split.io Stackery Text Control

Atlassian Chef CircleCI CloudBees HashiCorp JFrog Octopus Deploy OpenMake Software Puppet Rancher XebiaLabs

Testing Testing Applitools Eggplant Mobile Labs Parasoft Perfecto Testim Tricentis


016-019_SDT024.qxp_Layout 1 5/21/19 3:41 PM Page 19

www.sdtimes.com

June 2019

SD Times

Inflluencers Amazon Facebook Google Microsoft Red Hat SAFe Scrum.org

The Apache Software Foundation

Securrity

The Eclipse Foundation The Linux Foundation The World Wide Web Consortium

Low Code//No Code

Performannce AppDynamics Catchpoint Dynatrace Instana LightStep New Relic Stackify

Appian K2 Kintone Kony Nintex OutSystems Quick Base

Aqua Security Signal Sciences Sonatype Synopsys Veracode White Source

19


SubscriptionAd_2018_for PDF.qxp_Layout 1 8/28/18 2:08 PM Page 1

Discovery. Insight. Understanding. SD Times subscriptions are FREE!

SD Times offers in-depth features on the newest technologies, practices, and innovations affecting enterprise developers today — Containers, Microservices, DevOps, IoT, Artificial Intelligence, Machine Learning, Big Data and more. Find the latest news from software providers, industry consortia, open source projects and research institutions. Subscribe TODAY to keep up with everything happening in the ever-changing world of software development! Available in two formats — print or digital.

Sign up for FREE today at www.sdtimes.com.


021_SDT024.qxp_Layout 1 5/21/19 3:28 PM Page 21

www.sdtimes.com

June 2019

SD Times

DEVOPS WATCH

New Relic connects distributed DevOps teams BY CHRISTINA CARDOZA

New Relic is extending its platform to better support DevOps teams with the ability to find, visualize and understand data in complex environments. The company announced New Relic One, a new approach to observability with monitoring capabilities, new dashboards, global search and cross-account service maps. “DevOps teams continue to struggle with a fragmented view of their complex environments due to data about customer experiences residing in multiple systems, dashboards, and organizations,” said Lew Cirne, CEO and founder of New Relic. “New Relic One solves an important set of problems for our customers today, and it’s also our platform for delivering the next decade of innovation to our customers. We’re just getting started on our mission to help companies deliver more perfect software faster with New Relic One.” According to the company, New Relic One is designed to be entity-centric, meaning “we connect you to all the data you need to understand your increasingly complex and interdepend-

New Relic One offers improved observability, global search and cross-account service maps.

ent systems in the context of your digital business. It’s a critical update to the ‘application-centric’ approach we’ve always taken, because in complex modern software systems, there is so much more to worry about than just applications,” New Relic wrote in a blog post. Entities include apps, hosts, containers, Kubernetes clusters, cloud services, databases and VMs. Cirne went on to explain that the ability to unify data across multiple

xMatters open sources chaos engineering tool BY JAKUB LEWKOWICZ

xMatters has announced the open sourcing of Cthulhu, a chaos engineering tool that enables DevOps teams to design resilient, self-healing services various infrastructures. According to the company, chaos testing has gained popularity as more organizations move to a distributed systems model. The core features of Cthulhu include cross-platform failure orchestration, which automatically run random failure scenarios and version-controllable scenarios so engineers can easily reproduce a vulnerability. Additionally, automated communications allow team members to

monitor the evolution of failure experiments through targeted notifications. “Microservice architecture can provide many benefits in scalability and functional encapsulation, but can also generate complex failure scenarios due to service dependencies. Chaos engineering can help expose these issues before they manifest themselves in production,” said Tobias Dunn-Krahn, CTO of xMatters. “Cthulhu exposes critical gaps in the self-healing ability of systems so that engineering teams can continuously re-fortify their applications against failures and keep the business running smoothly.” z

accounts gives teams a “pan-enterprise view” of relationships and dependencies. “If you get an alert, for example, you don’t just need to know that something is spiking. You also need to know, ‘What is the thing that is generating that data?’ and, ‘What is it dependent on?’ Cirne wrote. Features include: • Cross-account service maps with the ability to visualize up and downstream dependencies of entities, enabling uses to detect the root cause of an incident • A search and discovery platform for finding entities across the enterprise • Dashboards that include New Relic Insights and a chart builder for creating queries • The ability to create business or domain-specific visualizations such as point-of-sale data, real-time telemetry, and performance based on a geographic area • A unified user experience where users can gain information about their performance data as well as dig deeper into the data. In addition, New Relic One includes monitoring for AWS Lambda, a Kubernetes cluster explorer and distributed tracing global search. z

21


022-27_SDT024.qxp_Layout 1 5/21/19 1:53 PM Page 22

22

SD Times

June 2019

www.sdtimes.com

GDPR ONE YEAR LATER: Slow compliance, lax enforcement BY JENNA SARGENT

It’s been one year since the General Data Protection Regulation (GDPR) went into effect. The regulation completely changes how organizations need to handle the data of European Union citizens.

The impact of the GDPR, though, has been minimal to this point. Compliance has been slow, enforcement has been lax, and organizations are finding that learning about data origin, residence and use can be hugely daunting and difficult. When it first went into effect, there was a lot of panic among organizations that did business in the European Union (EU), because the fines for not complying can be steep. According to Christian Wigand, a spokesman for the European Commission, fines are deter-


022-27_SDT024.qxp_Layout 1 5/21/19 1:53 PM Page 23

www.sdtimes.com

June 2019

SD Times

the lower fines are applied in response to the breach of obligations to the controllers and processors (including breach notification obligations), certification bodies, and monitoring bodies as described in the regulation. The higher fine is applied when there is a breach of the “basic principles for processing including conditions for consent, data subjects’ rights, international transfer restrictions, any obligations imposed by Member State law for special cases such as processing employee data, [and] certain orders of a supervisory authority.” According to Wigand, the fines that are collected will go back into the country’s budget. How the money is used will be determined by the individual country. The enforcement of the GDPR is the responsibility of national data protection authorities, together forming the European Data Protection Board (EDPB), Wigand explained. “The only company that I’ve seen that’s a big news story as far as GDPR enforcement was that fine that they imposed on Google,” said Matt Hayes, vice president of SAP Business at Attunity, a data management company. “And I believe Google is challenging it, but I haven’t seen too much enforcement of GDPR outside of that.” According to Hayes, there is a lot riding on the outcome of Google’s case

The regulation has had both positive and negative effects on organizations and consumers. mined based on a number of factors, such as how the company protected its data, how it reacted to a data breach, and whether it cooperated with the authorities. According to a report from DLA Piper, as of February, 91 fines have been issued. They noted that a majority of those fines were relatively low in value. One of the major fines is Google, which was fined by France for €50 million Euros, which translates to roughly US$56 million. According to a press release from the European Data Pro-

tection Board, Google is being fined “for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization. The regulation lists two different tiers of fines. For less severe violations, a company can be fined up to €10 million or up to two percent of their revenue from the previous financial year. More severe violations can cost a company up to €20 million or up to four percent of their revenue from the previous financial year. According to DLA Piper’s report,

if they try to fight the fine. “If Google can fight it and win it, that is problematic. If Google doesn’t win it, then I think it’s something that a lot of companies will notice.”

Enforcement leads to compliance Hayes expects enforcement to pick up soon, if the EU hopes to ensure compliance. If enforcement remains as lax as it is, companies will continue only loosely complying with the law, he explained. continued on page 24 >

23


022-27_SDT024.qxp_Layout 1 5/21/19 1:53 PM Page 24

24

SD Times

June 2019

www.sdtimes.com

< continued from page 23

For example, Attunity provides a product that deals with the right to be forgotten. Hayes believes that some of their customers have bought that product but haven’t implemented it. “We’ve seen some companies say just by owning [Attunity’s] software we can demonstrate some level of compliance. So they can actually own our software and not implement it.” Hayes believes we’ll continue to see this slow compliance, unless there is a concrete reason for companies to really speed things up. “I think we’re going to see companies say, look we’ve taken a few baby steps towards GDPR, but the minute that they find out that they’re going to be audited, or the minute that there’s some enforcement in their industry, they might then decide to tighten the screws up a bit.” According to Gary LaFever, CEO of Anonos, different countries have been taking different approaches to audits. The law is being enforced on a countryby-country basis, rather than a single

entity doing it for the EU as a whole. For example, in Italy they are doing audits in conjunction with the tax collectors. “They’re going in and they’re just doing random audits of people to see if they’re in compliance with the GDPR,” said LaFever. But despite the slow enforcement of the law, many organizations have made changes to their data practices, which has led to structural, technical, and cultural changes within organizations. “From an organizational perspective, I’ve seen a lot more today of teamwork at companies,” said Scott Giordano, a privacy attorney, IAPP Fellow of Information Privacy, and vice president of data protection at Spirion, a data security software provider. “Think about legal and compliance, risk management, HR, internal audits — they’re all now at the table, whereas previously it really was confined to IT and IT security.” Companies now have to be aware of where all of their data is stored, which is something that a lot of companies

struggled with, and still continue to struggle with, explained Tim Woods, vice president of technology alliances at FireMon, a security company. John Pocknell, senior solutions product marketing manager at Quest Software, explained that the person to do that identification is a database administrator (DBA). Having someone in place to oversee data isn’t a requirement of the GDPR, although having data protection in place is. Now, Pocknell is seeing U.S. companies start to put people in place whose main responsibility is to protect the company’s data. “So even though it’s not a requirement, we are beginning to see DBAs become more big in that sort of data protection role.” As a result of the GDPR, companies have put more of a focus on accountability when it comes to their data, said Jean-Michel Franco, senior director of product marketing at Talend, a data integration company. There should be someone at every company who is held accountable for how the company is

ARE DATA POLICIES MORE RESTRICTIVE? Have you noticed stricter policies at work regarding technology or customer data due to GDPR?

57% YES 70% EMEA 4 1 % US 6 1 % APAC Do workers feel that their personal data and information ion is more protected since GDPR enforcement began?

74% of emplooyees said they had noticed mor m e pop-ups and opt-ins asking for consent regarding their personal data and d information since GDPR enforcement began.

32% said these annoyed them, 1 in 5 said they even hurt their produ uctivity

39% Yes 33% No 6% Worse than before SOURCE: Snow Software, GDPR Emplo ployee Survey, 3,080 employees across the US, EMEA E and APAC


022-27_SDT024.qxp_Layout 1 5/21/19 1:54 PM Page 25

www.sdtimes.com

complying with the regulation. “I think this was the most important change and the most impactful change,” said Franco. “Until the company had the DPO (Data Protection Officer), GDPR remained something a little abstract and something a little boring, and a regulation that they didn’t care much about. Once a DPO has been nominated, it changes the way that the enterprise proceeded.” Companies need to decide whether they should hire a data protection officer, explained FireMon’s Woods. If a company suffers a breach, and they’re found guilty of not doing their due diligence — things like not having a data protection officer or not running internal assessments — then the fines could be much higher. “You have to provide them a way to be forgotten, you have to author a right of erasure or elimination or the right to be forgotten, once you have my information and if you don’t provide that and a breach happens, then you’re going to be found at fault,” said FireMon’s Woods. “I think as corporations look at this, it’s going to have them questioning what they are doing from a response perspective,” said Woods. Companies will have to start asking questions like: “‘What am I doing? Are we running data impact assessments? If we don’t have a data protection officer, should we get one now? What are we going to do when a breach occurs? — Not if, but when the breach occurs — Not understanding the significance or how deep that breach may be, but what’s going to be our posture in response to that when we have to notify the DPA, the Data Protection Authority, within our 72-hour quote unquote period? How are we going to be perceived as a company from a readiness position?” Quest’s Pocknell believes that a lot of companies were not prepared for the regulation. Companies may have started on the path, but in light of reports of recent big data breaches from companies outside of the EU, it’s a wake-up call that this is something everyone should be addressing. Wigand believes that companies were given sufficient time to prepare

June 2019

SD Times

The GDPR doesn’t cover everything According to Matt Hayes, vice president of SAP Business at Attunity, a data management company, the laws are more descriptive than prescriptive. This can cause some headaches for companies who are trying to figure out what they need to do to be compliant. For example, there’s nothing definitive in the GDPR about personal data on backup tapes, Giordano explained. But if personal data is restored from that backup tape after it had been deleted, organizations will have to go in and make sure that that data gets redeleted. This probably means organizations need to rethink their backup practices. Another scenario that’s not really addressed by the GDPR is tourism, Giordano explained. “Say that you’ve got an EU person and they’re here on vacation. Are they covered by the regulation? The GDPR doesn’t touch that and doesn’t really comment on it. Even some post GDPR commentary didn’t do a very good job of talking about it.” z

for the regulation prior to its implementation in May 2018. The GDPR was adopted in December 2015, and guidelines on how to apply the new rules were published by the commission in January 2018. Overall, the GDPR has led to positive effects for consumers, and negative (and some inadvertent positive) effects for companies. “At HubSpot, we believe that GDPR is a good thing for the sales and marketing industry,” said Jon Dick, vice president of marketing. “It puts our industry down a path that we believe in deeply, a path of being customer first, of abandoning spammy tactics, and of being more inbound. Today to be successful, it’s not just about what you sell, it’s about how you sell. The companies that are doing this well provide an exceptional experience and are transparent about how they’re using your data.” And while overall the GDPR has been good for consumers, most consumers still don’t know much about the GDPR. According to research from HubSpot, only 48.4 percent of consumers in the EU were familiar with

the regulation and 63.9 percent of consumers in the UK were familiar. They also found that EU consumers are less familiar with the GDPR this year than they were in 2018 (32.9 percent in 2018 vs. 26.3 percent in 2019). From the perspective of organizations, there are a few potential negatives to non-compliance. In addition to the financial penalty of not complying, there are also business risks if something were to happen to data that you weren’t protecting properly, Quest’s Pocknell explained. “You wouldn’t want to be that company that bestows personal data into the public domain. Yes, you’re going to get a fine, but imagine how much business you’re going to lose,” he said. Another negative is that a lot of companies have lost a lot of their contacts, Talend’s Franco said. Companies that didn’t have consent to the data struggled to get consent when they followed up with those users as the law went into effect. On the flip side, some companies used this as an opportunity to connect with their customers and establish continued on page 26 >

25


022-27_SDT024.qxp_Layout 1 5/21/19 1:55 PM Page 26

26

SD Times

June 2019

www.sdtimes.com

< continued from page 25

trust by being transparent about why they wanted the data and how it would be used, Franco explained. Gary LaFever, CEO of Anonos, a data company, believes that this data loss could be catastrophic. He mentioned a study by IDC that found that a top five hospitality firm had deleted

two decades of customer data. “Now they have no means of tracking historically how new initiatives that they have compared against what they’re done in the past,” said LaFever. “So particularly for AI and analytics that includes a baseline that includes what’s happening today and what’s been done in the past, you lose access to that data.”

LaFever notes that this doesn’t really apply for highly regulated industries that likely are required to maintain a separate copy of their data for audits or inquiries. But beyond the regulation itself, the process of preparing for it has had a positive impact on organizations. By taking a deep look at their data prac-

Privacy as a service BY DAVID RUBINSTEIN

Many Americans seem resigned to not having control over their data profiles on the Internet. As larger and more sophisticated data breaches are reported in growing numbers, and companies such as Facebook and Google engage in mysterious data activities, technology users are left not knowing who’s got their data, or what they’re going to do with it. “It was just a matter of time before Facebook and Google stepped over the line where regulators would no longer be able to avoid the public outcry,” said Jon Fisse, CEO and co-founder of a startup privacy-as-a-service platform provider called Atomite. “The idea behind Atomite came to me after observing the reaction of the public to the [Edward] Snowden NSA disclosures. My takeaway was, if the public was stunned by the NSA’s tracking techniques, they should have a look at what’s happening in the commercial marketplace.” In Europe, where Fisse said privacy is valued more as a constitutional right than it is in the U.S., there is a higher sensitivity as to what happens to EU citizen data. The General Data Protection Regulations (GDPR) put in place by the European Union a year ago was among the first moves to protect the public at large from data misuse. In the United States, he added, consumers are just now coming to place more importance on the right to control their data. Data is big business, with at least one estimate putting the size of that

market at $18.2 billion in 2018. But in mid-January, AT&T announced it would stop selling its customer location data to third-party data brokers without the customers’ knowledge. Fisse said mobile service providers, consumer finance companies and automotive OEMs with millions of users, are angling for a share of that data market. In the smartphone space, cellular service has become a commodity, and Fisse indicated that providers are looking for alternative revenue streams. “They’re saying, we can leverage our huge databases of first-party data to compete with Facebook and Google in the digital ad marketplace, but we need to do so in a way that doesn’t alienate our subscribers,”

he said. Managing data compliance and sales, though, is not the core mission of these companies. Atomite’s platform acts to provide consumers understandable insights into what information is being collected and shared, for what purposes, and for how long. “The consumer can create their permission profiles through a simple and engaging process,” Fisse said. With the platform, which is white-labeled to these companies, as users provide more information about themselves and grant more permissions, they are rewarded with points that can be redeemed for products and services on the company’s sites. In this sense, “Atomite helps consumers to navigate what are the initial stages of an embryonic consumer data marketplace,” Fisse concluded. z

Customers earn points for each item of information she enables an Atomite licensee to put to work for marketing purposes.


022-27_SDT024.qxp_Layout 1 5/21/19 1:55 PM Page 27

www.sdtimes.com

tices, companies have been able to make better use of their data, and develop better practices for handling their data — even beyond the requirements of the GDPR. According to Giordano, the GDPR has forced organizations to look at every element of the life cycle of personal data, including “how you’re collecting it, how you’re using it, how you’re sharing it— perhaps most importantly — and then disposing of it,” he said. For example, Talend’s Franco has worked with a company that now has a better understanding of where all of their data is, so they are able to leverage it and use it for analysis. “They took the opportunity to get control over their data.” Franco notes that he believes this to be true only for large corporations. There are a lot of companies out there that only did the bare minimum to be legally in compliance, he explained. FireMon’s Woods believes that companies have begun to reassess how ready they are to respond to a breach. He believes that this is more attributed to the rise of breaches in general, rather than a direct result of the GDPR. The GDPR does, however, put more pressure on companies to respond to breaches faster. According to DLA Piper’s report, 59,000 personal data breaches have been brought to the attention of regulators since the law went into effect, as of February 2019. Those breaches range from minor ones, such as an email accidentally being sent to the wrong recipient, to major attacks that affect millions of people. “I have to believe that GDPR as it related to the EU has definitely motivated people to better prepare themselves in the face of a breach on how they’re going to respond to it,” said Woods. The GDPR gives companies 72 hours from discovery of a data breach to response. According to Woods, those 72 hours are like a second when you have to identify how you’ve been breached, the scope of the breach, such as how many people were affected and how much data has been affected. Not

June 2019

SD Times

Legitimate use can be used in place of consent According to Gary LaFever, CEO of Anonos, there are six legal bases under the GDPR for capturing data. Consent was the most commonly used one in the past, but the GDPR radically changes what counts as valid consent. “If data was collected using the old-fashioned approach to consent, it would not be legal.” The GDPR does not have a grandfather clause that would enable companies to keep data that they collected using that old method of consent before the law went into effect. But, according to LaFever, there is another legal basis called legitimate interest processing. If you can prove that you have the proper technical and organizational controls in place that mitigate the risk of data loss, you can use that Gary LaFever data under the legitimate interest basis. This is stated in Recital 47 of the GDPR, which says: “Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller.” Recital 47 also states that processing data necessary for the purposes of preventing fraud and processing personal data for direct marketing purposes may be considered legitimate interest.

only that, but you have to notify the Data Protection Authority that you’ve been breached. “So 72 hours is not a lot of time to prepare, I think to understand what the extent of a given breach is.” For example, last year it was revealed that Marriott had a breach that goes back to 2014. “I mean how do you assess a breach that goes back that far? How much information has actually been compromised?” said Woods. “So I think probably for the EU, GDPR has had an impact. I think in general for the U.S. and other countries, I think the rise of breaches in general are causing people to better their posture from a breach incident reporting and forensics gathering position.” “A lot of the companies outside of the European Union regardless of whether or not they use data that originated from the European Union are taking a look at where they stand,” said Pocknell. The GDPR is just the beginning. Other data regulations are beginning to pop up now as well. The most noteworthy one is the California Consumer Privacy Act, which was

announced last year, and which will go into effect in January 2020. Industry experts expect that other states will soon follow suit with their own regulations. California is paving the way, but more will come soon, FireMon’s Woods believes. “I think all the states are following California’s lead right now on what they’re going to do from a personal privacy perspective to protect users. So no doubt. Everybody’s going to be following that.” “My guess would be that a year from now we’ll probably see two or three or four other high-profile states passing data protection laws,” said Attunity’s Hayes. z

27



029-33_SDT024.qxp_Layout 1 5/22/19 12:02 PM Page 29

www.sdtimes.com

For effective DevSecOps, shift left AND extend right

BY DAVID RUBINSTEIN

D

evSecOps has come to be known by many as the shifting left of security, making it a key part of software development while code is being written, as opposed to trying to put security onto the application after it’s completed. This follows the trends of DevOps, which moved operational considerations for applications into development, as well as software testing — though the term DevTestOps hasn’t really caught on. And DevSecOps, like many initiatives in their early stages, has awareness but often is not well understood. “People recognize the term DevSecOps and have a general notion of what it means,” said Jeff Williams, co-founder and CTO at Contrast Security. “It means shifting left and automating security somehow, but in practice we’re really just at the very early stages of this. I think most folks don’t have a very well-formed idea of exactly what they need to do [for] DevSecOps. In fact, I think most people get it dramatically wrong.” Shifting left, of course, puts even more responsibility on developers, who have been trained to write code, and

even find and correct errors in code, but who largely have not been trained on security best practices. “People have this idea that shifting left is taking the things you’re currently doing and pushing them on to the developers,” Williams said. “I call it shitting left. It doesn’t work. The way that security works today is largely built around experts, using expert tools. You can’t just take those same tools and shove them onto developers who don’t have the skills or background to be effective using them, and expect great results. All you’re going to do is create a lot of alienated developers who don’t do good security. You’re probably going to end up hurting security overall.” That could be problematic for organizations where security is a priority, as 73 percent responding to a recent Forrester study said it is. Trusting developers alone with security is the wrong approach, as developers are often the ones introducing insecure code into their applications through the use of open-source components. The Forrester report found there were 17,308 vulnerabilities published in 2018, up 23 percent from a year earlier. It also indicated that a best security practice would be to have developers

June 2019

SD Times

Just moving everything onto developers, who aren’t trained in security, could actually hurt your efforts

reduce the attack surface in their applications as they code. “Today’s reality is that developers don’t code securely,” the report stated. “When measured against major industry vulnerability standards, 70 percent of applications fail security testing on the first scan.” None of this is to say developers are at fault here. Forrester noted that the top 40 computer science programs in the United States do not require secure coding or application design in their curricula. Yet Williams cautioned against taking too much of a developer-centric view of DevSecOps, and noted that many people trying to do it correctly simply forget about the ‘extending right’ part of DevSecOps. Particularly in application security, Williams said most organizations don’t have any idea of who’s attacking them, or what attack vectors they’re using to go after them, or what systems they’re targeting. Large organizations, he said, are blind at that level and they don’t have a way of stopping those attacks or using intelligence of how they’re being attacked to drive their security strategy. “It’s that kind of feedback loop that’s one of the real characteristics of DevOps,” continued on page 30 >

29


029-33_SDT024.qxp_Layout 1 5/22/19 12:03 PM Page 30

30

SD Times

June 2019

www.sdtimes.com

< continued from page 29

he explained. “Someone attacks you using a new attack, that should instantly drive changes in the product. But we don’t have that feedback loop, so really, when I think about DevSecOps, it’s about continuing to do what you do now — generate assurance — but extend left and extend right. This idea of shifting left is dumb, and dangerous. It’s unfortunate, but those [advocating shift left] are people who haven’t thought it through very well.”

Doing DevSecOps effectively Today, many organizations are just taking the DevSecOps name and pinning it on trivial modifications of what they’ve been doing, and it’s not really that, Williams said. “DevSecOps is a fundamental transformation of security the way that DevOps is a fundamental transformation of the way we build software,” he explained. “My friend at Comcast runs their security program and says vendors are putting DevSecOps lipstick on a traditional security pig, because they’re not fundamentally changing how their products work; they’re just kind of taping them onto a DevOps pipeline and going, “Yep, we’re DevSecOps! Look!” Williams went on to say the organizations that are doing DevSecOps effectively are being smart about security across the entire software life cycle. “In dev, what that means is you empower developers to find and fix their own vulnerabilities, fix their own code, and check in clean code,” Williams said. “Seems pretty straightforward, and automation is a big part of that. It’s got to

be accurate, because developers don’t have [the] time or skills to deal with inaccuracies. If we can achieve that in dev, there are some really good downstream benefits from that. In CI/CD, both traditional and what we might call QA, in that stage, I think the goal has to be to generate assurance; that what you’re pushing into production has been thoroughly tested and is free of vulnerabilities.” Traditionally, this assurance came from a big test after the application was complete. So by pushing all that to the left, that final assurance is lost. But, if security has been factored in earlier in the process, the big test should find nothing because tests were done along the way and any found vulnerabilities would have been remediated. There is no assurance, though, that an effort to do DevSecOps effectively will succeed, because — like Agile, DevOps and Value Stream — the methodologies are not prescriptive. Organizations are usually left to their own devices to determine how they are going to realize the benefits. “There’s some real value in DevSecOps and I don’t want to see the term get watered down to apply to anything that’s security,” he said. “I think it really does mean something. When I go back to the fundamental principles of DevOps, things like breaking down the work into small pieces to create flow, creating tight feedback loops and creating a culture of innovation and learning, those three things, if you interpret them for security, that’s DevSecOps. So that means breaking security work down to small pieces to create flow; it

A self-protecting prophecy Cybersecurity expert Ed Amoroso talks about a model he calls Explode-Offload-Reload. Contrast Security’s Jeff Williams explained: “What that means is as you move from the traditional internal monolithic applications, you need to explode them into pieces, and move each of those workloads into the cloud, that’s off-loading, and then reload means adding those protections back to the stack that runs that code, creating a secure, self-protecting instance in the cloud. Instead of having one big wall, now you’ve got a whole bunch of little walls. It’s not even good to think about walls; it’s really just to secure applications that are able to protect themselves. But I like that description because he’s talking about how organizations can move from a very sort of traditional outside-in approach to security to the future, which is this self— David Rubinstein protecting way of doing things.” z

means creating tight security feedback loops, and it means creating a culture of security innovation and learning. “Those are the three ways of DevOps and I think they’re essentially the same for security,” he added. “But very few organizations are really focused on that. They’re focused on let’s buy some new tool and plug it into our CI/CD pipeline, and bam! We’re DevSecOps. But that is not how it works. You’re not going to achieve a transition overnight. You’re gonna have to do it piece by piece over the course of years.” DevSecOps, by its very definition, encompasses the entire stack, from coding, to UI, to the infrastructure it’s running on, and Williams added that the whole stack is turning into software. “If you’re deploying into the cloud, you’ve got a container on top of that, maybe you’ve got an app server running in the container, you’ve got libraries in the app server in the container, and you’ve got trusted code running on top of the app server... but it’s all really software.”

Inside out, outside in, perpetual change To have an effective DevSecOps practice, you have to approach security at each layer of the stack. If you’re running containers, you’ll need to create rules to ensure that the container has no vulnerabilities, is built with the proper defenses, and that it’s being monitored at runtime. “The old way we used to do that is with what I’ll call an outside-in approach,” Williams explained. “We used to put a firewall around it and scan the shit out of it, and try to see if the whole thing is secure. The problem is, modern architectures are much too complicated for that. I think the effective approach today is to get inside the thing we’re trying to secure. If you’re trying to secure a container, you need to be inside the container asking those questions about security. If you’re trying to secure an app server, you need to be inside the app server. If you’re trying to secure custom code, you need to be inside that custom code. That’s where you have all the information to make a smart decision about whether something is secure or not.”


029-33_SDT024.qxp_Layout 1 5/22/19 12:04 PM Page 31

www.sdtimes.com

June 2019

SD Times

Why do the same vulnerabilities keep showing up? Jeff Williams, co-founder and CTO of Contrast Security, created the OWASP Top Ten list, first published in 2003. While he’s proud of the work done, he’s a little disappointed that the list has not changed all that much in 16 years. “My thought at the time was, we’ll put this Top Ten out, we’ll solve some of these issues and we’ll raise the bar over time to get to a place where application

security is a lot better,” Williams said. “It’s hard to believe that it’s almost 20 years later. Part of me is like, they’re difficult to solve because they’re pervasive across so much code everywhere, and some of them are tricky to find. But at the same time they’re also [doing] basic blocking and tackling, like solving SQL injection is not particularly hard. We’ve taken this approach of mostly chasing

Top Ten 2003

1 2 3 4 5 6 7 8 9 10

2017

Unvalidated Input

Injection

Broken Access Control

Broken Authentication

Broken Authentication and Session Management

Sensitive Data Exposure

Cross-Site Scripting (XSS) Flaws

XML External Entities (XXE)

Buffer Overflows

Broken Access Control

Injection Flaws

Security Misconfiguration

Improper Error Handling

Cross Site Scripting (XSS) Flaws

Insecure Storage

Insecure Deserialization

Denial-of-Service

Using Components with Known Vulnerabilities

Insecure Configuration Management

Insufficient Logging & Monitoring

What Williams described is an instrumentation-based approach to security. Contrast Security, he said, doesn’t do container or cloud security. What Contrast does is instrument the application layer so vulnerabilities can be found and so the team can prevent vulnerabilities from being exploited at runtime. “If you zoom out and look at that, you can imagine instrumenting each layer of the stack with the right products, and then that stack is secure. It secures itself. And then you can put that stack wherever you want. If you want to put it internally, in an internal data center, great. If you want to put it in the cloud, great. The security goes with the code. For me, we’re talking about securing everything, and that’s a very DevOps/Cloud/Container kind of way

of doing security. It doesn’t matter if you’re rolling out tons of elastic servers or you’re spinning up containers all over the place, because the security goes with the code. Trying to do that kind of protection with an outside-in approach is impossible, because you can never keep the walls up around everything, and you can never scan everything from the outside, because what’s in there keeps changing, moving.” As Williams said, automation must play a big role in DevSecOps, because automation is what creates the guardrails around your development pipeline, to ensure no bugs or vulnerabilities sneak into the code and gets pushed into production. “So you have this automated pipeline that does all that work; that optimizes for the devel-

vulnerabilities and trying to remediate them as opposed to changing the way that we interact with databases. If everyone used prepared statements everywhere, we’d be a lot closer to solving SQL injections. It’s when people write custom queries and concatenate in untrusted data that we get into trouble.” He said he believes the right path forward is to give developers great automation so they just get alerted whenever they step outside the guardrails that DevSecOps provides. “For me, we’re not going to train our way out of this, we’re not going to pen test our way out of this, we’re not going to static analysis our way out of this. We’re going to have to get really good accurate automation that works instantly if we want to solve this, because the scale of the problem is just staggering.“ WIlliams went on to note that on average, applications have 27.6 serious vulnerabilities.” If we were an airline, and on average every time you did a safety check there were 27.6 safety problems, nobody would ever leave the ground,” he said. “But we don’t treat it like airline safety. People are a lot... we don’t take it as seriously as we should, as a country or a world. We just don’t. We could do better. We just need the commitment.” z — David Rubinstein

oper the ability to create new functionality and push it into production quickly,” Williams said. “All security, especially application security, has massive scale problems. There are just not enough people to do the work the old way, so you have to automate. Most big organizations, they’re really only doing effective application security on 10 percent of their applications. They only secure the public-facing stuff, or the ones they deem to be critical. They’re not securing all their applications, and it’s a huge risk. The only way to fix that problem is we’ve got to change the economics. We’ve got to figure out a force multiplier, and I believe that is DevSecOps. By empowering developers, we can use the big machinery of software development to do the security work.” z

31


029-33_SDT024.qxp_Layout 1 5/22/19 6:14 PM Page 32

32

SD Times

June 2019

www.sdtimes.com

A guide to DevSecOps tools n Aqua Security enables enterprises to secure their container and cloud-native applications. The Aqua Container Security Platform protects applications running onpremises or in the cloud, across a broad range of platform technologies, orchestrators and cloud providers. Aqua performs image scanning for known vulnerabilities during the build process, image assurance to enforce policies for production code as it is deployed, and run-time controls for visibility into application activity. n CA Veracode creates software that fuels modern transformation for companies across the globe. DevSecOps enables the build, test, security and rollout of software quickly and efficiently, providing software that’s more resistant to hacker attacks. Through automation, CA Technologies helps teams work collaboratively earlier in the DevSecOps process to detect security vulnerabilities in every phase, from design to deployment. n CodeAI is a smart automated secure coding application for DevOps that fixes security vulnerabilities in computer source code to prevent hacking. Its unique user-centric interface provides developers with a list of solutions to review instead of a list of problems to resolve. Teams that use CodeAI will experience a 30-50 percent increase in overall development velocity. n Synopsys helps development teams build secure, high-quality software, minimizing risks while maximizing speed and productivity. Synopsys provides static analysis, software composition analysis, and dynamic analysis solutions that enable teams to quickly find and fix vulnerabilities and defects in proprietary code, open-source components, and application behavior. n Checkmarx provides application security at the speed of DevOps, enabling organizations to deliver secure software faster. It easily integrates with developers’ existing work environments, allowing them to stay in their comfort zone while still addressing secure coding practices. n Chef Automate is a continuous delivery platform that allows developers, opera-

n

FEATURED PROVIDER n

n Contrast Security: Contrast Security’s Contrast Assess produces accurate results without dependence on application security experts, using deep security instrumentation to analyze code in real time from within the application. It scales because it instruments application security into each application, delivering vulnerability assessment across an entire application portfolio. Contrast Assess integrates seamlessly into the software lifecycle and into the toolsets that development & operations teams are already using. Contrast Protect provides actionable and timely application layer threat intelligence across the entire application portfolio. Once instrumented, applications will self-report the following about an attack at a minimum: the attacker, method of attack, which applications were attacked, frequency, volume, and level of compromise. Protect provides specific guidance to engineering teams on where applications were attacked and how threats can be remediated. Contrast doesn’t require any changes to applications or the runtime environment, and no network configuration or learning mode is necessary. tions, and security engineers to collaborate effortlessly on delivering application and infrastructure changes at the speed of business. Chef Automate provides actionable insights into the state of your compliance, configurations, with an auditable history of every change that’s been applied to your environments. n CloudPassage has been a leading innovator in cloud security automation and compliance monitoring for high-performance application development and deployment environments. Its on-demand security solution, Halo, is a workload security automation platform that provides visibility and protection in any combination of data centers, private/public clouds, and containers. Delivered as a service, Halo integrates with infrastructure automation and orchestration tools along within CI/CD tools. n CollabNet offers solutions across the DevOps toolchain. Its solutions provide the ability to measure and improve endto-end continuous delivery, orchestrate delivery pipelines and value streams, standardize and automate deployments and DevOps tasks, and ensure traceability and compliance across workflows, applications, and environments. n CyberArk Conjur is a secrets management solution that secures and manages secrets used by machine identities (including applications, microservices, CI/CD tools and APIs) and users throughout the

DevOps pipeline to mitigate risk without impacting velocity. Conjur is the only platform-independent secrets management solution specifically architected for containerized environments. n IBM provides a set of industry-leading solutions that work with your existing environment. And of course they work fantastically together: Change is delivered from dev to production with the IBM UrbanCode continuous delivery suite. Changes are tested with Rational Test Workbench, and security tested with IBM AppScan or Application Security on Cloud. IBM helps you build your production safety net with application management, Netcool Operations Insight and IBM QRadar for security intelligence and events. n Imperva offers many different solutions to help you secure your applications. Imperva WAF protects against the most critical web application security risks: SQL injection, cross-site scripting, illegal resource access, remote file inclusion, and other OWASP Top 10 and Automated Top 20 threats. Imperva security researchers continually monitor the threat landscape and update Imperva WAF with the latest threat data. n JFrog Xray is a continuous security and universal artifact analysis tool, providing multilayer analysis of containers and software artifacts for vulnerabilities, license compliance, and quality assurance. Deep recursive scanning provides insight into


029-33_SDT024.qxp_Layout 1 5/22/19 12:15 PM Page 33

www.sdtimes.com

your components graph and shows the impact that any issue has on all your software artifacts. n Nosprawl integrates with software development platforms to check for security vulnerabilities throughout the entire software development life cycle to deliver verified secure software before it goes into production. n Parasoft: Harden your software with a comprehensive security testing solution, with support for important standards like CERT C, CWE, and MISRA. To help you understand and prioritize risk, Parasoft’s static analysis violation metadata includes likelihood of exploit, difficulty to exploit/remediate, and inherent risk, so you can focus on what’s most important in your C and C++ code. Parasoft provides flexible, intelligent dashboards and reports specifically designed for each standard to provide necessary information for reporting and compliance auditing. n Qualys is a leading provider of information security and compliance cloud solutions. The Qualys Cloud Platform and apps integrated with it help businesses simplify security operations and automates the auditing, compliance, and protection for IT systems and web applications. n Redgate SQL Provision supports database DevSecOps, keeping compliance central to the process. It enables multiple clones of masked databases to be created in seconds, allowing them to be used safely within the development and test process. n Perforce helps thousands of global enterprise customers tackle the hardest and most complex issues in building, connecting, and securing applications. Our Klocwork static code analysis tool helps DevSecOps professionals, from developers to test automation engineers to compliance leaders, create more secure code with on-the-fly security analysis at the desktop and integrated into large-scale continuous integration workflows. n Signal Sciences secures the most important applications, APIs, and microservices of the world's leading companies. Our next-gen WAF and RASP help you increase security and maintain site reliability without sacrificing velocity, all at the lowest total cost of ownership.

June 2019

SD Times

What does Contrast bring to the table to address DevSecOps? Jeff Williams, co-founder and CTO, Contrast Security Contrast is an integration platform for application security. We use an instrumentation-based approach, so we work from inside the running application layer. From there, we support the entire software life cycle with three things. The first thing is, we help identify vulnerabilities. Typically you want them to be discovered really early in the life cycle, so that’s what we do. As developers are writing their code, they can get instant feedback on the code that they’re writing, they can fix those problems the way they normally could, and they can check in clean code without breaking stride. There is no scanning. I want you to imagine all of your applications — there could be thousands of applications in an enterprise — I want you to imagine them all testing themselves simultaneously, as opposed to having to go to each one and scan it, serially. It’s a very scalable approach to application security, finding vulnerabilities. The second thing that we do is we analyze open-source libraries for both known and unknown vulnerabilities. So this is really a big deal. There have been a bunch of big breaches related to the use of open-source libraries. Contrast is an effective way of doing that at scale in real-time across the organization, and our big differentiator there is that we can tell you exactly how each of those libraries is being used. Instead of just saying, ‘you’re using that library, therefore you have to replace it,’ we tell you ‘whoa, whoa, that has a vulnerability, but you’re never actually invoking that library, so you’re really not insecure,’ and that can cut the amount of work dramatically. LIke, three-quarters of the vulnerabilities those other tools report are false positives, so it really cuts the work. The last thing that we do is extending right into production. We work there as well. In production, Contrast prevents vulnerabilities from being exploited. We do this from inside the running application, but we can prevent SQL injection, cross-site scripting and express language injection, and a whole bunch of other classes of vulnerabilities, because we can actually observe them inside the running application. We’re not trying to guess whether they’re being attacked by looking at network traffic or HTTP requests or whatever. We’re actually watching the code run, seeing an exploit attempted, and preventing it from harming the application. So when you zoom out, we’re protecting the whole application process, from the first line of code all the way through production, all at the application layer. You still need to secure your operating system, your containers and your cloud environment. We don’t do that. We take care of the application layer. z — David Rubinstein n Sonatype Nexus IQ enables Nexus Firewall, which stops risky components from entering the development environment. From there, trusted components are stored in Nexus Repository, and can be easily distributed into the development process. Then, Nexus Lifecycle uses Nexus IQ to automatically and continuously identify and remediate OSS risks in all areas of an environment, including applications in production. n Sumo Logic simplifies DevSecOps implementation at the code level, enabling customers to build infrastructure to scale securely and quickly. This approach is required to maintain speed, agility and innovation while staying alert

for malicious cyber threats. n WhiteHat Security has been in the business of securing applications for 17 years. In that time, applications evolved and became the driving force of the digital business, but they’ve also remained the primary target of malicious hacks. The WhiteHat Application Security Platform is a cloud service that allows organizations to bridge the gap between security and development to deliver secure applications at the speed of business. Its software security solutions work across departments to provide fast turnaround times for Agile environments, near-zero false positives and precise remediation plans while reducing wasted time verifying vulnerabilities, threats and costs for faster deployment. z

33


34_SDT024.qxp_Layout 1 5/21/19 3:27 PM Page 34

34

SD Times

June 2019

www.sdtimes.com

INDUSTRY SPOTLIGHT

Chances of data leaks are high in mobile apps BY JEFFREY SCHWARTZ

Most mobile applications contain at least some programming flaws that make them susceptible to leaking data containing personal information. In fact, mobile applications distributed in Apple’s App Store and Google Play are more likely to have at least one hidden bug that can compromise privacy than they are of containing a security vulnerability, where the likelihood is also alarmingly high, according to a recent assessment. Mobile software penetration testing provider NowSecure determined in that recent study that 90 percent of applications in the U.S. portions of those marketplaces could potentially leak one or more pieces of personal information. The analysis found that personal data can leak within the confines of the device itself, to another application or in a file over a network, that a hacker can intercept. NowSecure conducted the privacy assessment this spring after it added the ability to scan for potential GDPR violations to its mobile app testing engine. The specter of leaking personally identifiable information (PII) could put an organization at risk of violating GDPR, though it doesn’t mean such a violation will or has occurred. “It’s kind of stunning,” said Brian Reed, NowSecure’s chief mobility officer. “PII can be a lot of things when you think about mobile. PII isn’t just my username and password, it is also the GPS location.” Reed emphasized that many mobile apps are responsible for sharing a user’s whereabouts, as well as data from a device such as its serial number, International Mobile Equipment Identity (IMEI) or MAC address. “You’d be amazed at how many of these apps just beacon that stuff out that can be picked up by sniffers, or can be intercepted over the network, either the WiFi or the carrier network,” he said. The finding that 90 percent of mobile Content provided by SD Times and

apps are at risk of compromising privacy is an exceedingly high figure, but only slightly higher than the percentage that have security vulnerabilities. Last fall, NowSecure performed a similar analysis of mobile apps and discovered 85 percent had one or more security vulnerabilities, where they specifically violated one or more of the OWASP Mobile Top 10 most critical web application security risks. Reed believes both findings show that secur-

‘Make sure that the code you write every day is getting scanned every day and has that higher bar of security.’ —Brian Reed, chief mobility officer, NowSecure

ing mobile apps has taken a back seat to web apps. “In the race to build more apps faster, security and privacy are being left behind,” he said. Consequently, hackers have gravitated to mobile apps as they have become the easier target. “The bad guys have figured out everybody’s hardened their web back end, but they haven’t hardened their mobile apps,” he said Perhaps that explains the rise in high-profile breaches affecting mobile apps over the past year. Among them, My Fitness Pal, a subsidiary of Under Armour, last year had to inform 150 million account holders that their infor-

mation was stolen by hackers. The information surfaced on the dark web, back in February nearly a year after the breach was discovered. A breach of Air Canada’s mobile app last summer affected 1.4 million customers. Earlier this year, hackers were using the McDonald’s mobile app to steal significant amounts of food from the hamburger chain. One reason securing mobile apps has fallen to the back burner is that it’s hard to do, Reed said. “Whereas in a browser, you can just make a Java call over HTTPS and it sets up a secure SSL and a TLS connection on your mobile device, with mobile apps you’ve got to actually write that code or download a library to help you do that.” Given the fact that mobile apps are so susceptible, organizations building customer-facing mobile apps need to address these issues head-on, according to Reed. The first step is to create DevSecOps processes around licensing, building and designing secure components for mobile apps with reusable network connections and encryption. Next, it’s critical to add continuous testing into the CI/CD process. “Make sure that the code you write every day is getting scanned every day and has that higher bar of security,” Reed said. “And then you should automatically raise the bar of building higher quality software faster, because you’ve got reusable components that are safe.” Finally, to address the agile nature of today’s continuous approach to mobile application development, automating mobile application security and penetration testing can protect all attack surfaces. NowSecure’s solutions plug directly into their various CI/CD tool chains including CloudBees, CircleCI and Jenkins. Reed said that means “now every build, every day gets an automated security test for any mobile app that they build.” z


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:34 PM Page 35


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:34 PM Page 36


037-42_SDT024.qxp_Layout 1 5/22/19 11:38 AM Page 37

www.sdtimes.com

June 2019

SD Times

Who Owns Continuous Testing?

C

ontinuous Testing (CT) has been the missing piece of the Continuous Integration/Continuous Delivery (CI/CD) movement, but that’s changing out of necessity. While organizations say they’re “delivering value to customers” at an ever-accelerating pace, customers may disagree with the definition of “value” when software quality doesn’t keep pace. In fact, software quality may actually decrease as delivery cycles shorten. There are good reasons why more organizations start with minimum viable products (MVPs) and enhance them over time. For one thing, the dizzying pace of business and technology innovation means that companies must get to market fast. Unlike yesteryear, there just isn’t time to strive for perfection over a long period of time without running the risk of missing the

BY LISA MORGAN target completely. Speed has become a competitive advantage, but not at the expense of software quality. One school of thought is, “If my users know I’m delivering 100 updates a week, they’ll be less forgiving of that which doesn’t work the first time.” Tell that to the irate customer who can’t get loyalty card benefits, play a game or complete an ecommerce purchase. Granted, the fix may be delivered swiftly, albeit perhaps not swiftly enough to prevent customer churn because there

THE LAST OF THREE PARTS

are always other alternatives and better experiences to be had. Interestingly, some cultural issues are contributing to the problem. Namely, how testers are viewed and who’s responsible for testing what. Arguably, everyone is responsible for quality in an Agile, DevOps and CI/CD scenario and yet the effects of an accelerated delivery cycle without an equal focus on quality means that buggy software is released faster. “When you start to think of testing as a discipline, rather than a role, everyone realizes that testing is part of their role even if testing is not in their title,” said Sean Kenefick, VP, analyst at Gartner. “That’s a big cultural change and it’s been a problem for a lot of organizations. When you define continuous testing, you should define it as continued on page 38 >

37


037-42_SDT024.qxp_Layout 1 5/21/19 1:40 PM Page 38

38

SD Times

June 2019

www.sdtimes.com

< continued from page 37

four things: test early, test fast, test often, test after.”

Making testers first-class citizens CT isn’t going to deliver the intended results if testers are viewed and treated as second-class citizens. Historically, the view has been smart people code and other people test, which has never been a productive mindset and is antiquated in light of Agile, DevOps and CI/CD. “Testing is a noble profession. It has to be valued, which has gone by the wayside,” said Theresa Lanowitz, founder and head analyst at market research firm Voke. “Once Mercury Interactive went under, there was no vendor standing up for testers saying they’re a valuable part of the software engineering process.” According to Rex Black, president of testing training and consulting firm RBCS, some organizations have tried to get away from having any sort of independent testing team by having developers automate tests. “That’s something that often sounds better in theory than it is in practice, especially if the developers who are to go and create automated tests are only trained in how to use tools and they don’t get any training in how to design tests,” said Black. Meanwhile, testers have been told that they need to code. “The industry has said to testers, ‘If you still want to have any relevance, you have to become more technical, you have to code in Selenium,” said Voke’s Lanowitz. “Testers are writing Selenium scripts that are difficult to write, hard to maintain, brittle, bloated and they’re still doing most of the testing at the interface level, so there’s not a healthy mix of tests and we’ve largely ignored the non-functional requirements of performance and security.” Fundamentally, organizations should have a philosophy about quality and satisfying customers that not only aligns with their customers’ expectations but is also in line with the brand image they’re attempting to create or

ers are doing unit testing today. Voke’s numbers are even more sobering at 67 percent. “When defects are not remediated properly, that costs you time, money, brand issues and customers,” said Voke’s Lanowitz. “Defects are leaking from sprint to sprint and phase to phase.” Forrester’s Lo Giudice said developers are spending about one hour per day testing, which isn’t enough. “You need to give them time,” said Lo Giudice. “If you’re doing unit testing, development time ‘Once Mercury Interactive went almost doubles. The only under, there was no vendor way to realize what’s going standing up for testers saying on is to start using tools that make that visible.” they’re a valuable part of the Forrester recognizes software engineering process.’ three different types of —Theresa Lanowitz testers: business testers, technical testers and developers. Business testers tend to you have to make it everyone’s job.” focus on user acceptance testing and The Shift-Left dilemma they have the fewest number of tools The “test early and often” mantra pre- available. Technical testers use highdates the turn of the millennium. Since level scripting languages or graphical that time, the shift-left movement tools and do the lion’s share of functionemerged which encourages even more al testing. They don’t necessarily write types of testing earlier based on the Java code to automate tests, which the same, old reasoning which is that it’s developers might be doing at the funceasier and cheaper to keep bugs out of tional testing level. Performance tests the later stages of the software develop- have been the domain of performance ment lifecycle (SDLC). More funda- engineers. mentally, software quality matters. “This is a joint venture among prodWeb and mobile development uct owners, business testers, technical stressed the need to focus on user expe- testers, and developers,” said Lo Giurience (UX), which hinges on software dice. “How responsive the UI needs to quality. However, the focus of UX qual- be is a business requirement. The prodity can focus too much on the UI level uct owner is incented to give feedback when other types of testing are also on the performance he’s requiring. The important. testers might help build a baseline or The belief now is that developers interact with a performance engineershould do more than just unit testing ing test center to identify issues that if including API, performance, and secu- not tested early on become very costly rity testing. They may also be the ones to do when there is end-to-end perwriting test automation scripts if testers formance testing.” haven’t learned to write Selenium code, Providing a great user experience for example. However, the shift left isn’t just about high performance and hype is rosier than the reality because slick UIs. Security testing should also many developers aren’t even doing unit be table stakes and not just in regulated testing — so how can they be expected industries, but there’s more to it than to do other types of testing? application security testing. According to Diego Lo Giudice, VP “If an organization approaches secuand principal analyst at Forrester rity purely from an application security continued on page 41 > Research, only 53 percent of developmaintain. They should look to testers to lead the CT effort. “So many organizations say we’re getting rid of our testers or our testers aren’t playing as much of a role as they have before. Quality should be front and center, but we’ve moved away from it, but I think we’re moving back to it because software has become so defectriddled over the past several years,” said Lanowitz. “You need to have healthy respect for the people that do testing. Quality has to be built in and


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:35 PM Page 39

A brief history of web and mobile app testing.

BEFORE SAUCE LABS Devices. Delays. Despair.

AFTER SAUCE LABS Automated. Accelerated. Awesome.

Find out how Sauce Labs can accelerate continuous testing to the speed of awesome. For a demo, please visit saucelabs.com/demo Email sales@saucelabs.com or call (855) 677-0011 to learn more.


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:35 PM Page 40


037-42_SDT024.qxp_Layout 1 5/22/19 11:03 AM Page 41

www.sdtimes.com

June 2019

SD Times

< continued from page 38

point of view, they’re going to fail because that misses two other important elements which are service security and infrastructure/network security,” said RBCS’ Black. “You have to have what security professionals call ‘defense in depth’ which means hardening all the different layers.” Then there are microservices applications and other loosely-coupled architectures that require testing at more than one level. “When you’re dealing with a lot of contracts — interface contracts, service levels, authentications — you have to make sure your object works in isolation with the contracts but somebody somewhere has to have the bigger picture [of] business processes that align a lot of different services made by a lot of different teams,” said Gartner’s Kenefick. Developers are the first line of defense against bugs, in other words.

The impact of shift left on testers Assuming more types of testing are shifting left in an organization, where does that leave testers? The division of labor is obvious in a waterfall scenario in which developers throw code over the wall and testers find bugs. With shift left, the division of labor evolves. “Shift left means making sure that you not only have ample and powerful upstream left-hand side defect filters but also good downstream right-hand defect filters,” said RBCS’s Black. Shifting left shouldn’t diminish the value of a tester, it should elevate it in two ways: by reducing the number of bugs that flow from a developer’s desk and enabling testers to spend more time improving testing-related processes. “As a tester, I might make the rest of the team more sensitive to what might need to be tested and why. I still need that skill, that mindset, that approach, the curiosity, and that gut feel of where a problem might be hiding or a technical issue might be hiding and discovering it,” said Forrester’s Lo Giudice. “We need automation and work done in parallel from the beginning and not test as a stage later on.” z

BY LISA MORGAN

Testing has always required tools to be effective. However, the tools continue to evolve with many of them becoming faster, more intelligent and easier to use. Continuous Testing (CT) necessarily recognizes the importance of testing throughout the software delivery life cycle. However, given the rapid pace of CT, tests need to run in parallel with the help of automation. However, that does not mean that all tests must be automated. The nature of “tools” is evolving in the modern contexts of data analytics, AI and machine learning. Nearly all types of tools, testing or otherwise, include analytics now. However, tool-specific analytics only provide a narrow form of value when used in a siloed fashion. CT is part of an end-to-end continuous process and so success metrics need to be viewed in that context as well. AI and machine learning are the latest wave of tool capabilities which some vendors overstate. Those capabilities are seen as enablers of predictive defect prevention, better code coverage and test coverage, more effective testing strategies and more effective testing strategy execution.

Analytics and AI can help Test metrics isn’t a new concept, although with the data analytics capabilities modern tools include, there’s a lot more than can be measured and optimized beyond code coverage. “You need to understand your code coverage as well as your test coverage. You need to understand what percentage of your APIs are actually tested and whether they’re completely tested or

not because those APIs are being used in other applications as well,” said Theresa Lanowitz, founder and head analyst at market research firm Voke. “The confidence level is important.” Rex Black, president of testing training and consulting firm RBCS said some of his clients have been perplexed by results that should indicate success when code quality still isn’t what it should be. “One thing that happens is sometimes [there is] a unidimensional view of coverage, and I’ve seen this with clients where they say, ‘How could we possibly be missing defects when we’re achieving 100% statement coverage?’ and I have to explain you’re testing the code that’s there, but how about the code that should be there and isn’t?” said Black. Similarly, there may be a big screen dashboard indicating that all automated tests have passed, so the assumption is all is good, but the dashboard is only reporting on the tests that have run, not what should be covered but hasn’t been covered. Forrester’s Diego Lo Giudice referenced a “continuous testing mobius loop” in which data is generated at each step that can be used to do “smarter things” like deciding what to test next. “If you start using AI or machine learning, you can start to predict where the software may be more buggy or less buggy, or you can predict bugs and prevent them rather than finding the bugs and validating or verifying them,” said Lo Giudice. “On the other hand, it’s part of a broader picture that we call ‘value stream management’ where each type of stakeholder in the whole endcontinued on page 42 >

41


037-42_SDT024.qxp_Layout 1 5/21/19 1:41 PM Page 42

42

SD Times

June 2019

www.sdtimes.com

< continued from page 41

CT-Related Tools

to-end delivery process has a view.” Quality is one of the views. Other Diego Lo Giudice, VP and principal analyst at Forrester Research, outlined some of the views enable a product owner to undertesting tools needed for a CT effort which have been included below. The list is merely stand the costs of achieving a quality representative as opposed to exhaustive: level, what has been released into pro• Planning – JIRA duction, how long it took and that the • Version Control – GitHub value it’s generating. • CI – Jenkins Sean Kenefick, VP, analyst at Gart• Unit testing – JUnit, Microsoft unit test framework, NUnit, Parasoft C/C++test, ner, said he’s working on a project that • Functional testing – Micro Focus UFT, TestCraft involves looking at AI and its relation• API testing – Parasoft SOAtest, SoapUI ship to software testing. Importantly, • UI testing – Applitools, Ranorex Studio there are two ways to view that topic. • Test suites – Smart Bear Zephyr, Telerik Test Studio One is testing systems that incorporate • Automated testing (including automated test suites) – Appvance, IBM Rational AI. The other is using AI as part of the Functional Tester, LEAPWORK, Sauce Labs, Selenium, SmartBear TestComplete, testing effort. SOASTA TouchTest, Micro Focus Borland Silk Test “I think AIs are going to have quite a • CT – Tricentis Tosca significant impact on automated testing because they’re going to allow us to accessibility testing and portability test- you’re trying to accomplish? For each solve some really thorny problems,” ing to some extent. one of those objectives, show me how said Kenefick. “Some of my clients are “You need to start with what do we you’re measuring testing and efficiency, in the video game space where, unlike need to test and if you let the test show me the business case for your an insurance package or a banking automation determine what your test automation. What’s your ROI?’ Without package that have right answers detercoverage is going to be, that’s likely to a business case, it’s not a tool, it’s a toy.” mined by banking regulations or GAAP, make you sad,” said Black. “I made that There is a temptation to go to tools games are disconnected from first and think later, when the When you start to think the real world.” reverse yields better results. ‘ Traditional automation tools Starting with tools results in a of testing as a discipline, have done a poor job of bridgtest strategy by default rather rather than a role, everyone ing the gap between fictional than test strategy by design. realizes that testing is part scenarios and the real-life “I think many people would of their role even if testing expectations of humans, such say [that having a testing strateis not in their title.’ as hair flapping in the wind. AI gy] is an old-school approach, —Sean Kenefick enhancement could help. but it’s the right approach Automated testing is ripe because if you’re just going with for AI enhancement from a number of mistake when I was an inexperienced tool A or tool B, you’re trusting your perspectives, such as identifying tests test manager and it’s not a good thing. approach to that tool and that tool may that should be automated and suggest- Identify all the things you need to cover not provide what you need,” said Voke’s ing new types of tests, but again, even and then figure out the best way to cov- Lanowitz. “I think you need to take a with AI it may not be possible or even er those things.” step back and decide what you’re going wise to automate all types of tests. to do.” “We can automate functional testing Drive higher value with a test strategy Ultimately, the testing strategy and at multiple levels to some extent, though The inclusion of analytics, AI, and execution need to tie together in a way there are some validation tests which machine learning in tools is arguably that aligns with business objectives, but may not be so easy to automate,” said “cool,” but the outcomes they help more fundamentally organizations have Black. “At some point, we’ll have AIs enable and the effectiveness of testing to cultivate a culture of quality in the that can do ethical hacker kinds of tests, generally can be improved by having an first place if they want their CT efforts but we don’t have that now, so if you overarching test strategy. to be stable. “This is a very big problem. Software want to do a penetration test, you need a “Understand that nothing we’re engineering is notoriously fad-driven and doing can be 100 percent. What we’re human ethical hacker to do that.” Localization testing tools are using hype-driven,” said RBCS’s Black. “I’ve really doing is trying to minimize our machine learning to test translations. worked with a number of clients who risk as much as possible and mitigate They’re not perfect yet, nor is Google were heavily into automation and if you for things we weren’t expecting,” said Translate, which is available via an API, look at what they were doing, there was RBCS’s Black. “Continuous delivery is a so a human is still needed in the loop. no strategy. I ask questions that are pretty process that lives forever. As long as the Humans also need to be involved in obvious like, ‘What are the objectives product is alive, we’re testing it.” z


98% Defect Identification is Possible. IMAGINE THIS Increasing your defect identification from 75% to over 98%. We did it for a Fortune 500 bank. And we can do it for you too. With fast feedback, smart AI-powered insights, and quieter reporting, Perfecto powers the world’s leading organizations. Get streamlined test reporting and analytics with visual heatmaps, automation dashboards, and detailed test reports. See it in action. Get your analytics demo today.


044_SDT024.qxp_Layout 1 5/21/19 3:26 PM Page 44

44

SD Times

June 2019

www.sdtimes.com

Guest View BY ITZHAK ASSARAF

A new approach to personal data discovery Itzhak Assaraf is CTO at 1Touch.io.

W

hy do we seem to emulate the ostrich and put our heads in the sand when it comes to personally identifiable information (PII) discovery? Every other technology throughout history that I can think of has undergone multiple phases of evolution, including anti-virus, perimeter security, cell phones, the car – I could go on. So please forgive me when I say that I find it incredulous that the current standard offerings that address PII discovery are still in their infancy compared with the solutions that are currently available. Can it really be that in the area of PII discovery, where the punishments for GDPR non-compliance are so severe, that we are stuck with antiquated solutions and methods of PII discovery? We don’t have to be, but it seems that way. Here’s why. Network element discovery solutions have been around for many years. Crawler software has been around for just as long. Does it really take a visionary like Henry Ford to look at these two solutions to bring them together? Ford recognized that by combining the assembly line with automotive manufacturing he could mass produce automobiles — both technologies had been around for a while in this case, too. Doesn’t it make perfect sense, then, to combine both crawling and network discovery technologies when it comes to PII discovery? The greatest weakness of current data discovery tech is that you must feed it with the already known or suspected known locations of PII. Instead, let’s add a way to automatically find the locations of PII and not rely on human input. In this way, enterprises can create an accurate and constantly updated list of PII locations, wherever they are, regardless of whether anyone knows about them. Ford’s innovation brought about many other benefits; his suppliers, for example, all benefited through significantly increased orders, and his customers benefited from lower-cost automobiles. By combining crawling software and network discovery technologies, we can suddenly understand which PII locations are sharing, storing, and processing data, which is an important GDPR compliance requirement. This approach brings many

Doesn’t it make perfect sense, then, to combine both crawling and network discovery technologies when it comes to PII discovery?

other benefits, including how to understand and know when PII is shared outside the organization, which is another significant data privacy compliance requirement. The manual approach basically says to a company, “tell me where your databases and repositories are, and I’ll locate your PII within them. Not surprisingly, this approach has major flaws. First, it doesn’t address the data repositories of which you are not aware. Repositories are constantly being created, with no efficient way to track these changes. There are major logistical challenges in finding time and resources to organize each data asset’s information into one area. Secondly, and perhaps most importantly, is that the PII organizations store is constantly in motion. Let’s say I somehow have the resources to pull off the inventory challenge previously stated and can figure out how to organize it. Great, right? Not so fast. Ask yourself, what happens almost immediately when the database containing so-and-so’s information gets copied from a known repository to an unknown one, or when DevOps create a duplicate repository where we don’t see it or know about it? Now we’re back at square one, over and again. Already, more than two-thirds of organizations are employing networking activity monitoring technologies to understand how personal data is traveling throughout the organization. Reason being, only a true network approach can track how personal data is processed, stored, and shared in virtually real time and give a constantlyupdated view on where their personal data is being stored and shared within an enterprise. The network approach can also organize an enterprise’s personal data about each customer into a single area. This is especially important for dealing with Data Subject Asset Requests (DSAR) and the right for erasure. A network approach offers organizations insight into their repositories that they didn’t know existed. We live and work in an ever-changing world that requires our technology evolve and keep up. Clinging to old models, approaches, or to “the way we’ve always done things” extinguishes innovation. Proceeding this way with PII not only makes operating an enterprise unmanageable, it threatens to put it out of business. z


045_SDT024.qxp_Layout 1 5/21/19 3:26 PM Page 45

www.sdtimes.com

June 2019

SD Times

Analyst View BY ROB ENDERLE

Qualcomm’s potential hybrid AI revolution Q

ualcomm had a coming-out party for their new AI technology that includes a series of new smart Snapdragon parts with built-in AI capability for smartphones and connected devices. They also surprised with the announcement of their Cloud AI 100 simulator. Both of these things are interesting by themselves but, together, they have the potential to create something far more powerful, a hybrid AI that has blended resources that operate on both the device and the cloud platform simultaneously. This could be a game-changer because it is only Qualcomm that is anywhere near a critical mass of AIs on smart devices, and while their server efforts are relatively new, the synergy between both products could be a massive advantage in the coming AI wars. Let’s talk about the birth of the hybrid AI.

The problem Often, we have vendors that create technology where no clear problem is evident. That isn’t the case this time because mobile devices will always be constrained by their form factors and power limitations, while latency and connectivity will constrain server resources. This means a future AI solution depending on either mobile or cloud-based resources exclusively will be inherently flawed. Either the solution, if mobile based, will never be smart enough, or the solution, if server based, will be unreliable, bandwidth intensive, and relatively slow. (I do expect this last to eventually be overcome but it may take a decade or two and the performance/bandwidth requirements will continue to increase).

The potential hybrid solution But if you can tie the two systems together into a synergistic whole, there is the potential for a hybrid system that lowers the load, and thus the network traffic, going to the cloud while being able to share the decision load and only pushing things up that need extra data or intelligence. The end result should be a system that ties the client and server together into a whole that can both outperform a local solution and work more reliably than a cloud-only solution. In effect, with this potential hybrid, you have the potential for something that is far better optimized for the world of today than anything anyone else is contemplating.

The opportunity = Google advantage Now this creates an interesting potential opportunity for both Google (Android) and Apple (iOS) to create that blended server/client solution that makes this all work. However, Apple is at war with Qualcomm, so they likely won’t get this technology and they suck at servers and cloud services, greatly reducing their potential even if they created their own solution. Google, on the other hand, is one of the most powerful cloud players in the market, they already have one of the most advanced AI systems in development, and their digital assistant is already far more advanced than Apple’s Siri. This potentially could become the game-changer that effectively makes the iPhone obsolete. Now “effectively makes” and actually does are two very different terms and outcomes. Apple is far more integrated than Google is, and it has been staffing up both mobile and AI capability, suggesting that Google’s advantage my not last that long. Or, put more crisply, we are at the front end of this race and the outcome is anything but certain.

Rob Enderle is a principal analyst at the Enderle Group.

Regardless of who goes first and where this resides, this idea of a hybrid AI will be a game-changer.

Carrier option Now one other thing has occurred to me and that is that a carrier could specify a hybrid solution that more closely ties their users to their service, making it almost impossible for a customer to switch carriers. Once you have this blended AI uniquely trained to your needs, if that AI existed on the carrier’s servers instead of on Google’s or Apple’s servers, a switch of carriers would mean you’d have to start training over again and I doubt many will, particularly after years of work, want to abandon the digital assistant that is uniquely theirs. So, this could play out a number of interesting ways. Regardless of who goes first and where this resides (be aware this same model could be used for autonomous cars, smart cities and buildings, and millions of increasingly intelligent devices that will surround us) this idea of a hybrid AI will be a game-changer. This likely will redefine not only the smartphone market but the blended market of ever more intelligent devices. z

45


046_SDT024.qxp_Layout 1 5/22/19 12:16 PM Page 46

46

SD Times

June 2019

www.sdtimes.com

Industry Watch BY DAVID RUBINSTEIN

Of open source, data breaches and speed David Rubinstein is editor-in-chief of SD Times.

W

hen IBM CEO Ginni Rometty joined Red Hat CEO Jim Whitehurst on stage at last month’s Red Hat Summit to declare that IBM would leave Red Hat alone after its acquisition, cheers went up from the developers in the keynote audience. Their fear, of course, was that IBM would somehow change Red Hat and move it away from its open-source roots. The same fears were expressed when Microsoft acquired GitHub, the leading open-source software development platform in the world, but that has worked out fine so far. When Microsoft drops US$7.5 billion for GitHub, and when IBM plunks down $33.4 billion for Red Hat, it proves that enterprises large and small have embraced open source. It has become big business. In fact, many software providers today are following the model of selling commercial versions of open-source software, which comes with support, vetted updates and the assumption that it is secure. Developers, though, often just grab the components they need from a public repository, unaware of vulnerabilities that might lie within. In other, not unrelated news, Facebook is leaking more data, as access to the company’s huge data trove was gained through a public AWS database. And in late 2017, credit reporting giant Equifax lost data through a breach in an open-source Apache web server, affecting 148 million users of the service. A simple patch could have prevented that breach. Furthermore, a year into the General Data Protection Regulation (GDPR) — and with California data privacy laws in place and growing calls for nationwide data protection laws in the United States — organizations are spending a lot of time and big money on understanding how they got a customer’s PII, where it’s stored and how it’s being used. This, at the risk of hefty fines in the case of the GDPR, and likely attached to any federal legislation here in the U.S. On top of all that, organizations are still fed the line that they must go faster to survive. To help developers achieve top speed, new software architectures such as microservices, containers, serverless and infrastructure-as-code have risen. Of course, another way developers work faster is to

If the Equifax and Facebook breaches have taught us anything, it’s that you can’t expect developers to be security experts.

rely on open-source components to build their applications. This has only added more complexity into IT and business systems, making them harder to test, maintain and find defects and vulnerabilities. It also has made them more susceptible to data loss. What’s it all mean? It means that in today’s software industry, companies are at risk of losing data, losing proprietary intellectual property, and losing business because of these converging initiatives. Taken individually, these initiatives all offer tremendous benefits to developers and their organizations. Open-source software helps developers work faster and smarter, as they don’t have to ‘re-invent the wheel’ every time create an application. They just need to be sure the license attached to that software allows them to use the component the way they want. They also need to stay on top of that application, so if the component changes, or an API changes, their application isn’t affected and they are still in compliance. Data protection is also something organizations must get serious about. While the GDPR only affects users in the European Union, it’s only a matter of time before those or similar regulations are in place in the U.S. and elsewhere. Companies should get a jump on that by doing a thorough audit of their data, to know they are prepared to be compliant with whatever comes down from the statehouses or from Washington, D.C. On the speed side, the benefits of Agile and DevOps are clear. These methodologies enable companies to bring new software products to market faster, with the result of getting a jump on the competition, working more efficiently and ultimately serving your customers. Unfortunately, these efforts are usually done by different teams of developers, database administrators and security experts. If the Equifax and Facebook breaches have taught us anything, it’s that you can’t expect developers to be security experts, and you can’t expect DB admins to understand the ramifications on the business when data is misunderstood. It will take a coordinated approach to IT to achieve business goals while not leaving the company — and its IP and PII data — exposed. z


Full Page Ads_SDT024.qxp_Layout 1 5/21/19 1:36 PM Page 47

Bad address data costs you money, customers and insight. Melissa’s 30+ years of domain experience in address management, patented fuzzy matching and multi-sourced reference datasets power the global data quality tools you need to keep addresses clean, correct and current. The result? Trusted information that improves customer communication, fraud prevention, predictive analytics, and the bottom line. • Global Address Verification • Digital Identity Verification • Email & Phone Verification • Location Intelligence • Single Customer View See the Elephant in Your Business -

Name it and Tame it!

www.Melissa.com | 1-800-MELISSA

Free API Trials, Data Quality Audit & Professional Services.


webinar ad.qxp_WirelessDC Ad.qxd 1/21/19 10:26 AM Page 1

Be a more effective manager

Visit the sdtimes.com Learning Center Watch a free webinar and get the information you need to make decisions about software development tools.

Learn about your industry at www.sdtimes.com/sections/webinars


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.