SD Times February 2023

Page 1

FEBRUARY 2023 • VOL 2, ISSUE 68 • $9 95 • www sdtimes com

Instantly Search Terabytes

www.sdtimes.com

EDITORIAL

EDITOR-IN-CHIEF

David Rubinstein drubinstein@d2emerge.com

NEWS EDITOR

Jenna Sargent Barron jsargent@d2emerge com

MULTIMEDIA EDITOR

Jakub Lewkowicz jlewkowicz@d2emerge.com

SOCIAL MEDIA AND ONLINE EDITOR

Katie Dee kdee@d2emerge.com

ART DIRECTOR

Mara Leonardi mleonardi@d2emerge com

CONTRIBUTING WRITERS

Jacqueline Emigh, Elliot Luber, Caryn Eve Murray, George Tillmann

CONTRIBUTING ANALYSTS

Enderle Group, Gartner, IDC, Intellyx

CUSTOMER SERVICE

SUBSCRIPTIONS subscriptions@d2emerge com

ADVERTISING TRAFFIC

Mara Leonardi mleonardi@d2emerge com

LIST SERVICES

Jessica Carroll jcarroll@d2emerge com

REPRINTS reprints@d2emerge com

ACCOUNTING accounting@d2emerge.com

ADVERTISING SALES

PUBLISHER

David Lyman 978-465-2351 dlyman@d2emerge com

MARKETING AND DIGITAL MEDIA SPECIALIST

Andrew Rockefeller arockefeller@d2emerge com

PRESIDENT & CEO

David Lyman

CHIEF OPERATING OFFICER

David Rubinstein

dtSearch.com 1-800-IT-FINDS The Smart Choice for Text Retrieval® since 1991 dtSearch’s document filters support: popular file types emails with multilevel attachments a wide variety of databases web data Developers: and recent .NET (through .NET 6) Visit dtSearch.com for developer evaluations efficient multithreaded search forensics options like credit card search
®
D2 EMERGE LLC www d2emerge com
NEWS 4 News Watch 12 ChatGPT won’t take your job, but it can make you more efficient and knowledgeable 15 Broadcom: 84% of orgs will be using VSM by end of the year 15 Report: Platform engineering to see big boost in 2023 Contents page 6 page 16 Software Development Times (ISSN 1528-1965) is published 12 times per year by D2 Emerge LLC, 2 Roberts Lane, Newburyport, MA 01950 Periodicals postage paid at Newburyport, MA, and additional offices SD Times is a registered trademark of D2 Emerge LLC All contents © 2023 D2 Emerge LLC All rights reserved The price of a one-year subscription is US$179 for subscribers in the U S , $189 in Canada, $229 elsewhere POSTMASTER: Send address changes to SD Times, 2 Roberts Lane, Newburyport, MA 01950 SD Times subscriber services may be reached at subscriptions@d2emerge com FEATURES How observability removes blind spots for developers page 20 How to build trust in AI for sof tware testing Sof tware intelligence is key to writing better code VOLUME 2, ISSUE 68 • FEBRUARY 2023 COLUMNS 24 GUEST VIEW By Dinesh Varadharajan The layers and phases of effective digital transformations 25 ANALYST VIEW by Jason English Web3: New scams on the blockchain Time to hide your API Manage the ones you write and even those you didn’t to keep them safe from attacks MARKET FORECAST page 22

GitHub Actions adds new features for CI/CD practices

Required workflows is the first new feature and it can be used to define and enforce CI/CD p ra c t i ces a c ross m u l t i p l e source code repositories By utilizing this feature to accomplish this, teams won’t have to m a n u a l l y co nf i g u re e a c h repository individually

Other benefits include the ability to invoke external vulnerability scoring tools, ensure code meets compliance requirements, and ensure code is continuously deployed

Re q u i re d wo r kf l ows g e t triggered as a status check on o p e n p u l l re q u est s o n t h e default branch Merges won’t be able to be completed until the workflow succeeds

The second feature is conf i g u ra t i o n va r i a b l es, w h i c h allow developers to store nonse n s i t i ve d a ta a s re u s a b l e plain text variables Examples of non-sensitive data include co m p i l e r f l a g s, u se r n a m es, and server names.

People on the move

B e fo re t h i s fe a t u re wa s introduced, developers needed to store configuration data as encrypted secrets if they wished to reuse the values in workflows. This made it difficult to retrieve non-sensitive data.

Security platform Kubescape accepted into CNCF Sandbox

Kubescape is an end-to-end security platform for Kubernetes created at Armo, and is t h e f i rst se c u r i ty sca n n e r u n d e r t h e C N C F u m b re l l a , according to Armo

“A R M O ’s co m m i t m e nt to open source means ensuring Kubescape is free, open and always improving to become the end-to-end open-source Kubernetes security platform of choice,” said Shauli Rozen, co-founder and CEO of ARMO “I’m proud that Kubescape’s a cce p ta n ce by t h e C N C F ce m e n t s t h i s co m m i t m e n t ARMO remains dedicated to making Kubescape the best open source Kubernetes secu-

n Steve Lucas has joined Boomi as its new CEO He has been in the industry for 27 years and has held leadership roles at Adobe, SAP, and Salesforce In addition to his enterprise software epxerience, he has also served on the boards of the American Diabetes Association and the Children’s Diabetes Foundation

n Sauce Labs has brought on Clair Byrd as its new chief marketing officer to drive new strategies for growing the company She was previously CMO at Wing Venture Capital and has held marketing leadership positions at Twilio and InVisionApp

n It has been announced that David Gardiner has been appointed as Tricentis’ newest executive vice president and general manager of the DevOps business unit, which is new The new team he is leading will focus on providing customers with the necessary tools for engineering quality throughout the development lifecycle He has previously held roles at SolarWinds, Motive, BroadJump, and Trilogy

rity platform, and ARMO Platform the best enterprise version for Kubescape. We strive to provide the best and simplest option for organizations to g e t t h e b e n e f i t s of Ku b esca p e w i t h e n te r p r i selevel service support and features, to ensure the most complete security experience.”

Armo will continue to lead d eve l o p m e n t of Ku b esca p e and continue its commitment of making “Kubernetes security a simple and trustworthy DevOps-first experience.”

Key features of Kubescape include risk analysis, security compliance, and misconfiguration scanning.

In addition to this news, the company also announced the launch of Armo Platform, which is an enterprise offering of Kubescape, providing full enterprise-grade support, maintenance and additional features.

Visual collaboration for Gliffy in Confluence Cloud

Gliffy is a technical diagramming solution and the new collaboration capabilities enable users to invite others to participate on one diagram Additionally, the diagram owner can monitor the level of access for each individual user and collaborators can follow each other’s activity during a specific work session

With this, everyone participating on one diagram can view all the changes that were made in real-time, simplifying the process of completing and updating architecture diagrams, flowcharts, and process diagrams

“ D i a g ra m m i n g d i re c t l y within Confluence has been an important benefit to teams that use Gliffy,” said Charlie

Ealy, senior product manager for Gliffy at Perforce Software. “Our team’s goal for real-time collaboration is to p rov i d e a n a d d e d l aye r of functionality that helps teams minimize the need to manage multiple tools in order to work collaboratively.”

LEADTOOLS v22 adds .NET 7 support

The latest release adds support for .NET 7, eSignatures, OCR Enhancements, Medical Web Viewer updates, and more.

W h i l e L E A DTO O L S h a s suppor ted .NET 7 since its l a u n c h i n N ove m b e r 2022 , now all of the LEADTOOLS Document, Imaging, and Multimedia libraries and binaries, a s we l l a s f u l l y- so u rce d demos, are built with .NET 6+ as the target runtime.

A l so, p ro g ra m m e rs t h a t a re l eve ra g i n g t h e L E A DTOOLS Document Viewer in their application are able to view, convert, compose, edit, compare, and now also sign documents. They can now easily add electronic signatures to documents and PDFs.

The LEADTOOLS Barcode SDK now has speed and detection optimizations for all QR barcode types and AI-Powered OCR SDK received recognition optimizations for italic fonts, uppercase and lowercase letters, text line assembling and word construction, and more.

The Medical Web Viewer now has new methods for loading DICOM files that use inputs defined by the DICOM specification. It also includes two new demos: the Watchdog demo that monitors the health of a LEADTOOLS DICOM Listening service and HL7 Relay demo that can process large volumes of HL7 messages.

4
N E W S WATC H N E W S WATC H SD Times February 2023 www.sdtimes.com

Google announces enhanced privacy technologies

Over the last decade the company has invented a number of different privacy-enhancing technologies (PETs), including Federated Learning, Differential Privacy, and Fully Homomorphic Encryption (FHE)

Three years ago Google open-sourced these technologies to provide broader access, and now the company is making new strides to provide the community with new ways to deploy these technologies.

First, it announced it would be open-sourcing a new project called Magritte It uses machine learning to detect a n d b l u r o b j e c t s, s u c h a s license plates, as they appear in a video. It requires low computational resources and can save time from having to manually blur objects in a video

N ex t, G o o g l e a n n o u n ce d i m p rove m e n t s to t h e F H E

Tra n s p i l e r I t o pt i m i ze d c i rcuits to reduce circuit size by 5 0 % Th i s i m p roves s p e e d and will be useful in industries where there is a need to have security guarantees when processing sensitive data

“PETs are a key part of our Protected Computing effort at Google, which is a growing toolkit of technologies that transforms how, when and where data is processed to technically ensure its privacy and safety,” Miguel Guevara, product manager in the privacy and data protection office at Google, wrote in a blog post

Microsoft makes multi-billion dollar investment in OpenAI

M i c rosof t h a d p rev i o u s l y m a d e l a rg e i nvest m e n t

Meta fined with GDPR violation

The Data Protection Commission (DPC) has announced that two inquiries into the data processing operations of Meta Platforms Ireland Limited (Meta Ireland) revolving around the delivery of its Facebook and Instagram services have been concluded Meta Ireland was fined €210 million due to breaches of the GDPR having to do with the company’s Facebook service as well as €180 million for breaches in connection to its Instagram service

With this, Meta Ireland has also been given three months to bring its data processing operations into compliance with the standards set forth by GDPR

According to DPC, the inquiries came about from two complaints, both made on May 25, 2018 raising the same basic issue The Facebook related complaint came from an Austrian data subject while the Instagram complaint was made by a Belgium data subject

OpenAI in 2019 and 2021, and is OpenAI’s exclusive cloud provider, meaning that all of OpenAI’s workloads are powered by Azure

Over the past few years, O p e n A I h a s b e e n a b l e to achieve things like building supercomputers powered by Azure, deploying OpenAI techn o l o g y t h ro u g h t h e A zu re OpenAI service, and incorpora t i n g O p e n A I ’s te c h n o l o g y i n to G i t H u b Co p i l o t a n d Microsoft Designer

Microsoft stated that it will continue investments in developing supercomputing systems to further OpenAI’s research

Th e co m p a ny w i l l a l so incorporate OpenAI technology into its products, such as Azure OpenAI Service, which provides developers access to O p e n A I m o d e l s to b u i l d A I applications

Ea r l i e r t h i s m o n t h , Th e I nfo r m a t i o n re p o r te d t h a t Microsoft was interested in integrating ChatGPT into its se a rc h e n g i n e B i n g Th ey reported that this could take place as soon as March

According to OpenAI, both companies regularly meet to review and share lessons that inform updates to the systems, future research areas, and best practices for the industry

Th e co m p a n i es d i d n’ t reveal specific dollar amounts, b u t d i d s ay M i c rosof t wa s making a “multi-year, multibillion dollar investment ”

Linux Foundation launches Open Metaverse Foundation

The goal of the foundation is to provide a collaborative environment for a variety of industries to build open-source software and standards for an inclusive, global, vendor-neutral, and scalable Metaverse

“We’re still in the early days of the vision for an open Metaverse, and we recognize that many open-source communities and foundations are working on vital pieces of this iterative puzzle,” said Royal O’Brien, executive director of the OMF “While the challenges may seem daunting, I’m energized

by the opportunities to collaborate with a broad, global community to bring these pieces together as we transform this vision into reality ”

Fo u n d a t i o n a l I nte rest Groups (FIGs), organized by t h e O M F, p rov i d e ta rg ete d resources and forums to identify new ideas, get work done, and onboard new contributors

The eight FIGs are Users, Transactions, Digital Assets, S i m u l a t i o n s a n d Vi r t u a l Worlds, Artificial Intelligence, Networking, Security and Privacy, and Legal and Policy.

Progress updates

Telerik, Kendo UI

Progress, provider of application development and infrastructure software, has announced the R1 2023 release of Progress Telerik and Progress Kendo UI, the company’s NET and JavaScript libraries for app development

The company stated that these new releases are geared at supporting developers in building better digital experiences across all platforms.

Several new features have been added to Telerik, including Progress Telerik UI for Blazor which adds new UI components as well as updates such as Data Grid compact mode and adaptive rendering in date and select-type components

Next, Progress Telerik UI for NET MAUI now offers new components such as Toolbar, I m a g e Ed i to r, P ro g ress B a r, Accordion, and SignaturePad, and Progress Telerik UI for WinForms adds a new Windows 11 theme in sync with Windows 11 OS.

Wi t h t h i s, P ro g ress a l so delivers support for .NET 7 across all relevant Telerik UI libraries and tools z

5
s i n
www.sdtimes.com February 2023 SD Times

How observability removes blind spots for developers

When changing lanes on the highway, one of the most important things for drivers to remember is to always check their blind spot Failing to do this could lead to an unforeseen, and ultimately avoidable, accident.

The same is true for development teams in an organization Failing to provide developers with insight into their tools and processes could lead to unaddressed bugs and even system failures in the future

This is why the importance of providing developers with ample observability cannot be overstated Without it, the job of the developer becomes one big blind spot.

Why is it important?

“ O n e o f t h e i m p o r t a n t t h i n g s t h a t observability enables is the ability to see how your systems behave,” said Josep Prat, open-source engineering director at data infrastructure company Aiven. “So, developers build features which

belong to a production system, and t h e n o b s e r v a b i l i t y g i v e s t h e m t h e means to see what is going on within that production system.”

He went on to say that developer observability tools don’t just function to inform the developer when something

is wrong; rather, they dig even deeper to help determine the root cause of why that thing has gone wrong

David Caruana, UK-based software architect at content services company H y l a n d , s t r e s s e d t h a t t h e s e d e e p insights are especially important in the context of DevOps

“That feedback is essential for continuous improvement,” Caruana said “As you go around that loop, feedback from observability feeds into the next development iteration So, observability really gives teams the tools to increase the quality of service for customers ”

The in-depth insights it provides are what sets observability apart from moni t o r i n g o r v i s i b i l i t y, w h i c h t e n d t o address what is going wrong on a more surface level.

According to Prat, visibility tools

SD Times February 2023 www sdtimes com 6

alone are not enough for development teams to address flaws with the speed and efficiency that is required today

The deeper insights that observability brings to the table need to work in conjunction with visibility and monitoring tools

With this, developers gain the most comprehensive view into their tools and processes

“It’s more about connecting data as well,” Prat explained “So, if you look at monitoring or visibility, it’s a collection of data We can see these things and we can understand what happened, which is good, but observability gives us the connection between all of these pieces that are collected. Then we can try to make a story and try to find out what was going on in the system when something happened.”

John Bristowe, community director at deployment automation company Octopus Deploy, expanded on this, explaining that observability empowers development teams to make the best decisions possible going forward

These decisions affect things such as increasing reliability and fixing bugs, leading to major performance enhancements

“And developers know this There are a lot of moving parts and pieces and it is kind of akin to ‘The Wizard of Oz’ ‘ignore the man behind the curtain,’” Bristowe said “When you pull back that curtain, you ’ re seeing the Wizard of Oz and that is really what observability gives you. ”

A c c o r d i n g t o Vi s h n u Va s u d e v a n , head of product at the continuous orchestration company Opsera, devel-

oper interest in observability is still somewhat new.

He explained that in the last five years, as DevOps has become the stand a r d f o r o r g a n i z a t i o n s , d e v e l o p e r i n t e r e s t i n o b s e r v a b i l i t y h a s g r o w n exponentially

“Developers used to think that they can push products into the market without actually learning about anyt h i n g a r o u n d s e c u r i t y o r q u a l i t y because they were focusing only on development,” Vasudevan said “But without observability the code might go well at first but sometime down the line it can break and it is going to be very difficult for development teams to fix the issue ”

The move to cloud native

In recent years, the transition to cloud native has shaken up the software development industry. Caruana said that he believes the move into the cloud has been a major driver for observability.

He explained that with the complexity that cloud native introduces, gaining d e e p i n s i g h t s i n t o t h e d e v e l o p e r processes and tooling is more essential than ever before.

“If you have development teams that are looking to move towards cloudnative architectures, I think that observability needs to be a core part of that conversation,” Caruana said “It’s all about getting that data, and if you want to make decisions having the data to drive those decisions is really valuable ”

According to Prat, this shift to cloud native has also led to observability tools becoming more dynamic

“When we had our own data centers, we knew we had machines A,B,C, and D; we knew that we needed to connect to certain boxes; and we knew exactly how many machines were running at each point in time,” he said “But, when we go to the cloud, suddenly systems are completely dynamic and the number of servers that we are running depends on the load that the system is having.”

Prat explained that because of this, it is no longer enough to just know which boxes to connect; teams now have to have a full understanding of which continued on page 8 >

www.sdtimes.com February 2023 SD Times 7

< continued from page 7

machines are entering into and leaving the system so that connections can be made and the development team can determine what is going on.

Bristowe also explained that while the shift to cloud native can be a positive thing for the observability space, it has also made it more complicated.

“Cloud native is just a more complex scenario to support,” he said. “You have disparate systems and different technologies and different ways in which you’ll do things like logging, tracing, metrics, and things of that sort.”

Because of this, Bristowe emphasized the importance of integrating proper tooling and processes in order to

work around any added complexities.

Prat believes that the transition to cloud native not only brings new complexities, but a new level of dynamism to the observability space.

“Before it was all static and now it is all dynamic because the cloud is dynamic. Machines come, machines go, services are up, services are down and it is just a completely different story,” he said.

Opsera’s Vasudevan also stressed that moving into the cloud has put more of an emphasis on the security benefits that observability can offer.

He explained that while moving into the cloud has helped the velocity of deployments, it has added a plethora of possible security vulnerabilities.

The risks of insufficient observability

When companies fail to provide their development teams with high level observability, Josep Prat, open-source engineering director at data infrastructure company Aiven, said it can feel like regressing to the dark ages.

He explained that without observability, the best developers can do is venture a guess as to why things are behaving the way that they are.

“We would need to play a lot of guessing games and do a lot more trial and error to try and reproduce mistakes… this leads to countless hours and trying to understand what the root cause was,” said Prat.

This, of course, reduces an organization’s ability to remain competitive, something that companies cannot afford to risk.

He emphasized that while investing in observability is not some kind of magic cureall for bugs and system failures, it can certainly help in remediation as well as prevention.

John Bristowe, community director at deployment automation company Octopus Deploy, went on to explain that observability is really all about the DevOps aspect of investing in people, processes, and tools alike.

He said that while there are some really helpful tools available in the observability space, making sure the developers are onboard to learn with these tools and integrate them properly into their processes is really the key element to successful observability. z

“And this is where that shift happened and developers really started to understand that they do need to have this observability in place to understand what the bottlenecks and the inefficiencies are that the development team will face,” he said.

Observability and productivity

Prat also emphasized that investing in observability heavily correlates to more productivity in an organization. This is because it enables developers to feel more secure in the products they are building.

He said that this sense of security also helps when applying user feedback and implementing new features per customer requests, leading to heightened productivity as well as strengthening the organization’s relationship with its customer base.

With proper observability tools, a company will be able to deliver better features more quickly as well as constantly work to improve the resiliency of its systems. Ultimately, this provides end users with a better overall experience as well as boosts speeds.

“The productivity will improve because we can develop features faster, because we can know better when things break, and we can fix the things that break much faster because we know exactly why things are being broken,” Prat said.

Vasudevan explained that when code is pushed to production without developers truly understanding it, technical debt and bottlenecks are pretty much a guarantee, resulting in a poorer customer experience.

“If you don’t have the observability, you will not be able to identify the bottlenecks, you will not be able to identify the inefficiencies, and the code quality is going to be very poor when it goes into production,” he said.

Bristowe also explained that there are times when applications are deployed into production and yield unplanned results. Without observability, the development team may not even notice this until damage has already been caused.

“The time to fix bugs, time to resolu-

continued on page 10 >

8
SD Times February 2023 www.sdtimes.com

We’ll Help You Keep It Clean

Dealing with bad data is a task no developer needs on their checklist. Inaccurate, outdated, and duplicate records can build up in your database, affecting business decisions, the customer experience, and your bottom line. As the Address Experts, Melissa helps our customers improve operational ef ciency with the best Address Veri cation, Identity Veri cation and Data Enrichment solutions available. We validated 30 billion records last year alone, which is why thousands of businesses worldwide have trusted us with their data quality needs for 37+ years.

BAD DATA BUILDUP

Returned Mail & Packages

Money Laundering & Fraud

Decreased Customer Insight

DATA CLEANLINESS

Real-time Address Veri cation

Identity Resolution & Watchlist Screening

Geographic & Demographic Data Appends

Test our APIs Today! Visit www.melissa.com/developer/ to get started with 1,000 Free Credits.

Trust the Address Experts to deliver high-quality address verification, identity resolution, and data hygiene.

Melissa.com 800.MELISSA (635-4772)

< continued from page 8

tion, and things like that are critical success factors and you want to fix those problems before they are discovered in production,” Bristowe said “Let’s face it, there is no software that’s perfect, but having observability will help you quickly discover bottlenecks, inefficiencies, bugs, or whatever it may be, and being able to gain insight into that quickly is going to help with productivity for sure ”

Aiven’s Prat noted that observability also enables developers to see where and when they are spending most of their time so that they can tweak certain processes to make them more efficient

When working on a project, develo p e r s s t r i v e f o r i m m e d i a t e r e s u l t s O b s e r v a b i l i t y h e l p s t h e m w h e n i t comes to understanding why certain processes are not operating as quickly as desired

“So, if we are spending more time on a certain request, we can try and find why,” Prat explained. “It turns out there was a query on the database or that it was a system that was going rogue or a machine that needed to be decommissioned and wasn’t, and that is what observability can help us with.”

How to approach observability

When it comes to making sure developers are provided with the highest level of observability possible, Prat has one piece of advice: utilize open-source tooling

He explained that with tools like these, developers are able to connect several different solutions rather than feeling boxed into one single tool This ensures that they are able to have the most well-rounded and comprehensive approach to observability

“You can use several tools and they can probably play well together, and if they are not then you can always try and build a connection between them to try and help to close the gap between two tools so that they can talk to each other and share data and you can get more eyes looking at your problem,” Prat said.

Caruana also explained the importance of implementing observability with room for evolution.

He said that starting small and build-

Automation and observability

John Bristowe, community director at deployment automation company Octopus Deploy, emphasized the impact that AI and automation can have on the observability space.

He explained that tools such as ChatGPT have really brought strong AI models into the mainstream and showcased the power that this technology holds.

He believes this same power can be brought to observability tools.

“Even if you are gathering as much information as possible, and you are reporting on it, and doing all these things, sometimes even those observations still aren’t evident or apparent,” he said. “But an AI model that is trained on your dataset, can look and see that there is something going on that you may not realize.”

David Caruana, UK-based software architect at content services company Hyland, added that AI can help developers better understand what the natural health of a system is, as well as quickly alert teams when there is an anomaly.

He predicts that in the future we will start to see automation play a much bigger role in observability tools, such as filtering through alerts to select the key, root cause alerts that the developer should focus on.

“I think going forward, AI will actually be able to assist in the resolution of those issues as well,” Caruana said. “Even today, it is possible to fix things and to resolve issues automatically, but with AI, I think resolution will become much smarter and much more efficient.”

Both Bristowe and Caruana agreed that AI observability tools will yield wholly positive results for both development teams and the organization in general.

Bristowe explained that this is because the more tooling brought in and the more insights offered to developers, the better off organizations will be.

However, Vishnu Vasudevan, head of product at the continuous orchestration company Opsera, had a slightly different take.

He said that bringing automation into the observability space may end up costing organizations more than they would gain.

Because of this risk, he stressed that organizations would need to be sure to implement the right automation tools so that teams can gain the actionable intelligence and the predictive insights that they actually need.

“I would say that having a secure software supply chain is the first thing and then having observability as that second layer and then the AI and automation can come in,” Vasudevan said. “If you try to build AI into your systems and you do not have those first two things, it may not add any value to the customer.” z

ing observability out based on feedback from developers is the best way to be sure teams are being provided with the deepest insights possible.

“As you do with all agile processes, iteration is really key, so start small,

implement something, get that feedback, and make adjustments as you go along,” Caruana said. “I think a big bang approach is a high risk approach, so I choose to evolve, and iterate, and see where it leads.” z

10 SD Times February 2023 www.sdtimes.com

April 12, 2023

A one-day virtual event

Organizations requiring a faster digital transformation are turning to low-code development solutions, empowering IT and non-IT personnel to use drag-and-drop tooling to quickly create necessary business applications.

Join us for the second Low-Code/No-Code Developer Day. This online event is designed to help organizations understand the use of low-code and no-code tools, where they are appropriate to use, and what they can deliver

Register for Free!

l Friends in low-code places by Cindy Van Epps

2022 Sessions

l Demystifying low code: Where to start? by Jason English

l Maximizing the value of hybrid dev teams in remote environments by Adam Morehead

l From ‘Hello World’ to the World at Large by Dawie Botes

l Crossing the Low-Code and Pro-Code chasm: A platform approach by Asanka Abeysinghe

l Designing a developer-led culture by Ricardo Miguel Silva

l How Medtronics created its LC/NC program by Lori Breitbarth

l Why now is the time for low-code CX by Stephen Ehikian

l Low-Code capabilities of digital product design platforms by Jason Beres

l Dispelling preconceived notions of DIY task complexity by Michiel de Bruin

433 people attended last year from these companies:

A Event Returning in April!

ChatGPT won’t take your job,

ChatGPT has been the talk of the d e v e l o p e r c o m m u n i t y e v e r since it was released to the public as a research preview at the end of last year.

The tool, developed by OpenAI and trained off its GPT-3.5 model, is an AIbased conversational chatbot that is actually quite impressive in its capabilities.

It enables you to ask it nearly any question and then it will generate a response. I’ve made use of it in a number of ways since it came out, such as rewording text for me to share on social media, and even in coming up with interview questions for this very story It’s easy to understand how it can take on some of the tasks I do every day as a writer; after all it is a chatbot But where it’s really impressive is when you consider what it can do in the software development space

For example, you can ask it to write a piece of code and then continue to ask it to refine what it comes up with until you ’ re happy with it, and then even ask it to transform that piece of code into another language

Another sample use case (Fig.1) highlighted on OpenAI’s website is using it as a way to ask why code is not working as expected.

Cody DeArkland, head of developer relations at feature management company LaunchDarkly, said that because

it is so good at giving explanations, it can be a really good learning tool “One of my favorite parts of it is being able to get a deeper understanding about what code is doing,” he said.

C h a t G P T r e m e m b e r s p r e v i o u s things you ’ ve said to it in a conversation, unlike other AI assistants where you ask a question and get a response and then the conversation is done. This enables you to get a response, then ask m o r e q u e s t i o n s . Yo u c a n a s k i t t o explain what it just told you, or even ask it to fix a part of what it gave you if it wasn’t quite what you were looking for

“I was writing a login component for a project the other night, and I just wanted a quick example that I could then build off of and change for my own uses, ” DeArkland said “So I went into ChatGPT and said, ‘Can I have a login component written in React?’ and it returns that back ”

not familiar with. He could take the React component from the example above and ask it to switch it to Flask within Python instead, for example.

“I’ve been writing code for 10 years now, but there’s still things I’m learning. And when I want to learn more, or I want to understand more about a different part of this, my existing workflow today is go and hit Google and look around and find interesting blog posts or find a StackOverflow guide about it Being able to use ChatGPT to learn from is something I’ve seen a lot of peers in the industry doing ”

It may have required some customizations to make it fit his particular use case, but the foundation was there, he explained.

DeArkland said that where this gets really interesting is when you ’ re looking for something in a framework you ’ re

DeArkland sees this as a way to lower the bar to entry for those entering the field of development because it will not just provide you with the code but enable you to ask questions like why the component was chosen, why certain directives were used in the code, or ask it to explain what a line of code does

One limitation of this, however, is that according to OpenAI’s website, ChatGPT was “fine-tuned from a model in the GPT-3.5 series, which finished training in early 2022.” This means that it doesn’t always know the latest infor-

SD Times February 2023 www sdtimes com 12
‘This one’s the first time I felt like I need to pay attention to this. It’s not going to replace my job. But it would help me in getting some things done. ’
Marcus Merrell, VP of tech strategy, Sauce Labs
but it can make you more efficient and knowledgeable

mation about the technologies you ’ re asking about, and it doesn’t know about new technologies either

A c c o r d i n g t o D e A r k l a n d , a s k i n g about a technology it doesn’t know m u c h a b o u t w i l l r e s u l t i n l i m i t e d answers because it simply doesn’t have enough data to learn from to give a really good answer

“I think the timing thing is going to be a challenge, especially in fast-moving industries, where new features, new concepts, and new innovations are coming out It’s going to have to learn how t o m a n a g e t h o s e A n d maybe we have to learn how to manage our expectations for what it knows,” said DeArkland

ChatGPT isn’t taking your job W h e n y o u s t a r t playing around with ChatGPT and see what it is capable of, you might start to worry if it could take over the role o f d e v e l o p e r s I t seems that at this point in time, the answer is no

“There’s always going to be a place for us to understand what our context is, what our business logic that we need to have in there is, what design choices we have, all of those things Humans are safe; we still have a place in a ChatGPT world,” said DeArkland

And while it is really good at writing code, that isn’t all a developer does all day “When you go to college and you learn to write code, all you ’ re doing is writing code, and that’s all you ’ re ever evaluated on But when you get into the real world, that’s only 20% of what you do The rest is config, design, architecture, etc ChatGPT can’t help with any of that,” he said.

Marcus Merrell, VP of tech strategy a t t e s t i n g c o m p a n y S a u c e L a b s , believes that not having an API to access ChatGPT through is another limitation to its usefulness.

He explained that at the end of the day it’s just a chatbot, meaning you still have to copy + paste whatever information it gives you to your code environment

“I can’t tell it, okay, you zip up all this code and send it to me in a GitHub repo, or go scan that GitHub repo and tell me how I can replicate that, fork it It can’t do anything yet And I think that’s what we ’ re probably all waiting for That’s when it might really start to threaten some stuff,” said Merrell

Another reason why it won’t be taking the role of developers anytime soon is because there still needs to be verification that what it’s giving you is what you intended “ChatGPT is biased to provide an answer, not necessarily a correct answer, ” said Kimen Warner, VP of product management at B2B marketing company Drift

Speaking in the context of marketers, Warner predicts that while there may be adoption of ChatGPT for generating content options, there still needs to be a human in the loop to edit and approve its suggestions.

“The way I think about it is it’s much easier to edit something rather than to create something from scratch,” she said “And I think that’s what ChatGPT and other language models like it will let you do So I think that’s probably where we ’ re going to see adoption for a while ”

It is turning heads though

While Merrell wasn’t worried about the idea of ChatGPT replacing developers, he said it was the first tool that he really felt he had to pay attention to

During his career he has evaluated 18 or 20 different low-code testing tools in the market, and he hasn’t felt the need with any of them to use to further his career, until now

“This one ’ s the first time I felt like I need to pay attention to this It’s not going to replace my job But it would help me in getting some things done,” he said.

Merrell recounted a recent conversation with a colleague where he pointed out that the thing that “ChatGPT has generated the most of is hot takes about ChatGPT.” z

www.sdtimes.com February 2023 SD Times 13
Fig.1
SD Times offers in-depth features on the newest technologies, practices, and innovations affecting enterprise developers today Containers, Microservices, DevOps, IoT, Artificial Intelligence, Machine Learning, Big Data and more. Find the latest news from software providers, industry consortia, open source projects and research institutions Subscribe TODAY to keep up with everything happening in the ever-changing world of software development! Available in two formats print or digital. Discovery. Insight. Understanding. SD Times subscriptions are FREE! Sign up for FREE today at www.sdtimes.com.

Broadcom: 84% of orgs will be using VSM by end of the year

If you ’ re a regular reader of SD Times you may have gotten the sense that value stream management (VSM) is really taking off in tech We’ve increasingly written about it, launched a new website just for value stream news, and even launched a value stream management conference in 2020

And if that wasn’t enough proof of its growing popularity, a new survey from Broadcom provides numbers to back that up. According to its survey of over 5 0 0 I T a n d b u s i n e s s l e a d e r s , i t i s expected that 84% of enterprises will have adopted VSM by the end of 2023. This is up from just 42% in 2021.

According to Broadcom, early adoption of VSM started around four years ago, and within the past two there has been a shift to mainstream adoption Sixty percent of survey respondents said they will use VSM to deliver at least one product this year

Being at the “mainstream” level of adoption signifies greater maturity in implementations and will enable compa-

nies at that point to address challenges they are facing: supply chain issues, inefficient processes, remote workers, and shifting company priorities

“You have to make a concerted effort and get everybody on board and change at the right pace for your organization,” said Laureen Knudsen, chief transformation officer at Broadcom “But you can truly make huge gains in your effectiveness and efficiency and therefore your outcomes if you put these things in place, and you actually take them seriously.”

Looking ahead, Broadcom predicts that increasing customer value will be the number one focus for businesses, with 58% of companies listing it as a priority this year.

Other priorities will be improving product quality (55%), cost reduction ( 5 1 % ) , c u s t o m e r g r o w t h ( 5 0 % ) , increasing profits (49%), and increasing speed to market (45%)

“Customers are much more likely to change whether it’s banking, or insurance, or airlines who they use, ” said Knudsen “You’re seeing people

kind of looking at everything fresh And so your customers are looking at, who else can I use? What can I do? Is there a better way for me to do this? So companies are really pivoting to strongly focus on that customer value ”

Broadcom also asked respondents what benefits they are seeing from VSM and the top five answers were increased transparency (42% of respondents), improved organizational alignment (39%), providing faster solutions to customers (38%), data-driven decis i o n - m a k i n g ( 3 7 % ) , a n d i m p r o v e d access to data (36%).

“The data clearly shows those with VSM are leveraging more indicators to track customer value, representing more sophistication but also having a greater focus on direct customer feedback such as support and social media,” said JeanLouis Vignaud, head of ValueOps at Broadcom “The research also found companies with a VSM practice have significantly better tools and processes to capture and track these customer value metrics than those without VSM ” z

Report: Platform engineering to see big boost in 2023

Platform engineering teams are on the rise, with 94% of respondents to Puppet by Perforce’s 2023 State of DevOps report saying that platform engineering is helping them realize the benefits of DevOps more than before

According to the company, platform engineering is the practice of “designing and building self-service capabilities to minimize cognitive load for developers and to enable fast flow software delivery.”

When implemented properly, platform engineering benefits the whole

organization, not just development and IT teams

About 68% of respondents said that they’ve seen an increase in development velocity since adopting this practice Forty-two percent said that speed has increased “ a great deal ”

Other benefits companies with platform engineering teams are seeing i n c l u d e i m p r o v e d s y s t e m r e l i a b i l i t y (60%), greater productivity (59%), and better workflow standards (57%).

Top priorities for platform engineering teams also align with product management responsibilities. These include

increasing awareness of platform capabilities (47% of respondents), setting realistic expectations of the role of the platform team within the company (44%), and following industry trends and keeping up with feature requirements (37%)

In an economy when layoffs are becoming increasingly common, those with platform engineering experience might have an edge. According to the report, 71% of respondents said their companies plan to hire someone with that qualification over the next 12 months and 53% will do so within the next six months. z

www.sdtimes.com February 2023 SD Times 15
D E V O P S WATC H D E V O P S WATC H

How to build trust in AI for software testing

The application of artificial intelligence (AI) and machine learning (ML) in software testing is both lauded and maligned, depending on who you ask. It’s an eventuality that strikes balanced notes of fear and optimism in its target users But one thing’s for sure: the AI revolution is coming our way And, when you thoughtfully consider the benefits of speed and efficiency, it turns out that it is a good thing So, how can we embrace AI with positivity and prepare to integrate it into our workflow while addressing the concerns of those who are inclined to distrust it?

Speed bumps on the road to trustville

Much of the resistance toward implementing AI in software testing comes down to two factors: a rational fear for personal job security and a healthy skepticism in the ability of AI to perf o r m t a s k s c o n t e x t u a l l y a s w e l l a s humans This skepticism is primarily based on limitations observed in early applications of the technology.

To further promote the adoption of AI in our industry, we must assuage the fears and disarm the skeptics by setting reasonable expectations and emphasizing the benefits Fortunately, as AI becomes more mainstream a direct result of improvements in its abilities a clearer picture has emerged of what AI and ML can do for software testers; one that is more realistic and less encumbered by marketing hype

First things first: Don’t panic

Here’s the good news: the AI bots are not coming for our jobs For as long as there have been AI and automation testing tools, there have been dystopian nightmares about humans losing their place in the world Equally prevalent are the naysayers who scoff at such doomsday scenarios as being little more than the whims of science fiction writers

The sooner we consider AI to be just another useful tool, the sooner we can

start reaping its benefits. Just as the invention of the electrical screwdriver has not eliminated the need for workers to fasten screws, AI will not eliminate the need for engineers to author, edit, schedule and monitor test scripts. But it can help them perform these tasks faster, more efficiently, and with fewer distractions

Autonomous software testing is simply more realistic and more practical when viewed in the context of AI working in tandem with humans People will remain central to software development since they are the ones who define the boundaries and potential of their software The nature of software testing dictates that the “goal posts'' are always shifting as business requirements are often unclear and c o n s t a n t l y c h a n g i n g T h i s v a r i a b l e nature of the testing process demands continued human oversight

The early standards and methodologies for software testing (including the term “quality assurance”) come from the world of manufacturing product testing. Within that context, products were welldefined with testing far more mechanistic compared to software whose traits

SD Times February 2023 www sdtimes com 16
Mush Honda serves as Chief Quality Architect for Katalon, an AI-augmented software quality management platform He is a results-oriented senior engineering leader with over 20 years experience in software delivery, with a strong background in quality engineering, test automation and DevOps continued on page 18 >

Introducing “Improve,” the Continuous Improvement Conference series focusing on how organizations can gain process e iciencies, create secure, higher quality software and deploy more frequently and with confidence. Rapid release cycles, automation and process data give organizations the opportunity to continuously improve how they work and deliver software. This conference series will evaluate how the pieces are put into place to enable continuous improvement.

Join us on February 22 for the first event in this series: Testing

Wed, Feb 22, 2023

9:00 AM - 3:00 PM (EST) FREE Online Event

Upcoming online events in the Improve Conference series:

August 30

October 18

November 15

DATA
SECURITY
PRODUCTIVITY
TESTING
REGISTER NOW
Presented by

are malleable and often changing. In reality, however, software testing is not applicable to such uniform, robotic methods of assuring quality

In modern software development, there are many things that can’t be known by developers There are too many changing variables in the develo p m e n t o f s o f t w a r e t h a t r e q u i r e a higher level of decision-making than AI can provide And yet, while fully autonomous AI is unrealistic for the foreseeable future, AI that supports and extends human efforts at software quality is still a very worthwhile pursuit Keeping human testers in the mix to consistently monitor, correct, and teach the AI will result in an increasingly improved software product

The three stages of AI in software testing

Software testing AI development essentially has three stages of development maturity:

• Operational Testing AI

• Process Testing AI

• Systemic Testing AI

Most AI-enabled software testing is currently performed at the operational stage Operational testing involves creating scripts that mimic the routines human testers perform hundreds of times Process AI is a more mature version of Operational AI with testers using Process AI for test generation Other uses may include test coverage analysis and recommendations, defect root cause analysis and effort estimations, and test environment optimization Process AI can also facilitate synthetic data creation based on patterns and usages

The third stage, Systemic AI, is the least tenable of the three owing to the enormous volume of training it would require Testers can be reasonably confident that Process AI will suggest a single feature or function test to adequately assure software quality. With Systemic AI, however, testers cannot know with high confidence that the software will meet all requirements in all situations. AI at this level would test for all conceivable requirements even those that

have not been imagined by humans. This would make the work of reviewing the autonomous AI's assumptions and conclusion such an enormous task that it would defeat the purpose of working toward full autonomy in the first place

Set realistic expectations

After clarifying what AI can and cannot do, it is best to define what we expect from those who use it Setting clear goals early on will prepare your team for success When AI tools are introduced to a testing program, it should be presented as a software project that has the full support of management with w e l l - d e f i n e d g o a l s a n d m i l e s t o n e s Offering an automated platform as an optional tool for testers to explore at their leisure is a setup for failure Without a clear directive from management and a finite timeline, it is all too easy for the project to never get off the ground. Give the project a mandate and you’ll b e w e l l o n y o u r w a y t o s u c c e s s f u l implementation. You should aim to be clear about who is on the team, what t h e i r r o l e s a r e , a n d h o w t h e y a r e expected to collaborate. It also means specifying what outcomes are expected and from whom

Accentuate the positive

Particularly in agile development environments, where software development is a team sport, AI is a technology that benefits not only testers but also everyone on the development team Give testers a stake in the project and allow them to analyze the functionality and benefits for themselves Having agency will build confidence in their use of the tools, and convince them that AI is a tool for augmenting their abilities and preparing them for the future

Remind your team that as software evolves, it requires more scripts and new approaches for testing added features, for additional use patterns and for platform integrations Automated testing is not a one-time occurrence. Even with machine learning assisting in the repairing of scripts, there will always be opportunities for further developing the test program in pursuit of greater test coverage, and higher levels of security

and quality. Even with test scripts that approach 100 percent code execution, there will be new releases, new bug fixes, and new features to test The role of the test engineer is not going anywhere, it is just evolving

Freedom from the mundane

It is no secret that software test engineers are often burdened with a litany of tasks that are mundane To be effective, testing programs are designed to audit software functionality, performance, security, look and feel, etc in incrementally differing variations and at volume Writing these variations is repetitive, painstaking, and to many even boring By starting with this l o w - h a n g i n g f r u i t , t h e m u n d a n e , resource-intensive aspects of testing, you can score some early wins and gradually convince the skeptics of the value of using AI testing tools.

Converting skeptics won’t happen overnight. If you overwhelm your team by imposing sweeping changes, you may be setting yourself up for failure. Adding AI-assisted automation into your test program greatly reduces the load of such repetitive tasks, and allows test engineers to focus on new interests and skills

For example, one of the areas where automated tests frequently fail is in the identification of objects within a user interface (UI) AI tools can identify these objects quickly and accurately to bring clear benefit to the test script By focusing on such operational efficiencies, you can make a strong case for embracing AI When test engineers spend less time performing routine debugging tasks and more time focusing on strategy and coverage, they naturally become better at their jobs When they are better at their jobs, they will be more inclined to embrace technology

In the end, AI is only as useful as the way in which it is applied It is not an instantaneous solution to all our problems We need to acknowledge what it does right, and what it does better. Then we need to let it help us be better at our jobs. With that mindset, test engineers can find a very powerful partner in AI and will no doubt be much more likely to accept it into their workflow. z

SD Times February 2023 www sdtimes com 18
< continued from page 16
sparxsystems.com R E NGA M A E GE O L SH DE Modeling and Design Tools for Changing Worlds Enterprise Architect Version 16 NEW UML ® | BPMN ® | BPSim | BPEL | DMN ™ | Google ® & AWS ® Icon Sets | TOGAF ® | Zachman ® XSD | ArchiMate ® | MARTE | SysML | NIEM ™ | BABOK ® | BIZBOK ® | BMM ™ | CMMN ™ | Code | DataBase | IFML ™ | GML ODM ™ | Schema | SoaML ™ |SOMF ™ | SPEM ™ | UAF | UBL | UPMC | VDML ™ | *More

Software intelligence is key to

Development teams are always on a mission to create better quality software, be more efficient, and please their users as much as possible

The introduction of AI into the development pipeline makes this possible, from software intelligence to AI-assisted development tools Both can work hand in hand to reach the same goal, but there’s a difference between software intelligence and intelligent software

AI-assisted development tools are products that use AI to do things like suggest code, automate documentation, or generally increase productivity Vincent Delaroche, founder and CEO of CAST, defines software intelligence as tools that analyze code to give you visibility into it so you can understand how the individual components work together, identify bugs or vulnerabilities, and gain visibility.

So while these intelligent software tools help you write better code, the software intelligence tools sift through that code and make sure it is as high quality as possible, and make recommendations on how to get to that point

“Custom software is seen as a big complex black box that very few people understand clearly, including the subject matter experts of a given system,” said Delaroche “When you have tens of millions of lines of code, which represent tens of thousands of individual components which all interact between each other, there is no one on the planet who can claim to be able to understand and be able to control everything in such a complex piece of technology ”

Similarly, even the smartest developer doesn’t know every possible option available to them when writing code That’s where AI-assisted development comes in, because these tools can suggest the best possible piece of code for the application.

For example, a developer could provide a piece of code to ChatGPT and ask it for better ways of writing the code.

According to Diego Lo Giudice, principal analyst at Forrester, Amazon DevOps Guru serves a similar purpose on the configuration side It uses AI to detect possible operational issues and can be used to configure your pipelines better

Lo Giudice explained that quality issues aren’t always the result of bad code; sometimes the systems around the software are not configured correctly and that can result in issues too, and these tools can help identify those problem configurations

George Apostolopoulos, head of analytics at Endor Labs, further explained the capabilities of software intelligence tools as being able to perform simple rules checks, provide counts and basic statistics like averages, and do more complex statistical analysis.

Sof tware intelligence is crucial if you’re working with dependencies

Software intelligence plays a big role not only in quality, but in security as well, solving a number of challenges with open source software (OSS) dependency

These tools can help by evaluating security practices of development, code of the dependency for vulnerable code, and code of the dependency for malicious code They use global data to identify things like typosquatting and dependency confusion attacks

“In the last few years a number of attacks exposed the potential of the software supply chain for being a very effective attack vector with tremendous force multiplying effects,” said Apostolopoulos “As a result, a new problem is to ensure that a dependency we want to introduce is not malicious, or a new version of an existing dependency does not become malicious (because its code or maintainer were compromised) or the developer does not fall victim to attacks targeting the development process like typosquatting or dependency confusion.”

When introducing new dependencies, there are a number of questions

the developer needs to answer, such as which piece of code will actually solve their problem, as a start. Software intelligence tools come into play here by recommending candidates based on a number of criteria, such as popularity, activity, amount of support, and history of vulnerabilities.

Then, to actually introduce this code, more questions pop up. “The dependency tree of a modestly complex piece of software will be very large Developers need to answer questions like: do I depend on a particular dependency? What is the potentially long chain of transitive dependencies that brings it in? In how many places in my code do I need it?” said Apostolopoulos

It is also possible in large codebases to be left with unused and out-of-date dependencies as code changes “In a large codebase these are hard to find by reviewing the code, but after constructing an accurate and up to date dependency graph and call graph these can be automatically identified Some developers may be comfortable with tools automatically generating pull requests that recommend changes to their code to fix issues and in this case, software intelligence can automatically create pull requests with the proposed actions,” said Apostolopoulos.

Having a tool that automatically provides you with this visibility can really

SD Times February 2023 www sdtimes com 20

writing better code

would make great progress, and then some other disruption would happen, and it would unravel their work So, we spent multiple days where we kind of got close to finishing the problem, and then it had to be reset ”

While something as disruptive as this may not happen every day, Delaroche said that every day companies are facing major crises It’s just that the ones we know about are the ones that are big enough to make it into the press

“Once in a while we see a big business depending on software that fails,” he said “I think that in five to ten years, this will be the case on a weekly basis ”

reduce the mental effort required by developers to maintain their software.

The sof tware landscape is a “huge mess”

Delaroche said that many CIOs and CTOs may not be willing to publicly admit this, but the portfolio of software assets that run the world, that exist in the largest corporations, are becoming a huge mess

“It’s becoming less and less easy to control and to master and to manage and to evolve on, ” said Delaroche “Lots of CIOs and CTOs are overwhelmed by software complexity ”

In 2011, Marc Andressen famously claimed that “software is eating the world ” Delaroche said this is more true than ever as software is becoming more and more complex

He brought up the recent example of Southwest Airlines Over the holidays, the airline canceled over 2,500 flights, which was about 61% of its planned flights The blame for this was placed on a number of issues: winter storms, staffing shortages, and outdated technology

The airline’s chief operating officer Andrew Watterson said in a call with employees: “The process of matching up those crew members with the aircraft could not be handled by our technology … As a result, we had to ask our crew schedulers to do this manually, and it’s extraordinarily difficult … They

What does the future hold for these tools?

Over the past six months Lo Giudice has seen a big acceleration in adoption of tools that use large language models.

However, he doesn’t expect everyone to be writing all their code using ChatGPT just yet. There are a lot of things that need to be in place before a company can really bring all this into their software development pipeline

Companies will need to start scaling these things up, define best practices, and define the guardrails that need to be put in place Lo Giudice believes we are still about three to five years away from that happening

Another thing that the industry will have to grapple with as these tools come into more widespread use is the idea of proper attribution and copyright

In November 2022, there was a

GitHub Copilot, led by programmer and lawyer Matthew Butterick

The argument made in the suit is that GitHub violated open-source licenses by training Copilot on GitHub repositories Eleven open-source licenses, including MIT, GPL, and Apache, require the creator’s name and copyright to be attributed

In addition to violating copyright, Butterick wrote that GitHub violated its own terms of service, DMCA 1202, and the California Consumer Privacy Act

“This is the first step in what will be a long journey,” Butterick wrote on the webpage for the lawsuit “As far as we know, this is the first class-action case in the US challenging the training and output of AI systems. It will not be the last. AI systems are not exempt from the law. Those who create and operate these systems must remain accountable. If companies like Microsoft, GitHub, and OpenAI choose to disregard the law, they should not expect that we the public will sit still AI needs to be fair & ethical for everyone If it’s not, then it can never achieve its vaunted aims of elevating humanity It will just become another way for the privileged few to profit from the work of the many ” z

Another area to apply shift-left to

Over the last years several elements of the software development process have shifted left Galael Zino, founder and chief executive of NetFoundry, thinks that software analysis also needs to shift left

This might sound counterintuitive How can you analyze code that doesn’t exist yet? But Zino shared three changes that developers can make to make this shift

First, they should adopt a secure-by-design mentality He recommends minimizing reliance on third-party libraries because often they contain much more than the specific use case you need For the ones you do need, it’s important to do a thorough review of that code and its dependencies

Second, developers should add more instrumentation than they think they will need because it’s easier to add instrumentation for analysis at the start than when something is already in production

Third, take steps to minimize the attack surface The internet is the largest single surface area, so reduce risk by ensuring that your software only communicates with authorized users, devices, and servers

“Those entities still leverage Internet access, but they can't access your app without cryptographically validated identity, authentication and authorization,” he said z

www.sdtimes.com February 2023 SD Times 21
c l a s s - a c t i o n l a w s u i t b r o u g h t a g a i n s t

Time to hide your API

Manage the ones you write — and even those you didn’t — to keep them safe from attacks

The need for robust API security is growing rapidly in response to the increasing dependence of organizations on APIs for their digital operations

With 70% of respondents to a report expecting to use more APIs in 2023 than last year, this presents a heightened challenge for API security, which only comprises about 4% of the testing efforts at organizations today

The 4th annual State of the APIs Report collected insights from more than 850 global developers, engineers, and leaders from across the technology community spanning over 100 countries including the US, the UK, Germany, and India.

The increased API usage is especially prominent in telecommunications, which is projected to rise to 72%, up from 59% last year. This is followed by smaller, yet still considerable, increases in the fields of technology and professional services

Mark O’Neill, VP analyst, and chief of research for software engineering at Gartner, correctly predicted in 2021 that by this year, API breaches would be the number one threat vector for web applications

“Part of the reason for that is because with mobile and web apps, along with any other type of modern application that you ’ re using, it all involves the use of APIs,” O’Neill said Gartner research has estimated that by 2025, fewer than half of enterprise APIs will be managed, as explosive growth in APIs surpasses the capabilities of API management tools and “security controls try to apply old paradigms to new problems ”

This vast number of APIs floating around the organization is further complicated by multiple teams building and managing APIs all while using different cloud platforms and frameworks, according to O’Neill

“When you have different platforms where your teams are building and deploying APIs, there’s no one place to put the gateway, which is a problem for traditional API management solutions,” O’Neill said.

To secure this wide API landscape, many companies have put up multiple gateways, which means that now there are more gateways in front of APIs, but it created a new problem of learning how to manage all of these gateways together.

“Many clients have asked us for a federated solution that would work across different API gateways and allow teams to have a single picture of their API traffic and to have a single control plane for management and security, but at the moment, that is a gap in the market,” O’Neill said

A single federated solution would allow users to set up authentication and authorization schemes across different APIs, ensuring that only the right users have access to the right resources It also enables administrators to set up rate limiting and other security measures, such as IP white/blacklisting, to protect against malicious attacks.

With such a solution, teams would also gain visibility into API performance and usage, allowing teams to identify and address potential security issues quickly.

A hodgepodge of APIs in use

The other problem APIs present for API management solutions is that there are many different types of APIs in use

The API jumble often consists of REST, Webhooks, Websockets, SOAP, GraphQL, Kafka, AsyncAPIs, gRPCs, if not more

“If you look at a typical organization that has deployed API management, they may believe that all of their APIs are being managed on one platform,” O’Neill said “But typically, there are a lot of other APIs that they have that are part of web applications, part of mobile apps, and they’re not managed, they’re effectively under the radar for that organization And these are the ones that get breached ”

The APIs to watch out for in particular are GraphQLs, according to O’Neill Users can do very wide and deep queries on data, which can also be their downside because it’s difficult to set up proper access control rules The complexity of the query can make it hard to predict what data will be accessible

Additionally, the use of variables in queries can make it difficult to prevent malicious users from exploiting the API. GraphQL APIs are often stateless, which means that security teams need to ensure that all requests are properly authenticated and authorized. These types of APIs are also new so many organizations are just building up their security teams’ skills around GraphQL and graph APIs in general.

22
SD Times Market Forecast

Another challenge is to consider where all of your APIs are coming from.

While internal APIs were still the most common API type developers reported working on for their organization, more developers in 2022 reported working on partner-facing or third-party APIs than the year prior In addition, the SaaS applications that developers utilize also often use their own set of APIs

The percentage of developers who reported working on partner-facing and third-party APIs grew by almost 5% in 2022 compared to 2021, according to the 2022 State of the API report This change was even more dramatic with partner-facing APIs in industries like technology, which grew by nearly 10%

One hotspot of security issues tends to be around the APIs that require access to data: customer data, preferences, and all sorts of account information Issues also surround APIs that run a function to do something because often that requires a transaction, so payment information might be at risk, O’Neill said

“One is the whole area of loyalty cards where you get points for making purchases, traveling, and so on Those involve many APIs. So you have an API to look up how many points a certain person has or you have an API to spend the points. We’ve seen security breaches where attackers have been able to find people who have accrued many points and then spend those,” O’Neill said. “Often the person is not aware, because they simply were not aware that they were running up all these points in the first place, and then they’re not aware when they get spent.”

Best practices for API security

The first step for ensuring API security is to catalog all of the APIs in the organization and to have an inventory Often, companies only look at their existing API gateway to see what APIs are registered there, but even multiple gateways don’t paint the complete picture, O’Neill explained

“The way that we advise people to do this is to see what APIs your business depends on, ” O’Neill said “So those of course can be your own APIs, but they can also be important to APIs that you ’ re consuming from third parties as well It’s

Market forecast for 2023

Cyberattacks and data breaches don’t pause with an economic slowdown. When prioritizing security investments, security leaders should continue to invest in security controls and solutions that protect the organization’s customer-facing and revenue-generating workloads, as well as any infrastructure critical to health and safety for those organizations in industries such as utilities, energy, and transportation, according to Forrester in its Planning Guide 2023: Security & Risk

“API-first is the de facto modern development approach, and APIs help organizations create new business models and methods of engagement with customers and partners However, security breaches due to unprotected APIs and API endpoints are common and no single type of tool fully addresses API security,” the guide states

API management tools address authentication and authorization issues, while API-specific security tools are used for scanning and discovery Additionally, some security tools extend further to provide runtime protections and microgateways to protect against API attacks Traditional security tools such as WAFs and bot management solutions are also expanding to cover these attacks, the report added

Gartner’s O’Neill said that he is seeing large vendors take steps forward in providing strong API protection and are acquiring some of the smaller specialist vendors that have come along for API protection as well

According to the 2022 State of APIs report, 69% of developers said that they expect to use APIs more in 2023 while 25% said that they expect about the same Only about 6% stated that they expect less or they didn’t know z

going to be a problem if those APIs suffer a security breach, if they are unavailable, or if they are just simply changing and creating breaking changes So API discovery is a hard problem because you have to look in multiple places for the APIs ”

There are also some solutions on the market that enable users to tap into application firewalls in the infrastructure at the CDN level to look at the traffic and see what API calls are happening

“That approach can in many ways be too late because those APIs that you ’ re discovering are already in production But still, it’s better than not discovering them at all,” O’Neill said z

Using APIs to increase security

By collaborating with APIs, organizations can become more secure as a whole. One such example occurred in the Open Banking Initiative that started in Europe but has since spread in popularity to North America.

The Open Banking Initiative began in January 2016, when the Competition and Markets Authority (CMA) in the UK issued a directive ordering the country’s nine largest banks to open up their customer data to third-party providers

Since then, it has become valuable because it has allowed financial institutions to create Open APIs that outside organizations and their third-party developers can leverage, according to MuleSoft in a blog post

Rather than opening up the APIs to attack, the initiative

enabled a secure form of data exchange that accelerates collaboration with outside organizations and has decreased the risks associated with screen scraping, a technique used by programs to extract data from the human-readable output of a computer application.

Screen scraping is insecure because it requires customers to provide third-party aggregators with login credentials and it also pushes significant traffic to servers with every “scrape ”

Open Banking initiatives offer financial institutions the opportunity to safely collaborate with third-party developers t h ro u g h A P I s U n l i ke sc re e n sc ra p i n g , t h i s se c u re d a ta exchange is API-enabled and does not strain or overload servers z

SD Times Market Forecast 23

The layers of digital transformations

Technology is a tool, not a strategy When a company undergoes a digital transformation, they are embedding technologies across their businesses to drive fundamental change, often resulting in increased efficiency and greater agility

No longer is the goal of digital transformation to become digitally native, but instead to drive real, tangible value for the business

The pace of change in the digital world is dizzying, and it’s easy to get caught up rubbernecking the competition to spot the latest trends

At its most basic, digital transformation is a combination of two things digitization, and optimization Digitization has been around for a while, and just means digitizing information or making data available in a digital format. Optimization

( a l s o r e f e r r e d t o a s “ d i g i t a l i z a t i o n ” ) i s m o r e process-focused, and involves using those digital technologies to operate systems more efficiently.

Tech talk

Real change occurs by listening, not talking Oftentimes, CIOs, CDOs and CTOs get swept up in the excitement of a transformation and simply pass down edicts that the company is “going digital” without first determining the strategy or creating KPIs that can properly measure impact The result? Conflicting plans, varying timelines, and splintered focus (And a lot of wasted time and money )

Instead, a digital transformation needs to start by defining business goals and expectations across the leadership team and aligning those goals and expectations with department heads to ensure their vision matches reality From there, rather than attempt to change an entire organization at once, it’s best to view a digital transformation in layers what are the areas of importance, how do they stack up against each other, and how can you condense operational needs across departments?

Imagine you ’ re the CIO of a large consumer brand with ~50K employees across 40+ departments offering hundreds of products to millions of end-customers. Your technology ecosystem would be a mess of legacy mainframes, aging document storage systems, disparate processes, overloaded IT teams, off-the-shelf systems with low user adop-

tion, and an unstable suite of customer-facing digital channels How do you encapsulate all of this onto a whiteboard and create a digital transformation roadmap?

This is where layers factor in Trying to find tactical solutions to each of these individual issues would lead to silos, multiple bespoke systems, and fragmented processes By grouping common issues into layers though, you’ll develop a much more manageable, strategic framework

Layers and phases

There are six layers to any digital transformation, each condensing key focus areas:

Data: How do you store and retrieve data securely? How do you scale? How do you ensure data integrity and avoid redundancy/duplication?

Application: How do you apply data to run your day-to-day business operations? How do you democratize data access by providing a centralized abstraction and distribution layer?

Process: How do you streamline your business processes?

Experience: How do your internal (employees) and external customers interact with the processes and systems? (UI/UX)

Collaboration: How do cross-functional teams collaborate more effectively to get work done faster?

Intelligence: How do you derive insights from data and apply intelligence into the work that you do?

Each layer must go through its own distinct phases of digitization and optimization to result in a successful (and holistic) transformation How a c o m p a n y d e f i n e s t h e i r o w n t r a n s f o r m a t i o n depends on multiple factors unique to the organization, including budget, available resources, and the capacity for innovation over time This is what makes true transformation a never-ending journey, the same way customer satisfaction is a never-ending pursuit

Playing the long game

Those looking to embark on their own digital transformation should be prepared for the realities of working without a finish line. It’s a process of continuous learning, but one that will ultimately create a more cohesive, efficient, and modern operation. z

24
B Y D I N E S H VA R A D H A R A J A N
Guest View
Dinesh Varadharajan is chief product officer at Kissflow
SD Times February 2023 www.sdtimes.com
There are six layers to any digital transformation, each condensing key focus areas.

Web3: New scams on the blockchain

Over the last 5 years, galaxy-brained folks have had time, thanks in part to a pandemic, to dream big about Web3 after catching some inspirational podcasts and YouTube gurus

What was so intriguing to so many about Web3 anyway? Since nobody could really agree on exactly what it was, it could literally be whatever aspiring entrepreneurs imagined it to be

Common threads appeared Blockchain Bitcoin and Ethereum DeFi Decentralization of organizations, infrastructure and data Freedom from tech giants Self-sovereignty Privacy Opportunity

All the kinds of ideals that generate charismatic personalities

Who cares about maturing cloud adoption or better integration standards, when you can explore a whole new economy based on blockchain, cryptocurrency and NFTs? Why wouldn’t tech talent leave standard Silicon Valley-funded confines to live this Web3 dream?

When a space is overhyped and undefined, it encourages the rise of the worst kinds of actors. Web3 never had a chance, with its uncertain crypto-economic roots and the use of blockchain technology, which hasn’t proved adequate for enterprise-class business

Crypto-Schadenfreude: Sham, bank run & fraud

Nobody enjoyed more of a media darling status in the Web3 world than FTX founder Sam BankmanF r i e d , w h o f a m o u s l y p l a y e d v i d e o g a m e s o n investor calls and shuffled around the tradeshow circuit in shorts, as he donated millions to “effective altruism” charities and crypto-friendly politicians

N o w S a m ’ s b e e n a r r e s t e d a n d s e t f o r e x t r a d it i o n f r o m t h e B a h a m a s t o f a c e c h a r g e s i n t h e U n i t e d S t a t e s , w i t h F T X t h e m o s t f a m o u s f a i l u r e a m o n g s e v e r a l o t h e r f a l l i n g d o m i n o e s ( L u m e n , C e l s i u s , G e m i n i , o n a n d o n ) i n t h e c r y p t o r u g p u l l

It was fun to mock celebrity shill ads, but it’s not funny to see $2 billion in investor deposits disappear into the ether. A lot of VC whales, other DeFi c o m p a n i e s a n d h a p l e s s i n d i v i d u a l s w e r e a l s o duped and parked their funds there too.

There’s no cash reserve regulation or FDIC account insurance in place for crypto, so when

buyer confidence eroded, market makers sold, accelerating the ‘ rug pull’ effect Ripples collapsed as much as $183B or more from the total market cap of cryptocurrencies

Blockchains looking for solutions

Besides cryptocurrency, the most common term we hear in Web3 discussions is blockchain, which is a distributed ledger technology (or DLT) underpinning Bitcoin, Ether, Dogecoin, and thousands more dogshitcoins

If cloud was just ‘ a computer somewhere else’ then blockchain is more like ‘ an append-only database everywhere else’ due to its decentralized consensus mechanism and cryptography.

Though I’m a skeptical analyst, I admit thinking there was some sleeper value in blockchain, if a few properly governed projects came along that could create safer, smoother rails to adoption.

We’ve seen vendors with very nice use cases for distributed ledgers, particularly in multi-party transactions, IP and media

where a blockchain can use a combination of transparency and immutability to provide a decentralized, shared system of record whether nodes are exposed publicly or among permissioned parties

The Intellyx Take

The main roots of Web3’s failure weren’t about technology, they were about misaligned incentives and the inevitable association of Web3 with crypto and NFT market madness

I’ve met early participants in the blockchain space with intentions for a better world with unique computing models and applications particu

weren’t building mansions on islands and taking crypto-bros out on yachts.

Who knows? Once the incentives and risks of easy money are washed out of the market for good, maybe the dream of global access to a new, decentralized internet of applications and value could someday be realized. z

25 Analyst View B Y J A S O N E N G L I S H
r i g h t s , l e g a l a g r e e m e n t s a n d a u d i t s , a n d p r o o f o f i d e n t i t y,
l a r l y w e l l - e n a b l e d b y d e c e n t r a l i z a t i o n T h e y
Jason English is Principal Analyst & CMO, Intellyx
www.sdtimes.com February 2023 SD Times
When a space is overhyped and undefined, it encourages the rise of the worst kinds of actors.
A recent survey of SD Times print and digital subscribers revealed that their number one choice for receiving marketing information from software providers is from advertising in SD Times Software, DevOps and application development managers at large companies need a wide-angle view of industry trends and what they mean to them. That’s why they read and rely on SD Times. Isn’t it time you revisited SD Times as part of your marketing campaigns? Reach software development managers the way they prefer to be reached For advertising opportunities, contact SD Times Publisher David Lyman +1-978-465-2351 • dlyman@d2emerge.com
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.