11 minute read

“HELLO, WORLD!”

Policy-as-Code in the mix

We have used the term ‘policy-based access’ above with special care. We now have an opportunity to build IT stacks that oversee tiers of applications, data transport layers and services that are architected to behave in ways that we can define and manage as an ongoing software-defined control mechanism. Even where automation is prevalent, prolific and perhaps even randomly non-specific, this is the world of Policyas-Code (PoC).

The popularization of enterprise-grade software application development tools designed to empower so-called ‘citizen developers’ has burgeoned over the last half-decade. It is a growth pattern that has run in parallel with the rise of robot process automation and the general trend to create autonomous controls to oversee database management and deeper system health.

With great power...

While the sum effect of the software automation revolution has been quietly reshaping the way we think about struc turing software teams to deliver enter prise applications and data services for modern business, a new responsibility has emerged at both a technology and business level.

In terms of technology, if we’re pro lifically automating, we must also be ho listically managing - and that means we need to create policy-based access con trols that manifest themselves at every tier of the modern automated IT stack. Nobody wants the wrong low-code/nocode apps accessing the wrong systems, integrating the wrong data channels and delivering the result to the wrong (unapproved) users.

In terms of business, there’s an equally impactful new responsibility. Users never previously tasked with self-determining IT system functionality are now being asked to innovate and create. Previously they just turned up for work, used some apps and occasionally went to the coast for a weekend. Today, they have to create the next killer app or risk being usurped by someone else with more creativity. So where does this whole chapter in the information revolution get us and how should we travel the road ahead?

Advocated and advanced by infrastructure software company Progress , among others, PoC is a means of applying a manageable codification layer to an organization’s security and compliance stance. But more broadly, PoC is also a means of extending business policy into a more software-defined realm and applying it to the way IT systems, datacentric workflows and application components are able to operate. In a world of automation short-cuts, this is not an antidote, it is more akin to vaccination and the liberal use of digital sanitizer.

As Progress CEO Yogesh Gupta has stated before now, “When it comes to the effective use of Policy-as-Code, organizations need to elevate themselves to a position where they can operate with policy that is both human-readable and machine-enforceable. It’s about providing a common language that can be used across teams (both between dif- ensure ferent IT teams - and between business and IT teams) to ensure solid, secure and effective operations.”

Having previously expressed his awareness of the fact that there has always been a traditional gap between what any organization’s IT policy is vs. what is actually implemented, Gupta and team say that the Progress Chef platform is there to help bridge the gap and form a new policy-based approach to IT operations.

“In the new automation arena, we’ll all need to travel with our eyes wide open,” says Kerry Kim, principal product mar- keting manager at OutSystems . “Next generation use of automation and artificial intelligence (AI) opens up exciting new possibilities for software development. But they also come with new risks. Not all forms of automation and AI are alike. It’s important to understand their capabilities - and their limitations - in order to use them to their best advantage.”

Profound power, profound risk?

Kim proposes that the same can be said for low-code tools. They enable a much larger, more diverse audience to innovate with software. But the risk of this newfound power isn’t just bad software - corrupted data, security breaches and legal jeopardy from exposing customer data are very real possibilities.

“Customers serious about accelerating their velocity and improving agility need to prioritize platforms that don’t compromise on their security and governance models. Using AI not just for speeding things up, but also for identifying security vulnerabilities is a great example of innovation enabling these initiatives. Finding the right balance between productivity and safety is critical to a customer’s success,” says Kim.

He advises that not all low-code tools are alike. So then, going forwards, it will be important to look for high-performance low-code platforms that offer AI assistance, visual programming and reusability for improved productivity, while also offering real-time monitoring and code analysis to ensure security and manageability, and reduce technical debt.

But even with the cautionary caveats expressed so far, do we really have a handle on low-code automation for the medium and long-term? At what point can we say that the low-code and indeed no-code cognoscenti has engineered enough steering controls (and indeed breaking power) into the platforms they now present to us in the automation revolution vortex? The answer might just lie in our past.

A brief history of connectivity

For Joe Drumgoole, senior director of developer relations at MongoDB , we need to look back to our recent IT engineering history. This should remind us that the traditional model of software in the 1990s were large monolithic applications linked together with technologies like CORBA (Common Object Request Broker Architecture) or COM (Component Object Model) with its ‘D for distributed’ DCOM extension.

“The vileness of both of these approaches gave birth to a modern API-first approach to building systems starting with web services and finally righting the ship with a pivot into REST [the modern architectural style much-loved for its uniform interface]. Initially, this APIfirst approach was used to drive mobile adoption, but even in 2002 companies were starting to see the integration possibilities in this new technology. Roll the clock forward to 2023 and we see that a modern business is just a large collection of SaaS cloud services tied together with REST integrations,” explains Drumgoole.

The implications of this coalescing connectivity revolution were (and still are) clear i.e. if a business was built in the last 15 years, it very typically will have had an integration team writing all this ‘glue code’ full time. But, it was boring, repetitive detail-oriented work. This then was the genesis of low-code.

“As low-code has proliferated we have seen the emergence of low-code hubs. These tools provide a consumer front-end to this twisty-turny maze of back-end connectivity,” Drumgoole continues. “But it’s no accident that these tools model themselves on the original low-code tool, Excel. However everyone misses out on the critical issue in this rush to ChatGPT-ize the world, but you can’t build enterprise systems without defining the ‘authority model’ of that enterprise.”

What we mean by authority model here is crucial to answering our central question: it’s all about who decides what matters and what gets automated. In other words, as we set about applying automation to our business, just what parts of which workflows and human workplace functions are we feeding into the machine - and how will we know if we’re being commercially prudent and operationally safe?

Speaking from experiences drawn from working with as diverse a customer base as any, MongoDB’s Drumgoole suggests that the real challenge of low-code and automation in general is to allow the capturing and encoding of the complexity of how organizations work when people are not watching. This could be a sweet spot that we’ve yet to really work out, but it presents us with some superb discussion points for the future.

A double-edged sword?

As we chart our progress forwards with automation, lower-code systems, software accelerators and all forms of autonomic self-managing IT systems, the weight of opinion generally flows in one largely positive, upbeat and amplifying direction. But thankfully, there are enough industry analysts and commentators to provide a little balance. Chris Haas, director of product for App Engine at ServiceNow suggests that low-code/ no-code applications have historically been perceived as a double-edged sword, but when used correctly, the first cut can be both the deepest and sweetest.

“It’s a double-edged blade because both IT and business teams using these technologies have to attempt to balance real innovation with app sprawl,” said Haas. “Successful low-code experiences need to create boundaries of governance that can help remove one edge of that sword while sharpening the other for innovation at scale. Especially in an era of IT talent gaps, developer shortages and hiring stalls, low-code is becoming more essential than ever to give employees the tools they need to streamline business processes.”

Almost amusingly perhaps, the real heart of the matter here may not come down to the applications, the so-called citizen developers now getting their hands on shortcut software tooling, or the increased breadth of connections now being constructed across modern organizations’ IT stacks. It might just come down to data. How we as humans get access to different information sets and what we do with them could be what really impacts the surge in current damentals of the software development life cycle, such as testing and DevOps. It is widely agreed that low-code platforms which don’t include these capabilities are not suitable for the enterprise, hence why he says his firm has concentrated on these areas with the Appian Platform for process automation.

It’s all just visual

automation and decides who comes out of the next phase alive.

“The heart of the issue is not the proliferation of apps themselves, but rather the data those apps are permitted to access and expose to users,” said Malcolm Ross, deputy CTO at Appian. For Ross to say this, as a C-suite executive at a firm quite indubitably known for its low-code prowess, might well speak volumes. He reminds us that data management in large enterprises has never been easy - and that the boom of digital technologies and tools has created exponentially more data, which means it has become increasingly fragmented across the organization.

“The goal ahead of us is two-fold: we need to unify that data for better decision-making, while strictly enforcing role-level security controls over who can see what,” said Ross. “But there is an approach that delivers both - and it’s known as a data fabric. This is an essential step for accelerating the pace of business transformation with ensuring data governance.”

In addition to the data issue, Appian’s Ross states that the increased pace of business technology delivery enabled by low-code requires adherence to the fun-

Perhaps we all just need to calm down a little, remember where we started and think about what low-code and automation really mean at their core. At its most elemental level, this is basically a form factor and process that we have seen before. In many ways, this whole low-code discussion is really just a case of enterprise software vendors talking about their own visual development environment. This is the opinion of Francis Carden, VP for intelligent automation and robotics at Pegasystems

“I think our industry needs to more carefully control and define what lowcode is,” says Carden. “Low-code platforms make it easier for apps to get built with more people involved, in near realtime, all the time, to achieve the best results. But we need to remember with some caution that we don’t now just go from the as-was way of building apps (professional coders) to citizens (lowcoders) - it’s not about just one or the other, but a collaboration of these and personas across the enterprise.”

Editorial Analysis

Perhaps it should have been more obvious. Proponents of low-code/ no-code software platforms and process automation tools are unlikely to admit to being complicit in the great technology automation innovation swindle.

Nobody was going to hold their hands up and say, “Well, now you mention it, yeah…it is always a concern that we’re putting too much onus and responsibility in the hands of businesspeople and non-technical folk. But quite frankly, software engineers have had just about enough of zany user requirements, so perhaps they should all get a taste of their own medicine for a while and we’ll all see you at the end of the decade to ask you if you’d like to be more reasonable in terms of your software functionality expectations.”

Nobody has said that and nobody is likely to say that, so we had to say it out loud. Not to be deliberately incendiary and contentious, but just to make sure we really do preserve some balance between business and technology departments going forwards.

“Many seem to believe they can just implement low-code and let employees freely run with it to get stuff built. After all, that’s the point of low-code, right? Absolutely not – low-code is not an excuse to cut corners. C-suite leaders like CIOs, CEOs and CTOs need to have an active role in ensuring the downstream teams are creating visible guardrails that ensure whatever’s being built - especially by citizen developersis continuing to be built effectively and compliantly,” says Carden.

Everything starts as a risk

The resounding message from Carden and team is that everything needs to be considered a risk - until it’s obviously whether automation itself put too much responsibility in the hands of the business function to innovate and solve problems, opinion on that whole predicament is currently comparatively scant. Although this issue may surface in the weeks and months to come, the industry is still somewhat more preoccupied with agreeing on why low-code has come to the fore, who we empower it with, when it should be used and what its key deliverables should look like in any given successful deployment.

Buzzphrase Bonanza

If we have successfully been able to talk about both sides of the argument for automation, then let’s hope some of the key control caveats tabled here become functional software tools and procedures, rather than some software buzzphrase bonanza that we might liken to the political rhetoric we hear every day on television.

Carden reminds us that software application developers used to be locked in a room with a business requirements specification for months (sometimes years) only to one day emerge and appear with the apps. Anything that was misconstrued poorly, and not interpreted by the coders correctly, led to a not-fit-for-purpose rating, stress and poor or late ‘go-lives’ for the deployment.

So how do organizations ensure that low-code doesn’t go completely off the rails? The Pegasystems VP says that the highest echelons in an enterprise need to be more involved from the outset to ensure trusted partners and processes.

not. “Low-code platform vendors must not shy away from proving that this is the basis of their products or better. Though we used to put a lot of trust in professional coders, with only some, if any, verification, the future of low-code apps requires us to instill verification of all personas across the low-code model,” concludes Carden.

For all its automation and point-andclick drag-and-drop efficiency, low-code software application development appears to throw up questions related to responsibility, process compliance and data governance unless it is handled, executed and above all managed diligently. Although we asked at the outset

We’ve heard about ‘AI assistance layers’ that will be used to corral and control how automation should be used diligently. We’ve heard about ‘boundaries of governance’ to control where and when automation can be applied. We’ve heard about the need for an ‘authority model’ as a means of providing decisionmaking power over how automation can be accelerated. Plus of course, we’ve heard about policy control throughout and the use of softwaredefined as-code platform tools to make this achievable and more closely aligned with the digital DNA of cloud computing composability. With all these insurance layers built into the fabric of automationand with an inherent understanding that it will always still be a case of garbage in equals garbage out (yes, we know data quality for both apps and MLOps for AI is of paramount importance) - we may just be safe enough to increase the automation overdrive up to 11 and see where it takes us.

This article is from: