This article is more than 1 year old

Software automation and AI in DevOps aren't the fast track to Skynet

Behind every robot lies a good human

Software automation is becoming intelligent, going deep into systems and is even going "autonomic" and helping create self-healing systems.

If we engineer too much automation into our new notion of intelligently provisioned DevOps, then are we at risk of breaking the Internet and causing global nuclear meltdown?

Maybe it’s not quite as dangerous as typing Google into Google (please don’t ever do that), but you get the point.

What is software automation anyway?

Of course software automation is not automatic software as such, but we rarely stop and define it.

In its most basic terms, software automation includes automated elements inside specific software applications, smaller software agents and wider software architectures and/or systems that are programmed to perform defined automated responses and controls based on the health and wealth of the total system’s operational flow.

It’s not a big leap from automation to DevOps – and DevOps has a good deal of automation inside it. This is why we hear about firms like Chef, an outfit that specifically labels itself an automation for DevOps player.

DevOps automation

DevOps automation in Chef’s universe comes down to intelligence including (but not restricted to) dependency management inside its Chef Delivery product.

If a dependency is a relationship between two (or more) code streams or datasets (or operations or functions) such that one coexists with another and is essentially defined by the other's being, then Chef Delivery knows which services and applications in the pipeline depend on one another.

It all sounds rosy and peachy doesn’t it? Operational streams, workflows and data sets that know about each other and have the ability to apply rules to govern the higher level wellbeing of the software application development lifecycle.

With Chef’s notion of dependency management, we know that any code changes must pass local tests and that (and this is the clever bit) related downstream components or services must also pass those tests too.

Automation robots rule

In Chef’s view, only service changes that pass are promoted, so deployments to production are always safe. Well, they’re safe if we accept that the software automation robots now ruling planet Earth of always going to perform every action with the health of the human race in mind.

A naysayer’s argument perhaps, but can we not suggest that more DevOps automation means potentially less control and more risk? As we plug more of this automation into mission critical systems (systems such as finance and air traffic control come to mind), are we handing over too much power to the robots?

Ah, no, say the DevOps automation vendors, because this is DevOps automation “intelligence”. Y’see?

Chris Conlin, senior vice president software architect at online degree organisation 2U helped clarify the argument here by saying that: “We obviously pursue automation because of its ability to create increasingly complex infrastructure at a rapid pace, but it can also carry great destructive powers - meaning the ability to quietly repeat mistakes at a faster rate than they can be identified and reversed.”

Conlin continues: “Advancement in automated acceptance testing of infrastructure should closely follow the evolution of intelligently provisioned DevOps. Running tests that might have to actually create and tear down infrastructure fast enough for tolerable continuous integration will be challenging!”

Chef, meanwhile, maintains that use of its automation for DevOps software means that irrespective of whether there are two developers or thousands of staff, teams can create and safely deploy interconnected services on diverse runtime environments including containers. See how Chef said “safely deploy” there? But is safety being discussed openly and bluntly enough in the world of DevOps automation?

Part of Chef’s answers here are enhancements that now feature in the Chef Compliance product. The software has been built to enable automated management of compliance policies that are based on the broadly used Center for Internet Security (CIS) benchmarks.

Users can now implement CIS and other compliance policies as code, called compliance profiles. In addition, Chef Compliance allows users to import Windows policies from the Microsoft Security Compliance Manager. But is all this enough?

Are policy controls enough?

The important part here is making sure that changes to infrastructure and applications meet all requirements across the software application development lifecycle. With both Dev and Ops working concurrently (hopefully in unison), the theory is that tools like Chef Delivery will enable users to input, track and test service runtime dependencies with CIS benchmark layers at all times.

As software service complexity continues to spiral inside increasingly intricate applications moving and changing at high velocity, will versioning and testing prior to production deployment keep pace?

Chef explains that it has distilled its customers' successful workflow patterns into Chef Delivery, where service dependencies can be predefined and compliance requirements become part of the development pipeline.

But still, is it safe enough?

Ken Cheney, Chef’s vice president of business development, told us flatly: "Contrary to reports, automation will not lead to the Earth's destruction.”

Cheney justifies his claim by pointing out that when it comes to dependency management, there is still a “very human” element – the most important element, he says.

“An individual, or team of individuals, need to describe the dependencies via code, then use those to set the appropriate tests at each stage of the pipeline. Now, once this is done, automation takes over and the pipeline itself automates the testing and management of the pre-set dependencies,” Cheney says.

This process, of course, significantly accelerates the process and accomplishes much more than a person, team, or teams could do manually.

“It is the person or people using the platform that created the dependencies and tests, so the power ultimately and will always rest with human beings - humans who with automation can collaborate with tens or thousands of people much faster and more safely than without automation," Cheney says.

So there you have it, we can safely automate as many internally complex application structure elements as we like and plug these into DevOps engines and the Earth will not explode as a result. Convinced? ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like