In a recent panel as part of the Southern California Virtual Cybersecurity Summit, Capsule8 Security Strategist Jason Madey joined Moderator Merritt Baer, Principal Security Architect for Amazon Web Services, and fellow panelists Jonathan Knudsen, Synopsys, and Carlo Beronio, Attivo Networks, to discuss DevOps Security and its relationship with the cloud.

The discussion provided an opportunity for leading experts in the field to touch on the ongoing challenge of security trying to keep up with the rapid speed of feature releases and bug fixes made possible by DevOps. Madey and his fellow panelists discussed what companies need to do to ensure security doesn’t get left behind but that bug fixes and feature rollouts remain on schedule. How can DevSecOps become a realistic component of the modern enterprise?

The Old Tropes – Security as a Blocker

To start, panelists discussed the old tropes of DevOps and security – specifically that security is a blocker for development and innovation, acting as a gatekeeper for progress. Jonathan Knudsen spoke to the reality of the situation and how many companies are starting to move beyond this 1.0 view of application security. In the old model, dev teams would make a product and throw it over the wall, where security teams were tasked with catching any and all issues. Often, when the security team identified a security issue, it was too late – the product team was almost done with the process. This conflict has defined application security’s narrative for years, but it’s not necessarily the reality any longer. As Knudsen states, “what we’re seeing now is the transition to 2.0, in which application security integrates with the dev teams, becoming a part of the development cycle. So we talk about DevOps, but what we really mean now is ‘DevSecOps.'”

Of course, this is easier said than done. As Carlo Beronio notes, we’re in the midst of a substantial transformation as many companies move their terrestrial networks to the cloud. The perimeter is still there in some sense, but “there’s a new methodology of understanding how to apply security to these transitioned environments, and the ability to actually leverage your existing toolsets and morph them into those environments becomes critical.”

So what do the changes to security controls look like as many organizations move into the cloud?

Jason Madey discussed how we’re often talking about Linux in these situations and how many organizations are making “significant shifts from the way we did things on-premise to new management consoles and new says that we’re building and packaging and delivering our applications.” Traditional systems are end-user-centric, not necessarily workload-centric or container-centric, so it’s become vital to find and implement solutions that help gain visibility into the new cloud delivery environments.

 

Evaluating Vendors for Security Controls versus the Alternatives

Because the perimeter is dead, as Carlo notes and the traditional tools don’t offer the visibility needed, how does an organization evaluate vendors for cloud-native protection?

Madey notes that the model long used has started to change. In the past, the security tools used needed to be best-in-breed, and there was a siloed approach that eventually morphed into a single platform that can work across all systems. Now, however, many organizations operate in unique environments that each serve a unique purpose. “We need to recognize that environments are all completely different, and using one solution across all of them is simply not effective anymore. We need to be more specific, and of course, that’s going to come with plenty of research and market analysis, but we need to get away from having a one-size-fits-all solution and identify tooling that is born and bred for each specific environment.”

For this to work, however, it needs to be implemented correctly. Knudsen notes that “It has to be automated because you don’t ever want to be in this situation where you’re waiting around for a security engineer to push the button to run some tests, and it has to be integrated so that the results that you’re getting out of security testing are actually being fed back into the issue tracker or whatever other processes you’re already using.” The vague outlines for development are consistent across many companies, but the specifics will be unique. Tools need to be flexible enough to adapt to the different styles of development in use.

Going beyond this, Beronio notes that “it’s not just best-of-breed but ‘what’s integrating with my traditional workflow?’ How do we integrate a tool that can feed into specific environments, because it’s vital that security understands how dev teams are being compromised and that they understand where data is being placed.” The two most important questions for security end are “how are users getting compromised?” and “how are attackers using compromised users or credentials to access the rest of the networks?”

 

Measuring Performance to Drive Improvement

 

When looking at a truly integrated cloud-native model where companies can obtain economies of scale, what influence does that have on security?

Madey touched on several key points. In a lift and shift approach from traditional on-premise infrastructure to cloud environments, little changes. “I’m going to stand up those servers and run those applications, just instead of my closet, it’s going to be Amazon’s closet.” But when companies start to evaluate “truly building, creating and delivering applications from a Cloud-Native perspective, they must also start looking at how to intelligently build out containers, build into modern CI/CD pipelines, adopt cloud-native technologies and make leaner, more performant, and scalable applications.”

Containers are a major concern, for example. They allow companies to run applications leaner, scale them faster, ensure less downtime, and positively impact the bottom line. However, they are also newer; therefore a bigger target for attackers and traditional tools don’t offer the same level of visibility as they do for other environments.

 

Regulatory Considerations for Cloud-Native Environments

A big point of contention for many companies when considering cloud adoption is the regulatory piece. As Madey points out, it’s vital that vendors are transparent, running a clean operation, and that they are consistently dealing with the basic configuration and vulnerability tasks needed to keep your data safe. “We need a level of trust between us and our vendors to continue building and developing and pushing out software in the manner that we are.”

Baer emphasizes this: “Show me an industry that isn’t regulated in some sense or that doesn’t have to interact with regulated entities. We’re all impacted by compliance considerations.” But at the same time, when moving to the cloud, the bottom layers of the stack have now been outsourced to those providers. That means less overhead to maintain audit and compliance documentation for on-premise equipment.

When asked about risk frameworks, Beronio highlighted ISO 27000, NIST, and the MITRE ATT&CK Framework, allowing issues to be mapped to the appropriate individuals to deal with them as they come in. More importantly, MITRE has created a framework specifically for Linux and is working on building one specifically for containers, ensuring a more catered solution for all organizations, regardless of the environments they are running.

 

The Goal of Successful DevOps and Security Integration

Many elements can improve the communication between DevOps and security, helping to build a better, more responsive cloud-native environment for your organization. Culture is a significant part of this. As Knudsen notes, it’s not about “finding the most knowledgeable engineers. This is important, but so too is hiring people who communicate and will work closely with your DevOps teams, discussing security in a way they will understand and helping them integrate processes in a way that works without slowing down development.” Leadership is a significant part of this. Security is traditionally seen as a blocker, but when integrated carefully from the top down, it can be more fully integrated with DevOps, helping make it an organizational priority where everyone is on the same page. It’s about mindset as much as the process.

 

Capsule 8 is one of Data Connectors’ key partners. Learn more about the company and what great services they provide. Do you want to submit a guest blog post? Contact us.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent news posts

This is a sample blog post title.
Featured Image

Small Bytes: The ‘I’m Finally Going to Delete Facebook’ Edition

This is a sample blog post title.
Featured Image

Guest Post: The New World of DevSecOps and The Cloud

This is a sample blog post title.
Featured Image

Cyber Fraud Task Force, Digital Forensics, Risk Balance Sheet Headline Texas Virtual Cybersecurity Summit

This is a sample blog post title.
Featured Image

This is (Cyber) War: Thoughts on The Future of The Industry

This is a sample blog post title.
Featured Image

Building Back Trust After a Breach: Interview

Attend an Event!

Connect and collaborate with fellow security innovators at our Virtual Cybersecurity Summits.

Register Today