Prisma sponsored this podcast.
From cloud security providers to open source, trust has become the foundation from which an organization’s security is built. But with the rise of cloud native technologies such as containers and infrastructure as code (IaC), it has ushered in new ways to build applications and requirements that are challenging the traditional approaches to security. The changing nature of the cloud native landscape is requiring broader security coverage across the technology stack and more contextual awareness of the environment. But how should teams like Infosec, DevOps rethink their approach to security?
In this episode of The New Stack Makers podcast, Guy Eisenkot, co-founder and vice president of product at Bridgecrew, Barak Schoster Goihman, senior director, chief architect at Palo Alto Networks and Ashish Rajan, head of security and compliance at PageUp and producer and host for Cloud Security Podcast preview what’s to come at Palo Alto Network’s Code to Cloud Summit on March 23-24, including the role of security and trust as it relates to DevOps, cloud service providers, software supply chain, SBOM (Software Bill of materials) and IBOM (Infrastructure Bill of Material).
Alex Williams, founder and publisher of The New Stack, hosted this podcast.
According to Gartner, companies will deploy 95% of new digital workloads on cloud native platforms by 2025. As this trend continues to be embraced, it introduces more complexity that makes it hard to secure and evades questions around trust. “If you want to really sleep well at night and have trust with your engineering teams, you should find the best way to have a lot of answers to questions like what open source packages am I using? What kind of infrastructure is in my provisioning? What third-party cloud providers am I using very early on and every step of the way,” said Goihman.
With serious vulnerabilities like Log4j that posed a severe risk to many enterprises, the shared responsibility model between cloud service providers and organizations has also tested the trust line. “Companies like Amazon as well as Google Cloud had to come up with services to counter account for what they are exposing us to. They have a script now that consistently checks for Log4j. So now the question of trust comes in: are you okay to have that script running? And having outages is something that tests the trust boundary,” said Rajan.
While security practices are being sharpened to new threats, there’s still a lot of work ahead to create a world-class culture, spread awareness and increase strong cyber-hygiene. “I think we must be much more thoughtful on how we treat our corporate code repositories. I think we need to do a much better job of knowing what’s in there; what those corporate party repositories are connected to; where are they streaming data from; what environment and environment variables they’re using across the board,” said Eisenkot.
Modern applications built from many components can be developed in-house and off-the-shelf, but it has left many to rethink cloud security. “Having all those different kinds of assets: infrastructure, code, container images, open source packages, and the workflow of delivery pipelines in the code repository, can give us the full picture of the supply chain. And we can build a Software Bill of Materials off open source packages for infrastructures code, which is like a runtime bill of material that helps us to prioritize the bad code,” said Goihman.
As security shifts leave developers accountable for securing code, many teams are overwhelmed and struggling to keep up with the pace of modern software development. But having a complete view of where potential vulnerabilities or misconfigurations exist can also help prioritize and offer the context of where a vulnerability fits into the layers of a cloud architecture. “Developers want to produce code, but simply giving them a vulnerability is simply not good enough because they don’t have a context on what they’re trying to solve and why they need to solve it,“ said Rajan.