Security Considerations When Moving to Database as a Service

Attracted by a range of benefits, including ease of management and predictable costs, companies are increasingly shifting mission-critical databases to the cloud. And while some may express concerns about the security of Database as a Service (DBaaS), a managed cloud deployment can bring security benefits as well.
ScyllaDB vice president Tzach Livyatan told The New Stack that for his company, as for any good DBaaS vendor, security is a top priority. Among the key security concerns regarding DBaaS, he said, is the threat of a cross-customer breach — which he strives to avert using tools like strict network access enforcement, encryption in transit, and encryption at rest.
Human error is also a concern, not only in terms of customers using weak passwords or forgetting to use encryption, but also regarding internal mistakes or misconfigurations. “We identify such bugs early, before they make it to production, using code review, automatic testing, and integration testing,” Livyatan said. “We also assume mistakes might happen on production and develop tools to identify and fix such issues ASAP.”
A rogue or compromised employee can also present a threat, in response to which Livyatan said it’s crucial to minimize the “blast radius,” the amount of damage an insider breach would be able to cause. “We must ensure that any employee’s power is limited to minimal necessity and quickly find and block anyone who might be compromised,” he said.
Assessing a Provider’s Security
Livyatan said his company uses a range of tools and methods to improve security both in development and in deployment, including static code analysis, edge machine protection, and port and vulnerability scanning; as well as third-party IDS, IPS, SIEM, single sign-on, and penetration testing. The company works with external security experts and leverages Panorays to evaluate suppliers for security.
Still, Livyatan said, for any potential customer evaluating a DBaaS provider, it can be challenging to assess the security of a service from the outside. Certifications like SOC 2, ISO 27001, ISO 27017 and ISO 27018 are a good start, he said, as is an evaluation by a third-party service like Panorays.
Another way to get a sense of a provider’s security, Livyatan suggested, is to ask for their latest security event report. “An established, seasoned service will have [had] some security event in the last few years,” he said. “Reading the report can tell a lot about the organization’s transparency and security approach. A service with zero security events is actually a red flag for me.”
Once you’re using the service, Livyatan said that RTFM (read the f’ing manual) is a good start. “Every service, ScyllaDB included, has a list of security recommendations. Read and follow them,” he said.
The principle of least privilege is also key — minimize the number of users who can access the system, and limit the privileges of each user and application to the bare minimum required. “Do a routine check on these privileges, making sure they do not grow with time, as they tend to,” Livyatan said.
Benefits and Risks of Automation
Dan Neault, senior vice president and general manager for data security at Imperva, said it often makes sense to count on automation to handle a lot of the basics of database management. “Certain of the undifferentiated, generic work that doesn’t require perfection and high judgment can be done programmatically with computers — the backup, the patching, certain levels of work,” he said.
Automation also means fewer people touching the systems, reducing the likelihood of human error. “There are enterprise customers out there that say they can’t operate it as securely as the hyperscalers, and I think they’re right,” Neault said.
On the other hand, you do lose one thing that’s central to security when you rely on automation and cloud services. “I’ve had many, many customers over the decades say, ‘The way I found my security issue was I bumped into it when I wasn’t even looking for it,’” Neault said. “You don’t really bump into things in the cloud quite like that — you have to be looking for it.”
So there is a risk, he said, in moving too far away from the older model of database management. You still need someone to keep an eye on things, but while in the past you’d likely have a DBA monitoring your database in detail, a cloud deployment takes away some, but not all, of that requirement. “You need to have somebody who has the judgment of what a DBA used to look for — but in the world of cloud, there’s only a subset of that that matters,” he said.
You can rely on a cloud provider to operate their infrastructure with high SLAs, and to provide some level of visibility into what’s happening in their data stores. Still, Neault said, “the risk analytics, the models, the algorithms, the insights, the correlation of ‘this user from this group looked at this kind of data, in this way, at this kind of scale,’ that’s a deeply specialized area.”
Leveraging Data Security Services
Particularly if you’re leveraging a multicloud strategy, Neault said, you need a broader overview than any single cloud database vendor might offer through their own management and reporting. A data security platform like his company’s Data Security Fabric, he said, should be able to drape (pun fully intended) across all the different deployments you need to monitor. That ability to help secure data regardless of location is crucial in an increasingly multifaceted and complex world.
The right combination of automation and human expertise, Neault said, is key. Let computers do what computers do well, and let people do what they do well. “Show me the facts, then help me through the analysis — the machine learning, the algorithms, the pattern-matching, the discernment about these thousands of things — and help me make the right calls to action based on known patterns.”
No matter what you may think you’re going to need over time, Neault said, it’s vital to maximize flexibility. “You’re going to have a system that might be running relationally for 20 years, and you realize, ‘Wow, that’s not going to be my future. I’m going to bring it up relationally, and then I’m going to move it around — I’m going to start with a data lake, and after I do that, I might push that into a non-relational system,’” he said. “You have to assume that level of flexibility, because it’s going to happen.”
In the future, Neault said, it’s reasonable to expect that regulators will intensify requirements regarding DBaaS security — but with the right plan in place, that shouldn’t require massive adjustments. “You should really assume that if you design for visibility and security and change, you’re on the right path,” he said. “What do I mean by change? Multiclouds, changing data stores, moving databases around, managing them in different areas for whatever reasons.”
And then any new regulation shouldn’t be a concern. “Imagine how great it would be if everybody thought that way and said, ‘No, let’s make sure it’s all in great shape, just from a pure security perspective’ — because that’s our data,” he said. “We’re all consumers at some level and businesses at some level; it’s our data that’s out there. How secure do you want it? I want it really secure.”