Making a case for how relentless automation can save significant money for the enterprise, Oracle has extended its flagship database’s “autonomous” features to transactional processing.
The self-healing, or autonomous capabilities of Oracle 18c, released this week, are designed to save money for the user, in that they require less personnel to operate and don’t use servers and other resources when not required, said Oracle Executive Chairman and Chief Technology Officer Larry Ellison, in a presentation Tuesday on the new technology.
“We are now the easiest database in the world to use. There’s nothing to learn and nothing to do” he said. “There is much less labor associated with running this database, so it’s much lower in cost.” The company claims that it can cut up to administrative costs up to 80 percent.
The idea of “autonomous” computing has been around at least since the 1990s, appearing first in IBM’s mainframes and eServer line,noted Charles King, principal analyst at Pund-IT, in an e-mail. “Since then, self-managing/healing systems have become increasingly common and are central to cloud providers’ platforms and operational efficiency,” he noted. Oracle appears to be pushing the concept further by leveraging machine learning algorithms (developed via its extensive experience) to autonomously monitor, repair and tune database instances, King noted.
Oracle calls its database package “autonomous” in that the software and underlying infrastructure is self-tuning, self-repairing and self-securing. Covering both the database and underlying servers and networking components, Oracle has developed or adopted software for self-tuning, memory management, automated failover, live vulnerability monitoring, live automatic updating. It runs on Oracle Exadata integrated servers, and managed by Real Application Clusters for parallel scale-out operations, and backed by Active Data Guard disaster recovery software. “There are no single-points-of-failure in the system,” Ellison said.
For the tuning element, a machine learning-based component watches over all SQL queries and indexing, looking for usage patterns that could lead to optimizations. It is also sophisticated enough to understand regressions — where an update will cause a performance slowdown — and plans to mitigate them.
Oracle first launched the autonomous capabilities earlier this year, with the Oracle Autonomous Data Warehouse service, which offered self-tuning queries for data analysis. This release expands those capabilities to transactional processing, which is the bulk of all database work.
Part of the motivation appears to be around reducing the amount of work needed to administer a database system. Most IT professionals and developers know very little about managing database systems — This is what led to the massive popularity of NoSQL databases like MongoDB or Cassandra. They are very easy to use, Ellison admitted. This is what Oracle aimed to beat
“With an Oracle ‘autonomous’ database the customer need be concerned only with the logical design of the database and the development and maintenance of the application(s),” further elaborated data management analyst firm WinterCorp summarized in a research note. The software “aims to deliver a dramatic simplification of the role of the customer, resulting in large reductions of staffing and skill requirements and reduction in total cost to the
customer, increased speed to market and other major benefits.”
The software’s self-provisioning and auto-scaling tailors the resources to the particular workload. If there is no workload, no servers will be used (Ellison called this approach “serverless cloud”). If demand spikes, the system automatically add more resources. This approach can save run-time costs by up to 90 percent Ellison boasted. He also boasted how this approach can cut the cost in half over running the Amazon Web Services’ Aurora and Redshift databases.
“Amazon’s database can’t do that,” Ellison said. “They can’t dynamically add a server when a system is running. They can’t dynamically add network capacity. They can’t dynamically take a server away when it is not running,” Ellison said. “You only pay for infrastructure when it is used.”
Speed also plays a role in cost savings. Oracle has claimed that its databases run 5 to 10 times faster than AWS’ Aurora and Red Shift on the same hardware configurations. “These performance advantages translate into dramatically lower costs,” Ellison said. “If we do the same work in half the time, we are half the price.”
In his presentation, Ellison also couldn’t help his competitive nature by pointing out that Amazon still uses Oracle as its primary databases, rather than its own in-house databases. “It’s kind of embarrassing when Amazon uses Oracle, but they want you to use Aurora and Red Shift,” Ellison said.
In the realm of security, Oracle has arranged its software to be self-updating, with no down-times. This means the database system, its supporting components and the underlying operating system (Oracle Linux) can be upgraded with bug and vulnerability fixes, while remaining operational, the company claims. Overall, Oracle guarantees 99.995 percent availability, or less than 2.5 minutes of downtime a month, including upgrades.
Oracle offers the autonomous database either as a public cloud service or as part of its private cloud management service — it is not part of the core 18c offering, King noted. The company offers a free trial of 3,500 hours of use. Currently, the Oracle Cloud supports more than 55 billion transactions a day, across 195 countries around the world, according to the company.
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Real.