Add It Up: Why Salesforce and Google Bought Tableau and Looker
Salesforce is buying Tableau and Google Cloud is adding Looker to its family. After reviewing a panoply of studies about business intelligence (BI), analytics, and data warehousing, we found it evident that both acquisitions will provide best-in-class functionality addressing the needs of both business analysts and the IT department, as well as leadership in high-growth markets that are only tangentially related to artificial intelligence and machine learning.
Twenty-eight percent of IT professionals believe that over the next five years more sophisticated data integration capabilities will be the focus in business intelligence and analytics software development, according to SharesPost’s October 2018 survey. This was a dramatic rise from only 10% that said so in the 2017 survey. Expectations also increased about the attention that will be paid to data visualization capabilities. The notable losers are machine learning integration and predictive analytics. This does not mean that ML and predictive analytics use has dropped — the same study found that the use of predictive analytics in BI tools rose from 41% to 67% over the same time period. Instead, developers may believe that the predictive power of new technologies will be less important than actually managing and visually exploring the data that is being analyzed.
The trade-off between simplicity for business users and the ability to code customized functionality is common among users of tools that monitor applications and IT systems.
The Looker purchase will address data integration issues and work with Google Cloud’s BigQuery and other data services. As Hyoun Park of Amalgam Insights explains, Looker has the ability to support a common data model at scale across hybrid and multicloud environments, as well as support data transformation at the time of query rather than through managed ETL (Extract, Transform, and Load) jobs.
Salesforce had already addressed the data integration pain point with its purchase of Mulesoft. While Tableau has strong capabilities for data self-service, data-driven discovery, it is a market leader for data visualization, particularly among large enterprise customers. Tableau usage in the SharePost study almost doubled, going from 18% to 33% compared to the previous year. Other studies show a less rapid increase but indicate Tableau now rivals Microsoft, SAP and offerings from legacy BI/analytics companies.
TDWI’s Fall 2018 survey of both business and IT professionals found dissatisfaction with both integration, predictive analytics, and advanced visualization functionality. Almost half (48%) were at least somewhat unsatisfied with their BI/analytics tools’ ability to do predictive analytics and forecasting; the same percentage are concerned about their ability to mash up and join different data sources. Users are generally satisfied that they can create and edit basic visualizations, but are less satisfied with the ability to embed interactive graphics and to visually map and transform data. The concern about predictive analytics can be associated with concerns about the data pipeline and preparation procedures needed to make analytics projects successful.
Developers and IT departments complain bitterly when these systems are purchased by a business unit but then require regular upkeep by technical staff. BI software is inevitably is limited because of the need to manually code queries and scripts, yet tools like Tableau and Looker make it easier to allow business users to handle customizations on a self-serve basis. In this regard, BI and analytics tools have long been at the forefront of low-code and citizen development.
Vendors and Open Source
The trade-off between simplicity for business users and the ability to code customize functionality is common among users of tools that monitor applications and IT systems. Nagios, Zabbix, Grafana and Kibana are all open source projects that developers use to create monitoring solutions that usually have a dashboard as a front-end. Companies have sprouted up to monetize these projects, but so have vendors that use some of these components in a larger product offering that is both easy to operate and scale within an organization.
Open source technologies have been very successful in areas related to data management and are central to the stream processing stack. Joseph Jacks, founder and general partner of OSS Capital believes the next major transformative wave of platforms at the BI application layer will be associated with commercial open source companies like Redash and Metabase. Apache Superset is another widely used project. It is offered as a managed service by PushMetrics and is explicitly marketed as an open source alternative to Tableau, Looker and PowerBI. While we cannot predict the future of “open core” as a business model, we are relatively confident that open source technology will be incorporated into many vendor offerings for years to come.
- Business Application Research Center’s “Top Business Intelligence Trends 2019”: A survey of 2,679 BI professionals. The study is conducted annually, with a majority of its participants living in Europe. In-depth vendor ratings are only available to paying customers, but a wide range of data is accessible.
- Dresner Wisdom of Crowds Business Intelligence 2019 Report: A long-running, well-respected study. It has detailed macro questions as well as customer ratings of specific vendors.
- Eckerson Group: The research company’s blog is an excellent source of analysis on business analytics, data management and data science.
Feature Image by bernswaelz from Pixabay.