With enterprises across the world making concerted efforts to become data driven, several important disconnects have developed along the way. For example, more than 60 percent of enterprises today expect that their employees are using data to make decisions, but only a third of employees strongly believe their actions are data driven, and even fewer trust the data, according to Improving Business Outcomes with DataOps Orchestration, a recent IDC Analyst Connection, sponsored by BMC. That’s consistent with the 2022 results of a long-running research series from New Vantage Partners, which found that 97 percent of organizations are investing in data initiatives, but only 27 percent feel they have been successful at becoming data-driven organizations.
The disappointments organizations are experiencing, particularly in artificial intelligence and machine learning (AI/ML), are occurring despite the fact they have access to more data sources and software resources than ever before. So, what’s stopping companies from being more successful in their efforts to become data driven? It comes down to complexity and culture.
One of the complexities has to do with operationalizing data initiatives at scale. If data initiatives can’t be operationalized, then they won’t produce the expected value. To respond to this challenge, the industry is adopting DataOps as a set of practices that will industrialize the operational aspects of data initiatives. An influential and widely used approach to DataOps, which you can learn about in The DataOps Manifesto, addresses the leadership, cultural, and management principles that organizations should embrace to make analytics and related efforts successful. It speaks about the importance of data orchestration, but focuses much more on cultural than technical steps to success.
Wherever you are in your efforts to make better use of data, developing a solid DataOps program is a tangible step you can take today to make the journey easier. Organizations that are not getting the value they expected from their data initiatives should look at their processes for managing data before making any major new investments. Because of the data sources and tooling available, enterprises have incredible freedom in what they can develop to become data driven. But as IDC Research Director Stewart Bond notes in the study, “Freedom without a framework is chaos.”
DataOps provides the framework that enterprises need to control that chaos—if the DataOps program can orchestrate across all data sources, internal and external data users, and every infrastructure component, software asset, and process in between.
Orchestration as an important element of DataOps
The desire to be data driven and the need for DataOps are not new, but the complexity that organizations now face is unprecedented. Here’s an example that illustrates that point. Soon after the COVID-19 pandemic hit, Tampa General Hospital was sharing data about case counts, available ICU beds and ventilators, and other information with dozens of hospitals and other providers across its region to support a coordinated response.
The information was summarized in daily dashboards that were produced with data that relied on file transfers and other exact, transform, and load (ETL) operations from multiple health systems across the state of Florida. Control-M orchestrated it all, so hospital and public health officials could make decisions based on comprehensive, up-to-date information. DataOps is the key to keeping complex environments like this functioning reliably, and orchestration is the key requirement for DataOps today.
Complexity and the need for orchestration are common to businesses, even if they don’t operate at enterprise scale or have inter-enterprise complexity. According to the IDC Analyst Connection, two-thirds of organizations are already using at least ten different data engineering and intelligence tools. Figure 1 below shows some of the leading components of a typical data pipeline. These tools, and the applications that depend on them, are rarely centralized and instead are often spread across multicloud and on-premises infrastructure.
While these tools can be highly functional, they will have limited value in the real world if they can’t effectively work together. The inability to automate across processes demonstrates how complexity can limit the value of data programs. That is why the ability to orchestrate—in addition to automate—is so important now. DataOps addresses this orchestration, along with the human elements related to sharing and collaborating across enterprise functions.
Some organizations have achieved success by executing completely in the cloud, using Control-M to orchestrate and automate all their data ingestion, analysis, and automated remediation processes. Control-M has also evolved to meet today’s DataOps workflow orchestration needs and can help organizations control potential chaos by injecting automation and orchestration into DataOps, too. As IDC points out, “Data logistics is experiencing a renaissance. Many of the capabilities provided by legacy data management and automation solutions that provided control and governance are being refactored, reimagined, and modernized to accelerate work in the modern data environment and to help rein in the chaos.”
Get more of IDC’s research-backed perspective on how DataOps and orchestration add value to data initiatives in the full IDC Analyst Connection, Improving Business Outcomes with DataOps Orchestration (doc #US49015622, April 2022), and click here to learn more about how BMC is helping companies become a Data-Driven Business.
These postings are my own and do not necessarily represent BMC’s position, strategies, or opinion.
See an error or have a suggestion? Please let us know by emailing firstname.lastname@example.org.