What is data agility?

Your business — every business — runs on data. It is an organization’s most valuable asset, and yet, too often, it fails to live up to its value. The potential of data is virtually limitless, but the practical reality is far less rosy. In truth, data integration specialists and data engineers are unable to meet the increasingly urgent and complex demands for data within the organization and from partners.

This is partially due to the fragmented nature of the typical data infrastructure. The average organization is pulling from over 400 data sources. 41% of support teams complain that they are being slowed down by data silos. And most companies spend more time searching for and preparing data than using it, leaving people across the organization at a loss to obtain even the most basic data.

The issue isn’t that companies don’t have enough data, quality data, or even the right data — most of the time, they have all of these things. What they don’t have is data agility.

Data agility definition

Data agility is the speed and flexibility to satisfy the data demands of a business quickly, reliably, and at scale, regardless of underlying data infrastcuture (e.g. hybrid, multi-cloud).

Agility is a core component of overall data health — that is, how well an organization's data supports its business objectives. Data can be considered healthy when it is easily discoverable, understandable, and of value to the people that need to use it, and these characteristics are sustained throughout the data lifecycle.

To support true data agility, an organization must have a flexible, scalable ecosystem with end-to-end data management. Only then will they be able to meet the changing demands of the business.

The risks of fragmented data

Data and IT teams that focus on the mechanics of moving data without regard to data health find themselves stuck with a data landfill — a confusing dumping ground of slow and siloed data with brittle data pipelines and no ability to scale efficiently. This has only been exacerbated in recent years with the rise of cloud data warehouses.

Data integration and data science professionals feel this pain most acutely. They are overwhelmed by the demand for data and frustrated with the pace of data delivery. Existing solutions based on hand-coding or a combination of point solutions are time-consuming to build, rely on institutional knowledge to maintain, and must be reinvented for every new integration and project. Requests pile up, putting at risk data delivery, data reliability, and the ability to scale.

Data and IT leaders, such as the chief information officer (CIO) or chief data officer (CDO), also struggle because rigid, brittle, or siloed data flows prevent an organization from achieving the full benefits of any digital transformation efforts. Without a clear, holistic view of organization-wide data, central IT leadership is unable to meaningfully enact organization-wide data policies. Slow and siloed data also impedes regulatory compliance which puts the organization at risk.

Meanwhile, line-of-business data experts aren’t getting the raw data they need for informed, timely decision-making. Slow and siloed data holds back the potential for data analytics, putting up barriers that ruin deadlines, raise project costs, prevent holistic analysis, or duplicate efforts in other lines of business. Technical skills gaps also prevent LOB data experts from maximizing the value of their data quickly.

By putting an emphasis on data agility, data engineering teams gain the speed and flexibility to keep up with the data demands of the business and scale their operations without worrying about continuity or compatibility.

Data agility in action: Customer stories

AB InBev — Data agility never tasted so good

Global beverage and brewing company AB InBev has a diverse portfolio of well over 500 beer brands, including Budweiser, Corona, Stella Artois, Beck’s, Hoegaarden, and Leffe. Since many of these brewers operate as independent entities with their own internal systems, integrating the systems and data from acquired companies was a major challenge.

“Our internal customers — data scientists, operations teams, and business teams — were struggling to pull together data from over 100 source systems, analyze it, and make timely decisions on product development, supply chains, marketing campaigns and more,” explains Harinder Singh, Global Director of Data Strategy & Solution Architecture, at AB InBev.

All data management work would have to be done under the AB InBev umbrella. Talend was able to extract data from over 100 data sources —real-time and batch, cloud and on-premises. Now, internal users spend only about 30 percent of their time gathering data and can spend 70 percent analyzing it.

Data helps understand drinker tastes and analyze new demands from consumers for low-calorie beers for example or determine preferences for beers according to seasonality. Data also help improve store and bar experiences, supply chain optimization, product development and more.

UNOS — Saving lives with data integration

In the US, the United Network for Organ Sharing (UNOS) is the private, non-profit organization that manages the nation’s organ transplant system under contract with the federal government. In doing so, UNOS brings together hundreds of transplant and organ procurement professionals and thousands of volunteers.

“We recognized we needed an enterprise tool that could integrate all our different technologies into one pipeline and eliminate the hand-coding,” says Jamie Sutphin, Big Data Services Architect at UNOS. “Using Talend has enabled us to automate the process of integrating systems and processing data as well as reduce the time required for this essential task from 18 hours down to three or four hours.”

It took UNOS only two and a half months to go live with the Organ Offer Report functionality. Transplant centers accessing the report can now see bio statistics about a specific organ, such as blood type and antigens, and the history of what they’ve transplanted over the last three months.

“We can be very productive with a small development team and be very efficient in terms of organizing and building code,” says Sutphin. Summing up the experience UNOS has had with Talend, Sutphin says, “We are very satisfied with Talend — it’s doing exactly what we want. Talend is solving problems we couldn’t solve before deploying it, and integrating systems we couldn’t beforehand. In fact, we have difficulty seeing where we couldn’t apply Talend to a use case because it’s so versatile.”

How can you improve your data agility?

Talend’s low-code cloud-independent data platform removes the financial and technical barriers to managing data from end-to-end. With Talend, you can speed up every aspect of the data lifecycle across any environment without the need for hand coding.

Data APIs and API services facilitate fast, secure data sharing both internally and with partners, reducing the load on data integrators and engineers. And Talend’s Pipeline Designer and Stitch enable rapid deployment of robust data flows with less time and technical expertise.

Sign up for a free trial today to see what Talend could do for your organization's data agility.

Ready to get started with Talend?