Gartner Magic Quadrant for Data Integration Tools 2017: The Data Integration Market is Being Disrupted
If you didn’t catch it, the 2017 Gartner Magic Quadrant for Data Integration Tools published in early August and what was most notable about this year’s report is that many of the legacy vendors in the Leaders’ Quadrant moved down and back, causing the entire competitive landscape to shift.
The definition of a leader is changing, and it caused some significant moves in the leader’s quadrant. Over the last few years, we’ve seen Gartner add and increase the importance of capabilities like the ones below:
- Delivering a combination of data residing on-premises and SaaS applications or other cloud-based data stores and services, to fulfill requirements such as cloud service integration.
- Populating and managing data in a data lake where data is continuously collected and stored in a semantically consistent manner.
- Connecting to, and enabling the delivery of data to — and the access of data from —big data systems such as Hadoop, etc.
- The demand for analyzing and integrating data during "business moments" [acting in real-time].
- Metadata management capabilities will become the center of all integration approaches.
While Informatica and IBM remained in leadership positions, each company took a significant step down and back. We believe this is because older technology platforms are having a harder and harder time keeping pace with new customer needs and available technologies including big data, machine learning, real-time and streaming data use cases. Gartner's Magic Quadrant for Data Integration tools highlights cautions around both companies for pricing, complexity and high total cost of ownership. Similarly, Oracle and SAP both dropped down noticeably on the Ability to Execute axis. Gartner said both are overly focused on their stack, with increasingly complex products, and pricing/perceived value issues. Both were also called out for extensive migration and upgrade issues. For SAP specifically, the report adds that it has limited support, services and trained resources in the market.
Talend was the only leader that substantially improved its position on the ability to execute axis. This noticeable move underscores how changing market dynamics and customer requirements are impacting vendors and causing the bar for data integration solutions to be raised. The historical leaders are losing ground when it comes to new data scenarios, like the cloud, big data and self-service. We believe these new data scenarios are the future of the data integration market. This aligns with Talend’s “Wayne Gretzky” strategy of skating to where the puck will be by becoming a leader in the areas that are most important to the future of the market.
A Key Trend Gartner Didn’t Highlight
Interestingly, the Gartner Magic Quadrant for Data Integration Tools report did not speak much about the importance of machine learning. In my view, the use of machine learning in data integration is one of the most important drivers of change within the data integration market. At Talend, we see the application of machine learning occurring in two ways. First, some vendors are using machine learning to make their products better. For example, at Talend, we are using machine learning to improve our data quality components, especially running at scale with Spark and Hadoop. Those same models are also continuously improving because they can monitor decisions made by people with our data stewardship application, so our data quality components getter more intelligent over time. The second application is using machine learning to make smart data integration pipelines. A good example of this second scenario is shown in our Talend Big Data Sandbox where we highlight how to make next-best offer recommendations based on web click-stream data. We believe that machine learning is already a critical component of data integration and that it will only become more prominent over the next several years.
The Future of Data Integration: Bigger Things Lie Ahead
At Talend, we believe that the data integration market is on the brink of being completely re-invented. Historically, data integration has been about moving data from Point A to Point B. That world is becoming increasingly less relevant. In today’s world, there is so much innovation in data platforms, real-time data processing, cloud technologies and machine learning, that the challenge companies will face in the future, is “How can we continuously and rapidly adopt new technologies?” Once you start to tackle that problem, what becomes obvious is that customers will need to continuously port their existing development work from one data processing platform to another. That is the key to the future of data integration: building smart data pipelines that can easily move from one platform to another, which will help customers ‘future proof’ their investments within the rapidly evolving world of big data and cloud technologies. As you look to partner with a data integration vendor on the next phase of your digital transformation journey, we recommend you keep the following questions and concepts in mind:
- Think through your future roadmap…will it eventually go multi-cloud? How will the uses for big data change at your company over time?
- Does the vendor you’re considering take a unified platform approach or will you need to buy several different products to support these new, emerging use cases?
- Will the product you buy today also support your business and IT needs of tomorrow or will you need to buy additional products that may introduce complexity, cost and the need for new skills?
- Are you going to have to rebuild everything you’ve already created when the next wave of big data and cloud technology comes along?
As the rapidly evolving landscape for big data and cloud technologies intensifies, staying ahead of the digital transformation curve will require an agile data integration platform that allows customers to plug emerging technologies in and out as they arise.