Analytics for the Masses: Five Things to Consider
Every day we all make thousands of decisions, many of them are made subconsciously or are based on minimal information. And we often make them lightning fast - even when it comes to making decisions at work. So clearly we often rely on our intuition and gut feeling.
Our intuition may often be better than its reputation. But in professional environments the important decisions that have to be made should be based on hard facts and careful analysis. Unfortunately, wishes and reality are often worlds apart: Studies prove that not even a third of all companies surveyed made their last major decisions on the basis of systematic data analysis.
Data-driven companies are well aware of the significance of their data and see it as the key foundation for making decisions. For companies like this, systematic data analysis is the true answer to staying ahead of the competition. And that brings us to the topic of “Analytics for the masses”, because in reality decisions are usually made locally, i.e. lots of people in many different departments contribute towards the decision-making process.
The good news is that more and more data is becoming available. As the amount of data is growing at an explosive rate (40% each year for the next decade) it should really stand data-based decision-making in good stead. However, studies show that even now, in the days of high-performance IT, the great majority of all decisions are made without sufficiently strong data to base them on – and sometimes this can have devastating consequences. That means there are still entrepreneurial opportunities to be had by making better decisions based on data.
So what is holding us back from making better decisions on the basis of as much of this available data as possible? Quite a lot! Here are five key areas you should pay attention to:
1. Data silos
Companies are highly interconnected organizational structures and the amount of data they process is increasing exponentially. In spite of that, data silos are still to be found within many companies, so some areas of business which really belong together are still viewed separately. Isolated data pools emerge in different specialized departments covering aspects which may seem to have little to do with each other. Relations between data are often not established and the variety of structures and inconsistent master data make it difficult to get a coherent overview of the right data to form a basis for decision-making. Novel, often external, data sources (big data, for example, as found in social media) tend to make this challenge even greater.
Overcoming this challenge requires professional tools for data integration which take account of big data and master data management, like those delivered by the Talend Platform. Similarly, an infrastructure for storing data has to be created which implements newer concepts and technologies from the world of big data while complementing the classic data warehouse.
2. Data quality
“Garbage in, garbage out” unfortunately also applies to decisions based on data – especially when decision-makers don’t have suitable information about the quality of the data they can use. Surveys confirm that data quality is still one of the greatest obstacles to carrying out significant analyses. And in the field of big data, analyses also show that a major amount of effort is still going into cleansing that data.
A multilevel approach is needed dealing with the enterprise-wide issue of data quality. On the one hand it is essential to create a continuous overview of data quality. On the other hand, any unreliable data must be removed or cleansed. In some cases, this happens automatically with a rules-based approach. In other cases, a departments’ expertise must be utilized with users getting involved in a manual data cleansing process. It is important, and clearly makes sense, to establish a comprehensive process to achieve sustainable improvements to data quality.
3. Self-service chaos
“Self-service” has succeeded in establishing itself in the field of analytics with tools such as Tableau, Qlik and Spotfire, although it is often driven by user departments wanting to solve their own more local problems. So one of the challenges facing IT departments is to “capture” the tools in circulation throughout the company, to standardize them and then make them available centrally, and in particular to use them to enable power users to make their own decisions based on data.
At the same time, self-service is also becoming more and more important when it comes to data preparation, because the expertise for this is also to be found in the user departments. The challenge of finding a professional self-service tool for user departments to prepare their data with is address by solutions such as “Talend Data Preparation”. In the end, it is data quality which benefits most from this.
4. Analytics for the masses – wishes and reality
Studies show that 75% of knowledge staff are not interested in working with an analytics tool and are thus defying manufacturers’ advertising efforts. The notional separation between software which is used regularly (like a CRM system) and analysis programs actually impedes the use of data! Analyses are created by power users and not by regular business users.
One remedy for this data divide is promised by “embedded Business Intelligence”. This is all about embedding analysis artefacts in the user’s programmes so that they are addressed automatically - whether in the form of simple charts or as easy to use self-services which are prepared for each different application case and context.
5. Culture and qualifications
It is essential to have the right sort of company structure for data-driven analyses and decision-making processes to work. If this is interpreted as a way to disempower decision-makers or if political factors dominate, then it will never be possible to establish this approach sustainably across the board. That is why it is important for management to set an example with their commitment to this new paradigm and show they expect everyone else to commit to it as well. In this context, the decision-maker must have the appropriate skills and a full understanding of the decision-making situation as well as having the actual data and knowing what analysing it can achieve. Of course, not all staff have to become full time “data scientists”. But these days, knowledge workers already have to take up this role occasionally – and have to rely on appropriate support from within the company when necessary.
While big data is more prevalent in the enterprise than ever, there is a lot of potential for using data more effectively. Modern technology can pay a significant contribution towards this. The Talend Platform specifically addresses a large number of the problems outlined here, including data integration with big data, data quality, master data management and self-service for data preparation.
However, the technology aspect should never mean the significance of the company culture and staff training gets forgotten. In the end, sustainable analytics success can only be achieved with an all-encompassing approach in the business. Data-driven companies like GE, Coca-Cola, Uber and Air France are leading the way and are now piling the pressure onto the competition. Ultimately, this development is going to impact all sectors and certainly all small and medium sized companies.
About the author Dr. Gero Presser
Dr. Gero Presser is a co-founder and managing partner of Quinscape GmbH in Dortmund. Quinscape has positioned itself on the German market as a leading system integrator for the Talend, Jaspersoft/Spotfire, Kony and Intrexx platforms and, with their 100 members of staff, they take care of renowned customers including SMEs, large corporations and the public sector.
Gero Presser did his doctorate in decision-making theory in the field of artificial intelligence and at Quinscape he is responsible for setting up the business field of Business Intelligence with a focus on analytics and integration.