Mastering Data and Self-Development at Lenovo
In 2012 when I worked at Lenovo, the company set out on a journey to create the Lenovo Unified Customer Intelligence (LUCI) platform. The decisions we made with regard to people and technology involved in that project helped to shape my self-development, relationships with others on the team, and relationship with executives.
Data management leaders today are still facing a problem that has been around for years:
How do we create systems and processes to move, transform, and deliver trusted data insights at the speed of business?
To provide an understanding of how we solved this problem at Lenovo, it would be helpful if I shared a bit of background about myself. I come from a non-traditional background in that data and analytics is not where I started.
My first position at Lenovo was as a digital analytics implementation manager, responsible for ensuring that all of the data collected at Lenovo.com integrated with digital solutions. I used quality assurance programs to establish trust. My web analytics team quickly realized that in order to create the value we wanted, we would need to integrate with many online and off-line data sources.
Building Your Data Team & Self-Development
This realization and the understanding that the team needed to build a new kind of platform was the beginning of a multi-year self-development journey. As I began to evaluate our needs against our internal platforms, I realized that none of them were capable of supporting our key requirements.
We needed an analytical platform that supported batch and streaming on 10-plus terabytes per year. We chose Tableau, R, and Python for the analytics layer and leveraged Amazon Web Services cloud databases for the storage layer. But we still needed to make a decision on the data integration layer.
The 80/20 rule of data management came to mind: I refused to accept that 80% of the time would be spent on data wrangling and 20% spent on analysis. Our program had more than 60 endpoints, and change management needed to occur within one business day. We wanted 30% of our resources focused on data wrangling and 70% focused on business intelligence (BI) and analytics.
To achieve this, we selected Talend for our integration technology, established one-week agile sprints, and leveraged our people to be integrators, implementers, and administrators.
Building the IT and Business Relationship
Organizational support for data integration was decentralized and often leveraged different vendors. It was considered an IT function and put in the background several layers removed from the business. I wanted to grow my team, and the only way I could do this was by creating value with business stakeholders.
At this point, I created data architect roles that would become masters of their domains and cross-trained in others. These roles would be business facing so that architects would be working directly with the stakeholders. They were responsible for architecting, developing, and maintaining their own data solutions.
A single data solution such as the voice of the customer pipeline could have more than 10 data sources, structured and unstructured data, varying volumes and velocities, translation and natural language processing loops, and multiple analytics and visualization outputs.
Empowering data architects over such a large scope enabled them and the business to move at the pace that was needed for success. Working hand-in-hand, analysts began to understand the data wrangling processes, improving both the performance and quality of these processes.
Most important, it helped them understand the value of an efficient data integration team.
Relationships with Executives
Business executives didn’t understand, nor were they interested in understanding, how a good data management practice can help drive the business forward.
The first two to three years of my role was focused on delivering insights more efficiently. We tackled challenges such as having a dashboard that required six people over the course of a month to copy and paste in Excel to get an executive a view once a month. We got that down to half a person, automated, daily, and with quality checks.
These wins give us the credibility and momentum to then connect data sets in different ways and to experiment with new analytics models. The larger business impacts and analytical wins came after we had a strong data integration and management practice. Today, many of those business executives understand what ETL is and why it’s important for their business.
Some of my key learnings throughout this experience have been to drive a sense of ownership and business accessibility with the data architect function. The most important was to help my team to understand the “why”.
Oftentimes the "why" of a business case is return on investment (ROI). I would vigorously enforce that the architects and engineers had to articulate how their actions were impacting a business objective, regardless of how far removed they were from the problem.
This focus on ROI, understanding the why, empowering technical resources interfaced with the business, and giving more end-to-end ownership of these data processes are in my opinion the keys to building a successful data integration practice.