Big Data Will Transform Every Element of the Healthcare Supply Chain
The entire healthcare supply chain has been being digitized for the last several years. We’ve already witnessed the use of big data to improve not only patient care, but also payer-provider systems, reducing wasted overhead, predict epidemics, cure diseases, improve the quality of life and avoid preventable deaths. Combine this with the mass adoption of edge technologies to improve patient care and wellbeing such as wearables, mobile imaging devices, mobile health apps, etc. However, the use of data across the entire healthcare supply chain is about to reach a critical inflection point where the payoff from these initial big data investments will be bigger and come more quickly than ever before. As we move into 2017, healthcare leaders will find new ways to harness the power of big data to identify and uncover new areas for business process improvement, diagnose patients faster as well as drive better more personalized, preventative programs by integrating personally generated data with broader healthcare provider systems.
Your Next Dream Job: Chief Data Architect
By 2020, an estimated 20.8 billion connected “things” will be in use, up from just 6.4 billion in 2016. As Jeff Immelt, chairman and CEO of GE famously said, “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” Mainstream businesses will face ongoing challenges in adopting big data and analytics practices. As the deluge of IoT continues to flood enterprise data centers, the new ‘coveted’ or critical role on the IT team will no longer be the Data Scientist—instead, it will be the Data Architect. The growing role of the Data Architect is a natural evolution from a Data Scientist or business analyst and underscores the growing need to integrate data from numerous unrelated, structured and unstructured data sources—exactly what the IoT era demands. If a company makes the misstep of tying their IT environment to the wrong big data platform, or establishing a system that lacks agility, it could paralyze a company in this data driven age. Two other challenges presented by the rise of the data architect is the shortage of qualified talent needed to fill data architect roles, as well as driving the cultural shift necessary to make the entire company more data driven.
The AI Hypecycle and Trough of Disillusionment, 2017
IDC predicts that by 2018, 75 percent of enterprise and ISV development will include cognitive/AI or machine learning functionality in at least one application. While dazzling POCs will continue to capture our imaginations, companies will quickly realize that AI is a lot harder than it appears at first blush and a more measured, long-term approach to AI is needed. AI is only as intelligent as the data behind it, and we are not yet at a point where enough organizations can harvest their data well enough to fulfill their AI dreams.
At Least one Major Manufacturing Company will go belly up by not utilizing IoT/big data
The average lifespan of an S&P 500 company has dramatically decreased over the last century, from 67 years in the 1920s to just 15 years today. The average lifespan will continue to decrease as companies ignore or lag behind changing business models ushered in by technological evolutions. It is imperative that organizations find effective ways to harness big data to remain competitive. Those that have not already begun their digital transformations, or have no clear vision for how to do so, have likely already missed the boat—meaning they will soon be a footnote in a long line of once-great S&P 500 players.
The Rate of Technology Obsolescence Will Accelerate
In 2017, we are going to see an increasing number of companies shift from simply ‘kicking the tires’ on the use of cloud and big data technologies to actually implementing and deriving significant ROI from enterprise deployments. However, at the same time, the rate of technology innovation today is at an all-time high—nearly replacing existing solutions every 12-18 months. As companies continue to drive new uses of big data and technologies, the rate of technology obsolescence will accelerate. Creating a new challenge for businesses: the possibility that the solutions/tools they are using today may need to be updated or entirely replaced within a period of a few months.
The increased use of public information for public good
Under the current administration, The White House has set a precedent in creating transparency of government through data.gov, making hundreds of thousands of datasets available online for public use. Data is inherently dumb or ‘dirty’. So we must use the algorithmic economy to define action and make sense of the data being in order to power the next great discovery. In 2017, organizations will find a way to apply machine learning to this public data lake in order to contribute to the greater good. For example, Uber might utilize this data to determine where accidents frequently occur on the roads and create new routes for drivers in order to augment passenger safety.