Analytic Rock Stars: Big Data and the Customer Experience
The buzzwords “big data” and “analytics” have flooded the wild web since the trend emerged in the digital sphere. They promised to unleash novel insights and help businesses execute innovative strategies. But what these terms really mean is huge, unstructured amounts of information.
Turning this abundance of zeros and ones into something valuable for your business requires an innovative approach to numbers. Using data and analytics to drive customer experience, for example, is easier said than done.
Obstacles to Applying Big Data Analytics Tools
Many business leaders were initially excited about the potential of data to reform the customer experience. However, this enthusiasm tapers when it comes to figuring out how to effectively analyze and derive meaningful insights from the data deluge.
Several hurdles prevent companies from successfully deploying big data and analytics. According to a 2016 McKinsey & Company Global Survey, 39% of executives said designing an appropriate organizational structure to support analytics activities was the greatest impediment to building an effective data analytics infrastructure.
Other obstacles include securing internal leadership for analytics projects (33%) and constructing a strategy to prioritize investment in analytics (22%).
Despite these challenges, some entities have successfully used analytics to completely redesign the customer experience, increase efficiency and gain a leg up from competitors. Here are 4 examples that stand out.
L’Oréal Paris Uses Big Data to Personalize Shopping and Educate Consumers
In 2014, L’Oréal Paris launched Makeup Genius, an app that lets consumers to virtually try on makeup. L’Oréal USA CMO Marie Gulin-Merle explained how it works: The app uses a scanned image of the user’s face, analyzes upwards of 60 characteristics, then displays in real time how different makeup products would look on the person.
Based on search and purchasing data, the app uses data and analytics to learn users’ preferences and provide tailored responses for a more personalized mobile shopping experience.
L’Oréal also deployed data and analytics to determine the best engagement strategy when preparing to launch its Maybelline Master Contour makeup line. The company partnered with Google to access search data and gain a better sense of consumers’ contouring questions and concerns.
After analyzing this information, L’Oréal created a series of YouTube videos to target “contour-me-quick” users. The videos, each containing three steps, simplify contouring for women who are interested in the technique but assume it’s too difficult or time-consuming to learn. The data-driven approach enabled the company to provide educational content for more than 9 million viewers and reduce the barrier to purchase.
How the NFL Leveraged RFID Technology to Deepen the Fan Experience
Sports teams and the National Football League, in particular, have long kept stats to inform strategy and create effective plays. With the arrival of high tech, big data and analytics not only changes the game, it improves the customer experience by promoting greater fan engagement and better broadcasting.
After a successful pilot in 2014, the NFL in 2015 expanded Zebra Technologies’ MotionWorks Remote Frequency Identification Data sensors to all players and stadiums.
The sensors, implanted into players’ shoulder pads and stadiums, use GPS tracking to capture key vector data points, including a player’s distance traveled, speed and direction. As of the 2017 NFL season, sensors will also be implanted into footballs to collect real-time location, speed and rotation data.
Zebra’s algorithms aggregate players’ stats and display them in real time, enabling coaches to tailor plays and lineups to better face competition, as well as improve training. The data and analytics used by the NFL also allows fans to access visualizations, statistics, and fantasy recommendations that were previously unavailable.
City-Wide Fitness Tracker Makes Chicago the First “Smart City”
Chicago’s Array of Things (AoT) is the first urban sensing project of its kind in the U.S. It is comprised of a network of interactive modular sensor boxes that collect real-time data on the environment, infrastructure, and pedestrian and vehicle activity to support research and public use.
The first of the sensors were installed in the summer of 2016, with a total of 500 planned for installation by the end of 2018.
The AoT, which has been likened to a city “fitness tracker,” measures factors that affect Chicago residents’ health and commutes, such as climate, air quality, noise, and traffic. According to the project’s official website, the nodes measure temperature, barometric pressure, light, a variety of pollutants including carbon monoxide and nitrogen dioxide, vibration, ambient sound intensity, and pedestrian and vehicle traffic.
The real-time, location-based data is published online. Residents can use the information to track their exposure to certain air contaminants, avoid excessive noise and congestion, and even find the most populated routes at various times of day or night.
In addition to individuals’ use, organizations, engineers, and researchers can use the data to study urban environments and support urban planning.
Hospital Command Centers Use Data and Predictive Analytics to Manage Patient Flow
Hospitals are looking to NASA’s Mission Control Center as an example as they increasingly rely on data and analytics to manage capacity, better coordinate patient flow, anticipate surges in the emergency department and prevent bottlenecks.
Efficient patient flow not only makes the hospital run smoother, it enhances the patient experience by eliminating long ED wait times, delays in transfers, and other frustrations.
Baltimore-based The Johns Hopkins Hospital worked with GE Healthcare Partners, an advisory firm within GE Healthcare, to open its Capacity Command Center in February 2016. The hospital’s 5,500-square-foot center, which includes GE’s proprietary Wall of Analytics, combines systems engineering, experience, big data and predictive analytics to coordinate the flow of 50,000 inpatients and 80,000 ED patients a year.
Hospital decision-makers work in the command center to monitor various algorithms and analytics to guide patient flow.
Since the hospital activated its command center, it’s seen a 60% improvement in the ability to accept complex patients from other hospitals. ED patients are assigned to beds 30% faster after the decision to admit, and they are transferred 26% faster after they are assigned a bed. Transfer delays from the operation room have dropped by 70%, and 29% more patients are discharged before noon. This increased efficiency is a boon to the hospital’s bottom line, as well as the patient experience.
While not easy to implement effectively, the applications for big data and analytics to improve the customer experience are endless. As companies, sectors, and cities continue to look for ways to innovate and compete, big data and customer experience will become even more tightly linked.