|
If you are starting your journey as a marketer using big data and wondering about the specific applications of predictive analytics and how it could impact your marketing decision making process here are a few ways to think about it.
The key imperative of embarking on the journey into big data and predictive analytics is “actionable insights that help achieve marketing goals”. Predictive analytics using machine learning will help you extrapolate past behaviors, reveal patterns and help predict future behaviors. A. Simple Applications 1. Lead Conversion : The life-blood of sales is lead conversions and its common sense to conclude that not all leads have equal potential to convert. Predictive analytics can help you identify from past conversions the ideal demographic/ behavioral characteristics of a strong lead vs a weak lead. Prioritising leads in this way will help sales teams create a plan for engagement based on immediacy of conversion. Note : Weak leads might be dropping off in a consistent pattern that might require a different set of actions to be performed to help convert them into strong leads. 2. Churn Prevention : Retention is always more profitable Vs acquisition. Customer Lifetime Value is the cornerstone of consumer centric marketing. Therefore, reducing churn by determining the factors that lead to churn of specific segments of consumers and creating plans to proactively prevent churn is one of the most useful applications of predictive analytics. Likelihood to churn can be based on visible factors like complaints lodged, frequency of interaction with the product, time of year, weather, demographic shifts or by a series of complex factors and invisible factors like competitive pricing, in-market innovation in features etc.. Based on the reasons for churn we can create the right ‘nurture’ programs to address the segments. B. Strategic Applications
There are more than 4 applications for predictive analytics (Continuous Multi-Variate Content Testing, Post Trade Programmatic Audit etc) but the list will help start the right practices to help ensure early wins within the orgainsation.
0 Comments
Assuming you have worked the right target and the right context how do you deliver the right message that’s exactly relevant to the consumer and that has a higher certainity of converting ?
Given that achieving relevance in a mass precision setting is a complex task the following steps can help : 1. Segment Understanding : The inputs into creating programmatic creative start with an understanding of the consumer segments that will be targetted and their specific behavior/ desired behavior changes. Understanding their consumer journeys and the inherent opportunities for behavior change within each episode of the journey becomes the opportunity statement that creatives are trying to solve for. The different journeys and triggers per segment creates the first level of personalisation. 2. Context of the Segment / Targetting : Context creates the second level of personalisation. Context is information regarding the platform, time, device, location, weather and editorial environement in which the creative will interact with consumers. Especially, understanding the unique platform specific interactions and preferences of users is key in improving relevance. 3. UX : The third and sometime the most important level of personalisation is with the UX design of the creative and its possible iterations based on A/B Testing and optimisation. This includes technical considerations like image compression, kight-weight animation and reducing custom fonts. Reduction in load time is a parallel goal for optimal UX. A few examples to showcase the simplicity of the creative solutions within programmatic : 1. The Economist – Personalised Display : As The Economist looked to add new subscribers it wanted to target people who wanted more value / analysis when reading the news. First it chose content that was most popular with its current user base and understood the context of the article related to the user profile. Second it targetted news websites based whose context was similar to the ones preferred by its loyal consumers. Third, to create relevance it created witty and humorous content and placed it next to the relevant news articles (always linking its page context to user profile). The campaign saw The Economist increase prospects by 650, 000 and an ROI of 10:1. 2. O2 Refresh – Personalised Video: The goal was to increase conversion for 02’s latest refresh campaign amongst the 3 segments of early adopters, out of contract and users with upcoming contract expiration. It used data (device, location) in order to effectively target its segments with personally relevant messages (recycle value for their current device + upgrade options + popular preferences of users like them + store locators). O2 recycled its TV ads to digital to benefit from precision targetting and developed programmatic creative to help them achieve significantly higher results (CTR increased by 128%). We all thought that programmatic advertising would solve all of marketings problems – personalization, engagement and boom (!)- conversion. Apart from the many problems its faces in execution (transparency, data quality, ad fraud etc) the issue seems to be the lack of focus – more talk-time is spent on targeting and viewability (these are milestone not metrics) while programmatic is meant to improve purchase intent/ aid conversion.
Given that programmatic spends will grow to $50 billion in 2017, improving quality in programmatic has to be an imperative if marketing is to benefit from the efficiencies in automation @ scale that it promises. The fault is not with programmatic advertising itself, its in what rules we preset for programmatic – rules that are less focused on contextual logic/ conversion but more blandly robotic in executing using remarketing principles. As simple as it sounds ‘right person, right time, right content’ is anything but. How should we choose the preset rules ? Using machine learning might help us create these rules by keeping conversion as the key goal and allowing for machine learning to match time/ audience demographics/ contextual content / advertising content to goals and determining the patterns that work and those that don’t. 2 patterns that machine learning will map for you : 1. Identify your high value audiences : using past behaviors, expressed current signals, probability of conversion and the average Customer Lifetime Value (CLV). 2. Identify your high conversion content-context : the golden combinations of context + content that leads to conversions with the high value audience. Patterns are not static and change over time of year/ influenced by trends and events as well as the entry of new audiences/ exit of current customers. Focusing on conversion oriented goals in programmatic and using machine learning to develop strong conversion models will help sharpen the efforts of targeting-viewability-content creation. One innovation that is allowing transparency and goal conversion to become the focus is ‘post trade programmatic’ – where the buyer can read 300-400 signals of the delivered impression to determine the accuracy-efficacy of the buy, including targeting and viewability. This then allows for machine learning to kick in, analyse the delivery Vs goals to determine success/ failure paths and optimise the buy every time. Using machine learning to help improve programmatic performance is not the only solution to achieving better quality programmatic but it certainly begins to immediately improve user experience, targetting accuracy, transparency and content choice. The real utility of big data comes to life when we are using it to analyse hidden patterns, analyse correlation and predict the possibility of future occurrences. This has direct implications to how marketing can use big data as a powerful tool to enhance the consumer experience, drive efficiency in spends and improve their effectiveness.
The promise of Machine Learning is to help duplicate the human experience of learning by extrapolating from past behaviors to predict future behaviors. The difference will be the scale/ speed / accuracy and consistency – all of which will improve exponentially. Machine learning, because it works towards determining patterns (Vs pre-set algorithms that are set to solve a specific problem logically) is especially useful in ‘real-time’ scenarios (the better way to talk ‘real-time’ is actually ‘right time’ given that business cycles vary depending on category) where the time available to discover the pattern and attempt predictions is extremely limited. It works best when you want to understand what happened in the last week and what is likely to occur in the next week. Before talking any further, I’d like to step back and underline the key imperative of big data – actionable insights. Insights that are actionable within the business cycle of the predictions. Insights that are deterministic, help generate clear conclusions or hypotheses and will be accepted by the broader organisation as action-worthy. Here are some pointers for marketers towards helping big data predictions :
Predictive analytics for marketing decisions is an emerging art, enabled by science. While its engine is driven by the past the true potential lies in taking the leap from predictions to imagine new possibilities. 90% of the Big Data that is being generated everyday is not in the form of numbers but words (unstructured data). What people write-click-share-view-upload everyday can reveal real insights about their motivations that can help marketers build relevant and conversion oriented experiences. Specifically emails, reviews, requests, complaints, FAQs from consumers can reveal insights @ scale that can be used by marketers to observe patterns, capture trends and identify high risk events.
The process of applying machine learning and natural language processing techniques to text data is complex. Not surprisingly, given that language is about concepts-meaning-inferences-nuance and specificity the process is part science and part art. Lets look at two steps within the entire modeling process that has direct linkage to marketing (Choosing a modeling technique to run your data, Data Transformation, Test-Training Data etc. are crucial parts of the process) : Step 1 : Feature Extraction Extraction of meaningful phrases within text, entity extraction (person, places, products and organisations). There are a wide range commercial text analytics tools that help you do this. Some of them are : IBM Watson Alchemy API, Lexalytics, Microsoft Text Azure Analytics API. Step 2 : Codification After the extraction of key phrases and entities is done the next step is to create tokens (phrases/ entity that will signal a key event has occurred) is basic first step. This step requires an in-depth understanding of category and consumer to help capture the signal Vs the noise. Whats meaningful and actionable for one category may be less so for another. Red flags for a skin care brand could be words like (allergy) while (call drop offs) would be more worrisome for a telco provider. Example A : For an Insurance provider focussed on building a reputation for simple and easy approvals, analysis of consumer reviews-FAQs-call center transcripts might indicate that consumers are dropping-outs because of the confusing application form that they need to fill in. In this case positive/ negative phrases related to application-form filling etc become meaningful. Token Example : Creating a token with negative phrases linked to application become the trigger for a company to respond immediately and appropriately to rectify the situation. Detecting the signals Vs the noise for your brand in ways that are actionable will allow you to effectively improve your brand goals in immediate and measurable ways. Lastly, continuous human intervention is needed as patterns change over time, new vocabulary and contexts emerge. “Big data is about finding correlation, while small data is about finding causation” – Martin Lindstorm.
Marketing is obsessed with harnessing the power of big data – rightly so. Big data is extremely valuable in understanding patterns at scale, helping brands take advantage of trends and for marketers to drive better ROI. But this laudable effort should be done in conjunction with the reverance for ‘small data’. Small data and the role of insights in marketing is different from that of big data and provides valuable inputs into marketing that big data, by definition, cannot.
Small data and big data are both valuable but different. Using both intelligently Vs choosing one should be the way to go. Small data should feed valuable hypotheses/ ideas and inspire while big data should discern patterns @ scale and inform / prioritise. |
About MeBuilding iconic brands using data, design and digital. Image Courtesy : JJ Ying
Archives |