Mobile Passive Behavioral Data: Opportunities and Pitfalls

Posted by Chris Neal

Tue, Jul 21, 2015

By Chris Neal and Dr. Jay Weiner

Hands with phonesAs I wrote in last week’s post, we recently conducted an analysis of mobile wallet use in the U.S. To make it interesting, we used unlinked passive mobile behavioral data alongside survey-based data.In this post, I’ve teamed up with Jay Weiner—our VP of Analytics who helped me torture analyze the mobile passive behavioral data for this Mobile Wallet study—to share some of the typical challenges you may face when working with passive mobile behavioral data (or any type of passive behavioral data for that matter) along with some best practices for dealing with these challenges:

  1. Not being able to link mobile usage to individualsThere’s a lot of online passive data out there (mobile app usage ratings, web usage ratings by device type, social media monitoring, etc.) that is at the aggregate level and cannot be reliably attributed to individuals. These data have value, to be sure, but aggregate traffic data can sometimes be very misleading. This is why—for the Mobile Wallet project CMB did—we sourced mobile app and mobile web usage from the Research Now mobile panel where it is possible to attribute mobile usage data to individuals (and have additional profiling information on these individuals). 

    When you’re faced with aggregate level data that isn’t linked to individuals, we recommend either getting some sample from a mobile usage panel in order to understand and calibrate your results better and/or doing a parallel survey-sampling so you can make more informed assumptions (this holds true for aggregate search trend data, website clickstream data, and social media listening tools).
  1. Unstacking the passive mobile behavioral data. Mobile behavioral data that is linked to individuals typically comes in “stacked” form, i.e., every consumer tracked has many different records: one for each active mobile app or mobile website session. Analyzing this data in its raw form is very useful for understanding overall mobile usage trends. What these stacked behavioral data files do not tell you, however, is the reach or incidence (e.g., how many people or the percentage of an addressable market) of any given mobile app/website. It also doesn’t tell you the mobile session frequency or duration characteristics of different consumer types nor does it allow you to profile types of people with different mobile behaviors. 

    Unstacking a mobile behavioral data file can sometimes end up being a pretty big programming task, so we recommend deciding upfront exactly which apps/websites you want to “unstack.” A typical behavioral data file that tracks all smartphone usage during a given period of time can involve thousands of different apps and websites. . .and the resulting unstacked data file covering all of these could quickly become unwieldy.
  1. Beware the outlier! Unstacking a mobile behavioral data file will reveal some pretty extreme outliers. We all know about outliers, right? In survey research, we scrub (or impute) open-ended quant responses that are three standard deviations higher than the mean response, we take out some records altogether if they claim to be planning to spend $6 billion on their next smartphone purchase, and so on. But outliers in passive data can be quite extreme. In reviewing the passive data for this particular project, I couldn’t help but recall that delightful Adobe Marketing ad in which a baby playing with his parents’ tablet repeatedly clicks the “buy” button for an encyclopedia company’s e-commerce site, setting off a global stock bubble. 

    Here is a real-world example from our mobile wallet study that illustrates just how wide the range is of mobile behaviors across even a limited group of consumers: the overall “average” time spent using a mobile wallet app was 162 minutes, but the median time was only 23 minutes. A very small (<1% of total) portion of high-usage individuals created an average that grossly inflated the true usage snapshot of the majority of users. One individual spent over 3,000 minutes using a mobile wallet app.
  1. Understand what is (and what is not) captured by a tracking platform. Different tracking tools do different things and produce different data to analyze. In general, it’s very difficult to capture detailed on-device usage for iOS devices. . .most platforms set up a proxy that instead captures and categorizes the IP addresses that the device transmits data to/from. In our Mobile Wallet study, as one example, our mobile behavioral data did not pick up any Apple Pay usage because it leverages NFC to conduct the transaction between the smartphone and the NFC terminal at the cash register (without any signal ever being transmitted out to the mobile web or to any external mobile app, which is how the platform captured mobile usage).   There are a variety of tricks of the trade to account for these phenomenon and to adjust your analysis so you can get close to a real comparison, but you need to understand what things aren’t picked up by passive metering in order to apply them correctly.
  1. Categorize apps and websites. Needless to say, there are many different mobile apps and websites that people use, and many of these do a variety of different things and are used for a variety of different purposes. Additionally, the distribution of usage across many niche apps and websites is often not useful for any meaningful insights work unless these are bundled up into broader categories. 

    Some panel sources—including Research Now’s mobile panel—have existing mobile website and app categories, which are quite useful. For many custom projects, however, you’ll need to do the background research ahead of time in order to have meaningful categories to work with. Fishing expeditions are typically not a great analysis plan in any scenario, but they are out of the question if you’re going to dive into a big mobile usage data file.

    As you work to create meaningful categories for analysis, be open to adjusting and iterating. A certain group of specific apps might not yield the insight you were looking for. . .learn from the data you see during this process then try new groupings of apps and websites accordingly.
  1. Consider complementary survey sampling in parallel with behavioral analysis. During our iterative process of attempting to categorize mobile apps from reviewing passive mobile behavioral data, we were relieved to have a complementary survey sampling data set that helped us make some very educated guesses about how or why people were using different apps. For example, PayPal has a very successful mobile app that is widely used for a variety of reasons—peer-to-peer payments, ecommerce payments, and, increasingly, for “mobile wallet” payments at a physical point of sale. The passive behavioral data we had could not tell us what proportion of different users’ PayPal mobile app usage was for which purpose. That’s a problem because if we were relying on passive data alone to tell our clients what percent of smartphone users have used a mobile wallet to pay at a physical point of sale, we could come up with grossly inflated numbers. As an increasing number of mobile platforms add competing functionality (e.g., Facebook now has mobile payments functionality), this will remain a challenge.

    Passive tracking platforms will no doubt crack some of these challenges accurately, but some well-designed complementary survey sampling can go a long way towards helping you read the behavioral tea leaves with greater confidence. It can also reveal differences between actual vs. self-reported behavior that are valuable for businesses (e.g., a lot of people may say they really want a particular mobile functionality when asked directly, but if virtually no one is actually using existing apps that provide this functionality then perhaps your product roadmap can live without it for the next launch).

Want to learn more about the future of Mobile Wallet? Join us for a webinar on August 19, and we’ll share our insights with you!

Chris Neal leads CMB’s Tech Practice. He judges every survey he takes and every website he visits by how it looks on his 4” smartphone screen, and has sworn off buying a larger “phablet” screen size because it wouldn’t fit well in his Hipster-compliant skinny jeans.

Dr. Jay heads up the analytics group at CMB. He opted for the 6 inch “phablet” and baggy jeans.  He does look stupid talking to a brick. He’s busy trying to compute which event has the higher probability: his kids texting him back or his kids completing an online questionnaire. Every month, he answers your burning market research questions in his column: Dear Dr. Jay. Got a question? Ask it here!

Want to learn more about combining survey data with passive mobile behavioral data? Watch our recent webinar with Research Now that discusses these findings in depth.

Watch Now!

Topics: Advanced Analytics, Methodology, Data Collection, Mobile, Dear Dr. Jay, Webinar, Passive Data

Upcoming Webinar: Passive Mobile Behavioral Data + Survey Data

Posted by Chris Neal

Mon, Jul 13, 2015

mobile research, mobile data collection, The explosion of mobile web and mobile app usage presents enormous opportunities for consumer insights professionals to deepen their understanding of consumer behavior, particularly for “in the moment” findings and tracking consumers over time (when they aren’t actively participating in research. . .which is 99%+ of the time for most people). Insight nerds like us can’t ignore this burgeoning wealth of data—it is a potential goldmine. But, working with passive mobile behavioral data brings with it plenty of challenges, too. It looks, smells, and feels very different from self-reported survey data:

  • It’s big. (I’m not gonna drop the “Big Data” buzzword in this blog post, but—yep—the typical consumer does indeed use their smartphone quite a bit.)
  • It’s messy.
  • We don’t have the luxury of carefully curating it in the same way we do with survey sampling. 

As we all find ourselves increasingly tasked with synthesizing insights and a cohesive “story” using multiple data sources, we’re finding that mobile usage and other data sources don’t always play nicely in the sandbox with survey data. Each of them have their strengths and weaknesses that we need to understand in order to use them most effectively. 

So, in our latest in a series of sadomasochistic self-funded thought leadership experiments, we decided to take on a challenge similar in nature to what more and more companies will ask insights departments to do: use passive mobile behavioral data alongside survey-based data for a single purpose. In this case, the topic was an analysis of the U.S. mobile wallet market opportunity. To make things extra fun, we ensured that the passive mobile behavioral data was completely unlinked to the survey data (i.e., we could not link the two data sources at the respondent level for deeper understanding or to do attitudinal + behavioral based modeling). There are situations where you’ll be given data that is linked, but currently—more often than not—you’ll be working with separate silos and asked to make hay.

During this experiment, a number of things became very clear to us, including:

  • the actual value that mobile behavioral data can bring to business engagements
  • how it could easily produce misleading results if you don’t properly analyze the data
  • how survey data and passive mobile behavioral data can complement one another greatly

Interested? I’ll be diving deep into these findings (and more) along with Roddy Knowles of Research Now in a webinar this Thursday, July 16th at 1pm ET (11am PT). Please join us by registering here

Chris leads CMB’s Tech Practice. He enjoys spending time with his two kids and rock climbing.

Watch our recent webinar with Research Now to hear the results of our recent self-funded Consumer Pulse study that leveraged passive mobile behavioral data and survey data simultaneously to reveal insights into the current Mobile Wallet industry in the US.

Watch Now!

Topics: Advanced Analytics, Methodology, Data Collection, Mobile, Webinar, Passive Data, Integrated Data

Superman, the Super Bass-o-Matic, and CMB's EMPACT℠

Posted by Dr. Erica Carranza

Mon, Jun 08, 2015

Introducing CMB's EMPACTSM: A practical approach to understanding the emotional impact of your brand.

Emotions matter in driving consumer choices. 

This is fast becoming a truism—thanks in part to behavioral economics making its way to the mainstream press.  For evidence from your own life, take a moment to think about your favorite brand.  What do you like about it?  What are the products or experiences it provides?  Now think about how those things make you feel.  Or think about the last time you swore off a brand.  Like the last time I bought something from Ikea.  They sold me an extra part they said I would need.  They didn’t deliver the part, then they told me I didn’t really need it.  But they charged me for it, and never credited me despite my investing 3 hours of time in calls with their customer service.  I felt so frustrated, and so angry, that I swore I’d never buy from Ikea again.  NEVER AGAIN!  [shakes fist at sky]  And, to date, I haven’t.  But I digress… The point is that scientific research, marketing research, and conventional wisdom all suggest that, if you’re trying to attract and engage consumers, emotions are an important piece of the puzzle.     

So what’s the best way to understand how your brand or product makes consumers feel, and what role those feelings play in shaping their choices?  Many marketers and market researchers have been wringing their hands over this question.  Which, in turn, has led research vendors to serve up an array of solutions—including some positioned as ways to get at “unconscious” emotions, or to tap into how people feel without having to ask them. I call these “Superman Methods.” 

CMB Empact, Emotional Impact AnalysisIf Superman wants to know what color your underwear is, he doesn’t need to ask.  He can see it without your saying a word.  He can see it even if you forgot which pair of underwear you chose this morning.  And if you don’t want Superman looking at your underwear, too bad!  HE CAN SEE IT ANYWAY. 

Wouldn’t it be nice if we had Superman-like methods that tapped consumers’ emotions directly, without ever having to ask them how they felt? 

I was witness to many a sales pitch for “Superman Methods” while I was on the client side.  It's hard not to be drawn in by their promise.  But ultimately I was bothered by a few key things:

  • Biometric measures (e.g., skin conductance, facial EMG, brain waves) are often positioned as Superman-style tools.  But even when they do a great job of measuring how good or bad someone feels (as with facial EMG), they don’t provide good measures of discrete emotions.  For example, they can’t tell you if negative feelings are driven by Anger vs. Anxiety, or if positive feelings reflect Amusement vs. Pride. 

  • Facial coding does measure some specific emotions.  But it only gets at the “basic” emotions, which are: Happiness, Surprise, Anger, Sadness, Fear, Disgust, and Contempt. 

    bass-o-matic, Empact, CMB, Emotional Impact AnalysisNotice anything about that list?  There is only one positive emotion.  The rest are all negative—except Surprise, which could swing either way.  So unless you’re trying to help Dan Aykroyd sell the Super Bass-o-Matic (for which disgust, anger and contempt could top the list of consumer reactions), understanding how your product makes people feel would ideally capture more granularity in terms of their positive emotions

    For example, what about feeling relaxed?  Proud?  Entertained?  Secure?  Indulged?  And even among negative emotions, there is more nuance.  What about feeling frustrated?  Bored?  Disappointed?  Or embarrassed? 

    Consumers’ emotional lives are more complex than what the “basic emotion” faces can reveal—and understanding that complexity can help you find a more direct (and competitively differentiated) route to capturing their hearts

  • While it’s true that people don’t always know why they do what they do, it doesn’t follow that they don’t know how they feel.  I might not know all the reasons why I choose Seventh Generation for my kids, but I know how its brand promise makes me feel.  And while we can’t always trust the reasons consumers give, isn’t that why we derive importance through experimental designs and predictive models? 

  • Furthermore, how much “Superman Methods” really tap the unconscious—or add value to self-report measures in consumer domains—is debatable.  For example, many scientists question whether the oft-cited Implicit Association Test (IAT) actually measures unconscious associations.  And meta-analyses (including one led by a creator of the IAT) have found that it doesn’t work as well as self-reports to predict consumer preferences. 

What measures like facial coding, EMG, and the IAT do do well is subvert socially sensitive situations—where people know how they feel, but don’t want to tell you.  (The IAT was first developed to study prejudice—a great use case, since people with racist attitudes usually try keep them on the DL).  But if you want to know how your brand, ad, or product makes people feel, in most cases you can trust what they tell you.  Especially in a context where they feel comfortable being honest, like an online/mobile survey.  In the hands of a skilled moderator, in-person discussions can also be a great way to uncover emotional reactions, but that method isn’t scalable to large samples. 

At CMB, we do a lot of research that calls for large samples, so we wanted to develop and validate a way to measure how brands/touchpoints make consumers feel that is: practical (e.g., scalable, fast, cost-effective, easy to combine with other measures such as brand perceptions); comprehensive (in terms of the range of emotions measured); robust (leveraging insights from the scientific study of emotion); and systematic (to enable brand comparisons, or track over time).  Oh yeah—and we also wanted results that are clear and compelling.  Because, if you can’t effectively communicate them to people who need to use them, what’s the point? 

Our solution is a survey-based approach to measuring the emotional impact of brands, communications, products and experiences called EMPACTSM. Curious? Watch our webinar!

WATCH HERE

Erica Carranza is a CMB Account Director with supplier- and client-side (American Express) experience. She is also our resident social psychologist; she earned her Ph.D. in psychology from Princeton University.

Topics: Chadwick Martin Bailey, Emotional Measurement, Webinar, BrandFx

WEBINAR: Using Discrete Choice to Better Position your Brand in a Complex Market

Posted by Amy Modini

Thu, Feb 20, 2014

CMB webinarsPlease join CMB's Amy Modini and UPMC's Jim Villella today at 12:30pm ET for our latest webinar: Using discrete choice to better position your brand in a complex changing market

Is your industry evolving?  In this webinar you'll learn how UPMC and CMB applied a discrete choice methodology, accounting for various factors to estimate shifting consumer preferences, make key product development and marketing strategy decisions, and ultimately position UPMC for success.

The health insurance industry faces an urgent need to prepare for a new competitive market introduced by healthcare reform. UPMC recognized the opportunity to gain competitive strength in the market through innovation and new product development. However, the research supporting these decisions would need to account for a wide range of market changes and influences. To apply a trade-off exercise UPMC needed to address many challenges, including:

  • New shopping and purchase channels

  • Controlling the effect of discounts and subsidies on price

  • Introduction of entirely new consumer segments for whom purchase behavior is unknown

  • Product optimization for multi-tier offerings

Register here

Did you miss one? All of our webinars are available here

Topics: Healthcare Research, Webinar, Brand Health & Positioning

Looking for Innovation? Consider "Brand-Storming"

Posted by Mark Carr

Thu, Feb 06, 2014

Originally posted in the SMEI blog

South Street Strategy and CMBAt some point all business leaders are challenged to “innovate” in order to grow their company’s bottom line.

Done right, innovation creates value for both the company and the customer through new-to –the-world solutions to needs. It’s logical that products and services are where companies start their innovation efforts because, after all, these are very tangible sources of value. However, brand and marketing can also be powerful drivers of value and differentiation and should not be overlooked as potential anchors for innovation.Many innovation initiatives begin with a brainstorming session in which a bunch of internal folks sit around and try to generate new ideas for products or services they think customers want. For a fresh take on this process, consider “brand-storming” as the starting point for inspiration.

What is a “brand-storming” session, exactly? Well, in marketing speak, it’s generating innovative ideas for brand extensions, leveraging brand equity (a very valuable asset) to push into adjacent or even totally new product areas.

Start a successful brand-storm with  a clear articulation of your brand strategy, brand attributes and positioning. Then do creativity exercises that apply key brand attributes to new markets or to new solutions to existing customers.

Need to get the juices flowing? Look for examples in the marketplace:

  • Consumer products are the easiest place to start. For example, consider Arm & Hammer Baking Soda’s extension to toothpaste (“clean” and “white”) or Duracell’s introduction of the PowerMat to recharge phones and other devices (e.g. “long lasting power”).

  • Virgin is probably the poster child for brand-centered innovation, using its well-defined and unique positioning to extend into everything from airlines to cell phones.

All of this is not to say that brand should be the only source of invention. But brand-storming brings a new part of the company to the innovation table and adds another angle for sparking new, powerful ideas for growth. 

In our upcoming webinar we will look at some of the common pitfalls of innovation initiatives and explore how to use “brand” and “brand attributes” as well as innovative go-to-market strategies to unlock growth opportunities in new, unexpected directions. Hope to see you there!

Posted by J. Mark Carr, Mark is co-founder and managing partner of South Street Strategy Group.

Topics: South Street Strategy Group, Strategic Consulting, Product Development, Marketing Strategy, Webinar, Brand Health & Positioning, Growth & Innovation