WELCOME TO OUR BLOG!

The posts here represent the opinions of CMB employees and guests—not necessarily the company as a whole. 

Subscribe to Email Updates

BROWSE BY TAG

see all

IA CRC - Be The Change

Posted by Julie Kurd

Fri, Oct 25, 2019

Maybe a lack of curiosity CAN kill the consumer insights professional. Speakers at the Insights Association’s Corporate Researcher’s Conference choraled symphony of voices around the concepts of exploration, trust, and curiosity. With the click of a button, Microsoft’s Anne Sedgwick and Anil Damodarans’ voices were transcribed into real-time closed captions as they shared how humans and AI make “a great orchestra.”

Here are some other key takeaways from the conference:

  • Unpacking Curiosity, by Alison Horstmeyer:  We live in a VUCA world (volatile, uncertain, complex, ambiguous), began Alison Horstmeyer in her “Unpacking Curiosity” presentation. Alison asked each of us to pick a photo and answer key questions. My picture was a bike leaning on a tree on a beautiful autumn day. She asked what happened the minute before this picture was taken. She asked what will happen in the next minute. She asked what the most significant thing in the photo is and what the key emotions are. Throughout the session, she motivated exploratory behavior in us, asking us to be resilient, curious, and open. Thanks to her exercise, I could see more opportunities to cultivate openness and ideational fluency, through continuing to venture out of our boxes through: 1) active exploration, 2) engaged inquisitiveness, 3) openness to experience and 4) stress tolerance. She described the value of P.R.O.B.E. or Presence (open ended, listening), Reframe (‘how might we…’), Openness (“tell me more”), Bravery (resilience), Experimentation (attempts in learning).
    CRC blog quote #2 oct 2019
  • Google: “Puppy or Not a Puppy,” by Elizabeth Merrick May: In a world where the market research industry typically statistically tests at a 90% confidence, Elizabeth challenges us with a simple question: puppy or not a puppy? Using this example to describe algorithm training in machine learning, she talked about how in a world of disruptors and disruption, we need to always think about the payoff. Don’t let the world mire you in decisions with minimal downside. Incrementality requires one set of decisions. Leaps require new models. Which is worse to be wrong about: deciding in favor of something that is actually bad or deciding against something that’s actually good?  We can underfit our models (too simplistic to really explain the variance), overfit (add too many options into the possibility so we don’t risk excluding…this makes it hard to replicate). She said a typing tool with the least number of questions yielding the most ‘accurate enough’ output is the one to go with. She challenged us to not over-define things…after all, there’s a downside to being thorough. She encouraged us not to pursue ‘right’ but instead to pursue the ‘right’ amount of ‘wrong’ by setting a risk-based approach. Although pup could be a dog or a seal, ultimately, we are looking for the right amount of wrong.Twitter Mattel blog quote oct 2019 (4)
  • Taboo Discussions and Peer-to-Peer Self-Moderation, by Melissa Spencer, Merck and Kim Bowers, Brado: Want to know about emotional and functional barriers to diagnosing and treating Alzheimer’s? STDs? Topics that Merck and Brado were noodling on included the elephant in the room…was it possible that the qualitative moderator impeded their authenticity by their very physical presence? Could they possibly launch self-moderated, consumer-to-consumer (C2C) discussions? They tried it. And they spoke about how C2C is messy, but the potential payoff exceeded the risks, so they recruited consumers, and, for Alzheimer’s, they asked that person to recruit a few friends for the ‘friend’ groups. They asked these groups to hold ‘book club’ style sessions in their homes, and to videotape it. For the STD discussion, they found that C2C ‘stranger’ sessions—recruited on a guide, and then brought to a facility—worked best.
  • Influence In the Age of ML, by Eric Solomon: Can you embrace curiosity, and the need to experiment? Eric shared the magic that can happen at the intersection of emergent technologies such as Artificial Intelligence and human psychology. If you believe that superintelligence is possible, that intersection shifts the way we tell and consume stories. Eric showed us advertisements that were created by AI, such as this ad by McCann for Clorets gum; And, on watching, tweeting and other behavior, I must have shifted Google’s algorithms, because I got served up the coolest, craziest ad. Does emergent technology disrupt? That girl be a tomboy.

PostScript:  Jeffrey Henning presented the new Insights Professional Certification program which will launch in 2020. The IPC, is an upcoming @InsightsMRX program, backed by @BurkeInstitute, @CambiarConsults, @ResearchRocks, @Rivainc and the @MRII_UGA and includes 5 new topic certifications (IPC Analytics, Practitioner, Qualitative, Quantitative and Specialist). Click here to learn more.


Julie KurdJulie Kurd is the VP, Business Development at CMB.

For more insights, please follow us on LinkedIn, Facebook, and Twitter.

Topics: conference recap, growth and innovation, Market research, Artificial Intelligence, professional development

Selling a Driverless Future: Messaging Strategies for the Autonomous Vehicle Industry

Posted by Chris Neal

Tue, May 07, 2019

Emotions play a key role in the commercial success or failure of emerging disruptive technologies. Most recently, we looked under the hood of the autonomous vehicle (AV) industry to understand the specific emotions that drive or deter widespread adoption.

On the wheels of Tesla’s recent announcement to operate a fleet of one million self-driving taxis by the end of 2020, I’ll provide more direction for how tech companies and automakers can most effectively convince various consumer segments to embrace this future.

Message Testing: Different Strokes for Different Folks

As part of a recent self-funded research study exploring the link between emotions and the self-driving car industry (download the full report here), I channeled my inner Don Draper and drafted faux ad concepts selling the promise of a driverless future.

With each concept touting a different benefit of autonomous vehicles (safety, convenience, etc.), respondents were asked to select which would most likely get them to consider a self-driving car.

message testing AV

I’m still awaiting my Ogilvy Award--but until then, let’s dig into the results of this exercise:

  • Safety is unequivocally the most persuasive message—indicating a creative campaign highlighting the public health and safety benefits of widely deployed AVs may help alleviate some consumer anxiety.
    • People who gravitate toward the potential safety benefits tend to skew 50+ years of age and are more likely than other segments to reject the idea altogether. They also tend to feel more positive towards driving their own car (e.g., feeling energized, proud, and in control).
  • Overall, Productivity/Efficiency isn’t a very compelling message, but is more likely to appeal to Gen Z and Millennials who are often less bound to the idea of owning their own car compared to older generations.
    • Consumers who are drawn to these features are more likely to feel “Efficient,” “Productive,"  and “Smart” when imagining themselves in AVs (even before they saw the messages). This is noteworthy because these specific emotions are consistently found to be key drivers to adoption in most of our emerging tech studies.
We then layered on a lift analysis that asked respondents to again consider likelihood to use an AV based on the ad message they had just selected as most compelling. Although the results from this exercise were underwhelming, it did help move some “Ambivalent” Millennials into the full-on “Accepter” category by touting the Productivity and Efficiency benefits.

 lift analysisAs this exercise indicates—and is often the case with new tech trying to “cross the chasm”—marketing to the most swayable early adopters vs. general population can be an effective tactic for gaining traction. Messaging to early adopters will be more nuanced, but when done right, can encourage adopters to spread positive word of mouth to more mainstream late adopters.

The Road Ahead: Evolution, Not Revolution

The Don Drapers of the world can only do so much convincing until more people actually experience the technology for themselves.

Fortunately, consumers are getting a taste of increased levels of ADAS (Advanced Driver-Assistance Systems) technology as features like auto braking and lane correction become more common in newer cars.

Further, the less common but also rapidly growing “Level 3” vehicles (e.g., Tesla’s “autopilot” mode) that can go on full autopilot—under certain conditions—can also help consumers overcome the anxiety they have about fully letting go.

At the moment, very few consumers said they’d get into anything other than an autonomous vehicle they could—if need be—take over (i.e., “Level 3”). This sentiment could be problematic for the future of companies like Uber, Lyft, and now Tesla, who aren’t about to let passengers take control when they feel like it.

However, people who own Level 2 or 3 vehicles have much more nuanced attitudes towards this scenario—more commonly anticipating that in the future, they expect their primary car to be a Level 4 or fully autonomous at Level 5. And those who already own Level 2 or Level 3 ADAS vehicles have much stronger positive emotions and fewer intense negative emotions when reacting to being in a fully autonomous car.

Driving Full Circle

This leads me back to my own emotional journey with vehicular automation. Recall a run-in with with a faulty cruise control back in the ‘90s left me extremely wary of technical automation (read here if you missed that story).

In 2018, after decades of avoiding this kind of automation, I got my first real taste of Level 2 assisted driving technology while on a road trip with my son to Washington, D.C. We were stuck in bumper-to-bumper traffic when my phone’s GPS cut out. I took my eyes off the road to futz with the phone when suddenly the car (not me) slammed on the brakes. Turns out I was about to rear-end the car in front of us.

I was shocked, embarrassed, humbled, and relieved. Had it not been for the auto braking, this story would have ended differently (we were only going 30 mph, but you get the idea).

The more I see drivers facedown in their phones at the wheel, the more I wonder if it’s time for us mere mortals to start letting AI take a little more control over our transportation systems. I still have deep anxiety over the prospect of riding in a fully self-driving car, but my emotions towards this possibility are complex and evolving.

With some focused determination, those invested in these efforts can push me—and likely many others—along towards greater openness to a driverless future.

Interested in more?

If you’re interested in learning more about this research, CMB's methodology, or want a live recall of my various run-ins with faulty cruise control, check out this webinar.

Watch Now

Topics: technology research, EMPACT, Artificial Intelligence

The Road Ahead: Emotions and The Future of Self-Driving Cars

Posted by Chris Neal

Tue, Apr 16, 2019

We recently published self-funded research exploring the impact of consumer emotions on the emerging autonomous vehicle (AV) industry—the latest in our ongoing analysis of the relationship between emotion and disruptive technology.

As detailed in a previous post, this study revealed many consumers are skeptical of self-driving cars. Further, even the prospect of using this technology generates a negative emotional response.

We’ve measured the emotional activation of hundreds of brands in dozens of industries (learn more about our EMPACT approach here) and have found, by far, the autonomous vehicle category generates the most intense and widespread overall negative emotions—indicating a critical obstacle this industry must overcome.

The first two steps in charting a path forward are:

  1. Understand which specific negative emotions are the most important to deactivate
  2. Understand which specific positive emotions are most critical to activate

Better understanding these emotions can help guide the industry’s marketing efforts and actual customer experiences with this technology.

Overcoming the Right Negative Emotions

Unlike most industries we analyze, it’s more critical for the AV industry to deactivate negative emotions than it is to activate specific positive emotions, although doing both are obviously important.

Through our emotional gap analysis, we identified “Anxiety,” “Paranoia,” “Hesitancy,” and feeling “Overwhelmed” as the negative emotions where AVs fare worst when compared to how people feel about driving a car themselves:

Negative Emotions Activated by AV vs Car

Anxiety is no surprise here: people fear the prospect of truly letting AI take over and drive the vehicle with no human intervention.

People are also concerned self-driving car systems could be hacked, which explains the significant feeling of paranoia—an emotion common in a lot of emerging technology we study. Anything “smart” (i.e., connected to the internet) could be hacked, and there are always people who are more concerned about this than others.

Feeling “Hesitant” or “Unsure” also comes up a lot in new and disruptive technology categories. With anything truly new and different, people are unsure of whether it’s ready for primetime, or if they should try it.

The emotions around feeling “Hectic” or “Overwhelmed” are more unique to the AV category. It’s so new and potentially transformative that many people simply can’t process the idea of trusting the technology to get them from point A to point B. It’s overwhelming to really think on the complexity of AV systems, not to mention the myriad road scenarios an AI algorithm will need to be trained well enough to react to.

Positive Emotions Activated by Driving Your Own Car

Positive emotions are also important to driving mainstream adoption of a disruptive technology. This is a unique challenge for the AV industry because people already have many positive emotions activated when driving their own car.  

Not surprisingly, the biggest positive emotional gap between driving your own car and the prospect of getting in an autonomous vehicle is feeling in control.

Positive Emotions Activated by AV vs CarThe combination of anxiety, paranoia, and losing that feeling of control is a major emotional obstacle to for the autonomous vehicle industry’s path to widespread consumer acceptance. We see this in many AI-driven technology categories where life is increasingly automated and data-driven.

This fear of technology running our lives—and the possibility that it might not always do so benevolently—runs deep and has been prominent in popular culture long before the first self-driven test vehicle ever hit the road.

Open the Pod Bay Doors Hal

Source: GIPHY

There’s also a significant gap between feeling “Secure” and “Protected." As the chart above indicates, people feel a lot more secure and protected when driving their own car, but not so much about self-driving cars. The feeling of insecurity is influencing the high levels of anxiety we see from AVs.

The gap in feeling “Efficient/Productive” is also problematic for the AV industry. In most new technology adoption projects where we run this analysis, that emotion emerges as one of the key determinants of more mainstream consumer adoption. People expect disruptive technologies to make them feel more efficient and productive, but if they don’t truly get that feeling when using the technology, they are unlikely to change their existing habits.

Emotions That Predict Adoption

In addition to a straight gap analysis, we also ran a model to isolate which specific emotions (negative and positive) best predict (on a derived basis) peoples’ willingness to use autonomous vehicles in the future.

By far the biggest predictors, not surprisingly, are reducing anxiety and increasing feelings of relaxation.

emotional predictors

Another emotion that popped in our predictive modelling, which wasn’t evident from the initial data review, was activating emotions around pride. In other words, people who would feel “proud” using an autonomous vehicle are much more likely to actually use one, whereas people who might feel ashamed or embarrassed if their friends or family saw them inside an autonomous vehicle are highly unlikely to hop on board.

This “social identity” element is something we see in many new tech adoption studies through our proprietary consumer-centric approach to measuring the impact of identity on decision-making. Does someone identify as being one of those people who uses an autonomous vehicle, or is that for another tribe altogether? Turns out this tribal identity matters quite a bit for new technologies attempting to cross the chasm.

Feeling “Secure” and “Efficient” also help predict likelihood to adopt the technology, but as we saw earlier, not many people feel these emotions when they think about using an autonomous vehicle.

The Road Ahead*

In my next article, I will share some thoughts and findings from this study on potential paths forward for the industry to overcome these obstacles. You’ll get to see the results when I attempt to play an Ad Man and convince people to reconsider the category. Although it was a humbling experiment, it did reveal additional insights that can help actual creative teams with briefs that include different value propositions linked to specific emotions the industry needs to address.

In the meantime, if you’re interested in learning more about this research or our EMPACT approach, check out this recorded (quick) webinar:

Watch Now

*Sorry again! The puns are just too good to pass up in this blog series.

Topics: technology research, EMPACT, emotional measurement, Artificial Intelligence

It’s Complicated: New Research on Emotion and Autonomous Vehicles

Posted by Chris Neal

Tue, Mar 26, 2019

Like many people, my relationship with technology is complicated, and when it comes to the increasing level of automation in cars, leading us—potentially—towards a future where many vehicles will be fully autonomous (i.e., no human drivers), it gets REALLY complicated. 

On one hand, the closest I’ve come to dying was in 1993 at the hands of a faulty after-market cruise control mechanism that terminally accelerated the vehicle I was driving (this is actually a thing, apparently). My sudden panic on the highway precluded a rational decision to put the car in neutral, turn the engine off altogether, and/or use the emergency brake. As we entered the highway off-ramp at 70 mph (brake on the floor) my girlfriend pulled an incredible Dukes-of-Hazzard style 90-degree spin turn that brought the car to heel, and ultimately, a miraculous stop.

I went on to marry that girl, of course.

While this happened back in 1993 (the car, the driver, and my very-questionable-90s hair pictured below)…

chris neal_AV… it gave birth to decades of irrational anxiety manifesting into all sorts of habits, including:

  • Never, ever, using cruise controls. Ever.
  • Avoiding elevators and escalators whenever possible (I’ve been mistaken for a Fitbit junkie on many occasions)
  • Seeking out manual transmission cars

Twenty-five years of significant technological advancements later, lots of people now share my emotional anxiety towards vehicular automation.

A new self-funded study we conducted indicates a rumbling emotional backlash towards autonomous vehicle (AV) technology. This is part of an ongoing proprietary analysis of the human emotional dimensions around disruptive emerging technologies, including virtual assistants and smart home technology.

In this study we leveraged our proprietary framework, EMPACT, and tested several AV-related scenarios to better understand consumers’ emotions towards the different use cases of this technology, including:

  • In a city: scenarios of (a) inside of an AV and (b) near an AV
  • On the highway: scenarios of (a) inside of an AV and (b) near an AV
  • Putting your child inside an AV alone
  • Putting an elderly relative inside an AV alone

The high-level results reveal a very steep path ahead for the AV industry in its journey to gaining widespread consumer acceptance.

As indicated in the chart below, a strong majority (~70%) of American adults reject all tested scenarios:

Likelihood to use Autonomous Cars

…and only 10% or so actively “accept” any of these scenarios.

Interestingly, the prospect of putting an elderly relative into an AV alone was even worse than the thought of one’s child riding solo.

 Understanding the emotional landscape behind these attitudes (i.e., how people feel about these scenarios) helps explain why.

Consumers react more negatively to autonomous vehicle-related scenarios than any other traditional or emerging category or brand we’ve tested to date. We’ve analyzed hundreds of brands across dozens of categories and have found, by far, AVs evoke the highest “net negative emotions”—a combination of valence (how bad) and activation (low to high energy).

Autonomous Vehicle Emotional Activation

If hopping in an AV yourself, the net negative emotional activation is -18%over 3x the net negative emotions generated from driving your own car. This came as a surprise considering plenty of our sample is from major urban areas where traffic continues its relentless march towards Sheer Awfulness. 

The prospect of having an autonomous vehicle, say, drive your kid to soccer practice on its own, fares even worse with a -27% net negative activation.

The verdict?

Although there are plenty of obvious benefits to autonomous vehicles, many people are still wary of this technology as it currently generates only ~12-16% net positive emotional activation compared to the ~42% generated by traditional driving.

And when compared to other emerging or disruptive technology categories like smart homes, autonomous vehicles also have a bigger emotional chasm to cross.

Smart Homes, for instance, do have a problem with not activating enough positive emotions (i.e., many people don’t “get” or feel the benefit of them), but at least they don’t have a major negative emotional barrier to overcome like AVs.

To learn more, join me on April 2nd at 2pm ET (11am PT). 

On April 2, I'll be hosting a brief webinar diving into the details of the specific negative and positive emotions that will be key to driving broader acceptance.

I’ll also cover an analysis of messaging that resonates with various consumer segments—uncovering what will compel/deter someone from considering this technology. 

Register Now

 Join me for the ride (sorry), I promise it will be interesting, useful and entertaining.

Christopher NealChris Neal is CMB's VP of Tech and Telecom research who recently made strides towards his reconsideration of autonomous vehicles while on a road trip with his son.

Topics: Consumer Pulse, Artificial Intelligence

AI You Can Drive My Car: Anxiety and Autonomous Vehicles at CES

Posted by Megan McManaman

Wed, Jan 16, 2019

autonomous cars

In December, The New York Times reported that disgruntled Arizonans were lobbing rocks at Waymo’s autonomous (but not unoccupied) vans. Experts, and the rock-throwers themselves, blamed the attacks on a combination of economic anxiety and safety fears (a woman was struck and killed by a self-driving Uber in Tempe last March). While it’s unlikely any modern-day Luddites attended last week’s CES in Vegas, companies like Intel and Baidu, and even Transportation Secretary Elaine Chao were hard at work addressing consumer fears.

With Congress expected to consider legislation regulating autonomous vehicles—the intense conversation and debate over security and safety will remain front and center. Counting out the projectile-hurling robot-haters (for now), what’s it going to take for average consumers to purchase, ride in, and share the road with these vehicles? That’s the billion(s) dollar question we set out to answer in our self-funded Consumer Pulse.

We surveyed 2,000 U.S. consumers (thanks to Dynata for providing sample!), conducted ethnographies, and in-depth interviews—including ride-alongs—to identify the segments of the adult U.S. population that have different reactions to and perceptions of a range of assisted and autonomous driving scenarios. We went beyond the typical examination of functional benefits to understand the emotions (both positive and negative) driving and deterring greater acceptance and adoption.

Chris Neal, CMB’s VP of Tech and Telecom, will share the results at the Quirks Event on March 6 at 2:15 pm in Brooklyn.

Want an advance copy of the report this spring?

Click here

Megan McManaman is CMB's Marketing Director, she welcomes our new robot chauffers.

Topics: technology research, Consumer Pulse, Artificial Intelligence