WELCOME TO OUR BLOG!

The posts here represent the opinions of CMB employees and guests—not necessarily the company as a whole. 

Subscribe to Email Updates

BROWSE BY TAG

see all

How to Win Virtual Assistant Rejecters Over

Posted by Chris Neal

Wed, Jun 20, 2018

It seems like every week, tech giants are adding new features to their virtual assistant (VA) tech arsenal. See Google’s new Duplex technology—an AI system for accomplishing real-world tasks by phone. 

While companies are pouring millions into making their virtual assistants smarter and more integrated, most users don’t stray beyond its basic functions like asking for the weather.

Learn about the emotional and social identity dimensions keeping people from adopting and using this tech to its full potential, and what brands need to do to win the VA war.

CMB01_VA_Infographic_07_AW

Topics: technology research, Consumer Pulse, emotional measurement, AffinID, Artificial Intelligence

Consumer-Driven Ideas from Leaders at Keds, NYU, and Alibaba

Posted by Julie Kurd

Thu, May 17, 2018

Yale header-2A few hundred of us attended the annual #YaleInsights18 conference to listen to leaders at companies and universities including L’Oreal, Warby Parker, Keds, Alibaba, Yale and NYU Stern. Three of the sessions are recapped below. 

Are you seeding the clouds for earned impressions?  Keds CMO Emily Culp talks about the shift from the 2-minute “long form” ad to a more integrated digital strategy that incorporates quick five second video GIFs. This strategy pulls together media impressions from a mix of sources like celebrities (1M+ followers), regional influencers (500k-1M followers) and micro-influencers (1-100k followers). Keds understands that a celebrity can get 150,000 likes on a picture of them wearing Keds vs. the Keds website that can’t get the same brand heat and wholesale interest. Keds is seeding with bloggers and consumers with the goal of getting their shoes in the hands of influential people who will be loyal brand ambassadors. These shoe-wearers in turn create user generated content. Keds then builds on this digital native-first strategy by leveraging their website, social media, PR/seeding, emails, and their sell in/sell through data.

Are you using AI to personalize promotional offers? NYU Stern School Professor Anindya Ghose explained that mobile devices provide atomic levels of behavioral data because they are owned and used by a single person. Specifically, companies can “know” the device’s location with 91% accuracy and within three feet. He says that in addition to location, as researchers, we need to layer in time, context, crowdedness (how crowded the location is), weather, trajectory, social dynamics, saliency and tech mix to really optimize the “in the moment” promotional offers. For example, if I’m walking to the train station during my morning commute, I might be enticed by an offer at a coffee shop I never frequent.  In the evening when I return, I might be more likely to divert my habitual path for a different promotion, but not likely for a coffee. So, depending on the time of day, the offers that need to pop are different.  In fact, commuters are ALWAYS more likely to redeem an offer than non-commuters. 

Can you see the human in your consumer data or are they just IP Addresses?  Alibaba’s Lee McCabe talked about the future of commerce in a connected world. At Alibaba, they’re focusing on context, convergence and contact. Alibaba is fully vertically integrated so they can take data and consumer insight to a whole different level of identifiable, analyzable and reachable. They have real person identity, full dimension analysis (all the sites and businesses they own have a single sign on so they can see across browsing, social, media, purchase, pay, logistics, entertainment, travel) and measure all around touchpoints for the consumer. On 11/11, Singles’ Day, (think of Singles’ Day as Black Friday on a global scale), Alibaba sold 100 Maserati’s in 18 seconds and 350 Alfa Romeo cars in 33 seconds. In fact, 140,000 brands participated in Singles’ Day across 225 countries/territories, and consumers placed 812M orders that generated $25B in sales that one day.   

How are we all using multi-method and multi-source to grow our brands?  These are a few of the ways.

Yale often posts conference content to its YouTube channel to broaden its reach, and we will append this blog post if/when that channel appears.

Topics: consumer insights, conference recap, integrated data, Artificial Intelligence

Emotions Run High with Virtual Assistants

Posted by Chris Neal

Wed, May 09, 2018

woman with VA

The pace of innovation and disruption is accelerating. Just 10 years ago Uber and Airbnb didn’t exist and the iPhone was still a novelty shown off at parties by overenthusiastic tech lovers. Now, we have a hair salon receptionist convinced she's speaking to a real person when in fact it was Google Assistant that was scheduling an appointment. While it might be hard for many of us to remember the last time we took a cab or used a flip phone, change is hardly straightforward and tech adoption raises critically important questions for brands.

Why do some people resist change while others embrace it? What emotions trigger true acceptance of a new technology and a new way of doing things?  What is that “a-ha” moment that gets someone hooked on a new habit that will be enduring? 

To help understand consumers’ journey with evolving technology, we applied our BrandFx framework to the broad virtual assistant category—measuring the functional, social identity, and emotional benefits that people seek from Siri, Alexa, Google Assistant, Cortana, etc. I shared our findings from the identity aspect here.

And while each of these three benefit types play a role in adoption and use—the role of emotion is profound.

We asked a lot of people about how they use virtual assistants—from information seeking to listening to music to planning and booking a trip. Then we ran analytics on the overall emotional activation, valence, and specific emotions that were activated during these different use cases. 

Our findings have broad implications for anyone in the virtual assistant category creating marketing campaigns to drive adoption, or product UX teams looking to design customer experiences that will deepen engagement.

Currently, virtual assistants are primarily used as information-seeking tools, basically like hands-free web queries. (See Exhibit 1):TOP VA USES CASES

Even though virtual assistants are evolving to do some pretty amazing things as voice-based developer communities mature, most people are only scratching the surface with the basic Q&A function. Asking Siri or Alexa for the weather forecast is a fine experience when they’re cooperating, but it can be extremely frustrating when you don’t get the right answer—like getting the current temperature in Cupertino when you live in Boston.

Meanwhile, watching TV or shopping through your virtual assistant turns out to be a much more emotionally rewarding experience, based on the analytics we ran. The problem for the industry as a whole is that these more emotionally rewarding use cases are among the least used VA functionalities today. Teams that market these experiences must motivate more consumers to try the more emotionally rewarding VA use cases that will deepen engagement and help form a lasting habit (see Exhibit 2):

Use, emotional activation, and emotions activated by use case v2

Listening to music and watching TV/movies yields high emotional activation in general—specifically “delight.” Our driver modeling shows that feeling “delighted” is one of the top predictors of future usage intent for a virtual assistant product (see Exhibit 3):

emotions that drive VA usage-2

As Exhibit 2 above indicates, using virtual assistants for scheduling and calendaring has overall moderate emotional activation, but is particularly good at activating feelings of efficiency and productivitythe single strongest predictor of use in this category.

emotions that drive VA usage-1

 

Tellingly, however, the scheduling and calendaring function also over-indexes on feelings of frustration because this task can be more complex—currently AI and natural-language processing (NLP) technologies are more apt to get these kinds of requests wrong. 

In general, “frustration” indexes high on more complex use cases (e.g., arranging travel, coordinating schedules, information seeking). This is a warning to the tech industry not to get too caught up in the hype cycle of releasing half-baked code quickly to drum up excitement among consumers. It also helps explain why younger demographics in our analysis actually experienced more frustration with VAs than older cohorts (contrary to my initial hypotheses). 

Younger consumers are attempting to do more complex tasks with virtual assistants, and therefore bumping up against the current limits of NLP and AI more frequently. This is dangerous, because they are the key “early adopter” segments that must embrace the expanding capabilities of virtual assistants in order for the category to become pervasive among mainstream consumers.

Consumers will quickly abandon a new way of doing things if they get frustrated. Understanding and activating the right positive emotions and minimizing the negative ones will be critical as brands continue to vie for the top virtual assistant spot.

Interested in learning more about the emotional dimensions of Virtual Assistant users? Reach out to Chris Neal, CMB's VP of Technology & Telecom.

Topics: technology research, growth and innovation, AffinID, Artificial Intelligence, BrandFx

CES 2018: Virtual Assistant Battle Royale

Posted by Savannah House

Wed, Jan 17, 2018

AI_Resized.jpg

Last week the 2018 Consumer Electronics Show (CES) wrapped up in Las Vegas and left us feeling excited and invigorated about what’s to come in tech. From talking toilets to snuggle robots, CES 2018 was yet another reminder of how deeply technology has infiltrated every aspect of our lives.

This year, once the world’s largest tech show found its way out of the dark, CES was all about virtual assistants.

Alexa vs. Google Assistant

Amazon’s Alexa has dominated the virtual assistant category—claiming 70% of the market share in 2017 and then ending the year with strong holiday sales as the most downloaded app for Apple and Android on Christmas Day. But this year, Google (who typically keeps a low profile at CES), made its presence loud and clear.

From wrapping the Las Vegas monorail with the words “Hey Google” to erecting a massive playground in the CES conference center parking lot (complete with a giant gumball machine), Google is making it clear that it intends for Google Assistant to be a legitimate contender in the virtual assistant space.

It’s about integration, not separation

Both Google and Amazon used CES 2018 as a platform to announce new partnerships for their virtual assistants. Alexa will soon be found in Toyota cars, Vuzix smart glasses, and Kohler smart toilets. Meanwhile, Google is integrating its smart technology with a slew of products from leading brands like Sony, Lenovo, and Huawei.

If there’s one takeaway from these partnership announcements, it’s that voice assistant technology will not be confined to the realm of their makers’ product lines. Instead, voice assistants intend to be everywhere—plugging into smart glasses, smart earbuds, and smart toilets—underscoring the tech industry’s expectation that voice assistants will continue to play a much bigger role in our digital lives.

Crossing the chasm

It appears Google’s goal at CES wasn’t necessarily to woo tech lovers with its Google Assistant. Rather, it was to show regular people what is possible with virtual assistant technology. This is important because it demonstrates the (potential) ubiquity of this category once thought of as only for early tech adopters.

However, despite pushes to show “regular" people that virtual assistants are meant for everyone, our research indicates that social identity is playing a role in preventing widespread virtual assistant adoption.

As the chart indicates below, peoples' ability to relate to the typical user is the biggest driver in virtual assistant usage:

VA drivers (branded)-1.jpg

However, currently, consumers can’t relate to the typical virtual assistant user, which is keeping them from “crossing the chasm” and becoming regular users themselves.

The virtual assistant category will only grow in complexity as more companies enter the game (let’s not forget about Siri and Cortana). But, while flashy conference displays, exciting partnership announcements, and product demos are all helpful in attracting more consumers, if virtual assistant brands want to achieve more mainstream adoption, the brand and creative teams need to tackle the virtual assistant image problem head on.

Savannah House is the Marketing Manager at CMB, and as a light sleeper, is most excited about the robotic pillow.

Topics: technology research, internet of things, Identity, AffinID, Artificial Intelligence

AI's Image Problem: Who's the "Typical" Virtual Assistant User?

Posted by Chris Neal

Tue, Jan 09, 2018

siri2-1.png

Every nascent technology and every tech start-up faces the same marketing challenge of “crossing the chasm” into mainstream adoption.  Geoffrey Moore framed this very well in his 1991 classic, “Crossing the Chasm”:

adoption curve.pngWord of mouth can play a huge role in motivating certain segments to sip the Kool-Aid and make the leap.

With CES 2018—the world's largest gadget tradeshow—happening in Vegas this week, I can't help but wonder if mainstream consumers don’t relate to the early adopters of a new technology? What if they think it’s used by people who aren’t part of “their tribe”? Will it prevent them even considering the new tech? There are countless technology categories that have faced this challenge, for example:

  • certain gaming categories trying to expand beyond 15-24-year-old males
  • consumer robot products to this day
  • social media when it was first introduced
  • Second Life and other virtual worlds

I hypothesized that the virtual assistant (VA) category—and specific brands within it—faces this challenge. Yes, many people have tried and used Siri, but few mainstream consumers are truly using virtual assistants for anything beyond basic hands-free web-queries. To further complicate things, an increasing number of “smart home” products that connect to intelligent wireless speakers in the home (e.g., Amazon Alexa, Google Home, Apple’s forthcoming HomePod) are proving divisive. Some people love the experience or the idea of commanding a smart device while others categorically reject the concept. 

My team and I had the chance to test out a few hypothesis through our Consumer Pulse program and —voila!—we’ve got some tasty (and useful) morsels to share with you about how social identity is influencing consumer adoption in the virtual assistant space using our proprietary AffinIDSM solution.

Here’s what we found:

Social identity matters in the virtual assistant space. We studied US consumers (18+)—covering usage, adoption, and perceptions of the virtual assistant category and a deep-dive on four major brands within it: Apple’s Siri, Amazon Alexa, Google Assistant, and Cortana by Microsoft. We covered rational perceptions of the category, emotional reactions to experiences using virtual assistants, and perceptions of the “typical” user of Siri, Alexa, Google Assistant, and Cortana.

We then ran fancy math™ on our data to create a model to predict the likelihood of a virtual assistant “category rejecter” (i.e., someone who has never tried a VA before) to try any one of those assistants in the future. Our analysis indicates that how much a current VA category rejecter relates to their image of the type of person who uses a virtual assistant is the number one predictor of whether they are likely to try the technology in the future:

Blog_Chris.png

Unfortunately for the industry, category rejecters do not find the typical VA user very relatable. 
AffinID metric by brand.png

As the chart indicates, relatability (biggest predictor of likelihood to try as shown previously) scores the lowest of the three components of AffinID: relatability, clarity, and desirability. You may ask yourself: “are scores of 12 to 14 ‘good’ or ‘bad’?  They’re bad: trust me. We’ve now run AffinID on hundreds of brands across dozens of industries, so we have a formidable normative database against which to compare brands. The VA category does not fare well on “relatability,” and it matters.

Some brands’ VA ads, while amusing, are not very relatable to “normal” mainstream consumers. For example as my colleague Erica Carranza points out in her recent blog, Siri’s ad featuring Dwayne “The Rock” Johnson doing impossibly awesome things in one day (including taking a selfie from outer-space) with the help of Siri isn’t exactly a “normal” person’s day. A-grade for amusement on this one, but it is playing into an existing perception problem.

Stereotypes about users’ age and income are currently keeping “rejecters” away from the virtual assistant category.

The age gap between rejecters and “typical” virtual assistant users is a social identity construct keeping rejecters out of the category. Current rejecters, not surprisingly, skew older while current heavy VA users, also not surprisingly, skew young.

We uncovered this disconnect with a big predictive model using “match analysis” on a variety of demographic, personality, and interest attributes. For every attribute, we examined whether there was a “match” or a “disconnect” between how a rejecter described themselves vs. how they perceived the typical user of a virtual assistant brand.

The two specific perceptions that had the greatest ability to predict a rejecter’s likelihood to consider using a brand in the future was an age-range match and an income-range match. For example, if I’m over 35 years old (hypothetically!), and I perceive the “typical” user to be under 35 years old and higher-income than me…so what? Well, it does matter. For new technologies to achieve mainstream adoption, they must debunk the widespread perceptions that the early adopter is “young” and highly affluent, and that their product can be used by everyone (think: Facebook). SNL pokes fun at this generational discrepancy.

But in all seriousness, if a virtual assistant brand wants to achieve more mainstream adoption among older demographics, the brand gurus and creative teams working on campaigns need to tackle this head on.

And they must try to do this—ideally—without alienating the original early adopter group that made them their first million (think: Facebook, again…how many Gen Zers do you know who actually use it actively?). I—prototypical 45-year-old suburban dad—can’t imagine using Snapchat, for instance. If Snapchat wanted to get me and my tribe to buy in as avid users*, it needs to convince me that Snapchat isn’t just for teens and early twenty-somethings. Or it needs to launch a different brand/product targeted specifically at my tribe, and market it appropriately.

It’s worth noting there are other social identity constructs that help predict whether a non-user of a virtual assistant is likely to try a product in the future. For instance, the few VA category rejecters who perceive the typical (young, affluent) user as being as “responsible/reliable” as themselves are more open to trying a VA in future than those who do not perceive VA users this way. So, we’re seeing this stereotype that virtual assistant products are for young, affluent professionals living in a major coastal city with no kids to contend with yet, and this is turning some consumer segments off from trying out the category in earnest.  

Stay tuned to this channel for more on our study of the virtual assistant category. I’ll be covering some key insights we got by applying our emotional impact analysis—EMPACT℠to the same issue of what virtual assistant brands should be doing to achieve further adoption and more mainstream usage of their products. 

*I am more than 95% confident that the Snapchat brand gurus do not want me as an avid user…and my ‘tween daughter would definitely die of embarrassment if I ever joined that particular platform and tried to communicate with her that way.

 

Topics: technology research, EMPACT, Consumer Pulse, AffinID, Artificial Intelligence

Robo-Advisors Aren't Your Father's Financial Advisor

Posted by Lori Vellucci

Tue, Dec 12, 2017

Back in the day, if you had a little money to invest, you called up the brokerage firm that your dad used, you talked to his“guy” and you asked him to invest your money for you. Those days aren’t totally gone, but over the last few years new technology has disrupted the traditional investor-client relationship—resulting in more ways than ever to invest your money yourself.

We all remember the iconic E*TRADE baby from way back in 2013. E*TRADE’s campaign brought the online discount stock brokerage firm for self-directed investors model into the mainstream. Since then, more DIY investment platforms have cropped up, each vying for the modern self-directed investor’s business. But one important learning from the DIY trend of the past decade is that even though this model lends itself to independent investing, DIY-investors still need some type of investment help.

Robo-advisors: The rise of AI in finance

The first robo-advisor was released in 2008 to help these new investors make smart money choices. For the most part, early DIY investors didn’t have a formal finance background, so robo-advisors offered them portfolio management services and insights that were once reserved for high-net-worth individuals—at a fraction of what a traditional human financial advisor might charge. It was a gamechanger.

Robo-advisor technology continues to shape the financial services industry with big players like Charles Schwab and Ameritrade each launching their own in the last few years. This growing interest and investment in robo-advisory technology is great for DIY investors and offers a ton of opportunity for traditional financial firms be on the cutting edge of FinTech.

Given the changing landscape, we wanted a better understanding of investor perceptions of robo-advisor clients.  Through our 2017 Consumer Pulse, we surveyed 2,000 US adults about FinTech, traditional financial services firms, and who they perceived as the technologies' typical user.

Who's using robo-advisors?

Typical Robo-Advisor User.png

CMB’s AffinID (a measure of social identity’s influence on consumers) score for this FinTech offering indicate that while all three components of AffinID (clarity, relatability, and social desirability) could stand improvement within the investor community. Relatively speaking, relatability is weakest--people have a clear image of what the typical robo-advisor user is like and that image is socially desirable, but they don't view the typical user as part of their "tribe".

The inability of investors to relate to their image of the typical robo-advisor user sheds light on a potential roadblock. Robo-service providers targeting traditional investors might consider messaging that conveys a typical user more closely aligned with the “traditional investor image”.

What emotions are driving use?

We found that robo-advisor users themselves are driven by feelings of being smart, wise, and savvyefficient, practical, productive.  Inspiration and motivation are also key emotional drivers for robo-advisor services.

Emotions that drive robo-advisor usage2.png

Why does this matter? It tells us what brands looking to differentiate themselves in a crowded FinTech market could be doing to attract more customers. These emotional drivers could be important messaging elements for those companies looking to court new money from traditional investors.

Are robo-advisors the next "big thing" in FinTech?

FinTech adoption curve2.png

Three quarters of robo-advisor users consider themselves early adopters, this is in contrast with users of mobile wallet and online-only banking--two technologies that have entered the mainstream. As traditional financial service providers make considerable investments in driving robo-advisor adoption, our findings show that to drive adoption it's critical to understand both how consumers want to feel, and how they perceive and relate to their image of the typical user.

Interested in learning more?

Our comprehensive FinTech study also looked at online-only investment apps, online-only banking, and mobile wallets. Download a sneak peek of our findings from all four in our Facing the FinTech Future series:

Topics: financial services research, Identity, AffinID, Artificial Intelligence, BrandFx

I, for one, welcome our new robot...partners

Posted by Laura Dulude

Tue, Oct 17, 2017

 

iStock-841217582.jpg

Ask a market researcher why they chose their career, and you won't hear them talk about prepping sample files, cleaning data, creating tables, and transferring those tables into a report. These tasks are all important parts of creating accurate and compelling deliverables, but the real value and fun is deriving insights, finding the story, and connecting that story to meaningful decisions.

So, what’s a researcher with a ton of data and not a lot of time to do? Hello, automation!

Automation is awesome.

There are a ton of examples of automation in market research, but for these purposes I'll keep it simple. As a data manager at CMB, part of my job is to proofread banner tables and reports, ensuring that the custom deliverables we provide to clients are 100% correct and consistent. I love digging through data, but let’s be honest, proofing isn’t the most exciting part of my role. Worse than a little monotony is that proofing done by a human is prone to human error.

To save time and avoid error, I use Excel formulas to compare two data lists and automatically flag any inaccuracies. This is much more accurate and quicker than checking lists against one another manually—it also means less eye strain.

As I said, this is a really simple example of automation, but even this use case is an incredible way to increase efficiency so I have more time to focus on finding meaning in the data.

Other examples include:

  • Reformatting tables for easier report population using Excel formulas
  • Creating Excel macros using VBA
  • SPSS loops and macros

I’m a huge proponent of automation, whether in the examples above or in myriad more complex scenarios. Automation helps us cut out inefficiencies and gives us time to focus on the cool stuff

Automation without human oversight? Not awesome.

Okay, so my proofreading example is quite basic because it doesn’t account for:

  • Correctness of labels
  • Ensuring all response options in a question are being reported on
  • Noting any reporting thresholds (e.g. only show items above 5%, only show items where this segment is significantly higher than 3+ other segments, etc.)
  • Visual consistency of the tables or report
  • Other details that come together to create a truly beautiful, accurate, and informative deliverable.

Some of the bullet points above can also be automated (e.g. thresholds for reporting and correctness of labels), but others can’t. On top of that, automation is also prone to human error—we can automate incorrectly by misaligning the data points or filtering and/or weighting the data incorrectly. Therefore, it’s imperative that, even after I automate, I review to catch any errors—flawless proofing requires a human touch.

When harnessed correctly, automation maximizes efficiency, alleviates tediousness, and reduces error to free up more time for insights. Before you start arming yourself against a robot takeover, remember: insights are an art and a science, and machines haven’t taken over the world just yet.

Topics: quantitative research, Artificial Intelligence, Market research Automation,

AI, AI, AI! What next?

Posted by Brant Cruz

Wed, May 31, 2017

robots.jpgPeople who know me are well aware I occasionally like to spin a tall tale. The routine is standard: I start with a barely believable premise, and if I see someone taking the bait, I keep adding ridiculous layers until my mark finally figures it out.

The other day started in similar fashion. Chris Neal (a colleague of mine) and I were asked by another colleague if our Silicon Valley clients were chanting this article’s mantra, “Mobile First to AI First.”  The real answer isn’t a simple “yes” or “no” (more on that in a bit). But in addition to answering the question, I decided to spin one of my famous yarns. I won’t bore you with the details, but the yarn evolved into me admitting that I was about to strike rich from investing in an MIT start-up that created AI-based robot leggings. Further, I’d sport those leggings while running the 2018 Boston Marathon as a publicity stunt.

I’m 5’9” (on a tall day) and 275 lbs. (after 24 hours of fasting). 

My only hesitation (according to the story) was that my wife was concerned my heart wouldn’t make it beyond the first mile and was greedily reviewing the details of my life insurance policy. 

Note: When my colleague reads this blog post, it will be the first time he or she realizes I was only pulling his/her leg. 

For the last few days, I’ve been basking in the satisfaction that only those with my genetic mutation feel. But that reflection has made me think–is my tale really that tall? The truth is, while neither Chris nor I hear “AI First” as universally and consistently cited as “Mobile First” was five years ago, AI is permeating strategy discussions at all major tech companies as they become more focused on the business opportunity it represents.

And, a lot of them are struggling to answer key questions. Where does AI “live” organizationally? Does it deserve its own category of products/apps, or should it remain a concept that permeates nearly every project across departments? Other challenges include foundational questions like who has subject matter expertise to advise on insights in this category adequately, and how can we market something this new (and to some, scary) effectively to the right audiences in a way that is compelling and easy to understand?

In my own experience, I can say that many consumers are ready for the realization of AI. Based on our recent work with Anki for their amazing robot Cozmo, consumers in millions of US HHs are excited to use AI in everything from fun to productivity. And, related to my colleague’s enthusiasm for my fictitious running suit, consumers in 8.8 million US households strongly agree with the statement “Tech toys/gadgets/robots make me feel closer to the future I’ve envisioned”.Cozmo Lifestyle 002-1.jpg

We’re also wrapping up a self-funded research study examining the barriers to and opportunities for getting coveted groups like Millennial and Gen Z to use Intelligent Personal Assistants (IPA—think Siri). Needless to say, AI is no longer a peripheral concept—it’s very much on the minds of consumers and brands alike. If you aren’t already, subscribe to our blog so you don’t miss a series of AI-inspired blog articles once we release our study’s findings.

In this context, I guess my MIT “get rich” story really wasn’t too far from believability. It’s possible that engineers at Nike or Under Armour are measuring up some other husky market researcher for a set of robotic leggings for some incredible athletic feat. Regardless, I’m excited about the possibilities–though my tastes tend more towards self-driving lawn mowers. 

Brant is CMB’s ecommerce and Digital Media Practice Leader, and will be co-presenting the aforementioned work with Anki at the Insights Association Northwest Educational Summit in San Francisco on June 8. In his near-future spare time he can be found hiding under his desk, avoiding his previously unsuspecting colleague. 

Are you registered for the Northwest Educational Summit on June 8 in San Francisco? If so, click here to receive our latest webinar and connect with one of our lead researchers.

Not going but still interested in learning about how Anki leverages emotions and identity to adapt, innovate, grow, and stay consumer-centric? Click here!

Topics: growth and innovation, Artificial Intelligence