WELCOME TO OUR BLOG!

The posts here represent the opinions of CMB employees and guests—not necessarily the company as a whole. 

Subscribe to Email Updates

BROWSE BY TAG

see all

Why We Still Use Facebook Despite Privacy Concerns

Posted by Kate Zilla-Ba

Wed, May 23, 2018

facebook (cropped)

The kids are on Insta and Snapchat, and even as those are a bit dated at this point, the question for Facebook seems to be how to stay relevant. But recent revelations about how Facebook had been using customer data have led to less of a backlash than might be expected. Why?

Complacency.

We humans tend to normalize things. What was scandalous the first time it came around, is less so over time, until we just expect it.

Why are we so complacent about Facebook? Well, for starters, we live on it. How many posts are things the poster could google, but instead, feels the need to ask the town group—when is the next trash pickup, or how do they get rid of the dead squirrel in front of their house, or… the list goes on. Facebook has become part of nearly 1.5 billion peoples’ daily lives around the world, suggesting most people have become okay (complacent) about how social media might be collecting, sharing and using our information.

And this shouldn’t be that surprising. Back in 2010 (and likely earlier) Mark Zuckerberg was clear that he felt privacy was no longer a social norm, saying, “People have gotten really comfortable not only sharing more information and different kinds, but more openly and with more people.”

We’ve come to expect less and less privacy, which is directly associated with the continued growth of Facebook usage—despite what happened with Cambridge Analytica.

We still care about privacy. Just not as much. Maybe if what happened with Cambridge Analytica felt individually targeted and less of a mass invasion of privacy, we’d care more. But it happened to a lot us, so what’s a user to do? Keep posting and sharing seems to be the net answer. Probably even in the EU, where strong consumer concerns about privacy has the EU setting standards for the world with GDPR.

Understanding consumers' psychology, such as Facebook users, is core to market research. What are they seeking to achieve? How do they express that in attitudes and actions? What barriers exist and what catalysts motivate them?

In addition to the normalization of information-sharing, users continue to use Facebook because of the functional, emotional, and identity benefits the brand provides. Facebook is a space where users can feel connected to other people (social identity), can find and share content that animates them (emotion) and is a convenient tool for things like selling your couch or getting recommendations for a reliable plumber (functional). Facebook has done an incredible job providing the right balance of these three benefits, which we know are key drivers of customer loyalty and advocacy.

Knowing all we know, we still use Facebook. We’ve normalized making public details about our lives we would’ve considered “private” 10 years ago—and that doesn’t look like it’s going to change anytime soon. For most of us, the benefits Facebook provides outweigh privacy concerns.

Will you share this when I post it? Like it? Love it? (No sad faces please!)

Kate is a FB user who loves to keep up with old friends and family but rarely posts. Her research background helps keep her eyes wide open (or so she thinks), to the privacy she has given away.

Topics: data privacy, social media, consumer psychology, big data, consumer insights

Why the Market Research Industry Must Stand up for the Census

Posted by Athena Rodriguez

Wed, Aug 23, 2017

Crowd of people_illustration-1.png

You might be forgiven if the future of the U.S. Census didn’t make your “list of things to worry about this week”. But a lack of funding coupled with the recent resignation of Census Bureau director John Thompson has put the 2020 census in danger—and the ramifications are deeply concerning.

The U.S. Census Bureau might not get the media coverage of other government entities, but it plays a critical role in our democracy, federal spending, and in the market research industry. As we prepare for the 2020 census, it’s time to start paying attention.

The U.S. Census

As a reminder, the U.S. Census, mandated by the Constitution, is a decennial survey that counts every resident in United States. The data is used to allocate Electoral College votes and congressional seats by state.  In addition, it helps the government determine how to allocate roughly $4 billion in federal funds to local communities that help pay for infrastructure like schools, hospitals, roads, public works, and other vital programs. The U.S. Census Bureau also administers the monthly American Community Survey—comprised of the long form census questions—sent to about 295k households a month. You can read more about the work of the Bureau and how the data are used here.

A New Collection Methodology Puts the Census in Danger

Replicating the 2020 census using the 2010 methodology would cost $17.8 billion, but Congress has mandated that the Census Bureau limit spending to meet the 2010 census budget ($13 billion over ten years).   

To comply, the bureau hoped to implement a new system, adding online and phone data collection, in addition to mail and in-person visits, that will ultimately keep costs in line. However, any change in methodology requires rigorous planning and testing to ensure results are accurate and replicable. For example, when moving a brand tracker from the phone to web, you typically run tandem data collections (both via phone and online) for the first wave and then compare the results. This testing requires extra work, and initially may cost more, but it’s critical to ensure the results from the new methodology are comparable and will save money in the long run. 

The scope and costs of the census far exceed my brand tracker example, and given the uncertainty of the census budget, it’s unclear whether the census will be able to properly test their new methodology before implementation. If funding isn’t there for testing, the Census Bureau runs the risk of missing the mark.

The end-to-end of the census test is still slated for 2018 but the prerequisite field tests that were to run this year have been cancelled.  The Bureau hopes to include the areas from the cancelled field tests, but that’s still up in the air.

The US Census and Market Research

The US Census serves as the backbone for all consumer market research. I don’t think it’s an exaggeration to say that here at CMB, we use the census on a weekly basis, if not more often, for designing sampling plans, weighting data, sizing audiences, and recommending who to target. You’d be hard pressed to find a research firm that doesn’t use census information to inform its work.

To that end, if the census is flawed by undercount (resulting from a poorly-tested methodology), these errors will be reproduced in most consumer market research studies. As researchers, we’d begin to question the foundation upon which much of our research is built—as would the many businesses that use our services

The Larger Picture

If the census is underfunded, the undercount would most likely impact areas where residents are harder to reach (think lower socio-economic groups less likely to have internet access, rural populations, transient populations like seasonal workers, etc.). These areas—the very communities that need funding the most—could be deprived of vital federal funds due to disproportionate allocations.

In addition to faulty fund allocation, an underfunded, undercounted census could produce a misrepresentation of seats in our House of Representatives. In this charged political environment where everyone’s vying to be heard, it’s more important than ever to ensure we are properly represented.

What’s Next?

2020 may seem far off, but if Congress doesn’t properly fund the census now, while there’s still time for testing, we run the risk of executing a bad census, one that misrepresents the population, unfairly allocates resources, and undermines the quality and credibility of market research. I strongly encourage the market research community to stand up and make their voices heard to preserve this important institution.

 Athena is a Project Director at CMB who wants to see her daughter grow up in a world where the US Census is accurate. 

Topics: Market research, B2B research, big data

Flying High on Predictive Analytics

Posted by Amy Maret

Thu, Jul 27, 2017

pexels-photo-297755_resized-1.jpgBuying a plane ticket can be a gamble. Right now, it might be a good price, but who’s to say it won’t drop in a day—a week? Not only that, it may be cheaper to take that Sunday night flight instead of Monday morning. And oh—should you fly into Long Beach or LAX? As a frequent traveler (for leisure and work!) and deal seeker, I face dilemmas like these a lot.

The good news is that there are loads of apps and websites to help passengers make informed travel decisions. But how? How can an app—say, Hopper—know exactly when a ticket price will hit its lowest point? Is it magic? Is there a psychic in the backroom predicting airline prices with her crystal ball?

Not quite.

While it seems like magic (especially when you do land that great deal), forecasting flight prices all comes down to predictive analytics—identifying patterns and trends in a vast amount of data. And for the travel industry in particular, there’s incredible opportunity to use data in this way. So, let’s put away the crystal ball (it won’t fit in your carry on) and look at how travel companies and data scientists are using the tremendous amount of travel data to make predictions like when airfare will hit its lowest point.

In order to predict what will happen in the future (in this case, how airfare may rise and fall), you need a lot of data on past behaviors. According to the Federal Aviation Administration (FAA), there are nearly 24,000 commercial flights carrying over two million passengers around the world every day. And for every single one of those travelers, there’s a record of when they purchased their ticket, how much they paid, what airline they’re flying, where they’re flying to/from, and when they’re traveling. That’s a ton of data to work with!

As a researcher, I get excited about the endless potential for how that amount of historical data can be used. And I’m not the only one. Companies like Kayak, Hopper, Skyscanner, and Hipmunk are finding ways to harness travel data to empower consumers to make informed travel decisions. To quote Hopper’s website: their data scientists have compiled data on trillions of flight prices over the years to help them make “insightful predictions that consistently perform with 95% accuracy”.

 While the details of Hopper are intentionally vague, we can assume that their team is using data mining and predictive analytics techniques to identify patterns in flights prices. Then, based on what they’ve learned from these patterns, they build algorithms that let customers know when the best time to purchase a ticket is—whether they should buy now or wait as prices continue to drop leading up to their travel date. They may not even realize it, but in a way those customers are making data-driven decisions, just like the ones we help our clients make every day.

As a Market Researcher, I’m all about leveraging data to make people’s lives easier. The travel industry’s use of predictive modeling is mutually beneficial—consumers find great deals while airlines enjoy steady sales. My inner globetrotter is constantly looking for ways to travel more often and more affordably, so as I continue to discover new tools that utilize the power of data analytics to find me the best deals, I’m realizing I might need some more vacation days to fit it all in!

So the next time you’re stressed out about booking your next vacation, just remember: sit back, relax, and enjoy the analytics.

Amy M. is a Project Manager at CMB who will continue to channel her inner predictive analyst to plan her next adventure.

Topics: predictive analytics, big data, travel and hospitality research

Big Data Killed the Radio Star

Posted by Mark Doherty

Wed, Jun 29, 2016

It’s an amazing time to be a music fan (especially if you have all those Ticketmaster vouchers and a love of '90's music). While music production and distribution was once controlled by record label and radio station conglomerates, technology has “freed” it in almost every way. It’s 200542299-001_47.jpgnow easy to hear nearly any song ever recorded thanks to YouTube, iTunes, and a range of streaming sources. While these new options appear to be manna from heaven, for music lovers, they can  actually create more problems than you’d expect. The never-ending flow of music options can make it harder to decide what might be good or what to play next. In the old days (way back in 2010 :)), your music choices were limited by record companies and by radio station programmers. While these “corporate suits” may have prevented you from hearing that great underground indie band, they also “saved” you from thousands of options that you would probably hate. 

That same challenge is happening right now with marketers’ use of data. Back in the day (also around 2010), there was a limited number of data sets and sources to leverage in decisions relating to building/strengthening a brand. Now, that same marketer has access to a seemingly endless flow of data: from web analytics, third-party providers, primary research, and their own CRM systems. While most market information was previously collected and “curated” through the insights department, marketing managers are often now left to their own devices to sift through and determine how useful each set of data is to their business. And it’s not easy for a non-expert to do due diligence on each data source to establish its legitimacy and usefulness. As a result, many marketers are paralyzed by a firehose of data and/or end up trying to use lots of not-so-great data to make business decisions.

So, how do managers make use of all this data? It’s partly the same way streaming sources help music listeners decide what song to play next: predictive analytics. Predictive analytics is changing how companies use data to get, keep, and grow their most profitable customers. It helps managers “cut through the clutter” and analyze a wide range of data to make better decisions about the future of their business. It’s similarly being used in the music industry to help music lovers cut through the clutter of their myriad song choices to find their next favorite song. Pandora’s Musical Genome Project is doing just that by developing a recommendation algorithm that serves up choices based on the attributes of the music you have listened to in the past. Similarly, Spotify’s Discover Weekly playlist is a huge hit with music lovers, who appreciate Spotify’s assistance in identifying new songs they may love.

So, the next time you need to figure out how to best leverage the range of data you have—or find a new summer jam—consider predictive analytics.

Mark is a Vice President at CMB, he’s fully embracing his reputation around the office as the DJ of the Digital Age.

Did you miss our recent webinar on the power of Social Currency measurement to help brands activate the 7 levers that encourage consumers to advocate, engage, and gain real value? You're not out of luck:

 Watch Here

 

Topics: big data, advanced analytics, predictive analytics, data integration

Dear Dr. Jay: Can One Metric Rule Them All?

Posted by Dr. Jay Weiner

Wed, Dec 16, 2015

Hi Dr. Jay –

The city of Boston is trying develop one key measure to help officials track and report how well the city is doing. We’d like to do that in house. How would we go about it?

-Olivia


DrJay_desk-withGoatee.pngHi Olivia,

This is the perfect tie in for big data and the key performance index (KPI). Senior management doesn’t really have time to pour through tables of numbers to see how things are going. What they want is a nice barometer that can be used to summarize overall performance. So, how might one take data from each business unit and aggregate them into a composite score?

We begin the process by understanding all the measures we have. Once we have assembled all of the potential inputs to our key measure, we need to develop a weighting system to aggregate them into one measure. This is often the challenge when working with internal data. We need some key business metric to use as the dependent variable, and these data are often missing in the database.

For example, I might have sales by product by customer and maybe even total revenue. Companies often assume that the top revenue clients are the bread and butter for the company. But what if your number one account uses way more corporate resources than any other account? If you’re one of the lucky service companies, you probably charge hours to specific accounts and can easily determine the total cost of servicing each client. If you sell a tangible product, that may be more challenging. Instead of sales by product or total revenue, your business decision metric should be the total cost of doing business with the client or the net profit for each client. It’s unlikely that you capture this data, so let’s figure out how to compute it. Gross profit is easy (net sales – cost of goods sold), but what about other costs like sales calls, customer service calls, and product returns? Look at other internal databases and pull information on how many times your sales reps visited in person or called over the phone, and get an average cost for each of these activities. Then, you can subtract those costs from the gross profit number. Okay, that was an easy one.

Let’s look at the city of Boston case for a little more challenging exercise. What types of information is the city using? According to the article you referenced, the city hopes to “corral their data on issues like crime, housing for veterans and Wi-Fi availability and turn them into a single numerical score intended to reflect the city’s overall performance.” So, how do you do that? Let’s consider that some of these things have both income and expense implications. For example, as crime rates go up, the attractiveness of the city drops and it loses residents (income and property tax revenues drop). Adding to the lost revenue, the city has the added cost of providing public safety services. If you add up the net gains/losses from each measure, you would have a possible weighting matrix to aggregate all of the measures into a single score. This allows the mayor to quickly assess changes in how well the city is doing on an ongoing basis. The weights can be used by the resource planners to assess where future investments will offer the greatest pay back.

 Dr. Jay is fascinated by all things data. Your data, our data, he doesn’t care what the source. The more data, the happier he is.

Topics: advanced analytics, Boston, big data, Dear Dr. Jay