What if Market Research Fails To Grasp Mobile? A Lesson from Retailers.

Posted by admin on Aug 3, 2017 9:42:47 AM

 

Smartphone in Grocery

 

What’s in store for stores? Different major retail chains are coming up with markedly different answers as they try to plan and strategize in the face of a landscape that’s been forever changed by e-commerce. Some of their shoppers now click to buy instead of coming to the store, and those who do shop in-store are arriving with smartphones in hand, giving them an unprecedented edge in product and price comparisons.

 

A report on the Top 100 retailers of 2016 from Stores, the magazine of the National Retail Federation, gives a solid overview of how operators of bricks-and-mortar stores are trying to find their bearings in the Smartphone Era. The insights industry itself is no stranger to changes wrought by the fact that phones are the combination tool and toy that U.S. consumers turn to whenever they need information or want to access useful or enjoyable content.

 

The Stores magazine article duly notes the strategic retreat that’s going on in retail, but it focuses mainly on how retailers are imagining the future and what to do about it. Here are some of its most interesting insights. We'll leave it to you to draw any parallel insights about market research.

  • “The country is `grossly overstored’ and...a third of the weakest retail locations should be shut down,’” one expert told the magazine. “This very painful process will surely take more than five years.”
  • However, the same observer says, strategic scaling back “will also create enormous opportunity for those with the capital and management platforms” to adjust smartly.
  • “Companies should be in the business of creating the future and not simply responding to what pundits and polls think their customers are looking for,” advises another expert who urges taking new initiatives even as retail contracts.
  • The report note’s Target’s three-year, $7 billion initiative to position itself for growth by building or remodeling stores, upgrading its supply chain, and developing more private-label merchandise. Meanwhile, it's willing to incur extra current costs and forgo some current profits by budgeting $1 billion in discounts to drive more e-commerce sales.
  • Costco, on the other hand, isn’t planning a special emphasis on e-commerce, which one expert says now accounts for about 4% of its sales. The priority, according to Costco’s chief financial officer, is “in-store, getting members coming in and buying when they can see everything that we have.”
  • Smartphones are the new, transformative fact that all retailers have to reckon with. The product knowledge in-store shoppers bring through the door has “changed dramatically,” says an expert from shopping center owner Starwood Capital Group.
  • But, he adds, that doesn’t have to be a bad thing. “The important thing to remember with e-commerce is that it is a tool to help shoppers do things more conveniently in more places than before” -- and that bricks and mortar stores will remain important to mobile shoppers.

One thing that’s certain is that quality consumer data is essential for decision-makers to chart their companies' paths successfully as they face tech-driven turbulence in their markets. And insights professionals need to adapt to the same big changes in technology, or risk being seen as irrelevant or even counterproductive to the decision-making process. The dominance of mobile communications is driving these changes, so researchers need to become mobile-savvy if they're going to adapt. You probably already know this. But how should you start?

 

The first thing you need is a clear understanding that mobile research is not a commodity -- and that there are radical distinctions between the two main approaches. One approach is in-app mobile research, which fully embraces the new realities and opportunities smartphones present. The other is  a “mobile optimized” approach that remains wedded to last-generation online panels and methodology. If online data is giving you trouble, so will "mobile optimized" data. That's why you're looking for something else, something that works in our mobile-saturated consumer landscape. Let’s have a productive conversation about it. Just get  in touch at solutions@mfour.com.

 

Topics: MFour Blog

There's a Lesson for MR in how Mobile Consumers Are Changing TV Sports

Posted by admin on Aug 2, 2017 10:18:21 AM

 

Blog pic boxers 900 x 300 1Aug17

 

 

Are smartphones a threat to pro athletes’ earnings, their leagues’ profits, and the future of legacy national and local TV outlets that traditionally have paid top dollar for sports’ ability to deliver real-time viewership?

 

The question is gaining urgency from the locker room to the owner’s skybox, not to mention the executive suites of broadcast, cable and satellite TV providers. Their struggle is captured in the title of a new report from Business Insider, “The Digital Disruption of Live Sports: A Deep Dive into the Fall of TV’s Most Lucrative Programming.”

 

Business Insider’s report summary says that live sports have been “traditional TV’s flagship bulwark against digital disruption.” But now that defense “appears to be in trouble” because “new media platforms” are taking audiences elsewhere.

 

Televised sports is hardly the only business that must master the sudden arrival and pervasive appeal of mobile devices as the public’s favored interface with the world of information and communications. Online market research is at a similar point of inflection. To borrow Business Insider’s phraseology, online surveys are also  “in trouble” because “new media platforms” are messing with their ability to attract and engage a responsive audience.

 

In sports, Business Insider’s key takeaways sound like inescapable challenges to the status quo:

  • “The increasing cost of sports broadcast rights and, accordingly, the higher advertising rates for brands, is making the current live sports business model unsustainable."
  • "With the legacy live sports model in decline, social and digital video platforms are making large strides to acquire sports programming."
  • "Broadcasters will likely be forced to relinquish a slice of the lucrative revenue pie generated by live sports content.”

The parallel in mobile market research’s challenge to online research is clear. The audience (that is, consumers willing to take surveys) has moved to another platform – the smartphone.  U.S. adults now spend  an average of 2 ¼ hours per day accessing mobile media, rising to more than 3 hours among Millennials.

 

Clearly, businesses that need to know what consumers think, feel, want and do will have to commit their market research to mobile – just as their marketing departments  intuitively are committing marketing dollars to mobile. It’s also clear that market research will have to make the same transition in order to inform business decisions with the most reliable, representative, fraud-resistant and easily obtainable data. So the question concerning mobile research is no longer “will you?” but “will you get it right?”

 

And the answer is that of course you will -- as soon as you start learning about mobile research's best practices, and how they fit your particular needs. The first step is understanding that so-called “mobile optimized” approaches are really cosmetic half-measures the don't get you the data you need. Mobile best practices begin with in-app, offline mobile studies that will move you forward instead of bogging you down in a futile fight to maintain a status quo that's collapsing fast. To learn more, just contact us at solutions@mfour.com.

Topics: MFour Blog

Learn How In-App Mobile Surveys Protect You from Bot Fraud

Posted by admin on Aug 1, 2017 9:34:39 AM

 

Newsletter Bots pic 900 x 300 31July17

 

You may know more about R2D2 and C3PO, the robot stars of “Star Wars,” than you do about the robots that suck the lifeblood out of online market research by fraudulently impersonating real, flesh-and-blood survey respondents.

 

Fraudsters create bots to feed on the rewards offered to online survey-takers. The more a bot goes undetected, the more it deceives research providers and clients into believing they are getting real completes from real people. The consequence is the very definition of a double whammy: not only do clients waste money on the reward payments that go to the botmeisters for each fraudulent response, but they incur the potentially much higher cost of basing analysis and business decisions on the bogus data the bots have left behind.

 

On a more encouraging note, insights professionals are starting to discuss the bot epidemic publicly, and open discussion is the first step toward prevention and cure. It's understandable that the main focus so far has been online research, because bots do their hunting online. But in-app mobile research also needs to be part of the conversation, because it moves the survey-taking process offline, where bots can’t follow.

 

Joe Hopper, founder of Versta Research, has made a strong contribution to the discussion in an article on his company’s blog entitled “How Many Bots Took Your Survey?” The answer, he writes, is “Almost certainly more than you think….if you are purchasing access to survey respondents from panel providers, or from survey software providers…you are probably getting fraudulent data from automated bots or from survey-taker farms.”

 

Hopper sees this as a severe challenge to data reliability and a real danger to survey-based market research. “For our most recent survey” he wrote, “we sourced sample from the top, most expensive provider in the U.S. market, with all the usual assurances of double opt-in, identity verification, etc. We found fraud (the provider was horrified, as they should be).”

 

Hopper emphasizes the need for perpetual vigilance in monitoring individual survey responses for signs of fraud. He also recommends tracing the IP address and Internet Service Provider from which suspected bots have been launched, then permanently blocking them.

 

But the best and simplest bot-fighter is a trustworthy, validated all-mobile panel that’s united in taking surveys on a native mobile app. In-app research takes place offline, in a safety zone that stands apart from the online realm where bots can freely roam. In a properly designed mobile survey, the entire questionnaire loads instantly into respondents’ smartphones. They proceed to answer in the app instead of online, which means there's no need to stay connected to the bot-infested internet. Bots can take surveys, but they don't have smartphones and they can't download a legitimate survey app. MFour’s process begins with each panelist downloading the Surveys on the Go® app, which has been defining and advancing in-app mobile research since its debut in 2011.

 

To sum up, prospective research clients should always ask panel providers about how they’re sourcing panelists, and inquire about what they’re doing to guard against bots and other types of panel fraud. And when your due-diligence turns to the subject of bots, don’t forget to ask how the seller’s approach to panel integrity and data quality stacks up against offline, in-app mobile. To have a productive conversation on this important subject, just contact us at solutions@mfour.com.

Topics: MFour Blog

Test Your Mobile Ads in Real Consumers’ Social News Feeds

Posted by admin on Jul 31, 2017 10:00:54 AM

 

Blog Social Ad Testing 900 x 300

 

How much faith should advertisers have in Facebook and other social media platforms?

 

Its huge audience, mostly arriving on mobile, allows Facebook to command a 25% share of the U.S. mobile advertising market, according to eMarketer. Two-thirds of all advertisers place bets with the social media giant, with no real idea of how well those bets will pay off. A new process called Emotional Brand Connections Social Media Ad Testing brings fresh clarity. But first, more on the foggy attempts at metrics EBC Social Media Ad Testing is designed to supersede.

 

The problem of how to assess a social media ad’s effectiveness isn’t specific to Facebook, although its clout puts Facebook in the spotlight. Last year, a series of discrepancies in some of the performance data Facebook had provided to advertisers drew lots of attention and highlighted how hard it is to obtain reliable metrics along the new frontier of mobile social advertising. In response, Facebook increased opportunities for outside auditing of ad performance.

 

But just what is the proper standard for judging whether a mobile ad is working? A recent article from Digiday says that third-party tests have shown that video ads on Facebook often don’t achieve the minimum standard for "viewability" set by the Media Rating Council (MRC). The MRC has determined that video ads should be seen for at least two seconds to be considered “viewable,” with at least half the player screen in view.

 

Responding to the article, Facebook argued that people absorb content more quickly on mobile than on desktop, and that the threshold for mobile ads’ effectiveness is in fact markedly lower than the MRC’s standard. “We believe that the value of an ad…is generated the moment an ad comes on screen,” said a statement Facebook gave to Digiday, adding that independent studies have shown that  “people can recall mobile news feed content at a statistically significant rate after only 0.25 seconds of exposure.”

 

So what’s a marketer to make of these conflicting assertions about ad metrics? It wouldn’t hurt to take a hint from Ronald Reagan, who famously said "trust, but verify." Trust that social media is where ads need to be. But arm yourself with a trustworthy way to verify that an ad can be seen well enough to achieve the intended responses from the consumers it’s targeted reach.

 

And that’s where Emotional Brand Connections Social Media Ad Testing comes in. Developed by MFour in connection with Kantar AddedValue, it allows an advertiser to test an ad’s chances of success with the target audience before the ad goes live on social media. Here are the key features:

  • Test recipients are drawn from an all-mobile active panel of more than 1.3 million U.S. consumers. Its representative demographic makeup lets you target  any key consumer audience -- including Millennials, Hispanics and African Americans.
  • The process injects an ad into targeted consumers’ social news feeds. It shows up naturally, with no discernible difference from regular ads the audience gets. That means test recipients don't know, at first, that they're part of a test.
  • In its initial stage, the test captures natural passive behavioral metrics: How long did recipients view the ad? Did they turn on a video ad's sound? Did they respond by clicking, liking or sharing? 
  • The process then adds a unique human dimension by allowing marketers to double back and survey the same recipients of the test ad for deeper qualitative insights.
  • In this stage you'll first get natural, unaided responses to measure awareness and recall of the ad and brand. Then you'll seek aided responses in which you identify your ad to respondents and ask them about it in detail.
  • If respondents give the ad high marks for concept, content and recall, the advertiser can proceed confidently, knowing the ad is going to achieve its objectives.
  • If survey respondents don’t respond well to an ad, they’ll give the feedback needed to revise it. Then it can be retested until it’s clear the ad is well-primed to achieve its goals.

The confusion that so far has permeated discussions of how effective mobile social ads really is no surprise, given how quickly the platform has become dominant. Clearly, the current standard that calls for  clocking seconds of "viewability" and measuring how many pixels appear is inappropriate for the age of social media. It's like using stopwatches and rulers. You need something geared specifically to the nature of social media -- something that gives you data and evaluations straight from real mobile consumers who are, after all, the real judges of mobile advertising. For more information, just contact us at solutions@mfour.com.

 

 

Topics: MFour Blog

On the Horizon: ANA Data & Measurement Conference

Posted by admin on Jul 28, 2017 11:08:13 AM

MFour is excited to attend the ANA Masters of Marketing conference September 13 in Florida. Lots to learn!

Topics: Upcoming Events

Meet MFour at September Path to Purchase Conference

Posted by admin on Jul 28, 2017 10:58:11 AM

MFour is exhibiting at the September 26-28 Path to Purchase conference in Chicago. Interested in mobile? Stop by our booth!

Topics: Upcoming Events

Learn How To Fight Duplicate Response Fraud

Posted by admin on Jul 28, 2017 9:48:22 AM

 

Blog Roundup Double Skulls

 

Here's your Friday roundup of 3 items from the MFour blog to keep you up to speed on mobile.

 

Ignoring Panel Fraud Gets You in Double Trouble

 

How Slowpoke "Mobile Optimized" Surveys Damage Your Data

 

Social Media Ads Are a Must. Here's How To Get Them Right.

 

And here's a Friday tune to put you in a happy mood for the weekend.

 

Topics: MFour Blog

Here's the Score for In-App Mobile vs. “Mobile Optimized” Surveys

Posted by admin on Jul 27, 2017 9:57:22 AM

 

Blog Mobile Speed Snail 27July17

Imagine a basketball team that beat a rival 131 to 26. Which would you bet on the next time they played?

In fact, 131 to 26 is the real-life score in the showdown between mobile app use and mobile web use. According to eMarketer, the average U.S. adult smartphone user spent 131 minutes a day accessing content with an app during 2016, and just 26 minutes using the phone to connect to the web. The blowout is expected to get even worse: the estimated engagement score for 2017 is mobile apps, 145; mobile web, 26.

Why does this matter for market research? Because it’s an important distinction insights professionals need to understand as they decide what to do about mobile research. As you plunge into mobile, you’ll need to decide which of two different approaches to use: in-app mobile surveys, and online surveys in which a smartphone user clicks a link to connect to a survey housed on the web. The latter approach is commonly referred to in the industry as “mobile optimized” research.

So why does the scoreboard read mobile apps 131, and “mobile optimized” 26? There are a number of reasons, all of them having to do with how frustrating it is for mobile consumers to use their phones to connect to the web. For now we’ll focus on just one common problem: slow survey downloads.

In basketball terms, mobile apps enjoy a huge advantage in team speed, and it’s reflected in that 131 to 26 usage gap between in-app and mobile web.

It’s a truism that today’s consumers want instant experiences, and that they hate waiting for anything.  In-app survey performance gives them the speed they require.  “Mobile optimized” surveys too often will leave them twiddling their thumbs while their phones try to connect with an online survey site. When they finally get there – if they wait out a slow download – the waste of time is apt to affect their engagement with the survey.

Soasta, a provider of testing services, examined how load-in delays affect mobile shoppers' engagement when they access retail websites. The data are as relevant to consumer surveys as they are to e-commerce shopping. Being forced to wait is a big turn-off, regardless what the specific task is that you're trying to accomplish.

  • 53% of mobile visitors to online sites will leave a page that takes longer than 3 seconds to load.
  • 28% won’t return to a slow site.
  • 47% of consumers browse retail sites on their phones, but only 20% use them to complete purchases – with slowed mobile-web transaction speeds a key factor.
  • The “bounce rate,” a strong indicator of engagement, is 51.8% on the mobile web (a user is said to “bounce” if he or she doesn’t click further after reaching a site).

When it comes to speed, the study concludes, “user expectations are extremely high….[and the] user patience threshold is low….even milliseconds can matter.” The report’s takeaway: “You need to understand how real users are experiencing your site, and how even small or intermittent slowdowns could be hurting your business.”

The same dynamic of high expectations and low patience that’s seen among mobile shoppers applies to survey-takers as well. The outcome you’re seeking is a high response rate with high engagement that will yield reliable data without undue angst for researcher and survey-taker alike.

To compensate for their lack of efficiency and engagement, online survey suppliers are compelled to compensate by sacrificing quality for the sake of volume. They need to engage in panel-sharing to achieve the completes and demographic quotas a study requires. And for the client that often means poorly-validated respondents who've been recruited in a non-transparent, catch-as-catch-can fashion.

Surveys that don't force respondents to wait, and can target and alert them for location studies by harnessing proprietary GeoIntensity® and GeoNotification® technologies, will naturally achieve far better response rates. That translates to faster projects that achieve better engagement and more accurate and reliable data.

If you’re experiencing angst over how your online surveys are being sourced and whether the panel you get is truly capable of representing what consumers actually think and feel, it could be time to look into in-app mobile. You already know the score: 131 to 26. For details on how innovative in-app, offline mobile research solutions can meet your specific needs, just get in touch at solutions@mfour.com.

 

Topics: MFour Blog

Mobile 101: So Why Bother With Mobile Research?

Posted by admin on Jul 26, 2017 9:36:38 AM

 

mobile 101

 

Why should consumer researchers want to survey people on smartphones instead of desktops?

 

This is one question whose answer is literally within arm’s reach, or at your fingertips  –  even if you’re sitting in front of a desktop or laptop computer. To talk to consumers, researchers need to find consumers where they are, not where they’d prefer them to be. And here’s where consumers are:

  • Pew Research Center reported that in 2016, 77% of Americans 18 and older owned smartphones, up from 35% five years earlier.
  • Smartphone ownership was 92% for the 18-29 age group.
  • Ownership of the other mobile device category, tablets, rose tenfold between 2011 and 2016, from 5% to 51%.
  • Ownership of desktops and laptops has been static at 78% since 2012, but their role in the digital realm has fallen: comScore reports that in 2016 Americans spent 11% less time with digital content on desktops/laptops than they had the year before.
  • Americans’ use of mobile apps rose 13.9% in 2016, according to eMarketer, which predicts more double-digit growth (10.7%) in 2017.
  • Looking at 2015 usage, comScore found that 58% of all U.S. digital usage time occurred on mobile apps – far exceeding desktops/laptops’ 33% share.
  • Mobile web’s 8% share of digital access time underscored the continuing connectivity issues that plague it, including dropped signals and excruciatingly slow content load-ins. When you hear about “mobile optimized” surveys, “mobile web” is what’s really being sold.
  • Facebook owes its juggernaut status to mobile, which accounts for 85% of its advertising revenue.
  • Intel, whose processors famously powered the desktop revolution, has struggled amid changing times. Last year it announced an 11% staff reduction, cutting 12,000 jobs, while attempting to shift its focus away from PCs to new-generation devices.

You get the picture, and you’ll draw your own conclusions. But resolving to adopt mobile research is just the beginning. Since all mobile is not created equal, you have important choices to make. Stay tuned to this blog for what you need to know, including our periodic Mobile 101 posts. To discuss how offline, in-app mobile  capabilities can meet your specific needs, just get in touch at solutions@mfour.com.

Topics: MFour Blog

Don't Get Duped By Panel Duplication

Posted by admin on Jul 25, 2017 9:30:10 AM

 

Blog panel duplication 900 x300 25July17

 

In consumer research, double-dipping is a common type of fraud that inflicts a double indignity on clients. When one panelist takes the same online survey twice, the answers have no validity and the data gets tainted. The client is being gamed by dishonest survey-takers, and let down (to put it politely) by a panel provider whose loose recruitment methods allowed the dishonesty to occur.

 

It should go without saying that consumer panels must consist of unique individuals whose collective responses reliably reflect the overall population whose behaviors and attitudes toward a product or experience are under study. But some providers – and, apparently, some research firms and brands -- seem to think that cutting corners on panel security and panel quality is no big deal. Duplicate responses are treated as a reasonable cost of filling sample needs for a price that’s nice and cheap. But what's the  cost when unwitting executives are led to rely on analysis and recommendations based on distorted survey data? Brands deserve quality data because they need to make quality business decisions.

 

A common recruitment method called “panel routing” opens the door wide to errors and abuses. It rises from widespread desperation to fill quotas for online surveys in the face of the boredom or alienation of a public that wants to conduct its personal business conveniently on mobile instead of being glued to a desktop or laptop. The desperation to fill online quotas is evident in the title of one article that recommends panel routing: “Can Someone Please Complete My Survey? Routing Makes it Easier.”

 

“Finding willing survey participants continues to be more and more difficult,” the author wrote, and routing “brings the respondent to the survey or surveys.” Putting it less delicately, routing is a cattle call that recruits survey-takers without vetting them in any serious way. After providing some quick information about themselves, at best, panelists who enter an online routing funnel are sent to the first desktop or "mobile optimized" survey for which its assumed they might qualify. And since panel routers serve many clients, they often invite respondents to keep coming back indiscriminately for more surveys, creating bias from overuse.

 

In a striking recent post on the GreenBook blog, Brian Lamar, an experienced data quality consultant, said he’s become so tired of sloppy panel recruitment, among other poor practices, that he had to speak out bluntly.  “I see a lot of bad research, unfortunately, both in my day job [evaluating] data quality as well as when I take surveys in my spare time,” Lamar writes. “Respondents routinely answer the same question over and over as they’re routed from sample provider to sample provider. And this bad research isn’t from [the less well-known] companies you would expect – they’re from names all of you have heard of: big brands or big market research companies...It makes me sad, and…you should be sad or angry as well.”

 

The good news is that there is a way forward: offline, in-app mobile survey technology, combined with a unique, dedicated panel whose members take surveys solely on their smartphones. That keeps the entire process out of the uncontrollable online environment. A smooth-functioning survey app such as Surveys on the Go® attracts a large and diverse array of panelists because it meets them in the offline, in-app mobile zone that today’s consumers favor (Americans now average about 2 ¼ hours a day using mobile apps – 3 hours for Millennials). Here are a few of the specific safeguards that in-app, offline mobile provides to prevent duplicate responses and other types of panel fraud:

  • Each mobile device has its own unique identification code, and the survey technology won't allow duplicate attempts from the same phone.
  • Panelists suspected of deliberate attempts to game the system will be bounced from the panel for good, along with those who give answers that are thoughtless and mechanical rather than honestly responsive. Data quality is too important to tolerate poor respondents for the sake of quota-filling numbers.
  • Also unique to an in-app mobile panel is your ability to verify its members' engagement simply by checking users’ unsolicited public ratings and reviews of the app at Apple’s App Store and Google Play.

There’s a lot more to say about data quality and how it’s endangered by duplicate responses and other forms of online panel fraud. To continue the conversation, just contact us at solutions@mfour.com.

Topics: MFour Blog

Subscribe to Email Updates

Recent Posts