admin

Recent Posts

Meet MFour at September Path to Purchase Conference

Posted by admin on Jul 28, 2017 10:58:11 AM

MFour is exhibiting at the September 26-28 Path to Purchase conference in Chicago. Interested in mobile? Stop by our booth!

Topics: Upcoming Events

Learn How To Fight Duplicate Response Fraud

Posted by admin on Jul 28, 2017 9:48:22 AM

 

Blog Roundup Double Skulls

 

Here's your Friday roundup of 3 items from the MFour blog to keep you up to speed on mobile.

 

Ignoring Panel Fraud Gets You in Double Trouble

 

How Slowpoke "Mobile Optimized" Surveys Damage Your Data

 

Social Media Ads Are a Must. Here's How To Get Them Right.

 

And here's a Friday tune to put you in a happy mood for the weekend.

 

Topics: MFour Blog

Here's the Score for In-App Mobile vs. “Mobile Optimized” Surveys

Posted by admin on Jul 27, 2017 9:57:22 AM

 

Blog Mobile Speed Snail 27July17

Imagine a basketball team that beat a rival 131 to 26. Which would you bet on the next time they played?

In fact, 131 to 26 is the real-life score in the showdown between mobile app use and mobile web use. According to eMarketer, the average U.S. adult smartphone user spent 131 minutes a day accessing content with an app during 2016, and just 26 minutes using the phone to connect to the web. The blowout is expected to get even worse: the estimated engagement score for 2017 is mobile apps, 145; mobile web, 26.

Why does this matter for market research? Because it’s an important distinction insights professionals need to understand as they decide what to do about mobile research. As you plunge into mobile, you’ll need to decide which of two different approaches to use: in-app mobile surveys, and online surveys in which a smartphone user clicks a link to connect to a survey housed on the web. The latter approach is commonly referred to in the industry as “mobile optimized” research.

So why does the scoreboard read mobile apps 131, and “mobile optimized” 26? There are a number of reasons, all of them having to do with how frustrating it is for mobile consumers to use their phones to connect to the web. For now we’ll focus on just one common problem: slow survey downloads.

In basketball terms, mobile apps enjoy a huge advantage in team speed, and it’s reflected in that 131 to 26 usage gap between in-app and mobile web.

It’s a truism that today’s consumers want instant experiences, and that they hate waiting for anything.  In-app survey performance gives them the speed they require.  “Mobile optimized” surveys too often will leave them twiddling their thumbs while their phones try to connect with an online survey site. When they finally get there – if they wait out a slow download – the waste of time is apt to affect their engagement with the survey.

Soasta, a provider of testing services, examined how load-in delays affect mobile shoppers' engagement when they access retail websites. The data are as relevant to consumer surveys as they are to e-commerce shopping. Being forced to wait is a big turn-off, regardless what the specific task is that you're trying to accomplish.

  • 53% of mobile visitors to online sites will leave a page that takes longer than 3 seconds to load.
  • 28% won’t return to a slow site.
  • 47% of consumers browse retail sites on their phones, but only 20% use them to complete purchases – with slowed mobile-web transaction speeds a key factor.
  • The “bounce rate,” a strong indicator of engagement, is 51.8% on the mobile web (a user is said to “bounce” if he or she doesn’t click further after reaching a site).

When it comes to speed, the study concludes, “user expectations are extremely high….[and the] user patience threshold is low….even milliseconds can matter.” The report’s takeaway: “You need to understand how real users are experiencing your site, and how even small or intermittent slowdowns could be hurting your business.”

The same dynamic of high expectations and low patience that’s seen among mobile shoppers applies to survey-takers as well. The outcome you’re seeking is a high response rate with high engagement that will yield reliable data without undue angst for researcher and survey-taker alike.

To compensate for their lack of efficiency and engagement, online survey suppliers are compelled to compensate by sacrificing quality for the sake of volume. They need to engage in panel-sharing to achieve the completes and demographic quotas a study requires. And for the client that often means poorly-validated respondents who've been recruited in a non-transparent, catch-as-catch-can fashion.

Surveys that don't force respondents to wait, and can target and alert them for location studies by harnessing proprietary GeoIntensity® and GeoNotification® technologies, will naturally achieve far better response rates. That translates to faster projects that achieve better engagement and more accurate and reliable data.

If you’re experiencing angst over how your online surveys are being sourced and whether the panel you get is truly capable of representing what consumers actually think and feel, it could be time to look into in-app mobile. You already know the score: 131 to 26. For details on how innovative in-app, offline mobile research solutions can meet your specific needs, just get in touch at solutions@mfour.com.

 

Topics: MFour Blog

Mobile 101: So Why Bother With Mobile Research?

Posted by admin on Jul 26, 2017 9:36:38 AM

 

mobile 101

 

Why should consumer researchers want to survey people on smartphones instead of desktops?

 

This is one question whose answer is literally within arm’s reach, or at your fingertips  –  even if you’re sitting in front of a desktop or laptop computer. To talk to consumers, researchers need to find consumers where they are, not where they’d prefer them to be. And here’s where consumers are:

  • Pew Research Center reported that in 2016, 77% of Americans 18 and older owned smartphones, up from 35% five years earlier.
  • Smartphone ownership was 92% for the 18-29 age group.
  • Ownership of the other mobile device category, tablets, rose tenfold between 2011 and 2016, from 5% to 51%.
  • Ownership of desktops and laptops has been static at 78% since 2012, but their role in the digital realm has fallen: comScore reports that in 2016 Americans spent 11% less time with digital content on desktops/laptops than they had the year before.
  • Americans’ use of mobile apps rose 13.9% in 2016, according to eMarketer, which predicts more double-digit growth (10.7%) in 2017.
  • Looking at 2015 usage, comScore found that 58% of all U.S. digital usage time occurred on mobile apps – far exceeding desktops/laptops’ 33% share.
  • Mobile web’s 8% share of digital access time underscored the continuing connectivity issues that plague it, including dropped signals and excruciatingly slow content load-ins. When you hear about “mobile optimized” surveys, “mobile web” is what’s really being sold.
  • Facebook owes its juggernaut status to mobile, which accounts for 85% of its advertising revenue.
  • Intel, whose processors famously powered the desktop revolution, has struggled amid changing times. Last year it announced an 11% staff reduction, cutting 12,000 jobs, while attempting to shift its focus away from PCs to new-generation devices.

You get the picture, and you’ll draw your own conclusions. But resolving to adopt mobile research is just the beginning. Since all mobile is not created equal, you have important choices to make. Stay tuned to this blog for what you need to know, including our periodic Mobile 101 posts. To discuss how offline, in-app mobile  capabilities can meet your specific needs, just get in touch at solutions@mfour.com.

Topics: MFour Blog

Don't Get Duped By Panel Duplication

Posted by admin on Jul 25, 2017 9:30:10 AM

 

Blog panel duplication 900 x300 25July17

 

In consumer research, double-dipping is a common type of fraud that inflicts a double indignity on clients. When one panelist takes the same online survey twice, the answers have no validity and the data gets tainted. The client is being gamed by dishonest survey-takers, and let down (to put it politely) by a panel provider whose loose recruitment methods allowed the dishonesty to occur.

 

It should go without saying that consumer panels must consist of unique individuals whose collective responses reliably reflect the overall population whose behaviors and attitudes toward a product or experience are under study. But some providers – and, apparently, some research firms and brands -- seem to think that cutting corners on panel security and panel quality is no big deal. Duplicate responses are treated as a reasonable cost of filling sample needs for a price that’s nice and cheap. But what's the  cost when unwitting executives are led to rely on analysis and recommendations based on distorted survey data? Brands deserve quality data because they need to make quality business decisions.

 

A common recruitment method called “panel routing” opens the door wide to errors and abuses. It rises from widespread desperation to fill quotas for online surveys in the face of the boredom or alienation of a public that wants to conduct its personal business conveniently on mobile instead of being glued to a desktop or laptop. The desperation to fill online quotas is evident in the title of one article that recommends panel routing: “Can Someone Please Complete My Survey? Routing Makes it Easier.”

 

“Finding willing survey participants continues to be more and more difficult,” the author wrote, and routing “brings the respondent to the survey or surveys.” Putting it less delicately, routing is a cattle call that recruits survey-takers without vetting them in any serious way. After providing some quick information about themselves, at best, panelists who enter an online routing funnel are sent to the first desktop or "mobile optimized" survey for which its assumed they might qualify. And since panel routers serve many clients, they often invite respondents to keep coming back indiscriminately for more surveys, creating bias from overuse.

 

In a striking recent post on the GreenBook blog, Brian Lamar, an experienced data quality consultant, said he’s become so tired of sloppy panel recruitment, among other poor practices, that he had to speak out bluntly.  “I see a lot of bad research, unfortunately, both in my day job [evaluating] data quality as well as when I take surveys in my spare time,” Lamar writes. “Respondents routinely answer the same question over and over as they’re routed from sample provider to sample provider. And this bad research isn’t from [the less well-known] companies you would expect – they’re from names all of you have heard of: big brands or big market research companies...It makes me sad, and…you should be sad or angry as well.”

 

The good news is that there is a way forward: offline, in-app mobile survey technology, combined with a unique, dedicated panel whose members take surveys solely on their smartphones. That keeps the entire process out of the uncontrollable online environment. A smooth-functioning survey app such as Surveys on the Go® attracts a large and diverse array of panelists because it meets them in the offline, in-app mobile zone that today’s consumers favor (Americans now average about 2 ¼ hours a day using mobile apps – 3 hours for Millennials). Here are a few of the specific safeguards that in-app, offline mobile provides to prevent duplicate responses and other types of panel fraud:

  • Each mobile device has its own unique identification code, and the survey technology won't allow duplicate attempts from the same phone.
  • Panelists suspected of deliberate attempts to game the system will be bounced from the panel for good, along with those who give answers that are thoughtless and mechanical rather than honestly responsive. Data quality is too important to tolerate poor respondents for the sake of quota-filling numbers.
  • Also unique to an in-app mobile panel is your ability to verify its members' engagement simply by checking users’ unsolicited public ratings and reviews of the app at Apple’s App Store and Google Play.

There’s a lot more to say about data quality and how it’s endangered by duplicate responses and other forms of online panel fraud. To continue the conversation, just contact us at solutions@mfour.com.

Topics: MFour Blog

How To Tell Whether Your Social Media Ads Really Work

Posted by admin on Jul 24, 2017 9:25:54 AM

 

Blog Social Media Ad Test 900 x 300 26July17

 

Once upon a time, “Westward, Ho!” defined where the U.S. population – and with it, the U.S. economy – was heading.

 

Today, “Social, Ho!” hasn’t yet become a watchword, but the advertising industry may have to come up with one. As Americans spend increasing daily minutes and hours on social media, advertisers are spending increasing dollars to be right there with them. But just as settlers heading West in the 1800s were venturing into rich but uncharted territory, brands are staking their ad dollars in a still-new and largely uncharted media terrain. They know they need to be in social media, but they’re still struggling to understand what makes a good social media ad. And metrics that show how well those ads are performing remain elusive. But read on, and you’ll see that effective solutions for mobile social ad testing and mobile ad metrics are in reach.

 

First, here’s how high the stakes are:

  • The Interactive Advertising Bureau reported that 2016 spending on social ads totaled $16.3 billion in 2016, up 50% over 2015. The lion’s share of those ads were seen on the mobile apps that U.S. consumers overwhelmingly favor as portals to their social accounts.
  • So far this year, advertising’s turn to social shows no signs of slowing. In an article on the latest figures from Standard Media Index (SMI), which tracks ad spending, MediaPost reports that social media “showed soaring 55% gains” in the second quarter of 2017.
  • That’s on top of “robust” growth of 25.9% in the first quarter.
  • Social is soaking up spending previously devoted to other U.S. ad channels. Second quarter growth in overall ad spending was just 3.8%, according to SMI, after 2.8% growth in Q1 – which was the slowest first-quarter growth rate since 2011.

So yes --  “Social Ho!”  works as a two-word catchphrase for what’s happening in 21st century media.

 

But the latest SMI figures also underscore the uncharted nature of social media advertising, and the uncertainties that result. Social media video advertising, including YouTube and Facebook, “sank an eye-opening 15%” during the second quarter. MediaPost attributes the decline to “brand safety concerns, along with some measurement issues.”

 

The brand safety concerns arise from automated placement that sometimes embeds a brand’s social advertising alongside incompatible or objectionable content. And measurement has been a puzzle as advertisers search for ways to understand how well social ads generate awareness and drive interest when they land in consumers’ social media news feeds.

 

Like social media itself, In-app mobile research has matured and taken hold in the 2010s, with social and mobile both driven by the rising dominance of the smartphone. Now newly-developed mobile research capabilities are finally allowing advertisers to fill in the blanks as they seek ways to optimize social media campaigns and measure their effectiveness. 

 

One breakthrough involves testing social media ads in the best way possible – by injecting them into the actual personal news feeds where those ads will be appear after the campaign is officially launched. Advertisers can test and hone their concepts with panel members who fit the audience for their brand or product, using a process that collects passive data that reflects awareness and interaction with an add, and then surveys ad recipients for detailed, qualitative insights into what’s working or not working.

 

For measurement after an ad campaign is launched, advertisers can now learn which mobile consumers are receiving the ad on their phones, and determine whether these validated recipients fit the audience profile the advertiser is paying to reach. There’s also an opportunity to send verified ad recipients a mobile survey for whatever further insights a researcher may need.

 

The takeaway is that  “Social Ho!” and “Mobile Ho!” are no longer calls to venture into uncharted country. Now advertisers have a map they can follow to understand how their concepts and content are faring along the rich frontier of social media and the mobile devices. To learn more, just get in touch at solutions@mfour.com.

 

 

 

Topics: MFour Blog

Why Email Notifications Spell Doom for Data Quality

Posted by admin on Jul 21, 2017 9:32:29 AM

 

 

Blog email image 900 x 300 21July17

 

Here's your Friday roundup of 3 items from the MFour blog to keep you up to speed on mobile.

 

Email Survey Notifications Are the New Horse & Buggy

 

The Sad Facts about "Mobile Optimized"

 

Why Market Researchers Should Watch "The Big Short"

 

And here's a Friday tune to get you bouncing into your weekend.

Topics: MFour Blog

MFour Hires Mike Gaffney To Drive its Growth as Chief Revenue Officer

Posted by admin on Jul 20, 2017 9:34:27 AM

 

Mike Gaffney blog size

 

Mike Gaffney has joined MFour Mobile Research as Chief Revenue Officer, a new position in which he’ll play a key role in driving the company’s rapid growth.

 

Mike will draw on extensive executive experience as he oversees MFour’s sales and marketing teams. He previously helped top technology and advertising companies rapidly expand their sales – most recently as Chief Revenue Officer at Sharethrough, where his five-year tenure coincided with its rise to dominance among native advertising platforms. Previously he was Chief Revenue Officer at Auditude, a video advertising platform, and Vice President of Sales at the pioneering digital advertising exchange Right Media, which was acquired by Yahoo! for $800 million. He began his career at Oracle, then joined Salesforce.com, where he was among its first wave of employees and rose to Vice President. Mike holds a Bachelor’s degree in American Studies from Georgetown University. He and his wife have three children, ages 11 to 15, and recently added a puppy to the family. Mike grew up on Long Island and maintains a long-suffering loyalty to the New York Jets.

Topics: MFour Blog

Don’t Let Online Panels and Data Leave You Shortchanged  

Posted by admin on Jul 19, 2017 10:00:34 AM

 

Blog Data Quality 900 x 300

 

If you’re not sure why you should pay close attention to consumer panel fraud, just think back on “The Big Short.” Nominated for a 2016 Oscar for Best Picture, it told the story of the subprime mortgage meltdown that triggered the Great Recession of 2008-2009.

 

Now it’s market research that’s facing a meltdown, if researchers and clients don’t make it a top priority to know exactly where their data comes from, and how vulnerable it will be if it’s sourced online. It’s this issue that brings to mind “The Big Short,” because the movie is a sobering study of what can happen if an industry gets too complacent or too distracted to be vigilant about data sourcing and data quality.

 

The film tells a complex story that boils down to a pretty basic business mistake: a failure to obtain reliable consumer data before making important decisions. “The Big Short” focuses on investment bankers who bought thousands of individual home mortgages, bundled them into mortgage-backed securities, and resold them to investors. In the rush to exploit an opportunity created by booming home prices -- and the belief that they would never come down – many sharp minds on Wall Street failed to pay attention to the basics. As the movie shows, they failed to obtain or verify actual on-the-ground facts concerning the financial qualifications of the real home-buyers who were taking out the actual mortgages that served as the foundation for mortgage-backed securities. 

 

Suspecting the worst, a hedge fund executive played by Steve Carell heads to Florida to do the ground-level consumer research the sellers and buyers of mortgage-backed securities had fatally omitted. He visits actual homeowners in actual neighborhoods to learn how much they owe on their mortgages, and how much they earn. He’s appalled to find that they have no chance of paying back the loans, making it inevitable that they will default, and that the investment pyramid built on their limited resources will crumble and fall.

 

Today’s consumer researchers and their clients now have to decide whether to risk a Big Shortchange in data quality and reliability. If they opt for online surveys, as most of them have for a generation or more, they’re likely to be buying consumer sample that’s in some ways akin to those mortgage-backed securities. Instead of known respondents who’ve joined a single, carefully-managed panel, buyers of online sample typically get data from a mix-and-match set of respondents who come from a variety of sources. They get verbal assurances that the data is representative, but no transparency about how it was sourced. This leaves the process open to all sorts of errors and abuses that undermine executives' ability to base important business decisions on accurate, reliable data.

 

One risk of relying on an online panel is duplicate respondents. Because many online panel members belong to multiple panels, there’s a real danger that they’ll receive multiple invitations to take the same survey. Two completes from one panelist is in nobody’s interest. Online research also is vulnerable to fraud by survey-taking bots. Experts say that bots are getting better all the time at mimicking real respondents and evading detection.

 

The alternative to online surveys is in-app mobile studies that bring two core benefits: They harness consumers’ love of their smartphones to insure a diverse, representative panel, and they exploit smartphones’ unique capabilities to validate each response and ensure it’s not coming from a bot or from duplicate survey-takers. Each phone has its own unique ID code and can be located geographically for further assurance against duplicate responses. Meanwhile, fraud-bots that stalk the online realm can’t break into the protected, in-app space.

 

If you want to invest in the most reliable data to inform your client's or company's analysis and decisions, it’s important to understand how legacy online technology and methodology stack up against new-generation in-app mobile solutions. To get in on the conversation, contact us at solutions@mfour.com.

Topics: MFour Blog

How Email Is Wrecking Online Surveys

Posted by admin on Jul 18, 2017 10:46:45 AM

 

Blog email notifications 18July17

 

Email is among the weakest of the weak links endemic to online surveys – and that's why it's important to learn about the in-app mobile alternative, and how it cuts email entirely out of the survey process.

 

Online studies seek respondents by sending emails to potential respondents. This begins a five-step process. First, the recipients must notice the emails in their inboxes. Then they must click to open the email. Third, they have to read the message and recognize it as a survey invitation. Fourth, they have to decide whether they want to take the survey. Fifth, they have to click on a link inside the email’s text to connect with a website where the questionnaire is housed.

 

In-app mobile cuts the process down to two steps. It starts with a push notification that a survey is available. The push comes with a unique audio tone that tells the recipient that this is a survey invitation. The recipient then decides whether to participate. If so, he or she simply opens the app and starts answering the questionnaire, which has been instantly downloaded into the phone.

 

The takeaway: two steps vs. five steps. One is efficient and drives up response rates and speeds your projects to avoid missed deadlines. The other is just plain cumbersome.

 

Not to mention outdated. A recent study of email’s use in marketing (as opposed to market research) underscores some of the general drawbacks of trying to reach today’s consumers by email.

  • 24% of Business to Consumer marketers in the study by Emma, an email marketing services provider, identified “getting people to open emails” as their biggest challenge, tied with “personalization and targeting.” Targeting is another strength of the in-app, custom panel approach, but we’ll leave that for another day.

The survey numbers were even worse among younger demographics, as represented by responses from email marketers for universities, whose audiences are disproportionately teens and young adults – prospective students, current students, and recent alumni included.

  • 41% of marketers for universities said they “struggle to get people to open emails,” leading the study’s authors to remark that “they’re battling a lot of noise in the inbox.”
  • Another stat worth noting is the average amount of time U.S. consumers spend using mobile apps – 2 hours and 15 minutes a day, according to analytics company App Annie. A separate study by comScore found that younger adults (ages 18 to 34) average about 3 hours a day in-app. Clearly, the app is the comfort zone where today’s mobile consumers can best be reached.

Because panel fraud is such a pressing issue for market research, it’s also important to remember that fraud bots programmed to masquerade as human survey-takers can latch onto email links to online surveys like crocodiles latching onto their prey. The online sphere is where bots are designed to function, and where they can flourish. Mobile-app surveys take place in a safety zone that stands apart from the wilds of the internet -- a place where panelists can be validated and where bots can't intrude.

 

Notifications are just the start of the in-app mobile survey process. To get the full story from start to finish, just get in touch at solutions@mfour.com.

 

 

 

 

Topics: MFour Blog

Subscribe to Email Updates

Recent Posts