Tom Webster, writing and speaking

Sometimes I Hate Market Research

Added on by Tom Webster.

Maybe that's a little strong. Here's what I do take issue with, however - misrepresenting an impressive-sounding sample size as a valid study. Today I got a pitch about a "social media study" that purported to have a sample size of over 2,000, which if done correctly would have a margin of error of just +/- 2.0%. I won't dignify this particular study with a link, but a closer inspection of the methods revealed this:

Sample size: 2322 persons 6-54, Error +/- 2.0%
Persons ages 6-54, 55% male/45% female
Address-based sampling
The study was conducted between January 15 and April 30, 2010.
Data was accumulated through in-person interviews, telephone interviews, social media focus groups and web surveys.

See, if you saw the results from this study and didn't bother to scrutinize the methods, you might be fooled into thinking you had a good piece of research to write about. Sometimes, sample size can fool you, as it does in this case. Consider:

1. The data was accumulated over almost four months. This introduces a longitudinal bias into the data. Think opinions have changed about BP over four months?

2. The data was collected multi-modally, using a series of qualitative projects all cobbled together. You can only really claim to have a given sample size for a survey if it is the same survey. Much of this data appears to have been quantitative questions asked in multiple small groups, and then simply crunched together. If the sample really was 2300 persons surveyed in one mode (phone, web survey, etc) then you could claim that the margin of error is +/- 2.0%. If you just add together a bunch of tiny, unrepresentative qualitative samples, all you are really doing is compounding error. A typical focus group, for instance, might have 10 respondents. The margin of error on a 10-person survey is, I dunno, about +/- 99% :) You can multiply those tiny samples together all you want - they don't get any better.

3. 55% Male/45% Female is another way of saying "convenience sample," or, "this is who showed up for the focus groups." The US population is actually slightly more female.

4. Where were these interviews conducted? It would be pretty hard to do a nationally representative sample using in-person interviews and focus groups. Certainly you'd make Southwest Airlines pretty happy. There is surely an unstated, unknown geographic bias. Probably a pretty sizable one.

5. What's a "social media focus group?" If such an animal exists, it needs some esplainin'.

Basically, you can't claim this sort of "survey" is representative of anything, even if the sample were 10,000. What's worse, you can't characterize the sample at all, so you can't even refer to the data as "amongst respondents to this survey." Which survey?

So, this is a rare airing of sour grapes here on BrandSavant, but my purpose here is to get you to think about numbers when you see them. Surveys like this are done all the time - it's up to you to ask the tough questions before you report on the results or - heaven forbid - act on them.

Tomorrow, a more positive post. :)