Think before you send out that survey


What matters?  What you say?  or What you do?

Can we learn from a binary answer? There’s a lot of clarity when given a simple Yes or No,  or asking if we do or don’t, differentiating something  good from  bad. Sure the world and people’s sentiments are often less definitive or clear. The world is full of gray and people tend to shy away from declaring something absolutely black or white.  It’s why metrics include scales.  It’s been hot, but if I want to know just how hot, I need to measure it.

Similarly, if I want to know more about how you feel, introducing a graded scale allows you to tell me something differently than forcing you to choose between extremes. The challenge is whether survey questions really get me there.

Surveys are fine instruments, when the universe of responses are “known,”  or the scale of measurement standard. For example the difference between below 32F and above differentiates freezing. Or 3 inches is one inch more than 2 inches. Measures of sentiment, belief or attitude are very difficult to standardize.

I began my career forecasting elections for CBS News.  The survey questions were very direct as shown here.  The tabulations were equally direct. Not only were the samples very carefully drawn but the population was also carefully estimated and provided a very solid denominator.  Survey responses were voluntary but  deliberately framed whether the pollster asked questions over the telephone or in person interview. The results predicted with great accuracy final election results and helped CBS and the other news outlets advantage to call winners before the real poll results were officially tabulated.  It was the point of their surveys, right? Both candidates and news outlets conduct surveys to get an accurate handle on which way voter sentiments are going.  The information is doable, insightful and thus useful.

OK, some questions are worth asking because it helps people warm up and get them to tell you what they really think or what really matters. But too often we want to know about sentiment and are not very good at framing the question.  It’s in these cases that the scale matters.  Frankly I’m tired of sifting through survey results that don’t deliver. Surveys by themselves don’t engage someone and the results often reflect their level of engagement as opposed to their true sentiment on the subject. The more concrete the option the more valid the results.

I admit I’m a sucker for information, especially when the promise of learning about my prospects’ their attitudes, desires, interests, needs and concerns is freely available.  But in the end I get what I pay for, yep, absolutely nothing.

Asking gets you where exactly?

Surveys, we hope, allow us simple, objective, opportunities to ask what we want to know.  The large sample surveys that characterize much of the information being gathered by large consulting firms, and in which I’ve been a participant, don’t live up to their promise of cultivating knowledge or insight about their customers.  In my view the approach is old school. Their ubiquity signals the ease with which surveys can be constructed, distributed and analyzed.  We can get more for less; and at the end of the day, very little of it is truly useful or provides real insights.

I wonder why C-Suite executives take part? What’s really in it for them? Do the questions draw them in and increase their attention to the topic? The receipt of results serve as quid pro quo for their input and clue them in on the feelings of their peers. That’s worth something, but again old school traditional competitive behavior doesn’t allow you to formulate strategies that deliver economic advantage.

Here are two examples of survey findings, you tell me if I’m off base.

CASE #1

The Wall Street Journal’s CIO blog last week  referenced a new study from McKinsey & Company.

  Roughly 49% of those surveyed said they are currently using Big Data to understand their customers better, while a combined 32% said they are currently using social media to interact with customers or promote their brands.

 

The survey asked 1,500 CEOs, CFOs and CIOs, in the first two weeks of April 2012, about three key business technology trends: Big data and analytics; Digital marketing and social tools; and flexible delivery platforms. More than half of the respondents  reported that the first two had made it to corporate strategy agenda’s  top-ten priority.  “In fact, for each of the three trends, between 6 and 9 percent say it is their organization’s top corporate priority.”

HMMM …wonder what that really means?  The report went on to say that ~1/3 of the executives reported their organizations looking to these trends to build a new business or tap into new profit pools.  The kicker for me about the survey and its limited use hit home when McKinsey turned to ask whether their budgets included support for these priorities. I  encourage you to take a look if you’re questioning my thesis. McKinsey notes interesting differences emerged in the findings “the discrepancy suggests the management team may lack clarity or a consistent concept about the investments needed to support digital business.”

CASE #2

The IBM 2010 Global CEO study discovered that CEOs cited creativity as the most important leadership quality over the next five years.

Further IBM’s  report stated innovation needs to be CORPORATE activity, not just an R&D activity.

Are your answers to these questions any different?  What would you say is the most important leadership quality for business leaders over the next five years? How do your rankings compare to the ones shown in the graph?  I suspect, but don’t know as I didn’t actually test this, average citizens might respond similarly to CEOs because the question is so general.  Why?

Improving your information relevance quotient

Have you ever asked a question when you knew the answer?  As in “Isn’t Coke the best drink on the planet?” I don’t mean to disparage the authors’ and analysts’ objectivity in creating and conducting these surveys, or criticize the soundness of their methods. These survey results represent sentiment when the value I’m seeking comes from behavioral insights best obtained from other methods.  In the end, the survey results merely add to the growing pile of information that has limited direct use and grows stale quickly,  but cost time and energy to collect, tabulate, analyze and distribute.

What other evidence would corroborate findings around important CEO qualities?  The available data would have to be public and somehow connect to actions undertaken by CEOs.  Where do you look?  You could mine their public statements or interviews, and you could look for supporting documents such as investments, resource allocations, new initiatives and the percentage of funding.  The good news? Gathering, sifting, reviewing, compiling and delivery the infomration with or with some analysis is easier than ever.  I’m just surprised that IBM, a very savvy research firm hasn’t begun to compare the two and shared its findings. Watson is good for more than winning at Jeopardy right?

In the next few weeks, I hope to try out a series of tools myself.   For example, I learned this morning about  Attensity. Text analytics systems are the plow horses that increase big data’s yield,  put to work by none other than Google to enhance their search algorithm. Do you remember when Google used to share the liklihood that their results matched your search?

I’m a big believer in skipping reinventing the wheel, so if you have done this analysis or know of any,  send them my way. I’ll keep you posted on things I find.

Are you tired of reading or even seeing trending information that lacks clarification or proper weighting?  If so, I’d love to hear about your experiences and any solutions you are finding promising.

Alternatively if you do find the survey results meaningful or actionable please share what makes these types of reports valuable to your work , or improve the outcomes of your tasks?

Advertisements

Social Media great for insights not prediction


An example of the share buttons common to many...

An example of the share buttons common to many social web pages. Thanks to http://www.nouveller.com for the free icon pack image. The author (Benjamin Reid) releases the image into the public domain, with the following text available at the source page: “You can use them anywhere you like, absolutely anywhere, anything. No attribution, 100% free.”. (Photo credit: Wikipedia)

Is it really surprising that on social media, generally speaking, people share more emotionally linked thoughts?

What People Really Want vs. What They Share on Social Media.

For my money this is not much of an insight.  After all, humans, like many other animals, are social creatures. From birth, our lives depend on others. In time, those who bring us along and introduce us to the ways of the world nurture specific beliefs and frame our understanding of the world.  Our connections to others are vital to our survival, happiness and success.

Social media simplifies our ability to share and connect. The social impulse that compels us to take part naturally mirrors underlying, maybe even unconscious emotions. The result is a natural  association between content and intention rooted in sentiment. Following the tradition of anthropology, or design research, self-reported assertions such as our tweets or Facebook updates can prove revealing. Tracking and tallying these qualitative data crumbs outline a wider system of association linkages and are wonderful additions to descriptive analysis. Whether linked specifically to more traditional demographic variables or not, they show characteristics,  detect relationships about something or someone; but are no proportional in their representation.

Infographic on how Social Media are being used...

Infographic on how Social Media are being used, and how everything is changed by them. (Photo credit: Wikipedia)

So what’s the problem? Insights don’t scale. The accompanying graphics suggests that there’s added value, and maybe there is for the casual observer, but at the moment I’m not convinced.

Problem Re-framed

Last week, I shared lunch with a group of people familiar with both quantitative and qualitative research methods to talk about big data.  Design, or anthropology, research methods focus on observing very small groups of subjects in natural conditions.  Watching people as they shop, work, make dinner, go to work etc. The data and analysis skews to the qualitative. Watching what people do has always proved to be more reliable a predictor than asking what they think. Researchers long ago discovered the knowing vs. doing gap.

For the less statistically inclined, probability sampling is necessary but not itself sufficient to make claims about a larger population group.  Exercising diligence in selecting a random sample to ask a series of questions, or observe them can still produce bias or large errors in the results if input from those who respond or were readily available are included.  All surveys include a margin of error due to sampling. National voter exit polls, for example, carefully sample to keep their  margin of error for a 95% confidence interval low, e.g. about +/- 3% . ( For further information check out: Edison research on exit polls)  The margin of error on public opinion polls asking what people believe and for whom they plan to vote is wider than the post voting survey results taken at the polls.

Diary studies illustrate the value in subjective research. Sure, the results are challenging to extend and difficult to scale as the richness of this data does not easily lend to classic systems analysis.  Often in the hands of the experienced researcher, the subtle presence or absence of contextual cues lead to new insights, or deeper understanding of the situation, or present circumstances responsible for a behavior.  Researchers isolating the specific cues come closer to understanding our inner nature and then developing insights into cause and effect.

Build it and….

The inspiration implied in the phrase if you build it they will come, suggests knowledge of what and how to build, this intuition may come from subjective research.  Note, the phrase is neither strategic or predictive of the number or timing of visitors.  Contrast anecdotal indicators to an algorithm churning through significant quantities of transactions to find common elements, the co-related information.  Observational data offer context, while the algorithm provides the measure of total significance.

Cover of

Cover via Amazon

If we’ve learned anything from the work of the behavioral economists, humans are predictably irrational.  Why?  The relative strength of an emotion can but doesn’t necessarily overcome reason.  The contextual elements trigger both specific behaviors, as well as unexpected associations and very different behaviors.

We are far from understanding how to successfully integrate expressed wants social media provides with analysis of objective, aggregate data.

As Steve Smith, of Pegasus Capital Advisors suggests, there is great power in pushing the economics analysis up the value chain.  Social media doesn’t create the transaction, the risks focus on reputation which has implications but has yet to disrupt the flow or more accurately allocation of capital.

I’m looking forward to seeing the continuing evolution of social media and the teams of marketing analysts familiar with statistical sampling to help chart a new course. It would be

great if they can help lead the charge toward a more robust metric of success.  One that favors the quadruple bottom line and thus captures Environmental, Social, Cultural (including governance) and. Economic factors.