Time is money, are you using real time data wisely?


busby-berkley-snowflakeTime is money, are you using real time data wisely?

Are you feeling up to date, in sync with the times? Both individuals and organizations find it challenging to fully leverage technology and integrate the sea of real time data that surrounds us.

This past week, I attended a local Internet of Things (IOT) conference, only to be reminded how we’ve been kidding ourselves with respect to the human machine dynamic.  When Factiva reported in 2013, that the previous two years had created 90% of the world’s data, it also reflected the impact of visibly faster technology and emergent opportunities for those capable of wrangling more data. Similarly, the exchange of information machine to machine and the responses that  IOT and the Industrial IOT (IIOT)  make possible  will soon surpass all human generated information.

Information has never proven more valuable to competitive advantage than now. The key istimely mastery and/or the ability to separate meaningful data from noise. Possessing  Real time capabilities merely up the ante. 

Suddenly,  all of the conversations about the real, meaningful  difference of  Big Data clicked. The challenges I knew and had experienced working with volumes of data is not something everyone experiences, and itswhy I missed the significance of the message. Language can do thar. Today’s – competitive advantage relies on learning synchronicity between people, and also between people and machine. 

Yep, syncing as in coincident timing. Timing reactions require coordination on the order of the elaborate dance numbers Busby Berkley made famous will separate winners and losers. 

People learning rates

People are interesting precisely because we begave inconsistently.  These same traits  make us effective competitors and efficient information processors.  We focus and only selectively pay attention, which means we consciously ignore most information in our midst. Unlike machines, we are slow few of us possess capabilities to process high volumes of complex data at high speeds. 

How people integrate data remains a bit mysterious. Part conscious and part unconscious, each of our senses connect to different parts of our brain and the information isn’t always processes with consistency. 

Humans create their own reality. For example, our eyes see things differently than what we describe and not because of language problems. Automatic transformations correct using depth perception and pre existing knowledge to flip the image, while sound tends to retain its integrity. 

Similarly, information new to us versus updates also  process differently; and yet, endless streaming information can overload and confuse us. Today’s powerful computers don’t experience anxiety or fatigue though they may overheat or fail.

The natural limits of time and energy challenge people to choose their focal point, the when and how we respond to data and perceive opportunities. For example, few of our waking moments and activities require conscious thought. Our body takes care of itself and manages to coordinate processing of external sensory information with internal demands. This syncing makes possible mindless activities like breathing, eating, walking and resting.

Consciously, our ability to track our time and energy is spotty.  Still, unstructured/unplanned  moments, especially those that demand little of us mentally remain ideal, while society frowns on the same characteristics when referred to as idleness. The contradiction reflects the value we attach to purpose or meaningful use of effort over time that results in tangible output.

Artists create, builders build, analysts compute and chefs cook for example by adding their effort over time. They make something or transform original materials/inputs.

The notion of efficiency also boosts the value of effort by measuring the effort relative to the output produced over time. Likewise effectiveness, measures the additional value produced relative to the starting inputs. Together, these measures translate into meaningful consistent tokens of value that permit ready exchange, or wealth accumulation.

In this context, the accumulated tokens of value allow us to buy ourselves time to take vacation or be idle as easily as buy us time to learn, create and do more.

Machine learning capabilities

This also explains precisely why technology advances prove so valuable, as they have progressively reduced the amount of time and effort necessary to perform a task. As a result, we DO spend less time on common, routine activities than was previously necessary.  Internal plumbing saves us time we spent fetching water, Wheeled transportation saves us time we spent walking, and similar telecommunications vastly removes the break in communications that once necessitated considerable effort  to cross the distance by one if not both parties, or the enlistment of a proxy to carry the message on their behalf. The human messengers were replaced first and written notes/letters, and then the telegraph dramatically reduced the time between message sending and receipt.  Now text messaging and email is displacing telephone and video conferences.

This evolution in communication methods affects the people’s interaction styles but also their information needs and expectations.

Real time communications savings and benefits are not equally distributed and so inefficiencies persist.  On one hand they present a new opportunity to replace planning and documentation of activities essential when communications were primarily indirect and time lagged. Built-in tracking, boosted transmission capabilities and data recording can both fill in and increase information gaps.  Problems associated with incomplete, unsupported or even delayed information that always created risk persists, but for new reasons.  The flood of data from more sources both people and machine generated pose new challenges to separate out meaning, predict and or respond in timely, relevant manner.

Another opportunity real time capabilities offer are all around us, assisted by the information collected and transmitted from multiple data sensors scattered across the environment.  In fact, it’s how airplanes fly automatically, rail road cars notify switches of their location to either open or close crossing gates, motion sensors in buildings adjust level of lighting and air temperatures, and Tsunami warning systems saves lives.

In general, people are wired to process information in real time. We use an array of body language cues to understand how to  manage the situation and engage with the people in our midst, and yet we do it unconsciously.  Planning on paper is a far more conscious activity, time consuming and energy draining.  Worse, planning often stops us from activating the unconscious real time processing.  We follow the plan, rather than notice the inconsistency or the more obvious information we may or may not have incorporated.  Best example, is the step by step navigation systems that we know are less than perfect.  Have you found yourself using the navigation only to discover it’s asking you to turn onto a one way street going the wrong way? Or your location is “ahead” of the GPS signal and so you miss a turn?

My point is this.  Too many built in business procedures and processes were designed in the absence of real time information.  In order to be more relevant, more valuable people will need to revisit their processes with respect to learning, creating and doing.  It will require a shift in attitude, refocus of needs and adjustment in expectations.  It’s a shift from a look back and partner with machines that look forward, use more data sources and get to analysis faster.

If you have any examples of success or any challenges I’d love to hear about them.

[i] Mike Hogan, “big Data of your Own,” August 2013, www.factiva.com

John Adams, “Be careful or Big Data could Bury your Bank,” January 25, 2013 http://www.factiva.com

Advertisements

Big 3I competencies: Why are they so darn hard to acquire?


Creating value and organic growth opportunities requires uncovering opportunities often hiding in plain sight. Innovations challenge expectations including possible returns on the effort.  We take for granted what’s under our noses even though it may be exactly where we need to pay closer attention. Understanding how perception affects our preferences makes compensation possible. Vigilance helps,  especially awareness of value on multiple dimensions. There’s a monetary aspect and there are ideas we hold near and dear.  Both values motivate human behavior and that’s what makes life interesting.  Let’s begin our exploration  looking at traditional expressions of value  after an introduction to the concept of “fundamental attribution,” or first perceptions.

Prior knowledge separates surprise from distraction.  A sudden unanticipated event will jolt our senses. Our sudden vigilant state will recede when we recognize familiar people, or cues, associated with things we know make us happy. Surprise includes circumstances or context that make us expect what comes next and so we relax our guard. The fundamental attribution idea literally draws on internal experience. Stored knowledge takes care of us, finding a fit to situations and environments we meet. That doesn’t mean we pick the best fit. Often familiar,  frequently used ideas come to mind faster. Logical or rational alternatives follow, too late to be useful. That’s where intention, pausing before reacting, offers the pre-frontal cortex time to process. This internal tradeoff makes humans wonderfully complex and predictably irrational.

The trick is to understand how circumstances get people to do what you want and avoid them blowing up in your face.  Psst, the answer goes beyond data analytic competencies, though that’s important.

Perception and preference the Big What?

Data comes in one flavor, but tastes differently to consumers than it does to product and service providers.  Everyday, more code and identifiers amplify specific and ambient details associated with activities such as tracking goods, service use etc. The convenience, cost and time savings provided by standard identifiers like bar codes, account numbers, social security numbers, email addresses and phone numbers also simplify providers, up and down the supply chain, catering to our unconscious preferences. Every day, we compromise a little more of our privacy and anonymity in the process.

The sheer volume, veracity and velocity of all this raw, “Big” data makes navigating the future possible. The tricks require exploring past and present relationships between variables. Predictive Models use that deeper understanding of variable relationships  and their interactions to create opportunities, control risk producing conditions and optimize sources of marginal profit. The results enrich our lives and few of us feel oppressed by this Business Intelligence (BI).  Big Brother does exist, but so does Big Sister, Big Doctor, Best Friend, Old Roommate, Big Pen Pal etc. In other words, government  surveillance creating the old FBI style dossiers, pales to the knowledge stored about you by your bank, Google, Facebook, Amazon and other retailers. Healthcare regulations and practices preserved the privacy of your information, and their slowed migration to electronic medical records. Their failure to keep up with the wider digital data practices have also slowed  diagnostic advances and cost saving opportunities.

Real innovations begin with insight, once the province of small tests and strictly the domain of human intelligence.  Today Big Insight crowds out the spotlight occupied by BI. Cheap storage and faster processing makes data mining possible for anyone, but it is the strategic opportunists  with the foresight to be serious players and accumulators that continue to change the world.  Recently, GigaOM  identified several use cases  while highlighting Terradata, the makers of the first terabyte scaled database. The full list is worth reading, as I mention only a few.

  1. Steve Jobs infamous statement that Apple doesn’t do customer research no longer holds true.  Terradata named Apple as its first customer to exceed a petabyte of storage. Apple rapidly accumulates  transactional information on their customers to understand customers across product groups.
  2. WalMart’s data processing and analytic capabilities go beyond simple sales efficiency. The data helps instruct and educate its suppliers with insights about packaging dimensions as well as shelf space location etc.

Intelligence to Influence requires insight

The ongoing arrival of new technologies and embedded tracking codes continue to fuel the race to understand and use real-time ambient data to influence transactions. More data makes it easier to see deeper underlying patterns more clearly.  With greater awareness, trends can be spotted and tracked more readily and the impact of different interventions tested simply and more thoroughly.

Understanding the data requires more than iterative recombination, it takes expertise. With knowledge and experience patterns can be understood by both people and machines (see Earlier post: understanding-aint-believing-and-yes-there-are-economic-consequences).  But it takes  curiosity to explore different dimensions and generate insights.  Here are two different takes:

Luis Arnal of InSitum explains what holds back many of us. Please listen to his Design Research Conference in 2011 complete  presentation, absent the charming slides. This summary doesn’t do justice to his talk, but  I wanted to share some of his key reflections and lessons on the steps to developing insights

Begin with data, or information records that represent your observations from field research. After collection, the data needs to be categorized, clustered.  Begin the analysis process using a simple scatter plot to understand the landscape or context of observations relative to the categories selected.  Using  intuition and prior knowledge, the dimensions you choose to contrast also leads to the direction in which you develop associations between the data points.  What, if any, possible connections exist?  Using imagination and creativity  lines of connection appear as  part of an effort to FIT the dots to a model.  Of course the interpretations vary. Time and patience make possible “a fidelity of meaning” and the underlying pattern comes into focus. The data’s added value  suggest patterns that slowly develop into solutions. Insights, Luis explains contain  30% Data, 30% inspiration, 30% perspiration and 10% luck.   Insights facilitate the transition from confusion to help resolve the initial problem. They are the links between what Is and What If, they help us imagine how when we don’t or can’t know.

Recent article in HBR by Thomas Davenport,  another worthwhile read, emphasizes a different set of talents and experiences.  Particularly helpful for positioning your firm is one of the closing observations about the capabilities housed within your organization and the opportunities they present.

“….their greatest opportunity to add value is not in creating reports or presentations for senior executives but in innovating with customer-facing products and processes….

LinkedIn isn’t the only company to use data scientists to generate ideas for products, features, and value-adding services. At Intuit data scientists are asked to develop insights for small-business customers and consumers and report to a new senior vice president of big data, social design, and marketing. GE is already using data science to optimize the service contracts and maintenance intervals for industrial products. Google, of course, uses data scientists to refine its core search and ad-serving algorithms. Zynga uses data scientists to optimize the game experience for both long-term engagement and revenue. Netflix created the well-known Netflix Prize, given to the data science team that developed the best way to improve the company’s movie recommendation system. The test-preparation firm Kaplan uses its data scientists to uncover effective learning strategies.”

What’s the common denominator linking Davenport and Arnal?  Both reference visual thinking or the conceptual translation of ideas into tangible representations.  Again,a  mastery difficult to acquire and beyond the bounds of computers, even those as powerful as IBM Watson. I don’t think Siri creates flow charts, but she might learn.

I did and so can and do others. When hiring for analytics teams I managed, three criteria or competencies were essential: SAS skills—statistical coding; knowledge of the business; and an ability to think through new problems. i never thought to ask someone if they could draw.  One of my teams pioneered new strategies to improve profitability.  Initially, that meant differentiating credit worthiness.  Managing the portfolio however required alternative methods to promote profitability by optimizing costs and simultaneously minimize risks.  At the time, combination of competencies we needed were rare. Above all we needed flexible thinkers to tackle complex problems  and create more sustainable solutions. We learned to bet on those who offered two of the three. In time, we came to realize that the third criteria, thinking, was one we couldn’t teach.  It became the minimum requirement. In the late 80’s, we sought out academics with  conceptual modeling experience and bypassed MBAs.  Banking wasn’t the only employers seeking these skills but we were much more flexible in hiring them.

Today, the combination of technical skills proving most valuable continue to be found among individuals who have studied complex data and demonstrate visual thinking, again not MBAs. Not all designers capabilities include assembly of a sophisticated social network analysis model, but they sure do a great job of communicating conceptual ideas tangibly.

This post began talking about value.  Should the value consumers derive match the value producers derive? Absolutely not. In business the preoccupation with return on investment makes sense for private equity focused on upside and early exit. This contrasts with Warren Buffet, who grew wealthy ” thanks to his ability to learn the value of various securities and then buy them for less, a concept at the core of value investing. “Price,” he has said, “is what you pay. Value is what you get.”

Remember the fundamental attribution concept?  Buffet’s remarks on value and his actions show how easily we mistake motive and behavior.  Companies that obsess about cost risk missing key insights.  Case in point, the recent rise and fall of JCPenney’s CEO, a man clearly familiar with the power of BIs (insight and intelligence analytics) to achieve innovation. How people interpret observed behavior matter. The more detail and the more attention to context , increases chances to uncover key actionable insights.  James Surowiecki, a notable observer of the slippery slope of over reliance on analytics, recent New Yorker column , shared comments on the widely touted and now vilified  Ron Johnson, by Mark Cohen, a former C.E.O. of Sears Canada, and now a professor at Columbia”

“In most of the retail universe, price is the most powerful motivator,” Cohen said. “This game of cat and mouse with regular, ever-changing discounts is illogical, but it’s one that lots of consumers like to play. Johnson just ignored all that.”

Conclusion?

Playing effectively with Big Data analytics requires an unusual mix of capabilities. More than sheer brute processing power, modeling, imagining and speculating requires artistic license.  Machines will find patterns of relationship quickly, but not clear they will find the direct relationship between cause and effect. The reasons and thought processes that drive the behavior, remain domains where humans excel.

Its’ hard to believe that the same analysis that led Johnson and his team to create the square fair pricing missed recognizing coupons significance to their customers. I agree with  Surowiecki, who  suggests the impact of one  fundamental attribution created a rippling effect producing one error after another. The first error made by the board in selecting Johnson, created further error by  decision-makers and Johnson himself  in choosing  to push their half-baked strategy forward prematurely.

What do you think?

 

Prediction and Understanding


All things “new” fascinate us.  Of late,  the business world’s growing excitement about Big Data and its analytic modeling seems to turn up surprising results in interesting places. Predictions mesmerize us, they offer us control in the midst of uncertainty and fool us to believe we understand things more completely than is possible.  The models used to predict an outcome are often confused with underlying mechanisms responsible for the outcome.  Models fuel discovery and yet we get cocky when we rely wholeheartedly on a built model’s power and accuracy.  Risk doesn’t disappear and its infrequent appearance merely challenges our ability to prepare adequately and only in hindsight differentiate the early warning signs.  This is what int he trade we call differentiating signal from the noise and is the focus of Nate Silver’s book.

Power to Predict

In finance, or physics circles the fascination around models is anything but new. Isn’t the primary purpose of analysis and model building discovery or greater understanding of causal relationships and interactions? Observing physical properties of planets helps us make sense of their movements and  explain other observable phenomenon. The notation and models provide insights into other activities and data collected in other settings.  These scientific modeling techniques when introduced into social science formed the basis of understanding economic behavior and a framework for a series of policies governing the money supply to welfare.  Once operating in obscurity, the mathematically trained analysts and modelers impact on society continues to ripple into ever-widening arenas difficult to miss.

Michael Lewis earned his living as a quant on Wall Street. His dual talents manipulating numbers and words led to his successful book Liar’s Poker.  Complexity found a voice and Lewis continued to seek out and tell more stories about the quants in multiple settings. Perhaps it was the popular success of MoneyBall, that attracted the popular interest. I admit I’m an ardent fan.  Michael Lewis and his wonderful story telling ability around number problems, shared how the Oakland As made the playoffs using statistics for competitive advantage. Among the collected stats, the story revealed those overlooked by scouts the Oakland As valued, making it possible for them to compete effectively against baseball teams with much larger budgets.

In Presidential Elections, during 2008 the baseball stats model maven Nate Silver demonstrated how a command of statistics can improve the quality of a candidate’s campaign.  By 2012, his success garnered him personal attention as author of the New York Times 538 column while further upping the fascination with applied statistics in new arenas.

Leonard Mlodinow,  a trained physicist himself, in his sympathetic review of Silver’s new book, shares his frustration with statistical shysters.  “The Signal and the Noise,” Silver shares “… studies show[ing] that from the stock pickers on Wall Street to the political pundits on our news channels, predictions offered with great certainty and voluminous justification prove, when evaluated later, to have had no predictive power at all.”

Quality Thinking

Andrew Hacker’s review of Silver in The New York Review of Books caught my attention when he questions James Weatherall’s intention as author of The Physics of Wall Street and  exposing a different expectation.

“…the assumption that the quality of our thought can be enhanced by new methodologies.”

Certainly, Hacker’s impressive eloquence helps; but invoking quality in reference to thoughts struck a visceral chord.  Variety and range implied by differences in quality intrigue us.  They make the world more interesting.  At the most basic level, variety compels trade  and incites desire for around diversity.  Frequently, recombining ideas defines innovation but does either necessarily signify progress, reflect higher power thinking,  or even spread benefits more widely?

Variety in objects or tangible goods naturally reach their limits and so too does our tolerance for diverse ideas.  In products, declining sales makes the limit recognizable in hindsight.  In ideas, their displacement provides some evidence of their limited appeal as in the transition to capitalism in the communist bloc or the return of Islāmic fundamentalism in the middle east.

Does a valued quality suggest our preference associates with a higher ranking of an object or an idea? Naturally, higher ranking or rating indicates higher preference, especially when done consistently. For example, measuring liquid in litres vs. quarts does not enhance or detract from the quality of the liquid, the measure and the liquid’s qualities are independent of one another.  In the US, quarts are the culturally preferred volume measure and it persists for numerous reasons, some irrational, but few suggest higher power thinking.

Of late, I am reading Scott E. Page’s book entitled The Difference.  He provides a series of examples to  show the  added value produced when multiple perspectives and varying rule based approaches test a situation.  Page’s training draws on the work of social scientists in multiple disciplines and his examples, by design demand minimal mental arithmetic and can easily be scaled.  His fundamental premise challenges  higher order thinking as the ultimate value varying diversity, flexibility and adaptability as ultimately more useful.

Then again, utility or use as an idea in spite of its competitors continues to prove itself resilient over time and earthly situations.  I’m OK with some mystery, the unknowns that both Nate Silver finds challenging and James Weatherall believes his approach can resolve.  Big data regardless of the  measurement methods, analysis models and their possible recombination, I’m betting that diverse human preferences for truth will continue to prove self-limiting.  That’s what ultimately makes life and all its diversity interesting!

Think before you send out that survey


What matters?  What you say?  or What you do?

Can we learn from a binary answer? There’s a lot of clarity when given a simple Yes or No,  or asking if we do or don’t, differentiating something  good from  bad. Sure the world and people’s sentiments are often less definitive or clear. The world is full of gray and people tend to shy away from declaring something absolutely black or white.  It’s why metrics include scales.  It’s been hot, but if I want to know just how hot, I need to measure it.

Similarly, if I want to know more about how you feel, introducing a graded scale allows you to tell me something differently than forcing you to choose between extremes. The challenge is whether survey questions really get me there.

Surveys are fine instruments, when the universe of responses are “known,”  or the scale of measurement standard. For example the difference between below 32F and above differentiates freezing. Or 3 inches is one inch more than 2 inches. Measures of sentiment, belief or attitude are very difficult to standardize.

I began my career forecasting elections for CBS News.  The survey questions were very direct as shown here.  The tabulations were equally direct. Not only were the samples very carefully drawn but the population was also carefully estimated and provided a very solid denominator.  Survey responses were voluntary but  deliberately framed whether the pollster asked questions over the telephone or in person interview. The results predicted with great accuracy final election results and helped CBS and the other news outlets advantage to call winners before the real poll results were officially tabulated.  It was the point of their surveys, right? Both candidates and news outlets conduct surveys to get an accurate handle on which way voter sentiments are going.  The information is doable, insightful and thus useful.

OK, some questions are worth asking because it helps people warm up and get them to tell you what they really think or what really matters. But too often we want to know about sentiment and are not very good at framing the question.  It’s in these cases that the scale matters.  Frankly I’m tired of sifting through survey results that don’t deliver. Surveys by themselves don’t engage someone and the results often reflect their level of engagement as opposed to their true sentiment on the subject. The more concrete the option the more valid the results.

I admit I’m a sucker for information, especially when the promise of learning about my prospects’ their attitudes, desires, interests, needs and concerns is freely available.  But in the end I get what I pay for, yep, absolutely nothing.

Asking gets you where exactly?

Surveys, we hope, allow us simple, objective, opportunities to ask what we want to know.  The large sample surveys that characterize much of the information being gathered by large consulting firms, and in which I’ve been a participant, don’t live up to their promise of cultivating knowledge or insight about their customers.  In my view the approach is old school. Their ubiquity signals the ease with which surveys can be constructed, distributed and analyzed.  We can get more for less; and at the end of the day, very little of it is truly useful or provides real insights.

I wonder why C-Suite executives take part? What’s really in it for them? Do the questions draw them in and increase their attention to the topic? The receipt of results serve as quid pro quo for their input and clue them in on the feelings of their peers. That’s worth something, but again old school traditional competitive behavior doesn’t allow you to formulate strategies that deliver economic advantage.

Here are two examples of survey findings, you tell me if I’m off base.

CASE #1

The Wall Street Journal’s CIO blog last week  referenced a new study from McKinsey & Company.

  Roughly 49% of those surveyed said they are currently using Big Data to understand their customers better, while a combined 32% said they are currently using social media to interact with customers or promote their brands.

 

The survey asked 1,500 CEOs, CFOs and CIOs, in the first two weeks of April 2012, about three key business technology trends: Big data and analytics; Digital marketing and social tools; and flexible delivery platforms. More than half of the respondents  reported that the first two had made it to corporate strategy agenda’s  top-ten priority.  “In fact, for each of the three trends, between 6 and 9 percent say it is their organization’s top corporate priority.”

HMMM …wonder what that really means?  The report went on to say that ~1/3 of the executives reported their organizations looking to these trends to build a new business or tap into new profit pools.  The kicker for me about the survey and its limited use hit home when McKinsey turned to ask whether their budgets included support for these priorities. I  encourage you to take a look if you’re questioning my thesis. McKinsey notes interesting differences emerged in the findings “the discrepancy suggests the management team may lack clarity or a consistent concept about the investments needed to support digital business.”

CASE #2

The IBM 2010 Global CEO study discovered that CEOs cited creativity as the most important leadership quality over the next five years.

Further IBM’s  report stated innovation needs to be CORPORATE activity, not just an R&D activity.

Are your answers to these questions any different?  What would you say is the most important leadership quality for business leaders over the next five years? How do your rankings compare to the ones shown in the graph?  I suspect, but don’t know as I didn’t actually test this, average citizens might respond similarly to CEOs because the question is so general.  Why?

Improving your information relevance quotient

Have you ever asked a question when you knew the answer?  As in “Isn’t Coke the best drink on the planet?” I don’t mean to disparage the authors’ and analysts’ objectivity in creating and conducting these surveys, or criticize the soundness of their methods. These survey results represent sentiment when the value I’m seeking comes from behavioral insights best obtained from other methods.  In the end, the survey results merely add to the growing pile of information that has limited direct use and grows stale quickly,  but cost time and energy to collect, tabulate, analyze and distribute.

What other evidence would corroborate findings around important CEO qualities?  The available data would have to be public and somehow connect to actions undertaken by CEOs.  Where do you look?  You could mine their public statements or interviews, and you could look for supporting documents such as investments, resource allocations, new initiatives and the percentage of funding.  The good news? Gathering, sifting, reviewing, compiling and delivery the infomration with or with some analysis is easier than ever.  I’m just surprised that IBM, a very savvy research firm hasn’t begun to compare the two and shared its findings. Watson is good for more than winning at Jeopardy right?

In the next few weeks, I hope to try out a series of tools myself.   For example, I learned this morning about  Attensity. Text analytics systems are the plow horses that increase big data’s yield,  put to work by none other than Google to enhance their search algorithm. Do you remember when Google used to share the liklihood that their results matched your search?

I’m a big believer in skipping reinventing the wheel, so if you have done this analysis or know of any,  send them my way. I’ll keep you posted on things I find.

Are you tired of reading or even seeing trending information that lacks clarification or proper weighting?  If so, I’d love to hear about your experiences and any solutions you are finding promising.

Alternatively if you do find the survey results meaningful or actionable please share what makes these types of reports valuable to your work , or improve the outcomes of your tasks?