Doubt in the Data: Why We Need to Think Harder About Data and Data Gathering

Related Expertise: Data and Analytics, People Strategy, Organizational Culture

The Dangerous Duplicity of Data

By Mihnea Moldoveanu and Martin Reeves

One should be especially careful in using the words “reality,” “actually,” etc., since these words very often lead to statements [without any empirical content].—Werner Heisenberg

As recently as a decade ago, most business decisions were based on very limited data. More recently, an explosion of data, a dramatic decline in the cost of processing power, and advances in machine learning have created the expectation that we will be able to capitalize on a data windfall to revolutionize most aspects of business. The data (from the Latin datum, a thing that is given) on which machine learning and analytics are built are taken as given and incontrovertible. But what managers, data scientists, and social scientists think of as data is in fact not given. It is the outcome of a process of measurement—an interaction between an observer, a technique or apparatus, and a context.

As every seasoned executive knows, asking for data to inform an important decision can set in motion a process in which facts and figures are colored, filtered, obscured, deleted, shaded, clipped, and even fabricated. In practice, the result depends on whom you ask for it (observer dependence), how you ask for it (frame dependence), and when and under what conditions you ask for it (context dependence). Equally, the process by which we ask people to report preferences, emotions, and perceptions can interfere with the underlying state, to the point where enquiries can create rather than report the states they refer to. “Are you happy?” triggers a complex set of considerations about self and others that make it overly simplistic to interpret “yes” as “s/he is happy.” Research suggests that individual dispositions and propensities are actually second-person specific (“happy toward whom?”; “signaling happiness in the presence of whom?”).

Business does not have a clear and cogent way of dealing with these limitations of data and measurement, and they are usually ascribed to error and noise. But these limitations are precisely the grist of quantum mechanics—one of the most successful predictive theories humans have developed to describe the world. Quantum mechanics has also produced a model of measurement that uses concepts like indeterminacy, superposition, entanglement, and observer-dependence very precisely. This model can be used in other contexts, with phenomena that do not occur on the space-time scales of quantum mechanics.  

Indeterminacy

Indeterminacy refers to the impossibility of jointly measuring certain pairs of variables, such as the momentum and position of a particle, with high accuracy. When we survey social and organizational interactions that create measurements we call data, we find many examples of such Heisenberg-complementary variable pairs.

Say you are trying to simultaneously measure the motivational force of an emotional state of a person and her awareness of that state. If a person’s awareness of being in a state (“excited”) impacts the motivational force of the state (because, for instance, the process of answering a question or filling out an instrument changes the intensity of the state) or vice versa, then we cannot measure both variables with the same accuracy. And, if both variables are relevant to the person’s propensity to act in a certain way in a given situation, then we are faced with a Hobsonian choice.

Superposition, Nonlocality, and Entanglement

Quantum mechanics describes physical systems as being in superpositions of states (spin-up and spin-down), rather than discrete states (spin-up or spin-down). Moreover, when we do measure the state of one entity (an electron), our choice of measurement (spin-up or spin-down?) can impact the measured state of other entities (other electrons), even if the second measurement is performed so far away as to preclude the two entities from communicating with one another at velocities lower than the speed of light. That’s what physicists mean by “entanglement.”

Data-generating processes (producing “measurements”) in organizations can exhibit similar patterns.

Superposition. Humans frequently experience radical ambivalence with respect to internal states and dispositions to act. Research suggests that ambivalence is in fact not “uncertainty about how you really feel“—which could be resolved by clever observation and inquiry—but a superposition of dispositional, motivational, or emotional states. Measurements will collapse this superposition into a single state corresponding, say, to one motivation or another. Whatever we end up declaring to be data—the stuff we feed to predictive algorithms and take as axiomatic—depends on the measurement process we used to “collapse the superposition.”

Nonlocality. When we try to create composite estimates of the variables that matter to the success of an organization (like “heed” or ”openness”) by measuring individual-level variables (like “attention span” and “reward sensitivity”), the responses we get likely reflect nonlocal interactions, even if information flow is strictly classical in nature. Managers behave and respond in ways that depend on underlying “epistemic networks,” which reflect what they think others in their network think, what they think others think they think, and so forth. They often give answers and produce behaviors that shape and adapt to the perceived social context.

Observer-Dependence and Frame-Dependence

As every CEO knows, initiating a “transformation process” is not just hard but treacherous work: the information you need to do the right thing will be highly dependent on the perceptions, incentives, and behaviors of those on whom you rely to provide that information. The choice of dimensions and rubrics on questionnaires and interview and focus group scoring sheets will generate different response patterns based on the choice of words, sequence of questions, grammatical complexity, and perceived intent and purpose—as well as the perceived emotional temperature (active/passive, positive/negative, dominant/submissive) of the instrument we use to inquire.

We labor under the comfortingly simple but ultimately dysfunctional illusion of “classical information” and “classical measurement” when we deal with human and organizational phenomena. Insights from quantum epistemology raise powerful doubts about the given-ness of data. But businesspeople need more than reasonable doubt: they need insights and action prompts. How can we leverage “quantum effects” in human organizations? The “quantum epistemology of social phenomena” is in its infancy but already can provide a battery of new questions for those who want to understand and shape the process of measurement.

Contextualizing the “Given.” The examples above demonstrate how important it is to specify the process by which data is generated, and, in particular:

  • What is the purpose of the measurement?
  • What is the measurement “apparatus”?

Calibrating the Importance of Quantum Effects. These effects will be more critical in some contexts than others, leading us to ask:

  • Do the people whose behavior we are measuring interact with each other?
  • Does it involve complementary variables like perceptions and emotions?

Separating the Separable. The examples also show how important it is to consider the pairwise interactions among different measurements, which suggests that we should also ask:

  • What are the organizational variables whose measurements are likely to interfere with one another?
  • By what mechanism does this interference happen?
  • How do we mitigate it?

Making Heisenbergian Uncertainty Part of the “Interrogator’s Toolkit”

Something is real if it is real in its consequences, argued sociologist David Émile Durkheim in the early 20th century. Every strategist has a little Durkheim floating around in her mind: she knows that even talking about measuring a variable that impacts a decision with economic consequences will itself have economic consequences. These considerations are rarely applied to measurement processes—leading us to ask:

  • How does the decision to measure shape perceptions and motivations?
  • How does the specific way in which we measure shape them?
  • How does the announcement of the results shape subsequent attempts to measure—and the outcomes?

Taking Superpositions Seriously.  Just as quantum computation harnesses phenomena of superposition to generate useful work, taking the radical indeterminacy of affective states seriously can generate useful strategies for intervening in organizations, by asking:

  • What is the likely set of states the system we are trying to measure is “simultaneously” in?
  • What are the ways in which a measurement can “make real” a possible world we want to bring about—or one we desperately wish to avoid?

In order to create value in the fluid and dynamic organizations and business ecosystems of today, we need to better understand how people’s interactions themselves create value. We need to reinvent the measurement tools we use to understand organizations, and quantum mechanics provides a promising basis to achieve this.

Originally published by  Scientific American


The BCG Henderson Institute is Boston Consulting Group’s strategy think tank, dedicated to exploring and developing valuable new insights from business, technology, and science by embracing the powerful technology of ideas. The Institute engages leaders in provocative discussion and experimentation to expand the boundaries of business theory and practice and to translate innovative ideas from within and beyond business. For more ideas and inspiration from the Institute, please visit Featured Insights.

BCG Henderson Institute Newsletter: Insights that are shaping business thinking.

BCG Henderson Institute Newsletter: Insights that are shaping business thinking.