Cognitive Gadgets Theory Might Change Your Mind—Literally

A book by Cecilia Heyes gives teacher-educators something to think about

Vibrant heads with gears placed on light background.

The scientific consensus on human cognition goes something like this:

The way we humans function is a primarily the product of biological evolution, and this includes the functioning of our minds. Over the course of millennia, our minds have evolved such that we all think in basically the same way, using a combination of innate, automatic processes (“thinking fast”) and more deliberative, analytical approaches (“thinking slow”). Evolution has favored the former over the latter, because thinking fast requires far less mental effort than thinking slow. We can of course acquire new knowledge that expands our cognitive abilities— such as learning to read—but this is taxing and difficult, and only succeeds if we push through some of the evolutionary obstacles in our way. Put simply, our minds are, in the words of Dan Willingham, not built to think.

Virtually every claim in that previous paragraph is debatable, yet on the whole, it captures the view of cognition among scientists today.

Then along came Dr. Cecilia Heyes and her remarkable 2018 book Cognitive Gadgets: The Cultural Evolution of Thinking. If Heyes is right, then the implications for education and, indeed, all of humanity are profound.

What Is Cognitive Gadgets Theory?

Heyes’s central claim, as indicated by the title of her book, is that what makes humans distinct from other animals is that we’ve developed unique mechanisms of thinking—“cognitive gadgets”—that are largely the product of information passed from generation to generation through culture rather than genetic code. Cognitive gadgets, as Heyes conceives of them, describe particular ways in which we process information to make sense of our world. Seen this way, cognitive gadgets might be thought of as “mental technologies” we’ve adopted because we’ve found them to be particularly useful in some way.

For example, take “mindreading.” As used by cognitive scientists, mindreading refers not to some Kreskin-esque ability to see the innermost thoughts of other humans but rather to the ability to ascribe mental states to others based on information we can glean from their behavior. When we see someone sobbing, we imagine they are thinking about something painful or hurtful. Alternatively, if they are laughing so hard they are crying, then we imagine they must have heard something funny.

Understood this way, developing the ability to mindread is akin to developing “mental literacy.” Possessing this cognitive mechanism is useful because it allows people to understand, explain, and often predict human behavior. (Indeed, and by way of contrast, Heyes speculates that people with Autism Spectrum Disorder may struggle to link human expressions to inner mental states, which is similar to how people with dyslexia struggle to link written words to abstract concepts.)

How does this connect to education? As Heyes points out, teaching is a salient example of where the cognitive gadget of mindreading proves useful:

Teaching is often described in contrast with other types of social learning as a process in which one agent doesn’t merely permit another to observe their behavior but acts with the intention of producing an enduring change in the mental states—especially the knowledge states—of another… Mindreading allows teachers to represent the extent and limit of a pupil’s current knowledge and, thereby, to infer at each stage in the learning process what the particular pupil mush be shown or told to overcome ignorance, correct false beliefs, and build his or her body of knowledge. And, in a complementary way, mindreading by pupils enables them to isolate what it is the teacher intends them to learn and, thereby, to focus their efforts on particular aspects of a to-be-learned skill.

Mindreading, then, is a cognitive gadget that helps us transmit knowledge from one generation to the next. Few cognitive scientists would quarrel with that claim. But Heyes goes further and posits that the ability to mindread and other mental functions are culturally transmitted, rather than biologically inherited.

This is pushing the theoretical envelope. Knowing this, Heyes marshals a variety of empirical evidence to support her theory. Relying on her training as an experimental psychologist, she cites a medley of relatively recent research that suggest our mental processes are inherited through cultural transmission. Regarding mindreading, for example, she points to study that found “children in Samoa, where it is considered improper to talk about mental states, develop an understanding of false belief at around eight years of age—four or five years later than children in Europe and North America.” Buttressing this with citations to a few other experiments, she argues “these studies indicate that we learn about the mind through conversation about the mind.”

The argument is compelling but also technical, and further research is needed, as Heyes herself admits. But even now, the cognitive gadgets theory is particularly important for a perhaps unlikely audience: teacher-educators within schools of education.

The False Dichotomy That Plagues Schools of Education

There’s a palpable skepticism among many of the faculty of America’s schools of education that cognitive science is relevant to the enterprise in which they are engaged, namely, preparing future teachers. Even among the faculty at programs that have voluntarily agreed to work with Deans for Impact to integrate learning science into their preparation process, we typically have to spend a great deal of time making the case for teachers’ understanding how our minds work scientifically. We do this by connecting principles of learning science to other ideas or values that teacher-educators already possess. But very rarely, if ever, do we find faculty who are receptive to cognitive science on its own terms.

How did this come to pass? Dr. Ilana Horn at Vanderbilt University has offered one plausible history: behaviorist theories of learning gave way to cognitive theories, which in turn gave way to sociocultural theories. Sociocultural learning, as described by Horn, means learning as it “happen[s] in interactions in the world.” Since these interactions are shaped by the culture that surrounds us and the languages we speak, learning should be studied using tools of anthropology and linguistics, which stands in contrast to the cognitive focus on the functioning of an individual’s mind. Or as Horn puts it:

Language and culture were not just the setting for development and thinking––some kind of external variable to be controlled for––they were, in fact, fundamental components of these mental processes. This insight meant that, to explain some learning phenomena, researchers needed to do more than describe mental structures.

Cognitive gadgets theory simultaneously affirms this statement while also requiring that we reimagine the relationship between culture and cognition. Instead of a false dichotomy between internal (cognitive) processes and external (sociocultural) influences, the two are fused into one harmonious model. Of course cultures shape how we think, but only individuals can think, using their individual minds. Neither should be neglected in our quest to understand the processes we use to make our world intelligible.

This nation’s schools of education are not underinvested in exploring the importance of sociocultural factors to teacher-candidates. In contrast, they have—at least until recently—neglected teaching the science of cognition to future teachers. Happily, cognitive gadgets theory underscores the value in understanding the interrelationship between the two.

A Future of Gadgets

Cognitive gadgets theory is just that: a theory. The power of the theory of evolution lies not only in its remarkable power to explain the world as it exists today but also in its ability to explain how the world might change in the future. What if cognitive gadgets theory has similar forecasting possibilities?

If it’s true that cultural evolution dramatically shapes how we think (and not just what we think about), and if it’s also true that we shape our culture through deliberate choices, then it’s possible that we can change how we think. Indeed, Heyes says these changes may take place far faster than biological evolution would ever permit:

The cognitive gadgets theory suggests that distinctively human cognitive mechanisms are light on their feet, constantly changing to meet the demands of new social and physical environments… [R]ather than taxing an outdated mind, new technologies—social media, robotics, virtual reality—merely provide the stimulus for further cultural evolution of the human mind.

This is an optimistic take, yet caution is warranted. It’s exciting to imagine that new technologies will help advance our collective cognitive abilities. It seems just as plausible, however, that these technologies might spread detrimental cognitive “mutations,” like, for example, the impulse to troll others on social media. Such mutations wouldn’t necessarily advance humanity’s wellbeing.

We may have the power to shape how our minds work. If so, let us hope we culturally evolve in ways that foster inquiry, empathy, and human flourishing.

Benjamin Riley is CEO of Deans for Impact.

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College