MIT neuroscientist Earl Miller, BA 鈥85, continues to break new ground in the understanding of cognition鈥攁nd his research may help us move beyond the limits of the brain鈥檚 working memory.
By Adam Piore, Photography by Jason Grow
In the rehearsal space of the Boston band What She Said, Earl Miller lays into his bass guitar, plucking out a funky groove. In a black band tee, faded cargo pants and signature newsboy cap, Miller looks like a seasoned musician you鈥檇 see in any corner dive bar.
But at his nearby office at MIT, Miller is nothing if not professorial. How could that rocker in the cap be the same bookish academic now gazing solemnly at me across his paper-strewn desk at the Picower Institute for Learning and Memory? The jarring contrast between the two Earl Millers is a fitting way to begin a discussion of the pioneering neuroscientist鈥檚 work.
After all, some of Miller鈥檚 biggest contributions to the field over the past 20 years have explored exactly how contrasts like these are possible; how it is, in other words, that human beings鈥攐r any other animal with a brain鈥攁re able to seamlessly adapt behavior to changing rules and environments. How is it that distinct populations of brain cells, or neurons, are able to work together to quickly summon an appropriate response? How do we know when it鈥檚 fitting to play a Patti Smith bass line, and when it鈥檚 time to explain the complex workings of brain waves?
This mental flexibility is so fundamental that it鈥檚 easy to take it for granted. But there are few functions the brain must perform that are more complex or crucial to survival than recognizing when something has changed and then calling up all the disparate information needed to adapt appropriately.
鈥淭hink about what we鈥檙e doing here,鈥 Miller says. 鈥淩ight now. We鈥檙e sitting on chairs. We鈥檙e taking turns talking. This is based on rules. We鈥檝e learned how to behave in this context we鈥檙e in right now.鈥
To pull off tasks like these, the brain uses something called working memory. Cognitive psychologists coined the term in 1960 as they tried to explain the fundamental structure of the human thought process.
Try to hold that last sentence in your mind, or memorize a phone number you鈥檙e about to dial, and you鈥檒l have engaged this critical brain system.
Miller has spent the past two decades trying to understand the mechanisms behind working memory, and he believes the key lies in the brain鈥檚 prefrontal cortex. Insights into this thin layer of neurons at the front of the brain could answer questions that have flummoxed scientists for generations. It might have practical use, too.
Experts have long known that we have a virtually unlimited capacity to store new long-term memories. Yet there鈥檚 a limit on how much information we can cram into our working memory.
In studying the prefrontal cortex鈥檚 functions, Miller and others are coming closer to finally explaining this contradiction. And by solving this riddle, we may find ways to get beyond those limits.
Someday, Miller believes, he鈥檒l be able to make us all smarter.
Someday, Miller believes, he鈥檒l be able to make us all smarter.
Building the Picture
As postdoctoral students at Baltimore鈥檚 Johns Hopkins University in the 1960s, David Hubel and Torsten Wiesel set out to solve a long-standing mystery: What happens in the brain when we see objects and shapes?
Every one of us has about 100 billion neurons, separated by gaps called synapses. Neurons talk to each other by passing signals across these spaces. When one neuron鈥檚 signal is strong enough, it causes the neuron on the other side of the synapse to fire an electrical spike. When that second neuron fires, it passes messages to all the other neurons it鈥檚 connected to, which can cause those neurons to fire. This sequential firing of neurons allows us to think, to move鈥攁nd to see.
After a series of experiments performed on the visual cortex of animals, Hubel and Wiesel argued that it is the consecutive firing of individual, specialized neurons, each responsible for a specific detail in a picture or pattern, that helps us build complex images in our mind鈥檚 eye. Their work earned them the Nobel Prize in Physiology or Medicine in 1981.
As it happened, Miller entered 麻豆视频最新最全 University the same year鈥攚ith dreams of becoming a doctor. That quickly changed when he started working in a neuroscience lab.
鈥淭he moment I first dropped an electrode into a slice of brain and heard all these neurons firing away like a thunderstorm, I was hooked,鈥 Miller recalls.
As a Princeton University graduate student, Miller studied the inferior temporal cortex, a patch of neurons slightly forward of the visual cortex. Scientists had demonstrated this was the region that knits together a unified image from all the complex individual components Hubel and Wiesel identified. Then it starts the 鈥渉igher level鈥 processing of the outside world.
By the time Miller earned his PhD in 1990, he was asking the questions that would later define his career: What happens in the inferior temporal cortex after a unified picture emerges? How do our brains tell us what it means?
Miller tried to answer those questions while working in the lab of National Institute of Mental Health neuroscientist Bob Desimone. Miller was looking for neurons that fired only when an animal spotted an item it was storing in short-term memory. Miller and Desimone trained animals to hold a single image in mind鈥攕uch as an apple鈥攁nd release a lever when that picture reappeared on a screen.
If the animal remembered the first picture it saw and released the lever, a drop of delicious juice would roll down a tube and into its cage.
The pair noticed that certain parts of the animal brain were inherently sensitive to repetition鈥攔egardless of whether it translated into a valued juice reward. Some neurons fired when animals saw a second banana or second image of trees. It was as if the brain was on automatic pilot, primed to notice repetition without any active effort to do so, even when that repetition had no meaning.
It was as if the brain was on automatic pilot, primed to notice repetition without any active effort to do so, even when that repetition had no meaning.
But the pair also discovered a second type of firing pattern. When the animal spotted a picture it was actively holding in his memory鈥攈oping for a juice reward鈥攏ot only did different neurons fire, those neurons fired far more intensely.
鈥淪omething was switching the volume to make these neurons fire, more or less, depending on the nature of the memory,鈥 Miller says. 鈥淭hat got me wondering. Who鈥檚 turning up or down the volume?鈥
Turn It Up
Scientists have suspected that the prefrontal cortex plays a key role in high-level cognition since the case of Phineas Gage. On Sept. 13, 1848, Gage, who worked in railroad construction, was setting an explosive charge with a tamping iron when the gunpowder detonated, rocketing a metal rod up through the roof of his mouth, into his left frontal lobe and through the top of his skull. The rod landed 75 feet away, coated in pieces of Gage鈥檚 brain.
Miraculously, Gage survived and could speak, walk and function. But, it was written later, he could no longer stick to plans and lost much of his social grace and restraint.
From studying Gage and others like him, neuroscientists surmised that the frontal lobes performed the brain鈥檚 鈥渆xecutive functions.鈥 They run the business of thinking and processing and directing the spotlight of attention. And yet, nearly 150 years after Gage鈥檚 famous injury, scientists were still trying to understand how the frontal lobe works.
So, when Miller started his own lab at MIT in 1995, he decided to switch his focus to the prefrontal cortex. By then, some of his peers had already shown that clusters of neurons in lab animals would fire repeatedly in the prefrontal cortex during memory exercises. Their results suggested this region houses our working memory.
To Miller, however, this didn鈥檛 explain how the executive areas of the brain could 鈥渢urn up the volume鈥 on memories associated with free juice.
How does the animal know how to do the task? How does the animal know the rules?
鈥淚 thought that was the most important thing,鈥 Miller says. 鈥淚 didn鈥檛 understand why no one was studying it. Context-dependent behavior is what high-level cognition is all about.鈥
In his new lab, Miller designed an experiment that complicated the choice his animals faced. Instead of just showing an animal a picture and training it to respond every time it reappeared, he varied the number of possible responses by adding a second cue.
Miller predicted he鈥檇 detect activity in multiple neurons in the prefrontal cortex every time he changed the rule. These neurons, he believed, somehow turned up or down the 鈥渧olume鈥 of the neurons he鈥檇 recorded in other areas of the brain.
Not only was Miller right, but the rule change consistently caused twice as many neurons in the prefrontal cortex to fire than in the more simplistic experiments where the task required the animal to just hold a picture in mind.
鈥淭hat told us something,鈥 he says. Perhaps the prefrontal cortex鈥檚 primary job wasn鈥檛 short-term memory at all, but to learn the rules of the game.
In 2001, Miller published a research review that fundamentally shifted the way many viewed the prefrontal cortex. Miller compared the prefrontal cortex to a railroad switch operator, and the rest of the brain to railroad tracks. The switch operator activates some parts of the track and takes others offline. This model would explain how attention works. It explains, for instance, how an animal can focus on a picture while suppressing a noise. And it explained why Phineas Gage had trouble blocking out distractions and focusing on the task at hand.
The theory made intuitive sense. But to some, steeped in the specialized-neuron theories of Hubel and Wiesel, Miller鈥檚 theory seemed preposterous.
鈥淭hat鈥檚 impossible!鈥 Miller recalls one prominent neuroscientist declaring after Miller delivered an invited lecture. 鈥淲e all know that neurons do one thing. Your problem is you can鈥檛 figure out what these neurons are doing,鈥 the researcher told him.
But Miller has continued to accumulate experimental evidence鈥攁s have many other labs鈥攇radually winning scientists over to his idea.
鈥淣eurons are multifunctional,鈥 Miller says. 鈥淲e鈥檝e shown this over and over again for 20 years.鈥
鈥淓arl is kind of a rock star. When he says something, a lot more people notice it.鈥
Wave Change
These days, Miller is taking on another piece of dogma鈥攖hat neurons primarily communicate by electrical spikes. In recent papers, Miller argues that there鈥檚 still a lot to learn from the intermittent electrical currents called oscillations, or brain waves.
When we hold an item in working memory, these oscillations move through brain circuits in waves that rise and fall scores of times. These oscillations, he argues, are how the prefrontal cortex鈥攖hat mental 鈥渟witch operator鈥濃攕tores several items on the cusp of our awareness in working memory, so we can pull them into our conscious minds as needed.
The oscillations aren鈥檛 enough to make the neurons spike. But the brain waves bind together all the neurons in a circuit with every crest, pushing the neurons so close to their firing point that they鈥檙e primed to respond to just the slightest extra stimulus.
This might help answer a question that has long intrigued scientists: How can the human brain store a virtually unlimited number of long-term memories, yet remain severely limited in the information we can hold in our conscious minds at once?
It鈥檚 a limit most notably characterized by Princeton cognitive psychologist George Miller (no relation) in a 1956 paper, 鈥淭he Magical Number Seven, Plus or Minus Two.鈥 George Miller, who helped coin the term working memory, argued that seven, plus or minus two, is the maximum number of objects most of us can hold in our short-term memory at once. Researchers have since demonstrated the number can vary far more widely and may even be smaller than seven. But no one doubts there are limits. (See sidebar below.)
If working memory is encoded in oscillations, Earl Miller says it would explain these limits, because a single wave can only rise and fall a certain number of times a second. 鈥淭hat means you have to fit in everything you want to juggle in your current conscious mind,鈥 he says. 鈥淭hat鈥檚 a natural limitation in bandwidth.鈥
Brad Postle, a University of Wisconsin-Madison neuroscientist, says the idea that something other than the spiking of neurons is important has been 鈥渒icking around for a while.鈥 Postle himself suggested brain waves may play a role in focusing attention. Still, he believes it鈥檚 significant that Miller is now arguing the point.
鈥淗aving it come out of Earl Miller鈥檚 mouth almost by definition will bring attention to it,鈥 says Postle, who authored a widely used neuroscience textbook that includes many of Miller鈥檚 earlier experiments. 鈥淓arl is kind of a rock star. When he says something, a lot more people notice it.鈥
Now, Miller is focusing on new technologies that might actually enhance working memory capacity.
鈥淚f we find a way to stretch the cycle, increase amplitude, make it taller or maybe slow the frequency a little bit, maybe we could increase the capacity of working memory,鈥 he says.
So he鈥檚 planning on experimenting with a technique that uses electrodes placed on top of the scalp to deliver faint pulses of electricity and record the impact. If these pulses are timed correctly, they could change the shape of the brain waves.
It would be a significant technological feat, but Miller thinks it鈥檒l work. If he鈥檚 correct, it could have a profound impact on human performance, literally expanding our brainpower.
Adam Piore is a writer based in Boston, Mass. Excerpted from an article that originally appeared in the October 2016 issue of Discover magazine.
The power of paying attention
When neuroscientist Earl Miller, BA 鈥85, , he gave the graduates some practical advice that boiled down to one word: focus. 鈥淢ultitasking ruins productivity, causes mistakes and impedes creative thought,鈥 he says. Below, he breaks down why that is鈥攁nd what you can do about it.
Multitasking is a misnomer
Your brain has limited capacity for simultaneous thought. Humans can only hold a little bit of information in mind at any single moment, but your brain deludes you into thinking you understand more about what鈥檚 going on around you than you actually do. For example, you probably think you see almost everything in front of you. But you鈥檙e actually sipping at the outside world through a straw. Your brain can only take information in little snippets, which it combines to give you the illusion of seamless visual perception.
You can鈥檛 pay attention to two things at the same time. Toggling between tasks requires a series of small cognitive shifts. You may think you鈥檙e juggling two tasks at once, but actually you鈥檙e switching back and forth very rapidly. For example, if you interrupt a project to check an incoming email or watch a cat video, when you finally return to the task your brain has to expend valuable mental energy refocusing, backtracking and fixing errors.
Whenever you switch from one thought to another, you cognitively stumble a little bit. Humans have a great ability to change their thoughts from moment to moment, but it comes at a cost. As the cognitive apparatus in your brain reconfigures from one mode of thought to another, you slow down, make more errors and miss things.
You鈥檙e less likely to think creatively if you multitask. Innovative thinking comes from extended concentration, i.e., the ability to follow links between thoughts. Memory is a big network of associations. Truly deep and creative thoughts come from following the path of this network to new and different places. When you try to multitask, you typically don鈥檛 get far enough down any path to stumble upon something original because you鈥檙e constantly switching and backtracking. Multitasking lowers the quality of your thoughts, making them more superficial, less creative and less innovative.
Your brain is ill-equipped to handle sensory overload. In prehistoric days, when the human brain first evolved, it was a different environment. Any new information could be critical to survival鈥攁 rustling in the bush might mean a tiger is about to leap out. It was adaptive for our brains to seek out and pay attention to new information, and our brains also evolved to focus on one thing at a time. However, in today鈥檚 modern society, the ceaseless onslaught of information has the potential to cripple us. What was once an evolutionary advantage has become a distraction.
You may think you鈥檙e good at multitasking, but you鈥檙e not. Studies have shown that people who think they are good at multitasking are actually the worst at it. People who multitask a lot do so because they are easily distracted. Then they rationalize it by convincing themselves that they are really good at multitasking.
Improve your ability to focus
Block out a period of time. Think ahead about what you need to accomplish, and plan to focus instead of trying to multitask.
Eliminate as many distractions as possible. Work in a quiet environment. Put away your mobile phone and tablet. Shut off extra computer screens. Turn off email alerts and shut down your email if you have to. Don鈥檛 try to monotask by willpower alone; it鈥檚 too hard to fight the brain鈥檚 craving for new information. Instead, prevent the urge by removing the temptation.
Work on one task at a time for extended periods. Your work quality and productivity will improve if you focus on one task at a time.
Take a short break. If you feel yourself losing focus, walk around a bit. It increases blood flow to the brain and helps restore focus.
Prioritize by doing the most important tasks first. This removes some of the pressure to multitask as deadlines draw near.
Introduce novelty. What we perceive is not a faithful representation of the world. Our brain is constantly interpreting sensory inputs, and if something is familiar, the brain begins to gloss over it. For example, to catch errors when proofing a paper, change the formatting or read it aloud to 鈥渨ake up鈥 your brain and cause it to pay closer attention.
Put away your cell phone when you drive. Your ability to pay attention to the road while you talk on the phone is another delusion. It is estimated that as many as half of the car accidents in the United States alone are due to distracted driving. Studies have shown that talking on the phone causes drivers to miss as much as half the things in front of them.
Hands-free headsets don鈥檛 help much, because it鈥檚 the cognitive demands of conversation that cause the distraction. (Talking to a passenger is different, because they know when to shut up.)
And if you find yourself focusing intently on a radio program or an audiobook, etc., turn it off. You have a limited pool of cognitive resources. Multitasking while driving is just plain dangerous.鈥擡arl Miller
Learn more about Professor Miller's research at