Subscribe via RSS Feed

“Limitless” and Buick Ad Get the Brain All Wrong

[ 12 ] March 23, 2011

Bradley Cooper in "Limitless"

Tracing the influence of present-day brain research on mainstream media and the film industry is a fascinating but routinely frustrating endeavor. The hope is that filmmakers and advertising executives would use their powerful platforms as an opportunity to engage their viewers and buyers with real science, and to use the few opportunities they have to slip in a brain reference to really make sure they are saying something that is based on current understandings. The optimistic result could be a gradual correction in the public consciousness of some prevalent urban legends about the brain.

But more often than not, we see mass media take the easy road– the road that doesn’t require consulting real scientists or taking a genuine interest in scientific accuracy. Instead, the brain is often used for the alluring and impenetrable aura it exudes to further a fantastical plot or sell more products. It is one thing to stay entirely in the realm of fantasy– it is another to claim a basis in the real world and then to get your facts wrong, leading a viewership on to believe that what you’re saying is right.

In the past years, we’ve seen Avatar– with some significant gaps in logic though overall a genuinely intriguing take on a global neural network– and Inception, which, in my opinion, made the mistake of trying to ground its fantastical worlds in actual insights about actual dreaming that humans do (Ellen Page’s endless scenes of rationalizing how this was all possible), but seemed to take absolutely nothing from modern sleep and dreaming research.

Today there are two more significant mentions of the brain swirling in mass media. One is the trailer for the film Limitless, which opened this past weekend in the U.S. Let’s just deal with the trailer, which first grabbed my attention when a male character says the following to Bradley Cooper:

“You know how they say that we can only access 20% of our brain? This lets you access all of it.”

Cut to flashy digital animations of brain tissue. And another generation that will continue to believe that what “they” say about only using a fraction of your brain is still correct. This is an incredibly lazy and irresponsible tidbit to include in a trailer that has now been seen by millions of people around the world. From a Scientific American article published in 2008:

Though an alluring idea, the “10 percent myth” is so wrong it is almost laughable, says neurologist Barry Gordon at Johns Hopkins School of Medicine in Baltimore. Although it’s true that at any given moment all of the brain’s regions are not concurrently firing, brain researchers using imaging technology have shown that, like the body’s muscles, most are continually active over a 24-hour period. “Evidence would show over a day you use 100 percent of the brain,” says John Henley, a neurologist at the Mayo Clinic in Rochester, Minn.

In the same vein of mainstream irresponsibility, a new Buick car commercial currently running on all the major networks makes an outrageous claim about the brain’s daily function.

The voiceover begins: “Humans have three thousand thoughts a day. The engine of the Regal Turbo has a hundred and twenty five million thoughts a second.”

I could not believe this ad copy could make it through any rational modern day chain of command. Where did this tidbit come from?

And it goes on, making a completely impossible comparison between the “thoughts” of a mind, however you quantify those, and the “thoughts” of a car’s engine, attempting to mine the allure of the brain to sell more cars.

One can’t even begin to counter this ad because it’s not clear where their research process began and ended, and where they plucked that number from. If you ask any respectable and humble neuroscientist today, the answer you’d get is that there is no way we can even quantify a single thought, let alone figure out how many we have in a day, with current scientific methods. Not even close.

But if you had to guess, just feel it out, does 3,000 a day– which averages out to somewhere between 2 and 3 thoughts a minute– seem intuitively right to you? If the guy driving that car in the Buick ad is having just 2 or 3 thoughts a minute I pray his car is really having 125 million a second, for his safety and the safety of us all.

Perhaps Buick ripped the 3,000 thoughts a day factoid from a quote by A-Rod’s performance coach, Jim Fannin, in a 2004 New York Times article:

”Superstars don’t think like everyone else,” Fannin said. ”The average person has 2,000 to 3,000 thoughts a day, and 60 percent of the average person’s thoughts are in chaos. The superstar has 1,100 to 1,300 thoughts a day. They eliminate worry, envy, jealousy, embarrassment and anger. The superstar thinks a lot less and holds a thought longer.” [source]

If A-Rod only has 1,100 thoughts a day, that may make sense. And maybe Jim Fannin now writes copy for Buick ads.

Learning to Walk

[ 5 ] January 27, 2011

I have no memory of learning to walk.  I fill the void with out-of-body images: photographs my parents took and slipped into the plastic sleeves of our family archives.  In one photo, I’m pushing myself up from the floor: arms straight, bottom up, feet on tiptoes, as if in yogic preparation for what’s to come.  In another, I am standing up straight, holding the leg of our china cabinet with one hand and reaching the other outward for balance.  I’m looking down at my feet with great focus. My little slippers show no signs of wear – yet.

Of course I’m not alone; none of us remembers our first step.  All we know is that it happened some time at the beginning of our lives.  As a human species, meanwhile, we might assume that our “beginning” happened when we began to walk.  There is no memory – no knowledge – of when exactly this happened.  And so we fill the void with fossils that might suggest a divergence from the four-legged locomotion of our ancestors.

But when paleoanthropologists sift through African rubble and ash, they are not only looking for traces of bipedalism.  They hope that their sweat-drenched labors will yield evidence of something greater, yet infinitely more elusive: consciousness.  What exactly consciousness is still incites debate; its varied definitions begin inside the brain and move outward.  Cognition in the head, language at the mouth, creation with the hands and, finally, some subjective relationship with the rest of the world.  Since not all of these phenomena fossilize, paleoanthropologists can only try to unearth manifestations of conscious – or “modern” – behavior.  Tools, for example.  Or art.

When, then, did humans gain these two great capacities – walking and conscious behavior?  And is there any connection between the two?

One Step for Homo erectus

Lucy

Ardi

In 1974, Homo erectus, or upright man, got a mascot: a fossil hominid named Lucy.  Archeologist Donald Johanson spotted part of her femur sticking out from the dusty ground in the Awash Valley of Ethiopia, and three weeks later his team had unearthed forty percent of her skeleton.  They could tell from the way her knee bent and her lumbar curved that she had spent most of her time on two feet.  They also determined that at 3.2 million years old, she was the earliest example of a walking hominid.

Another find in Laetoli, Tanzania a few years later more or less confirmed this chronology.  A team led by Mary Leakey discovered three sets of footprints pressed into volcanic ash and cemented by rainwater.  No knuckle prints preceded the impressions; this was not the path of lumbering primates.  Homo erectus had walked here, too – this time, 3.7 million years ago.

Then, in 1992, archeologists unearthed another skeleton, just forty-six miles from the site where Johanson found Lucy. Ardipithecus ramidus, or Ardi, walked on all fours and upright – she did this 4.4 million years ago.

Movement & Consciousness
Despite an ever-fluctuating timeline, we know that our ancestors reared up on their hind legs some several million years ago.  More sophisticated endeavors, it turns out, came much later.  In seeking to qualify “consciousness” for a scientific audience, the archeologist Christopher S. Henshilwood defined modern human behavior as “thoughts and actions underwritten by minds equivalent to those of Homo sapiens today.”  In 1991, in a cave on the coast of South Africa, Henshilwood found an example of this: small pieces of ochre, crosshatched and otherwise engraved.  The creators had used tools, certainly, and since the abstract images on the ochre suggest some “arbitrary conventions unrelated to reality-based convention,” perhaps the tool-bearers had symbols in mind.  Maybe they were making art for art’s sake.  In any case, this was modern human behavior at its purest, and its earliest.  Henshilwood estimated the ochre to be 77,000 years old.

Crosshatched Ochre

77,000.  That number never fails to astound me.  Because if hominids began walking around millions of years before they began to resemble Homo sapiens, then walking shaped their consciousness – our consciousness.  Walking makes us think.  It makes us human, not just as upright walkers but as creatures of reason and creativity.  It was, and is, something as fundamental to the human experience as sex, or death.

Contemporary philosopher Alva Noë, in his book Out of Our Heads, writes that the whole of the conscious human experience relies upon movement; that “consciousness is not something that happens in us.  It’s something that we do.”  Noë critiques some cognitive scientists’ tendency to relegate consciousness to the brain alone and unwed it from the world around the body.  In Noë’s mind, the two are inextricable: the former would not exist without the latter.  “Perceptual consciousness” involves accessing objects and visual information by moving a hand or an eye toward them.  Experience, according to Noë, lies in the “two-way dynamic exchange between the world and the active perceiver.”  You can’t think without moving through or towards something.

Timeline of the Hominid Lineage

It is easy to imagine the roots of consciousness, or early modern behavior, developing with the movements of our Homo erectus ancestors.  Early humans were nomads, after all.  They sought food and fertile mates, and when abundance turned to scarcity, they moved on.  “Neither humanity nor its environment was static,” writes Joseph Amato in On Foot.  “Walking was shaped to place and place was shaped to walking.”  Early humans centered their lives on the assumption that they were transient, and that the land they encountered would always be moving past them.

Before I learned to walk, I slept in a crib.  My parents placed their swaddled babe in the middle of the crib each night, only to find come morning that I had somehow moved myself forward.  My father recalls watching me, on several occasions, trying to propel myself ­­through the front of the crib – my head pressed against the wooden slats, my arms and legs performing a determined, dry-land breast stroke.  “If every newborn baby has an appetite for forward motion,” writes Bruce Chatwin in The Songlines, “the next step is to find out why it hates lying still.”  My efforts earned me the title “ The Go-Go Girl.”

A Wandering Artist

If movement stimulates thought, then perhaps lots of movement allows the mind to reach new planes of contemplation and creativity.  The Greeks explored this idea with their creation of the god Dionysus (Bacchus, in Latin).  Bringer of wine, pleasure, and the Bacchic euphoria, the god’s epithet is, simply, “ The Wanderer.”  He moves with a posse of revelers; in Euripides’ Bacchae, he “urges on the wandering band with shouts and renews their frenzied dancing, as his delicate locks toss in the breeze.”

But Dionysus is more than a party animal; he’s an artist.  What looks like debauchery is actually a kind of feverish groping for inspiration; a successful frenzy climaxes in an ecstatic release of dance, music, and unfettered creativity.  And while everything about Dionysus may be over the top – the booze, the sex, the staying up until all hours – his perpetual state of motion seems to prime him for enhanced activity, creative or otherwise.

The Cult of Dionysus was an affront to the civilized self-restraint championed by Greek city-states (and personified by the god Apollo).  It was also wildly popular among Greek citizens.  They worshiped a god who was, in many ways, much more free than they could ever be.  Dionysus’ ability to reach a state of pure pleasure and existential clarity is certainly enviable.  Is it a coincidence that the Greeks conceived of a figure whose identity was inextricable from – and enhanced by – ceaseless movement?  Perhaps the Greeks, in their infinite wisdom, recognized walking as something that defines being human, and purposefully bestowed a predilection for life-affirming activities on “The Wanderer.”

Flâneur State of Mind

It’s one thing for a god to achieve a state of controlled chaos just by walking around.  But can a mortal?  In 1863, Charles Baudelaire wrote an essay called “The Painter of Modern Life,” in which he reviewed the work of artist Constantin Guys.  In seeking the proper word to praise Guys, Baudelaire layered new meaning onto the word flâneur, the masculine French noun meaning “stroller” or “saunterer.” Baudelaire’s flâneur became a sort of urban synthesis of observer, philosopher, dandy, and artist – a contemplative individual who walks the city in order to experience and understand it.

“For the perfect flâneur, the passionate spectator,” he wrote, “it is an immense job to set up house in the heart of the multitude, amid the ebb and flow of movement, in the midst of the fugitive and the infinite.”  A harbinger of the modern age, this cultural oddity could “be away from home and yet [feel] oneself everywhere at home.”  For an artist, to be a flâneur is to enjoy a penchant for deep perception – to imbue one’s art with the “acrid or heady bouquet of the wine of life.”  Evidently, the only way to do this is to loose oneself from all social and personal mores and become what Ralph Waldo Emerson would later call the “transparent eyeball.”

In the moment of perfect flâneur lucidity, the symbolic infrastructure of the landscape crumbles.  With nothing left to mediate her experience, the artist can finally connect with the earth directly below her feet and, if she’s lucky, represent it more fully.  Perhaps that lucidity is a kind of hyper-consciousness activated by constant movement.

Marchand Abat-Jours | Eugene Atget, 1900

In her 1973 essay “On Photography,” Susan Sontag identifies photography as an “extension of the eye of the flâneur” – a middle-class endeavor that, like the meditative stroll of the flâneur, involves setting out to find something real and exciting.”  She cites Atget’s shabby and surreal Paris, Brassaï’s ephemeral images from Paris de nuit, and Weegee’s seamy New York City night scene as exemplars of the flâneur aesthetic.  “The photographer is an armed version of the solitary walker,” Sontag writes, “reconnoitering, stalking, cruising the urban inferno, the voyeuristic stroller who discovers the city as a landscape of voluptuous extremes.”  What Dionysus, Baudelaire’s flâneur, and Sontag’s urban photographer have in common is that their constant movement through space brings them something in the way of a higher understanding – of themselves, of their surroundings, of their creative capabilities.

After graduating from high school, I embarked upon my own artistic exploits à la flâneur, although I did not know that word at the time.  In Rome, I walked to Trastevere in the pouring rain, and photographed motorcycle headlights illuminating raindrops.  In Bulgaria, I walked through the Saturday bazaar, and photographed a man and a woman below a large poster of old Soviet propaganda: In the arms of mother Russia you will be safe.  In Berlin, a man walking past a building near Checkpoint Charlie, covered with graffiti and still baring its shrapnel wounds.  In Munich, snow-covered tourists squinting upwards at Marienplatz’s Glockenspiel clock tower– an ornate cuckoo clock with moving wooden figures in colorful tights.

Channeling Wanderlust

I walk to understand new places (as much as a stranger can), and only when I think I have succeeded do I capture a frame.  My wanderlust forces me to move; movement, in turn allows me to instill in my creations the “heady bouquet of the wine of life.”  I am a follower of Dionysus, an epilogue for Sontag.  It doesn’t matter that I – we – can’t remember learning to walk.  What matters is that we walk, unabashedly, on the ground that swallowed our ancestors.  And that we embrace our wanderlust, that igniter of thought and creation.

GALLERY | PHOTOGRAPHS BY ANGELA JANE EVANCIE

____________________________________________

Angela Jane Evancie is a writer, photographer and radio producer based in Burlington, Vermont.  She graduated from Middlebury College in 2009 with a joint degree in English and Geography.  She is spending this year with a Compton Mentor Fellowship in multimedia journalism.

When Memory Starts Working

[ 8 ] September 11, 2010

Stained pyramidal neurons in the prefrontal cortex. Pyramidal cells are the principal labor force of working memory in the PFC.

Working memory is the tool that allows us to navigate an ever-changing world, as we assess what information to ignore, what information to retain for mere seconds, and what information to process as lasting, long-term memories. It is the process that is at work at this very moment, as you—the reader—move from sentence to sentence: each one, I would hope, building on the previous. Or perhaps your mind is elsewhere—your eyes scanning these lines in rhythm yet your attention lost—then, alas, I have failed. But working memory is still working—you have just established a stronger neural coalition with another brain region, and not that of the visual stream from the text before your eyes. If that is the case, I beg your return!

Working memory has been studied on all levels of neuroscientific analysis, from behavioral tests to the cellular mechanisms of neurotransmitter release and receptor function at the synapse.  The idea that working memory is contained within one region of the brain is most likely a false assumption— however, the prefrontal cortex (PFC), the region just behind the forehead, is most often correlated with working memory, as well as a host of other executive functions that are uniquely human, such as complex planning, decision making, and the censorship of one’s impulses. Mark D’Esposito, researcher at the University of California in Berkeley, provides the following definition: “Working memory refers to the temporary retention of information that was just experience but no longer exists in the external environment, or was just retrieved from long-term memory” (D’Esposito, 2007).

Working memory is a particularly difficult cognitive process to study because of its rapidly transient nature. It is the moment after sensory information has entered the brain, but before consolidation into short-term or long-term memory occurs—it is the moment after you receive directions over the phone and where you begin to recite those steps to yourself in hopes of finding your destination.

Presumably the refinement of this process and the ability to instantly call upon the vast stores of memory in the brain to solve problems presented by working memory has been very evolutionarily adaptive for humans: perhaps this explains why working memory is thought to be correlated with the activity of the PFC, an area that has increased in size so rapidly since we parted evolutionary paths with the apes. As the human brain has tripled in size over the five million years of evolution from our ape ancestors, the PFC has increased in size sixfold. This type of rapid increase in cortical surface area over our evolutionary history is not seen anywhere else in the brain.

The prefrontal cortex first came into focus as a potential seat of working memory when single cell recordings found persistent, sustained levels of neuronal firing in that region during tests that require a monkey to retain information over a short period of time in order to perform goal-directed behaviors (Fuster and Alexander, 1971; Kubota and Niki, 1971). These tasks often require a remembered response to a previously administered stimulus cue, such as an eye saccade to the remembered location of a previously displayed flash of light. The single cell recordings from these early studies revealed that neurons in the PFC seemed to show fast-spiking behavior that was sustained for the period of time that the stimulus location was retained by the monkey.

Figure 1 (click image for more information)

In humans, these findings have been supported by functional magnetic resonance imaging (fMRI, which measures levels of bloodflow throughout the brain) techniques that show pronounced PFC activity during delay tasks (Curtis and D’Esposito, 2003). While these results lead us toward the PFC, figure 1b clearly shows that there are additional regions that yield pronounced fMRI readings during the delayed response task at hand. Other experiments have shown that these coalitions of activation during the retention phase fluctuate depending on whether the task is retrospective (requiring the recall of past sensory experience) or prospective (anticipating future action).

The prefrontal cortex can be divided into three distinct regions that have been shown to correlate with the processing of different types of information. While the distinctions are based largely on superficial fMRI data, the orbitofrontal and ventromedial areas seem to be most relevant to reward-based decision making. The dorsolateral areas seems to be critical in making decisions that call for the consideration of multiple sources of information, and the anterior and ventral cingulated cortex appear to activate most saliently when dealing with conflicting sensory data that requires integration (Krawczyk, 2002).

Like most of the brain, neurons in the PFC are activated by the predominant excitatory neurotransmitter, glutamate, and inhibited by GABA. While these transmitters are the most widespread and are necessary for PFC activity, other modulatory neurotransmitters have been shown to play a crucial role in working memory. Catecholamines—specifically dopamine and norepinephrine—play direct roles in working memory in humans and animals.

Figure 2 (click image for more information)

Depletion of catecholamines in rats has been shown to severely impair working memory in rats (Simon et al. 1979), whereas the application of dopamine into the PFC of monkeys performing working memory tasks has been shown to significantly increase spike activity (Sawaguchi 2001).

Yet the correlation of increased working memory ability on a behavioral level and higher levels of catecholamines is not exponential: an important body of research has established the presence of an inverted-U-shaped response curve that suggests working memory functions with an optimal range of dopamine D1 receptor stimulation in the PFC,

Figure 3 (click image for more information)

with insufficient and excessive stimulation showing dramatic drop-offs in working memory-dependent task performance (Williams and Goldman-Rakic, 1995). Results have suggested that dopamine is involved in the PFC not as a reward mechanism for successful actions but as a modulatory factor that aids, and perhaps causes—at the correct levels of release—the accurate recall of relevant elements of working memory.

To that effect, a 2005 study showed that distinct molecular processes are at work for memory retrieval lasting seconds versus memories recalled after several minutes (Runyan and Dash, 2005). In this experiment, slice preparations from rats were studied after tasks requiring quick, working memory, versus longer “short-term” memory retrieval, such as that seen in the experiment discussed above. The results show that there is a clear difference in PKA activity between these two temporal periods of information storage and retrieval: in working memory, PKA action was demonstrated to be detrimental, while in short-term memory, PKA function was necessary to avoid behavioral error. The explanation of these results set forth in the study suggests that in longer periods of memory retention the need to select between conflicting internal representations becomes necessary—a process that, on the cellular level, is mediated by PKA activity—while working memory relies more on transient, singular representations without as much conflicting information.

While these cellular studies offer insight into the dynamics of PFC circuits, our task is to connect these synaptic behaviors to the behavior of the organism, as far as working memory is concerned. In addition to the action of dopamine, NMDA receptor action seems to be a crucial component of local PFC circuitry. NMDA receptors are particularly intriguing in the study of working memory because of their unique properties that require depolarization to remove a magnesium block and allow ions to flow into the cell, furthering depolarization. This delayed-onset behavior has been suggested as a possible mechanism for PFC pyramidal cells to sustain firing after initial excitation, and thus as a possible cellular trace of the storage of working memory.

NMDA receptor activation has been repeatedly linked to the innervation of the dopaminergic system (Tsukada et al. 2005). One can begin to see how reverberating connections between sensory regions and the PFC could be the telltale sign of working memory, dependent on NMDA receptor-mediated dopaminergic modulation in pyramidal neurons in the PFC as well as inhibition of background noise to focus on the retention of specific sensory cues for their relay to appropriate, goal-directed outputs, or consolidation into longer-term memories by the hippocampus. A study published last month in the Journal of Neural Systems by researchers at Boston University explains how dopaminergic neurons may keep short-term memories alive in the PFC by firing sharply initially and then maintaining a lower voltage, just enough to prime the cell for a quick re-fire.

As we begin to sort out what neurotransmitters, receptors, and types of neurons are involved in the circuitry of the PFC and thus ostensibly working memory, we can begin to see how cellular events, such as the NMDA-mediated release of dopamine onto pyramidal neurons, can modulate specific synaptic dynamics and lead to the sustained activity that is the hallmark of working memory. Can we make the causative association between the firing of neurons in the PFC—the presumed integrator of various incoming streams—and the behavioral output of the organism? Are there distinct maps in the PFC for spatial orientation, color, or other sensory data that are overlaid in interesting ways, or are these selective response fields ever-shifting, dependent on the incoming stream from the sensory system(s) at play?

Moving from correlation to causation will be the next major challenge for research into the most transient forms of memory we posses. The research has important implications for our understanding of internal representations, as information is moved from sensory inputs to higher cortical regions. If we can narrow down what is necessary and sufficient to explain a given representation of an external stimulus in the briefest of retained intervals, then we could be moving closer to understanding at least a piece of subjective, first-person experience.

Secret Pleasure

[ 0 ] July 21, 2010

When there is a congregation of young people crammed into a music hall in Brooklyn and some of them are wearing flannel shirts—though it is summer—and dark sunglasses—though it is nighttime—it usually means that a postmodern pagan ritual is taking place.  Most likely, a Band You’ve Never Even Heard Of is performing.  But on July 13 at the Bell House, a diverse audience appeared to hear the un-hip (trust me, this is truly a compliment) Yale cognitive psychologist Paul Bloom talk about his new book How Pleasure Works: The New Science of Why We Like What We Like (W.W. Norton and Company.  280 pp.  $26.95).   Professor Bloom called us “the drunkest audience he had ever addressed.”  It was a meeting of the Secret Science Club, which is the second-best secret I have uncovered this month.

Professor Bloom possessed a certain combination of humor and humility that is rare in any person, but especially an academic.  He began his talk by inviting to his e-mail inbox  (paul.bloom@yale.edu) any questions and comments that his time onstage might shortchange.  As seems to be the case with most scientific treatments of artistic phenomena, Bloom began by addressing the words of his former advisor—Steven Pinker—who has essentially called the arts inessential.  Bloom, a “card carrying evolutionary psychologist” whose research focuses on children, argues that art is in fact deep.  We are hard-wired to want to see past the surface of pieces of art to an underlying emotional reality of the human condition.  This would explain our obsession with originality; a painting by Vermeer is considered to be a masterpiece until it is revealed to be the work of Han van Meegeren, the master forger.  Though the aesthetic is identical, some essence is lacking.  I believe that art must possess certain properties that appeal—albeit accidentally—to our adapted minds.

Those with avant-garde inklings constantly attempt to make an audience reconsider “What is Art?”  The goal is no longer pleasure, or emotional engagement of any kind.  This amounts to an intellectual exercise.  Despite the fact that all art is inherently intellectual (that is precisely Bloom’s point, that your senses are not enough to explain the experience), these efforts are intentionally extreme.  They are called things like “interesting” or “important” or “experimental” (an interesting scientific staple used differently here in art).  Take—for example—Finnegan’s Wake by James Joyce.   Although lyrical, the prose has no semantic flow and the book contains hardly a trace of any essential hallmarks of fiction.  But Joyce was a genius writer.  Why?  Independence.  Originality.  He was only trying to write like himself; he mastered traditional forms with Dubliners and A Portrait of the Artist as a Young Man before breaking the mold with Ulysses.  But when far less talented people are hell-bent on being alternative just for the hell of it their art often becomes essentially boring.

A Portrait of the Scientist as a Young Artist

[ 3 ] July 1, 2010

Drawing of a Purkinje cell in the cerebellar cortex done by Cajal, after using the Golgi stain.

This is the story of how an artistic son grew up to become the father of modern neuroscience.  In 1873, an Italian pathologist named Camilo Golgi stirred the scientific community by managing to expose the brain in a new light—or darkness.  Golgi found that by immersing nervous tissue first in a potassium dichromate solution and then in a silver nitrate solution, one could show a small number of cells—randomly—in a naked, black entirety.  The stain—which Golgi named la reazione nera (“the black reaction”)—was hugely and internationally influential.  From his inky-looking data, Golgi induced that our brain is composed of a syncytium, or a physically continuous nervous net.  The new conclusion supported an already prominent hypothesis:  the “reticular theory,” which was proposed by the German anatomist Joseph von Gerlach in 1871. (Imagine a structure similar to the enmeshed fingers of your two hands).  But this turns out to be incorrect, an explanation destined to fall flat atop the scrap heap of once-received wisdom.  Camilo Golgi was awarded a share of the Nobel Prize in 1906 for his spectacular stain, which is still used by investigators today.

In fact, the most important result of la reazione nera occured half a generation after its invention when an unknown Spanish academic saw some expert preparations in the private laboratory of a colleague.  The sight incited an insatiable need to see more.  At that moment the ambitious scientist became excited—and started firing.  After Charles Darwin, Santiago Ramón y Cajal—though far less well-known than the founder of evolutionary theory—is the second greatest biologist of his era.  He was the first of two Spanish scientists to receive a Nobel Prize, which he strangely shared in 1906 with his wrongheaded rival.  The most famous and important discovery of Cajal was the neuron, a cellular entity proven to be the basic anatomical, physiological, genetic, and metabolic unit of the nervous system [DeFelipe 2006].  In the 1890s, Cajal provided indisputable evidence of distinct cerebral individuality in the form of his portraits of neurons, which finely—finally—revealed the composition of our mysterious mental matter.  As profoundly true as great science and as truly creative as great art, the investigations of Cajal offer rare and precious insights into life and its infinitely small secrets.

Next >

Download article as PDF (345 KB)

Who is a Neuroscientist?

[ 7 ] March 23, 2010
Can artists be called neuroscientists?

Can artists be called neuroscientists?

There is a trend as of late to ascribe scientific insights to the intuitions of artists. The basic idea is this: whether through literature, visual art, music, or even cooking, artists have predicted—even discovered—the same concepts that scientists later discover in the lab through very different methods.

Some stick to the metaphorical realm with this line of thought, believing that artists do intuit some profound truths about the human experience that are later supported by hard, scientific data. Others, however, take this relationship to the next level, suggesting that the artists are actually making scientific breakthroughs themselves—a step beyond intuition and into the realm of the scientists, who wield their testable, repeatable, peer-reviewed methodology. This trend of thought, which may satisfy our 21st century interdisciplinary romanticism, should be approached with some caution. Can artists really be considered to have made scientific breakthroughs, beyond the metaphorical level of predicting these discoveries with their art? Can artists be called scientists?

Since launching this site late last year, I’ve been given the book Proust was a Neuroscientist twice as a gift and had it recommended to me several other times, as it certainly seems to strike the same chord we’re attempting to hit with this site. Each chapter in the book makes a case for how an artist of yesterday anticipated a scientific breakthrough of today; as the A-equals-B title indicates, author Jonah Lehrer believes they did this tangibly, not metaphorically.

I am a huge admirer of Jonah Lehrer’s writing, which is always graceful and informative (especially his SEED Magazine feature on The Blue Brain Project, which first piqued my interest and led me to start my own documentary film project about the endeavor). His blog The Frontal Cortex is one of the best neuroscience blogs around today. However, I did read Proust was Neuroscientist immediately when it was published in 2007, and it left more questions in my mind than the authoritative title suggests it answers. This is not necessarily a negative: it is always pleasing when a book stirs one’s thoughts, especially when it concerns the intersection of neuroscience and art.proustwasaneuroscientist

My questions about Proust stem mostly from Lehrer’s step beyond the aforementioned metaphorical level of the artist as scientist, and into the realm of the literal. He writes that “We now know that Proust was right about memory, Cezanne was uncannily accurate about the visual cortex, Stein anticipated Chomsky, and Woolf pierced the mystery of consciousness; modern neuroscience has confirmed these artistic intuitions.” Lehrer gives a lot of credit to these artists, and he wants his claims to be taken seriously. He has said that Proust was a Neuroscientist “is about writers and painters and composers who discovered truths about the human mind—real, tangible truths—that science is only now rediscovering.”

For example, Lehrer argues that George Eliot’s novels reject the scientific determinism of the day and affirm a decidedly modern version of free will, infusing her characters with ever-changing, malleable minds. Then, as Lehrer argues in chapter 2, neuroscientists verified this concept decades later when they discovered adult neurogenesis in the 1990s. George Eliot had no idea about adult neurogenesis, which involves neural stem cells, growth factors, and all sorts of biological data that was not at her disposal. What Eliot may have done—and what all the artists in Lehrer’s book may have done—is to make an intuitive statement about the human condition. As modern neuroscience begins to unveil concepts like adult neurogenesis, whereby new neurons can be created well into adulthood (thus the “malleable” mind) we will see that many artistic intuitions can be tied to scientific findings.

In fact, all artistic intuitions can be tied to the brain—isn’t that where they came from to begin with? We feel like we can learn and change well into adulthood, then that thought is penned into a novel, and sure enough, we discover its cellular basis years later. This can apply to the full spectrum of human thought and intuition. Processes of the brain are all destined to be linked to scientific observations of the brain. Linking Eliot’s literary insights to those about adult neurogenesis seems to be more based on our current neuro-everything craze than on any actual scientific notion of neural stem cell populations that Eliot happened to intuit.

It is hard to dismiss a book that has opened and will continue to open the door to neuroscience for thousands of readers who may be coming to this material from other backgrounds. However, the danger is still that these readers may give the artists discussed in Lehrer’s text a level of hard scientific explanatory power that they simply do not deserve. Artists and scientists both seek to understand human nature, but they have been doing so with very different methodologies in their different vocations. Just because an artist’s insight into human behavior seems to tenuously line up with a neuroscientist’s discovery of cellular dynamics does not then mean that an artist is a neuroscientist. Artists reveal things that science may never be able to; the reverse is also true.

There are cases where we can make such connections by using sturdier threads than those which Lehrer employs.

Goethe's manuscipts contain illustrations of his scientific studies of plants and insects.

Goethe's manuscripts from 1790 contain illustrations of his scientific studies of plants and insects.

Goethe, for example—Germany’s national poet—was also a scientist who wrote about plant morphology and color theory.  He was a true scientist, and his artistic work reflects the deep insights gained through a lifetime of scientific inquiry.

Neuroscientists are investigators of the central nervous system who use the scientific methods of hypothesis, observation, and deduction to generate testable, repeatable results. They focus mostly on cells, neurotransmitters, and proteins, unveiling the mechanisms that, on a massive scale, account for our thoughts and behaviors. If an individual does those things, they are a neuroscientist. Like a neuroscientist, Proust was an investigator of the nervous system; but his tool was the written word, and his methods were subjective and introspective. He was not a neuroscientist, nor were the other household names Lehrer calls upon in his book.

Lehrer has continued his thinking on this subject with a recent blog post entitled “Borges was a Neuroscientist,” in which he quotes from neuroscientist Rodrigo Quian Quiroga’s piece about Borges published in Nature. Quiroga’s article is an appreciation—he admits that “Even without this scientific knowledge, Borges’s intuitive description is sharp.” But by slapping on the “Borges was a Neuroscientist” title, Lehrer seems to once again overestimate the neuroscientific reach that these artists may have had. It is one thing to appreciate a sharp artistic intuition that meshes with a later scientific discovery. Indeed, the best artists seem to be the ones who have penetrated something real in our brain-based existence. It is another thing to keep calling these artists neuroscientists—this, even if metaphorically, even if just to attract attention, is misleading.

I am always delighted to take a ride back and forth across the normally rigid division between the arts and sciences, and Lehrer’s writing takes us on that ride quite gracefully. But his can be a irresponsible grace, as it lends the explanatory power of neuroscience to the intuitions of artists who had barely any sense of cells, synapses, action potentials and ion channels. To call an artist a neuroscientist sounds sexy—a buzzword plucked from an increasingly neuro-centric culture—but that sexiness might fade quickly if we picture the artist elbow-deep in formaldehyde, wielding a micro-pipette—which none of these artists ever did. Being elbow-deep in formaldehyde may be sexy on another level, but we’ll leave that discussion for another time.

The future of the dialogue between the arts and the sciences is exciting, as more and more artists begin to tap the rich reservoirs of scientific findings for subject matter and inspiration, and scientists begin to listen to artists for clues as to the neuroscientific basis of their creative processes. We should remain acutely aware of the possibilities as well as the limits of this dialogue. Proust was a Neuroscientist, while exciting in its interdisciplinary nature, may be more of a neuro-revisionist text than a true dialogue between the arts and sciences.

If anything, Lehrer’s book—and his continued use of the gag—should shift from “Artist X was a Neuroscientist” to “Artist X was a Cognitive Psychologist,” as that would rightly put more emphasis on behavior (often the artist’s own) as evidence rather than on the observation of cells and synapses. But will that really sell?

Let us know what you think in the comments section. Can artists really be considered to have made scientific breakthroughs?

Page 2 of 3123