Damasio, Spinoza and our Current Confusion about Cause and Effect, by Charles M. Saunders

Portrait of Baruch de Spinoza (1632-1677), ca. 1665, by an unknown artist

In this article, Charles M. Saunders considers Looking for Spinoza: Joy, Sorrow, and the Feeling Brain by Antonio Damasio
(Houghton Mifflin Harcourt Publishing, N.Y., 2003)

In 2003, one of our most capable and respected neuroscientists went searching for Spinoza. What Antonio Damasio found is both enlightening and alarming. It is laudable that an empirical scientist had the interest, care, and capability to analyze the sequencing and behaviors associated with what Spinoza terms ‘the Emotions.’ This is clearly a positive development. When our neuroscientist friend recognized that something about emotional response is measurable, he made strides for the entire scientific community. But by focusing his analysis only on chapters 3 and 4 of “The Ethics”, Damasio sidetracks Spinoza’s metaphysics, chapters 1 and 2 while presenting Spinoza as some sort of intuitive materialist. The alarming part in all of this is that chapters 3 and 4 are linked inexorably to 1 and 2 wherein Spinoza insists that our thoughts are as real as our experience. As notable as Damasio’s respect for Spinoza’s psychology may be there is a tremendous distance from his awakening to the import and physical reality of the emotions to an adequate understanding of the full impact of Spinoza’s discovery, that the human mind has the ability to form replications of objects so accurate that these ideas are essentially the same thing as the objects they represent.

This is an astounding claim that Spinoza makes and to this day, it has been overlooked or dismissed in light of the advances in contemporary science and its ability to “reduce” everything in its purview through observation and measurement. But cause and effect are not observable within the same time and space.

Brain illustration from The Principles and Practice of Medicine...' by W Osler, 1904, public domain via Wikimedia Commons

Brain illustration from The Principles and Practice of Medicine…’ by W Osler, 1904, public domain via Wikimedia Commons

When the neuroscientist-researcher connects electrodes to a patient to monitor brainwaves there is no question that the observable patterns that emerge are exciting and are indicators of some brain activity related to behaviors that correspond to the patient’s emotional state and mood changes But to conclude from this that the patterns and their location in the brain somehow indicates the cause of the thinking process is a leap that indicates faulty reasoning and bad science. To draw a conclusion about the source of the thinking process from an electroencephalogram is akin to a person who while standing atop the tallest building in a large city before dawn observes the pattern of traffic lights below and concludes that the pattern of lights is the cause of the flow of traffic. No matter how many thousands of lights make up the discernable pattern of the flow of traffic, the actual cause of the traffic is not observable. The cause of the traffic resides elsewhere. It originates in the reasons that each individual driver leaves home and enters the flow: going to work, driving a friend to the hospital, making deliveries, police responding to emergencies and countless other actions are the actual cause of the traffic and they are entirely disconnected from one another. There is no common cause to be observed and reported on here.
This analogy demonstrates the confusion inherent in the empirical process. There is no argument about what the scientist sees during the study. But there is a strong argument against what he claims to have observed. If this mistaken insistence that causality must be observable resided solely in speculative neurobiology the harm might not be that negligible. Unfortunately for us, this curious misunderstanding of cause and effect permeates most of our scientific theory and practice, including applications in healthcare diagnosis and treatment.

Perhaps one of the most debilitating misapplications of the empirical process lies within the field of genetics and the supposed causal link observable in DNA. Crick and Watson never assigned any causal agency to their brilliant discovery. They clearly understood DNA for what it is; a marker not a cause. Assigning cause to DNA strands came later after arrogance and the same faulty reasoning process employed by Damasio came into play. Whether a person suffers from cancer or obesity or a predilection towards baldness, DNA is not the cause of the affliction it merely marks the presence of the condition. To carry the traffic lights/scientific research analogy a bit further, just as we can clearly understand that no matter how complicated or advanced the light pattern and system flow technology might be it cannot be said to be the cause of the traffic. That flow can only be understood by seeing the individual actions and behaviors that are the actual cause. So with DNA, it is a marker that notes the presence not the cause of disease.

The upshot of all this is that our current empirical/materialist science system that has brought about some of the most significant advances for humans in medicine and other sophisticated technologies contains a seriously flawed view of cause and effect. But by insisting on a research focus only on the world of external experience it ignores the rich world of experience’s counterpart and co-equivalent, the Human Mind. This now outmoded way of explaining our planet and our relationship to it must give way to a more sophisticated view. This view will credit the mind as the source and wellspring of any scientific achievement that we’ve ever accomplished and that it is the mind which provides us with the most magnificent tool at our disposal for unraveling Nature’s mysteries.

Charles M. Saunders

~ Ordinary Philosophy is a labor of love and ad-free, supported by patrons and readers like you. Any support you can offer will be deeply appreciated!

*All views and opinions expressed by guest writers are their own and do not necessarily express those of Ordinary Philosophy’s editors and publishers

Happy Birthday, Charles Darwin!

A Charles Darwin display at the Kelvingrove Museum, Glasgow, Scotland

Let’s remember and salute Charles Darwin, the thinker who came to understand the basic mechanism by which we and all other species on earth come to be.

Born on February 12, 1809, Darwin was the grandson of Enlightenment physician, poet, and botanist Erasmus Darwin, who posited his own theory of evolution, as had many others, who observed its effects but had not successfully formulated a theory to explain how it worked. Given that his father was also a physician, it seemed natural that young Charles would take up the family profession. He studied medicine at the University of Edinburgh (my university!) from the age of 16 to 18. Darwin would have attended classes in the original building on South Bridge, now called the Old College, beautifully designed by Robert Adam (it didn’t yet have the dome it has now). While he loved the excellent science education he received there, Darwin decided being a physician was not for him.

Old College Building on South Bridge, University of Edinburgh, where Darwin attended classes

His father then sent Darwin to Christ’s College, Cambridge, with the idea that he could be a minister instead. Darwin did well at Christ’s College, but it was his pursuits as a naturalist that really captured his imagination and into which he poured his best efforts. After he completed his Bachelor of Arts degree in 1831, he continued his scientific study of animals and geologic formations. When the opportunity arose to travel to South America on the HMS Beagle later that year, Darwin took it, and spent the next five years gathering specimens and making detailed notes of his observations of the natural world. Among the wealth of valuable scientific information he amassed, Darwin’s observations of the appearance of apparently designed adaptations in living things; fossils of known and unknown animals sometimes found in the most unexpected places (remains of ancient sea life embedded in rocks at high elevation?!?); and the incredible amount of waste and suffering throughout the natural world, from wasps who laid their eggs in living caterpillars so that the growing grubs would devour them slowly from within to the genocide and slavery routinely practiced against the native people there, gave him much to think about.

Finches in a Charles Darwin display case at the Kelvingrove Museum, Glasgow, Scotland. The adaptations of finch beaks to food sources provided Darwin a perfect example of how natural selection works to produce the appearance of design.

With his experience broadened, his understanding deepened, and his body strengthened by the rigors of his expeditions, Darwin returned to England a wiser, stronger, more serious man. The first publications of his findings, together with his friendships with influential scientists such as the geologist Charles Lyell, made him famous. Darwin had found his profession. He began to pull together the evidence of his own eyes with the work of other naturalists and scientists to formulate a theory that would explain it all. What would explain a world of living things replete with beauty and waste, some joy and contentment but far more suffering, animals marvelously wrought but more often than not hidden from the human eye either by remoteness, incredibly tiny size, or time through extinction? It was the work of Edinburgh’s own self-made geologist James Hutton, popularized and developed by Lyell, which gave Darwin one key to the mystery. Since it had become clear that the earth was indeed ancient, not young as popular interpretations of the Bible would have it, species had plenty of time to adapt and change to their environment as needed, just as the earth itself had plenty of time to form as it is.

Hutton’s Section near the foot of Salisbury Crags, Holyrood Park, Edinburgh, Scotland. On my twice-weekly hikes, I regularly pass by this rock formation. It sparked James Hutton’s realization that the earth must be ancient indeed to give the rocks time to layer, fold, and bend as they do here.

Another key to the mystery was the mass suffering and death Darwin observed. While he mourned it, it was no doubt a comforting realization that it was not designed into the natural world by a divine mind that he was nonetheless bound to worship. Rather, Darwin realized that the living things that could not survive in the environment they found themselves in left those better equipped to do so to reproduce and pass on their adaptations. This realization, this theory of natural selection, Darwin recognized to be explosive as well. It took him about twenty years of careful thought and self-questioning to publish this theory. He knew, for one, that his theory went against people’s natural squeamishness and desire to think of the earth as a friendly home. More than that, Darwin knew perhaps better than anyone what a profound challenge this theory was to orthodox Christianity. But when another naturalist, Alfred Russel Wallace, independently arrived at the same theory, Darwin was galvanized to publish his findings in 1859. His On the Origin of Species went on to become one of the most influential works in the history of thought.

Another Charles Darwin display at the Kelvingrove Museum, Glasgow, Scotland

Darwin’s life is a fascinating one in so many more ways outside of the scope of this piece. To learn more about this husband, father, writer, and restless seeker for truth, I recommend the excellent works I’ve linked to below.

Before that, one more thing: I’ve always hated the term ‘Social Darwinism’ because I think it’s terribly misleading. It refers to the idea that societies can be structured so as to direct evolution in some way, for example, by allowing the weakest or least able, as defined by that society, to die off so that the strongest and most able are the most likely to survive and reproduce. But Darwin did not espouse that idea, nor do scientists now understand him to have implied it. For Darwin, as for those who understand the theory of evolution by natural selection as an explanation of a natural process rather than a policy of action, the reason why human beings have become such a successful species is precisely our capacity for empathy and solidarity. It’s the fact that we care about each other as individuals, that we help each other survive and develop our unique capacities that makes us so adaptable, so creative, so able to get by in such a wide variety of environments. Social Darwinism, then, is contrary to Darwin’s own theories about human evolution. Eugenics, ‘survival of the fittest,’ and other such ideas that later thinkers claimed as part of Darwin’s intellectual legacy are not, in fact, his, or ideas that he would endorse given what he actually wrote. The problem with putting Darwin’s name in the term ‘Social Darwinism’ is that it wrongly implies that it was his idea, and therefore leads many to think of him as a cruel and heartless thinker, responsible for ideas which have caused much suffering and death. He was nothing of the sort.

Charles Darwin’s gravestone in Westminster Abbey, London, England. I was naughty and snuck in a quick photo, though photography is not allowed in the city’s places of worship.

Charles Darwin placard at the Kelvingrove Museum, Glasgow, Scotland

Learn more about this most influential of scientists and thinkers:

Charles Darwin: British Naturalist ~ by Adrian J. Desmond for Encyclopædia Britannica

Charles Darwin: Evolution and the Story of Our Species ~ iWonder at the BBC

Charles Darwin: various articles ~ by Maria Popova for Brain Pickings

Darwin Correspondence Project ~ at the University of Cambridge website

Darwin’s Influence on Modern Thought ~ by Ernst Mayr for Scientific American, November 24 2009

Darwin Online ~ read Charles Darwin’s books, articles, and other publications online

The Evolution of Charles Darwin ~ by Frank J. Sulloway for Smithsonian Magazine, December 2005

The Origin of the Thesis ~ by Claire Pettitt for The Times Literary Supplement

*A version of this piece was previously published at Ordinary Philosophy

Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, supported by patrons and readers like you. Please offer your support today!

Happy Birthday, Morton White!

Morton White in 1981

The world lost Morton White (April 29, 1917 – May 27, 2016) less than two years ago, and I first learned of him through reading his obituary in The New York Times. As I read, I knew this is a man and an approach to philosophy I must learn more about.

White was a philosopher and historian of ideas. According to the Institute for Advanced Studies, ‘he maintained that philosophy of science is not philosophy enough, thereby encouraging the examination of other aspects of civilized life—especially art, history, law, politics and religion—and their relations with science’. And as William Grimes put it for TNYT, his ‘innovative theory of “holistic pragmatism” showed the way toward a more socially engaged, interdisciplinary role for philosophy’.

I studied philosophy with great love and enthusiasm as an undergraduate, yet I found myself then as now just as curious about other disciplines, especially history and the arts, and have often felt that the lines dividing these areas of study are sometimes artificial and even impediments to understanding. Since then, I’ve been pursuing my studies in the history of ideas more broadly, informally for the past several years, formally now at the University of Edinburgh. No doubt, White has influenced the direction my studies in intellectual history will take in ways I’ll learn as I go along, and in many more ways than I’ll ever know.

Learn more about White and his fascinating ideas:

Holistic Pragmatism and the Philosophy of Culture‘ – chapter 1 of A Philosophy of Culture: The Scope of Holistic Pragmatism, New Jersey: Princeton University Press 2002, in which White summarizes what his holistic pragmatism is all about

Morton White, Philosopher of Holistic Pragmatism, Dies at 99‘ – Obituary by William Grimes for The New York Times, June 10, 2016

Morton White 1917–2016 – His memorial page at the Institute for Advanced Study website, June 08, 2016

And you can find his selected bibliography at Wikipedia

*A version of this piece was previously published at Ordinary Philosophy

~ Ordinary Philosophy is a labor of love and ad-free, supported by patrons and readers like you. Any support you can offer will be deeply appreciated!

Photobook: Anatomical Museum, Old Medical School, University of Edinburgh

Doorway to the Anatomical Museum, Old Medical School, University of Edinburgh, 2018 Amy Cools

Doorway to the Anatomical Museum, Old Medical School, University of Edinburgh. The Museum is open about one day a month to visitors who are not medical students. I’m excited to finally discover it today!

Anatomy Lecture Hall, view from near the door, Old Medical School, University of Edinburgh, 2018 Amy Cools

Anatomy Lecture Hall, view from near the door, Old Medical School, University of Edinburgh

Anatomy Lecture Hall, Old Medical School, University of Edinburgh, 2018 Amy Cools

Anatomy Lecture Hall, view from above, Old Medical School, University of Edinburgh

Downstairs foyer of the Anatomical Museum, Old Medical School, University of Edinburgh, 2018 Amy Cools

Downstairs foyer of the Anatomical Museum, Old Medical School, University of Edinburgh. It’s full of interesting skeletons, plaster casts, art, and so on, in a lovely vaulted chamber below the the main museum hall.

View in foyer of the Anatomical Museum, Old Medical School, University of Edinburgh, 2018 Amy Cools

View in foyer of the Anatomical Museum, Old Medical School, University of Edinburgh

A collection of life masks from men and women of the world, Anatomical Museum collection, Old Medical School, University of Edinburgh, 2018 Amy Cools

A collection of life masks from men and women of the world, Anatomical Museum collection, Old Medical School, University of Edinburgh

A portrait head of Chief Bokani in the Anatomical Museum collection, Old Medical School, University of Edinburgh, 2018 Amy Cools

A striking portrait head of Chief Bokani in the Anatomical Museum collection, Old Medical School, University of Edinburgh

Detail of an illustration repoduced from De Humani Corporis... by Andreas Vesalius, 1543, Anatomical Museum collection, Old Medical School, University of Edinburgh, 2018 Amy Cools

Detail of an illustration repoduced from De Humani Corporis… by Andreas Vesalius, 1543, in the hallway to the main display hall. Anatomical Museum collection, Old Medical School, University of Edinburgh

Image of Benjamin Rush, Anatomical Museum collection, Old Medical School, University of Edinburgh, 2018 Amy Cools

Image of Benjamin Rush hung in the stairwell to the main display hall, Anatomical Museum collection, Old Medical School. Rush attended the University of Edinburgh from 1766 to 1768.

Anatomical Museum, Old Medical School University of Edinburgh, photo credit Scots Magazine. Photography is not allowed without prior arrangement, since there are human specimens and pieces from private collections that do not have permissions granted for general photography scattered among the collection. Among the many, many fascinating objects here, there is a large phrenology display, a discipline now considered pseudoscience but once a cutting edge field of research. In this display, I gaze upon the faces, through their life / death masks, of: Robert Owen, John James Audubon, composers Ernst von Weber and Liszt, Robert the Bruce (skull cast), Sir Walter Scott, Johnathan Swift, Samuel Taylor Coleridge, Alexander Pope, William Wordsworth, Samuel Johnson, William Pitt, Oliver Cromwell, Napoleon Bonaparte, Jean-Paul Marat, William Herschel, Voltaire, John Ross, George Combe, George Washington, and many others.

Life mask of George Combe, Anatomical Museum, Old Medical School, University of Edinburgh, 2018 Amy Cools

I was naughty only once, and snuck a picture of the life mask of George Combe. Frederick Douglass was a fan of George Combe and wrote glowingly of their meeting. This episode is particularly poignant because phrenology would come to be used to reveal the supposed inferiority of black, Semitic, and other peoples. Evidently, there was no such association to Douglass in 1846. He would have been confident, I think, that Combe’s research would align with what Douglass knew to be true: the rationality and set of capabilities that all humans share.

Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, supported by patrons and readers like you. Please offer your support today!

Happy Birthday, Charles Darwin!

A Charles Darwin display at the Kelvingrove Museum, Glasgow, Scotland

Let’s remember and salute Charles Darwin, the thinker who came to understand the basic mechanism by which we and all other species on earth come to be.

Born on February 12, 1809, Darwin was the grandson of Enlightenment physician, poet, and botanist Erasmus Darwin, who posited his own theory of evolution, as had many others, who observed its effects but had not successfully formulated a theory to explain how it worked. Given that his father was also a physician, it seemed natural that young Charles would take up the family profession. He studied medicine at the University of Edinburgh (my university!) from the age of 16 to 18. Darwin would have attended classes in the original building on South Bridge, now called the Old College, beautifully designed by Robert Adam (it didn’t yet have the dome it has now). While he loved the excellent science education he received there, Darwin decided being a physician was not for him.

Old College Building on South Bridge, University of Edinburgh, where Darwin attended classes

His father then sent Darwin to Christ’s College, Cambridge, with the idea that he could be a minister instead. Darwin did well at Christ’s College, but it was his pursuits as a naturalist that really captured his imagination and into which he poured his best efforts. After he completed his Bachelor of Arts degree in 1831, he continued his scientific study of animals and geologic formations. When the opportunity arose to travel to South America on the HMS Beagle later that year, Darwin took it, and spent the next five years gathering specimens and making detailed notes of his observations of the natural world. Among the wealth of valuable scientific information he amassed, Darwin’s observations of the appearance of apparently designed adaptations in living things; fossils of known and unknown animals sometimes found in the most unexpected places (remains of ancient sea life embedded in rocks at high elevation?!?); and the incredible amount of waste and suffering throughout the natural world, from wasps who laid their eggs in living caterpillars so that the growing grubs would devour them slowly from within to the genocide and slavery routinely practiced against the native people there, gave him much to think about.

Finches in a Charles Darwin display case at the Kelvingrove Museum, Glasgow, Scotland. The adaptations of finch beaks to food sources provided Darwin a perfect example of how natural selection works to produce the appearance of design.

With his experience broadened, his understanding deepened, and his body strengthened by the rigors of his expeditions, Darwin returned to England a wiser, stronger, more serious man. The first publications of his findings, together with his friendships with influential scientists such as the geologist Charles Lyell, made him famous. Darwin had found his profession. He began to pull together the evidence of his own eyes with the work of other naturalists and scientists to formulate a theory that would explain it all. What would explain a world of living things replete with beauty and waste, some joy and contentment but far more suffering, animals marvelously wrought but more often than not hidden from the human eye either by remoteness, incredibly tiny size, or time through extinction? It was the work of Edinburgh’s own self-made geologist James Hutton, popularized and developed by Lyell, which gave Darwin one key to the mystery. Since it had become clear that the earth was indeed ancient, not young as popular interpretations of the Bible would have it, species had plenty of time to adapt and change to their environment as needed, just as the earth itself had plenty of time to form as it is.

Hutton’s Section near the foot of Salisbury Crags, Holyrood Park, Edinburgh, Scotland. On my twice-weekly hikes, I regularly pass by this rock formation. It sparked James Hutton’s realization that the earth must be ancient indeed to give the rocks time to layer, fold, and bend as they do here.

Another key to the mystery was the mass suffering and death Darwin observed. While he mourned it, it was no doubt a comforting realization that it was not designed into the natural world by a divine mind that he was nonetheless bound to worship. Rather, Darwin realized that the living things that could not survive in the environment they found themselves in left those better equipped to do so to reproduce and pass on their adaptations. This realization, this theory of natural selection, Darwin recognized to be explosive as well. It took him about twenty years of careful thought and self-questioning to publish this theory. He knew, for one, that his theory went against people’s natural squeamishness and desire to think of the earth as a friendly home. More than that, Darwin knew perhaps better than anyone what a profound challenge this theory was to orthodox Christianity. But when another naturalist, Alfred Russel Wallace, independently arrived at the same theory, Darwin was galvanized to publish his findings in 1859. His On the Origin of Species went on to become one of the most influential works in the history of thought.

Another Charles Darwin display at the Kelvingrove Museum, Glasgow, Scotland

Darwin’s life is a fascinating one in so many more ways outside of the scope of this piece. To learn more about this husband, father, writer, and restless seeker for truth, I recommend the excellent works I’ve linked to below.

Before that, one more thing: I’ve always hated the term ‘Social Darwinism’ because I think it’s terribly misleading. It refers to the idea that societies can be structured so as to direct evolution in some way, for example, by allowing the weakest or least able, as defined by that society, to die off so that the strongest and most able are the most likely to survive and reproduce. But Darwin did not espouse that idea, nor do scientists now understand him to have implied it. For Darwin, as for those who understand the theory of evolution by natural selection as an explanation of a natural process rather than a policy of action, the reason why human beings have become such a successful species is precisely our capacity for empathy and solidarity. It’s the fact that we care about each other as individuals, that we help each other survive and develop our unique capacities that makes us so adaptable, so creative, so able to get by in such a wide variety of environments. Social Darwinism, then, is contrary to Darwin’s own theories about human evolution. Eugenics, ‘survival of the fittest,’ and other such ideas that later thinkers claimed as part of Darwin’s intellectual legacy are not, in fact, his, or ideas that he would endorse given what he actually wrote. The shameful thing about putting Darwin’s name in the term ‘Social Darwinism’ is that it misleads people into thinking that he came up with it, and therefore to think of him as a cruel and heartless thinker, responsible for ideas which have caused much suffering and death. He was nothing of the sort.

Charles Darwin’s gravestone in Westminster Abbey, London, England. I was naughty and snuck in a quick photo, though photography is not allowed in the city’s places of worship.

Charles Darwin placard at the Kelvingrove Museum, Glasgow, Scotland

Learn more about this most influential of scientists and thinkers:

Charles Darwin: British Naturalist ~ by Adrian J. Desmond for Encyclopædia Britannica

Charles Darwin: Evolution and the Story of Our Species ~ iWonder at the BBC

Charles Darwin: various articles ~ by Maria Popova for Brain Pickings

Darwin Correspondence Project ~ at the University of Cambridge website

Darwin’s Influence on Modern Thought ~ by Ernst Mayr for Scientific American, November 24 2009

Darwin Online ~ read Charles Darwin’s books, articles, and other publications online

The Evolution of Charles Darwin ~ by Frank J. Sulloway for Smithsonian Magazine, December 2005

The Origin of the Thesis ~ by Claire Pettitt for The Times Literary Supplement

Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, supported by patrons and readers like you. Please offer your support today!

The Triage of Truth: Do Not Take Expert Opinion Lying Down, by Julian Baggini

Brain illustration from The Principles and Practice of Medicine…’ by W Osler, 1904, public domain via Wikimedia Commons

The thirst for knowledge is one of humankind’s noblest appetites. Our desire to sate it, however, sometimes leads us to imbibe falsehoods bottled as truth. The so-called Information Age is too often a Misinformation Age.

There is so much that we don’t know that giving up on experts would be to overreach our own competency. However, not everyone who claims to be an expert is one, so when we are not experts ourselves, we can decide who counts as an expert only with the help of the opinions of other experts. In other words, we have to choose which experts to trust in order to decide which experts to trust.

Jean-Paul Sartre captured the unavoidable responsibility this places on us when he wrote in Existentialism and Humanism (1945): ‘If you seek counsel – from a priest, for example – you have selected that priest; and at bottom you already knew, more or less, what he would advise.’

The pessimistic interpretation of this is that the appeal to expertise is therefore a charade. Psychologists have repeatedly demonstrated the power of motivated thinking and confirmation bias. People cherry-pick the authorities who support what they already believe. If majority opinion is on their side, they will cite the quantity of evidence behind them. If the majority is against them, they will cite the quality of evidence behind them, pointing out that truth is not a democracy. Authorities are not used to guide us towards the truth but to justify what we already believe the truth to be.

If we are sincerely interested in the truth, however, we can use expert opinion more objectively without either giving up our rational autonomy or giving in to our preconceptions. I’ve developed a simple three-step heuristic I’ve dubbed ‘The Triage of Truth’ which can give us a way of deciding whom to listen to about how the world is. The original meaning of triage is to sort according to quality and the term is most familiar today in the medical context of determining the urgency of treatment required. It’s not infallible; it’s not an alternative to thinking for yourself; but it should at least prevent us making some avoidable mistakes. The triage asks three questions:

  •  Are there any experts in this field?
  •  Which kind of expert in this area should I choose?
  •  Which particular expert is worth listening to here?

In many cases there is no simple yes or no answer. Economic forecasting, for example, admits of only very limited mastery. If you are not religious, on the other hand, then no theologian or priest can be an expert on God’s will.

If there is genuine expertise to be had, the second stage is to ask what kind of expert is trustworthy in that domain, to the degree that the domain allows of expertise at all. In health, for example, there are doctors with standard medical training but also herbalists, homeopaths, chiropractors, reiki healers. If we have good reason to dismiss any of these modalities then we can dismiss any particular practitioner without needing to give them a personal assessment.

Once we have decided that there are groups of experts in a domain, the third stage of triage is to ask which particular ones to trust. In some cases, this is easy enough. Any qualified dentist should be good enough, and we might not have the luxury of picking and choosing anyway. When it comes to builders, however, some are clearly more professional than others.

The trickiest situations are where the domain admits significant differences of opinion. In medicine, for example, there is plenty of genuine expertise but the incomplete state of nutritional science, for example, means that we have to take much advice with a pinch of salt, including that on how big this pinch should be.

This triage is an iterative process in which shifts of opinion at one level lead to shifts at others. Our beliefs form complex holistic webs in which parts support each other. For example, we cannot decide in a vacuum whether there is any expertise to be had in any given domain. We will inevitably take into account the views of experts we already trust. Every new judgment feeds back, altering the next one.

Perhaps the most important principle to apply throughout the triage is the 18th-century Scottish philosopher David Hume’s maxim: ‘A wise man … proportions his belief to the evidence.’ Trust in experts always has to be proportionate. If my electrician warns me that touching a wire will electrocute me, I have no reason to doubt her. Any economic forecast, however, should be seen as indicating a probability at best, an educated guest at worst.

Proportionality also means granting only as much authority as is within an expert’s field. When an eminent scientist opines on ethics, for example, she is exceeding her professional scope. The same might be true of a philosopher talking about economics, so be cautious about some of what I have written, too.

This triage gives us a procedure but no algorithm. It does not dispense with the need to make judgments, it simply provides a framework to help us do so. To properly follow Immanuel Kant’s Enlightenment injunction ‘Sapere aude’ (Dare to know), we have to rely on both our own judgment and the judgment of others. We should not confuse thinking for ourselves with thinking by ourselves. Taking expert opinion seriously is not passing the buck. No one can make up your mind for you, unless you make up your mind to let them.Aeon counter – do not remove

This article was originally published at Aeon and has been republished under Creative Commons.

~ Julian Baggini is a writer and founding editor of The Philosophers’ Magazine. His latest book is A Short History of Truth (2017). (Bio credit: Aeon)

Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, entirely supported by patrons and readers like you. Please offer your support today!

Happy Birthday, Morton White!

Morton White in 1981

The world lost Morton White (April 29, 1917 – May 27, 2016) less than a year ago as I write this today, and I first learned of him through reading his obituary in The New York Times. As I read, I knew this is a man and an approach to philosophy I must learn more about. Being immersed in other projects, I learned little about him in the intervening eleven months. Happily, I was just reminded by going through my list of significant dates in the lives of the world’s great thinkers (by no means comprehensive!) I placed two of his books on hold at the San Francisco Public Library and will commence reading them on this 100th anniversary of his birth.

White was a philosopher and historian of ideas. According to the Institute for Advanced Studies, ‘he maintained that philosophy of science is not philosophy enough, thereby encouraging the examination of other aspects of civilized life—especially art, history, law, politics and religion—and their relations with science’. And as William Grimes put it for TNYT, his ‘innovative theory of “holistic pragmatism” showed the way toward a more socially engaged, interdisciplinary role for philosophy’.

I studied philosophy with great love and enthusiasm as an undergraduate, yet I found myself then as now just as curious about other disciplines, especially history and the arts, and have often felt that the lines dividing these areas of study are sometimes artificial and even impediments to understanding. Since then, I’ve been pursuing my studies in the broader history of ideas as well, informally for the past few years, formally at the University of Edinburgh starting this fall. No doubt, White has influenced the direction my studies in intellectual history will take in ways I’ll learn as I go along, and in many more ways than I’ll ever know.

Learn more about White and his fascinating ideas with me:

Holistic Pragmatism and the Philosophy of Culture‘ – chapter 1 of A Philosophy of Culture: The Scope of Holistic Pragmatism, New Jersey: Princeton University Press 2002, in which White summarizes what his holistic pragmatism is all about

Morton White, Philosopher of Holistic Pragmatism, Dies at 99‘ – Obituary for the New York Times by William Grimes, June 10, 2016

Morton White 1917–2016 – His memorial page at the Institute for Advanced Study website, June 08, 2016

And you can find his selected bibliography at Wikipedia

Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, supported by patrons and readers like you. Please offer your support today!

O.P. Recommends: Your Brain on the Scientific Method

Old Medicine Bottles, PublicDomain via Wikimedia CommonsIn ‘Your Brain on the Scientific Method‘, Sara E. and Jack M. Gorman open with a discussion of John Oliver’s recent takedown of scientific sensationalism in the media and its negative impact on the public’s understanding of science and its methods. Just about every day, it seems, there’s a study that comes out which reveals that things science said were bad are actually good for you and vice versa; that some foodstuff, familiar or exotic, was just discovered to be the ‘miracle cure’ for something or other;  some new report or yet another scientist will come out either proving or disproving human-caused climate change; and so on.

But there’s a lot more to the story of public misunderstanding of science, the authors say: the reason we often have trouble understanding science and its methods is the same reason why scientific sensationalism is so effective: science is so contrary to the way our brains generally, instinctively work. Find out why in this excellent piece…

Gorman, Sara E. and Jack M. ‘Your Brain on the Scientific Method‘. Oxford University Press blog, May 17th 2016.

Scientific Studies‘. Last Week Tonight with John Oliver, HBO, May 8th 2016.

~ Ordinary Philosophy and its Traveling Philosophy / History of Ideas series is a labor of love and ad-free, entirely supported by patrons and readers like you. Please offer your support today!

Anecdote and Evidence

I was engaged in conversation the other day with someone I like very much and whose opinions I respect, yet with whom I often disagree. This is a very good thing: these are the sort of discussions that keep us honest. They force us to confront arguments and evidence we hadn’t considered before. They challenge us to recognize our unjustified assumptions, things we’ve long taken for granted and never thought to question. And over time, they instill in us the habit of forming good quality arguments that withstand such challenges, and discard those that don’t. These are valuable lessons which we don’t learn so readily in discussion with like-minded people, in preaching to the choir, so to speak.

That evening, we were mostly discussing politics, history, and social issues. Over the course of the evening, I found my interlocutor often supported his arguments primarily with anecdotes, as we all often do. Anecdotes are invaluable discussion tools: they illustrate what we mean by taking the argument out of the realm of the abstract into concrete reality, or in other words, they bring the argument to life. But over the course of the evening, I found that for nearly every anecdote he presented, I thought of one in support of a counterargument. Now it just so happened that some of the topics under discussion were sensitive issues, and since we were in mixed company and everyone was on holiday, I was loathe to bring up anything that would cause strong discomfort or hurt feelings, so I held back.

But I wish I had asked him to clarify this crucial detail: did he mean to use these anecdotes as illustrations, or as evidence?

If he was using these anecdotes to illustrate the larger points he was making, well and good. If he was using these anecdotes as evidence of how stated facts or general rules were manifested or broke down in particular circumstances, well and good. And if he was using these anecdotes as evidence of how particular circumstances can give rise to unique results, again, well and good.

Yet, the fact that I could easily think of a contradictory anecdote for every one he presented weakened his arguments in my mind as he was making them, in those cases where he was arguing in favor of truth claims about the world as a whole. That’s because he hadn’t made is clear how he was using these anecdotes to support his claims.

We should keep this in mind every time we make an argument: an anecdote, considered on its own, should not be considered evidence when it comes to general rules, facts, or theories.

Generally, we should be hesitant to rely too much on anecdotes when we want to persuade others of the truth of what we’re saying.Why? Well, the world is a complicated place, with innumerable factors to consider when making a judgment on any given situation. So while any one anecdote can show how a particular array of circumstances can lead to a specific outcome, it doesn’t reveal enough about what can happen given another particular set of circumstances, or what usually happens in the world as a whole.

There was a warehouse cat I knew named Stinky, years ago when I worked in a salvage yard and retail warehouse. She was a charmingly decrepit cat, runty and ancient. She purred like an old lawnmower, she had rheumy eyes, and she ate a special diet of soft food because she had no teeth. She had terrible arthritis or some other undiagnosed bone or joint condition, which gave her an oddly rolling gait and caused her to nearly fall over every time she strained her head up around to the side to look at you. She also left a patch of brown dust everywhere she slept because she could not reach around to groom her body. For all of this, she seemed to have a happy life: she was very affectionate, showed few signs of pain or distress for all her maladies, and dearly loved and tenderly cared for by all of us who worked there. (My heart still aches with affection when I remember our dear departed little kitty!)

Now, suppose someone where to discuss cats with me, and based on my close acquaintance with Stinky, I were to argue that cats are slow, ungainly creatures with no teeth, that they are dirty animals that don’t groom themselves, they always weigh less than eight pounds, and if you were to hear a low rumbling sound, you can bet it’s a cat. My interlocutor would justifiably think I’m a little nutty. When it comes to talking about one cat, an anecdote is very revealing. When it comes to talking about the species cat, not so much. In other words: one cat is an anecdote, but lots of cats are evidence.

While all this might appear obvious, it’s natural for human beings to form beliefs and to argue on the basis of what we’re familiar with: we all have our own sets of experiences from which we draw our ideas about the world. Yet, as we grow in knowledge and understanding, it’s important to gather as much information as we can about the world that goes beyond our own experience, since we lead ourselves astray all the time by relying on anecdotes, or in other words, the limits of our own experience. The anecdote can point us in the direction of where to seek for truth, since it reveals facts about the world in that particular time and place, but on its own, can’t tell us much about larger truths or how the world works as a whole.

Statistics are evidence. Meta-studies are evidence. One study can be considered useful evidence if it’s sufficiently large and well-conducted, but given so many variables in the world and the statistical likelihood of getting skewed results in any one given study, it’s better to rely on meta-studies, or a hypothesis or theory supported by many studies and observations over time.

Returning to the anecdote with which I began this piece to illustrate my argument: given the evidence of the many discussions we’ve had over time, I have every reason to believe my interlocutor that evening is an intelligent person, well-informed in many ways. Given my confidence in his abilities, I also believe he’s fully aware of the difference between anecdote and evidence. Yet, since our evolved brains naturally think first in terms of our own experience so that we easily fall into the anecdote-belief trap, we need to keep in mind the difference between anecdote and evidence, use them appropriately, and make it clear to ourselves and our partners in discussion how we’re using each of them to support our arguments and why.

How the Brain Works (and Doesn’t) vs Our Justice System

We’re learning new and surprising things about our brains all the time. Psychologists, behavioral economists, and other scientists have sophisticated new tricks to reveal what’s going on inside our skulls, and their findings are publicized more widely than ever before. We reveal what we think we’re thinking through polls and quizzes, we’re ‘tricked’ into revealing what we’re really thinking through rigged puzzles and tests (exposing our biases, misconceptions, etc), we have easy access to massive databases of recorded human thought, and most amazingly, neuroscientists can now peer inside our brains while we’re thinking. And some of what we’re learning shows us that we’ve been wrong about ourselves in some really important ways.

So: do changes in our understanding of how the brain works mean we have to change, even to overhaul, how our justice system works too?

Our justice system! WHAT?!? Why would we want to change something so equitable, so honorable, it’s defined by the word ‘justice’?

Seriously, though: to many, this question is almost too scary to ask. 

Won’t we send the signal that we’re no longer tough on crime by even proposing such a project? Isn’t our justice system pretty good as it is? Occasional ‘bad apple’ cops, perjuring witnesses, and corrupt prosecutors aside, our justice system is founded on the wholesome principle of personal responsibility: if we do wrong, we pay the price. A central feature of our justice system, after all, is the right to a fair trial by a jury of our peers. And as a society, we’ve taken great pains to ensure that everyone charged with a crime can get a fair trial. Our standards of evidence are pretty high: there must be eyewitness testimony and plenty of it. Forensic evidence is carefully collected and thoroughly analyzed, from blood to fingerprints to DNA to the tiniest of hairs and fibers. Everyone is entitled to legal counsel, even if the taxpayers have to foot the bill. Children and the mentally disabled are, properly, not tried as if they have the same level of responsibility as fully capable adults. The convicted have a right to appeal if they present evidence that their trial was unfair or if they can demonstrate innocence. And so on.
Even granting all of these and setting them aside for now, a common objection to the current justice system in general is that the underlying concept of personal responsibility is a myth. We’re not responsible since everything we think and do is determined by laws of cause and effect. So, we have no free will, and if we have no free will, the whole concept of moral accountability, of responsibility for our actions, doesn’t make sense. The justice system ends up, then, having nothing to rightfully judge or punish. Let’s explore this for a moment.

What do we mean by personal responsibility? We mean that it’s we that did the thing, it’s we who understand that there are alternatives available to us, and it’s we that could, at least conceivably, have done otherwise. This is true even if our personalities, past experiences, and current circumstances make it unlikely we would have chosen otherwise. It’s reasonable to assign responsibility for actions, since it deters us from making bad choices, and motivates us to inculcate better habits in ourselves. Assigning responsibility does not mean we must ruthlessly punish all who do wrong; it means that we can make reasonable demands of one another as the circumstances warrant, be it punishment, recompense, an apology, or an acknowledgement of responsibility.


Who’s responsible for our actions, then? We all are, so long as we are capable of understanding what we should do and why (whether or not we understood at the time), and so long as we could have chosen to do otherwise. Unless immaturity, injury, or illness makes it impossible, or nearly impossible, to control our actions, all persons who are free to make their own choices can and should be held responsible for those choices. (I argue for this more fully in ‘But My Brain Made Me Do It!’)

By the way, that’s why I disagree with many who think that psychopaths shouldn’t be held responsible for their actions. Although it may be more difficult for the psychopath to do the right thing, to respect the rights of others, it’s still in their power to do so, so long as they are capable of reason. We don’t let other people off the hook just because they don’t feel like doing the right thing; and all people, psychopaths or not, often find the wrong thing to do irresistibly attractive, and the right thing difficult. As long as anyone demonstrates sufficient intelligence to understand that the rules apply to them, it doesn’t matter that they wish it didn’t, or feel like it shouldn’t. After all, psychopathy is characterized by lack of empathy, not intelligence. The impartiality that underlies all morality, as in A owes a moral or legal duty to B, so B owes an equal moral or legal duty to A, is a simple and logical equation that any minimally intelligent adult person can grasp, psychopath or not.

So if the concept of personal responsibility, or moral accountability, is generally a good one, and our society is so committed to creating a fair justice system, what’s the problem with it?Here’s one of the main problems: many of our laws and practices are based on a poor understanding of how memory works, and an underestimation of how often it doesn’t work. For example, many of our law enforcement tactics are virtually guaranteed to result in unacceptably high rates of false convictions through their tendency to influence or convince suspects and eyewitnesses to remember details and events that never happened. Police interrogators can and do legally lie to their subjects in the attempt to ‘get them to talk’, under the assumption that false information has little to no effect on the person being interrogated other than to coax or scare the truth out of them. Until very recently, we didn’t understand how malleable memory really is, how easy it is in many circumstances to get others to form false memories by feeding them information. Courts all over the United States, even the Supreme Court, have upheld such deceptive techniques as lawful, yet we have mounting evidence to show these techniques have the opposite of their intended effect – unless, of course, the intended effect is any conviction for crime, not just the true one.

The fact that human memory is as undependable as it it seems counterintuitive, to say the least. It’s true that we can be forgetful, that we trip up on unimportant details such as the color of a thing, or the make of a car, or someone’s height. But surely, we can’t forget the really important things, like what the person who robbed or raped us looks like, or whether a crime happened at all, especially if we’ve committed it ourselves. Yet sometimes it’s the accumulation of small, relatively ‘unimportant’ misremembered details that leads us to ultimately convict the wrong people, and the evidence piling up also reveals that we do, in fact, misremember important things all the time. And people are losing their reputations, their liberty, their homes and money, even their lives, because of it.
 
Before we look at this evidence, let’s explore, briefly, how we think about memory, and what we are learning about it.

Not so long ago, working theories of human memory rather resembled descriptions of filing systems or of modern computers; many still think of it that way. We thought of the brain as something like a fairly efficient librarian or personal secretary neatly and efficiently storing important bits of information, such as memories of things that

happened to us, images of the people and places we know or encounter in significant moments in our lives, and so on, in a systematic way that would facilitate later retrieval, and most importantly, retrieval of reliable information. ‘Unimportant’ memories, or the less significant details of memorable events, were thrown away by forgetting, so that brain power wouldn’t be wasted on useless information. Sometimes we’d find old memories stuffed away in the back of the file or pushed off to a corner somewhere, a little difficult to retrieve after much time had passed, but still accessible with some effort. But important memories were stored more or less intact and discrete from one another, so if we remembered something at all, we’d remember it fairly correctly, or at least, the most important details of it. After all, it wouldn’t make sense for our internal secretary to rip up the memory file into little pieces and stash it all over the place helter-skelter, or to cross out random bits or even doctor the files from time to time, would it? And we certainly couldn’t remember things as if they actually happened to us if they didn’t. It seems all wrong, that evolution (or an intelligent designer, if that’s your thing) would give us an inefficient, inaccurate, dishonest, or mischievous keeper of that most cherished, most self-defining component of ourselves, our life’s memories.

But now we know that memory works differently in many ways than we thought. How do we know this? For one, we’re now looking inside the brain as we think, and brain scans show that the process of recalling looks different in practice than we might have expected if some of our old theories were true. But more revealingly, we’re taking a closer look at our cognitive blunders. Like many other discoveries in science, we find out more and more about how something works, honing in on it so to speak, by examining instances where it doesn’t work as it should if our theories, or common sense, were correct.

As we’re finding out that memory doesn’t work as we thought, we’re also realizing how heavily our justice system relies on memory and on first-person accounts of what happened. We place a very high value on eyewitness testimony and confessions in pursuing criminal convictions, again, based on faulty old assumptions about how memory works, and how accurately people interpret the evidence of their senses. Yet as we’ll see, people have been falsely convicted because of these, even when other evidence available at the time contradicted these personal accounts of what happened. As our justice system places undue faith in memory and perception, despite their flaws, in what other ways is it built on wrong ideas about how the mind works?

Neuroscience, the quest to understand the brain and how it works, was founded on case studies of how the mind to appeared to change when the brain was injured. Traditionally, the mind was thought of as a unified thing that inhabits or suffuses our brain somewhat as a ghost haunts an old house. If part of the house burns down, the ghost is the same: it can just move to another rooms. But careful observers, early scientists, noticed that an injury to a part of the brain affects the mind itself. When particular parts of the brain were injured, patients lost specific capabilities (to form new memories, for example, or to recognize faces, or to keep their temper). Sometimes, the personality itself would change, like from friendly to unfriendly or vice-versa. Or in the case of split-brain patients, they would simultaneously like and dislike, believe and not believe, or be able and not able to do certain things, depending on how you ask, or sometimes just depending on what side of the body you address them from! Gradually, we came to understand that the mind is something that emerges from a physical brain, from the way the brain’s parts work together, and is entirely dependent on the brain itself for its qualities and for its very existence. 


And as we discussed, it’s not only neuroscientists with their fMRIs that are revealing how the brain works. The criminal justice system has amassed mountains of evidence that shows us how often the human brain does its job and helps us ‘get the right guy’, and how often it fails. With advances in technology, such as DNA testing, more sophisticated firearms testing, expanded access to files and records, and many other newly available forensic tools, we’re discovering an alarming number of false convictions, not only of people currently imprisoned, but sadly, of those we’re too late to help. Most of these are the direct result of basic errors of our own faulty brains. 

(I’ve more briefly discussed the issue of false convictions for crime in an earlier piece, stressing the importance of knowing the ways our criminal justice system fails, and one of the most effective ways society can keep itself informed.)

Two of the main culprits in false convictions are false memory and misperception, the one resulting from errors in recall, the other resulting from bias or the misinterpretation of sense evidence.

For example, let’s consider the case of a rape victim and the man she falsely accused, who went on to work together for reform in certain law enforcement practices. For years, Jennifer Thompson-Cannino was absolutely sure that Ronald Cotton had raped her, and her identification of him led to his conviction for the crime. After all, as she said, and as the court agreed, one couldn’t forget the face of someone who had done this to you, committed such an intimate crime inches away from one’s own eyes, could they? Eleven years later, the real rapist was identified when the original rape kit sample was re-tested to confirm a DNA match. Over time, Thompson-Cannino had convinced herself, honestly by all accounts, that Cotton’s face was the one she originally saw, and each time she looked at it, the more she ‘recognized’ it as that of her rapist, and the more strongly she associated this recognition with the rest of her memories of the crime. Fortunately, she’s one of those rare people who are able to fully admit such an egregious mistake, and she has joined with Cotton to call on police jurisdictions to change the way suspect lineup identifications are conducted. Thompson-Cannino and Cotton learned a hard lesson about the fallibility of memory, and how preconceived notions (in this case, that the police must be right) can withstand the rigors of the courtroom and still lead to very wrong conclusions.

But we can imagine that such a mistake must happen from time to time, as in Cotton’s case, when the supposed criminal looks quite a bit like the real one, and the array of circumstances that led to the whole mix-up were so unusually convoluted. But, surely, a lot of people couldn’t all remember the same crime, or series of crimes, wrongly, and describe it mostly the same way when questioned separately? Well, that happens too
. The McMartin case of the early 1980’s was just the first of a string of cases which resulted in hundreds of people being convicted for the rape and torture of small children, usually entirely based on the purported evidence of the ‘victims’ ‘ own memories, or those of their family members. The ‘victims’, mostly young children, told detailed, lurid, horrific stories of events that most people would consider beyond the imaginative scope of innocent children. Over time, as those convicted of the crimes were pardoned, exonerated, or usually just had the charges dropped without apology (some still languish in prison, or are confined to house arrest, or are barred from being with children, including those of their own family), videos and transcripts of ‘expert’ interviews with these children found them leading their interviewees on. Professional psychologists and interrogators were found to be influencing children to form their own false memories, even planting them whole-cloth, rather than drawing them out as most people thought they were doing. This includes the professionals themselves! Some of these children, now grown, still ‘remember’ these events to this day, even those who now know they never happened.

Okay, now we’re talking about children, and we all know children are impressionable. But usually it’s adults who give evidence in such important cases, and though they might be fuzzy on the details when it comes to what other people did, they know what they themselves did, right? Well, here again, no, not necessarily. In Norfolk, Virginia, eight people were convicted of crimes relating to the rape and murder of one woman, based on the first-person confessions and testimony of four military men. These men, who had tested sound enough of mind and body to join the Navy, were convinced by police interrogators, one by one, to ‘reveal’ that they and several others had, by a series of coincidences, formed an impromptu gang-rape party that managed to conduct a violent crime that lasted for hours, while leaving little destruction or evidence behind. Although these confessions didn’t match the evidence found in the crime scene, didn’t match the other confessions, or were contradicted by alibis, all were found guilty on the common-sense and legal assumptions that sane, capable people don’t falsely accuse themselves.Yet as in the Norfolk case, the case history of our criminal justice system reveals that with the right combination of pressure, threats, assertion of authority, and personality type, just about anyone can be pushed to confess to committing even a terrible crime, and worse yet, become convinced that they did it (as one of the men did in the Norfolk case). The Central Park Five, as they are often called, were five teenage boys, aged 14 to 16, who were convicted of the rape, torture, and attempted murder of a woman in Central Park in New York City. The justifiably horrified public outrage at this crime, combined with frustration over a rash of other crimes throughout the city, put a lot of pressure on law enforcement to solve the crime in a hurry. These five boys had already been picked up by police officers in suspicion of committing other crimes that night, of robbery and assault, among other things, and when the unconscious, severely beaten woman was found, the police hoped they had her attackers in custody already. After hours of intensive untaped interrogation, all five eventually confessed, implicating themselves and each other. They were convicted, despite the fact that the blood evidence matched none of them, and their confessions contradicted each other in important details. 

False memories and false confessions are only two of the ways our fallible brains can lead us astray in the search for truth. Human psychology, so effective at so many things, is also short-sighted, self-serving, and wedded to satisfying and convenient narratives, to a fault. Law enforcement officials in all of these cases were convinced that their theories of effective interrogation were right, and that their perceptions of the suspects were right. The prosecutors were convinced that the police officers had delivered the right suspects for trial. The legislators that made the laws, and the courts that upheld them, were convinced they were acting in the best interest of justice. And as we’ve seen, all of these were wrong.

As just about everyone was who were involved in bringing Todd Willingham to court, and in condemning him to die for the murder of his own children by arson. By all accounts, Willingham didn’t act as people would expect a grieving father to act, especially one who had escaped the same burning house his children had died in. Yet it was one faulty theory after anotherfrom pop psychology and preconceived notions about that ‘real’ grief looks like, from bad forensics to a poor understanding of how an exceedingly immature and awkward man might only appear guilty of an otherwise unbelievable crime, that led to his conviction and execution by lethal injection.

But the problem of false conviction for crime is much, much larger than we might suppose from the cases we’ve considered here: these were all capital crimes, and as such were subject to much more rigorous scrutiny than in other cases. If wrongful convictions are known to happen so often in the case of major crimes, we can reasonably extrapolate a very high number of false arrests, undeserved fines, and especially, false plea deals, in which people innocent of the relatively minor crimes they’re accused of are rounded up, charged, and sentenced. Plea bargaining presents a special problem: suspects are persuaded to plead guilty and accept a lesser sentence than the frighteningly harsh one they’re originally threatened with, and in jurisdiction after jurisdiction, we’re finding that huge numbers of innocent people are sent to prison every year through this method. All of this results from a blind zeal to promote justice, or at least, the appearance of justice in the interest of feeling secure, of more firmly establishing authority, or of fulfilling the emotional need to adhere to comforting social traditions.

So how do we need to change our attitude towards our criminal justice system, in the pursuit of actual justice? A proper spirit of epistemic humility, a greater concern for those who may have been wrongfully convicted, and a real love of justice itself (rather than the mere show of caring about justice that the ‘tough on crime’ too often consists of).

But we still are left with the practical task of protecting ourselves and one another. Positive action must be taken, or crime will run rampant, being unopposed. But that doesn’t mean hold on to old ideas and practices because we like them, because they are familiar and ‘time-tested’, and make us feel safe. This includes the death penalty, which shuts out all possibility that we can remedy our mistakes.

Here’s one general solution: approach criminal justice as we do science itself, where we accept conclusions based on the best evidence at the time, but founded on the idea that all conclusions are contingent, are revisable if better, compelling, well-tested evidence comes along. We need a justice system that assumes the fallibility of memory and perception, and builds in systematic corrections for them.

And we need a system that doesn’t just pay homage to this idea: we need to build one that allows for corrections, and not in such a way that it takes years, if ever, to release someone from prison or clear someone’s name if the evidence calls for it. Many would say that the system already works this way: look at how many appeals are available to the convicted, and how many hundreds of people have already been exonerated of serious crimes. But it doesn’t work that way in almost all circumstances. It takes anywhere from months to several years after actual innocence is established to actually release a wrongfully convicted person from prison. For all those not so lucky as to have their innocence proved: most cases don’t have DNA evidence available to test to begin with, at least that would definitively prove guilt or innocence. And even in the rare cases such evidence is available, most is never re-tested to begin with, since the bar for re-evaluation of evidence is so high. Or, the evidence that was available is destroyed after the original conviction and is unavailable for re-examination. Or, legal jurisdictions are so determined that their authority remain unchallenged that they make it extremely difficult, if not impossible, for prosecutors and law enforcement officers to be held accountable in any way if they make a mistake, and bend over backwards to make sure such mistakes are never revealed. And so on.

In short: we don’t just need a justice system that brings in science to help out; we need a justice system whose laws and practices emulate the self-correcting discipline of science, which, in turn, is derived from the honest acknowledgement of the limitations of our own minds.

*Listen to the podcast version here or on iTunes

* Also published at Darrow, a forum for thoughts on the cultural and political debates of today

~~~~~~~~~~~~~~~~~~~~~~

Sources and Inspiration

‘About the Central Park Five’ [film by Ken Burns, David McMahon and Sarah Burns], PBS.org.
http://www.pbs.org/kenburns/centralparkfive/about-central-park-five/Berlow, Alan. ‘What Happened in Norfolk.’ New York Times Magazine. August 19th, 2007.
http://www.nytimes.com/2007/08/19/magazine/19Norfolk-t.html

‘The Causes of Wrongful Convictions’. The Innocence Project.
http://www.innocenceproject.org/causes-wrongful-conviction

Celizic, Mike. ‘She Sent Him to Jail for Rape; Now They’re Friends’ Today News, Mar 3, 2009.
http://www.today.com/id/29613178/ns/today-today_news/t/she-sent-him-jail-rape-now-th

Eagleman, David. ‘Morality and the Brain’, Philosophy Bites podcast, May 22 2011.
http://philosophybites.com/2011/05/david-eagleman-on-morality-and-the-brain.html

‘The Fallibility of Memory’, Skeptoid podcast #446. Dec 23, 2014.
http://skeptoid.com/episodes/4446

Fraser, Scott. ‘Why Eyewitnesses Get It Wrong’ TED talk. May 2012
http://www.ted.com/talks/scott_fraser_the_problem_with_eyewitness_testimony

Grann, David. ‘Trial by Fire: Did Texas Execute an Innocent Man?’ The New Yorker, Sep 7, 2009
http://www.newyorker.com/magazine/2009/09/07/trial-by-fire

Hughes, Virginia. ‘How Many People Are Wrongly Convicted? Researchers Do the Math’. National Geographic: Only Human, Apr 28, 2014.
http://phenomena.nationalgeographic.com/2014/04/28/how-many-people-are-wrongly-

Jensen, Frances. ‘Why Teens Are Impulsive, Addiction-Prone And Should Protect Their Brains’. Fresh Air interview, Jan 28th, 2015.
http://www.npr.org/blogs/health/2015/01/28/381622350/why-teens-are-impulsive-...

Kean, Sam. ‘These Brains Changed Neuroscience Forever’. Interview on Inquiring Minds, 

Lilienfeld, Scott O. and Hal Arkowitz. ‘What “Psychopath” Means’. Scientific American,
Nov 28, 2007. http://www.scientificamerican.com/article/what-psychopath-means/

Loftus, Elizabeth. Creating False Memories.’ Scientific American, Sept 1,1997. http://faculty.washington.edu/eloftus/Articles/sciam.htm
and ‘The Fiction of Memory’ TED talk. June 2013.
http://www.ted.com/talks/elizabeth_loftus_the_fiction_of_memory?

 
Nelkin, Dana K., ‘Moral Luck’, The Stanford Encyclopedia of Philosophy. Edward N. Zalta (ed.) http://plato.stanford.edu/entries/moral-luck/
 
Perrilo, Jennifer T. and Saul M. Kassin. ‘The Lie, The Bluff, and False Confessions’. Law and Human Behavior (academic journal of the American Psychology-Law Society). Aug 24th, 2010.
https://www.how2ask.nl/wp-content/uploads/downloads/2011/10/Perillo-Kassin-The-Lie

Possley, Maurice. ‘Fresh Doubts Over a Texas Execution’. The Washington PostAug 3, 2014.
http://www.washingtonpost.com/sf/national/2014/08/03/fresh-doubts-over-a-texas-execution/

Robertson, Campbell. ‘South Carolina Judge Vacates Conviction of George Stinney in 1944 Execution’, The New York TimesDec. 17, 2014. http://www.nytimes.com/2014/12/18/us/judge-vacates-convict

Shaw, Julia. ‘False Memories Creating False Criminals’. Interview, Point of Inquiry podcast.
March 2nd, 2015. http://www.pointofinquiry.org/false_memories_creating_false_criminals_with_dr..

‘The Trial That Unleashed Hysteria Over Child Abuse.’ New York Times, Mar 9th, 2014.
and video ‘McMartin Preschool: Anatomy of a Panic | Retro Report’