Showing posts with label evolutionary psychology. Show all posts
Showing posts with label evolutionary psychology. Show all posts

Thursday, May 19, 2022

Do You or Someone You Know Suffer From MAH?

[Why did I not publish this in May 2020 when I wrote it? Was I actually going to try to get the MAH Foundation started? No recollection. I'll go on & publish now, worry about whom to send it to later.

Last Update 2020-05-05]

MAH, or Malignant Amygdala Hyperplasia

Landmark study in 2011:

https://www.cell.com/current-biology/fulltext/S0960-9822(11)00289-2

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3092984/

Another study in late 2017

https://www.nature.com/articles/s41562-017-0248-5

Psychological study:

2012

https://royalsocietypublishing.org/doi/full/10.1098/rstb.2011.0268#aff-1

my comment: could also reference hyperthyroidism.

So, perhaps we look at this result as telling us that Conservatism is a side effect of MAH - Malignant Amygdala Hyperplasia? Perhaps it is treatable by drugs, similar to the one I take for BPH? Or perhaps research by someplace like 23andMe could identify a genetic component? Maybe we could finally start building the bright, shiny future we all deserve if we can finally get rid of Conservatism and its friends, Patriarchy and Feudalism.

I'm going to contact a few of my very rich friends and propose the creation of the MAH Foundation, to promote research into this disorder and into finding a cure for it. Perhaps the scourge of MAH can be wiped out in our lifetimes!

So tempting to call it MAD - Malignant Amygdala Disorder - but I believe MAH is the more correct terminology.

Actually, I think that Malignant => Cancerous. So maybe it's Benign Amygdala Hyperplasia, or BAH. But IMO, it is a cancer on our society. My more technical medical allies will have to make that call. How hard will it be to get this into ICD-10?

Articles discussing:

2010:

https://www.dailymail.co.uk/sciencetech/article-1342239/Brain-study-reveals-right-wing-conservatives-larger-primitive-amygdala.html

https://www.salon.com/2010/12/29/conservative_brains/

2011:

https://www.psychologytoday.com/us/blog/the-human-beast/201104/conservatives-big-fear-brain-study-finds

https://www.discovermagazine.com/mind/your-brain-on-politics-the-cognitive-neuroscience-of-liberals-and-conservatives

2016:

https://www.psychologytoday.com/us/blog/mind-in-the-machine/201612/fear-and-anxiety-drive-conservatives-political-attitudes

https://www.salon.com/2016/06/06/study_liberals_and_conservatives_have_different_brain_structures_partner/

2017

https://www.dailykos.com/stories/2017/4/9/1651771/-Neurology-Conservative-Amygdala-Fake-News-Liberal-Anterior-Cingulate-Cortex-Rational-Analysis

https://www.washingtonpost.com/news/inspired-life/wp/2017/11/22/at-yale-we-conducted-an-experiment-to-turn-conservatives-into-liberals-the-results-say-a-lot-about-our-political-divisions/

2018

https://qz.com/1238929/your-political-views-are-influenced-by-the-size-of-your-brains-amygdala/

https://www.dailykos.com/stories/2018/3/7/1747135/-What-makes-a-conservative-may-be-Fear-Itself

Friday, April 13, 2018

It's Getting Better All The Time!

5 years ago, I greatly enjoyed Steven Pinker's book "The Better Angels of Our Nature", subtitled "Why Violence Has Declined", blogged here under the title "Good News! Good News!" So I was really psyched by the release of his follow up to that book: "Enlightenment Now", subtitled "The Case for Reason, Science, Humanism, and Progress", February 13, 2018, 576 pages. The cover has a blurb by Bill Gates, "My new favorite book of all time". Gates has written a review of the book, if you want something a lot shorter than I'm guessing this review/summary will turn out to be.

Just published, here's an interview with Pinker on the book in the NY Times April 10, 2018.

As I started reading this book, I was not going to write one of my long review/summaries. I really felt like everybody should just GO READ THE BOOK! It is an easy and fun read! But as I continued to read, there were just so many succinct statements of things we all should know but don't know, or, rather, forget that we know, that I started highlighting. So here we go.

The book has 23 chapters in 3 parts: Part I, "Enlightenment", 3 chapters; Part II, "Progress", 17 chapters; and Part III, "Reason, Science, and Humanism", 3 chapters, 1 on each of the 3 topics listed. Like "Better Angels", the book is chock full of charts (75) showing bad things declining or good things increasing - with 1 exception.


Part I has an introduction before Chapter 1. Pinker recounts how after giving a talk on "the commonplace among scientists that mental life consists of patterns of activity in the tissues of the brain", a student asks, sincerely, "Why should I live?" Here is part of Pinker's answer, which gives a good feel for his overall approach.

As a sentient being, you have the potential to flourish. You can refine your faculty of reason itself by learning and debating. You can seek explanations of the natural world through science, and insight into the human condition through the arts and humanities. You can make the most of your capacity for pleasure and satisfaction, which allowed your ancestors to thrive and thereby allowed you to exist. You can appreciate the beauty and richness of the natural and cultural world. As the heir to billions of years of life perpetuating itself, you can perpetuate life in turn. You have been endowed with a sense of sympathy — the ability to like, love, respect, help, and show kindness—and you can enjoy the gift of mutual benevolence with friends, family, and colleagues.

And because reason tells you that none of this is particular to you, you have the responsibility to provide to others what you expect for yourself. You can foster the welfare of other sentient beings by enhancing life, health, knowledge, freedom, abundance, safety, beauty, and peace. History shows that when we sympathize with others and apply our ingenuity to improving the human condition, we can make progress in doing so, and you can help to continue that progress.

"Human flourishing" is a touchstone throughout the book.

Chapter 2 has an interesting title: "Entro, Evo, Info" - referring to Entropy, Evolution, and Information. Entropy, the tendency of all things towards disorder ("shit happens"), always must increase, according to the 2nd Law of Thermodynamics. This provides the background against which all activity, all striving, in the universe must work.

the ultimate purpose of life, mind, and human striving: to deploy energy and knowledge to fight back the tide of entropy and carve out refuges of beneficial order.
The 2nd Law of Thermodynamics only applies "in a closed system". Meanwhile
Organisms are open systems: they capture energy from the sun, food, or ocean vents to carve out temporary pockets of order in their bodies and nests while they dump heat and waste into the environment, increasing disorder in the world as a whole.
Evolution is the song and the story by which organisms, through natural selection, discover ever better ways to fight entropy. Information is how they store those better ways, both in the DNA of their genomes, and, in more advanced organisms such as humans, in their nervous systems, their minds.
Internal representations that reliably correlate with states of he world, and that participate in inferences that tend to derive true implications from true premises, may be called knowledge.

...

language, which allowed them to coordinate their actions and to pool the fruits of their experience into the collections of skills and norms we call cultures.

Pinker emphasizes how these 3 forces are what is real, are what we really have to work with. As opposed to
disembodied forces like karma, fate, spiritual messages, cosmic justice, and other guarantors of the intuition that “everything happens for a reason.”

...

misfortune may be no one’s fault.

...

Not only does the universe not care about our desires, but in the natural course of events it will appear to thwart them, because there are so many more ways for things to go wrong than for them to go right.

In Chapter 3, "Counter-Enlightenment", Pinker talks about some of the things that oppose the Enlightment: left-and-right-wing ideologies; romanticism [which I think I may have encountered in some of my right-wing gunophile acquaintances], and finally, the Art Nazis.
A final alternative to Enlightenment humanism condemns its embrace of science. Following C. P. Snow, we can call it the Second Culture, the worldview of many literary intellectuals and cultural critics, as distinguished from the First Culture of science.

...

They write as if the consumption of elite art is the ultimate moral good.

...

The idea that the ultimate good is to use knowledge to enhance human welfare leaves people cold. Deep explanations of the universe, the planet, life, the brain? Unless they use magic, we don’t want to believe them! Saving the lives of billions, eradicating disease, feeding the hungry? Bo-ring. People extending their compassion to all of humankind? Not good enough — we want the laws of physics to care about us! Longevity, health, understanding, beauty, freedom, love? There’s got to be more to life than that.


Part II, "Progress", starts with a great quote from my favorite president:

If you had to choose a moment in history to be born, and you did not know ahead of time who you would be—you didn’t know whether you were going to be born into a wealthy family or a poor family, what country you’d be born in, whether you were going to be a man or a woman—if you had to choose blindly what moment you’d want to be born, you’d choose now.
—Barack Obama, 2016
The 1st Chapter of this part is titled "Progressophobia", and follows Chapter 3 in exploring opposition to the Enlightenment and the goal of Progress. My old friend Professor Pangloss is mentioned. [I have downloaded most of Voltaire to my iPad, I may try to actually get around to reading "Candide".] We hear about "the Optimism Gap" - I'm OK, but everybody else sucks. My neighborhood is safe and prosperous, but everyone else's are varying degrees of shitholes [to quote our Dear Leader].

Pinker talks about "mental bugs", or cognitive biases or illusions (tagged in this blog as "cognitive illusions"), as we've met before in the work of Tversky and Kahneman, in particular the Availability heuristic. And, of course, the policies of most news media - "If it bleeds, it leads" don't help things at all.

Hah, I like this example.

The consequences of negative news are themselves negative. Far from being better informed, heavy newswatchers can become miscalibrated. They worry more about crime, even when rates are falling, and sometimes they part company with reality altogether: a 2016 poll found that a large majority of Americans follow news about ISIS closely, and 77 percent agreed that “Islamic militants operating in Syria and Iraq pose a serious threat to the existence or survival of the United States,” a belief that is nothing short of delusional.
Pinker's answer? Science! "The answer is to count."
A quantitative mindset, despite its nerdy aura, is in fact the morally enlightened one, because it treats every human life as having equal value rather than privileging the people who are closest to us or most photogenic.
Pinker talks about the U.N. Millennium Development Goals, most of which had been met by their target date of 2015. (They have been followed by the Sustainable Development Goals for the next 25 years.)
And here is a shocker: The world has made spectacular progress in every single measure of human well-being. Here is a second shocker: Almost no one knows about it.
Pinker lists 3 websites with data detailing this progress. They all look great, but HumanProgress does not have an RSS feed :-( Hah! I took the Gapminder test, and got a measly 54% correct!

The next 15 chapters detail the world's progress, with chart upon chart to back it up. This parallels "Better Angels", but with 5 more years of data. The chapters are:

  1. Life
  2. Health
  3. Sustenance
  4. Wealth
  5. Inequality
  6. The Environment
  7. Peace
  8. Safety
  9. Terrorism
  10. Democracy
  11. Equal Rights
  12. Knowledge
  13. Quality of Life
  14. Happiness
  15. Existential Threats


Chapter 9, "Inequality", was somewhat surprising. Pinker downplays the importance of inequality. He quotes philosopher Harry Frankfurt's 2015 book On Inequality:

“From the point of view of morality, it is not important everyone should have the same. What is morally important is that each should have enough.”
Here he questions a lot of the basis of "1% vs 99%" arguments - this seemed somewhat glib to me:
Readers commit the same fallacy when they read that “the top one percent in 2008” had incomes that were 50 percent higher than “the top one percent in 1988” and conclude that a bunch of rich people got half again richer. People move in and out of income brackets, shuffling the order, so we’re not necessarily talking about the same individuals. The same is true for “the bottom fifth” and every other statistical bin.
These comments hit home with regard to recent US politics:
The international and global Gini curves show that despite the anxiety about rising inequality within Western countries, inequality in the world is declining.

...

Now, it’s true that the world’s poor have gotten richer in part at the expense of the American lower middle class, and if I were an American politician I would not publicly say that the tradeoff was worth it. But as citizens of the world considering humanity as a whole, we have to say that the tradeoff is worth it.

Take that, Trump voters! But he does counterbalance this outlook with yet another thing that we all forget:
globalization, may produce winners and losers in income, but in consumption it makes almost everyone a winner.
Yes, we are already in a post-scarcity economy, now we just need to work on the utopia part.

Pinker gives a nod to Universal Basic Income (UBI).


In Chapter 10, "The Environment", Pinker also strays from the party line. 1st, here's a positive fact to be happy about:

the proportion of the Earth’s land set aside as national parks, wildlife reserves, and other protected areas has grown from 8.2 percent in 1990 to 14.8 percent in 2014—an area double the size of the United States. Marine conservation areas have grown as well, more than doubling during this period and now protecting more than 12 percent of the world’s oceans.
But, he appears not to be a fan of the ideas and arguments of Naomi Kline, whose 2014 "This Changes Everything: Capitalism vs. the Climate" I described as "a clarion call to action". He starts off sarcastically paraphrasing her:
we should not treat the threat of climate change as a challenge to prevent climate change. No, we should treat it as an opportunity to abolish free markets, restructure the global economy, and remake our political system.

He then moves on to some flat-out criticisms, justified(?), of some of her stances:
In one of the more surreal episodes in the history of environmental politics, Klein joined the infamous David and Charles Koch, the billionaire oil industrialists and bankrollers of climate change denial, in helping to defeat a 2016 Washington state ballot initiative that would have implemented the country’s first carbon tax, the policy measure which almost every analyst endorses as a prerequisite to dealing with climate change. Why? Because the measure was “right-wing friendly,” and it did not “make the polluters pay, and put their immoral profits to work repairing the damage they have knowingly created.” In a 2015 interview Klein even opposed analyzing climate change quantitatively

...

Blowing off quantitative analysis as “bean-counting” is not just anti-intellectual but works against “values, human rights, right and wrong.”

Pinker also invokes the Kusnets arc, which Kate Raworth (and Piketty?) describe as being "debunked" - search for Kuznets here.

Pinker is a proponent of nuclear energy (as am I), and is willing to consider climate engineering that is "moderate, responsive, and temporary".

Here is a FFTKAT: the chemical formula for coal is C137H97O9NS. Wow, lots of nasty carbon compared to our friends hydrogen and oxygen, and a nasty sulfur atom as well.


Chapter 12, "Safety", was 1 of my favorites. So many charts showing how much unbelievably safer we are! I have been tweeting 1 of these every few days. Here's the latest:

I've been tagging them with #ItsGettingBetterAllTheTime and #TheFuture. The last few I've added #EnlightenmentNow (duh). Where appropriate, I also add #GovernmentRegulationSavesLives to troll my libertarian friends.

Here we encounter our 1 exception to bad things declining and good things increasing: deaths by poison (solid or liquid) started rising steeply in 1990, going from around 2-3 deaths/100,000 people/year to 12 or so - a 4-5x increase. Pinker was of course surprised by this exception - until he realized that this category mostly (98%) consists of drug overdoses. The Opioid Epidemic is indeed a step backward, in contrast so many other steps forward.


Chapter 18, "Happiness" discusses, along with other topics, the current mental state of the US and the world.

Psychologists and psychiatrists have begun to sound the alarm against this “disease mongering,” “concept creep,” “selling sickness,” and “expanding empire of psychopathology.” In her 2013 article “Abnormal Is the New Normal,” the psychologist Robin Rosenberg noted that the latest version of the DSM could diagnose half the American population with a mental disorder over the course of their lives.

The expanding empire of psychopathology is a first-world problem, and in many ways is a sign of moral progress. Recognizing a person’s suffering, even with a diagnostic label, is a form of compassion, particularly when the suffering can be alleviated.

We are introduced to "Betteridge’s Law of Headlines: Any headline that ends in a question mark can be answered with the word no." with reference to a study which asked “Is There an Epidemic of Child or Adolescent Depression?” Answer, no.

[Ha ha, the very next day, April 11, there was a headline on page 1 of the Living section of the Lexington Herald-Leader "Is there a child, teen mental health crisis in the US?" to an article by a syndicated, old-school, spare-the-rod-and-spoil-the-child family psychologist. He, of course, answered "yes". His evidence:

Today's child by age 16, is five to 10 times - depending on the source - more likely to experience a prolonged emotional crisis than was a child raised in the 1950s. For example,
OK, he's going to tell us about some of those studies, right? Wrong! Instead we get a worthless anecdote:
For example, I do not remember, nor have I ever run into a person my age who remembers a high school classmate committing suicide.
I will be writing a letter to the editor on this, reminding them that, particularly in matters of public health and science, data and evidence are what matter, not opinion and anecdotes.]


Here's another FFTKAT from Chapter 19, "Existential Threats":

about 10 percent of electricity in the United States comes from dismantled nuclear warheads, mostly Soviet.
The final chapter of Part II is Chapter 20, "The Future of Progress". Guess what? It's getting better all the time!
The Enlightenment is working: for two and a half centuries, people have used knowledge to enhance human flourishing.
Except for Trump voters, of course, but Pinker shows that the demographics are in our favor. But it comes back to the same issue as ever - when will the young people start voting? Fingers quadruple-crossed that the #NeverAgain movement will get out young voters in 2018.


Chapter 21 "Reason" opens with a tautology: "Opposing reason is, by definition, unreasonable." Postmodernism, ugh. Pinker quotes philosopher Thomas Nagel, in what seems to me to be a riff on the Liar's Paradox:

The claim “Everything is subjective” must be nonsense, for it would itself have to be either subjective or objective. But it can’t be objective, since in that case it would be false if true. And it can’t be subjective, because then it would not rule out any objective claim, including the claim that it is objectively false. There may be some subjectivists, perhaps styling themselves as pragmatists, who present subjectivism as applying even to itself. But then it does not call for a reply, since it is just a report of what the subjectivist finds it agreeable to say. If he also invites us to join him, we need not offer any reason for declining, since he has offered us no reason to accept.
The more you think about postmodernism and absolute relativism (oxymoron), you realize that they are, well, just silly.
From the most recondite deconstructionist to the most anti-intellectual purveyor of conspiracy theories and “alternative facts,” everyone recognizes the power of responses like “Why should I believe you?” or “Prove it” or “You’re full of crap.” Few would reply, “That’s right, there’s no reason to believe me,” or “Yes, I’m lying right now,” or “I agree, what I’m saying is bullshit.” It’s in the very nature of argument that people stake a claim to being right. As soon as they do, they have committed themselves to reason—and the listeners they are trying to convince can hold their feet to the fire of coherence and accuracy.
This is so relevant to modern times. Facts that impinge on political beliefs are negated, by both the left and the right (but of course the right is worse). Maintaining status and political correctness within your political tribe are more important to people's sense of self and worth than ascertaining facts and truth.
Given these payoffs, endorsing a belief that hasn’t passed muster with science and fact-checking isn’t so irrational after all—at least, not by the criterion of the immediate effects on the believer. The effects on the society and planet are another matter. The atmosphere doesn’t care what people think about it, and if it in fact warms by 4° Celsius, billions of people will suffer, no matter how many of them had been esteemed in their peer groups for holding the locally fashionable opinion on climate change along the way.

...

preposterous beliefs are more effective signals of coalitional loyalty than reasonable ones

I have heard this last with reference to religious beliefs - it's easy to believe something reasonable, it takes real zealotry to believe something that is completely unbelievable.

Pinker references "legal scholar Dan Kahan:"

Kahan concludes that we are all actors in a Tragedy of the Belief Commons: what’s rational for every individual to believe (based on esteem) can be irrational for the society as a whole to act upon (based on reality).
Pinker talks about some other terms which are new to me:
  • a blue lie is told for the benefit of an in-group (originally, fellow police officers) ...
  • But since another part of the human mind keeps a person in touch with reality, as the counterevidence piles up the dissonance can mount until it becomes too much to bear and the opinion topples over, a phenomenon called the affective tipping point ...
  • Most of us are deluded about our degree of understanding of the world, a bias called the Illusion of Explanatory Depth.
Ha ha, I wonder what libertarians have to say about this?
no developed country runs on right-wing libertarian principles, nor has any realistic vision of such a country ever been laid out.
I really liked this idea: that politics should be based more on sciencific principles.
A more rational approach to politics is to treat societies as ongoing experiments and open-mindedly learn the best practices, whichever part of the spectrum they come from.

...

Reason tells us that political deliberation would be most fruitful if it treated governance more like scientific experimentation and less like an extreme-sports competition.

As disheartening as our fractious politics and even views of reality are, as would be expected, Pinker offers us some hope.
We are not in a post-truth era. Mendacity, truth-shading, conspiracy theories, extraordinary popular delusions, and the madness of crowds are as old as our species, but so is the conviction that some ideas are right and others are wrong.

...

eight in 10 Americans have a positive view of political fact-checking.

This is an insight into yet another cognitive illusion/bias that can help us move forward. Recognizing cognitive biases helps us to compensate for them.
The discovery that political tribalism is the most insidious form of irrationality today is still fresh and mostly unknown.


Chapter 22 is titled "Science"! Yes, science!

What, then, distinguishes science from other exercises of reason? It certainly isn’t “the scientific method,” a term that is taught to schoolchildren but that never passes the lips of a scientist. Scientists use whichever methods help them understand the world: drudgelike tabulation of data, experimental derring-do, flights of theoretical fancy, elegant mathematical modeling, kludgy computer simulation, sweeping verbal narrative. All the methods are pressed into the service of two ideals, and it is these ideals that advocates of science want to export to the rest of intellectual life.

The first is that the world is intelligible.

...

The second ideal is that we must allow the world to tell us whether our ideas about it are correct.

Some bitter medicine for faith-based or magical thinking:
the findings of science imply that the belief systems of all the world’s traditional religions and cultures—their theories of the genesis of the world, life, humans, and societies—are factually mistaken.

...

There is no such thing as fate, providence, karma, spells, curses, augury, divine retribution, or answered prayers

...

Though the scientific facts do not by themselves dictate values, they certainly hem in the possibilities. By stripping ecclesiastical authority of its credibility on factual matters, they cast doubt on its claims to certitude in matters of morality.

Pinker is not a fan of most Second Culture analyses of science.
The result is like a report of a basketball game by a dance critic who is not allowed to say that the players are trying to throw the ball through the hoop.

...

Resisters of scientific thinking often object that some things just can’t be quantified. Yet unless they are willing to speak only of issues that are black or white and to foreswear using the words more, less, better, and worse (and for that matter the suffix –er), they are making claims that are inherently quantitative. If they veto the possibility of putting numbers to them, they are saying, “Trust my intuition.” But if there’s one thing we know about cognition, it’s that people (including experts) are arrogantly overconfident about their intuition.

This was an interesting example of the value of quantification.
The political scientists Erica Chenoweth and Maria Stephan assembled a dataset of political resistance movements across the world between 1900 and 2006 and discovered that three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones. Gandhi and King were right, but without data, you would never know it.
We encountered Dr. Chenoweth before here. I have their book "Why Civil Resistance Works" but have not read it yet.

Pinker discusses the disconnect between the science and humanities departments in modern universities. The humanities departments have been struggling for years, particularly as our universities (and educational systems as a whole) are being transformed into factories for producing corporate wage slaves. Pinker urges them to keep their focus on their very, very important subject matter, but to adopt scientific methods, rather than condemning them.

The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, self-refuting relativism, and suffocating political correctness. Many of its luminaries—Nietzsche, Heidegger, Foucault, Lacan, Derrida, the Critical Theorists—are morose cultural pessimists who declare that modernity is odious, all statements are paradoxical, works of art are tools of oppression, liberal democracy is the same as fascism, and Western civilization is circling the drain.

...

A consilience with science offers the humanities many possibilities for new insight. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, and a forward-looking agenda that could attract ambitious young talent (not to mention appealing to deans and donors). The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanities scholars.

...

The advent of data science applied to books, periodicals, correspondence, and musical scores has inaugurated an expansive new “digital humanities.” The possibilities for theory and discovery are limited only by the imagination, and include the origin and spread of ideas, networks of intellectual and artistic influence, the contours of historical memory, the waxing and waning of themes in literature, the universality or culture-specificity of archetypes and plots, and patterns of unofficial censorship and taboo.


Chapter 23, the last, is titled "Humanism". Here is Pinker's definition:

The goal of maximizing human flourishing—life, health, happiness, freedom, knowledge, love, richness of experience—may be called humanism.
[sarcasm] I will lodge an official protest on behalf of my gunophile friends that he does not mention "lots of guns" in this statement of what defines human flourishing.[/sarcasm]

He references The Humanist Manifesto, 1st published in 1933, now on version III. In my very 1st blog post, I included a link to the principles of the Council of Secular Humanism - which was dead, I have refreshed. I think I like their verbiage a little better.

I enjoyed Pinker's discussion of "the Golden Rule and its precious-metallic variants". I had heard of the Silver Rule, but not "the Platinum Rule, “Do to others what they would have you do to them.”" I tweeted Pinker the Bronze Rule, which somehow I have been referencing lately, I'm not sure where it came from: "Don't Be An Asshole".

This was something really new: a defense of all of our basic (guilty) human pleasures, which have been under attack, what, forever, by various forms of asceticism and puritanism. I for 1 wish I would rewrite the damage 11 years of sin-based teaching I received in Catholic schools did to my mind.

The physical requirements that allow rational agents to exist in the material world are not abstract design specifications; they are implemented in the brain as wants, needs, emotions, pains, and pleasures. On average, and in the kind of environment in which our species was shaped, pleasurable experiences allowed our ancestors to survive and have viable children, and painful ones led to a dead end. That means that food, comfort, curiosity, beauty, stimulation, love, sex, and camaraderie are not shallow indulgences or hedonistic distractions. They are links in the causal chain that allowed minds to come into being. Unlike ascetic and puritanical regimes, humanistic ethics does not second-guess the intrinsic worth of people seeking comfort, pleasure, and fulfillment—if people didn’t seek them, there would be no people. At the same time, evolution guarantees that these desires will work at cross-purposes with each other and with those of other people. Much of what we call wisdom consists in balancing the conflicting desires within ourselves, and much of what we call morality and politics consists in balancing the conflicting desires among people.
This next statement I think reflects how far we have come in the last 50-100-200(?) years. Kind of an Occam's Razor argument.
Even when humanistic movements fortify their goals with the language of rights, the philosophical system justifying those rights must be “thin.” A viable moral philosophy for a cosmopolitan world cannot be constructed from layers of intricate argumentation or rest on deep metaphysical or religious convictions. It must draw on simple, transparent principles that everyone can understand and agree upon. The ideal of human flourishing—that it’s good for people to lead long, healthy, happy, rich, and stimulating lives—is just such a principle, since it is based on nothing more (and nothing less) than our common humanity.
These ideas all seem so easy, so self-evident, don't they? So what opposes them?
The idea that morality consists in the maximization of human flourishing clashes with two perennially seductive alternatives. The first is theistic morality: the idea that morality consists in obeying the dictates of a deity, which are enforced by supernatural reward and punishment in this world or in an afterlife. The second is romantic heroism: the idea that morality consists in the purity, authenticity, and greatness of an individual or a nation. Though romantic heroism was first articulated in the 19th century, it may be found in a family of newly influential movements, including authoritarian populism, neo-fascism, neo-reaction, and the alt-right.
Hah, reading the 2nd part above, I read "manly" instead of "newly". An insightful slip. I do indeed possess a Y chromosome, and, as such, grew up imagining heroism, valor, strife, slaying dragons, and rescuing maidens (who of course were greatly appreciative). Then, I became an adult, and having to actually be responsible and raise 4 children pushed these to a far back burner. I'm sure they were/are still going great guns in my subconscious.

Over the last few years I have occasionally engaged (for as long as I can stomach) with RWNJ/libertarian gunophiles, mostly on FaceBook. I have concluded that many of them are still immersed in fantasies of, with their incredibly fabulous and sexy gun collections, protecting white women, who will of course be so grateful, from the ravening, raping hordes of black-and-brown-skinned subhumans. So the curse of "romantic heroism" is still gumming up the works.

A final argument for humanism:

The Euthyphro argument puts the lie to the common claim that atheism consigns us to a moral relativism in which everyone can do his own thing. The claim gets it backwards. A humanistic morality rests on the universal bedrock of reason and human interests: it’s an inescapable feature of the human condition that we’re all better off if we help each other and refrain from hurting each other. For this reason many contemporary philosophers, including Nagel, Goldstein, Peter Singer, Peter Railton, Richard Boyd, David Brink, and Derek Parfit, are moral realists (the opposite of relativists), arguing that moral statements may be objectively true or false. It’s religion that is inherently relativistic. Given the absence of evidence, any belief in how many deities there are, who are their earthly prophets and messiahs, and what they demand of us can depend only on the parochial dogmas of one’s tribe.
Our new word for the day: necrometrician, presumably a statistician who studies the statistics of death.

So many inspirational thoughts, I will let Pinker conclude with the last 3 paragraphs of the book:

We will never have a perfect world, and it would be dangerous to seek one. But there is no limit to the betterments we can attain if we continue to apply knowledge to enhance human flourishing.

This heroic story is not just another myth. Myths are fictions, but this one is true—true to the best of our knowledge, which is the only truth we can have. We believe it because we have reasons to believe it. As we learn more, we can show which parts of the story continue to be true, and which ones false—as any of them might be, and any could become.

And the story belongs not to any tribe but to all of humanity—to any sentient creature with the power of reason and the urge to persist in its being. For it requires only the convictions that life is better than death, health is better than sickness, abundance is better than want, freedom is better than coercion, happiness is better than suffering, and knowledge is better than superstition and ignorance.


There is so much good information and so much good news in this book. A few times, yes, the words "Polyanna" or "Pangloss" came to mind, but, that is kind of the point of the book. Given that there is a $B industry out there dedicated to spreading bad news, fear, and anger, a little tilt towards the side of good news is a welcome change.

It is an easy and fun read, please put it on your to-read list. Bill Gates will give you a free Microsoft product! (Just kidding).

You knew it would come sooner or later. Here it is. I love this band, which McCartney has been playing with for the last couple of decades.

Monday, November 28, 2016

Tennessee, Land of Crazy People

* * * * * RANT ALERT * * * * *

I am of course extremely depressed by the election result. I am trying to ignore it as much as possible. I'm definitely taking a few months off. I've unsubscribed from maybe 50 political mailing lists - my inbox is much smaller now - and muted most political tweeters for 30 days.

Only time will tell whether this election will be the turning point in history which puts us on a direct line to Hunger Games, or maybe a good correction in the US as the world's only superpower, or who knows what. Meanwhile ...

My wife had some time off work, so we decided to spend a couple of nights (November 17 & 18) in Nashville. We stayed at the Hermitage Hotel downtown - very nice, old detailed architecture that my wife loves. The first afternoon there we walked around downtown and wound up at the Tennessee Bicentennial Mall State Park. We were taking a rest sitting in front of a fountain feature when we were approached by 2 women. They were in their 30s, white, attractive, well groomed and well dressed. The younger 1 was pushing a stroller with a baby in it. Here's the gist of our conversation.

Older of the 2: "Can we talk with you?"

Me: "Sure."

Older of the 2: "We wanted to share stories of faith and spirit with you. We are both ministers."

Me: "Arrggghhh! I'm an antitheist, I definitely do not want to share stories of faith with you."

Younger of the 2: "What's an antitheist?"

Me: "It's someone who believes that religion harms the human race - that we would all be better off if people quit including nonexistent supernatural beings in their decision making. Please go away and leave us alone." I was at this point making "go away" motions with my hands.

Older of the 2: "OK, well then, where are you from?"

My wife: "Lexington, KY"

Older of the 2: "OK. Have a nice day."

These 2 women's affects were both very laid back and spacey - kind of like they were stoners. I guess they were high on religion. My wife and I were both annoyed at this intrusion of unreality into what had heretofore been a pleasant afternoon.

That evening we had dinner in the hotel's restaurant, the Capitol Grille. The dinner was OK but nothing special. We were eating at 6:30 and the place was pretty crowded. Our server told us that it was because there was a 7:30 performance of "Book of Mormon" at the Tennessee Center for the Arts, which was 2 blocks up the street.

A few tables over, a couple in their late 30s or early 40s came in. The couple was fairly attractive. The woman had very expensive looking thick curly blond hair past her shoulders and was wearing an evening dress. After a while they were joined by another similarly aged and dressed couple. The woman in this couple was a brunette.

My wife and I both noticed that both men seemed to be totally focusing in on the blond, who is expounding vivaciously. We kind of felt sorry for the brunette. Then during a lull in other conversations, we hear some of what she is saying: "with Jesus's love", "ephesians" ...

At that point, I'm like, crap. Are these people all crazy? This is what is important to them???

And I immediately flashed on this behavior as evolutionary sexual selection. I usually characterize sexual selection as "because chicks dig it", but here we have a woman, proudly flashing her peacock's tail of all the brain cycles she apparently can afford to waste on theories of imaginary superfriends. She was doing it in what appeared to be a courtship / mate selection environment. And the males seemed to be eating it up!

A "peacock's tail" is the archetypal example of a feature of a species that is an extravagant waste of energy, which serves to demonstrate how fit the individual is to be able to waste that much energy, implying that they are obviously superior breeding stock.

Year's ago, I got really discouraged and cut way back on following evolutionary psychology and cognitive science as I kept seeming to find that so much of what we hold dear - our minds, language, music - were all peacock's tails. They were all sexually selected. They may have had some species survival value but that was overwhelmed by the "because chicks dig it" aspect. So it really is all about who are the cool kids in middle school. I found this very depressing. So I think we can add religion to this list.

I have speculated in the past that religion may have species selection effects, in that it increases the survival potential of a group by creating a mechanism which allows for the justification of killing other humans. This may be more or less significant than the sexual selection effect.

This was particularly painful after the presidential election. 80% of white evangelical christians voted for Trump, who appears to be mostly amoral and completely lacking in common human decency. So these same people wanting to share "stories of faith and spirit" and expounding on "Jesus's love" probably voted for this evil man. If I ever had any doubts about evangelical christians being full-of-shit hypocrites, they are all gone now. I'll also bet that most of the audience at this performance of "Book of Mormon" were evangelical christians, who would laugh uproariously at this completely blasphemous play. Blasphemy, the victimless crime.

For many of them, the abortion issue overrides all else. This is yet another example of the harm caused by delusional thinking based on religion rather than facts. Driving in the south, you see the billboards with a fetus thinking, "My heart is beating at 14 days." Well, in pretty much every animal with a heart, it starts beating fairly early in development. So what? The question about development that has meaning is, when is there enough brain to support a human personality? The answer to that question is, around 20 weeks, which is around the time that abortion decisions do indeed become harder.

Oh wait, I'm forgetting the soul. Once the zygote? blastula? fetus? has been given a soul by god, then terminating the pregnancy is murder. I believe Steven Pinker has pointed out that is a totally slippery slope. Who exactly can identify the time of soul implantation? If you take the extreme case and posit that is at the time of fertilization, you still have problems. The genetic material of sperm and egg does not fuse instantaneously. And once it has fused and formed a zygote, a large percentage of zygotes fail to implant in the uterine wall or otherwise spontaneously abort. So I guess god wasted those souls? Or murdered them? Well, we know god does not have any problem with letting children die in countless horrible ways.

My overall conclusion is, believing in nonexistent beings and nonexistent souls makes it much harder to make rational decisions. It reminds me of my older brother, who was an officer in the US Navy for 20 years. He is a bright guy, but at other times seems to be somewhat of a doofus, which I have always attributed to his time in the military. When you belong to an organization where refusing to follow the orders of a superior, no matter how stupid or irrational, can lead to imprisonment or death, I don't believe you form the best problem analysis and decision making skills. Religion is 10x worse than that.

I generally avoid being stridently antitheist. Of me and my 6 siblings, only 1 of us still practices religion, despite having been raised by a devoutly catholic mother. But I have cousins and a few friends who are religious, and, overall, I try to avoid poking sticks in people's eyes. But this election really drove home to me how harmful religion mostly is. The good news is, Pew Research says that 20% of Gen Xers are unaffiliated religiously, and 26% of millennials. So once again, it's on the youngsters to save our old white butts. Come on kids, you can do it!

Thursday, September 08, 2016

Year's Greatest Nightmare Stacks

Out and about with my youngest daughter and her 2 YO son, we went into Carmichael's Bookstore to kill time while waiting for our food order from Ramsi's. I wound up with not 1 but 2 hardcopy books.

1st, in hardback, Charlie Stross's latest Laundry Files novel (#7), "The Nightmare Stacks". I still buy all Charlie's stuff in hardcopy - I pass the Laundry Files books on to my son-in-law.

This one seemed less tired than the last one.

  1. It's a love story, between our vampire Laundry protagonist and our parallel-universe elven antagonist.
  2. Fun stuff, with a magic-based army taking on tanks and planes of the British military.
  3. An interesting, speculative example of how a human subspecies developing language later could cause a very different type of mind to be created.
2nd, in trade paperback, "The Year's Best Science Fiction: Thirty-Third Annual Collection", edited by Gardner Dozois. I look forward to this every year, but, again this year, after having read the Technology Review, Microsoft, and tor.com short story collections, there were 4-5 stories here I had already read. There were also stories that really did not seem to belong here.
  • "The Children of Gal" by Allen M. Steele is your standard, colony planet forgets science and substitutes religion. That is a really old idea, I think the original Star Trek used it at least a couple of times.
  • "Inhuman Garbage" by Kristine Kathryn Rusch is a police procedural with several threads that fail to come together by the end. So what was the point of including them?
The final story, which is often strikingly good, is the 2nd one in the collection by Aliette de Bodard, set in her oriental-themed space opera where humans are grown to become ship-minds. For whatever reason, these stories just don't do much for me. I guess that conceptually it is interesting for a space empire to have an oriental flavor rather than a western one (Ms. de Bodard is French-Vietnamese), but still, I'm kind of like "so what"?

There are some good and different stories, but no real standouts. I'm wondering if I should quit reading the other short story collections? I'm thinking not. I don't mind rereading the duplicates, they are mostly good stories. But 4 or 5 seems like too many.

Thursday, June 02, 2016

Thinking, Fast And Slow

"Thinking, Fast and Slow" by Daniel Kahneman came out in 2011. It summarizes the work he and his longtime, now deceased collaborator Amos Tversky have done on cognitive biases and decision theory. The book is dedicated "In memory of Amos Tversky".

We 1st met these 2 scholars in the book "Misbehaving" by Richard Thaler, with whom they worked for a while. I blogged about that book November 15, 2015. I would recommend reading that post first, as in this post I give short shrift to the topics already covered in the earlier post. The 2 works together give a big picture of behavorial economics, with "Misbehaving" coming in from the economics side, and "Thinking, Fast And Slow" coming in from the psychological side.

The book is 512 pages long. It has an Introduction, a Conclusion, and 38 chapters in 5 parts. It also contains reprints of 2 of their groundbreaking papers:

  1. "Judgement Under Uncertainty: Heuristics and Biases"
  2. "Choices, Values, and Frames"
The 5 parts are:
  • Part I. Two Systems - 9 chapters
  • Part II. Heuristics and Biases - 9 chapters
  • Part III. Overconfidence - 6 chapters
  • Part IV. Choices - 10 chapters
  • Part V. Two Selves - 4 (short) chapters
The book is easy to read. I liked how every chapter ends with a section "Speaking Of {the title/topic of the chapter}", which contains 4-6 simple declarative statements that summarize the ideas of the chapter. There are also many descriptions of simple experiments and exercises that quickly show you what he is talking about.

Kahneman states his purpose in the Introduction:

improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them.

...

My main aim here is to present a view of how the mind works that draws on recent developments in cognitive and social psychology. One of the more important developments is that we now understand the marvels as well as the flaws of intuitive thought.

Some previews of what we will be getting into:
This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

...

our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight.

...

the unfortunate tendency to treat problems in isolation, and with framing effects, where decisions are shaped by inconsequential features of choice problems.

This next statement seems a little glib, maybe, but it makes sense.
The premise of this book is that it is easier to recognize other people's mistakes than our own.


Part I is spent exploring the 2 types of thinking named in the title. "Thinking, Fast" he refers to as "System 1", "Thinking, Slow" he refers to as "System 2".

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
System 1 is our fabulous subconscious intuition, which generally operates in a fraction of a second. But, to operate that quickly, it makes simplifying assumptions and uses heuristics which apparently had evolutionary value, but which are subject to numerous sources of error.

System 2 is our conscious mind, with its powers of reasoning, calculation, and concentration. It also is prone to errors, perhaps the worst of which is that it is lazy, and doesn't question System 1's snap judgements often enough.

One way you can tell when System 2 is engaged is by measuring the dilation of the pupils. When you concentrate your pupils dilate. Who knew?

the psychologist Eckhard Hess described the pupil of the eye as a window to the soul.

...

[Hess's article] ends with two striking pictures of the same good-looking woman, who somehow appears much more attractive in one than in the other. There is only one difference: the pupils of the eyes appear dilated in the attractive picture and constricted in the other.

System 2 has limited capabilities: apparently you can't use System 2 and walk at the same time:
While walking comfortably with a friend, ask him to compute 23 × 78 in his head, and to do so immediately. He will almost certainly stop in his tracks.
Another thing about the type of activities performed by System 2 is that using will power engages it and burns mental energy.
both self-control and cognitive effort are forms of mental work. Several psychological studies have shown that people who are simultaneously challenged by a demanding cognitive task and by a temptation are more likely to yield to the temptation.

...

The phenomenon has been named ego depletion.

One of System 1's skills is association, as in free association. Association is subject to the priming effect - if you have been talking about eating, you will pick out words for food rather than others; if you have been shown words relating to old age, you will complete a task more slowly, like an old person. The 2nd instance is known as the ideomotor effect.

Reciprocal priming effects also exist. Plaster a smile on your face, you will be happier. Nod your head while listening to a message, you will tend to agree with it; shake your head, and the opposite occurs.

You can see why the common admonition to “act calm and kind regardless of how you feel” is very good advice: you are likely to be rewarded by actually feeling calm and kind.
2 more interesting examples of priming. The 1st example involves money, which is very important in our culture, but perhaps not so much so in other societies.
money primes individualism: a reluctance to be involved with others, to depend on others, or to accept demands from others.
The 2nd example is that after being asked to do something bad, people are primed to want to clean themselves.
Feeling that one’s soul is stained appears to trigger a desire to cleanse one’s body, an impulse that has been dubbed the “Lady Macbeth effect.”
Kahneman talks about cognitive ease vs cognitive strain. System 1 is usually very happy to take the easy path. Some of the ways to create cognitive ease make you almost embarrassed for our minds' foolishness. Given 2 written answers, people who don't know the correct one will choose an answer in bold font over one in normal font.

Manipulating System 1 by creating cognitive ease can create truth illusions. The role that "repeated experience" plays in creating cognitive ease is known as the exposure effect.

the mere exposure effect is actually stronger for stimuli that the individual never consciously sees.
The familiar makes us comfortable; the unusual makes us wary - the root of the urge to conservatism.
Survival prospects are poor for an animal that is not suspicious of novelty.
In summary:
good mood, intuition, creativity, gullibility, and increased reliance on System 1 form a cluster. At the other pole, sadness, vigilance, suspicion, an analytic approach, and increased effort also go together.
Further exploring System 1, Kahneman posits that it makes heavy use of norms.
We have norms for a vast number of categories, and these norms provide the background for the immediate detection of anomalies such as pregnant men and tattooed aristocrats.
System 1 also likes to believe that things happen for a reason. It creates "the perception of intentional causality". It is also "a machine for jumping to conclusions".
When uncertain, System 1 bets on an answer, and the bets are guided by experience. The rules of the betting are intelligent: recent events and the current context have the most weight in determining an interpretation.
System 2 is biased to go with the flow, to believe and confirm, rather than to unbelieve. And, "when System 2 is otherwise engaged, we will believe almost anything". Interesting work by psychologist Daniel Gilbert:
Gilbert proposed that understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it.
Another flaw of System 1 is confirmation bias.
The confirmatory bias of System 1 favors uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events.
Then there is the halo effect.
The tendency to like (or dislike) everything about a person — including things you have not observed — is known as the halo effect.

...

The procedure I adopted to tame the halo effect conforms to a general principle: decorrelate error!

One very useful example of a technique to "decorrelate error" deals with conducting a meeting to discuss a topic. Kahneman recommends having everyone write down their thoughts on the subject before the meeting begins.
The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.
Finally, Kahneman introduces a concept that he returns to many times in the book: WYSIATI.
Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is.
In other words, System 1 will totally react based on the only information available, even when that information has problems. This leads to several biases which are discussed in more detail later in the book: overconfidence, framing effects, and base-rate neglect.

In the chapter "How Judgments Happen", we learn that many basic assessments have been hardwired by evolution into System 1. One example of a judgement heuristic is how we make snap judgements of people based on their faces.

In about 70% of the races for senator, congressman, and governor, the election winner was the candidate whose face had earned a higher rating of competence.

...

the effect of facial competence on voting is about three times larger for information-poor and TV-prone voters than for others who are better informed and watch less television.

One place where System 1 really falls down is in doing integration or addition.
Because System 1 represents categories by a prototype or a set of typical exemplars, it deals well with averages but poorly with sums. The size of the category, the number of instances it contains, tends to be ignored in judgments of what I will call sum-like variables.
System 1 also has a way of making WYSIATI worse:
we often compute much more than we want or need. I call this excess computation the mental shotgun.
Yet another flaw of System 1:
If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.
Hah, here's a fun fact. "Heuristic" comes from the same root as "eureka". Heuristics are what substitute the easy questions for the hard ones. Examples given are the mood heuristic for happiness, and the affect heuristic.


In Part II we look at more heuristics and learn how they become biases. For the rest of the book, one thing that becomes painfully apparent is that the human mind, particularly System 1, is incredibly bad at probability and statistics.

System 1 is inept when faced with “merely statistical” facts, which change the probability of outcomes but do not cause them to happen.
One example of how bad our intuitive judgements of statistics comes from scientists themselves. Kahneman and Tversky found it was not at all unusual for scientists to use sample sizes for studies that were way too small. Intuitively, they seemed big enough, but, if you did the math, you found that random chance could easily overwhelm the statistical conclusions that were being made.
Reseachers who pick to small a sample leave themselves at the mercy of sampling luck.
Kahneman refers to this as the law of small numbers.

[I have had personal experience with how little intuition we have about probability and randomness. 25 years ago, I wrote a complex system to assign candidates to examiners for an oral examination. My customer insisted that the assignment should be random. But random assignment produced clumps of good and bad candidates that would throw the examiners off. I finally convinced them that what they really wanted was balanced assignment, which did indeed greatly improve examiner performance.]

Random processes produce many sequences that convince people that the process is not random after all.

...

We are far too willing to reject the belief that much of what we see in life is random.

Kahneman expands on the law of small numbers.
  • The exaggerated faith in small samples is only one example of a more general illusion — we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify. Jumping to conclusions is a safer sport in the world of our imagination than it is in reality.
  • Statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.
Another almost embarrassing bias is the anchoring effect.
It occurs when people consider a particular value for an unknown quantity before estimating that quantity.
It is a flavor of a priming effect.
If you are asked whether Gandhi was more than 114 years old when he died you will end up with a much higher estimate of his age at death than you would if the anchoring question referred to death at 35.
Kahneman for the 1st time gives advice on now to combat this bias: get System 2 activated by actively pursuing arguments against the anchor value.

Next we have the availability heuristic.

We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.”

...

The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind. Substitution of questions inevitably produces systematic errors.

We now learn more about the affect heuristic.
people make judgments and decisions by consulting their emotions: Do I like it? Do I hate it? How strongly do I feel about it?
The availability heuristic combines with the affect heuristic to produce an availability cascade.
the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.
I think we here in the US are now experiencing a potentially disastrous availability cascade: the presidential campaign of Donald Trump. The 1000s of hours of press coverage he has received has everyone's minds primed for more Trump news.

Another place where our ineptitude with probability shows is in our assessment of risks, referred to as probability neglect.

a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight — nothing in between.
A particular flavor of our poor statistical intuition is base-rate neglect. In an experiment in the early 70s, Kahneman and Tversky found that, if asked what field an individual with a stereotypical geek description was likely to be studying, subjects would base their answer solely on the description, and ignore the statistics of what majors are most common. The "base-rate" percentage that should be the starting point for an estimate of likelihood was "neglected" completely. The representativeness heuristic was used instead - the geek description was just too good a match for System 1 to ignore. If the subjects were asked to frown, which engages System 2, they "did show some sensitivity to the base rates."

Another way to combat this bias, and 2 points to remember:

instructing people to “think like a statistician” enhanced the use of base-rate information, while the instruction to “think like a clinician” had the opposite effect.

...

[1.] base rates matter, even in the presence of evidence about the case at hand

...

[2.] intuitive impressions of the diagnosticity of evidence are often exagerated

The fact that one has to come up with ways to try to engage System 2 is an example of the laziness of System 2.

I think I had heard of this one before, which goes beyond bias to just plain wrong thinking: the conjunction fallacy

which people commit when they judge a conjunction of two events (here, bank teller and feminist) to be more probable than one of the events (bank teller) in a direct comparison.
So event A by itself has a probability of being true, the probability of event A and a 2nd event B being true has to be less than the probability of event A by itself. But if event B helps tell a story, or could be causally related to event A, or otherwise makes System 1 happy, System 1 completely ignores logic.
The uncritical substitution of plausibility for probability has pernicious effects on judgments when scenarios are used as tools of forecasting.
I found this next statistical bias to be fascinating. The statistical concept of regression to the mean almost seems a little spooky. It simply says that, for values that cluster around an average, a high value will likely be followed by a lower value, and visa versa. System 1 completely ignores this law, preferring its many heuristics instead. Ha ha, I believe this anecdote about trials.
the statistician David Freedman used to say that if the topic of regression comes up in a criminal or civil trial, the side that must explain regression to the jury will lose the case. Why is it so hard? The main reason for the difficulty is a recurrent theme of this book: our mind is strongly biased toward causal explanations and does not deal well with “mere statistics.” When our attention is called to an event, associative memory will look for its cause — more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.
Knowing how our minds mishandle estimates by ignoring regression to the mean gives System 2 a tool to do a better job. But
Following our intuitions is more natural, and somehow more pleasant, than acting against them.

...

Furthermore, you should know that correcting your intuitions may complicate your life. A characteristic of unbiased predictions is that they permit the prediction of rare or extreme events only when the information is very good.

So, despite all of the faults of System 1, it still does some pretty amazing work.


Part III is titled "Overconfidence". We put much more faith in our intuitions via System 1 than we should. But we would probably find life terrifying without some overconfidence. However, there several places in our society where this overconfidence is passed off as skill or wisdom.

1st we meet, via Nassim Taleb and "The Black Swan", the narrative fallacy

to describe how flawed stories of the past shape our views of the world and our expectations for the future.
We are always looking for the narrative, for the story that makes sense of what is happening. Sometimes this leads us astray. Sometimes there really is no narrative.
The ultimate test of an explanation is whether it would have made the event predictable in advance.

...

The human mind does not deal well with nonevents.

...

Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle.

Our minds try to keep the story going in the past, as well as the present and the future. Again, we all overestimate how good we are at this.
Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events. Baruch Fischhoff first demonstrated this “I-knew-it-all-along” effect, or hindsight bias
Everyone's 20/20 hindsight leads to outcome bias.
Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.
Kahneman does not seem to be a fan of business books.
The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can predict and control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. We all have a need for the reassuring message that actions have appropriate consequences, and that success will reward wisdom and courage. Many business books are tailor-made to satisfy this need.

...

Consumers have a hunger for a clear message about the determinants of success and failure in business, and they need stories that offer a sense of understanding, however illusory.

...

And even if you had perfect foreknowledge that a CEO has brilliant vision and extraordinary competence, you still would be unable to predict how the company will perform with much better accuracy than the flip of a coin.

Kahneman recounts how he came up with a rating methodology for army trainees. He felt good about it, it seemed logically consistent - and had pretty much 0 predictive value. But when the next group of trainees came through and he applied his methodology again, he felt just as good about it as before, even though he knew that statistically it had been proven worthless. He named this the illusion of validity.

He moves on to look at the predictive powers of stock market workers. As much as he is not a fan of business book authors, he is even less a fan of stock brokers.

my questions about the stock market have hardened into a larger puzzle: a major industry appears to be built largely on an illusion of skill.

...

on average, the most active traders had the poorest results, while the investors who traded the least earned the highest returns.

...

men acted on their useless ideas significantly more often than women, and that as a result women achieved better investment results than men.

...

the key question is whether the information about the firm is already incorporated in the price of its stock. Traders apparently lack the skill to answer this crucial question, but they appear to be ignorant of their ignorance.

Next up for skewering is the pundit class, be they political or economical. Here's the results of a study of 284 pundits, asked to pick 1 of 3 possibilities for several near-future political events.
The experts performed worse than they would have if they had simply assigned equal probabilities to each of the three potential outcomes.

...

Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.

We now meet Paul Meehl.
Meehl reviewed the results of 20 studies that had analyzed whether clinical predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule.

...

The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algorithms. The other comparisons scored a draw in accuracy, but a tie is tantamount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convincingly documented.

...

In every case, the accuracy of experts was matched or exceeded by a simple algorithm.

I am totally onboard with these conclusions. I have been saying for decades how I would trust an AI to diagnose an illness more than a human physician. And I believe that studies in medical areas have shown that far better results are obtained via working from a checklist rather than relying on a physician's intuition. The TV show "House" always cracked me up. Every episode the brilliant clinician House went through 5-10 wrong diagnoses before finally stumbling onto the right one.

Why are experts inferior to algorithms? One reason, which Meehl suspected, is that experts try to be clever, think outside the box, and consider complex combinations of features in making their predictions. Complexity may work in the odd case, but more often than not it reduces validity. Simple combinations of features are better.
Even when using formulas, simpler is better. From the work of Robyn Dawes:
The dominant statistical practice in the social sciences is to assign weights to the different predictors by following an algorithm, called multiple regression, that is now built into conventional software. The logic of multiple regression is unassailable: it finds the optimal formula for putting together a weighted combination of the predictors. However, Dawes observed that the complex statistical algorithm adds little or no value.

...

formulas that assign equal weights to all the predictors are often superior, because they are not affected by accidents of sampling.

But, most people still prefer experts to algorithms. I think this will change, as we realize that our new robot overlords only want the best for us.
The story of a child dying because an algorithm made a mistake is more poignant than the story of the same tragedy occurring as a result of human error, and the difference in emotional intensity is readily translated into a moral preference.
I like that, after showing us the shortcomings of intuition, Kahneman gives us techniques for harnessing our intuition, by combining it with the hard data.
intuition adds value even in the justly derided selection interview, but only after a disciplined collection of objective information and disciplined scoring of separate traits.
So, can we ever trust the intuitions of a skilled expert? Kahneman says yes, depending on the skill.
The answer comes from the two basic conditions for acquiring a skill:
  • an environment that is sufficiently regular to be predictable
  • an opportunity to learn these regularities through prolonged practice
When both these conditions are satisfied, intuitions are likely to be skilled.
But we should also note the converse of this answer:
intuition cannot be trusted in the absence of stable regularities in the environment.

...

Whether professionals have a chance to develop intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.

Kahneman recounts another personal anecdote introducing the outside view vs the inside view. The outside or objective view of, say, a project's success can differ significantly from the inside or subjective view of those involved.
Amos and I coined the term planning fallacy to describe plans and forecasts that
  • are unrealistically close to best-case scenarios
  • could be improved by consulting the statistics of similar cases
"Everybody else failed at this, but we're going to make it work! This time for sure!" But quite often we don't even bother to look at what everybody else has done.
people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.
The treatment for the planning fallacy is called reference class forecasting.

The optimistic bias, which Kahneman says "may well be the most significant of the cognitive biases", is part of what creates the planning fallacy. Chapter 24 is titled "The Engine of Capitalism" - probably everyone who creates a startup or small business does so with help from the optimistic bias.

The optimistic risk taking of entrepreneurs surely contributes to the economic dynamism of a capitalistic society, even if most risk takers end up disappointed.
So where do entrepreneurs go wrong? What cognitive biases undermine their efforts?
  • We focus on our goal, anchor on our plan, and neglect relevant base rates, exposing ourselves to the planning fallacy.
  • We focus on what we want to do and can do, neglecting the plans and skills of others.
  • Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck. We are therefore prone to an illusion of control.
  • We focus on what we know and neglect what we do not know, which makes us overly confident in our beliefs.
Also contributing is the above-average effect:
"90% of drivers believe they are better than average"

...

people tend to be overly optimistic about their relative standing on any activity in which they do moderately well.

Here's a good line about one of those skilled experts we've talked about which I hadn't heard before:
President Truman famously asked for a “one-armed economist” who would take a clear stand; he was sick and tired of economists who kept saying, “On the other hand…”
Some more examples of optimistic bias:
inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid.

...

“clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.”

Kahneman again provides us with a tool to use against these biases. In this case, it is the premortem, from Kahneman's "adversarial collaborator" Gary Klein:
“Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”


Part IV begins with being reintroduced to Richard Thaler's Humans vs Econs. The definition of an Econ, from Bruno Frey:

“The agent of economic theory is rational, selfish, and his tastes do not change.”
Meanwhile we all know that Humans are nothing like Econs.
To a psychologist, it is self-evident that people are neither fully rational nor completely selfish, and that their tastes are anything but stable.
In this part we start to get overlap with Thaler's "Misbehaving" book blogged here. I will move quickly through the overlapping material.

The rational agent model is founded on expected utility theory. Kahneman and Tversky expanded on expected utility theory with their prospect theory, which occupied most of Chapter 5 of Thaler's "Misbehaving". The roots of these theories go back to Daniel Bernoulli in the 18th century.

Kahneman refreshes prospect theory with the ideas of this book.

it’s clear now that there are three cognitive features at the heart of prospect theory. They play an essential role in the evaluation of financial outcomes and are common to many automatic processes of perception, judgment, and emotion. They should be seen as operating characteristics of System 1.
  • Evaluation is relative to a neutral reference point, which is sometimes referred to as an “adaptation level.” ...
  • A principle of diminishing sensitivity applies to both sensory dimensions and the evaluation of changes of wealth. ...
  • The third principle is loss aversion. ... The "loss aversion ratio" has been estimated in several experiments and is usually in the range of 1.5 to 2.5.
We revisit the endowment effect, which was chapter 2 of "Misbehaving". I characterized it there as "a bird in the hand is worth 2 in the bush".

We learn more about loss aversion. In "Misbehaving" this is called "risk aversion". This is one of those things that evolution as "survival of the fittest" explains perfectly.

The brains of humans and other animals contain a mechanism that is designed to give priority to bad news. By shaving a few hundredths of a second from the time needed to detect a predator, this circuit improves the animal’s odds of living long enough to reproduce. The automatic operations of System 1 reflect this evolutionary history. No comparably rapid mechanism for recognizing good news has been detected. Of course, we and our animal cousins are quickly alerted to signs of opportunities to mate or to feed, and advertisers design billboards accordingly. Still, threats are privileged above opportunities, as they should be.
Kahneman agrees with that this is the root of the impulse to conservatism.
Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.
Loss aversion shows up in law and business as findings that people are most incensed when companies "break informal contracts with workers or customers". I know I have personally annoyed when a software product drops support for a feature I liked.

Loss aversion interacts with our poor probability judgement skill, and seems to only be overcome when we are desperate. We seem to be pretty good at understanding "nothing", and also "all", as in "all or nothing". But add a small possibility to "nothing", and we get the possibility effect, "which causes highly unlikely outcomes to be weighted disproportionately more than they “deserve.”". The same thing happens at the "all" end via the certainty effect.

Outcomes that are almost certain are given less weight than their probability justifies.
So the expectation principle, which goes back to Bernoulli, "by which values are weighted by their probability, is poor psychology."

Quantifying these effects, Kahneman and Tversky "carried out a study in which we measured the decision weights that explained people's preferences for gambles with modest monetary stakes." Here are their numbers:

Next, we get the fourfold pattern, shown below.

The top left cell was discussed by Bernoulli:

people are averse to risk when they consider prospects with a substantial chance to achieve a large gain. They are willing to accept less than the expected value of a gamble to lock in a sure gain.

... the bottom left cell explains why lotteries are popular.

... The bottom right cell is where insurance is bought.

The top right cell they found surprising, and a source of new insights.
we were just as risk seeking in the domain of losses as were risk adverse in the domain of gains.

...

Many unfortunate human situations unfold in the top right cell. This is where people who face very bad options take desperate gambles, accepting a high probability of making things worse in exchange for a small hope of avoiding a large loss. Risk taking of this kind often turns manageable failures into disasters.

Despite its evolutionary value, loss aversion now works against us.
[this is] how terrorism works and why it is so effective: it induces an availability cascade.
Another cognitive bias is referred to as the denominator neglect. Given 1 winning chance in 10 or 8 in 100, most people will pick 8 in 100 - more chances to win! They ignore the denominator, which gives the 2nd choice an 8% probability of success vs 10% for the 1st choice.

Here's another one that is almost hard to believe, but it has been born out by studies. People will be more impacted by "10 out of 100" than "10%. We can add percentages to probability and statistics as things that the our mind, particularly System 1, does not handle well.

The discrepancy between probability and decision weight shown above in Table 4 is characterized as

  • People overestimate the probabilities of unlikely events.
  • People overweight unlikely events in their decisions.
This gives us 2 more cognitive biases, overestimation and overweighting.
The successful execution of a plan is specific and easy to imagine when one tries to forecast the outcome of a project. In contrast, the alternative of failure is diffuse, because there are innumerable ways for things to go wrong. Entrepreneurs and the investors who evaluate their prospects are prone both to overestimate their chances and to overweight their estimates.

...

Ralph Hertwig and Ido Erev note that “chances of rare events (such as the burst of housing bubbles) receive less impact than they deserve according to their objective probabilities.

The end of the discussion of rare events concludes with a scary sentence, particularly when you think about how successful we will be in addressing the climate crisis.
Obsessive concerns ..., vivid images ..., concrete representations ..., and explicit reminders ... all contribute to overweighting. And when there is no overweighting, there will be neglect. When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that may be exposed to events no one has yet experienced, this is not good news.
We are given more evidence of the mind's lack of skill at integrating or summing in a discussion of narrow framing - making lots of small decisions - vs broad framing - coming up with a comprehensive plan. Econs would use broad framing most of the time, but real Humans will almost always use narrow framing.

Kahneman gives us another tool to use against this bias: risk policies.

Decision makers who are prone to narrow framing construct a preference every time they face a risky choice. They would do better by having a risk policy that they routinely apply whenever a relevant problem arises.
Next we have a discussion of mental accounts - these were discussed in Chapter 11 of "Misbehaving".
mental accounts are a form of narrow framing; they keep things under control and manageable by a finite mind.
Our mental accounts, where we track the value and cost of various aspects of our life, are of course prone to cognitive biases.
finance research has documented a massive preference for selling winners rather than losers — a bias that has been given an opaque label: the disposition effect ... - an instance of narrow framing.

So you sell a winner, you feel good about yourself - I'm a winner! But when you sell a loser, you are admitting a failure and taking a realized loss right then and there. Rather than doing that, most people will keep the loser - an instance of the sunk-cost fallacy, discussed in Chapter 8 of "Misbehaving". Speaking of a corporate manager who refuses to give up on a losing project, Kahneman states:

In the presence of sunk costs, the manager’s incentives are misaligned with the objectives of the firm and its shareholders, a familiar type of what is known as the agency problem.
On a different note, Kahneman discusses the emotion of regret:
The fear of regret is a factor in many of the decisions that people make ...
Kahneman references the claims of Daniel Gilbert on regret - more advice for us in dealing with our biases.
you should not put too much weight on regret; even if you have some, it will hurt less than you now think.
2 more biases. The 1st deals with "sins of commission" vs "sins of omission". Logically, if the (bad) outcome is the same, what difference does it make if it came about through action rather than inaction? Maybe it gets back to the "first do no harm" principle of medicine, which is yet another flavor of loss aversion.
people expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction.
The 2nd bias is the "taboo tradeoff against accepting any increase in risk". Particularly where children are involved, we don't do a good job evaluating how to address risk in the face of limited resources. We refuse any increase in risk.

Preference reversals are discussed in chapter 6 of "Misbehaving". People will reverse their choices illogically depending on how they are framed. It frequently shows up when options are weighed separately rather than together. Christopher Hsee addressed this with his evaluability hypothesis. Kahneman reminds us again that

rationality is generally served by broader and more comprehensive frames.

...

Broader frames and inclusive accounts generally lead to more rational decisions.


Part V is titled "Two Selves". We are introduced to another dichotomy to go along with System 1/System 2 and Econs/Humans: experienced utility and the experiencing self vs decision utility and the remembering self.

The experiencing self is the one that answers the question: “Does it hurt now?” The remembering self is the one that answers the question: “How was it, on the whole?”

...

Confusing experience with the memory of it is a compelling cognitive illusion

...

What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self.

Kahneman describes an experiment what produced results that seem to totally fly in the face of common sense. Subjects were subjected to 2 slightly painful experiences: having a hand placed in cold water for 60 seconds; and having a hand placed in cold water for 60 seconds, followed by slightly warming the water for 30 seconds. When subjects were then asked which experience they would choose for a repeat performance, they chose the 2nd! They chose more pain rather than less! This shows the dominance of the remembering self, and yet another instance of our minds' inability to integrate or sum: "System 1 represents sets by averages, norms, and prototypes, not by sums.". We get 2 more components of our mental makeup:
  • Peak-end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end.
  • Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain.
Interestingly, these 2 components also work the same for life as a whole.
A story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character.

...

In intuitive evaluation of entire lives as well as brief episodes, peaks and ends matter but duration does not.

I'm not sure I agree with Kahneman's last conclusion on this subject. These 2 selves could maybe be characterized by the Zen self vs the non-Zen self. Maybe a Zen master who can really live totally in the here and now contradicts this conclusion. But, I have no knowledge that such a Zen master exists.
Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.
Talking about one's experience of life as a whole, Kahneman introduces us to Mihaly Csikszentmihalyi and his concept of flow - a state that some artists experience in their creative moments and that many other people achieve when enthralled by a film, a book, or a crossword puzzle. Hah, I knew I recognized that name, I mention having read his book "Creativity" in my 3rd blog post, May 6, 2003. Kahneman mentions the studies on overall well-being vs monetary status.
The satiation level beyond which experienced well-being no longer increases was a household income of about $75,000 in high-cost areas (it could be less in areas where the cost of living is lower). The average increase of experienced well-being associated with incomes beyond that level was precisely zero.
My wife doesn't read this blog, so I think it's safe to include this here.

the decision to get married reflects, for many people, a massive error of affective forecasting.

...

Daniel Gilbert and Timothy Wilson introduced the word miswanting to describe bad choices that arise from errors of affective forecasting.

Given what we have learned about our minds' inability to sum over time, it is not surprising that we are not very good at determining what will make us happy, past, present, or future. When people are asked how life is going now, the mood heuristic is ready to jump in and report whatever their current mood is.

We now get our final cognitive bias:

the focusing illusion, which can be described in a single sentence:
Nothing in life is as important as you think it is when you are thinking about it.
...

The essence of the focusing illusion is WYSIATI, giving too much weight to the climate [of CA vs the midwest, in an experiment], too little to all the other determinants of well-being.

...

The mistake that people make in the focusing illusion involves attention to selected moments and neglect of what happens at other times. The mind is good with stories, but it does not appear to be well designed for the processing of time.

One final excerpt that I thought was interesting, on how we acclimate to novelty:
over time, with few exceptions, attention is withdrawn from a new situation as it becomes more familiar. The main exceptions are chronic pain, constant exposure to loud noise, and severe depression.


This was a very enjoyable and informative book. System 1, with its heuristic-based intuitions providing us with the snap judgements that help keep us alive, is a marvel of evolution. Of course, every part of life is a "marvel of evolution". But, we now have a laundry list of things System 1 is NOT good at: percentages, probability, statistics, integration, summation, broad framing.

With regard to economics, we again get the message that "Misbehaving" delivered: that Econs are a complete and utter fiction. I think in "Misbehaving" a statement is made to the effect that "Yeah, there are no Econs, but without them we couldn't have developed economics". So Econs are a simplifying assumption. I continue to have the nagging thought that it is too big of a simplification, such that I question the entire framework of economics.

One of the "killer apps" for economics is predicting boom and bust cycles. I was around for the dotcom bubble in the late 90s, the housing bubble of the mid 00s, and the Web 2.0 (social media) bubble, which is still active. In all 3 of these, I observed the same characteristics. Here are examples of people's (including my) thinking during these bubbles:

  • This is a sure thing.
  • Everybody else is making big $$$.
  • I'm going to miss out.
  • I better pile on.
I think all of these are the indicators of herd mentality. I have yet to come across attempts to incorporate this mammalian (?) behavior into economics. (Hmmm, the wikipedia article says "herd behavior is closely studied by behavioral finance experts in order to help predict future economic crises". So, never mind? No, it is a slim article, the above quote references a single paper, and there has not been a book written on the topic in 100 years).

Another place where I think herd mentality could be the secret sauce is predicting demand. Fashion, fads, and "keeping up with the Joneses", expertly manipulated by the marketing machine, all incorporate herd behavior.

It's funny, my hero (before he became such a Hillary shill) Paul Krugman says he wanted to work on psychohistory like Harry Seldon in Asimov's "Foundation" novels, and economics is what is closest. But psychohistory was all about predicting herd behavior. So where is it really, in modern economics? I guess I'll have to check out that paper.

A final personal note, I find it interesting that I used to do a lot more reading in cognitive science, and found cognitive illusions particularly interesting. I start studying (slowly) economics, and find, in behavioral economics, the integration of those concepts. In another life behavioral economics could be a good field for me.