Friday, July 3, 2015

Our "Life is a Lottery" Culture

One of the things President Obama is famous for is dissing the successful as "life's lottery winners." This parallels his charge that successful businesses don't deserve the credit for their success, saying "You didn't build that." This is politically shrewd, because it taps into two basic weaknesses of human nature: envy and greed. His philosophy, and that of many liberals in generals, conveys the message that voters are justified in their envy of success and wealth. Voters are encouraged to demand income redistribution because the wealth of others was not earned.
Fairness doctrine dominates the ethos of liberals and lies at the heart of their increasingly successful efforts to punish success by government regulation and taxation policy. If success is due to circumstance and luck, then it is only fair to push for more equal outcomes in the efforts of people to succeed. Voters, perhaps now a majority of them, like the idea of leveling the playing field by taking from the successful and giving to those who have not succeeded.
At the other end of the spectrum, the lottery mentality offers an excuse for those whose lives are mostly marked by failures. Their problem is bad luck, or exploitation by the successful. This is another example of unfairness, perhaps even of the mendacity of those who have won life's lottery. No longer do we have to blame failure on bad choices and decisions or on laziness. Poverty or antisocial behavior can't be blamed on freely chosen actions. Those who fail can find scapegoats and blame them.
Politically, liberal politicians have found the guarantee of success: convince people they are victims and pledge to punish the perpetrators and restore fairness. As Jeff Bergner put it in his essay, "One cannot build a cult of victimhood on the soil of personal responsibility."
It is not just craven politicians who reject personal responsibility. Many brain scientists have arrived at the same conclusion from highly flawed experiments that seem to show that a person makes choices and decisions well before their claim of when they made them. This research has been interpreted to mean that the unconscious mind makes the decision and later that becomes known consciously. Thus, the decision cannot be made freely. It is driven unconsciously by our genes and past learning experiences.
Thus, scientists provide support for redistributionist and retributionist social policies, because life success is not based on merit. We now have defense lawyers arguing "diminished capacity" for criminal clients. We even have clerics who argue from original sin and predestination perspectives that we can't help our bad morals.
I am writing a book on "Making a Scientific Case for Free Will." So far, despite my track record of science-publishing success, no publisher will touch it. Has free will become politically incorrect?
So why try? If you succeed, you won't get credit. You will get diminished, perhaps even punished. If you fail, you won't get blame. In fact, you will get compensated. This is what life in America is coming to in this new age of "life's lottery."


Source for quote: Bergner, Jeff. 2015. The fame of life. The Weekly Standard, June 29, p. 25-26.

Source for critique of free-will research: Klemm, W. R. 2010. Free will debates: simple experiments are not so simple. Advances in Cognitive Psychology. 6: (6) 47-65.

Thursday, June 25, 2015

Where We Place Blame Depends on What Is True

When we do something that others say is wrong, do we blame ourselves and repent? That depends on whether we agree it was wrong. In other words, the issue becomes one of what is a fact or true. Was what we did really wrong, or just some busybody's opinion?
Honoring truth is a value. People are taught values, and how and what they are taught is especially impactful on children. Both children and adults are prone to fuzzy thinking in general, but this becomes especially problematic when it comes to thinking about the pursuit of honorable behavior. In a New York Times piece by Justin McBrayer, a college philosophy professor, explains how fuzzy thinking about morals is leading children to think there are no moral facts, just moral opinions which they are free to accept or reject without blame. He is noticing that this moral ambiguity seems to be increasing among college students. He also observed that the issue was captured in a sign in his kid's secondary school, which posted the edict that defined fact and opinion as follows:

Fact: Something that is true about a subject and can be tested or proven.
Opinion: What someone thinks, feels, or believes.

When McBrayer Googled definitions on the web, he found them all to be similar to the sign in the school. The implications of such definitions are quite damaging for honorable behavior. It breeds a situational ethics mentality that says that opinions are not based on fact or truth, and thus opinions are fungible. Such definitions mean that there are no moral truths, because none of them can be proved to be true. Moral claims are believed to be mere opinions. For example, murder is therefore not immoral, just illegal, because a majority of people had an opinion it was wrong and passed a law outlawing it. No one person's opinions are any more valid than anybody else's. These definitions lead to a moral ambiguity that is systematically encouraged by Common Core standards which include the deceptively innocuous requirement that children learn to "distinguish fact, opinion, and reasoned judgment in a text." Links to the flawed definitions occur in lesson plans and quizzes.
The reality is that we think opinions are wrong if we do not hold them. Often, our attachment to our own opinions is welded by self-interest and emotions rather than by reason and evidence. What reason and evidence we employ is used as argument for existing opinion rather than for evaluation of alternative opinions. But if one uses reason and evidence to examine a range of opinions, the opinion we finally accept as true is more likely to be true. We construct an excuse for ourselves from thinking hard to seek absolute truth if we accept the claim that no moral truths are absolutely truer than others. How convenient. Now we don't have to take blame for what otherwise would be moral failures. This could have been the rationale for the statements of Jesus, "Ye shall know the truth, and the truth shall make you free," and the paraphrase of several statements that are equivalent to "Seek ye the truth and you shall find it." 

The pedagogical challenge is for schools to place more emphasis on evidence-based thinking. Too many teachers impose their own opinions as authoritative and true without compelling supporting evidence. This conveys the message that it is o.k. for students to do likewise with their own opinions. Science courses have special value because they require students to consider and value evidence for conclusions about the nature of the physical world. That is the mindset students should use for conclusions outside the realm of science.
The problem is not just limited to uneducated kids. McBrayer sees moral relativism all the time with colleagues in academia. Just what is the problem with these definitions and mind set? For one, truth does not have to have proof in order to be true. Some things can be true, even though proof is not yet available. For example, the theory of evolution has overwhelming supporting evidence, but many people have the opinion it is not true because it cannot be definitively proved. McBrayer points out that some things that have been "proved" turn out later to be wrong. Proof is a feature of our mental life, and if proof is required for facts, then facts become person relative. You can have your truth, I can have mine. How then do we refute the counter-argument that says "You are entitled to your opinion, but not to your own facts."
The second flaw in the definitions is that students are being taught that claims are either facts or opinions: they can't be both. Common Core quizzes for example require students to sort claims into one or the other category. There are demonstrable facts that certain people steadfastly refuse to believe, as well as beliefs about certain "truths" that are manifestly not based on evidence or fact. The main point, however, is the reality that a fact can be true and believed at the same time.
Unfortunately in school curricula, students are taught that value claims are opinions, neither true nor based on fact. In an online fact vs. opinion student worksheet, McBrayer found that children were expected to classify the following behaviors as mere opinions:

  • Copying homework assignments is wrong.
  • Cursing in school is inappropriate.
  • All men are created equal.
  • It is wrong for people under age 21 to drink alcohol.
  • Vegetarians are healthier than people who eat meat.
  • Drug dealers belong in prison.

Our culture may be producing a whole generation that thinks there are no moral facts and thus no world view about honor can be true. Thus, no one can be blamed for violating moral values. We are left with the unavoidable problem, however, that adult life presents us with moral dilemmas wherein we must acknowledge certain moral values as facts. How, for example, can we be outraged when rioters destroy the property of innocents if this is not viewed as a moral fault, as dishonorable behavior? Or with McBrayer's example, how can we be outraged at the murder of cartoonists, if such murder is morally neutral? Indeed, we can rationalize it as o.k., because the cartoonist was extremely offensive. To protect people from crimes against humanity, we must acknowledge the reality that certain moral truths are indeed facts. As a society, we are challenged to think through the evidence that supports each of many competing moral claims to determine which claims are true. We abdicate that responsibility by believing that nothing is true that has not been proved. Unfortunately, it is easier to abdicate moral commitments than to live an honorable life.
Even when we acknowledge that certain things are right and others wrong, we seem to be living in a devolving culture where blame is something you place on others or on uncontrollable outside forces. How long will it be until blame is no longer politically correct, where we can't hold anybody responsible for anything? Maybe that time is coming soon, as witnessed by the numerous recent scandals and failures in government agencies where nobody is held accountable.
There is much more to be said about honesty, and I am working on a book about truth and falsity. In the meanwhile, I recommend the book by Dan Ariely, "The (Honest) Truth About Dishonesty."

For more about Dr. Klemm's writings, see his web site at and his blog on Improve Learning and Memory at


McBrayer, Justin P. (2015). Why our children don't think there are moral facts. New York Times. March 1. Accessed June 25, 2015.

Saturday, February 28, 2015

Blaming the Victim

"What did we do to make them hate us so much?' America and Western civilization in general are under attack by Islamic terrorists. Apologists assert that we brought this wrath upon ourselves by our historical sins (like the Crusades) and therefore we must not only apologize but also make excuses for terrorism and take blame for their dysfunctional governments (created by colonialism) and their poverty. So what if our people get berated, bombed, and beheaded. We had it coming.

Yet such self-hatred fails to recognize that the Crusades were conducted to take back the Western countries and Christian holy sites that Muslims had conquered. Colonialism was a problem, but America had no colonies in Middle East or African colonialism. Dysfunctional governments seem to be the rule in Muslim countries, but we don't make the rules there. Muslims do.

Moreover, how do sins of the past equate to sins of the present? The Christian church has gone through many epochs of reform. Radical Islam does not seek to reform itself. It harkens back to Islam's original and literal scripture in the seventh century.


"So what that we don't have immigration papers? Our country is poor, yours is rich, and we have a right to violate your laws because you stole our country and made it poor." Norte Americanos are blamed for banana republics and the historical government dysfunction in Latin American. Today, open-border activists argue that the U.S. deserves the chaos of illegal immigration because of our misbegotten riches, historically stolen from Latin American countries who now have a right to seize our entitlements and jobs from our own poor.

Only Mexicans can claim we took part of their country, but it wasn't originally their country. It had belonged to Spain before Mexico rebelled against Spanish rule. Texas in 1836 was a province of Mexico, and Texans seceded because of mistreatment and restricted political rights by the Mexican government. Mexico refused to grant Texans their freedom and independence, apparently unwilling to honor the wish of Texans to do the same thing that Mexico had done to Spain little more than a decade earlier. California, Arizona, and New Mexico territory were purchased in 1848 from Mexico with cash and forgiveness of debt owed to Americans.

It is not clear what natural resources we stole from Central and South America. On the contrary, we created NAFTA trade agreements that help their economies. We transferred to Venezuela and Mexico the oil technologies that help to prop up their economies. And the countries that are complaining about our not taking their emigrants have much stricter immigration restrictions than we do. Talk about hypocrisy!

"Your ancestors were racists and held slaves. Therefore, we are entitled to affirmative action, reverse discrimination, and even reparations." The ancestors of most whiteys in America weren't even here when slavery existed. They were in Europe and Asia and came after the civil war. And among those who were here, 359,528 Union soldiers died to free the slaves. Where is the condemnation of the blacks in Africa who captured fellow blacks and ran the slave trade? American blacks are free to return to Africa — interesting that they don't want to. Around the world, blacks want to come to America, not leave it.


These and other examples of blaming the victim that could be cited are political or religious. i these areas, rational and unemotional discussion is usually futile. So let us consider other examples.

Defense lawyers in general often use "blame the victim" tactics when they know their defendant is guilty. Soft-hearted juries are swayed by clever excuses and specious argument. I bet they teach this stuff in law school.

Among the many examples of blaming the victim are rape victims whose assailants charge were dressed provocatively and thus were "asking for it."  The excuse is that what a woman wears, says, where she goes, or what she does can make her responsible for the crime committed against her. No amount of saying no or physical resistance seems to nullify the excuses. Lawyers try to gain sympathy for rapists by pointing out how unfair it is to tempt men. Indeed, this idea is probably the basis for the Muslim requirements that their women cover their whole body and even their face when in public.

A related example is domestic violence, typically inflicted by males on their spouses or girlfriends. The victim is blamed for enabling the violence by refusing to end the relationship. While resistance to leave can sometimes develop because the woman is emotionally weak and dependent, often they have little choice for economic reasons or often justifiable fear for themselves or children if they dare to leave. A common explanation for why women do not leave their battering relationship is Seligman's theory of "learned helplessness." The teaching of helplessness comes from their abuser. In some locales, the victim faces a high tolerance of wife beating by police and legal systems.

Chronic victims may be inviting perpetuation of their abuse when they believe they are victims of outside forces beyond their control. When helplessness has been learned, it is hard to generate the will and initiative to use internal psychological resources to overcome the abuse.  Chronic victims commonly have low sense of self-worth, low sense of efficacy, feelings of shame and guilt, and may even believe that they deserve to be punished. The task of recovery from victimization is to take responsibility, moving from helplessness to accountability and from hopelessness to optimism. Seligman makes the point that optimism can be learned too.

Then there is the death fatwa on Salman Rushdie, whom former President Jimmy Carter says invited the attempts on his life by writing a "blasphemous" book. Freedom of speech, according to Carter, seems to apply only for speech that does not offend. He does not elaborate on where one draws the line where murder is warranted for verbal offense. Maybe it just applies to religious challenges.

Bullying is often blamed on the person being bullied for all manner of reasons, such as being gay, ugly, fat, too smart, in the wrong minority group, some negative or annoying personality trait, being weak or too sensitive, and so on. How does the bully justify his own negative traits? Well, of course, that question never arises because bullies pretend to feel superior and are condescending.

Whistleblowers and investigative reporters are often blamed and disparaged when they disclose embarrassing or criminal actions of others. Their colleagues often shun them. Perhaps shunners feel inferior and shamed by their own lack of courage to do the right thing. If those of us on the sidelines don't speak up for the true victims, maybe nobody will. Those who victimize others need to be held responsible, not excused.

Why do people blame the victim? One reason is that people want to believe that life is fair (it is not), and therefore unfairness is hard to accept as a cause of victimization. The victim must deserve her state. Another reason is that making excuses for others provides a rationale for excusing ourselves. If we can lift the burden of personal responsibility on others, we can justify doing it for ourselves. Thus, we don't have to face our own weaknesses. Excuse-making is profound cowardice.

Dr. Klemm's latest books are:

Mental Biology, The New Science of How the Brain and Mind Relate and
Improve Your Memory For a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine


Wednesday, November 26, 2014

Patience May Indicate Free Will

They say that patience is a virtue. It may also support the notion of free will. A person may defer action, as in pursuing a reward for example, because of a free choice to delay. This possibility lies at the heart of a new study in a long series of studies that began in the 1980s that have tested the notion that free will might not exist, that it is an illusion.

The prior experiments, widely interpreted to indicate that free will does not exist, demonstrated that neuron activity in  a movement-control area of the cerebral cortex accelerated prior to a conscious decision to press a button. Thus, scientists interpreted this to mean the decision was made unconsciously, prior to the conscious realization that a decision had been made. I have challenged this interpretation on both grounds of the scientific methods used and misinterpretation of the cognitive neuroscience (see manuscript at

In this new study, Masayoshi Murakami and colleagues in Portugal, this electrical marker of decision-making was studied in rats in a different paradigm. Highly thirsty rats were trained to wait in place after hearing a sound cue until a second sound, a "go-for-it" sound was heard that would give them access to water. If the rats showed the required restraint, they received a larger water reward.

As the rats waited, the electrical "decision-making indicator" grew in magnitude and reached a threshold at the point where rats lost patience and went for the reward. The progressive increase in neural firing is interpreted as a well-known "integrate-and-fire" mechanism, wherein activity grows until a threshold for action is reached.

But they also found a second class of neurons whose firing could predict the rate at which the integrating neurons added up toward threshold. This observation of preceding regulatory control enabled a new interpretation of the original free will experiments on illusory free will. The "integrate-and-fire" population of neurons may not be making the actual decision, but rather reflecting an earlier decision-making processes elsewhere that regulate the integration toward action threshold.

Decision-making is complicated, even when it just involves pressing a button. I try to explain all this in my new book from Prometheus, "Mental Biology. The New Science of How the Brain and Mind Relate."


Murakami M, Vicente MI, Costa GM, & Mainen ZF. (2014) Neural antecedents of self-initiated actions in secondary motor cortex. Nature Neuroscience, 17(11), 1574-82. PMID: 25262496

Saturday, October 5, 2013

Why We Hate to Accept Blame

Why is it so hard to take responsibility for our errors? Of course, the obvious and somewhat glib answer is that our ego gets in the way. But there is more to it than that. Recent experiments show that one undiscovered basic cause involves our sense of agency, that is, our sense of being responsible for what we choose to do (1).

Only recently have scientists started to spend much effort on understanding human agency (2). I discovered this to my dismay when recently asked to write a book chapter on the subject. It is hard to write knowingly on any subject when not much is known. Of course agency and sense of agency are two different things, and here we are concerned with how people perceive how much control they have over their lives. Given the growing dependency on government in this country, a likely prediction is that more and more people will surrender to the belief that they can’t do much about their lives. Research has shown that a person’s sense of agency depends on how many options they have to choose from (3). Options shrink when government or anything else constrains your range of choices. Moreover, another study showed that people believe that they have more self-control than others (4), which might explain why so many politicians treat the public as helpless and in need for a government nanny.

Anyway, on the point of accepting blame for one’s failings or mistakes, it seems that people claim ownership of their actions more readily when the outcomes are positive. Negative outcomes from their deeds are associated with less ownership and sense of responsibility. The most recent experiments had a primary focus on our sense of time in association with voluntary actions. The experimental design was based on prior evidence that the perceived estimate of time lag between when we do something and when we think we did it is an implicit index of our sense of ownership. Investigators asked people to press a key, which was followed a quarter of a second later by negative sounds of fear or disgust, positive sounds of achievement or amusement, or neutral sounds. The subjects were then asked to estimate when they had made the action and when they heard the sound. Timing estimation errors were easily measured by computer.

Subjects sensed a longer time lag between their actions and the consequences when the outcome (the sound) was negative than when it was positive. The interpretation is that with positive outcomes, the subjects sensed a more direct connection between what they did and their action of button pressing. With negative outcomes, subjects wanted to put more distance (time in this case) between what they did and the outcome. This seems like a rather indirect way to assess sense of agency, but we must await a cleverer and more direct way to measure it.

In any case, such experiments support our intuition that ownership of what we do can be affected by whether or not things turn out well. In other words, we have a self-serving bias to take more credit for when things turn out well than when they don’t. When they turn out badly, we want to insulate ourselves from responsibility and put blame elsewhere. Of course, we probably already knew that, but now we have objective experimental ways to study and perhaps manipulate sense of agency. Parents and social pressures don’t always succeed in teaching people to accept blame when it is due. This problem is likely to continue to get worse as more and more children lack responsible parents or even a father and a stay-at-home mom in the home. Maybe there are more systematic ways to train people to recognize how the consequences of their deeds affect their sense of responsibility.

1.Yoshie, et al. (2013). Negative emotional outcomes attenuate sense of agency over voluntary actions. Current Biology.

2.  David, N. (2012)/ New frontiers in the neuroscience of the sense of agency. NCBI. Retrieved Oct. 2, 2013 from

3. Barlas, Z., and Obhi, S. S. (2013). Freedom, choice, and the sense of agency. Frontiers in Human Neurosience. August 29. Doi:10.3389/fnhum.2013.00514

4. Pronin, E, and Kugler, M. B. (2010). People believe they have more free will than others. PNAS. 107 (52), 22469-22474.

Wednesday, September 18, 2013

Politically Correct Science

It’s not just climate scientists and their global warming. Behavioral scientists have their own biases. The August 30, 2013 issue of the premier journal, Science, has a research report claiming that poor people stay poor because being poor “impairs their cognitive capacity.”[1] That is, being poor is so challenging and stressful that poor people have exhausted their mental capacity to regulate their impulses and make wise choices. The research was deemed so impeccable by the editors that they also ran a companion news story to publicize the findings to lay audiences.

The research yielded two basic, independent findings:

  1. In one group of ordinary people sampled at a shopping mall, inducing negative thoughts about finances impaired function on two thinking tests in poor but not well-off participants. The median incomes per household varied from $20,000-$70,000 per year.
  2. Testing of farmers at two stages of the harvest cycle showed that thinking performance before the harvest, when they were relatively poor, was impaired relative to performance afterwards, when they were relatively rich.
The conclusion was that people do not choose to be poor and that they become trapped in poverty because being poor impairs their thinking and decision-making. Blame for people being poor is put on their state of being poor. Nowhere, not in the research report nor its news summary, did anybody consider an alternate explanation for poverty in the U.S.:

Poverty of able-bodied, normal people can be a choice.

In terms of social policy, the implication of the politically correct is that you can help lift people out of poverty if you make them less poor through gifts, exempting them from taxation, and providing subsidies. Interestingly, the writers avoided this obvious inference, no doubt because it would irritate people who believe in the importance of personal responsibility. Rather, the writers focused on the need of government to ease mental challenges of poor people to a level commensurate with their impaired thinking capacity. For example, the authors suggested that government should reduce “cognitive taxes” in addition to monetary taxes for the poor. Specific policy suggestions included providing help in “filling out forms, planning, and reminders” to help the poor access government services and welfare.

Not mentioned is the current U.S. policy of spending $67 million for “navigators” to help poor people sign up for the Affordable Care Act. Who knows, that idea may well have come from these authors, two of whom were high-status Ivy League professors.

But I have to ask, wouldn’t these millions of dollars be better spent helping the poor get a work ethic and make better life choices? Instead, our government gives welfare payments and subsidies worth more than the poor can earn by working. And the poor don’t have to work to get the welfare. At the same time, politicians and bureaucrats push for an amnesty program for illegal aliens who are willing to do the work that our poor citizens refuse to do. Tell me, how is a government policy of promoting more dependency going to help anybody? If such government policy were not so deliberate, it would be insane.

[1] Mani, A. et al. (2013) Poverty impedes cognitive function. Science. 341: 976-980.