More and More Little Wins

Since I read Nudge by Richard Thaler a few years back, I’ve been happily surprised how quickly the idea of “nudges” is spreading around the world. In a recent New York Times piece, David Brooks catalogues many successful nudges, notably in places like Kenya and Zambia. David Cameron is a noted supporter of using the gleanings of behavior economics to get citizens in the UK to “do good by default.”

The way nudges work is that governments and organizations set up “decision architecture” such that the default option–or an easy option–has a socially beneficial outcome. A well known nudge is making the default option in organ donation “yes.” (In the past the default option was nearly always “no organ donation.”) A more whimsical one is to put some kind of target–say a picture of a fly or seashell–inside men’s urinals to induce them to aim better.

The most important findings of behavioral economics are that humans often do not make rational decisions…but they’re predictably irrational (in the words of scholar Daniel Ariely).  Scientists like Amos Tversky and Daniel Kahnemann pioneered studies that showed subtle biases and decision-making “errors” that humans make in some situations. That said, just as we are sometimes led astray, we can use behavioral economics to unconsciously guide people to make prosocial decisions while allowing individuals freedom and control to make decisions.

Brooks’ examples from Africa were most intriguing to me:

“Too many people die in auto accidents. When governments try to reduce highway deaths, they generally increase safety regulations. But, also in Kenya, stickers were placed inside buses and vans urging passengers to scream at automobile drivers they saw driving dangerously.”

“In Zambia, hairdressers were asked to sell female condoms to their clients. Some were offered financial incentives to do so, but these produced no results. In other salons, top condom sellers had a gold star placed next to their names on a poster that all could see. More than twice as many condoms were sold. This simple change was based on an understanding of the human desire for status and admiration.”

Now these behavioral economics inspired nudges are not going to end malaria or cure cancer, but this kind of clever policy making can have an impact. Nudges like these can get well-meaning programs–like the female condom scheme in Zambia–to perform better. And while I don’t think that a sticker encouraging Americans to yell at drivers would work in our culture, I do like how the Kenya government encouraged its citizens not to stand for dangerous behavior. At their best, nudges get people to make small, prosocial decisions at the grassroots level. Like the improvements in life that this blog chronicles, nudges bubble up from the bottom and make the world a better place.

How Not to Be Ignorant

Hans Rosling is probably my favorite optimist, both because he bases his views on huge data sets AND he’s a hoot. Here’s his latest TED Talk. Take his quiz. I bet you’re dumber than a chimp.

http://www.ted.com/talks/hans_and_ola_rosling_how_not_to_be_ignorant_about_the_world

“The More We Know…

…the greater we find our ignorance.” Gardiner G. Hubbard, the first President of the National Geographic Society

True?

This is a popular sentiment. As I looked for similar quotes I stumbled upon this one from JFK: “The greater our knowledge increases, the greater our ignorance unfolds.”

Regardless of the truth of the sentiment, this rhetorical form–I’ll call it “The More…The More” statement–is seductive. It sounds so strong, kind of like Leia saying to Grand Moff Tarkin “The more you tighten your grip…the more star systems will slip through your fingers.” On the face of these statements, I don’t accept that ignorance increases as knowledge does.

But I can accept the statement that more knowledge makes understanding meaning more complex, maybe harder. This is what I think Kennedy and Hubbard were saying, in fairly elegant terms meant for rhetorical impact, not logical soundness.

The utter disappearance of the Air Malaysia jet is tragic. The failed search for it is a big fat metaphor for human limitations in the era of Big Data. It reminds me of The Onion’s “World’s Largest Metaphor Hits Ice-berg: Titanic, Representation of Man’s Hubris, Sinks in North Atlantic.”

This symbolism of human limitations is explored in today’s New York Times article by Pico Iyer, “The Folly of Thinking We Know.” Ayer’s piece is a good meditation on our weaknesses and blind spots. And I always love hearing mention of the Overconfidence Effect, our persistent belief that we think we know more than we really do (and with great confidence, no less).

I’m most interested I hearing your thoughts on this. Does all this information make us smarter yet dumber? Are we more informed but less wise?

Traffic Deaths Will Plummet

As you know, I have a rosy–maybe rose colored–vision of the future. On the top of my “getting better” futuristic wish list is the self-driving car. It’s funny, but many people think that self-driving cars will make driving more, not less, dangerous. Why?

I love neuroscience, and finding, understanding (and, I hope) avoiding cognitive biases is a micro-hobby of mine. One of my “favorite” biases is the Illusion of Control. This is “the tendency for people to overestimate their ability to control events.” (Wikipedia) One of the most common examples of the illusion of control is the fact that so many people feel safer driving than flying. Driving feels safer because one has control. However, if you’re flying in a commercial jet and the ailerons fail (exTREMely unlikely), you have no control. Buckle up and assume the position. This powerlessness makes people feel less safe.

However, we should be grateful that we have no control. Safety procedures for commercial aircraft have been so routinized, regulated and automated that in some years no one dies in commercial aviation crashes in the US. In 2012, 34,080 died in auto accidents (Wikipedia) in America. In that same year, there were no commercial airline crashes that caused fatalities in the US. According to the New York Times, “the death risk for passengers in the United States has been one in 45 million flights.”

So why not do for automobiles what the airlines have done to their jets? As you have probably experienced in your (or someone else’s) car, new safety gadgets keep coming with each new model year. Contrast that with previous decades. As a child my mom plopped me in the front passenger seat, and when she would brake suddenly, Mom put out her arm so that I wouldn’t do a face plant on the dashboard. I guess she could’ve made me buckle my seatbelt, but hardly anyone did that in the 1970s. Nowadays there are many systems and alarms that are standard, such as the rear-facing camera to make backing up safer, sensors for blind spots, anti-lock brakes, etc., etc., etc.

But the real jump in safety will come when we take our hand off the wheel and feet off the pedals. Humans make consistent and persistent perception and judgment errors. On Rock Creek Parkway, a particularly dangerous road in Washington, DC, I have had two accidents at one merge and three at another. (This clearly reflects on my poor driving skills, but it also reflects on human driving skills, too.) All have been fender benders and no one was hurt, but a machine wouldn’t have made the mistakes I made once, let alone five times. In every one of those crashes, I saw the car in front of me accelerate, so I checked my blind spot to make sure I could go, I accelerated, but unbeknownst to me, the car in front of me has decided to stop. Bang! My fault.

A recent CNN article explores some of the coming safety innovations, including cars that “learn” and communicate, external air bags, laser headlights, and self-parking and self-driving cars.

And when this all comes to pass, my insurance rates will plummet like the fatality rates. (Hopefully before my next accident.)

Moneyball Crime Fightin’

Instinct. Guts. Street Smarts. Experience. These are qualities of a great cop, right? But does a belief in heroic crime fighting get in the way of fighting crime?

In an intriguing TED talk, Anne Milgram, former Attorney General of New Jersey, demonstrates how she used smart statistics to zero in on the real problem crimes and criminals in the Garden State.

Her methods produced remarkable results. Like so many people today who are successfully solving our most intractable challenges, she broke down the problem by asking essential questions and then followed the data to the real bad crimes. It didn’t surprise me that she found that too much time, focus, energy, manpower and money are being spent on low-level drug crimes, and not enough on gangs, violent crime, and crippling corruption.

People seem to dislike this statistical approach because it lacks the street smart gritty glamor of the gumshoe cop on the beat–think Hill Street Blues. But what we think is most important and what is most important are often different, even in the eyes of “experts.” Overconfidence Effect and Illusion of Control are cognitive biases that veteran detectives, successful stockbrokers, doctors–and experienced teachers like me–possess whether they (or we) admit it or not. These biases lead “experts” to believe that all they see is all there is (I paraphrase Daniel Kahneman) and think that their experience and wisdom are right. Big Data, especially now that we have the computing power to crunch it, can help us get a more realistic picture of the real problems of the world. But I hope we are as skeptical of Big Data as we are of experts. Big Data has blind spots, too.