Richard H. Thaler “Nudge: Improving Decisions About Health, Wealth, and Happiness”

Ричард Таллер – звезда первой величины в области поведенческой экономики. Его книга Nudge сама по себе очень известна, но сам по себе автор еще более известен и о не м вы будете читать в других книгах по поведенческой экономике. Сразу несколько ссылок для ознакомления, а дальше сразу к самой книге: Ричард Таллер, видео, видео (в этом видео просто фантастический пример про линии на дороге), блог книги.

Итак.

Amazon.com: What do you mean by “nudge” and why do people sometimes need to be nudged?
Thaler and Sunstein: By a nudge we mean anything that influences our choices. A school cafeteria might try to nudge kids toward good diets by putting the healthiest foods at front. We think that it’s time for institutions, including government, to become much more user-friendly by enlisting the science of choice to make life easier for people and by gentling nudging them in directions that will make their lives better.

– В чем отличие книги от других книг про поведенческие финансы и поведенческую экономику?

– В том, что автор (ы) предлагает (ют) способы использования человеческих слабостей пртив самих же слабостей.
Если помните, то у Дена Ариэли книга называется “предсказуемая иррациональность”, что значит, что используя знания о человеческих ошибках (прогнозируя ошибки) можно заранее готовить и противоядие таким ошибкам.
Часть 1. Люди и Экос (рациональные существа из моделей)
В этой части быстро, но со вкусом обсуждается то, что люди не приспособлены к рациональным поступкам во многих случаях из реальности. У человека есть две системы принятия решений (медленная и быстрая) и там где нужна скорость там работают инстинкты, которые могут ошибюаться в трактовании посылов современного мира. Также тут вводится понятие стимула (nudge) и обсуждается то, в каком контексте его необходимо применять (чтобы он действовал). Тут же автор вводит на в понятие архитектуры выбора и элементов из которых состоит среда в которой мы делаем наш выбор.
Часть 2. Деньги.
Тут мы переходим к конкретеке – ошибкам и стимулам в которых мы можем нуждатся. Опять же, стимул, это не пряник который нам дают за правильно выполненную работу. Это может быть просто интуитивно понятно сдизайнированная дверь которая своим видом будет указывать куда ее нужно толкать, от или на себя. В этом разделе у нас: сбережения, инвестирование, кредиты, и пенсии. Подробно расписывать не буду, смотрите цитаты ниже по тексту.
Части 3, 4: здравохранение и свобода.
Я не дошел до этих частей. Не потому, что интересно, отнють. Просто на данный момент общественные проблемы интересут меня несколько меньше, чем проблемы конкретных людей в процессе их работы с финансовыми рынками. Но вы обязательно прочтите.
Часть 5. 
Еще 12 стимулов. И контраргументы по поводу влияния на выбор людей.
В заключение распишу пример, о ктором я вспомнил выше. Про дорогу. Итак, есть дорога и поворот на котором мало кто снижает скорость, хотя это делать нужно. Придумали следующее. Нанесли линии на дороге (бело краской) сначала на одинаковом расстоянии друг от друга (линии перпендикулярные дороге, как пешеходный переход), например 50м между линиями. Ближе к повороту расстояние между линиями сделали меньше, потом еще меньше, и постом совсем маленькие.
Представим, вы едете по дороге. Видете линии мелькают через одинаковые промеждутки времени. Потом, линии начитают мелькать быстрее и вам кажется, что это от того, что вы начинаете быстрее ехать. В ответ вы интинктивно сбавляете скорость. В итоге пользуясь наивность первой системы приянтия решений (интуитивной) вас обманули и заставили снизить скорость при входе в поворот.
Далее цитаты, которые я отмеча в процессе чтения.

1. Carolyn is what we will be calling a choice architect. A choice architect has the responsibility for organizing the context in which people make decisions.

2. There are many parallels between choice architecture and more traditional forms of architecture. A crucial parallel is that there is no such thing as a “neutral” design.
3. A nudge, as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting the fruit at eye level counts as a nudge. Banning junk food does
4. A nudge, as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting the fruit at eye level counts as a nudge. Banning junk food does not.
5. If you look at economics textbooks, you will learn that homo economicus can think like Albert Einstein, store as much memory as IBM’s Big Blue, and exercise the willpower of Mahatma Gandhi. Really. But the folks that we know are not like that. Real people have trouble with long division if they don’t have a calculator, sometimes forget their spouse’s birthday, and have a hangover on New Year’s Day. They are not homo economicus; they are homo sapiens.
6. That research has raised serious questions about the rationality of many judgments and decisions that people make. To qualify as Econs, people are not required to make perfect forecasts (that would require omniscience), but they are required to make unbiased forecasts. That is, the forecasts can be wrong, but they can’t be systematically wrong in a predictable direction. Unlike Econs, Humans predictably err. Take, for example, the “planning fallacy”-the systematic tendency toward unrealistic optimism about the time it takes to complete projects. It will come as no surprise to anyone who has ever hired a contractor to learn that everything takes longer than you think, even if you know about the planning fallacy.
7. It seems reasonable to say that people make good choices in contexts in which they have experience, good information, and prompt feedback-say, choosing among ice cream flavors. People know whether they like chocolate, vanilla, coffee, licorice, or something else. They do less well in contexts in which they are inexperienced and poorly informed, and in which feedback is slow or infrequent-say, in choosing between fruit and ice cream (where the long-term effects are slow and feedback is poor) or in choosing among medical treatments or investment options. If you are given fifty prescription drug plans, with multiple and varying features, you might benefit from a little help.
8. These two figures capture the key insight that behavioral economists have borrowed from psychologists. Normally the human mind works remarkably well. We can recognize people we have not seen in years, understand the complexities of our native language, and run down a flight of stairs without falling. Some of us can speak twelve languages, improve the fanciest computers, and/or create the theory of relativity. However, even Einstein would probably be fooled by those tables. That does not mean something is wrong with us as humans, but it does mean that our understanding of human behavior can be improved by appreciating how people systematically go wrong.
9. How We Think: Two Systems. The workings of the human brain are more than a bit befuddling. How can we be so ingenious at some tasks and so clueless at others? Beethoven wrote his incredible ninth symphony while he was deaf, but we would not be at all surprised if we learned that he often misplaced his house keys. How can people be simultaneously so smart and so dumb? Many psychologists and neuroscientists have been converging on a description of the brain’s functioning that helps us make sense of these seeming contradictions. The approach involves a distinction between two kinds of thinking, one that is intuitive and automatic, and another that is reflective and rational.’ We will call the first the Automatic System and the second the Reflective System. (In the psychology literature, these two systems are sometimes referred to as System i and System 2, respectively.)
10. Ifyou are a television fan, think of Mr. Spock of Star Trek fame as someone whose Reflective System is always in control. In contrast, Homer Simpson seems to have forgotten where he put his Reflective System.
11. In fact, there is a great collection edited by Tom Parker titled Rules of Thumb. Parker wrote the book by asking friends to send him good rules of thumb. For example, “One ostrich egg will serve 24 people for brunch.” “Ten people will raise the temperature of an average size room by one degree per hour.” And one to which we will return: “No more than 25 percent of the guests at a university dinner party can come from the economics department without spoiling the conversation.”
12. Their original work identified three heuristics, or rules of thumb-anchoring, availability, and representativeness-and the biases that are associated with each. Their research program has come to be known as the “heuristics and biases” approach to the study of human judgment. More recently, psychologists have come to understand that these heuristics and biases emerge from the interplay between the Automatic System and the Reflective System.
13. This process is called “anchoring and adjustment.” You start with some anchor, the number you know, and adjust in the direction you think is appropriate. So far, so good. The bias occurs because the adjustments are typically insufficient.
  Секретный план обвалить рынок и не только!
14. Anchors can even influence how you think your life is going. In one experiment, college students were asked two questions: (a) How happy are you? (b) How often are you dating? When the two questions were asked in this order the correlation between the two questions was quite low (.11). But when the question order was reversed, so that the dating question was asked first, the correlation jumped to .62.
15. When “availability bias” is at work, both private and public decisions may be improved if judgments can be nudged back in the direction of true probabilities. A good way to increase people’s fear of a bad outcome is to remind them of a related incident in which things went wrong; a good way to increase people’s confidence is to remind them of a similar situation in which everything worked out for the best.
16. The third of the original three heuristics bears an unwieldy name: representativeness. Think of it as the similarity heuristic. The idea is that when asked to judge how likely it is that A belongs to category B, people (and especially their Automatic Systems) answer by asking themselves how similar A is to their image or stereotype of B (that is, how “representative” A is of B).

17. A less trivial example, from the Cornell psychologist Tom Gilovich (1991), comes from the experience of London residents during the German bombing campaigns of World War II. London newspapers published maps, such as the one shown in Figure 1.3, displaying the location of the strikes from German V-i and V-2 missiles that landed in central London. As you can see, the pattern does not seem at all random. Bombs appear to be clustered around the River Thames and also in the northwest sector of the map. People in London expressed concern at the time because the pattern seemed to suggest that the Germans could aim their bombs with great precision. Some Londoners even speculated that the blank spaces were probably the neighborhoods where German spies lived. They were wrong. In fact the Germans could do no better than aim their bombs at Central London and hope for the best. A detailed statistical analysis of the dispersion of the location of the bomb strikes determined that within London the distribution of bomb strikes was indeed random. Still, the location of the bomb strikes does not look random. What is going on here? We often see patterns because we construct our informal tests only after looking at the evidence. The World War II example is an excellent illustration of this problem. Suppose we divide the map into quadrants, as in Figure i.4-a. If we then do a formal statistical test-or, for the less statistically inclined, just count the number of hits in each quadrant-we do find evidence of a nonrandom pattern. However, nothing in nature suggests that this is the right way to test for randomness. Suppose instead we form the quadrants diagonally as in Figure i.4-b. We are now unable to reject the hypothesis that the bombs land at random. Unfortunately, we do not subject our own perceptions to such rigorous alternative testing.

18. Mostly, though, there is thankfully nothing to worry about, except for the fact that the use of the representativeness heuristic can cause people to confuse random fluctuations with causal patterns.
19. Unrealistic optimism is a pervasive feature of human life; it characterizes most people in most social categories. When they overestimate their personal immunity from harm, people may fail to take sensible preventive steps. If people are running risks because of unrealistic optimism, they might be able to benefit from a nudge. In fact, we have already mentioned one possibility: if people are reminded of a bad event, they may not continue to be so optimistic.
20. The combination of loss aversion with mindless choosing implies that if an option is designated as the “default,” it will attract a large market share. Default options thus act as powerful nudges. In many contexts defaults have some extra nudging power because consumers may feel, rightly or wrongly, that default options come with an implicit endorsement from the default setter, be it the employer, government, or TV scheduler. For this and other reasons, setting the best possible defaults will be a theme we explore often in the course of this book.
21. Framing. Suppose that you are suffering from serious heart disease and that your doctor proposes a grueling operation. You’re understandably curious about the odds. The doctor says, “Of one hundred patients who have this operation, ninety are alive after five years.” What will you do? If we fill in the facts in a certain way, the doctor’s statement will be pretty comforting, and you’ll probably have the operation. But suppose the doctor frames his answer in a somewhat different way. Suppose that he says, “Of one hundred patients who have this operation, ten are dead after five years.” Ifyou’re like most people, the doctor’s statement will sound pretty alarming, and you might not have the operation.
22. Framing works because people tend to be somewhat mindless, passive decision makers. Their Reflective System does not do the work that would be required to check and see whether refraining the questions would produce a different answer. One reason they don’t do this is that they wouldn’t know what to make of the contradiction. This implies that frames are powerful nudges, and must be selected with caution.
23. Our goal in this chapter has been to offer a brief glimpse at human fallibility. The picture that emerges is one of busy people trying to cope in a complex world in which they cannot afford to think deeply about every choice they have to make. People adopt sensible rules of thumb that sometimes lead them astray. Because they are busy and have limited attention, they accept questions as posed rather than trying to determine whether their answers would vary under alternative formulations. The bottom line, from our point of view, is that people are, shall we say, nudge-able. Their choices, even in life’s most important decisions, are influenced in ways that would not be anticipated in a standard economic framework.
24. At the beginning of the dangerous curve, drivers encounter a sign painted on the road warning of the lower speed limit, and then a series of white stripes painted onto the road. The stripes do not provide much if any tactile information (they are not speed bumps) but rather just send a visual signal to drivers. When the stripes first appear, they are evenly spaced, but as drivers reach the most dangerous portion of the curve, the stripes get closer together, giving the sensation that driving speed is increasing (see Figure 1.5). One’s natural instinct is to slow down. When we drive on this familiar stretch of road, we find that those lines are speaking to us, gently urging us to touch the brake before the apex of the curve. We have been nudged.
25. As with Supreme Court Justice Potter Stewart’s “I know it when I see it” adage about pornography, temptation is easier to recognize than to define.
26. None of this means that decisions made in a cold state are always better. For example, sometimes we have to be in a hot state to overcome our fears about trying new things. Sometimes dessert really is delicious, and we do best to go for it. Sometimes it is best to fall in love. But it is clear that when we are in a hot state, we can often get into a lot of trouble.
27. For most of us, however, self-control issues arise because we underestimate the effect of arousal. This is something the behavioral economist George Loewenstein (1996) calls the “hot-cold empathy gap.” When in a cold state, we do not appreciate how much our desires and our behavior will be altered when we are “under the influence” of arousal.
28. Recent research in neuroeconomics (yes, there really is such a field) has found evidence consistent with this two-system conception of self-control. Some parts of the brain get tempted, and other parts are prepared to enable us to resist temptation by assessing how we should react to the temptation.’ Sometimes the two parts of the brain can be in severe conflict-a kind of battle that one or the other is bound to lose.
29. The cashew problem is not only one of temptation. It also involves the type of mindless behavior we discussed in the context of inertia. In many situations, people put themselves into an “automatic pilot” mode, in which they are not actively paying attention to the task at hand. (The Automatic System is very comfortable that way.)
  Выручка компании Газпром»в первые три месяца увеличилась на тридцать один процент
30. Since people are at least partly aware of their weaknesses, they take steps to engage outside help. We make lists to help us remember what to buy at the grocery store. We buy an alarm clock to help us get up in the morning. We ask friends to stop us from having dessert or to fortify our efforts to quit smoking. In these cases, our Planners are taking steps to control the actions of our Doers, often by trying to change the incentives that Doers face. Unfortunately, Doers are often difficult to rein in (think of controlling Homer), and they can foil the best efforts of Planners. Consider the mundane but revealing example of the alarm clock. The optimistic Planner sets the alarm for 6:15 A.M., hoping for a full day of work, but the sleepy Doer turns off the alarm and goes back to sleep until 9:oo. This can lead to fierce battles between the Planner and the Doer. Some Planners put the alarm clock on the other side of the room, so the Doer at least has to get up to turn it off, but if the Doer crawls back into bed, all is lost. Fortunately, enterprising firms sometimes offer to help the Planner out.
31. More formal versions of these strategies are easy to imagine. In Chapter i6 we will encounter the Web site Stickk.com (of which Karlan is a cofounder), which gives people a method by which their Planners can constrain their Doers. In some situations, people may even want the government to help them deal with their self-control problems. In extreme cases, governments might ban some items (such as heroin use, prostitution, and drunken driving). Such bans can be seen as pure rather than libertarian paternalism, though third-party interests are also at stake.
32. One interesting example of a government-imposed self-control strategy is daylight saving time (or summer time, as it is called in many parts of the world). Surveys reveal that most people think that daylight saving time is a great idea, primarily because they enjoy the “extra” hour of daylight during the evening. Of course, the number of daylight hours on a given day is fixed, and setting the clocks ahead one hour does nothing to increase the amount of daylight. The simple change of the labels on the hours of the day, calling “six o’clock” by the name “seven o’clock,” nudges us all into waking up an hour earlier.
33. Alarm clocks and Christmas clubs are external devices people use to solve their self-control problems. Another way to approach these problems is to adopt internal control systems, otherwise known as mental accounting. Mental accounting is the system (sometimes implicit) that households use to evaluate, regulate, and process their home budget. Almost all of us use mental accounts, even if we’re not aware that we’re doing so.
34. According to economic theory (and simple logic), money is “fungible,” meaning that it doesn’t come with labels. Twenty dollars in the rent jar can buy just as much food as the same amount in the food jar. But households adopt mental accounting schemes that violate fiingibility for the same reasons that organizations do: to control spending.
35. You can also see mental accounting in action at the casino. Watch a gambler who is lucky enough to win some money early in the evening. You might see him take the money he has won and put it into one pocket and put the money he brought with him to gamble that evening (yet another mental account) into a different pocket. Gamblers even have a term for this. The money that has recently been won is called “house money” because in gambling parlance the casino is referred to as the house. Betting some of the money that you have just won is referred to as “gambling with the house’s money,” as if it were, somehow, different from some other kind of money. Experimental evidence reveals that people are more willing to gamble with money that they consider house money.4
36. David Gross and Nick Souleles (2002) found that the typical household in their sample had more than $S,ooo in liquid assets (typically in savings accounts earning less than S percent a year) and nearly $3,000 in credit card balances, carrying a typical interest rate of 18 percent or more. Using the money from the savings account to pay off the credit card debt amounts to what economists call an arbitrage opportunity-buying low and selling high-but the vast majority of households fail to take advantage.
37. that one of the most effective ways to nudge (for good or evil) is via social influence.
38. Social influences come in two basic categories. The first involves information. If many people do something or think something, their actions and their thoughts convey information about what might be best for you to do or think. The second involves peer pressure. If you care about what other people think about you (perhaps in the mistaken belief that they are paying some attention to what you are doing-see below), then you might go along with the crowd to avoid their wrath or curry their favor.
39. We can see here why many groups fall prey to what is known as “collective conservatism”: the tendency of groups to stick to established patterns even as new needs arise.
40. The Spotlight Effect. One reason why people expend so much effort conforming to social norms and fashions is that they think that others are closely paying attention to what they are doing. If you wear a suit to a social event where everyone else has gone casual, you feel like everyone is looking at you funny and wondering why you are such a geek. If you are subject to such fears, here is a possibly comforting thought: they aren’t really paying as much attention to you as you think.
41. We are also greatly influenced by consumption norms within the relevant group. A light eater eats much more in a group of heavy eaters. A heavy eater will show more restraint in a light-eating group.
42. Priming. Thus far we have been focusing on people’s attention to the thoughts and behavior of other people. Closely related work shows the power of “priming.” Priming refers to the somewhat mysterious workings of the Automatic System of the brain. Research shows that subtle influences can increase the ease with which certain information comes to mind. Imagine playing a word-association game with Homer Simpson and you will get the idea. Sometimes the merest hint of an idea or concept will trigger an association that can stimulate action. These “primes” occur in social situations, and their effects can be surprisingly powerful. In surveys, people are often asked whether they are likely to engage in certain behavior-to vote, to lose weight, to purchase certain products. Those who engage in surveys want to catalogue behavior, not to influence it. But social scientists have discovered an odd fact: when they measure people’s intentions, they affect people’s conduct. The “mere-measurement effect” refers to the finding that when people are asked what they intend to do, they become more likely to act in accordance with their answers. This finding can be found in many contexts. If people are asked whether they intend to eat certain foods, to diet, or to exercise, their answers to the questions will affect their behavior.” In our parlance, the mere-measurement effect is a nudge, and it can be used by private or public nudgers.

43. The nudge provided by asking people what they intend to do can be accentuated by asking them when and how they plan to do it. This insight falls into the category of what the great psychologist Kurt Lewin called “channel factors,” a term he used for small influences that could either facilitate or inhibit certain behaviors.

44. The three social influences that we have emphasized-information, peer pressure, and priming-can easily be enlisted by private and public nudgers.
45. The key point here is that for all their virtues, markets often give companies a strong incentive to cater to (and profit from) human frailties, rather than to try to eradicate them or to minimize their effects.
46. Suppose you are told that a group of people will have to make some choice in the near future. You are the choice architect. You are trying to decide how to design the choice environment, what kinds of nudges to offer, and how subtle the nudges should be. What do you need to know to design the best possible choice environment?
47. Benefits Now-Costs Later. We have seen that predictable problems arise when people must make decisions that test their capacity for self control. Many choices in life, such as whether to wear a blue shirt or a white one, lack important self-control elements. Self-control issues are most likely to arise when choices and their consequences are separated in time.
  Трейдинг бай, ч2
48. Even hard problems become easier with practice. Both of us have managed to learn how to serve a tennis ball into the service court with reasonable regularity (and in Sunstein’s case, even velocity), but it took some time.
49. Generally, the higher the stakes, the less often we are able to practice. Most of us buy houses and cars not more than once or twice a decade, but we are really practiced at grocery shopping. Most families have mastered the art of milk inventory control, not by solving the relevant mathematical equation but through trial and error. *
50. Even practice does not make perfect if people lack good opportunities for learning. Learning is most likely if people get immediate, clear feedback after each try. Suppose you are practicing your putting skills on the practice green. If you hit ten balls toward the same hole, it is easy to get a sense of how hard you have to hit the ball. Even the least talented golfers will soon learn to gauge distance under these circumstances. Suppose instead you were putting the golf balls but not getting to see where they were going. In that environment, you could putt all day and never get any better.
51. When feedback does not work, we may benefit from a nudge.
52. stimulus response compatibility. The idea is that you want the signal you receive (the stimulus) to be consistent with the desired action. When there are inconsistencies, performance suffers and people blunder.
53. Consider, for example, the effect of a large, red, octagonal sign that said GO.
54. Expect Error. Humans make mistakes. A well-designed system expects its users to err and is as forgiving as possible. Some examples from the world of real design illustrate this point: • In the Paris subway system, Le Metro, users insert a paper card the size of a movie ticket into a machine that reads the card, leaves a record on the card that renders it “used,” and then spits it out from the top of the machine. The cards have a magnetic strip on one side but are otherwise symmetric. On Thaler’s first visit to Paris, he was not sure how to use the system, so he tried putting the card in with the magnetic strip face up and was pleased to discover that it worked. He was careful thereafter to insert the card with the strip face up. Many years and trips to Paris later, he was proudly demonstrating to a visiting friend the correct way to use the Metro system when his wife started laughing. It turns out that it doesn’t matter which way you put the card into the machine! • Another automobile-related bit of good design involves the nozzles for different varieties of gasoline. The nozzles that deliver diesel fuel are too large to fit into the opening on cars that use gasoline, so it is not possibe to make the mistake of putting diesel fuel in your gasoline-powered car (though it is still possible to make the opposite mistake). The same principle has been used to reduce the number of errors involving anesthesia. One study found that human error (rather than equipment failure) caused 82 percent of the “critical incidents.”
55. Give Feedback. The best way to help Humans improve their performance is to provide feedback. Well-designed systems tell people when they are doing well and when they are making mistakes. Some examples: • Digital cameras generally provide better feedback to their users than film cameras. After each shot, the photographer can see a (small) version of the image just captured. This eliminates all kinds of errors that were common in the film era, from failing to load the film properly (or at all), to forgetting to remove the lens cap, to cutting off the head of the central figure of the picture.
56. Incentives. Our last topic is the one with which most economists would have started: prices and incentives. Though we have been stressing factors that are often neglected by traditional economic theory, we do not intend to suggest that standard economic forces are unimportant. This is as good a point as any to state for the record that we believe in supply and demand. If the price of a product goes up, suppliers will usually produce more of it and consumers will usually want less of it. So choice architects must think about incentives when they design a system. Sensible architects will put the right incentives on the right people.
57. We have sketched six principles of good choice architecture. As a concession to the bounded memory of our readers, we thought it might be useful to offer a mnemonic device to help recall the six principles. By rearranging the order, and using one small fudge, the following emerges. Ncentives. Understand mappings. Defaults. Give feedback. Expect error. Structure complex choices. NUDGES. With an eye on these NUDGES, choice architects can improve the outcomes for their Human users.
58. The decisions they do make will differ from those of Econs in two ways. First, they will be unduly influenced by short-term fluctuations, and second, their decisions are likely to be based on rules of thumb.
59. The lesson from the story of Vince and Rip is that attitudes toward risk depend on the frequency with which investors monitor their portfolios.
60. As Kenny Rogers advises in his famous song “The Gambler”: “You never count your money when you’re sittin’ at the table, / There’ll be time enough for countin’ when the dealin’s done.”
61. Even the most sophisticated investors can sometimes find the decision about how to invest their money daunting, and they resort to simple rules of thumb. Take the example of the financial economist and Nobel laureate Harry Markowitz, one of the founders of modern portfolio theory. When asked about how he allocated his retirement account, he confessed: “I should have computed the historic covariances of the asset classes and drawn an efficient frontier. Instead … I split my contributions fifty-fifty between bonds and equities.”2
62. Markowitz’s strategy can be viewed as one example of what might be called the diversification heuristic. “When in doubt, diversify.” Don’t put all your eggs in one basket. In general, diversification is a great idea, but there is a big difference between sensible diversification and the naive kind. A special case of this rule of thumb is what might be called the “i/n” heuristic: “When faced with `n’ options, divide assets evenly across the op- tions.”3 Put the same number of eggs in each basket.
63. In a revealing study, university employees were asked how they would invest their retirement money if they had just two funds to choose from.5 In one condition, one of the funds invested entirely in stocks, the other in bonds. Most of the participants chose to invest their money half and half, achieving an asset allocation of So percent stocks. Another group was told that one fund invested entirely in stocks and the other “balanced” fund invested half in stocks and half in bonds. People in this group could have also have invested So percent of their money in stocks by putting all their money in the balanced fiend. Instead, they followed the inn rule and divided their money evenly between the two funds-ending up with mostly stocks. People in a third group were given a choice between a balanced fund and a bond fund. Well, you can guess what they did.
64. We can take some general lessons from this analysis. When markets get more complicated, unsophisticated and uneducated shoppers will be especially disadvantaged by the complexity. The unsophisticated shoppers are also more likely to be given bad or self interested advice by people serving in roles that appear to be helpful and purely advisory. In this market, mortgage brokers who cater to rich clients probably have a greater incentive to establish a reputation for fair dealing. By contrast, mortgage brokers who cater to the poor are often more interested in making a quick buck.*
65. We have covered a number of topics in this chapter, but the unifying message is simple. For mortgages, school loans, and credit cards, life is far more complicated than it needs to be, and people can be exploited. Often it’s best to ask people to take care of themselves, but when people borrow, standard human frailties can lead to serious hardship and even disaster. Here as elsewhere, government should respect freedom of choice; but with a few improvements in choice architecture, people would be far less likely to choose badly.

Пролистать наверх