Pursuit of Truthiness

my gut tells me I know economics

Archive for the ‘rationality’ Category

We Need Better Experts- and a Better Public

leave a comment »

Experts are often wrong. Non-experts are even more often wrong.

These two statements are both true and important but it is hard to keep both of them in mind at the same time. Experts and members of the public both misunderstand this, and each ‘side’ can err by being either too humble or too arrogant:

ExpertFailureModes

COVID has provided many examples of all 4 failures. The expert institutions that were supposed to handle this best, the WHO and CDC, kept being arrogantly wrong thoughout the first quarter of 2020- saying that there was no human-to-human transmission, that travel bans were unnecessary, that masks don’t work, and in the case of CDC producing a botched test kit while preventing the use of alternative tests. Throughout this time, much of the public, and even experts outside the institutions, humbly deferred.

At the same time, many people who had figured out what was really happening were too humble to speak out, or too scared, or simply didn’t think of it. In the second quarter of 2020, expert institutions gradually figured things out, but much of the public still arrogantly dismisses their advice.

There are a few reasons this isn’t easy to get right.

Generalizability- Sure, in early 2020 on COVID you would have been better off listening to the Bay Area technologists I follow on Twitter than to the WHO and CDC- but is that generally the best way to get health advice? How many people are capable of listening to both groups and evaluating their specific arguments to figure out who is right?

Bias- experts really are biased, but it isn’t always obvious in which directions this matters, and members of the public have their own biases. For instance, are the public health agencies overstating the risk of COVID in order to get more funding and power, or understating it so as not to embarrass the governments that fund them? Both sounds plausible in the abstract, and you could say either in order to justify your own biases. In this case I do think they understated risks due to political pressure, but its also possible they simply weren’t as smart as we, or they, thought they were.

Expert, compared to who?- For quasi-experts, its important to keep in mind what your audience knows and where their information will come from if not from you. Many economists get annoyed and don’t answer when regular people ask them about the stock market- “I’m not an expert, that’s not what economics is really about”- but I think this is a bad response. We’re not true experts, but we should know much more than the average person, and be glad for the opportunity to share the basics (save through your 401(k) and put it in low-fee diversified index funds) and point them to the true experts. On the other hand, if you think your quasi-expertise means you can beat the market day-trading you’re likely to have a bad time.

How can members of the public know who are the real relevant experts? This is often far from obvious. Should I trust epidemiologists at the CDC more or less than those at universities? When evaluating potential treatments should I most trust virologists, epidemiologists, medical doctors, or someone else? If I want to know how COVID will affect the economy should I ask epidmiologists, economists, or someone else? If economists, who or which subfield? For forecasting COVID cases, is the relevant expertise domain knowledge like epidemiology or is it general forecasting ability?

Getting experts to be less biased and to have the appropriate level of confidence in their abilities and predictions is vital. So is getting members of the public to know who the relevant experts are, and how much to defer to their judgement in various situations- yet none of this is explicitly taught.

A parting thought, paraphrasing Garett Jones‘ twist on an old William Buckley quip- “I’d rather be governed by the first 2,000 names in the Boston phone book than by the faculty of Harvard, but I’d rather be governed by the faculty of MIT than either”

Related reading:

Inadequate Equilibria, Eliezer Yudkowsy is all about this issue: “the single most obvious notion that correct contrarians grasp, and that people who have vastly overestimated their own competence don’t realize: It takes far less work to identify the correct expert in a pre-existing dispute between experts, than to make an original contribution to any field that is remotely healthy.”

“inside a civilization that is often tremendously broken on a systemic level, finding a contrarian expert seeming to shine against an untrustworthy background is nowhere remotely near as difficult as becoming that expert yourself. It’s the difference between picking which of four runners is most likely to win a fifty-kilometer race, and winning a fifty-kilometer race yourself. Distinguishing a correct contrarian isn’t easy in absolute terms. You are still trying to be better than the mainstream in deciding who to trust. For many people, yes, an attempt to identify contrarian experts ends with them trusting faith healers over traditional medicine. But it’s still in the range of things that amateurs can do with a reasonable effort, if they’ve picked up on unusually good epistemology from one source or another.”

Myth of the Rational Voter, Brian Caplan: Argues that the public is rationally ignorant of issues (like public policy and politics) where more knowledge would not personally benefit them. But also shows that experts have systematically different political views than the general public, and showcases a statistical method to “correct” for this and show  what economists would think on various policy issues if their income level and general political views matched that of a typical person.

Better to remain silent and be thought a fool…

leave a comment »

“The worst thing that can happen to a good cause is, not to be skillfully attacked, but to be ineptly defended.” ― Frédéric Bastiat

Though Bastiat wrote in the 1800’s, this point (like his other main points) still seems woefully under-appreciated today. So often I hear people defending one sort of idea by pointing out the weak character or arguments of the idea’s opponents.

While this itself borders on fallacious reasoning, it seems to be simply how people work. Because of this, we should all consider from time to time whether to best thing we can do to advance our own ideas is to simply stay quiet, at least until we have thought more.

Dostoevsky has a great illustration of this idea through the character of Semyonovitch in Crime and Punishment:

Andrey Semyonovitch was an anæmic, scrofulous little man, with strangely flaxen mutton-chop whiskers of which he was very proud. He was a clerk and had almost always something wrong with his eyes. He was rather soft-hearted, but self-confident and sometimes extremely conceited in speech, which had an absurd effect, incongruous with his little figure. He was one of the lodgers most respected by Amalia Ivanovna, for he did not get drunk and paid regularly for his lodgings. Andrey Semyonovitch really was rather stupid; he attached himself to the cause of progress and “our younger generation” from enthusiasm. He was one of the numerous and varied legion of dullards, of half-animate abortions, conceited, half-educated coxcombs, who attach themselves to the idea most in fashion only to vulgarise it and who caricature every cause they serve, however sincerely.

Though Lebeziatnikov was so good-natured, he, too, was beginning to dislike Pyotr Petrovitch. This happened on both sides unconsciously. However simple Andrey Semyonovitch might be, he began to see that Pyotr Petrovitch was duping him and secretly despising him, and that “he was not the right sort of man.” He had tried expounding to him the system of Fourier and the Darwinian theory, but of late Pyotr Petrovitch began to listen too sarcastically and even to be rude. The fact was he had begun instinctively to guess that Lebeziatnikov was not merely a commonplace simpleton, but, perhaps, a liar, too, and that he had no connections of any consequence even in his own circle, but had simply picked things up third-hand; and that very likely he did not even know much about his own work of propaganda, for he was in too great a muddle….

Written by James Bailey

January 11, 2016 at 6:30 pm

Ignoring Economics: Tactics for Beginners and Advanced Practitioners

with one comment

President Obama called for an increase in the minimum wage to $9 in last night’s State of the Union speech. A lot of economists will take this as a personal affront, wondering how people still think this is a good idea after we explain in every MicroEcon 101 class how it will backfire and result in poor people losing their jobs and losing non-wage benefits. If you are determined to support a minimum wage, you could simply ignore all these arguments, but this beginner tactic will leave you looking ignorant.

A more advanced tactic for not having to change your mind about the minimum wage allows you to know two things instead of none. You can know the Econ 101 arguments, and also know about Card and Kreuger’s 1996 empirical study showing how the minimum wage might not affect unemployment. Pull out your pocket copy of Card and Kreuger’s paper whenever someone brings up the topic.

Be careful, though, not to take this whole “acquiring new information” thing too far. Remember that your goal isn’t to understand how the world works, but rather to keep the beliefs you started with. Don’t develop a general rule of looking at the academic literature on a subject: this would lead you to do things like read other papers about the minimum wage, but the vast majority suggest problems with it. Don’t decide that David Card and Alan Kreuger are the most trustworthy economists- this would mean you need to take their other work seriously, and then you would have to change your mind about immigration or occupational licensing. Remember, reasoning works by starting with a conclusion you like, and then looking for information that supports it. Otherwise you might have to admit you were wrong!

Obviously this is my poor attempt at a joke. More seriously, as a researcher I worry that even when people do seem to be interested in your work, it is only because it confirms their prior beliefs. Alan Kreuger is a great econometrician and managed to become an advisor to the President. This could be a great opportunity for his work to inform which policies to choose, but instead his work is either ignored or used as a decoration for policies that would be pushed anyway. So, depsair.

Written by James Bailey

February 13, 2013 at 1:22 pm

Were the Roots of the Global Financial Crisis in our Business Schools?

leave a comment »

No. Well, maybe, the proposition does sound like there is something to it, but after reading “The Roots of the Global Financial Crisis are in our Business Schools” I am actually much less convinced of this thesis than I was when I read the title. This is because the article throws a bunch of shit on the wall hoping some will stick, without even considering that some of the shit might be contradictory. But I think it is a good example of how populism that sounds reasonable at first will end up providing its own contradictions.

Two of the main arguments are that people in finance are too self-interested and care too much about profit maximization (and that they learned each of these in business school), and so don’t consider morality or what is good for society generally. Attributing the financial crisis to individual selfishness or to profit maximization might each sound good individually, but can’t work together, which is the whole point of principal-agent theory. Are people acting in their own best interest (work as little as possible, make as much money for yourself as possible, be neutral about what all this means for the firm), or are they acting in the firm’s shareholders’ best interest and trying to maximize profits? Given the actual history of the financial crisis, it seems more realistic to blame selfishness, since many of the firms involved spectacularly failed to maximize their profits, while the compensation structures based on short-term performance allowed many individuals to make a killing in the process of sinking their firms.

Another main bogeyman of the piece is Milton Friedman and the Chicago school of economics, who are claimed to have taken over business education (I wish!). They are blamed for teaching that individuals are rational and selfish utility maximizers, something which was actually in the textbooks of Alfred Marshall and Paul Samuelson well before the ascent of Chicago. More importantly, the very existence of many departments in business schools and financial companies demonstrate that the people involved do not believe in the standard economic model. How much marketing and advertising is based on the premise that the audience is rational? How many investment companies could justify their existence and huge salaries and fees if they really believed in the Efficient Markets Hypothesis, and believed that their clients were also rational and believed the EMH? Basically just index funds, which don’t have the huge salaries and fees anyway. As an economist, it is very hard for me to believe that my kind has had a huge influence over business schools or business practice. Though I may be a bit biased as a member of a department which just left the business school, and to its credit this argument is more empirically wrong than actually contradictory.

The article also blames Milton Friedman specifically for arguing that a corporation’s sole goal should be to maximize profits and shareholder value, rather than also caring about “corporate social responsibility”. What corporate goals should be is up for debate, but I think it is hard to say that the problem was firms being too profit focused. Again, the problem was that many of these firms turned spectacular losses and failed, or would have failed without bailouts. There are many possible reasons for this, which have been discussed to death: compensation structures, previous bailouts and moral hazard, lack of regulation or poorly structured financial regulation, a giant pool of investment-seeking money caused partly by overly loose monetary policy in 2003-4, misregulation and massive subsidies for the housing sector- all these could have encouraged excessive risk-taking that led to giant losses. But it is hard to argue that firms lost money because they were focused on making money. Furthermore, some of the firms which exploded worst were actually known for “corporate social responsibility”.

But suppose the financial crisis was caused by the standard economics model and the business schools- what do we do about it? This article provides further evidence that an argument jumps the shark when its proposed solution is “more Holistic approaches”. This is the epitome of something which sounds vaguely nice but is actually meaningless. Even better, the article calls for “the rejection of the ideology of rational knowledge in favor of one that gives greater weight to experience, or to spirituality”. I couldn’t make this stuff up.

So, the people actually making this argument made themselves into strawmen. What would a better version look like? Is it possible to “steelman” the argument? I would start with the observation that students who have taken economics classes are more likely to defect in the repeated prisoners dilemma, and their selfishness in this case leads them to worse outcomes than those who haven’t taken economics. Then discuss the importance of cooperation in business, especially as work is done by larger teams. Finally and most importantly, look for empirical evidence that firms act differently based on the college major of their management, and see if firms with more business majors and MBA’s actually take more risks and act less socially responsible.

One major advantage blaming business schools is that it can give new life to an old argument. After any financial crisis or other economic problem people tend to blame “greed”. Economists will often reject this argument by asking, “why was there suddenly more greed now? The crisis started suddenly but human nature changes slowly if at all”. But one possibility is that business schools have in fact been teaching people to be greedier. I’m sure one could make a reasonable case for this; but as we have seen, the worst fate for a position is not to be adeptly attacked, but to be ineptly defended.

Written by James Bailey

November 25, 2012 at 2:03 pm

Does Politics Actually Make Us Dumber?

with one comment

It is sometimes said that while talking about politics the average person loses 15 IQ points. You could look for evidence that this effect persists with a classic priming experiment: people are randomly assigned to answer questions about either current political issues or something innocuous before taking an IQ test. See if the people who recently had politics on the brain did worse than those who didn’t.

This test may already have been done inadvertently by someone trying to figure out which political groups have higher IQ scores, since the same experiment could also provide evidence on that front.

Bonus: If the experiment finds the predicted effect, calculate what the lengthening US election cycle may be doing to average effective IQ (and therefore GDP, et c).

Written by James Bailey

August 12, 2012 at 4:49 pm

Explaining the Puzzle of Congressional Popularity

leave a comment »

There is a seeming paradox in the fact that the US Congress is extremely unpopular (its current approval rating is 17%), while most individual members of Congress are reasonably popular (approval ratings more in the 50% range, with incumbents extremely likely to be re-elected). People like each of the parts but hate the whole.

The simplest way to resolve this paradox is to say that people are irrational, and as an economist I am ashamed to say that this was always my reaction when I heard these facts. But there is a good reason for the usual economist’s assumption of rationality: saying people are irrational often serves as a curiosity-stopper. You see something puzzling, but you just say that people are weird and dumb and you can stop thinking about it. But often it doesn’t take much more thinking to realize how people could be rational after all. Here are some possibilities in this case:

1. Different congressional districts have voters with different political beliefs. Congresspeople should usually have beliefs closer to their own district than someone representing another district would. Voters in Philadelphia should like their representative better than Congress as a whole because their rep is liberal while Congress is moderate.

2. An important specific case of (1) is that people know their representative is trying to bring them pork, while literally no other person in Congress is doing so; in fact the other Congresspeople are all trying to redirect pork away from my district and towards their own.

3. Congresspeople campaign and advertise heavily in their own districts, but very little in other districts. Congress as a whole does essentially no advertising (except, I suppose, putting up signs beside ARRA projects).

4. We may simply like individuals more than groups; perhaps you could call this a kind of irrationality. Certainly people dislike “corporations” but like almost every individual corporation.  Then again, some things probably poll better collectively- the military, the Supreme Court.  This is an interesting question in its own right.

I wonder if political science papers have succeeded in determining the importance of each explanation (and what other explanations they have advanced). One could get data on political beliefs of politicians and their districts to see how unpopular diverging from your district makes you (or see if congress as a whole is more popular in more moderate districts). You could examine how much popularity congresspeople get from bring more pork home (or being seen trying to do so). You could get at the individual-vs-group question by asking people what they think of specific congresspeople in other districts.

Written by James Bailey

June 14, 2012 at 12:57 pm

Politics is the Mind-Killer. Are Dark Arts the Solution?

with one comment

Politics tends to make people much dumber, or at least much worse at discovering the truth.  There is a good reason for this, and Eliezer Yudkowsky said it well:

Politics is an extension of war by other means.  Arguments are soldiers.  Once you know which side you’re on, you must support all arguments of that side, and attack all arguments that appear to favor the enemy side; otherwise it’s like stabbing your soldiers in the back – providing aid and comfort to the enemy.

This is one of the reasons that politics is not about policy.  I wish politics were really all about figuring out what policies are best for everyone and implementing them.  But in reality people are not strongly attached to policies.  It is fun to see everyone change positions on the merits of the filibuster when there is a new majority party in the Senate.  Recently we have been treated to the spectacle of the Heritage Foundation and Mitt Romney disavowing the health reform that strongly resembles the policies they advocated and implemented, because this one was passed by Evil Democrats.  Similarly, all the Democrats who claimed to feel very strongly about the Wars on Afghanistan, Iraq and Terror when the Big Bad Republicans were doing it forgot all about this once their guy was elected and continued the same policies.  Sociologist Fabio Rojas found that attendance by Democrats at anti-war demonstrations fell by more than half after Obama was elected; ironically, this is when the protests may have actually had a better chance at changing policy.

Glenn Greenwald (the currently-ignored conscience of the Democratic Party, or indeed the US) made some great points in this vein recently in a controversial (1700+ comments) article:

Then there’s the inability and/or refusal to recognize that a political discussion might exist independent of the Red v. Blue Cage Match. Thus, any critique of the President’s exercise of vast power (an adversarial check on which our political system depends) immediately prompts bafflement (I don’t understand the point: would Rick Perry be any better?) or grievance (you’re helping Mitt Romney by talking about this!!). The premise takes hold for a full 18 months — increasing each day in intensity until Election Day — that every discussion of the President’s actions must be driven solely by one’s preference for election outcomes (if you support the President’s re-election, then why criticize him?).

You should probably just read the whole Greenwald article, it is a better version of this post.  Anyway, I’ve been thinking about this a lot as we start election season in the US.  How can we be involved in both truth-seeking and politics when they don’t naturally mix?

Two great public intellectuals, Paul Krugman and Tyler Cowen, have been openly discussing the merits of their very different rhetorical strategies.  Krugman has a more political style, implying and often outright saying that people who disagree with him must be idiots.  Cowen is more intellectual and abstract.  Cowen implies that Krugman’s style lacks virtue and integrity:

“The issue is not that Krugman changed his mind (I’ve done that plenty, Alex too).  The issue is that Krugman a) regularly demonizes his opponents, including those who hold Krugman’s old positions, and b) doesn’t work very hard to produce the strongest possible case against his arguments….. There is a kind of hallelujah chorus for Krugman on some of the left-wing economics blogs.  The funny thing is, it’s hurting Krugman most of all”

This is a new chapter in an ancient debate; it reminds me of Socrates calling out the Sophists.  Socrates was a purist who thought we should seek truth and the good, while the Sophists realized that their rhetorical Dark Arts could win them money and influence.  Indeed, Krugman’s response is essentially that his style brings him influence, and you can’t argue with success:

I realized that I also wanted to say something in response to the concern trolling, the “if you were more moderate you’d have more influence” stuff. Again, this amounts to wishing that we lived in a different world. First, there is no such thing in modern America as a pundit respected by both sides. Second, there are people writing about economic issues who are a lot less confrontational than I am; how often do you hear about them? This is not a game, and it is also not a dinner party; you have to be clear and forceful to get heard at all.

Basically, Krugman is saying “If only you knew the Power of the Dark Side”.  But is this strategy really so powerful?  I may not be representative, but it certainly loses me.  I can help but notice how Krugman is trying to reframe the debate (and distort Cowen’s argument) in almost every sentence.  Cowen isn’t concern trolling because he practices what he is preaching; and he advocates a moderate tone, not moderate positions.  Krugman sets up false dichotmies: being civil means you aren’t clear and forceful, you can only be respected by one of two sides.  This last may be the most crucial: Krugman assumes that all discussion takes place where there are opposing sides (and only two of them!) and no respect across them.  This puts him squarely in the arguments-are-soldiers camp.

But is Krugman’s strategy effective in general?  He certainly has a large audience, which is valuable (in terms of income and status).  It is hard to say whether this results in much influence on policy though.  It is hard in general to determine the effect of individuals on policy, but I can’t think of a single issue where Krugman was a leader in getting a policy changed.  Cowen previously argued that most intellectuals, including Krugman (and himself), have little real influence.

Can we become more influential through the use of Dark Arts in general, and ridiculing the other side in particular?  I think the jury is still out here.  Milton Friedman was influential while being unfailingly civil and assuming the best of his opponents, and I think that this helps his work stand the test of time.  It is easier for people to adopt your position if this doesn’t mean they were being idiots before; as Brad Delong said (ironically, since he is guilty of the same vice) “No, Paul, No! You Don’t Slaughter the Returning Prodigal Son, You Slaughter the Fatted Calf!!”  The easiest way to convince people is not to change their mind, but to convince them they agreed with you all along; this is easier if you don’t call them names.

Even if the Dark Arts do confer the power to influence and persuade, it is likely that they erode the ability to find the truth.  In theory you could have a persuading public persona and a truth-seeking private persona with different beliefs.  But to reduce cognitive dissonance people must start believing their own propaganda.  Further, the best way to learn is often through open, honest debate that cuts to the heart of the issues; you don’t learn by beating up strawmen.  The power of the real Dark Arts, just as in so much literature and mythology, comes with a great cost.

Update: One big thing I missed in this post is the need to know your audience; there is no one form of argument that is most convincing to everyone.  My implicit assumption (and that of Kantoos, who made me realize this) is that you should target the median reader out there, just as politicians should target the median voter.  But of course, in primary season you do not target the median of the whole electorate.  Krugman is in a perpetual primary season.  He is not trying to convince the average reader that his “side” is better, but to educate, entertain and radicalize those already on his “side”, by noting that the other side is not even worth considering.  This should have been obvious since he named his book and blog “Conscience of a Liberal”, but I do catch on eventually.

Written by James Bailey

January 5, 2012 at 12:06 am

Berkeley Students Are So Conservative

leave a comment »

They, like most people, are small-c Burkean conservatives about life in general.  They have a strong status quo bias, but rather than admit this like Edmund Burke, they feel compelled to invent reasons why status quo things are good.  Berkeley psychology prof Seth Roberts said “Most of my students, for better or worse, were very conformist. My conclusion…. is that the reasons we give for our beliefs have roughly zero correlation with the actual reasons and shouldn’t be taken seriously (e.g., argued with).”  Robin Hanson said the same about George Mason students:

  • Ask random colleges student random policy questions and they will feel compelled to come up with opinions.
  • Ask them for reasons for those opinions and they’ll feel compelled to come up with such reasons.
  • Such opinions strongly tend to support the status quo – mostly whatever is, is assumed good.
I am thinking along similar lines today after discussing organ markets with my students.  Students say that legal markets in human organs would be bad mainly because it would lead to organ theft.  Even supposing there would be more organ thefts, it is hard to imagine there would be enough to outweigh the deaths of 9000 Americans every year caused by our current ban on organ sales.  If people were used to a functioning market in organs, I have to think they would be horrified by someone saying we should ban organ sales and consign thousands to death in order to reduce theft, just as it would seem crazy to ban day-laboring to protect laborers from employers who stiff them after a day’s work (stealing is already illegal!).  It is easier to think there must be a good reason for the status quo, that we live in the best of all possible worlds and aren’t doing something horrible.  Indeed, there is more right with the world than wrong with it; there is a reason status-quo-biased people continue to survive and thrive.  Further, it is dangerous to think that those who disagree with you must do so out of some ignorant bias; call this the “bias bias”.
In general though, if we are trying to figure out the truth, we have to fight pro-status-quo bias more often than its opposite.  The reason for this is wired into our brains: our dominant trait is to rationalize, not reason.  One part of our brain is dedicated to coming up with a reason for anything, whether it makes sense or not.  In extreme cases, paralyzed people can come up with all sorts of reasons to explain why they aren’t really paralyzed; their brain is acting as an apologist for what is done, not a reasoned truth-seeker (Seriously, check out that link- it is way more interesting than my post, even if you have heard of the phenomenon before).
I am optimistic about getting people who think of themselves as non-conformist or politically liberal to consider new ideas by telling them they are being conservative conformists.  Put name-calling to good use!

Written by James Bailey

October 17, 2011 at 3:00 pm

Placebos, Utilitarianism and Truth

leave a comment »

I finally got around to reading Predictably Irrational, and the chapter on placebos got me thinking.  The chapter describes how some surgeries were found to be no more effective than “placebo surgery”, when doctors told patients they would do the surgery, gave them anesthetics and made incisions but didn’t actually perform the part of the surgery that was supposed to be effective.  The usual response when a treatment is proven to be no more effective than a placebo is to stop doing it, or to claim the study was flawed.

But if a placebo is effective (and they are often quite effective), perhaps we should continue giving them.  If placebos require false belief on the part of the recipient, to what extent is it ok for the scientific and medical establishment to deceive people, or at least not expend effort discrediting placebos?

I know this isn’t exactly a novel question, but I haven’t put much thought into it and the answer is not obvious to me.  Like many other who think of themselves as “rationalists”, I am mostly a utilitarian but I put a value on truth that is likely out of proportion to that which can be justified on purely utilitarian grounds.  My modus operandi is to be truthful without even making utilitarian calculations, and even if I made them and they pointed to deception I would likely decide to be a single-issue deontologist.

This tension goes back to the beginning of both utilitarianism and classical liberal truthiness, since JS Mill helped come up with both ideas.  He tried to square the circle and argue that there was no conflict.  Today people acknowledge the conflict but I have not read a good solution to it.  I believe Robin Hanson and Eliezer Yudkowsky have said something like “the conflict exists, I take the side of holding truthfulness as a value in itself but I cannot fully defend this position.” (except for mundane dishonesty)

I guess that’s where I am now too.  However, I do wonder if rationalists should spend so much effort trying to convince people that, say, homeopathy is quackery.  If people turn to homeopathic remedies in lieu of modern medicine when there is a real treatment available, that is certainly bad.  However, in the areas where modern medicine does little better than a placebo, homeopathy is likely to provide a much cheaper placebo.

This issue comes up in economics as well.  Some macroeconomic remedies may return the economy to prosperity by fooling people.  Rational expectations argues against this by saying that the government is incapable of fooling markets.  However, provided that they could, economists face a dilemma where telling people the truth about what government policy is doing could make the country poorer.

This conflict comes up in politics all the time.  Is it ok to use dishonest tactics to get better policies adopted?  Like end-justify-the-means problems generally, much of the problem is due to the fact that everyone considers their own ends to be worthy, but for many reasons their ends would not in fact increase total utility.

This is part of why I say err on the side of truth, but I cannot really defend this position.

Written by James Bailey

August 31, 2010 at 5:03 pm

Who needs rationality?

leave a comment »

It seems that irrationality is everywhere.  Economics long assumed that people are perfectly rational, but now behavioral economics is the hottest sub-field and Daniel Kahneman got a Nobel prize for showing how people make biased decisions.  Even so, there is no shortage of criticism of the remaining reliance on rationality in economic models.  Books like Nudge: Improving Decisions About Health, Wealth, and Happiness and Predictably Irrational: The Hidden Forces That Shape Our Decisions become bestsellers.  A smart, earnest community of people who want to become more rational has grown up around websites like Less Wrong and Overcoming Bias.

In the face of all this, Fast and Frugal: The Tools of Bounded Rationality (a part of this book) argues that people are actually excellent decision-makers.  The author concedes that people rarely act as perfect Bayesians who do constrained optimization of their utility function.  However, he claims that the heuristics that ordinary people use actually perform very well in most environments.  These heuristics work more quickly and with less information than the corresponding “rational” method.  The paper includes criticisms of many of the studies showing people are  biased, as well as other studies showing examples of heuristics with good predictive power.

Some questions stand out to me:

1.  To what extent are markets an environment in which these heuristics perform well?  As we find out what heuristics people actually use, economists can build models based on them; to some extent this is being done already with the general idea of “bounded rationality”.  If some markets are not a friendly environment for these heuristics, how can public regulation or entrepreneurial information-providers improve them?

2.  What are the arbitrage opportunities?  The investment funds of behavioral economists have not been especially successful as far as I know, but perhaps the fund that really figures out how people apply a heuristic will be, or perhaps there is another market where this would work.  Of course, casinos and credit card companies seem to have done this already.

3.  In some situations (in markets or otherwise) there are returns to cognitive diversity.  People with less common heuristics or who are proper rationalists may succeed because they are doing something different rather than because their strategy is strictly dominating per se.  Of course, the reverse may also be true in other situations and heuristics could pull everyone to a bad equilibrium.

4.  Heuristics may work well for people in most environments, but our environments are becoming increasing different from those we evolved to succeed in.  More and more decision-makers are not human, and the question of how to program them is a very different one from the question of how humans should think.

HT: This Less Wrong post, it is good that they consider arguments which seem to undermine much of their project.  Of course, for all the people working on AI the computer exception is a major one.

I realize my writing style is trying to imitate Tyler Cowen‘s.  This attempt is mostly unconscious and is doomed to failure as I can never be so succinct.

Written by James Bailey

January 26, 2010 at 2:09 pm