Honesty Is The Best Policy Story Essay On Dred
Whether you’re getting it off your chest, venting, expressing yourself, airing your feelings or “just being honest," the truth about honesty is that honesty is not always the best policy.
What’s more, continuing on the path of full disclosure can actually put a permanent closure on your relationships!
The reality is we don’t need knives or guns to mortally wound those closest to us. Words cut like knives and it’s easy to bury your relationship with the verbal cuts of a “truthful” tongue.
The truth is honesty is often a veiled form of self-indulgence.
What do I mean by self-indulgence?
In a nutshell, when feelings build up, it’s frustrating to “sit” on them. And, of course, it feels damn good to release them. That feel good sensation is a form of gratification. It’s like taking an emotional poop, which provides an instant release of pressure. But when we dump emotional turds on others, we are flushing our relationships down the toilet.
It’s a sad fact that our education at home and in school doesn’t include teaching us how to manage our angry feelings. Since intimate relationships trigger negative feelings, this means most of us are mistreating the people we love most by lashing out and even verbally killing those we supposedly love in various overt and symbolic ways.
In my book Till Death Do Us Part (Unless I Kill You First), soon to be republished by Hay House under the title Kiss Your Fights Goodbye: Dr. Love’s 10 Simple Steps to Cooling Conflict and Rekindling Your Connection, I talk about what I call Fight Traps, which are the dysfunctional ways that humans act out anger. These Traps consist of Open Warfare, such as Name Calling, Character Assassination, Put Downs and Sarcasm, to name but a few, and Secret Warfare, such as Silent Treatment, I Forgot, Recruiting Allies, and so on.
The point here is there’s a continuum of dumping that ranges from outright physical violence on the one end of the spectrum to far subtler forms of aggression—honesty being the subtlest of all forms of assault.
While we may feel temporarily relieved when we shoot off rounds of honesty, we pay a terrible price for this temporary satisfaction, as we harm our relationships and our own self-esteem (you can’t feel proud of yourself when you misbehave).
The good news is you can make the decision to change the way you handle your angry feelings; to consider what you say before you speak, to ask yourself how the other person will feel before you say or do x, y or z. To consider whether what you intend to say or do will be helpful and constructive to the other person and your relationship or not.
It's also important to remember that anger is never the primary emotion. When we become angry, it’s because we feel other more basic and vulnerable feelings such as hurt, sadness and fear.
Making the decision to get beneath the veneer of anger and speak from the most vulnerable part of your emotional core is the ticket. When you honestly speak from this place, you arouse a feeling of empathy rather than antipathy. This one simple shift is your secret to turning conflict into connection by fostering a truly intimate and loving relationship that is based on the right kind of honesty.
by Brian Tomasik
First written: 7 Sept. 2013; last updated: 7 Jan. 2018
I personally value honesty on emotional grounds, such as the tendency of honesty to build strong intimacy with others. But I also think honesty is an essential policy for altruists to adopt in their outward-facing work. Some of the reasons include improved credibility, capacity to make promises, protection against lie detectors and surveillance, common-sense heuristics, concession to other value systems, and guarding against overconfidence. While it seems clear that outright lying to advance an altruistic cause is almost never wise, there remain shades of gray in deciding when and how much omission of facts can be justified.
--Dogberry, Much Ado About Nothing, Act 3, Scene 3
Personal feelings on honesty
When I was in 7th grade, my sister and I purchased a new video game. Historically I had tended to be the first one of us to play through the video games that we bought, so this time my sister decided she wanted to play the new game first. I enjoyed watching her do so, but I was itching to play it myself and had a hard time waiting. So, sometimes when my sister wasn't home, I plugged in the game and played it on my own. It was fun, but at one point I made progress that my sister hadn't made herself. The next time she played the game, she found that part already accomplished. "Wait, I didn't do this," she said, puzzled. I replied: "Oh, yes you did. Don't you remember?" In a nervous silence, I hoped she would take my word and move on. Instead, she saw though the deception: "You've been playing this, haven't you?" My face flushed red, and I had to admit that she was right. Not only had I played the game without her permission, but I lied about it too. I felt really embarrassed for a long time afterward.
This highlights a general phenomenon I find with lying: Getting found out is so embarrassing that usually I'd rather just tell the truth and be done with it than tell a lie and have some risk of being discovered later. I find holding a lie to be emotionally draining; it's not worth having the sword of Damocles hang over your head. And it's also a lot of extra work: You have to maintain two histories of reality to refer to depending on the occasion (or N histories for N-1 different lies). As Eliezer Yudkowsky observes in "Protected From Myself":
The honest Way often has a kind of simplicity that transgressions lack. If you tell lies, you have to keep track of different stories you've told different groups, and worry about which facts might encounter the wrong people, and then invent new lies to explain any unexpected policy shifts you have to execute on account of your mistake.
There's a further personal cost to deception, which is a kind of opportunity cost: You forgo the chance to engage fully intimately with others. When I struggle with an emotional situation, often the best thing I can do is come to friends and tell them about it. Sharing everything that's going on allows for catharsis, in addition to being a fruitful way to get advice on next steps. The act of baring yourself to others opens the gateway for emotional intimacy more strongly than almost anything else, and with that intimacy come the comfort, security, and joy of closeness between people. Dishonesty makes this more difficult, and moreover, when you have intimacy, you have less need to be dishonest: You don't need to hide your problems because you can share them with others who love you unconditionally and will help you through whatever you did wrong. In Christianity, this looks like confessing your sins to Jesus. Of course, it works with regular people too. And often it helps to confess not just to your close friends but also to whomever you wronged: Admitting what you did openly is often the first step toward reconciliation.
I suppose some people share emotional intimacy with a few friends but then remain more distant with most of the rest of the world and might be willing to lie in the latter case. Perhaps they've learned the hard way that being emotionally intimate with too many people can lead you to get burned. I've personally been fortunate to have grown up in environments where openness about my experiences was almost uniformly welcomed, and as a result, I'm not shy about sharing deep thoughts and feelings. Relatedly, I show little embarrassment adopting unusual stances or even seeming to make a fool of myself in public, such as by stopping to move worms off the sidewalk. These traits help me to be unembarrassed by openness, which reduces my need to lie. I can see that everyone may not feel the same way about being so open on a personal level. Still, I think the general principle of honesty is not dependent on also being willing to be emotionally open; it's just that emotional openness can make honesty easier.
Honesty as an effective policy
So far I've explained why I incline toward honesty on a personal level, but I haven't shown why honesty is also valuable if someone has a very different emotional disposition. Moreover, even if honesty feels good, maybe utilitarians are still obligated to violate it for practical reasons?
Here are some ways in which not being dishonest or misleading can be very important purely from an altruistic standpoint:
Risk of being exposed. In a world where people are tempted to lie for short-term advantage, other people respond by social punishment of deception. At a macro scale this happens through news exposés of scandals, watchdog organizations that hold politicians accountable for past statements, advocacy groups that launch campaigns against mendacious companies, and prosecution of lying under oath. On a smaller scale, gossip, local news, and social networks serve to bring lies to the surface. In addition to potential direct legal punishments, the long-term and often more severe impact is damage to public relations (PR). This is why it's often the case that "honesty is the best PR."
What about cases where you know you'd never get caught? First of all, there are many examples where people thought they'd never get caught and then did. If those who genuinely care about truth-telling admit an unflattering truth even when it seemed no one would know, this earns them major credibility points.
Also note that the risks of being caught lying may be higher for altruists. If you're an individual who commits a crime, the upper bound on the cost is some period of punishment for yourself. If you're an advocate for a cause who commits the same crime, you may tarnish your whole movement. In addition, altruists are often held to higher standards of integrity than, say, politicians or salesmen.
Credible commitments. A rule-utilitarian approach of always being honest in spite of seeming temptations to the contrary in particular situations may prove more robust in the long run, since a person adopting this stance can credibly commit to keeping promises in situations where it really matters. A friend once told me: "Utilitarians should become virtue ethicists." While not strictly true, there's a deep insight contained in this claim.
Eliezer Yudkowsky, "Prices or Bindings?":
There's a proverb I failed to Google, which runs something like, "Once someone is known to be a liar, you might as well listen to the whistling of the wind." You wouldn't want others to expect you to lie, if you have something important to say to them; and this issue cannot be wholly decoupled from the issue of whether you actually tell the truth. If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
Simplicity. Again quoting "Protected From Myself":
We don't live in a righteous universe. And so, when I look over my history, the role that my ethics have played is so important that I've had to take a step back and ask, "Why is this happening?" The universe isn't set up to reward virtue--so why did my ethics help so much? [... One reason is that] simplicity is powerful enough to explain a great deal of the positive influence [...].
Countersignaling. The fact that you're willing to acknowledge unflattering facts can actually boost your credibility because it shows you as that much more trustworthy. Often people make a good impression and present only the favorable information as a way to distinguish themselves from those who aren't sophisticated enough to hide the ugly truths or who have too many ugly truths for them all to be hidden. However, those who don't need to worry about being confused for criminals can adopt a countersignaling approach: Being completely honest about their flaws as a way to distinguish their integrity above those who put on a good show. GiveWell is an example of this: On a prominent page of their website they discuss their mistakes, including a major astroturfing scandal in 2007 that other organizations might instead have tried to keep quiet. That GiveWell makes these details available gives some confidence that they are interested in truth rather than self-promotion.
Of course, sometimes countersignaling doesn't work, and these are cases in which you'd probably want to push toward non-deceptive omission of unflattering details rather than calling them out in the open. Whether and how much this is dishonest may depend on the circumstances.
Lie detectors. As lie detectors become more accurate in the future, there may be a greater premium placed on those who tell the truth and have told the truth historically. This is a rather speculative point, so I consign further discussion to the appendix to avoid giving the impression that my argument in this essay depends on lie-detector technology being widely used in the future.
Surveillance. Already many things that we say and do are captured electronically, from Internet communications to recordings by hidden cameras, and we should expect this trend to continue going forward.
Common-sense heuristics. Common sense usually regards honesty as important. We should be pretty confident we know better than this heuristic before deciding to act duplicitously. Many people think they know better than common sense only to find out otherwise.
From Eliezer Yudkowsky's "Ethical Injunctions":
If you've truly understood the reason and the rhythm behind ethics, then one major sign is that, augmented by this newfound knowledge, you don't do those things that previously seemed like ethical transgressions. Only now you know why.
Someone who just looks at one or two reasons behind ethics, and says, "Okay, I've understood that, so now I'll take it into account consciously, and therefore I have no more need of ethical inhibitions"--this one is behaving more like a stereotype than a real rationalist. The world isn't simple and pure and clean, so you can't just take the ethics you were raised with and trust them. But that pretense of Vulcan logic, where you think you're just going to compute everything correctly once you've got one or two abstract insights--that doesn't work in real life either. [...]
I am completely unimpressed with the knowledge, the reasoning, and the overall level, of those folk who have eagerly come to me, and said in grave tones, "It's rational to do unethical thing X because it will have benefit Y."
Compromise with other value systems. Deontology, virtue ethics, religious ethics, and other views place high value on honesty even in cases where telling the truth seems to have bad consequences. By acting honestly, we make a concession to people who hold these value systems, as a result of which they may be more willing to grant concessions to us in return. (Of course, there may be exceptions, like if Nazis knock on your door and ask if you're hiding Jews in your attic.)
Combatting tragedy of the commons. The social costs of lying may be considerable, and if so, lying could be seen as a sort of tragedy of the commons: Holding everyone else's actions fixed, you may gain advantage by lying, but since everyone reasons this way, everyone is worse off as a result. From an altruistic perspective, there's potentially some value in being part of the movement against this trend.
Truth seeking benefits from truth telling. Altruistic arguments for lying sometimes go like this: "I know that policy X is best to reduce suffering. But most people don't agree with policy X because they're biased and uniformed. Therefore, it's altruistically required that I lie about the arguments for policy X so that more people will support it, so that huge numbers fewer animals will suffer." But there's a problem: Is it true that most people don't accept policy X because they're irrational and stupid? Or is there actually a good reason for their hesitation? Taking epistemic modesty seriously suggests that maybe you should be glad that others are opposing policy X, as a guard rail against your prematurely doing something that, if you reflected more, you'd realize was harmful. Ultimately your values benefit by knowing the truth, and it's easier to reach the truth when you exchange truthfully. You'd better be pretty darn sure you're right and others are wrong before you deceive them (or even omit crucial information) in order to advance your position. Rather than deceiving others, you should use the disagreement as an opportunity to test your arguments, and let the best arguments win.
Now, this point only goes so far. It's possible that others disagree with you on policy X because they actually are more irrational, or alternatively because of a difference in values rather than a difference in facts. To return to the Nazi example, if policy X says "you shouldn't look in our attic," you are right to pretend that the Nazis shouldn't look because no one is there, rather than admitting that Jews are there and wondering if you're wrong to want to keep them hidden. There are many other less dramatic examples where you predictably have values in opposition to those of other people, and if you honestly admit a fact, others may use that information to do exactly what you oppose. These cases are tricky, and they may sometimes constitute grounds for at least omission of information. But I also think we should treat these cases as relatively rare, in part as a way of counteracting our own bias to think we're more right than we likely are, and in part because of the other positive spillover effects of a more open policy.
Intellectual intimacy. Just as being honest with those personally close to you can forge a rich emotional connection, so being close with intellectual peers can build bridges of mutual transparency with them. This allows you to learn more action-relevant insights than if they didn't trust you as a source of useful information and revealed less to you.
Shades of gray
It seems fairly plausible that outright lying is rarely a good idea, with exceptions like the Nazi case. The harder questions are what to do with various gray areas. If you make a mistake, should you necessarily admit it openly, or is it okay to keep some things to yourself? If there's an argument or datum that would make someone oppose your position more, are you obligated to state it? How should we handle information hazards?
I tend to think it's good to bring up information against your position when it's relevant, although you aren't obligated to go out of your way to publicize it. It's also good to bring up your failings in contexts where people need to know that information, but it's okay to keep quiet otherwise. In general, the case for revealing something is stronger when short-term harm would result if it weren't revealed, but that criterion alone is not sufficient, because this kind of "ends justifies the means" thinking breaks down in the case of lying, where we need a rule stronger than preventing expected harm to ensure long-term good consequences. Lying for seemingly altruistic reasons is about preventing expected harm in your eyes, but in practice it seems not to be a good idea. Omission is less drastic, but similar sorts of tradeoffs between avoiding short-term harm and committing to a long-term valuable policy remain.
No one is perfect
Lying doesn't make you a bad person. I think it's good to try to be honest as much as possible, with an emphasis on those cases that matter most altruistically, but people differ in how hard this is, and everyone probably wishes s/he had done better at some point or other. We do the best we can.
Why does lying persist?
An interesting objection to my arguments for honesty is the following outside-view perspective: Lying is very common in society, so it must be serving well those who do it. Indeed, a number of animals lie, like the mimic octopus; even brainless plants can be deceptive.
Certainly honesty is not always the best policy, but it becomes more favored when
- there are repeated interactions with other agents
- agents have powerful tools to assess your veracity
- society places a high premium on truthfulness
- you're aiming to improve the world rather than just, say, make money or have sex.
As an illustration of the last point, a skillfully dishonest sexual partner could in some ways be attractive because, if the trait is heritable, then offspring produced by that partner should benefit from similar skill.
Changing the world requires persuading people of your position, and hence credibility matters more there than in the realm of private consumption. Of course, even in the realm of persuasion, there are plenty of sticky memes based on lies. In the long run, though, these should be exposed, and there may be backlash against them as a result of the deception they entailed. For example, those who become vegan due to the promise of a miracle diet will likely be disappointed. Lying seems particularly likely to backfire if your target audience is well educated and skeptical, which are attributes that describe many effective altruists.
Anyway, this is not a complete response to the outside-view objection, and I think there remains work in assessing whether it has merit or, if not, why not.
Not all lies are bad
In this piece, I'm mainly arguing against the idea held by some naive utilitarians that the ends justify the means, so lying to advance your altruistic agenda is sometimes called for. I'm not arguing that all forms of lying in life are always bad. Context matters, and rigid adherence to any single principle is probably unwise.
Some examples where not being 100% truthful seems acceptable or even desirable:
- White lies. I think it's fine to use white lies, at least in moderation. The classic example is that if you're asked "Does this dress make me look fat?" you're allowed to say "No." I don't often need to use white lies, but when I do, it's usually to avoid making someone upset ("Did you like my paper?" "Yeah, it wasn't bad.") or when the question is not necessarily meant literally (if someone asks "Is it a bother for you to help with this?" I might say "No" even if it is somewhat of a bother because I mean that I'm willing to help and don't want the person to feel bad for asking).
- Understood imprecision. The quotes I used in the story about my sister's video game at the start of this piece were obviously not exact. I recall some exchange like that happening, but I don't know if it played out exactly as I described. I'm not even 100% sure I was in 7th grade at the time, but that's my best guess for the year. Sometimes the last update date on my essays is off by a few days. And so on. Readers can expect these sorts of imprecision, and storytellers have poetic license. Some errors just don't matter enough to fix.
- Private details. If a friend asks you to keep personal details secret, it's important to do so, even if you might need to hide this knowledge in other contexts through omission or maybe mild active deception.
- Cases where government is oppressive. For example, drug laws in the USA are tyrannical. Possession of marijuana, which in many ways is less harmful than alcohol, can get you arrested. Narcing on your friends for enjoying some weed because of a theoretical commitment to honesty is a jerk move.
Fairness in argumentation
In 2012, I wrote a forum post arguing that the expected value of the far future is plausibly negative even for utilitarians who think moderate amounts of happiness can morally outweigh suffering. While I do think it's the case that many plausible future scenarios could be quite negative in total value even for standard, non-suffering-focused utilitarians, my presentation of the argument was somewhat lopsided, in that I omitted mention of plausible ways in which the future could also contain enormous amounts of positive value. After discussing the matter with a friend, I decided to add a section mentioning positive future scenarios.
Many of my articles are written with a clear point of view, but with the exception of the example from 2012 mentioned above, I usually try to avoid making a case using the propaganda tactic of omitting counterarguments or evidence that disagrees with my position. Instead, I try to lay out the case as openly as I can, including considerations that go against my position. By making my moral values explicit, I'm able to argue for what I believe in exactly the way that I think about the issue, without trying to engineer my argumentation for people who have different values than mine. (That said, there are still cases in which people accuse me of not being sufficiently explicit about my idiosyncratic moral values. In cases where this criticism applies, it's probably because I don't like repeating myself from one essay to the next. Anyone who wishes to look up my moral values can find them.)
I think it helps to write as if you're a principled truth seeker who aims to "let the chips (arguments/data) fall where they may". Having some emotional distance from the topic can help to reduce bias, although too much emotional distance can also be harmful insofar as emotions often serve as important error/bullshit detectors. And you can still infuse your arguments with your moral values as much as you want, as long as you're not gerrymandering the empirical side of your argument in an effort to bolster a conclusion favorable to your values.
Reducing the inclination toward one-sided presentations of arguments is likely to be a case in which honesty pays dividends, because what you've written is intended to be public and verifiable, which means anyone can check your arguments and call you out for being biased in your presentation of information.
Weird ideas and taboos
Many of the ideas on my website are unusual to say the least, and sometimes others are inhibited from advancing similar arguments because they don't want to seem "weird." I'm fortunate to not mind appearing weird, but not everyone is so comfortable with it, and many don't have the luxury of it. I have a job in which I can say anything I want about philosophical topics without repercussion, but more mainstream public figures can't.
Sometimes my friends encourage me to eschew making controversial statements to avoid upsetting certain allies. While I usually steer clear from invidious language and don't provoke needless confrontation, I also don't shy from saying what I think. One reason is that I selfishly don't like to keep my mouth shut and feel better being open. Another reason is the meta-principle that I would like to see more people express themselves this way. And then there are the further benefits as discussed above, like countersignaling your commitment to truth over dogmatism. That said, I realize that it's not possible for everyone to be a gadfly, and I admire those who take a more moderated approach to outreach so as to avoid turning too many people away. There's also great value in those who can bridge large inferential distances in many small steps. And there are times when moderation in one's message avoids provoking excessive anger.
In general, social taboos against iconoclasm are a double-edged sword. They sometimes serve a useful function, such as by inhibiting people from inclining toward tempting but harmful stances like racism. On the other hand, they can inhibit the open expression of new ideas just because those ideas go against the grain. If we want to encourage more honesty, one thing that would help is more tolerance of those different from what we're accustomed to.
Should movement-builders eschew weird topics toward the beginning?
A common tradeoff in efforts to build a movement is between having widespread appeal by focusing on relatively mainstream topics, versus promoting the messages that are actually most important to get across. Say you think risks of future computational suffering are most important to make progress against, but you think they sound weird to most people. In order to maintain high status and attract lots of followers, you talk mostly about topics that are more widely known, like robots displacing human jobs and risks of superintelligence causing human extinction. You think it's important to first make a name for yourself as an organization and later on begin to introduce some of your more radical ideas.
I typically see movement-builders side for appearing mainstream at the beginning, with the intention to eventually introduce more unusual ideas down the road. Given that many people apparently pursue this approach, the evidence from peer beliefs seems to push in its favor. However, at an object level, I much prefer being up front and focusing on the ideas that matter more to me, without worrying about how they'll be perceived. Some of the reasons for this parallel arguments for honesty in general in this piece. Here are a few particular criticisms of appearing mainstream:
- Lack of transparency: Not revealing your true intentions for an organization is deceptive. It's not fair to your donors and supporters if you change course down the road, away from your initial message toward your true agenda.
- Misplaced effort: Becoming well known for discussing mainstream altruism topics can take significant time and effort that could have gone toward making direct progress on the actual problems you wanted to solve.
- Goal displacement: Once you begin discussing a topic, that topic tends to perpetuate itself, because people will continue to associate your organization with it and will continue to bring it up. At a psychological level, it's also easy for subgoals to stomp supergoals: i.e., to get so absorbed in the mainstream topic that you actually come to focus on it too much for too long.
- Reaching the wrong audience: If you're actually most worried about future suffering, then marketing your message to people who primarily seek to prevent the extinction of carbon-based Homo sapiens isn't bringing in the optimal audience. If you want to make progress on problem X, it's generally best to talk openly about problem X. That's the cleanest way to make connections with the people most interested in your topic.
- Closed-minded audience: In general, people who reject your unusual ideas out of hand are less likely to challenge other beliefs. Those who aren't turned off by unorthodox views are more likely to be critical thinkers and hence make better collaborators. (Of course, the kind of unorthodoxy I have in mind here is mostly about moral questions and the implications thereof. It's often good to maintain a filter against crank factual theories, and critical thinkers are often quick to dismiss pseudoscience.)
- Unorthodox ideas can be exciting: Mainstream topics may have broad appeal, but the market for discussing them is already somewhat saturated. In contrast, unusual ideas that haven't been raised before can sometimes generate more enthusiasm, at least from a core group of very valuable people.
- Less agility: In general, I shy away from long-term plans for how to accomplish a goal. If you think something is important, you should go explore it further right now. Don't wait for some future moment when the climate will be more ripe for action. Part of this impulse comes from seeing people (including my past self) construct elaborate future plans that become irrelevant once conditions change. For instance, suppose you spend three years building a movement of people worried about superintelligence killing humans, and then you realize that a topic unrelated to superintelligence is actually most important. Now your movement doesn't contain the right kind of people to become interested in your new top priority. If instead you had focused on your original priority more directly, you would have found this out sooner and would have had less organizational baggage when switching.
Ghostwriting is the norm for books and speeches by politicians, celebrities, and executives. It's also somewhat common in other situations as well. Whether ghostwriting is seen as permissible varies by context. For example, ghostwriting is frowned upon in an academic setting:
we now come to the question of scholarly works and other products of research where there is a presumption of originality with the named author (one of the defining characteristics of scholarship), and for which the original thought itself is the source of value. Here, we do expect the named author to actually be the author. [...] in scholarly publishing our reliance on original authorship makes this unethical and undermines trust in the scholarly record and the system of recognition and rewards on which the scholarly community has long been based. If an author is not really the original author, that system is rendered meaningless.
Most commentators I've read seem to consider ghostwriting acceptable if and only if it's relatively transparent to readers that the piece has been ghostwritten. For example, most people realize that politicians don't write their own speeches, so there's no deception involved in ghostwriting there.
People seem to differ on whether they think ghostwriting is wrong:
We asked blog readers about the permissibility of having a PR person draft blog content on behalf of the executive without disclosure. We also noted that, in this scenario, the original ideas came from the executive and he or she reviewed, edited and gave final approval of the content.
The survey results showed that corporate blog readers were split in their opinions about whether this level of PR support without disclosure was acceptable. Among readers of politicians’ blogs and nonprofit blogs, there were more people who disapproved of this level of PR assistance without disclosure than those who approved of it.
I generally dislike ghostwriting because either it has no effect or it's unfair and deceptive:
- If other people know that you didn't write the material, then there's no purpose in ghostwriting rather than making the person who helped write the material a coauthor.
- If other people don't know you didn't write it, then ghostwriting is deceptive and creates pressure for others to do likewise in order to compete.
I prefer to give credit for authorship liberally, especially since this gives authors more of a stake in the content and helps "a thousand flowers bloom" by letting other people establish themselves as writers and public intellectuals in their own right. I like this quote from Ralph Nader: "the function of leadership is to produce more leaders, not more followers." In any case, I would find an organization or movement more compelling if it had a lot of voices producing content than if most of the material was produced by a single figurehead.
A counterargument to this is that there can be increasing marginal returns to fame: The more famous you are, the more other media outlets want to invite you on, and thus, fame snowballs. Likewise, if you've had a successful first book, perhaps it's easier to market your second book. This is an interesting argument worth exploring further, but it seems plausible that the same effect could be true for a cause and not just for a person. For example, if one person talks about an issue, then media outlets and publishers are more willing to have someone else address the same issue, and the issue snowballs in notability. It's not obvious that this snowball effect is stronger when it's just one person championing the cause. Indeed, having more independent voices may diminish the impression that the cause is a cult with a single crackpot leader.
The above discussion is an empirical matter worth sorting out further. But regardless of the conclusion of those arguments, I still lean against ghostwriting. From my perspective, dishonesty anywhere is a threat to one's credibility everywhere. If you don't mention that you use ghostwriters and people find out, then people may begin to wonder what else you're hiding.
On the other hand, I can understand the opposing viewpoint as well: A lot of other people use ghostwriting too, and one has to pick one's battles. Maybe opposing ghostwriting is not the most important battle to fight.
Many of the observations in this essay are unoriginal and have their origins in the wisdom of many past generations. More proximately, Eliezer Yudkowsky and Carl Shulman both made a big difference to my views on honesty.
Appendix: Lie detectors
Present-day polygraphs are widely criticized as insufficiently accurate to be useful, though these criticisms center around using the test as a decisive criterion rather than as just more Bayesian evidence. Some studies suggest that polygraphs are pretty accurate:
Probably the most comprehensive look at polygraph accuracy is a 2003 report from the National Academy of Sciences. [...] Their analysis of the 30 most recent polygraph data sets showed an overall accuracy of 85 percent, and an analysis of seven field studies involving specific incidents showed a median accuracy of 89 percent.
And despite the criticism, polygraphs are routinely employed in law enforcement and intelligence.
While polygraphs are not widely used among civilians today, lie-detection technology should become significantly more advanced in the coming decades. In principle, certain techniques like fMRI lie detection should be able to become extremely accurate:
While a polygraph detects changes in activity in the peripheral nervous system, fMRI has the potential to catch the lie at the 'source'. [...] Using this method, an initial 2005 study on individuals (not group averages as previous studies) without pattern recognition and automation showed that lies can be distinguished 78% of the time. That statistic has risen, in one study, to 100% when predicting a lie in an individual when baseline lie/truth levels were closely studied with training from pattern recognition technology (machine learning). fMRI does rely upon the individual remaining still and safeguards in the analysis such that the questions can not be gamed by the participant (G. Ganis 2010). Studies have been done on Chinese individuals and their language and cultural differences did not change results. To show the robustness of this fMRI technology, a study (S. Spence 2011) was done that showed fMRI lie detection / truth verification technology worked even in a group of 52 schizophrenic patients, 27 of whom were experiencing delusions at the time of the study.
Once lie detectors reach very high levels of accuracy, will they be widely used? Quite possibly not. Maybe people will object on privacy grounds. Maybe there will be a "right to lie" movement that considers lie detectors an invasion of human rights. And so on. But it's also conceivable that eventually society will embrace the technology and consider it a positive tool for holding each other accountable. Or maybe lie detection will be used in a more Orwellian fashion, but even in that case, you'd want to be prepared.
"Okay," you might say. "In such a society, I would tell the truth. But that doesn't mean I need to tell the truth now. I'm not under a lie detector at the moment." This is correct, but under a future lie detector, you could be asked: "Did you lie in the past?" Unless you erase your memories of having lied (and also erase your memories of erasing your memories), the old lies would come to the surface at that point. ↩
- One way to think about white lies is the following: What would the person prefer for you to say, if s/he knew the details of the situation? For instance, the person in the bad dress might prefer to remain blissfully ignorant rather than be told an honest answer. If so, it makes sense to tell a white lie. If, instead, the person is more concerned about whether the dress actually looks bad because of an important external reason, in that case you shouldn't lie. Sometimes there's something a person would rather not know, and in this case, white lies make sense, because you're doing the person a favor, and if s/he found out, she would be glad that you tried to tell the white lie. (back)