Research gives edge to fessing up
Harsher consequences for people who stay mum on past, experiments suggest
In a job interview or on a first date, everyone wants to make the best impression possible. So naturally, choosing to volunteer only information that puts you in a good light, while avoiding talk about anything that might be unflattering, seems like a pretty foolproof strategy. Well, think again.
When directly questioned, people who withhold negative or embarrassing information about themselves give a far worse impression about their overall character and trustworthiness than those who come clean right away, according to new research by Harvard Business School (HBS) faculty.
Across a series of experiments, participants were asked to choose between two people — one who gave truthful but embarrassing answers to questions about drug-taking, bad grades, and sexually transmitted diseases, and one who refused to answer. Over and over, those who “fessed up” were preferred and viewed more positively, even after it was revealed that the withheld information was less unpleasant.
Withholding an answer does more damage than most of us even realize, say the paper’s authors, assistant professor Leslie K. John, doctoral student Kate Barasz, and Michael I. Norton, the Harold M. Brierley Professor of Business Administration at HBS. The Gazette spoke with John about the findings.
GAZETTE: What did you set out to find?
JOHN: The first thing we set out to do was to test whether there is an effect — that you view people who withhold information with disdain. The way we tried to show that is to give people a choice where we say, “Please choose which of two people you’d rather go on a date with.” One, who we call The Revealer, has divulged a lot about him- or herself and the things they’ve divulged are extremely unsavory, like that they frequently forget to tell a partner that they have an STD. The other dating option is what we call The Hider. When they do reveal information, they reveal information that’s just as unsavory, but for some questions [they] opted out of answering. The situation we set up is that The Hider is only — at worst — as bad as The Revealer, so it’s very surprising that people, again and again, they’d rather date The Revealer … [when] there’s a really good chance that The Hider has better attributes.
GAZETTE: While withholding information is not necessarily the same as hiding it, your findings show that it’s perceived as the same and somehow indicative of underlying character flaws. Why is it viewed so harshly?
JOHN: First, I think that it’s important to delineate the boundary of this effect. It would be wrong to conclude from this research that we should all go out and tell everybody our deepest and darkest secrets. That would be creepy and weird. But what it is saying is that in situations where disclosure is expected — for example, when you’re posed a direct question — it’s expected that you’re going to answer. Where disclosure is expected, when someone does not disclose, we view them with contempt. One reason we find support for is that people have a preference for disclosers and there’s good reason. We know that when you disclose information about yourself, this is a key way of developing intimate bonds with others, which is a fundamental human need. And so, by that logic, it follows that when someone doesn’t disclose information, we tend to like them less and become untrusting of them.
When I first started this work, I thought that it was just about the specific information that was missing that people were inferring, “Oh, if they’re not divulging it, it must be the worst possible.” But again and again, we found that does not account for the effect. In one experiment, we asked people “Who would you rather hire?” There are two possible candidates and on their job application they were asked, “What is the worst grade you’ve ever gotten on an exam?” One candidate’s answer was “F.” The other person opted out of answering that question. Of course, they’d rather hire The Revealer. But the really interesting thing is we also asked two additional questions: “From zero to 100 percent score, what do you think each candidate’s worst grade was?” and “Who do you think is more trustworthy?” What we found was people think the Revealer is more trustworthy, but, interestingly, people think the Hider’s lowest score is significantly higher than the person who revealed. So it’s really not driven by inferences about the undisclosed information, it’s this trait-level judgment.
GAZETTE: Why do people tend to avoid disclosure? Don’t they realize the consequences?
JOHN: That’s a great question. Most of the paper focuses on observers’ judgments of people who abstain. But another important finding is, we ask “If you’re trying to make the other person like you, if you’re trying to get hired, if you’re trying to be asked out on a date, do you make the right decision of whether to disclose or to hide?” It turns out that most people think that the best thing to do in this situation is to withhold the information, when we know from the other studies that they’d make a far better impression if they just came clean. So on the surface level, the effect is driven by when you’re deciding whether to disclose or not, you focus more on the risks of disclosing. This is conjecturing a bit beyond the data, but I think one of the reasons might have to do with the way we learn and the type of feedback we get. When you divulge something really unsavory, you typically get immediate and pretty visceral feedback. If you’re on Facebook and you say something, people will comment and they’ll call you out. Or, if you say something face-to-face, you just look at the person’s face and it’s shock and awe. The risks are very salient, so we learn — appropriately — that we should be guarded about disclosing this stuff. But what we don’t get feedback on is when we fail to divulge information — the trust hit is not so salient. You can’t really see that on someone’s face, their perception of you. So because we don’t see the trust hit of withholding, we underweight it when we make decisions.
GAZETTE: How did you test to see if observers were being truthful when they claimed they didn’t penalize those who came clean?
JOHN: It comes back to understanding when we’re likely to get this effect and when we’re not. We’re likely to get it when there’s a job application and there’s an explicit question and you opt out of answering. Then it’s salient. Like if you’re asked, “Have you ever done drugs?” on a job application and you opt out of answering, that’s obviously very salient. Contrast that to where you’re in a job interview and you have done drugs, but you’re not even asked about it so you don’t volunteer it. Where you fail to volunteer it, the interviewer isn’t going to notice that you didn’t divulge, so at least in the short run, a better strategy is to not volunteer it.
The other thing you’re speaking to is a broader and important question when you do the kind of research I do. Basically, talk is cheap, and so in some of these experiments, maybe people would react differently if they actually had skin in the game. What we did to address that potential concern is an experiment — it’s a classic paradigm from experimental economics called the trust game. We pair people up, Player A and Player B. Player A, we give $5 and we say to Player A, “You can give as much or as little of this money to Player B.”
When whatever amount of money is transferred between Player A and Player B, it’s tripled. So if I decide to give $2 of my $5, Player B gets $6. Player B can then decide what to do with the $6. Player B can choose to send some back to Player A if he wants to or not. It’s called the trust game because as Player A, if you trust Player B to be fair and give you half of the spoils, then you’re going to maximize your payout if you trust them with all $5. But if you don’t trust them, you’re not going to give them any money, so this is nicely incentive-compatible. If you appropriately trust the person then you’re going to make more money for yourself. We vary whether when Player A is deciding how much money to transfer, Player A is made aware of whether Player B answered questions. Some Player Bs are randomly assigned to be Hiders, and some are Revealers. As Player A, when I decide how much, if any, money to transfer to Player B, I know whether I’ve been paired with a Hider or a Revealer. We find that when Player As are paired with Hiders, they transfer less money to the Hiders — because they trust them less — than when they’re paired with Revealers. And because they give less money to the Hiders, they get less money back.
GAZETTE: Do people have disclosure double standards?
JOHN: It seems to be the case. When we put people in the role of judgers, like when you judge someone who doesn’t disclose, you think they’re unsavory. However, when you’re in the position yourself to disclose, or to hide, you actually think you should hide the information.
GAZETTE: What should people take away from this research?
JOHN: When you’re being asked explicitly to divulge information and you know it is not particularly savory, we caution people to reconsider their natural instinct to withhold because you might be better off just coming clean and sharing that information. I think it also cautions us against being unwarrantedly harsh on those who don’t disclose information.
This interview was edited for clarity and length.