Thursday, March 17, 2016

Paying People to Quit

As launch week, for the new book Under New Management, is drawing to a close, I have
the honor of hosting a guest blog by author David Burkus. Under New Management has been wildly successful this week and I encourage you to get a copy today.

Paying People to Quit: The Cost of an Unengaged Hire
Possibly the most counterintuitive process to appear in recent years is the idea of paying people to quit their jobs. Not only are some leaders finding it beneficial to company performance, but research suggests these incentives may even have a positive effect on the people who stay.

One benefit of paying people to quit is obvious: it screens out people who would probably end up quitting anyway. In a purely logical world, as soon as people figure out that they've made a bad decision in coming to work at a company, they would leave. However, humans are not logical creatures. As such, we’re subject to a cognitive glitch that makes it difficult to quit the things we start. Economists often refer to this as the 'sunk costs fallacy.' Sunk costs represent the time, money, or effort we’ve already invested in a course of action. Money has already been spent, and there’s no getting it back whether we continue down the same course or break away and go our separate way.
Rationally then, the moment we realize we’ve made a mistake, we should change our course of action. But we don’t do that. In one of the original studies on sunk costs, Hal Arkes and Catherine Blumer (both of Ohio University at the time) asked undergraduate students to envision the following scenario and make a choice:
Assume that you've spent $100 on a ticket for a weekend ski trip to Michigan. Several weeks later you buy a $50 ticket for a weekend ski trip to Wisconsin. You think you'll enjoy the Wisconsin ski trip more than the Michigan ski trip. As you're putting your just-purchased Wisconsin ski trip ticket in your wallet, you notice that both trips are for the same weekend! It’s too late to sell either ticket, and you cannot return them. You must use one ticket and not the other. Which ski trip will you choose?

Surprisingly, the majority of students choose the more expensive Michigan trip even though the Wisconsin trip would be more fun. Despite the fact that the full $150 was spent and couldn’t be recouped students were influenced by how much had been spent on the trip and that led them to make a less enjoyable choice. We’re biased toward throwing more money or more effort at a less enjoyable — or doomed — cause if we’ve put significant effort or money behind it already. Jobs are no different.

It takes time to find a job, and when you’re hired, if you suddenly realize the job isn’t right for you, your sunk costs exert pressure to ignore that realization and continue. Offering a quitting bonus can help offset the sunk costs building up in the mind of the future underperformer. 

For both the employee and the employer, sunk costs make it difficult to end a doomed relationship. Companies that pay people to quit are acting rationally and ignoring sunk costs. They realize they can’t really head off a future problem by investing more time and money in someone who isn’t a good fit. When a company pays an employee to quit, it’s often doing so in the belief that even if they accept the offer, the company is getting a good deal. By giving the employees most likely to be disengaged the option to leave, companies save a lot in the long run. According to research from the Gallup Organization, disengaged employees are less productive, more likely to steal from their employer, skip work, and negatively influence customers and other employees.

At companies that have implemented this policy, only about two to three percent of people who get the offer take it. When people stay, not only does the company get to keep the money, but they might even get a more engaged and productive employee. So what happens to everyone who stays? The answer to that question points to the second reason why paying people to quit works: cognitive dissonance.

'Cognitive dissonance' is the term psychologists use to describe the discomfort you feel when two ideas conflict in your mind, as well as your attempts to reconcile them. The theory of cognitive dissonance was first proposed by Leon Festinger, a social psychologist who worked at a variety of universities, from MIT to Stanford.

Jack Brehm, another social psychologist, built on Festinger’s theory with a phenomenon he labeled 'post-decision dissonance.' Brehm theorized that after we make certain decisions, we modify our beliefs to strengthen the validity of that decision. In a famous experiment, Brehm asked 225 female students to rate a series of common household appliances. The students were then asked to choose between two of the appliances they’d rated to take home as a gift for participating. Brehm followed up with the students and asked them to complete a second round of rating the same appliances. Oddly, the students’ ratings had changed. In the second round, most of the participants rated the appliance they’d chosen as a gift higher than they’d rated it in the first round, and likewise rated the rejected item lower than they had before.

While it may seem counterintuitive, offering disengaged or unsuitable hires the opportunity to self-select out can lead to greater engagement and productivity from the employees who remain, as well as increased profitability for the company as a whole.

David Burkus is the author of the new book, Under New Management. He is host of the Radio Free Leader podcast and associate professor of management at Oral Roberts University. Please visit his website at

No comments: