Friar Laurence has the best of intentions. But in William Shakespeare’s Romeo and Juliet, the friar’s well-intended interventions do not yield happy results.
Laurence’s initial good intention is to end the destructive conflict between two warring families. That’s why he agrees to marry Romeo and Juliet, though their romance is both sudden and perilous. Later, the friar recognizes their true love and gives Juliet a potion so she can fake her death and slip away to be reunited with Romeo. Neither scheme works. The unintended consequence of Laurence’s intervention is the death of star-crossed lovers.
When government actors translate their own good intentions into policy interventions, the consequences may not be so tragic. But all too often, they produce unintended and adverse consequence for many affected parties.
A famous example was the initial roll-out of child-safety caps for pharmaceuticals. The intention was to reduce incidence of accidental poisonings. And the new caps did so in many cases. But the intervention to frustrate the efforts of children to gain access to pills also frustrated the efforts of adults, especially seniors and those with disabilities, to open the containers. Some ended up leaving the caps off their medications altogether, which actually made it easier for kids to gain access and poison themselves.
For others, the regulation produced what scholar Kip Viscusi termed a “lulling effect.” Bringing home bottles with so-called childproof caps, these adults weren’t as careful in where they stored their meds. Their children noticed, got curious, and learned to manipulate the caps.
Other safety rules have produced a similar effect, a form of what is called “moral hazard.” In response to regulations requiring safety features on automobiles, for example, some motorists felt so reassured that they drove more recklessly.
Pointing out the unintended consequences of state action is not to argue against all efforts to combat harms through regulations or expenditures, of course. As a limited-government conservative, I recognize there are situations in which consumers have no practical way of discerning what risks they may bring on themselves — by purchasing food from a restaurant, for instance. Can we watch it being prepared? Can we be reasonably certain that the stomach ache we got was the result what we ate for dinner, what we ate for lunch, or something else entirely?
The gap between intentions and consequences is nothing more than a reflection of our human nature. We are risk-calculators. But we are not computers. Instead of performing flawless calculations based on carefully curated data, we rely on traditions, rules of thumb, social cues, and time-saving assumptions.
These decision rules actually serve as well most of the time. When they don’t, the intervention of others can certainly help us make better choices. Depending on the context, however, such an intervention can also distract us, remove a self-protective incentive, or provoke a self-destructive backlash (human beings are social creatures, yes, but that doesn’t mean we automatically welcome being told what to do or treated as children).
Consider what happened when 11 states responded to the Great Recession by restricting employer access to the credit reports of prospective employees. Many of those who had lost jobs got behind on their mortgage payments or other debts. State lawmakers argued that to allow employers to request credit reports constituted “kicking people when they were down.”
As a team of three economists found in a recent study of the policy, however, the real-world effects deviated significantly from the stated intention. Forbidden from using the data to distinguish between applicants and thus reduce the risk of a problematic hire, some employers simply cut back on new hires, particularly in communities with a disproportionate number of economically fragile residents with subprime mortgages. Indeed, delinquency rates for subprime loans rose in states where credit-check bans were enforced.
While Left and Right may differ on when government should intervene, we may at least agree that nudging is better than shoving, and that regulations that enhance information are likely superior to those that suppress it.