Substitution, mass confusion, clouds inside your head
Hirshleifer (2008) proposed a psychological attraction theory of regulation--that regulation is the result of psychological biases on the part of political participants (voters, politicians, media commentators). Let's see if we can't put a couple aspects of Hirshleifer's framework to work in light of calls to limit/regulate the energy sector in the wake of the British Petroleum (BP) oil spill.
Salience and vividness effects. Politics can be viewed as a struggle for attention. Constraints on information processing influence political debate. Political competitors seek mechanisms that will make their positions plausible, understandable, and memorable.
Research suggests factors that make stimuli easy to retrieve. Attention is drawn to salient stimuli, or those that stick out compared to others in the environment. Attention is also drawn to vivid stimuli, such as stories about personal experiences and emotionally arousing information (Nisbett & Ross, 1980). Moreover, people possess a 'negativity bias,' or a distaste for losses when measured from an arbitrary reference point (Kahneman & Tversky, 1979).
Disasters, of course, play right into the hands of political participants with an agenda. To my knowledge, we've never had a deep water oil spill like this before (which is likely the chief reason why no one has been able to stop it). Stories of tar balls on the beach and dead wildlife tug at emotions.
As such, voters are eager for politicians to 'do something' to at least create an illusion of initiative to correct the problem.
In-group bias and scapegoating. People tend to prefer members of their own group to outsiders, a phenomenon known as in-group bias. Moreover, people engage in self-serving attribution bias, the belief that in interactions with others we are right and they are wrong. Group serving interpretations of attribution bias can result in antagonism with other groups (Beck, 1999).
The animosity that various politically minded environmental groups hold for industries such Big Oil pretty apparent. Accidents provide a prime stage for the 'we're right; they're wrong' production.
BP becomes a scapegoat for those seeking sweeping reform. Scapegoating is blaming the visible, disliked, and relatively vulnerable--in this case to support regulation to avert future misconduct, regardless of whether there was any villanous behavior or not.
Overconfidence. It has been argued that the most robust finding in all of psychology is that people are overconfident. Overconfidence is belief that one's personal abilities are better than they really are (Hirshleifer, 2008: 864). People consistently express confidence that regulatory regimes can avert disaster. Yet disasters in heavily regulated processes persist. The Space Shuttle program, a government run initiative, has seen two catastrophic failures since its inception. Financial markets have experienced various meltdowns over the past couple of highly regulated decades.
Like all individuals, regulators think they are better than they really are. Can the oil industry be regulated by bureaucrats in a manner that reduces chances of an extreme event? Theory suggests that people think so ex anted but historical data suggest otherwise ex post.
Availability cascades. Extreme events such as disasters gain widespread public attention in intense bursts. Tversky and Kahneman's (1973) 'availability heuristic' suggests that people judge the importance of a phenomenon by their ability to recall examples of it. The more people talk about an event or problem, the more important it seems, creating a self-reinforcing cycle that can be labeled an 'availability cascade' (Kuran & Sunstein, 1999).
As such, news media amplify the availability of threats selectively. In an availability cascade, as public opinion swings toward one position, evidence becomes increasingly one sided in favor of that position. Evidence suggests that people fail to account for the one sidedness of evidence, even when that one sidedness is explicit (Brenner, Koehler, & Tversky, 1996). Consequently, during an availability cascade based upon a perceived threat, political pressure for government to do something to mitigate the threat becomes irresistible.
All of this helps explain not just the regulatory regime sprouting from the current oil spill, but the larger phenomenon of why a people consistently cede power to political entities in the form of regulation--even when the cost of regulation is high and prone to failure.
position in oil
Beck, A.T. 1999. Prisoners of hate: The cognitive bias of anger, hostility, and violence. New York: HarperCollins.
Brenner, L., Koehler, D., & Tversky, A. 1996. On the evaluation of one-sided evidence. Journal of Behavioral Decision Making, 9: 59-70.
Hirshleifer, D. 2008. Psychological bias as a driver of financial regulation. European Financial Management, 14: 856-874.
Kahneman, D. & Tversky, A. 1979. Prospect theory: An analysis of decisions under risk. Econometrica, 47: 263-291.
Kuran, T. & Sunstein, C. 1999. Availability cascades and risk regulation. Stanford Law Review, 51: 683-768.
Nisbett, R. & Ross, L. 1980. Human inference: Strategies and shortcomings of social judgment. Englewood Cliffs, NJ: Prentice-Hall.
Tversky, A. & Kahneman, D. 1973. Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5: 207-232.