The following is a guest post from Joshua Hochschild, associate professor of philosophy and Dean of the College of Liberal Arts at Mount St. Mary’s University.
I read Nicholas Carr’s The Shallows, and so I was interested to see David Cloutier’s review of Carr’s latest book on the Commonweal site. Carr writes about the cultural significance of automation, and Cloutier sees in this evidence that certain kinds of technological changes make us dumber, in particular dumber about moral language:
The missing language is a language of limits, modesty, appropriateness, reasonableness, and the like. It is the traditional language of the virtue of temperance, but leavened by prudence and wisdom. Carr’s book illustrates the need for such a language, but also the difficulty of finding it. We’d like to imagine that technology will threaten us in a Frankenstein or Robocop sort of way. But what if technology’s power to destroy us is more like the slow, sure creep of the problems of the high-functioning alcoholic, instead of the dramatic conflict of the destructive or homeless drunkard?
This sounds right, but I wonder if it is radical enough. It strikes me that we are not just missing the language of some virtues, but the very language of evaluating actions as capable of displaying and cultivating (or not) virtue. Automation makes us forget that we are moral agents.
Cloutier’s reflections on the subtlty of Carr’s critique of technology (neither all bad, nor all good) reminded me of an essay by Russell Hittinger, which (drawing on ideas from Christopher Dawson) argues that technology changed the moral landscape of the modern West. Hittinger writes:
Modern technologies are not only “labor saving” devices. A labor saving device, like an automated farm implement or a piston, replaces repetitive human acts. But most distinctive of contemporary technology is the replacement of the human act; or, of what the scholastic philosophers called the actus humanus. The machine reorganizes and to some extent supplants the world of human action, in the moral sense of the term. Hence, the policy of mutual assured destruction supplants diplomacy; the contraceptive pill supplants chastity; the cinema supplants recreation, especially prayer; managerial and propaganda techniques replace older practices and virtues of loyalty, etc. … [There is] a new cultural pattern in which tools are either deliberately designed to replace the human act, or at least have the unintended effect of making the human act unnecessary or subordinate to the machine.
If Hittinger is right, we aren’t just missing the language of particular virtues, we are also missing some of the basic vocabulary of human action — the deep threat of technology isn’t that it tempts us to bad action, but that it disguises the nature of action itself, and so enables us to avoid even thinking about good vs. bad action.
C.S. Lewis makes a very similar point about technology in The Abolition of Man – again, not that technology makes us bad, but that it can disguise from us the very possibility of evaluating our actions in terms of good and bad. He gives three examples, which might have seemed trivial to his audience: the airplane, the “wireless” (i.e. radio), and the contraceptive. (Isn’t it interesting that the current season of Downton Abbey has the latter two making an intrusion into the Crawley family, disrupting established patterns?) While each of these may seem to liberate us, or give us power, in fact we adapt our lives to them in such a way that they turn out to have power over us, without the restraint of a moral law. C.S. Lewis’s “men without chests” are moral deformities, not because they are so wickedly disposed, but because they lack the fullness of even being able to make moral judgments.
I assign these two texts – Hittinger’s essay and Lewis’s short book – when I teach moral philosophy. They are the first two readings of the semester, and the third reading is another which identifies a problem at the very root of moral evaluation: John Paul II’s Veritatis Splendor. The beginning of that encyclical explains that it is not intended to defend any particular moral teaching of the Church, but the very possibility of moral judgment (and hence the Church’s standing to speak about morality at all). “It is no longer a matter of limited and occasional dissent, but of an overall and systematic calling into question of traditional moral doctrine” (sect. 4). True to John Paul’s consistent focus, his strategy is to make clarifications about anthropology – until we understand the human person, we can’t understand action and how it can be evaluated.
Of course, one could say that the conceit of Veritatis Splendor is that theoretical clarifications might address the problem; the insights of Carr, Hittinger, and Lewis suggest that the problem is more rooted in disordered cultural patterns than in bad philosophy. But John Paul II is attentive to the way in which ideas are shaped by culture, as evidenced especially by his attention to social and political life (sections 98ff.). Americans need to especially take to heart his warning about the tendency of democracy, to distort from a respect for the rights and dignity of each individual, to a laxness and reluctance to formulate moral judgment, “which would remove any sure moral reference point from political and social life.” That relativism, John Paul argues, is in fact a concession to the last-ditch principle of order that might-makes-right: tyranny or totalitarianism, not true democracy.
But my opening reflections suggest that an even greater threat than democracy-cum-relativism is a technological culture that hides from us our own moral agency. In other words, what needs to be revived is not (or is not only) the language of particular virtues, but of genuine human action. Hittinger says that our culture prizes above all else “the machine insofar as it promises an activity superior to the human act.” If that is right, then the threat of automation isn’t the bad things it tempts us to do, but its ability to hypnotize us into thinking we don’t even rise to the status of moral agents.
Trackbacks/Pingbacks