Speech canbe powerful, and the human impulse to censor words, images, and ideas that are thought to be wrong and dangerous is understandable. There are, of course, the classic free-society responses—education of our youth to be critical thinkers; “more speech” in the marketplace of ideas where good ideas supposedly drive out bad ones. But these solutions lack the emotional satisfactions and broad “quick-fix” appeal of censorship.
Among the more “realpolitik” arguments, some critics of filters emphasize that they create a false sense of security because smart youngsters can circumvent them, and because they “underblock” – that is, they fail to identify and suppress many “bad” sites. But these practical arguments cede too much ideological territory to the advocates of censorship, and lead the public to conclude that if only we can improve filtering technology, the solution to our worries will be at hand.
This is where reports “from the trenches” such as Lynn Sutton’s come in. By describing the experiences of ordinary students and teachers, Sutton demonstrates the negative impact of filters on research, discovery, and curiosity—the essential elements of education. Stories like those that Sutton recounts have the potential to persuade local communities that simple training in Internet safety serves all of us better than a filtered Internet.
Meanwhile, as FEPP’s report concluded, there are steps that schools and libraries subject to CIPA, as well as companies and parents that want to filter, can take to reduce both the bias and absurdity of filtering products.
First, they should understand the differences among products, and choose filters that easily permit both overall disabling and unblocking of individual sites.
Second, they should only activate the minimum necessary blocking categories, rather than accepting the filter’s default setting. For schools and libraries, this means only activating the “pornography” or similar filtering category, since CIPA only requires blocking of obscenity, child pornography, and “harmful to minors” material. Each of these legal categories requires that the targeted material contain “prurient” or “lascivious” sexual content.