December 13, 2002

Censorware, "filtering", and the imperatives of control

[I've sent this message around a few places in discussion about the Kaiser Family Foundation censorware study]

One censorware aspect the Kaiser report does not discuss, is that in order for control to be effective, sites such as language-translators, privacy sites, anonymity protections, the GOOGLE CACHE, the Wayback Internet archives, etc tend to be banned. Otherwise, such sites act as a "LOOPHOLE" (to use N2H2's terminology) for the control of censorware. This is a structural, architectural, issue. Whether or not you consider this bad, good, or not a horribly high cost, it's factually a deep problem of censorware which is not going to go away from configuration. Take a look at my (sadly under-publicized) work, e.g.

BESS's Secret LOOPHOLE: (censorware vs. privacy & anonymity) - a secret category of BESS (N2H2), and more about why censorware must blacklist privacy, anonymity, and translators
http://sethf.com/anticensorware/bess/loophole.php

BESS vs The Google Search Engine (Cache, Groups, Images) - N2H2/BESS bans cached web pages, passes porn in groups, and considers all image searching to be pornography.
http://sethf.com/anticensorware/bess/google.php

The Pre-Slipped Slope - censorware vs the Wayback Machine web archive - The logic of censorware programs suppressing an enormous digital library.
http://sethf.com/anticensorware/general/slip.php

Very broadly, the Kaiser study found that the more blacklists that are used, the more inaccurate bans there are. Viewed basically in terms of what censorware is - a bunch of blacklists - this should be clear.

That is, fundamentally, a censorware program is a collection of blacklists. Each blacklist has some accurate entries, and some wildly inaccurate ridiculous entries. If you use several blacklists, you get the accurate entries, and then all the wildly inaccurate ridiculous entries contained in all those several blacklists. Simple.

From this point of view, it's not a surprise that several blacklists, have in combination, a much higher number of wildly inaccurate ridiculous entries, than a few blacklists. Roughly, having more blacklists means more silliness, and fewer blacklists means fewer silliness. No special magic to "configuration" there. The less of the censorware you use, the less of the baleful effects you have.

And Kaiser didn't find that censorware bans all the porn sites either! At heart, it's not difficult to get a big list of porn sites. It's really not. But what "benefit", other than the political, is there in just making the outright porn-searchers work a little harder, while randomly denying some people the information they need, and denying everyone such tools as language-translators, google caches, etc?

I don't think this is a simplistic opposition to "filtering". But it is saying there is no magic - there's not going to be any configuration that makes all the naughty stuff go away, while having only nice remaining. Or even most of the way there. The best PR the censorware companies ever did, was to have the word "filtering" attached to their blacklists. Because that channels all the discussion into a focus on the supposedly worthless material, and far away from all the imperatives involved in controlling what people are forbidden to read.

By Seth Finkelstein | posted in censorware , infothought | on December 13, 2002 11:53 PM (Infothought permalink) | Followups

Seth Finkelstein's Infothought blog (Wikipedia, Google, censorware, and an inside view of net-politics) - Syndicate site (subscribe, RSS)

Subscribe with Bloglines      Subscribe in NewsGator Online  Google Reader or Homepage