September 04, 2011

Wikimedia/Wikipedia Image Filter "referendum" results

There's been a so-called Wikimedia/Wikipedia Image Filter "referendum" with results announced now (The Wikimedia Foundation owns Wikipedia and other projects, such as their "Commons" image repository). Much "discussion". "So-called", as it wasn't really a referendum. It was obviously intended as more of a rubber-stamp for what the Wikimedia higher-ups have decided to do anyway about an ongoing problem with "controversial" images (they said it, not me: "The Board of Trustees has directed the Wikimedia Foundation to develop and implement a personal image hiding feature ... The feature will be developed for, and implemented on, all projects."). The little people got to "vote" on advice for the developers! ("To aid the developers in making those trade-offs ...").

I keep telling myself stay out, stay out, stay out, as no good will come from any commentary I make. I have very bad memories of hurting my life from censorware activism. Between ankle-biting wikicultists on one side, and wiki-porn-porn-porn complainers waving bloody heads on the other, I can lose from both ends (i.e. Wikipedia fanatics have incentives in social approval for attacking me just on general principles because I'm a critic, while being insufficiently moralistic is never an easy pundit position). But I don't intend to argue the censorship-related issues. Other writers can do that. This post is about the incredible circus around the event.

One reason Wikipedia truly fascinates me is that, contrary to very deliberate public relations, "Inside, Wikipedia is more like a sweatshop than Santa's workshop". There's distilled group dysfunction on display. And since so much of the interaction is documented (not everything, but a large amount), one can see all the factors much more visibly than elsewhere. Here, one can trace the various political forces at work.

The powers-that-be find sexual material controversies to be embarrassing. They're very clear about the thinking involved, for example regarding some images "in various categories and sub-categories around" "Female toplessness" and "Nude women":

Thirdly, it is our belief that the presence of these out of scope images in Commons is potentially dangerous for the Foundation and Community because they reduce the overall credibility of the projects as responsible educational endeavors, and thus call into question the legitimacy of the many images of sexual content and "controversial" sexual content that must remain on Commons for the projects to fulfill their mission. And, although not our primary motivation in making this recommendation, it must be noted that they are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view that of woman as sexual object. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV we should not allow it on Commons either.

To put it mildly, this runs into conflict with prominent sentiments among a major demographics of wiki editors, single young men. I don't think I need belabor the obvious differences of opinion involved.

So, what to do with when there's outside social pressure but you don't want to alienate a major source of free labor? It's time to start contortions about "filtering". This was a censorware argument I'd seen many times before, and it went down the same path.

Have fun, Wikimedia Foundation folks. I don't envy you. Running a cult is not all PR puff-pieces and back-scratching among elites. Sometimes you have to actually deal with the uncomfortable fact that the "community" isn't completely dedicated to doing unpaid work exactly as you desire.

By Seth Finkelstein | posted in wikipedia | on September 04, 2011 02:40 AM (Infothought permalink)
Seth Finkelstein's Infothought blog (Wikipedia, Google, censorware, and an inside view of net-politics) - Syndicate site (subscribe, RSS)

Subscribe with Bloglines      Subscribe in NewsGator Online  Google Reader or Homepage


Hang on, they've decided the best response to sexist behaviour is make it the personal responsibility of victims of sexism to filter it out? That's...special. That's like everything we've been talking about since the 1960s rolled up into a special ball of fail.

Posted by: thene at September 4, 2011 08:55 AM

Seth, My guess would be they are trying to build a more consumer-friendly brand and get some better integration.

Posted by: David Burt at September 5, 2011 06:10 PM