Nick Carr has an essay "Is Google Making Us Stupid?", where he applies a certain effects-of-tech-on-humans framework to Google. I know Nick is a very smart and learned man, so I read his thoughts carefully. I suspect there will be a certain amount of noise in reaction to what he wrote, as some with less regard for him will take away a superficial impression and go into standard techno-utopian rants against that ("Luddite!" is a tip-off you're reading one of these).
However, I'll try to outline what I found unsatisfying, by talking a bit about some of the meta-issues (I have to spell out I'm deliberately doing this, otherwise the widely-varying contexts tend to make it look like I'm *only* talking about myself).
When I read articles such as the above, I'm very aware that there is indeed a science/humanities "Two Cultures" divide. And I'm on one side of it (science) while many pundits are on the other (humanities). One basic way to tell the difference is essentially when science types can extend "themselves" through technology, they think "This is cool! Wonderful! Great! More!", while humanities types angst about "How has the basic nature of our essential souls been corrupted?". Note this angst-ing effect generally applies only to technology they haven't grown up with - for example, you don't see a lot of articles bemoaning how the telephone disembodies us into ghostly vocal presences. Of course, the more intelligent humanities types, like Nick, know this history, and it's clear especially towards the end of his piece. But they write the angst-filled articles all the same.
To demonstrate, here's a paragraph shot through with those themes (my interpolations are in the brackets):
Still, their easy assumption that we'd all "be better off" if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling [tech: "Neat!", lib-arts: "Scary!"]. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. [tech: "Yeah!", lib-arts: "My soul!"]. In Google's world, the world we enter when we go online, there's little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. [tech: "Math rules!", lib-arts: "Poetry rules!"]. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive. [tech: "Humans are machines!". lib-arts: "Humans are divine!"]
So I've often wished there was more support for what I call "technology-positive social criticism". By which I mean that criticism of techo-hype and marketing hucksters often seems to end up couched in a certain type of fogeyism (which alienates tech types) because there's no other power-center supporting that criticism. I sometimes don't want to alienate those who write in this fogeyist idiom. But it's a struggle.By Seth Finkelstein | posted in google | on June 09, 2008 11:59 PM (Infothought permalink)