October 20, 2008

Google, Wikipedia - seeing RE-INTERMEDIATION in action!

One of the concepts I've tried to advocate (pretty much futilely) against the web evangelists who blather on about the buzzword "disintermediation", is that they are talking nonsense. My counter-buzzphrasing is, "There is re-intermediation". That is new centralization (new gatekeepers), new centers of power.

Nick Carr is now making this point better heard in The centripetal web

Technorati just couldn't compete with Google's resources. But it wasn't just a matter of responsiveness and reliability. As a web-services conglomerate, Google made it easy to enter one keyword and then do a series of different searches from its site. ... Google offered the path of least resistance, and I happily took it. ... I thought of this today as I read, ... a report that people seem to be abandoning Bloglines, the popular online feed reader, and that many of them are coming to use Google Reader instead. ...

By coincidence, Philipp Lenssen just posted about Google Now Allows Sites to Serve Content to Them While Showing a Registration Box to Non-Google Users, noting one implication being:

the barrier for competing search engines, existing and future ones, being raised... because Google may now be offered a key by some sites, something which the same site may not bother implementing for the new engine on the block (if that other engine would also suggest a first click policy). If this policy would ever become wide-spread, the next Larry and Sergeys of today writing a web crawler would face a lot of new dead ends: "Google exclusive" crawl territory, a place where newcomers need to ask permission first.

One the biggest examples of re-intermediation (driven by Google) has of course been Wikipedia, and Nick Carr observes in his post:

One of the untold stories of Wikipedia is the way it has siphoned traffic from small, specialist sites, even though those sites often have better information about the topics they cover

I actually argued this point in an academic discussion thread over a legal case, where I pointed out the process in action. That a link/attention to a poorly-fitting Wikipedia article supplanted attention and ranking from specialists who were experts on the topic (n.b. I didn't mean me, but real lawyers who were on top of the legal issues). The dream of blogging was that such specialists would supplant the superficial "MSM" ("Mainstream Media"), but instead we're just getting the potentially worse Wikipedia. But I just ended up getting flamed, maybe Nick Carr will do better (centralization of critics? 1/2 :-)).

By Seth Finkelstein | posted in google | on October 20, 2008 08:09 AM (Infothought permalink)
Seth Finkelstein's Infothought blog (Wikipedia, Google, censorware, and an inside view of net-politics) - Syndicate site (subscribe, RSS)

Subscribe with Bloglines      Subscribe in NewsGator Online  Google Reader or Homepage

Comments

Could we perhaps classify 're-intermediation by a publicly funded (free) computerised intermediary' as disintermediation?

After all, file-sharing has disintermediated producers of material for public distribution and the public who would have it distributed to them.

Certainly, folk are always wondering if they can slip in some advertising, if not control the distribution 'channel', but it seems pretty disintermediated to me.

The back channel needs disintermediating too, i.e. public funding of publicly owned works without going via a publisher. We'll get there.

Posted by: Crosbie Fitch at October 20, 2008 08:54 AM

As I asked in the CT thread, do you have evidence on what would be at the top of Google searches if it weren't Wikipedia? It seems to that, in the cases I've seen, the effect of Wikipedia is
(1) to bump a lot of rubbish/promotional/quasispam links down one spot in the top 10, and make the top result an article that is at least adequate
(2) to bump the first worthwhile search result from, say, 25th to 26th.

Posted by: John Quiggin at October 20, 2008 06:32 PM

John Quiggin: How would I prove what Google would look like if it made different weighting factor choices in its algorithm? All I can say is that it's not obvious at all that the result of removing the Wikipedia giant juice-sucker would be the same as the current ranking minus Wikipedia. That is, between the two possibilities:

1) Wikipedia sucks away the ranking and attention from better specialist results
2) There are no better specialist results, Wikipedia is really the best

Then both #1 and #2 could look the same - it wouldn't be that the specialist results are second, third etc. because the very hypothesis is that Wikipedia is taking away their attention and links - leaving only the rubbish/spam, because that has a different source of links/ranking.

Part of what I was trying to do at CT was pointing out that #1 is in fact plausible, given the repeated actions of people turning to Wikipedia as it's-a-good-quick-reference.

Posted by: Seth Finkelstein at October 20, 2008 07:26 PM