Saturday, November 28, 2009
This is actually a very troubling story. I certainly agree that offensive images of our first lady are offensive. I have not looked at any of them and I don't intend to, but I'm willing to stipulate that they would offend any person of good will. What is troubling about this is that Google has evidently taken steps to eliminate or marginalize the images from the Web. This illustrates the great power that a monopoly search provider has. It's a dangerous thing and I'm not sure the average person understands it.
Google and the other search engines that search the web operate by applying complex mathematical algorithms to the Web. Google's famous PageRank algorithm crunches a lot of numbers regarding how many links run into and out of websites and how important those links are. So a link coming from a site that is more linked to has more weight than a link from some minor website would have. In addition, Google applies hundreds, perhaps thousands, of other measures that are closely guarded secrets to their ranking methodology. A whole industry of Search Engine Optimization has grown up trying to outgame Google to get it to rank one's website higher than one's rivals.
Whether and how prominently web pages show up in a Google search is a matter of great economic and cultural importance. You can think of Google as a kind of weak AI that determines to what our collective attention is directed when any given topic comes up. Search technology and the resources on the Web itself are going to get more and more sophisticated and permeate more aspects of our lives as time goes on. Being at the forefront of the kind of attention Google controls will only become more important.
If Google uses a neutral algorithm, one can see a kind of fairness in that. The Web is a spontaneous order and if it spontaneously decides that this news story is the most important or best at explaining some topic, we tend to think, that's the wisdom of crowds, or actually better, since the algorithm is more sophisticated than simple majority voting. But what happens when the designer of the weak AI is willing and able to tweak the brain of the web so that its attention is not directed towards things it decides are unworthy, or dangerous, or just offensive? And is directed toward those things somebody decides we should be paying attention to? This rigging of the marketplace of ideas is troubling partly because it's impossible to foresee all of its implications and also because it seems a bad idea to lodge so much power in the hands of people with the unique brand of political correctness, geeky self-satisfaction and grandiose ambition that so characterizes the sages of Mountain View. To paraphrase William F. Buckley, I don't want to be governed by them, even if only mentally and partially, anymore than I would want to be governed by the faculty of Harvard University. Or anybody else for that matter. I am naive enough to aspire to self-government, at least as to the several cubic centimeters inside my skull.
By demoting offensive images of the First Lady, Google is doing the same thing to us that they did to the subjects of the Chinese Communist Party by excluding any unfortunate reminders of Tienanmen Square in searches of that phrase in Google China. Up popped super-happy photos of the Square as a tourist attraction rather than of the day it was soaked with the blood of young freedom fighters. The Google line then was, they were just following the laws of the country they were in. But no law says Google cannot rank highly offensive pictures of the First Lady. Or rather, there is such a law, and Google just made it. Or maybe it just seems that way, the pictures having been lost somewhere after the long tail of results disappears into the murk. The only way we would know which would be if Google deigns to tell us.
It's easy to say so what, and in this case we are not losing anything valuable. But Google could just as easily be manipulating results on searches about health care, global warming and a hundred other topics. No doubt they would say they are not. And maybe they aren't. We just have to take their word for it, I guess. Even if such spinning is not a big deal now, you have to try to imagine a world in which information technology is as far beyond what we have now as what we have now is beyond the card catalog based libraries of my college days. In a sense, Google is part of our extended brains. I use it constantly to refresh my memory, look for half intuited connections, see if I am on the right track. As the technology gets more sophisticated, and it is already very sophisticated indeed, all this will be spinnable. In fact, it already is.
Consider an example lawyers may relate to. Google Scholar just came out with a feature that allows one to search and rank cases and law review articles. If you haven't tried it, you should. It is crazy good, and Google is obviously using a highly enhanced version of PageRank to do its ranking. I know it's highly enhanced because simple PageRank applied to a database of legal cases does a lousy job. You will see that the ranking is far, far better than what you will get from Lexis or Westlaw. Cases and articles (you can do both at once) come up in something much closer to a "natural" order of importance. That's good, in the sense that it makes research easier. Searching in Lexis or Westlaw, results just come up in reverse chronological order, with higher jurisdictions on top, but other than that, painfully arbitrary, which makes research time consuming, not to mention infuriating, to those used to Google. But now consider the power that Google would have in a world where everybody did their legal research on Google. Their ranking would become influential in determining which legal precedents were more important as a matter of law. Just by putting some cases more directly in your face than others, the search algorithm is deciding, to overstate it somewhat, what direction the law should go. As long as the algorithm doing the choosing is secret, and the adjustments that are made to it are secret, nobody even knows what this direction is. As a matter of coding, it would be child's play to tell the algorithm to weight privacy rights more or less, for example. This is a version of the tyranny of law clerks, where clerks influence their judges, or associates their partners. By managing information, you manage decisions. You exercise power. It is no exaggeration to say this is in tension with the rule of law. When West drops cases into its key number system, it may be exercising discretion, but at least it is transparent, and no one would confuse the ranking of their search results as an order of importance or anything more sophisticated than jurisdiction and chronology.
The only solution that suggests itself to me is open source search, where the code that is deciding what gets pulled up is transparent to those able to understand it, who can then sound the alarm if something fishy is going on, not to mention improve the technology. I would be the first to admit, however, that this approach has many problems as well. In the meanwhile, I will just observe that having one opaque monopoly be our principal brain augmentation is a dangerous place to be.