Duke University’s Gerry Canavan is teaching my novel Down and Out in the Magic Kingdom in a class on utopias and he conducted an interview with me on the subject for the course:
CD: I based Whuffie at the time more on Slashdot’s Karma, and I don’t know that Faceook has an exact analogue to it. I guess Facebook has this thing where you can see who has the most inbound links, who has the most friends, and you can “digg” up yourself by getting more of those.
I think that in general we have a pathological response to anything we measure. We tend not to measure the thing we care about; we tend to measure something that indicates its presence. It’s often very hard to measure the thing that you’re hoping for. You don’t actually care about how calories you eat; you care about how much weight you’re going to gain from the calories you eat. But as soon as we go, oh, well, calories are a pretty good proxy for weight gain, we start to come up with these foods that are incredibly unhealthy but nevertheless have very few calories in them. In the same way, Google doesn’t really care about inbound links because inbound links are good per se; Google cares about inbound links because inbound links are a good proxy for “someone likes this page; someone thinks this page is a useful place to be, is a good place to be.” But as soon as Google starts counting that, people start finding ways to make links that don’t actually serve as a proxy for that conclusion at all.
GDP is another good example. We don’t care about GDP because GDP itself is good; we care about GDP because the basket of indicators that we measure with GDP are a proxy for the overall health of the society—except as soon as you start measuring GDP, people figure out how to make the GDP go up by doing things like trading derivatives of derivates of subprime subderivates of derivatives, but which actually does the reverse of what we care about by undermining the quality of life and the stability of society.
So I think that one of the biggest problems that Google has, taking Google as probably the best example of someone trying to build a reputation currency, is that as soon as Google gives you any insight into how they are building their reputation system it ceases to be very good as a reputation system. As soon as Google stops measuring something you created by accident and starts measuring something you created on purpose, it stops being something that they want to measure. And this is joined by the twin problem that what Google fundamentally has is a security problem; they have hackers who are trying to undermine the integrity of the system. And the natural response to a problem that arises when attackers know how your system works is to try to keep the details of your system secret—but keeping the details of Google’s system secret is also not very good because it means that we don’t have any reason to trust it. All we know when we search Google is that we get a result that seems like a good result; but we don’t know that there isn’t a much better result that Google has either deliberately or accidentally excluded from its listings for reasons that are attributable to either malice or incompetence. So they’re really trapped between a rock and a hard place: if they publish how their system works, people will game their system; if they don’t publish how their system works it becomes less useful and trustworthy and good. It suffers from the problem of alchemy; if alchemists don’t tell people what they learned, then every alchemist needs to discover for themselves that drinking mercury is a bad idea, and alchemy stagnates. When you start to publishing, you get science—but Google can’t publish or they’ll also get more attacks.
So it’s a really thorny, thorny problem, and I elide that problem with Whuffie by imagining a completely undescribed science fictional system that can disambiguate every object in the universe so when you look at something and have a response to it the system knows that the response is being driven by the color of the car but not by the car, or the shirt but not the person wearing it, or the person wearing it and not the shirt, and also know how you feel about it. So it can know what you’re feeling and what you’re feeling it about. And I don’t actually think we have a computer that could that; I don’t think we have Supreme Court judges or Ph.D. philosophers that can do that.