Tuesday, December 28, 2010

Algorithmic Meta-Rationality and Meta-Algorithmic Rationality


You are a monkey and you need to eat an equal amount of fruit, nuts and meat. A human researcher is presenting you with fruit and nuts in his hands, of which you'll have to choose one. What do you pick and how do you pick it? The most obvious solution is to keep track of how many of each fruit, nuts and meat you've already eaten and pick what you've eaten least. This is wrong!

You see, one of the most obvious reasons you're eating food is to fuel your brain. And your brain takes a huge load of food to fuel it. So if you use a complicated algorithm, one which requires you to keep track of what you eat every single day, then obviously you're going to need more brainpower to implement it and so you're going to need more food to fuel that brainpower.

Let's recap. An algorithm for picking food that requires more food just to decide what you're going to eat is a horrible algorithm. That algorithm may do its job on the level of picking food to eat, but it doesn't do the job of helping to keep you alive! So while it seems to be rational on the level, it's entirely irrational on the meta-level.

Can you guess what algorithm the monkey's actually going to use? It's a simple one: A is better than B, B is better than C, and C is better than A. Rock, paper, scissors. And the best part of that algorithm is that while it seems "irrational" since it violates all the rules of arithmetic (A > B > C > A so logically A > A) it actually does the fucking job of making sure you have equal amounts of A, B and C in the long term.

That's one crafty monkey.


You are a hairless monkey living in a giant monkey hive with all the other monkeys. Another monkey comes up to you asking you to donate money to save a little birdie from an oil spill. You give him 80$. Yet another monkey comes up to you asking you to donate money to save a whole pod of dolphins from fishing nets. You give him 80$. Does this mean you think one little birdie is worth as much as a whole pod of dolphins? NO!

What it means is that you're just one hairless monkey living in a GIANT MONKEY HIVE WITH ALL THE OTHER MONKEYS. You've off-loaded some of your moral decision-making onto other people. Because that makes it fast and efficient, saving your time so you can do more important stuff like LIVE YOUR OWN LIFE. So what you're actually doing is living your life rather than pondering useless fucking crap like the exact dollar value of a pod of dolphins. Now that sounds pretty rational, doesn't it?

But the best part's to come. Because the fact is you're just one member of a connected society. You're not a sick right-libertarian fuck of a "rugged individualist" who says bald-faced English lies like "there is no such thing as society". And what that MEANS is that you can depend on the REST of society to process exactly how much a pod of dolphins is worth compared to a little birdie. What's actually happening is that you're relying on the charities' volunteer-based systems to ensure that a pod of dolphins has 1000x as many volunteers (and thus donations) as the single bird.

The individual parts of a distributed algorithm don't need to make any sense on their own, or even at all, for the algorithm to do its fucking job.

Humans are surprisingly rational sometimes. And just because some egotistical morons like the typical mainstream economist or a narcissist like Yudkowsky claims we aren't rational doesn't mean they're right. What the fuck do they know? Anyone who aspires to be an ubermensch barely qualifies as a human being. And how much can a non-human really understand about human beings?

A Valid Psych Experiment

So yeah, to finish my earlier post about how poking holes in psych experiments is so fucking easy, the FIRST criterion in any psychological experiment aiming to measure humans' moral decision-making is to set it up so that they are the only person that can possibly help. The moment you use money or words or any other proxy to set up the moral problem, you automatically have a fraudulent experiment.

If you want to measure how much effort people will put into saving X number of human lives, it's simple! You strand the person by a lake with no one else in sight, then you have a drowning victim calling them for help. Then you repeat the experiment with TWO drowning victims calling out for help. Then with THREE drowning victims. And you measure whether they'll put in three times the effort to save three people as they would to save one person.

But see, this kind of experiment would take effort to set up. The experimenter would have to put in sweat and effort. Not just sitting back on their ass in some campus office. And that's the reason it isn't done - because psychologists are lazy-ass motherfuckers. And they're perfectly willing to lie because they're convinced they're going to get away with it. And the best part? They're right.

No comments: