So Eliezer doesn't care about dissidence vs group-think in any way. Not anymore than he cares about morality or creativity, both of which he is blind to. (In one of my earlier blog posts I point out he's a plodder who's entirely too willing to repeat himself so long as he can hear himself speak.) He seems to care about truth, justice (but not the morality component of justice), progress, integrity, passion, and himself. Yes, he is one of his own core values since he's a narcissist. And being a narcissist, he must have severely reduced empathy, though not absent like a psychopath.
And speaking of narcissism, in The Military And PTSD: A Star Wars Guide, the blogger writes "a narcissistic injury would be the discovery of the limitations of your own power". Hmmm, that sounds like a good characterization of Eliezer's reaction to the death of his brother. Apparently, he was so traumatized that he started making up pretentious names like "affective death spiral" for his emotional state. As if no human being in all of human history had ever suffered like him before because of course he is Unique and Special.
(I thank The Last Psychiatrist for writing wonderfully entertaining and entirely true blog posts bashing narcissism.)
How did I get on this thought? Oh yeah, Eliezer is obsessed with his own power. I suppose that's part and parcel of being a narcissist. Much like projecting his own needs and desires (to enslave and torture an AI) on all of humanity is also part and parcel of being a narcissist. So we have that his core values are himself, truth, power, justice, progress, integrity, and passion. Let's fill out the rest of his personality profile,
- core values: himself, truth, power, justice, progress, integrity, and passion
- super-value: preacher or maybe televangelist of rationalism
- big five: unknown, open, conscientious, extroverted?, anti-neurotic?
- bloom's cognitive traits: anti-synthetic, anti-intellectual?, analytic, intelligent - trusts analysis over synthesis
- attachment style: narcissistic so lacking in higher emotions, has positive thoughts of self and negative thoughts of others. Incapable of bonding and unwilling to bond
- neuroanatomy: unknown
- subconscious: unknown
- all-levels (neuro to conscious) cross-cutting affinities: unknown
That's a lot more than I expected to get from someone I never talked to.
8 comments:
This has got to be the most ignorant interpretation of Eliezer's sincere transhumanist optimism that I have ever encountered...
Someone entitles his blogs "I am less wrong than you" and "my cult is overcoming bias, you heathens are not" and you think he's just a happy-go-lucky naively optimistic guy. And you label it "transhumanism" as if that word actually meant something, let alone excused the blatant narcissism.
There's just no accounting for stupidity, is there? Not until the Great Accounting when all stupid people will be eradicated in some mass genocide. But that unfortunately only happens in my dreams.
Maybe I wasn't explicit enough in my last comment so I will be.
The only thing narcissists sincerely believe in is themselves. They are incapable of believing in others. They are incapable of understanding others. They are incapable of caring for others.
Yudkowsky wants an AI because he sees it as an extension of himself, of his own will. That's what the "friendly" part means. He wants to specify an AI that is guaranteed to be emotionally attached to HIM. And all the talk about humanity is just talk about himself too. Yudkowsky doesn't entertain the thought for a single second (he is not capable of it) that anyone would be revolted by his agenda.
Because to him, any human who thinks differently from him is simply not part of humanity. Or is to be dismissed through some pretext. And he's got a lot of pretexts by now: lack of intelligence, lack of education in "rationality", insanity, pure "irrationality", being "pro-death", and so on. Yudkowsky considers himself the quintessential exemplar of what a human being SHOULD be. Anyone who fails at agreeing with Yudkowsky is either uneducated or simply unworthy as a human being. So to Yudkowsky, "humanity" means "Eliezer Yudkowsky".
Note that Yudkowsky considering himself a shining examplar of humanity is not a conclusion arrived at by logic. Nor is it even a quirky and ridiculous accident in logical reasoning. It is nothing of the kind.
It is an emanation coming straight from Yudkowsky's subconscious where it is indelibly carved in stone. It is a feeling and an emotion which is made manifest through judicious use of poses, pretexts and over-elaborate rationalizations. This is the conclusion he has always striven to "prove".
His being a narcissist is an indelible and unrepairable part of his nature. It is there, it has always been there, it will always be there. But more importantly than that, it is a part of his nature which (as a core value) drives his whole being, has always driven his whole being, and will always drive his whole being.
Yudkowsky has always had a FEELING of being intrinsically superior to everyone around him. And as blatantly laughable and just WRONG as I consider this feeling. As blatantly obvious as I consider the fact he is my INFERIOR. None of that matters. Because he's got an AXIOM in his personality that says "I am superior to others". And axioms can never be disproved.
Which doesn't stop Yudkowsky from trying to "prove" his axiomatic sense of superiority to others. By building overcomplicated and deeply but subtly flawed "logic" which supposedly "concludes" his superiority. But none of that matters because it is a lie. A post facto rationalization of his own axioms.
Yudkowsky may care about truth but he cares equally as much about himself. And that is why he will never EVER let truth get away with contradicting his own innate sense of his own superiority. Just as most people consider it's okay to lie in order to save a life, Yudkowsky thinks it's okay to lie in order to bolster his own ego. And not just okay to lie to others, but more importantly, okay to lie to himself.
Yudkowsky will NEVER accept any empirical evidence of his own inferiority. Even though such evidence is quite simple, plentiful and blatant. And he will NEVER accept that he is a narcissist (ie, mentally defective). And he will NEVER accept that his "logic" to "prove" his own superiority is just rationalization of a prejudice and about as credible as KKK propaganda.
And yes, I talk so much about him because he is a fascinating example of a diseased personality. Showcasing a mental disease that's rather important since it's the signature trait of the American personality.
Also, Yudkowsky offends me greatly. Much more so than Steve Jobs. Jobs is almost benign, especially compared to Microsoft's William Gates III. Yudkowsky not only has never achieved anything but he is quite malign. Twisting all kinds of facts and bashing humans for being "irrational" when they are just being meta-rational.
> Yudkowsky considers himself the quintessential exemplar of
> what a human being SHOULD be. . .
An interesting (if depressing) exchange between
Eliezer Yudkowsky and Ben Goertzel on SL4 back in January, 2002
(the links are no longer valid):
------------------
http://www.sl4.org/archive/0201/2638.html
Re: Ethical basics
From: ben goertzel
Date: Wed Jan 23 2002
Realistically, however, there's always going to be a mix
of altruistic and individualistic motivations, in any
one case -- yes, even yours...
------------------
http://www.sl4.org/archive/0201/2639.html
From: Eliezer S. Yudkowsky
Date: Wed Jan 23 2002
Sorry, not mine. I make this statement fully understanding the size of
the claim. But if you believe you can provide a counterexample - any case
in, say, the last year, where I acted from a non-altruistic motivation -
then please demonstrate it.
------------------
http://www.sl4.org/archive/0201/2640.html
From: Ben Goertzel
Date: Wed Jan 23 2002
Eliezer, given the immense capacity of the human mind
for self-delusion, it is entirely possible for someone
to genuinely believe they're being 100% altruistic even
when it's not the case. Since you know this, how then can
you be so sure that you're being entirely altruistic?
[Isn't] your apparent altruism. . . actually partially
ego gratification ;> And. . . [u]nlike a superhuman AI,
"you" (i.e. the conscious, reasoning component of Eli) don't
have anywhere complete knowledge of your own mind-state...
Yes, this is a silly topic of conversation...
------------------
http://www.sl4.org/archive/0201/2646.html
From: Eliezer S. Yudkowsky
Date: Wed Jan 23 2002
> Yes, this is a silly topic of conversation...
Rational altruism? Why would it be?. . .
No offense, Ben, but this is very simple stuff. . .
I don't take pleasure in being more altruistic than others.
I do take a certain amount of pleasure in the possession and
exercise of my skills; it took an extended effort to acquire them,
I acquired them successfully, and now that I have them,
they're really cool.
As for my incomplete knowledge of my mind-state, I have a lot
of practice dealing with incomplete knowledge of my mind-state -
enough that I have a feel for how incomplete it is, where,
and why. There is a difference between having incomplete knowledge
of something and being completely clueless. . .
I didn't wake up one morning and decide "Gee, I'm entirely
altruistic", or follow any of the other patterns that are the
straightforward and knowable paths into delusive self-overestimation, nor
do I currently exhibit any of the straightforward external signs which are
the distinguishing marks of such a pattern. I know a lot about the way
that the human mind tends to overestimate its own altruism. . .
[W]hen I understood not just motivations but also the intuitions used to reason
about motivations, was when I started saying openly that yes, dammit, I'm
a complete strategic altruist; you can insert all the little qualifiers
you want, but at the end of the day I'm still a complete strategic
altruist. . .
------------------
http://www.sl4.org/archive/0201/2649.html
From: Ben Goertzel
Date: Thu Jan 24 2002
> > Yes, this is a silly topic of conversation...
>
> Rational altruism? Why would it be? . . .
Not rational altruism, but the extended discussion of *your
own personal psyche*, struck me as mildly. . . absurd...
> No offense, Ben, but this is very simple stuff
Of course it is... the simple traps are the hardest to avoid,
even if you think you're avoiding them. . .
The tricks the mind plays on itself are numerous, deep and
fascinating. And yet all sorts of wonderful people do emerge,
including some fairly (though in my view never completely)
altruistic ones...
I haven't read anything by Goertzel save for part of one essay of his. I planned to write a scathing denunciation of everything in it but he bored me. I have to say Goertzel struck me as a complete and utter idiot. So I find it unsurprising that he likes and respects Yudkowsky.
Categorizing people as selfish or altruistic is misguided in the extreme. Hence both Goertzel AND Yudkowsky are utter fucking idiots for pretending such an easy categorization exists.
First of all, social science recognizes not two but FOUR such categories. They are selfish, cooperative, competitive, altruistic. None of these have anything to do with any other. Furthermore, competition can be subdivided into positive competition and negative competition which have radically opposite first order and second order effects.
(Negative competition is malign on first order and ambivalent but moralizable on second order. Positive competition is benign on first order and malign on second order. First order effects dominate in useless contests like sports. Second order effects dominate in society and real life. Negative competition has a cohesive but stagnating effect on society. Positive competition is corrosive all the way through.)
Competition and cooperation are far, FAR more prevalent than selfishness and altruism. As determined by empirical research of course.
Given how much Yudkowsky postures about all that "rationality" crap, he should fucking know this. It is a fairly strong indictment against him that he doesn't.
Second of all, it is impossible to categorize people except by their core values and a few other things. You can't categorize people by 'competitive / selfish / altruistic / cooperative' because these things simply have no meaning to the bulk of the population. It's like trying to categorize non-analytics by their math scores. These terms only have meaning to the very small percentage of the population that actually DOES hold them as core values.
One of my best friends has a core value that's almost Cooperate. Another probably actually has a core value that is Altruism. Yudkowsky being a narcissist actually has a core value that's Selfishness. Yet none of these terms has any meaning to me. They aren't some kind of life strategy, they're merely tactics to be used and discarded as the situation dictates.
Almost everyone who has core values, who has principles, will have NONE OF those core values map to ANY OF the 4 social strategies. And most of the population doesn't have any core values of any kind as they lack the analytic trait that's necessary for their minds to have any coherent structure.
Selfishness vs Altruism is a horrible abstraction and only stupid people could hold on to it. Oh yeah, and selfish people. Though those assholes tend to try to dismiss altruism because they won't accept the fact they are inferior scummy pieces of shit. Yudkowsky's claim that he is actually altruistic is novel in that regard.
You must be a hoot at parties. I was surprised to read "one of my best friends" implying not only that you have a friend, but that you at least one other friend. The generosity of the human spirit is amazing.
Post a Comment