There is an obvious problem with Utilitarian moralities. I don't know if Utilitarians see it as a problem because they're pretty dumb. They just may be dumb enough to see it as a plus. Well I will here explain the problem and prove that it is indeed a problem. An unavoidable, irrefutable and fatal problem. The problem is 'morality inversions by weight of numbers'.
What it boils down to is that you have some mechanism to multiply the small positive benefits of an evil act while putatively avoiding the multiplication of the enormous costs. Say for example you videotape a real live torture, rape, snuff film. For the next century, millions of sadists are going to be able to enjoy the experience vicariously while anyone disturbed by the event will simply avoid it. All it will cost is a single life. Intuitively this is immoral. For idiots like Eliezer Yudkowsky it is possibly, probably, obviously moral.
There are many problems with this particular morality inversion. Firstly, morality is an abstract hypothetical system, not a concrete calculation. Treating it as a concrete calculation, as morons such as Yudkowsky do, is wrong from the get go and will only result in wrong answers.
Secondly, morality is an ought and oughts are second derivatives of wants, they are what we WANT TO WANT. And we don't want a world in which snuff (at least the non-consensual kind since there was an interesting court case of consensual cannibalism in Germany a year or so ago) is considered moral. We don't want this and we don't want to want this. It's an obscenity. As a result, snuff can't be how the world ought to be, so it can't be moral. Obscenities generally can't be moral, that's what it means to be an obscenity.
Thirdly, and most grievously, the concept of a person is rather ill-defined for an AI or any society that includes people who can temporarily bifurcate (copy themselves and then merge back their memories). How many votes do you get if you clone yourself 20 times? In such societies, only moral systems that are completely independent of weight of numbers can produce well-defined decisions. And since such societies are our future, it behooves any future-oriented person to toss Utilitarianism by the wayside.
And that's not even the biggest problem with Utilitarianism since the whole concept of 'utility' is ill-defined.
The upshot of all this is that morality inversions are not "cool" or "deep" or a sign of "overcoming bias". They are WRONG. Persecution of minorities doesn't become a good idea just because the minority is small enough and the majority wants to do it badly enough. That would be absurd. That would be ANTI-morality. And anyone who sets aside these numerous deep flaws in order to appear elite or philosophical is just a blatant idiot. A poseur, not a philosopher.
This makes it the third fundamental property which any moral system must have in order to be coherent and well-defined. The first two being consistency across actors (different people applying the same moral system can't disagree on whether an act is moral or immoral), and consistency across order of application (the same outcome must result regardless of who acts to apply morality first).
Trump’s budget hits transit hard
5 weeks ago