Showing posts with label unix. Show all posts
Showing posts with label unix. Show all posts

Sunday, May 29, 2011

Engineers Are An Inferior Form of Life

Engineers lack the capacity to synthesize original ideas, which is one of the two pillars of Judgement. Which means engineers are amoral and incapable of discerning good from evil without a market survey. Which will of course return bogus answers.

That's the abstract explanation. Let us examine a concrete example. Let us look at how software engineers decided to handle data.

The History of Unix and Windows

Long ago, software engineers looked at data on hard disks and they asked themselves: what dimensionality is yon data, what is its nature? And they determined it was one dimensional (bytestreams) and they patted themselves on the back and saw that it was good.

Then software engineers looked at data on CRT monitors and they asked themselves: what dimensionality is yon presentation, what is its nature? And they determined it was TWO dimensional (bit-BLOCKS) and they scratched themselves on the head and they said "we have a problem".

But lo, a brave engineer came forth and said: we shall Reduce the dimensionality of the CRT so that it maps the one-dimensional data in two-dimensions. Look, it is simple, Cantor did this! And the engineers toiled for a day and a half and they named their creation the Command Line Interface and they patted themselves on the back and saw that it was good.

Then some evil-doers (called Lusers) made a Feature Request, and they asked: we want to embed this directory data inside of other directory data, can we have this? And the engineers saw that this Feature Request was Easy. So they toiled half a day and they named their creation (now with fractional dimensionality) the Filesystem and they patted themselves on the back and saw that it was good.

Then some evil-doers (incomprehensible unfathomable aliens called Artists) said: look, this is all well and good but I have these things called PHOTOS, what the fuck do Command Lines and Bytestreams have to do with my Photos? And the engineers now had a Real Problem.

But the Engineers' procedures had "worked" so well they had produced something 5% as usable as Symbolics' Lisp Machine or Smalltalk OS, so they said they might as well get on with it and produced Windows. And they patted themselves on the back and saw that other people were calling them delusional fuckers. And they were very surprised.

The History of Lisp Machines and Smalltalk

Needless to say, the approach of genuine systems researchers and designers to data was ... a bit different. For one thing, these weren't engineers. They were experimentalists. And they were either capable of original thought themselves, or if they weren't personally capable of creativity, they were at least capable of recognizing that genuine creativity was something to be cherished, not something to be scorned, crushed and dismissed by claiming it didn't fit a Market Need.

The Researchers' and Designers' approach to data was thus: okay so we've got an encoding of data, but we forgot about it because it's not even remotely relevant. We've got a presentation of data, that's almost relevant but mostly it's misleading so we'll actively put it out of our minds. What we need to do is figure out the NATURE of the DATA ITSELF.

For starters, what dimensionality does this data naturally exist in? Oh, it's K-dimensional fractal data where K varies arbitrarily, hmm that's interesting. Okay, so it looks like we'll need a couple of transformations to encode the data and a completely different family of transformations to present it.

And lo the Researchers approached some engineers and the engineers said: say what? I don't fucking get what you're talking about! What are these "objects" you're talking about? Why would anybody need this? What is this "idea space" you keep talking about?! This is absurd and inefficient! Only hardware exists. ONLY HARDWARE!

Then the researchers scratched their heads and said unto themselves: what we need is to buy some engineers and if they don't DO AS WE SAY then we will FIRE THEIR ASS. And lo this was done and through natural selection the Researchers and Designers finally got some engineers that had faith in their Word, and Lisp Machines and Smalltalk were both invented. And this was magnificent.

And to this day, still the Engineers maintain that Unix and Windows are "good" because they refuse to shut the fuck up and do as they're told!

Thursday, May 19, 2011

Genera, Termkit, Unix: what a fucking joke Unix is!

I've been going through info on Genera and Stanislav's site when some commenter on there mentioned Losethose, possibly as a joke.

It's worth watching the first few minutes of the first video. Especially if you're familiar with LISP or Smalltalk. Because then you immediately understand what a sick joke is the conceit, the pretense, of Unix that it "supports C".

Unix doesn't support C. In fact, Unix supports nothing. Unix is a sick fucking joke of an OS.

Take this project for instance. Termkit is just trying to reproduce one small feature of Dynamic Windows on Symbolics' Lisp Machines. See Using A Lisp Machine 1 as proof.

Except that Termkit provides a distinctly inferior version of the objectedness found in Dynamic Windows. And it sure as fucking hell didn't take the Symbolics team a year to program it! Actually, I predict that like many other similar projects to improve Unix, Termkit will die quietly.

What is Unix? It is a sick fucking joke. An undead monstrosity shambling along from the dinosaur age when interactivity didn't exist. Unix was created for batch processing and was never meant to work with anything better. And it has failed to adapt to newer hardware.

Sunday, February 27, 2011

Proof That Unix Programmers Are Retarded Morons

It doesn't get any more retarded than being unable to name colours. Like confusing silver for grey, or roses for chestnuts, or violets for royal purple, or bamboo green for every other forest green.

And the worst part of it is that HTML sucks and was obviously made by retards so Unix programmers are losing a colour naming contest with retards. Because they think glowing CRT colours are reasonable standards.

I realize now, 'retarded' and 'moronic' are inaccurate descriptions of Unix programmers - inbred is better. The only colours they dealt with were from CRT screens so they thought it would be reasonable to name colours solely on their own inbred concerns.

Unix programmers are inbred morons who don't know anything outside of their own little pathetic Unix world.

Tuesday, December 28, 2010

Why Linux Is Decrepit Donkey Crap

Nobody seems to realize how truly ancient Linux is. We're living in the internet age when 5 years is a long time and 10 an eternity, yet Linux is over 40 years old. It wasn't known by that name 40 years ago but that hardly matters since it's hardly changed since. Linux dates back to a prehistoric era when computer dinosaurs roamed the Earth and stern patriarchs ruled the home. And it shows!

Dictatorship

Linux has 40 years' worth of mistakes and missed opportunities accumulated in a gigantic pile of crap. First was refusing to use a capability security model in favour of intrinsically broken access control lists. Oh sure, the morons at the time didn't know ACLs were intrinsically broken. What they DID know was that ACLs appealed to evil totalitarian power-hungry pieces of shit. Far from taking the hint, they considered it a plus! Which is why Linux's model of users is still based on Fascism.

Stalin would approve of a unique special super-user that has total control over every other user's existence. Everything from creating users to parceling out resources to controlling which groups exist and who belongs to them. Approve? That's putting it mildly. Stalin could only DREAM of having this much power. Yet moronic programmers don't think twice about the dictatorship that's hard-coded in this supposedly "multi-user" OS.

Apparently, not only do most people think totalitarian dictatorship's a fine solution to political problems, they also think it's the first and best solution. Something de rigueur and hardly worth mentioning. My mind boggles.

Low Level

The second mistake was becoming wedded to an intrinsically decrepit low-level programming language. I'll remind you that LISP had already been invented by then so it was clear that C was horrifically low-level. That too was considered a plus! But it didn't stop there since Ken Thompson made his inability to grasp higher-level abstractions blatant when he went on to create Plan 9 at Bell Labs.

Plan 9 is "what Unix should have been". It is an abject failure even by Thompson's rather miserable standards (processes aren't really files in Plan 9 and can't be copied using file tools). In 1980, Xerox PARC released Smalltalk-80. Did Ken Thompson learn anything from it? Fuck no. Throughout the 80s and even in the late 90s when object-orientation was the buzzword of the day, the whole concept of objects was completely beyond Ken Thompson. You see, he was still obsessed with FILES.

The fact that Linux failed to stick to its paradigm of everything-is-a-file when the need for a graphical system came along never made Thompson think twice about whether this pathetic files+bytestream filters paradigm was good enough. The fact that Smalltalk was awe-inspiringly beautiful apparently never crossed his mind either. No, he spent over a decade trying to redeem his failure. Trying to prove that he wasn't some overpaid loser that got lucky and struck it rich.

Which just goes to show: living in the past doesn't pay off, it only proves you're a loser has-been incapable of moving on. Ever heard of "moving on" Ken? Bah, never mind. Alan Kay never learned to move on after his failure to create a programming language for children either. Hint: it can't be done because half the population lacks analysis and can't master programming at any age.

No Graphics

That's another thing, Linux grew up long before 2D graphics was a gleam in anyone's eyes. And it never, ever integrated it. X Window is a piece of crap that's officially "not part of the operating system" despite the fact it had to run as super-user with the ability to crash the machine at will. This was considered a plus!

That's right, just because a major piece of the programmatic base of the machine violates ALL OF the fundamental principles of the OS, that's no cause for concern. Just say it's "not part of the OS" and you can pat yourself on the back for fixing the problem. And who cares if they don't integrate together?

I mean, it's not like anyone would ever want the window they're using to be given absolute priority by the machine! It's not like Linux is suddenly an interactive operating system just because it's suddenly got graphics. Who cares if windows suddenly become unresponsive? Batch processing and teletypes are so much better! It's not like users want to actually be in control. It's not like they want the machine to be responsive.

More than 25 years later, and Linux still can't handle 2D graphics correctly. How many years has it been since 3D graphics became commonplace? At the rate it's stagnating, Linux will have integrated 3D graphics in about 100 years. That's 50 years for the first half-wit systems developer to grasp what 3D is good for in operating systems, and another 50 years for Linux to get it. So yeah, in the year 2110, Linux might be fit for use by our generation.

Metaphors Make Bad Models

Let's not even go into the "desktop" and "office" metaphors. Because the whole notion of using, let alone relying on, metaphors in systems design is vile, repulsive and disgusting. It's the kind of crap that ought to get your design credentials revoked for life. Yet again, software developers consider it a plus! Then again, software developers have always been pretty contemptuous of everyday users.

Fucking "icons" as if users were weak-minded retarded morons who can't possibly learn what an object's representation means without it superficially resembling something completely different they're already overly familiar with. You'd almost think users were as retarded as software developers. That'll be the day. When users spend all of their time on projects 90% of which fail to deliver any value to anyone then we can start comparing users to developers.

Fucking technological obsession, as if there weren't perfectly good buttons on the keyboard to use in concert with a button-less mouse. Ever heard of synergy?! It's a word. If you don't know what it means then go back, back to the 80s! But no, a mouse is a Holy Technological Artifact and a keyboard is a Completely Separate Holy Technological Artifact, and never the twain shall meet in the mind of the users. Why, it would be sacrilege!

Ever wonder why there has to be a metaphor of the physical mouse inside of the user's model of their workspace? I did and the answer is obvious: because software developers are too fucking stupid to understand "the user's model of their workspace". In fact, software developers are too fucking stupid to understand that what's inside the computer isn't their workspace!

Proof: you only have one mouse, so you "should" have only one mouse pointer, right? Wrong! You should have two. That way you can swap between them at will. You leave one somewhere then swap and move the other one to the other end, then hit some key (on the keyboard) and all the objects between your pointers get selected. It is fucking obvious when you think about it. It is fucking obvious when you're unhappy with drag-selection being modal and bother to spend just a few hours coming up with something better!

Yet it's equally "obvious" to an OS developer that what's inside your workspace really belongs to them. They're programming software drivers for one mouse so in their atrophied minds, it's obvious there can only be one pointer. Never mind what's in your mind, because you don't matter!

Conclusion

It's amazingly difficult to convey exactly how ancient, how capricious, how fucked up, and how totally dysfunctional this piece of crapware called Linux is.

It's vile, incoherent, inconsistent, contradictory, meaningless, arbitrary, senseless, ad hoc, unprincipled, bloated, arrogant, dictatorial, fetid, rotten, corrupt, and just plain evil. That's when it's working. When its laughable insecurity (called "security" for some incomprehensible reason - I suppose it sounds better) hasn't been cracked open like an eggshell, your computer crashed and all your data corrupted.

And do you want to know why? It's because software developers are egotistical pricks. They have the same cognitive type as engineers, physicists and economists. But they have fewer limitations imposed on their work. They don't answer to physical reality, they don't answer to their peers thirsting for power & knowledge, and they don't answer to rich & powerful masters wanting to oppress the poor. They answer only to themselves. And that's why Linux is a pile of decrepit unusable crap.

To a software developer, running amock having "fun" is called hacking ... and it's a plus!


A warning to the people
The good and the evil
This is war

Saturday, November 20, 2010

My Life Projects

When I hit 20 my life projects were:

  • two SFF trilogies
  • a formal theory of morality
  • a formal theory of mind
  • integrating and modeling my own mind
  • designing an ideal OS
The first was abandoned pretty quickly since I couldn't write happy characters. And then about a year ago I discovered extraordinary writers are all non-analytic. So far, Yudkowsky is the only exception.

The next two are more or less done. Any further progress on them has been shelved because it wouldn't affect my life or the real world. I haven't read an AI researcher I didn't come to despise so telling them how to construct a mind sounds like a bad idea. Even assuming they would listen which they never would.

The before last is done to the point where any further progress wouldn't even be visible. For something with no definable end goal, it's as done as done can be.

The last project would affect my life directly, indirectly, affect the whole world, and eventually transform it utterly by undermining hierarchical media and teaching direct democracy. It's the ultimate example in leveraging meta levels.

Design work is substantially complete. The holdup has always been implementation. First because analysis isn't my strong point, second because I end up regressing every time.

That is, I need some tool to do something, discover that all the tools presented as candidates are hopelessly inadequate, and end up having to learn a whole new subject domain to build the tool. The new subject domain is invariably something I dislike and resent learning which causes an enormous holdup until I feel comfortable in it.

I am at the point of working on an OpenGL framework because I need one to build a weird 3D engine because I need it to build a 3D UI paradigm because I need a UI for the first application that's going to take advantage of my OS. It's a good thing I'm going to be publishing from the top down - UI first with very little OS then application then rest of OS.

Yes, it takes developing software across 4 levels of abstraction (3d, engine, objects, UI, social) in order to redefine human-computer interaction, in order to displace Unix, teach real democracy, undermine top-down media, and redefining software licenses in order to transform the software industry, in order to transform the real world's politics and economics. And it takes having every single level of that hierarchy in mind simultaneously.

I started a blog on design a few years ago intending to explain this whole process but there didn't seem much point.

Needless to say, I have more projects now.

Thursday, July 10, 2008

Firefox Is For Cunts

Case study: after some browsing I casually tossed my keyboard aside in such a way that the F1 key got pressed continuously by a cable. By the time I noticed what was going on, Firefox was trying to open 50+ help tabs. Which of course slowed it to a crawl.

I killed Firefox only to try to edit the profile.js file. gEdit promptly froze on me as well. And examining the file with OpenOffice made me give up in disgust. So I tried to reopen Firefox hoping it would not try to open all of those tabs, or let me close them as fast as they were opening. And of course, it froze my computer.

So I try to reboot my computer but control-alt-delete doesn't work and I have to do a hard reboot. And when I get back into my computer it demands to know whether I want to restart the last session of Firefox (hell no) or start it with a clean slate (fuck no). So I hit ESCape which somehow doesn't cancel but defaults to clean slate. Okay no panic, last time I did that I was able to kill Firefox and get back my previous session. So I proceed to do this and I find out that profile.js has been completely wiped clean. Not a trace of the original dozen tabs I had open exists anywhere. They aren't in history, they are nowhere.

Now Firefox and Unix weenies will dismiss it all as a bunch of "accidents" or even worse claim it was my fault - it's always the user's fault as far as incompetent programmers are concerned. But there's at least two dozen principles of systems design that have been fucked up the ass in this case.

And what's tying it all together? Massive arrogance. Massive overweening I-know-what's-best-for-you-even-though-I-hold-you-in-utter-contempt arrogance. This "accident" could not have happened except that at every single step of the way, the programmer decided that he knew what was "best" for the user and decided to inflict it on them.

Even Internet Explorer, which is barely useable, didn't have that much arrogance. Whenever IE crashes, which is often, it's feasible to recreate your list of open windows by going through your history. But not Firefox because the history doesn't keep anything as simple as the pages that have been opened (programmer model) or the pages that are open at that moment in time (user model). Instead, the programmer decided to be "smart" and keep only the pages that have been opened by user action. So now those tabs which were initially opened by me an unknown number of weeks ago are nowhere in history.

And it's like this all the way down the line. Why did Firefox freeze my computer? Because Unix system programmers are morons incapable of comprehending the user model of scheduling (the main interface window has absolute priority and applications inherit resources from open windows) and also were incapable to sticking to the batch programming model which Unix's incompetent designers built into it initially. They had to get "smart" and fuck it up.

This travesty is the direct result of programmers' addiction with adding "features" over the users' dead bodies. Features which proceed to interfere with basic functionality in ways that make it unreliable or non-existent. Because programmers are mindless robots incapable of comprehending good versus evil. Like the mad engineers working on atomic weapons, warplanes, biological weapons and anti-children mines, they only care about getting a shiny new device. Not something constructive for the world.