Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Wednesday, June 17, 2015

Criticism of AO3's User Interface

A few observations on UI design prompted by a comment I read on reddit.
Look at Archive Of Our Own for a good example. It has a much, much easier interface than fanfiction.net, its tagging system is rich and has a lot of advantages, it has nice features like letting you group stories into collections, you can reply to reader comments with proper discussion threads, etc. 

On A3O you can't do social tagging as you could with delicious more than a decade ago. There is only dictatorial tagging with the author being the sole dictator. You also as a reader can't cut out seeing insignificant tags, whether it's because they're so general as to be meaningless or so unique as to be unguessable. So those useless tags clutter up your attention and screen space while browsing, slowing everything down. Even though detecting insignificant tags by software operation is trivial.

You also can't do logic operations with search operators, only nest searches in a narrow way. You don't even have access to the simple NOT operator. And finally, the whole paradigm of search is entropic because it annihilates the existing lattice structure that exists in browsing rather than filtering that structure and collapsing sparse branches.

The comment system offers no levels of privacy nor ways to aggregate comments OR readers in any useful discussion-oriented manner. Because A3O isn't about discussion nor debate nor criticism nor anything like that, its model is strictly pairbonding and attentional. That's why "gift stories" exist on A3O. Everything is about readers giving attention to the author and possibly authors giving attention to the readers and authors giving attention to each other. And that is IT. A3O is functioning at the level of a human infant.

And because every comment is public, it's impossible to hide anything. Which means that strangers can scrutinize your interactions and apply their values to them. And which values get applied in any large open community? Gaian values of doing nothing, surviving at any cost and giving no offense to anyone for any reason whatsoever. Anyone who critiques others heavily is labeled an antagonist, a "troll", and totally anathema to the rule of giving no offense to anyone, thus driven out of "the community" which is more important than any abstractions like "good storytelling" or improvement or truth or critique. Gaians run every open forum in existence, and most closed ones. It's so easy to ban people as "offensive" when everyone can see all of their stuff and you just have to wait for an asshole to report being offended.

Some features I'd like: * For readers: a feature to auto-recommend stories based on the currently viewed story or your pattern of likes, sort of like Amazon does.

Extremely difficult to implement, requires machine learning. Unlike cutting out insignificant tags which only requires an asshole to think about it. (I say asshole because there are virtually no systems designers anywhere and so you rely on assholes for all your software needs. Enjoy your frustration.)

And in any case, the "features" paradigm of design is Gaian. A feature is a property of something that has a specific purpose. X is for Y. It's an incredibly narrow way of thinking that invariably produces broken software and kludgy software that's impossible to understand. Look at C++, it's a Gaian "evolved" programming language which absolutely no one understands and everyone claims is able to do anything but in practice can do absolutely nothing. Every program written in C++ is intrinsically broken, precisely because of its many, many, many "features". Brokenness is an intrinsic property of Gaianness, or any biological or ecological or evolved system. The only property those systems have is survival by cravenly appealing to their audiences or neighbors.

Genuine design works on the meta-level by COHERING features and ERADICATING features that don't fit within higher principles. Or even merely eradicating features whose benefit is not above a certain minimum which represents the cost of the feature taking up the user's and all programmers' mental space (so they can learn to avoid it as they inevitably have to do with broken crap). Undesigned "evolved" systems that have features added and added and added with none ever taken away because some asshole depends on them, are intrinsically broken because 90% of them is unused by 90% of the population, and 50% of them is unused by 99% of the population. And it's always a different 1% that wants each tiny feature and will whine. And Gaians will say cravenly that there is a "demand" for each feature and it would take "effort" to remove them.

For authors: a way to upload chapters and have them published automatically at preset dates/times.
See, instead of having a "feature" that does this, such as a script or something, you could just have a versioned object system that symmetrically allows you to read and write both the past and the future. So instead of a "scheduled" story, you just have a story that's written in the future. But nobody does systems design, least of all "architects", everyone is enamored of their little "features" which are really kludges.

The advantages of writing in the future are obvious: no more deadlocks ever but rather a solid transaction-oriented system. The advantages of writing in the past are difficult for lesser minds to fathom but they are far far more extensive. They boil down to: equity. If designed correctly, writing to the past can be done without the possibility of unending time wars.

But of course, there is that problem "if designed correctly" and the lack of systems designers on the Earth. At least recognized job-holding ones with significant whip power. It doesn't help that few systems designers understand systems design or can teach it, so you mostly end up with assholes with pretensions of grandeur like the inventors of Unix and C. Both of them 50 year old dinosaurs that really should have every single contributor to them summarily put to death.

I am not prepared to forgive the existence of C or Unix or C++.


So in summary, AO3 enforces the values of brokenness and inequity while superficially appealing to values such as freedom but really anarchistic hatred of all social organization.

Monday, March 16, 2015

Functional vs Object-Oriented Programming

Long ago I said that functional and OO were opposites in a way and I pointed to the fact that functional is verb-oriented whereas OO is noun oriented. Well today I have discovered the relation they have to each other. Functional is a gutless paradigm and OO is total.

They have the exact same relation to each other as deontology vs consequentialism, and for the exact same reason. Deontology is obsessed with obeying rules about actions regardless of consquence (state) and regardless of context, even when those rules appear blatantly insane and the consequences are insufferable. The question is WHY? Why would anyone do such a thing?

Obviously, deontology was invented by gutless people to deal with a universe they can't bring themselves to even comprehend. See no evil, hear no evil, speak no evil. The mantra of the gutless insane fuck. These people make up rules to deal with the universe and then they ... *hope for the best*. Even when they're facing the very worst, when they're facing an immoral unjust crazy and evil universe. Even in such a situation, they blind their eyes to the truth and hope for the best.

If you're going to blind and deafen and mute yourself to the actual state of the universe (state, get it?) then you have to blind and deafen and mute yourself to any consequences of your actions (future state). And while you're at it, you might as well blind and deafen and mute yourself to ALL state, because state is painful, isn't that right? If you have no coherent notion of state then you can't USE state (or nouns) in your methodology. You must resort in referential transparency (and verbs) as a last ditch method, no matter how insane it is.

The opposite of deontology is consequentialism. Consequentialism is about NOT blinding yourself to the actual state of the universe, be it so harsh or vile or nauseating or evil. Consequentialism is about understanding the state of the fucking universe, no matter how disgusting it may be, because only then can you ameliorate it. Only then can you make it less bad. Only then can you make it LESS harsh, LESS vile, LESS nauseating, and LESS evil. Consequentialism is about lessening badness.

And object-orientation? Is about understanding fucking reality. Especially, understanding the fact we live in a STATEFUL universe. A universe where objects clobber their past versions, where objects have side-effects, and where objects clobber other objects. THAT ... is ... THE UNIVERSE. Object-orientation is about fucking reality, and functional programming is about ... being gutless and weak and living in a fucking never never fairyland full of sugar plums and fairies.

Functional programming is despicable.

And logic programming.







And declarative programming.

Inferior tools for emotionally inferior minds.

Tuesday, January 27, 2015

No Interfaces Worthy Of The Name

Between self-sufficient concepts such as cars or Class Car on the one hand and large scale models on the other hand, there are interfaces. Not interactions or relationships but interfaces. The problem is that the software industry is extremely impoverished in those and as a result, dealing with it is excruciating torture to me. Because interfaces are what I'm best at and what I love. MVC is a good example of interfaces which are debilitatingly painful to me due to being hopelessly broken and low-level.

Think about it and think about how many concepts there are for large scale structure. Some patterns, architectures, frameworks, libraries, that's 4 categories already. Then how many concepts there are for small scale self-sufficient structure, probably hundreds of the latter. And then how many concepts there are for interfaces that one would be willing to use (so command and instruction and function don't count).

Events? But events are broken and not first class, so they aren't real. Object-capabilities? Disgustingly low level and broken. Object, maybe but that counts as small-scale structure really, or non-interface even. So there's message passing, inheritance, polymorphism? that one doesn't count. delegation. cloning vs instantiating, subclassing. Oh yes, aspects vs crosscutting, those are nice. Agents? Not really. Actors? Hmm maybe, maybe not. Probably not. Meh, probably yes but the problem is I just don't give a damn since it's about distribution and concurrency.

So there's no first class events, there's no first class dependencies, aspects aren't in any language I know. Transformational programming seemed in its infancy when I first heard about it, and I've never heard anyone ever ever mention it since then. Namespaces suck rocks so they're broken. Naked Objects? Oh yeah there's some guy who implemented it as a library or framework in Java, that's good for him honestly but doesn't count. Especially with the implementation being so kitsch and primitive rather than thorough and comprehensive. I mean, where's the IDE using naked objects? Nowhere.

There's remote message sends and proxy object, doesNotUnderstand: NullObject, those are another 4 interface concepts. So that makes what? 10? An even dozen? Twenty? It doesn't matter how many there are because here's the sick thing, they're enumerable. and they're not categories of things either, they're discrete instances of interfaces.

The software world forms an uncanny valley type field to me. There's large scale structure and then there's small scale structure and there's no bridges between them.

I don't think I'm the only one who loathes debugging or reverse-engineering with a passion. But I do think I'm the only one who understands why. The tools are worthless because the concepts to even minimally support asking "where did this bug come from?" and "how do I use this?" don't exist in software.

Sunday, January 18, 2015

Conversation On Secure Multiplexing

I drew some insights into the execution stack from TUNES. More of them than the whole exokernel thing.

Main and only insight from exokernel was that secure multiplexing is independent of abstraction. You can have ONLY secure multiplexing enabling you to present something that looks exactly like the bare resource you're multiplexing. That insight fueled Xen and other hyper-virtualization things.

The only problem with it is it's a lie. Secure multiplexing is an abstraction by itself. You run into the limitations of the abstraction if you push it, exposing the underlayer's existence, at which point the abstraction starts to fray and reveal its nature. For example, that there ARE other OSes running on top of the hypervisor because there's "missing time". and then it becomes obvious that hiding each other and not permitting any way to cooperate or interact is a choice of abstraction.

Joe B: fuck, I comprehend nothing

Okay, say you've got a CPU. now the traditional way to multiplex (slice and share it) is with a scheduler. Problem is that OS schedulers look nothing like CPUs, they're higher level. What people managing a cloud want ideally is to present CPUs, bare and naked, and tell everyone to fuck off because hey there's your CPU, your problem.

Now they don't want those CPUs to be REAL CPUs because that's not scalable. But they also don't want them to interact, so one asshole customer can't bring the whole business crawling to its knees. They want no-stick teflon quarantine isolation from each other. better than quarantine, they want everyone stuck in their own reality with no way to guess that they're stuck in a virtual reality.

multiplexing = slicing and sharing
secure multiplexing = teflon nostick compartmentalized quarantined isolated slicing and sharing

If you're a bank, you give out gold. but you want to give out virtual gold tokens that function just like actual gold. and you want to give out as much as people will buy without collapsing your business. You don't want to give out REAL gold because most of it's just going to sit in people's homes unused rather than being consumed in jewelry and electronics. And if people are only going to trade them then they only need to be pseudo-real enough for the purposes of trading. The virtual gold tokens need to look and feel real when they're being tested by a buyer, and at no other time, which is money.

Any questions? Or is this too primitive?

Joe B: no, this is perfect

Well, the exokernel folk tried to pull the same stunt as gold => money but with CPU+memory or in general 'comp hardware'. the only problem is that nobody pretends that money ACTUALLY IS gold. nobody tries to melt money down to make jewelry. nobody tries to electroplate anything with it.  so what these guys were doing is ... debasing.

They were debasing CPU+memory+hardware and saying "it's just as good as the real thing!!" and the problem with that is inevitably they'd run into someone trying to treat it EXACTLY like the real thing (ie, someone who bought into the propaganda) and then they try to use the debased gold to electroplate something ... and feel gypped because it doesn't work.

So with exokernel, if you have a really high load on the CPU, many operating systems, you come to have missing time. and the whole mockery of it being teflon and no-stick comes crashing down. Now it's not a problem if admins in the cloud-providers keep a watch on resource utilization and add more physical computers in time ... but those admins can't pretend to themselves that it's JUST AS GOOD AS real physical computers.

And if you're going to have something that's intrinsically different from physical computers, then why not do away with some of the problems of it? So the exokernel folk's attitude that their project was somehow purer and better than everything else is just a lie.

What does the Unix scheduler provide as an execution abstraction? It provides processes. C processes to be specific. GemStone provides Smalltalk processes or smalltalk images even. The C processes *ARE* images, they're just dumb as fuck images ...

So what is the exokernel lesson? The REAL lesson? At any time, at any point in the stack of abstractions, you can insert a circular loop from a node (layer) to itself, presenting a facsimile of that layer higher up. And if you understand that then the whole exokernel project is revealed as limited in scope because it was providing ONE such circular loop among the one to two dozen layers of abstraction found in a typical operating system.

Joe B: what is this layer, and how does it loop on itself? is it the physical computer, which loops by resources being added to it?

It's any layer. you can take ANY layer and make it loop in on itself. the loop forms a layer.

Say you've got a harddisk. it presents blocks. So you can partition it and now you have four hard disks which also present blocks. And if you're smart you can make those partitions flexible.

Say you've got a monitor with 1 framebuffer. well, you can partition the monitor and present multiple framebuffers. and those are now called windows. Or you can have multiple monitors present as one framebuffer.

You generally need some OTHER resource mixed in with the first one in order to fake the first resource.

gold + paper = paper money

If you could completely supplant the underlying resource, you would do away with it and it would be called a change of technology.

TCP allows how many different sockets? That all run over a single physical copper wire. The phone company uses multiplexing to provide virtual circuits instead of real circuits.

Richard: you got what I said about OSI, right? about how SOCKS is just a circular loop of a layer?
Joe B: oh yes. I got the words, not the concept. I'd have to learn the OSI model first.
Richard: SOCKS provides a sideband and extension to the layer below but it really does nothing else. Much like barebones secure multiplexing provides a sideband, although the exokernel tried to pretend the sideband didn't exist.

application layer (protocols used by applications, supposedly close to humans)
V
transport layer (virtual circuits)
V
data layer (packets)
V
link layer (0s and 1s to the next computer)
V
physical layer (physical connectors, physical cables, electrical voltages, radio frequencies)

Joe B: okay, that makes sense

In the fibersphere model, there are no packets and the virtual circuits are pretty close to real circuits so they're fused in with the link layer. Too bad we have no fibersphere because it might have been resistant to wiretapping. since you'd need to own a substantial fraction of the world's computing resources to wiretap everybody. Not even to interpret or do analysis, JUST to wiretap.

So, the OSI's model provided two additional layers to the above, and both of them were sidebands off of the application layer and the transport layer. SOCKS takes virtual circuits and provides ... virtual circuits. + some proxying and crypto. The so-called presentation layer took in application stuff and provided ... different application stuff. MIME took text and provided images, both of them being application layer.

The fact these two layers were BESIDE the application and transport layers really confused the dumbasses that made OSI, which means moralists since this was a standard, they thought since SOCKS takes in virtual circuits we'll just ignore that it provides virtual circuits, we'll focus on the other stuff it provides and call it a higher layer. And as for the presentation layer, since there's nothing closer to humans than applications, by definition, then by stupidity it follows presentation must be below applications and let's ignore the facts to the contrary.

Joe B: yeah, I stalled at trying to distinguish application from presentation

An email is an application object. the application layer provides for emails. Well, MIME took emails and provided images and that's exactly how gmail attachments work. They just hide the MIME, as they should have in the past but didn't.

Basically, those two layers are extensions of an existing layer rather than separate layers in themselves. Extensions which aren't accepted enough to be considered part of the same layer. Or weren't at the time that OSI was made. Hence the service and presentation layers belong on the same level as transport and application ... just besides them.

Joe B: so… a loop layer is one that can take in the same entities that it can provide?

It's basically a type of extension of the layer. It's aware of the other layer and the other layer isn't aware of it.

Joe B: hmmm

Joe B: is this design, or is this analysis? well it's both. it's awesome, lol.

It's the kind of high level analysis that fuels systems design, and NOT normal design. It's part of the majestic overlayer that has been until now entirely missing. This is lesson 4?


  • definitions / thinking
  • manipulating datasets
  • injecting values

Wednesday, November 26, 2014

What Is An OO Language?

People who understand Smalltalk make disparaging comments about how Java is Smalltalk minus minus. Something that is literally and historically true as Java was explicitly and deliberately invented as a crippled broken-down version of Smalltalk. A version of Smalltalk made more entropic to appeal to retards who were using still more entropic languages. Because when you want pigs to play with a diamond, why not coat it in mud so it resembles what they're used to? Hard to argue with that logic.

Now as I was saying, people who understand Smalltalk make disparaging comments about Java. And they understand that Java is not at all OO. Contrary to what the cretins say, it isn't true that Smalltalk is "the purest OO language". Smalltalk is not pure, it is highly impure. Smalltalk is crap and is the crappiest of OO languages. Smalltalk is the absolute bare minimum of what an OO language is. And since Java is inferior to the bare minimum, then logically it isn't OO at all. BUT, that doesn't actually explain what an OO language is and why Smalltalk is one and Java is not.

This rabid lethal epidemic of ignorance is what enables cretins such as this guy to compare Java and Smalltalk and Self without ever realizing that "one of these does not belong" much like a monkey does not belong with a man and a woman. So let us dispel the ignorance and talk about what actually makes up OO. Which is of course not classes as the majority of (entirely retarded) people claim. Rather it is objects. And by objects we mean independent dynamic contexts.

Now, the fact classes aren't objects in Java is bad, The fact there exist non-object primitive types in Java is bad too, but the fact that as far as scoping is concerned, objects simply do not exist in Java and are totally irrelevant? That's a deal-killer. No objects in Java <=> Java not object-oriented. And now let's turn to one of the most intrinsic and yet blatantly externally obvious properties OF objects so that everyone can behold the knowledge that Java has no objects and bask in Enlightenment. The Enlightenment that even LISP manages to be OO and Java will never be.

Fermions vs Bosons


Objects in reality are made up of FERMIONS. Fractional spin particles which obey the Fermi exclusion principle. Bosons are integral spin particles which do not obey the Fermi exclusion principle and therefore stack on top of each other and FORM NO STRUCTURES. Fermions <=> exclude each other <=> form structures <=> form objects. Bosons <=> stack on top of each other <=> form no structures <=> do not form objects. Bosons are light and radio waves and fermions are planets and stars and idiots who lionize Java.

Now, in Smalltalk and in Self and in LISP, there exist dynamic contexts which EXCLUDE EACH OTHER. They DO NOT STACK. And in Java those same "dynamic contexts" STACK ON TOP OF EACH OTHER. In Java, an instance of a class can freely play with any variables of any other instance of the class. Why? Because instances do not matter, because they aren't real, because they don't exclude each other, because they stack in the same volume. In physical reality, you can stack an infinity of bosons in the same volume until the whole volume collapses down into a black hole. In Java, you can stack an infinity of instances of a class into the exact same namespace until Java runs out of memory and collapses into itself.

There are no objects in Java because there is no matter in Java because there are no fermions. This is why everyone who's ever so much as played with Smalltalk or Self or LISP has grasped intuitively the feeling that objects in those languages are more "concrete" and more "real". Because they are LITERALLY more physical than the insubstantial ungraspable bosonic crap pseudo-matter which is all you can find in Java. In OO languages, objects have SUBSTANCE, whereas in Java they do not. In OO languages, objects take up VOLUME, whereas in Java they do not. In OO languages, objects PERSIST, whereas in Java they do not. And since classes aren't real in Java, it follows the fact that Java classes DO exclude each other can't matter at all.

In Smalltalk, everything is REAL. Everything is made of REAL objects and REAL matter. Objects have volume, and they jostle each other if you try to make one object reach into the innards of another object. It is indeed possible to make them do that but only by doing surgery rather than like a holographic projection passing through you. You can FEEL the resistance against doing this. and classes are even MORE real, because all classes are objects too. You can OFTEN ask classes "you class, give me your name and ID" and "you class, are you class ThisNameIsMine?" and the browser constantly asks classes for their parents and children. and you CAN ask ClassName allInstances of a class. And that's the least of what you can do.

So, Smalltalk, LISP and Self ==> OO + real + objects + matter. Java, C++ ==> dead crap + fake + insubstantial + ectoplasm. Also, OO <=> Good, and Java <=>; Bad. The reason Java and C++ prevailed and OO lost is because most people are retarded brain-dameged idiots incapable of grasping OO. Just like they're incapable of grasping Goodness is the reason why we have capitalism and coal and disease and poverty and wars and death. Bad to the retards is "Good Enough". This is the Worse Is Better crowd.