Monday, December 23, 2013

Information vs Entropy

What is the difference between information and entropy?

Nobody will tell you because as usual the scientific community is made up of inept worthless morons so THEY. DON'T. KNOW. In fact, they will tell you that information and entropy are synonyms despite the fact that they are never used interchangeably but rather are opposites.

Just like the "learned" philosophers will tell you that ethics and morality are synonymous despite their never being used interchangeably and often as complements to each other. Entropic shit-spewers, the whole lot of them, with only single-digit exceptions!

Well, I will tell you. Because I am not a douchebag driven to hide how much he doesn't know to maintain "credibility". Nor am I a douchebag driven to build up and cryptify what he knows in order to build up "credibility" amongst a crypto-priesthood of like-minded "brothers"..

What It Is

bits + ISA = information

bits - ISA = entropy.

What is ISA? ISA is INFORMATIONAL SYNTAX AFFINITY. It's basically, THE INFORMATION YOU LIKE. That's right bitches! The difference between information and entropy is subjective! Something they barely let on in communication theory when they start talking about signal vs noise. Something they LIE ABOUT in thermodynamics when they claim it's about microstates vs macrostates!

What It Isn't

Physicists claim that the motion of aluminum atoms in your hard drive are microstates, and that the arrangements of aluminum atoms in your hard drive that correspond to 0s and 1s which your computer reads are macrostates. And, this is the important bit, they claim that the microstates are garbage (entropy) whereas the macrostates are useful (information). In other words, physicists claim that the difference between entropy and information is the META-LEVEL you're interested in. THEY ARE LYING!

Counter-example: how many of you have garbage files on your computer? Ancient ZIP files, corrupt files that can't be played, duplicate and truncated textfiles, automated log files nobody ever looks at, useless "temporary" files that have accumulated, porn you never bother to look at any more, bookmarks you never go to? How many of you have entire FOLDERS' worth of that crap?

And yet this ALL OCCURS AT THE SAME LEVEL AS THE USEFUL FILES! Proving that the distinction between information and entropy has NOTHING to do with meta-levels. But actually, we'll get back to this because the retards were more wrong than could be imagined.

Building Blocks Of The Universe

Some, and I stress, some exceptional physicists have grasped that bits are one of the fundamental building blocks of the universe. The other building blocks are energy and dimension. Although dimension may, MAY, be optional if you successfully reduce it down to ... bits!

Are there other fundamental building blocks of the universe? Why, yes there are! These are all of them,

  • 0. math = symbols + rules
  • 1. energy, the substrate of physical existence
  • 2. dimension, a kind of information, maybe
  • 3. information, symbols given physical existence
  • 4. value, meta-circular loops

Note how time isn't in there. That's because time is just a dimension along which information is conserved. That's it. And since time isn't fundamental and computation is just 'math occurring in time' ... but I digress.

The unfamiliar building block of the universe in there is 'value'. Which is nonetheless startlingly familiar to anyone who's skimmed GED: The Eternal Golden Braid. I say skimmed since that book was useless to anyone. Meandering, winding and always avoiding making its fucking point! Probably because its authors were too stupid to put their point into the kind of rigor which mathematicians prefer and they were afraid of looking stupid. Credibility is the death-knell of science.


Now, I already said that information is intrinsically subjective. You need something like brains to have information at all. But you actually don't. You need neither brains nor computation nor time to have information. What you DO need is a container with bits in it and for those bits to form meta-circular loops. In other words, for the bits to talk about what the other bits look like!

When a container chock full of bits is ALSO chock-full of meta-circular loops, then the space of possible information is vastly decreased. When that happens, the bits in the container can be COMPRESSED. And when bits are compressed that's the same thing as if they didn't exist. Look it up, information theory says this most explicitly.

So, the more meta-circular loops there are or the stronger the loops are in the container (the more fully they describe the container's bits) the more VALUE is in a container full of bits, the fewer bits there actually are. And ISA is a very primitive form of value because it refers to what a computation machine such as a brain likes in terms of bits on a purely syntactic level.

That is, what it likes to see in terms of density and spatial arrangement of bits. Does this brain like for the container to be about half-full of 1s? Almost completely full of 1s? Almost completely full of 0s? Should those 1s and 0s move in time or not? Should they ...? That's ISA.

Information And Entropy Are Opposites

Bits + ISA = Bits + Value = Bits + more compression = Fewer actual bits = Information

Bits - ISA = Bits - Value = Bits + zero compression = Maximum actual bits = Entropy

And *THAT* children is why Information and Entropy ARE FUCKING OPPOSITES!

Because while "information" theory and computation theory and physics all talk about BITS ... they never, EVER talk about information OR entropy. Because the concepts of information and entropy exist ABOVE the level of bits. And because scientists and academic researchers are TOO FUCKING UNCREATIVE to synthesize concepts above the level they're working on. Being worthless stupid idiotic hacks.

Information and entropy REALLY ARE opposites and academics are simply too stupid to understand the concepts so they MISUSE the words informatino AND entropy to refer to ... BITS.

No comments: