Three of the central claims of my onticology are 1) that objects are always composed of other objects, 2) that objects exist at all levels of scale, and 3) that objects are negentropic in that they both resist dissolution and perpetually face the problem of dissolution or entropy. I draw the first two claims from Graham’s thought. The second claim is the thesis that corporations such as the Coca Cola corporation, for example, are no less objects than quarks, acorns, or stars, and that electrons are no less objects than rocks. These claims commit me to the thesis that objects are emergent or self-organizing. If it is true that objects are always composed of other objects, then it follows that I need some account of self-organization or emergence that allows us to think the transition from a mere aggregate, to a genuine object. This boundary, of course, will be fuzzy (all of the paradoxes that the Greeks encountered surrounding questions of when a pile becomes a hill will arise because there will be, drawing on Husserl’s concept, fuzzy essences).

Questions of self-organization, in their turn, will be deeply related to questions of entropy because the formation of an object will refer to the emergence of order in the world. It will be recalled that entropy is a measure of probability. The more probable it is that a particle (or other object) is located in any particular place in, for example, a chamber, the higher the degree of entropy a system possesses. The less probable it is that particles will be found in a particular place in the chamber, the lower the entropy of the system. The video below provides a nice visual illustration of entropy:

As time passes the entropy of the system increases because it become equally probable that particles of the system will be located at any particular place in the chamber. In other words, there’s a high probability that the particles will be located at any particular place in the chamber.

read on!

The relationship between entropy and objects here becomes a little clearer. An object is an entity that has a very low degree of entropy because it’s elements (smaller scale objects) are arranged in a particular patterned organization that maintains itself over time. The emergence of an object is accompanied by a decrease in entropy. In The Allure of Machinic Life John Johnston theorizes a beautiful way of thinking about this in terms of the information Claude Shannon and the cybernetician Heinz von Foester. Here I hope I’ll be forgiven for quoting at length. Johnston writes that,

[f]or von Foerster, a self-organizing system [what I call an object] is one whose “internal order” increases over time. This immediately raises the double problem of how this order is to be measured and how the boundary between the system and the environment is to be defined and located. The second problem is provisionally solved by defining the boundary “at any instant of time as beingthe envelope of that region in space which shows the desired increase in order.” For measuring order, von Foerster finds that Claude Shannon’s definition of “redundancy” in a communication system is “tailor-made.” In Shannon’s formula,

R = 1 – H/Hm, (“m” here is a subscript)

where R is the measure of redundancy and H/Hm the ratio of entropy H of an information source to its maximum value Hm. Accordingly, if the system is in a state of maximum disorder (i.e., H is equal to Hm), then R is equal to zero– there is no redundancy and therefore no order. If, however, the elements in the system are arranged in such a way that, “given one element, the position of all other elements are determined”, then the system’s entropy H (which is really the degree of uncertainty about these elements) vanishes to zero. R then becomes unity, indicating perfect order. Summarily, “Since the entropy of the system is dependent upon the probability distribution of the elements to be found in certain distinguishable states, it is clear that [in a self-organizing system] this probability distribution must change such that H is reduced.”

The formula thus leads to a simple criterion: if the system is self-organizing, then the rate of change of R should be positive (i.e. dR/dT > 0). To apply the formula, however, R. must be computed for both the system and the environment, since their respective entropies are coupled. Since there are several different ways for the system’s entropy to decrease in relation to the entropy of the environment, von Foerster refers to the agent responsible for thee changes in the former as the “internal demon” and the agent responsible for changes in the latter as the “external demon.” These two demons work interdependently, in terms of both their efforts and results. For the system to be self-organizing, the criterion that must be satisfied is now given by the formula. (55 – 57)

We thus get two extremes. In the case of absolute entropy where R is equal to zero, we have the absence of a new object, but rather only a mere aggregate of objects. Where, by contrast, the position of each sub-object is perfectly determinate such that entropy is equal to zero, we would have what might be called an “absolute object”. This would be an object where entropy has been completely banished.

Of course, the interesting question resides not so much in the equation that allows us to determine the entropy embodied in an object or the degree of order possessed by an object, but rather lies in determining the mechanisms by which this self-organization takes place. These mechanisms will differ depending on the sorts of objects we’re investigating. It will not be the same for rocks, stars, electrons, giraffes, academic disciplines, political groups, nations, and so on. All of these entities, it goes without saying, are negentropic in their own way and stave off their entropy in a variety of ways. However, it’s also interesting to note that because we have a variety of gradations between absolute non-objects (mere aggregates) and absolute objects, we also get the beginnings of an account here of both why objects change and become new objects as a result of the entropy they continue to harbor within themselves and the entropy they encounter in their environments. Here we encounter the order from noise principle, where noise at the heart of objects, as Melanie Doherty has recently argued, might function as the impetus for both the dissolution of certain objects and the emergence of new objects.

About these ads