Javascript

Sunday, April 10, 2005

philobabble

In the process of procuring for a friend a 1999 college paper of mine from my class on space-time, I decided to indulge myself in a waft of unintelligible writing from my other papers adjacently kept in six years of abeyance. I find it remarkable how I now find the jargon from philosophy so completely alien when years ago I swam in these terms and wrote about them freely. In my defense, Google presently seems to know of only 222 pages mentioning the Covering Law Model of Explanation and only 557 containing corpuscularity. I, however, have little excuse for forgetting Logical Positivism which appears in 76,900 pages in the Google corpus and even has its own wiki entry. Another esoteric term I almost never use and had nearly forgotten, supervenience has a healthy 30,200 hits with Google. Below is the text from my "Models of Explanation" class paper:
03/04/99 “Models of Explanation”
It could be argued that the most powerful tool we as intelligent beings posses is our capability to reason through logic in order to explain the happenings in our world. Ironically, it then becomes quite a task to explain such a tool since we will have to assume that our ability to deduce and explain is powerful enough to be self-applicable. For now, however, let us assume we can explain explanation. One model that seemingly does exactly that is the Covering Law Model of Explanation. In essence, this model uses a miscellany of reasoning taken from corpuscularity and positivism to disassemble our complex understanding of explanation into a series of simpler causal events that take on a true or false value in existence.

Applying this model, an explanation for why a vending machine gives soda would be broken into several mathematical tokens, each representing an ongoing event that influences the outcome in some way. A very rudimentary view would lead one to conclude that putting in seventy-five cents into the coin slot while the sodas are not empty and the power is on and the mechanical components are working would yield in the machine giving a soda. In any combination of events, a mathematical equation could take all the initial states and calculate a corresponding outcome. This very mathematical approach, however, would only be acceptable to a full-fledged determinist, or would otherwise be limited in its range of application, such as being applicable only to non-living things. Aside to this model of explanation’s dependence on determinism, its dependence on reduction again reduces its audience. Most probably, several other factors affect the mechanical condition of the vending machine and the number of sodas remaining as well as the power being on. Recursively, multiple factors would need to be monitored as potential variables in order to truly reduce any complex object. This infinitely required knowledge is often an argument against any form of corpuscularity.

A usual retort to any theoretical challenges imposed on a theory is providing empirical evidence in support of the theory; and, history comes in favor of the Covering Law Model of Explanation. Most Newtonian physicists and chemists often reduced complex real life occurrences into a simple matrix of variables, yielding a determined answer. Even in “An Explanation of Hunger,” a scientific paper written by Cannon and Washburn, the two authors sought to explain why humans get hungry, and in doing so fervently applied the Covering Law Model of Explanation. A similar endeavor to understanding hunger was conducted by another scientist prior to the attempt by Cannon and Washburn. With knowledge of the previous unsuccessful attempt at showing that hunger causes stomach contractions, Cannon and Washburn sought further meticulous analysis of the issue, with the starting intuition that perhaps stomach contractions cause hunger and not vice-versa.

By monitoring all the known macro-variables, which were referral of pain, chemicals secreted by the stomach, volume of substance in the stomach, and smooth muscle lining, Cannon and Washburn were able to deduce that the contraction of the smooth muscle lining that made up the stomach walls was what produced hunger. The very nature of the experiment focussing on components that could affect hunger shows the experimenters’ usage of reduction. The manner in which they manage to stay non-contradictory is to fix the recursive problem with reduction by having holistic macro-variables and using statistics instead of a deterministic framework. With this logic, whenever a person’s stomach walls contract, there exists a probability that the person will suffer from pangs. However, no attempt is made to recursively explain when these macro-variables will be true or when they will be false. This room for ambiguity serves, in part, to preserve the somewhat implicit nature of randomness of any complex system and the likelihood of supervenience being a more comprehensive model of the hierarchy in complexity than corpuscularity.

In fact, Mischel initially reasoned the holistic alternative that challenges the Covering Law Model of Explanation. The problems with both corpuscularity and determinism are corrected in Mischel’s holistic view whereby complex things are not treated as the sum of their individual components. Although classical Newtonian mechanics interlaces reduction throughout many of its theorems, it should be noted that most of the problems under scrutiny by these theorems were about non-living physical things in ideal conditions where no unknown variable influences the system. Because of the limited scope of simplified Newtonian physics, reduction works. However, for any truly natural event, and especially for any living being, the predictions based upon the micro-components would deviate from the actual results so quickly that no similarity between prediction and actuality would ever be noticed. Indeed, if a group of people were to enter a subway, the driving forces behind why each of them entered may differ vastly from the other. No generalization could ever be achieved unless a law were to be made for every person, essentially creating a rule for every exception.

In an excerpt from “The Origins of Intelligence of Children,” by Jean Piaget, the power that the Covering Law Model of Explanation can give is observed. Although, as stated earlier, reduction has its many problems that holism appears to resolve, science repeatedly maximizes the utilization of reduction for their benefit. Piaget initially tries to observe and record each invariant that could have any significant impact on his subjects, and then he goes about creating a system of simultaneous equations, essentially a matrix of all recorded variables. By seeing the deviance of output from a small initial change in one of the variables, and after several iterations of experimenting, a conclusion can be made about what each variable does with its normal environment. Again, empirically, conclusions appear to assimilate from the environment to predict accurately the actual end-results.

The philosophical issue pertinent now is whether empirical corroboration is enough to offset problems in the fundamental theories. The dichotomy between reliance on empirical proof and logical proof is of concern. A simplified analogy would be to drop a feather and a lead ball and show that objects do not undergo equal and unified acceleration due to gravity. This is utter nonsense, since all the empirical evidence would then corroborate with this invalid hypothesis unless someone were to remove every variable including air friction for a perfect experiment. Therefore, empirical evidence with many distracting outside influences cannot always be reliable, since induction has many of its own theoretical flaws.

In the case of Newtonian physicists and many scientists using reduction in order to simplify and then to conquer problems, the focus of study is always a simple organism, be it biological or inanimate. Firstly, a simple focus of study inherently supposes that the actions performed are deterministic, or pseudo-deterministic. A baby’s natural instinct for milk, as was studied by Piaget, would fall under this category of pseudo-determinism, because, although the baby is a living complex entity, the study is not about the baby. The focus of study, in this case, would be on the code programmed into its body, or natural instinct, which is not subject to free-will. If, however, the Covering Law Model of Explanation were to try to explain why a person decided to enter a subway, large clashes in philosophy would occur, since a free-willist would not accept any theory that assumes humans as deterministic. For studying human actions or any other complex indeterminant system, a holist’s approach, such as that of Mischel’s, would be necessary, where supervenience not corpuscularity would be the underlying axiom.

Another freshman-year paper worth reposting is one titled "Reductionist's Belief in the Observation Basis". I cannot recall another time in the six years following when I have alluded to Hilbert or Gödel. My lack of doing so is perhaps indication that, unless I make a conscientious effort otherwise, my most enlightened years are behind me. In any case, below is the text from that elliptical class paper of mine:
03/17/99 “Reductionist’s Belief in the Observation Basis”
Straining to cope with the sudden burst of information rushing through time, with information pleading to be processed as quickly as the next information can take its place, a newborn has the universe materialize around itself. With no earthly law other than the apriori knowledge of motion, everything else, every nuance of our meticulous training and thinking, the foundation of all modern understanding, is left as an unprecedented event for the distant future. The future of this newly created realm, however, remains an unintelligible concept as elusive to decipher as the creation of the present. The self encompasses everything, and the universe appears as merely an extension of the limbs. Indeed, nothing starts of as apriori knowledge and all accumulated knowledge may only be a gaudy tower of Lego blocks where the corpuscles of reality are deceptive inferences of the sensations.

Logical positivism has a long life, spotted with illness, yet healthy for the most part; its longevity allows it to remain an incessantly recurring philosophical view. However, if the universe is a game as mathematics was thought to be by Hilbert, could there be some savior, an anti-positivist counterpart to Gödel? Perhaps, through this game we play, where we role-play in our most sentient dream, we may discover truths of the underlying reality, assuming there is one. Therefore, with the needed assumptions, truths can be derived merely from observation. Furthermore, although a newborn may appear to learn holistically, as the universe takes on a more rigid and developed form, cognitive thinking sharpens like a lens focussing. This sharpening of cognitive thought from the sensory data can be seen as one’s adopting of reductionism. Although the focussed picture may vary in appearance depending on the defined corpuscle, the re-blurred images will all be identical again. Essentially, from a functionalist’s standpoint, the macroscopic object reconstituted solely from its corpuscles will always be the same regardless of what its corpuscles are. As long as the observation basis continues to be used, there will be inherent truths discovered through reductionism, and these inherent truths, holistic in nature, will correlate in a non-contradictory manner.

This weekend has seen me mired in nostalgia. Yesterday, I wanted to immerse myself in '80s video games, loading the save files from my 386. Alas, it was not to be had. I found my 386 hidden in the study room but could not locate its power cables or monitor.

Revisiting the past has surprisingly been therapeutic for me. When I was a latchkey kid at age twelve, I mucked around with the CMOS configuration and nearly rendered the only few months old 386 inoperable. Fortunately, with some desperate prayers and panicky corrective changes, the machine came back to life before my parents returned home. However, ever since then, I harbored the suspicion that the new settings were making the 386 far slower than it once was. When I was 23, I somehow remembered this episode and decided to put my limited experience with computer hardware to task. The 386 at that time was still plugged in - albeit collecting dust - and awaited being turned on. Interestingly, this 386 demanded more of its user to configure its CMOS than needed to configure a modern CMOS. Aside to the ubiquitous settings regarding whether to shadow video ram, the geometry of the hard disks, and the caching policies, there lay many other settings concerning intricacies on architecture and design requiring an investment in a computer architecture course. I reconfigured the CMOS employing what I knew, and, miraculously, the 386 sped up considerably! The feeling of correcting a ten year wrong is an indescribable coupling of euphoria and serenity.

No comments: