In the common usage, a system is generally said to be "under control" whenever there is some way to act on the system to produce a predictable outcome. This is a perfectly sensible definition, but it makes clear that control is really a matter of degree. Does "outcome" refer to the ultimate behavior of the system, or only certain of its moving parts? Does "predictable" mean that outcomes are foreseeable one second into the future, or one year into the future, or indefinitely into the future?
There are other interesting questions one can ask that may be particular to a system or a class of systems: When control-actions put together serially or in parallel (or in that awkward hybrid of the two sometimes known as "concurrency") are their cumulative results foreseeable, so that large, structured actions can be composed of smaller ones? Do control-actions have an outcome that is constant with time, or does their behavior change, albeit in a predictable way? Do what degree is control of the system susceptible to irregularities of the environment or noise in the input?
Degree, though, seems to be an essential aspect of the notion of control. The things we are typically concerned with controlling are macroscopic and complicated. In those rare instances where we unequivocally succeed in controlling a physical phenomenon (a working machine is one instance of such a success) behavior of the thing may be quite steady and predictable, but still show susceptibility to abrupt, unexpected failures or malfunctions. Friction, cross-talk, ambient vibrations, waste-heat, leaky gates, freak-accidents, and all their like loom threateningly in the background of any working order. Though such entropic can never be eliminated, a successful machine (at least in all current conventional senses) has a design that somehow subsumes these forces.
The world is a vast and inconceivably complicated place. That anything is predictable or understandable at all is something of a miracle. That humans can produce even dim understanding or very modest instances of control is more miraculous still. In this sense, any given thing in the world, from a pebble to a space shuttle to a low-pressure trough to a working farm to an ocean, analyzed in full, contains a volume of information that is completely beyond the comprehension of even the most brilliant human mind. In essence, the information density of even the tiniest, simplest objects renders them wholly immovable to the human mind.
The idea of control thus hearkens back to Archimedes using a lever to move the whole world. Control is a particular arrangement that gives an agent (i.e. something with a minded purpose) leverage enough to move to the world from one understandable condition to another. This suggests a (very) slightly more formal idea of control as a specific kind of proportion: a thing is "controllable" when there is some arrangement by which a relatively low bandwidth input yields a comparably (very) high bandwidth output, i.e. a state of the system that is foreseen.
This is not necessarily a new way of looking at things, by any stretch. (After all, we can even give old Archimedes some credit, not to mention such big names as Boltzmann or Weiner, who first began asking the modern versions of such questions.) Control is essentially a means of mapping some relatively coarse vision of a complex phenomenon onto its extremely fine-grained reality, and doing so in a way that is suitably robust and structure-preserving. The transistor is the classic example of this, whereby something as complex as a semiconductor can be made to act like a trivial logic function. This view is suggestive of certain interesting avenues of investigation. One has to wonder, for instance, if establishing a regime of control, i.e. designing a machine or proving its properties, is something like playing Michael Barnsley's "Chaos Game": the design ask and answer, over and over again, how do the small things resemble the big things in this picture?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment