Why tech progress doesn't stagnate: An evolutionary view
In the comments to the post on genius below, the topic of the stagnation of progress came up. A post is more appropriate for a response, so here it is. I confess that I haven't studied the work of historians / philosophers of science, so the outline of what I'm going to say may well have been suggested already. In fact, it's pretty obvious if you devote a little time to thinking about it, so I assume that the gist actually has been proposed before, but as I'm otherwise occupied as far as reading, studying, and thinking goes, I'm not going to look into it. (A cursory Google search didn't turn anything up, so it's probably in a book or body of a journal article.)
To briefly reiterate my thoughts, I think that while there may be a grain of truth to the idea of progress plateau-ing, it's always premature to suggest that we've approached an asymptote for good in some area -- physics, music, what have you. The reason is simple: you can never tell when a huge revolution is about to happen, so for all we know, yet another one will occur -- when, who knows? -- even if we won't live to see it. So, I'd agree with a weaker version of the complaint that simply noted that we've apparently run out of bold new ideas for the time being, and only time will tell whether our progeny will discover or create something that no one had dreamed of before. I gave some examples of this in the comments.
A separate idea is that some fields don't stagnate as much as others -- for instance, as much progress as mathematicians have made in the past 2500 years, look at how little was done from roughly the ascendancy of the Roman Empire until roughly the proto-Renaissance in the 14th C. Alfred North Whitehead, a 20th C. thinker, quipped that philosophy was "a series of footnotes to Plato" -- but that exactly characterizes the field of geometry up until the beginnings of the 18th C. when Euler sowed the seeds for the generalization of geometry known as topology. It was only in the 19th C., with the discovery of non-Euclidean geometries, that anyone had found a way of doing geometry that didn't obey all of the rules that Euclid had laid down 2000 years before. Moreover, one of the four foundational branches of mathematics -- calculus and analysis -- wasn't even developed until the mid-17th C. An inchoate working out of probability began about the same time, while statistics is almost entirely a 20th C. invention. So, even though big changes do eventually occur once more, it seems to take a very long time for them to happen.
Why don't we put that into picture form to get a better feel for the claim?
Above is (what's supposed to be) an exponential decay function, which is what someone means anytime they use the phrase "diminishing returns." As is plain to see, the effect that the nth discovery has on the maturity of the field -- whether that be development of mathematics, understanding of natural laws, or elaboration of musical forms -- decreases (monotonically) as n increases. At the beginning, Euclid writes The Elements, and each work on geometry after that contributes increasingly less understanding to the field. Bach fleshes out most of the potential of the fugue, and each composer's contribution to the form afterward fills in increasingly smaller gaps left by the pioneer. Eventually, another Euclid or Bach comes along and charts out previously unseen areas, and the process repeats -- perhaps it ultimately will grind to a halt as science and art begin to place too strenuous of a demand on human cognition, but I don't believe that either (more on that at the end).
Now, here's how I think technological progress is different:
The axes have the same scale, so I didn't just "zoom out" from the graph above. Essentially, the logic is the same as above, but the cycle just repeats far more quickly, such that it is hardly noticeable that progress ever stagnates. Now, some people do complain about technological progress stagnating, such as those who whine that the nth incarnation of their iPod hasn't changed all that much over the past five years, but such "stagnation" is below the detection threshold of those who are not profoundly afflicted by ADHD. The reason for a faster cycle is pretty simple: most technology is designed to out-perform someone else's technology, as summarized by the common phrase "arms race." As an aside, most groundbreaking technological innovation is done at the level of monopolies (Bell Labs before AT&T was broken up), national governments (Department of Defense), or state-funded "private" bodies (MIT), and not between competitive firms, so don't read too much into the analogy. The basic point is that geometers and composers aren't threatened by an imminent menace, such as an invading army, so their urge to out-do others must derive from less reliable qualities such as personality trait competitiveness, spite, and so on.
In other words, while all geniuses have shown an indefatigable work ethic, some of the lesser figures may have had roughly genius levels of general intelligence and nuttiness, but simply lacked the time commitment required to make Newtonian contributions. Surely if it were a matter of life and death, though, as in the case of those charged with technological progress, even someone who might not have an extremely high intrinsic work ethic would have their feet put to the fire by the prospect of being conquered, for example.
Those familiar with the lingo of evolutionary biology will have noticed that I've hinted at an analogy, so I might as well make it explicit. In the more "pure" fields -- the arts and sciences -- progress reflects a spreading of something to all of those in the field: an understanding of nature, the conventions of the fugue, and so on. This is like a group of alleles spreading toward fixation in a population, such as lactose tolerance among peoples who have raised dairy-producing animals. The exponential decay model is also borrowed from this work, especially that of H.A. Orr, who has argued convincingly that adaptation of a population to its environment obeys such a model. If you don't have access to his journal articles, here's an intuitive argument I came up with to help me remember the gist of his more formal statements:
Imagine that you're a little kid passing the time dangling by your arms from a tree branch (so you're at an equilibrium state), and that a freak environmental change like an earthquake or gale-force wind knocks you off your branch. Also imagine that your flexible hand is the population, and that it is trying to latch onto an irregularly shaped branch to keep from falling into oblivion. Just getting your palm in the right place does most of the job, and then two flexings of your fingers do most of the rest, although you still need to carry out many more minute adjustments to get it to contour the branch (nearly) perfectly. In this analogy, a discrete movement of your hand is like a favored allele being substituted at some locus. (I hope you'll forgive the digression I've made from my central point here, since this is one of the more fascinating ideas I've read about recently.)
The case of technological progress, however, is more like frequency-dependent selection: your hand is making movements not only to fit a static branch, but in reaction to the series of movements made by a nearby person's hand -- maybe there are lots of people in the tree, and one or more of them is using their hand to try to shove you out to make room for them. Now you're engaged in a vicious cycle that is a matter of life and death. If you left a video camera recording the events in these two different trees, the results of the former would look pretty boring since the action would only be dictated by whether a hurricane or earthquake chanced to pass through the area during the time period, and many would complain that "nothing's happening." The latter's movie, however, would be so action-packed that, again, only the incurably jaded would lose interest.*
That's what must be causing people to complain that one type of progress seems to have ground to a halt -- the human mind is incapable of handling large stretches of time (a special case of the difficulty with big numbers in general), as any society's pre-scientific historical texts amply demonstrate. So, it's just a matter of having to wait longer for "the next big thing" in the arts and sciences.
There is another complication, namely that you can always out-perform your enemy, but you may or may not be able to understand more and more of nature or invent more and more original musical forms. Since that's a bit outside the scope of this musing, I'll just link to a previous post I wrote on this topic last year. Briefly, the original objection can be stated as an analogy to visual perception: the human eye is only so discerning when it comes to color, so even if every once in a great while a freak were born with 10 types of cone cells (as opposed to the typical 3), that person could still perceive only so much of the electromagnetic spectrum. But that's fine, since we've designed instruments like spectroscopes that supersede our eyes. The recent proof of the four color map theorem and elaboration of the properties of some 248-dimensional symmetrical thingie that I don't understand were both done by computer. This isn't as satisfying on a gut level, since the computer's proof just exhausted a huge number of mutually exclusive cases rather than present a conceptually elegant solution. But hey, more knowledge is more knowledge, right? So don't worry.
*This is distinct from Stephen Jay Gould's notion of punctuated equilibrium in two ways: first, the basis for Orr's claim is pretty rigorous, while Gould was making touchy-feely verbal arguments only. Second, Gould was making claims about the entirety of a species' evolution, while as I read it, Orr's claim is that adaptation to correct for a particular environmental disruption follows an exponential decay curve -- surely it's conceivable that many such disruptions operate independently of others, such that there is always pioneering adaptation going on, rather than all disruptions being concentrated into a much narrower window of time.