Kluge

Chapter 1, Remnants of History

A kluge is a clumsy or inelegant — yet surprisingly effective — solution to a problem.
...the building of a Kludge... is not work for amateurs. There is a certain, indefinable, masochistic finesse that must go into true Kludge building. The professional can spot it instantly. The amateur may readily presume that ' that's the way computers are.
What's really amazing — in hindsight — is that most people probably didn't even realize it was possible to do better.
... often comes from understanding not just how things are, but how else they could have been.
Nature is prone to making kluges because it doesn't "care" whether its products are perfect or elegant. If something works, it spreads. If it doesn't work, it dies out. Genes that lead to successful outcomes tend to propagate; genes that produce creatures that can't cut it tend to fade away; all else is metaphor. Adequacy, not beauty, is the name of the game.
Although no reasonable scholar would doubt the fact that natural selection can produce superlatively well engineered functional designs, it is also clear that superlative engineering is by no means guaranteed.
Underpinning such examples is a bold premise: that optimization is the inevitable outcome of evolution. But optimization is not an inevitable outcome of evolution, just a possible one. Some apparent bugs may turn out to be advantages, but — as the spine and inverted retina attest — some bugs may be genuinely suboptimal and remain in place because evolution just didn't find a better way.
Natural selection, the key mechanism of evolution, is only as good as the random mutations that arise.
If a given mutation is beneficial, it may propagate, but the most beneficial mutations imaginable sometimes, alas, never appear.
If no immediate change leads to an improvement, an organism is likely to stay where it is on the mountain range, even if some distant peak might be better.
In the final analysis, evolution isn't about perfection. It's about what the late Nobel laureate Herb Simon called "satisficing," obtaining an outcome that is good enough. That outcome might be beautiful and elegant, or it might be a kluge. Over time, evolution can lead to both: aspects of biology that are exquisite and aspects of biology that are at best rough-and-ready.
History has a potent — and sometimes detrimental — effect because what can evolve at any given point is heavily constrained by what has evolved before.
Natural selection therefore tends to favor genes that have immediate advantages, discarding other options that might function better in the long term.
In short, evolution often proceeds by piling new systems on top of old ones.
In the same way, living creatures' continuous need to survive and reproduce often precludes evolution from building genuinely optimal systems; evolution can no more take its products offline than the human engineers could, and the consequences are often equally clumsy, with new technologies piled on top of old.
...when we see perfection, we often can't tell which of many converging factors might have yielded an ideal solution; often it is only by seeing where things went wrong that we can tell how things were built in the first place.
...remnants of the past that don't make sense in present terms — the useless, the odd, the peculiar, the incongruous — are the signs of history.

Chapter 2, Memory

In general, we pull what we need from memory by using various clues, and when things go well, the detail we need just "pops" into our mind.
Contextual memory may have evolved as a hack, a crude way of compensating for the fact that nature couldn't work out a proper postal-code system for accessing stored information, but there are still some obvious virtues in the system we do have.
Contextual memory has its price, and that price is reliability. Because human memory is so thoroughly driven by cues, rather than location in the brain, we can easily get confused. 
Relying on context works fine if the circumstance in which we need some bit of information matches the circumstance in which we first stored it — but it becomes problematic when there is a mismatch between the original circumstance in which we've learned something and the context in which we later need to remember it.
Another consequence of contextual memory is the fact that nearly every bit of information that we hear (or see, touch, taste, or smell), like it or not, triggers some further set of memories — often in ways that float beneath our awareness.
Lacking proper buffers, our memory banks are a bit like a shoebox full of disorganized photographs: recent photos tend on average to be closer to the top, but this is not guaranteed.
Or consider another common quirk of human memory: the fact that our memory for what happened is rarely matched by memory for when it occurred.
Instead of simply trying to recall when something happened, we can try to infer this information. By a process known as " reconstruction, " we work backward, correlating an event of uncertain date with chronological landmarks that we're sure of. ... Reconstruction is, to be sure, better than nothing, but compared to a simple time/date stamp, it's incredibly clumsy.
In no small part this is because we evolved not as computers but as actors, in the original sense of the word: as organisms that act, entities that perceive the world and behave in response to it. And that led to a memory system attuned to speed more than reliability.
The Harvard psychologist Dan Schacter, meanwhile, has argued that the fractured nature of memory prepares us for the future: " A memory that works by piecing together bits of the past may be better suited to simulating future events than one that is a store of perfect records. " Another common suggestion is that we're better off because we can't remember certain things, as if faulty memory would spare us from pain. ... The notion that the routine failures of human memory convey some sort of benefit misses an important point: the things that we have trouble remembering aren't the things we'd like to forget.
What we remember and what we forget are a function of context, frequency, and recency, not a means of attaining inner peace.
Similarly, there is no logical relation between having a capacity to make inferences and having a memory that is prone to errors. In principle, it is entirely possible to have both perfect records of past events and a capacity to make inferences about the future.

Chapter 3, Belief

Our subjective impression that we are being objective rarely matches the objective reality: no matter how hard we try to be objective, human beliefs, because they are mediated by memory, are inevitably swayed by minutiae that we are only dimly aware of.
Realizing the limits of our own data sampling might make us all a lot more generous.
For example, people tend to prefer social policies that are already in place to those that are not, even if no well - founded data prove that the current policies are working. Rather than analyze the costs and benefits, people often use this simple heuristic: "If it's in place, it must be working."
I describe the latter system as "deliberative" rather than, say, rational because there is no guarantee that the deliberative system will deliberate in genuinely rational ways. Although this system can, in principle, be quite clever, it often settles for reasoning that is less than ideal. ... Conversely, the reflexive system shouldn't be presumed irrational; it is certainly more shortsighted than the deliberative system, but it likely wouldn't exist at all if it were completely irrational. Most of the time, it does what it does well, even if (by definition) its decisions are not the product of careful thought. Similarly, although it might seem tempting, I would also caution against equating the reflexive system with emotions. Although many (such as fear) are arguably reflexive, emotions like schadenfreude — the delight one can take in a rival's pain — are not. Moreover, a great deal of the reflexive system has little if anything to do with emotion; when we instinctively grab a railing as we stumble on a staircase, our reflexive system is clearly what kicks in to save us — but it may do so entirely without emotion. The reflexive system (really, perhaps a set of systems) is about making snap judgments based on experience, emotional or otherwise, rather than feelings per se.
Evolution gave us the tools to deliberate and reason, but it didn't give us any guarantee that we'd be able to use them without interference.
...acquiring an abstract logic is not a natural, automatic phenomenon in the way that acquiring language is. This in turn suggests that formal tools for reasoning about belief are at least as much learned as they are evolved, not (as assumed by proponents of the idea that humanity is innately rational) standard equipment.
Rational man, if he (or she) existed, would only believe what is true, invariably moving from true premises to true conclusions. Irrational man, kluged product of evolution that he (or she) is, frequently moves in the opposite direction, starting with a conclusion and seeking reasons to believe it.
Superficially, one might think of perception and belief as separate. Perception is what we see and hear, taste, smell, or feel, while belief is what we know or think we know. But in terms of evolutionary history, the two are not as different as they initially appear. The surest path to belief is to see something.
...our legal system may be designed around the principle of " innocent until proven guilty, " but our mind is not.
Experimental evidence bears this out: merely hearing something in the form of a question — rather than a declarative statement — is often enough to induce belief.
Why do we humans so often accept uncritically what we hear ? Because of the way in which belief evolved: from machinery first used in the service of perception. And in perception, a high percentage of what we see is true (or at least it was before the era of television and Photoshop).
the linguistic world is much less trustworthy than the visual world.
The dictionary defines the act of believing both as "accepting something as true" and as "being of the opinion that something exists, especially when there is no absolute proof." Is belief about what we know to be true or what we want to be true? That it is so often difficult for members of our species to tell the difference is a pointed reminder of our origins.
Evolved of creatures that were often forced to act rather than think, Homo sapiens simply never evolved a proper system for keeping track of what we know and how we've come to know it, uncontaminated by what we simply wish were so.

Chapter 4, Choice

And often, the closer we get to conscious decision making, a more recent product of evolution, the worse our decisions become. 
In the clear-eyed arithmetic of the economist, a dollar is a dollar is a dollar, but most ordinary people can't help but think about money in a somewhat less rational way: not in absolute terms, but in relative terms.
What we think of — what we summon into memory as we come to a decision — often makes all the difference.
Evolution built the ancestral reflexive system first and evolved systems for rational deliberation second — fine in itself. But any good engineer would have put some thought into integrating the two, perhaps largely or entirely turning over choices to the more judicious human forebrain (except possibly during time-limited emergencies, where we have to act without the benefit of reflection).
Instead, our ancestral system seems to be the default option, our first recourse just about all the time, whether we need it or not. We eschew our deliberative system not just during a time crunch, but also when we are tired, distracted, or just plain lazy; using the deliberative system seems to require an act of will. Why? Perhaps it's simply because the older system came first, and — in systems built through the progressive overlay of technology — what comes first tends to remain intact. And no matter how shortsighted it is, our deliberative system (if it manages to get involved at all) inevitably winds up contaminated. Small wonder that future discounting is such a hard habit to shake.
... the attraction of the visceral.
Hunger, lust, happiness, and sadness are all factors that most of us would say shouldn't enter into rational thought. Yet evolution's progressive overlay of technology has guaranteed that each wields an influence, even when we insist otherwise.
where we feel certain that something is wrong but are at a complete loss to explain why — "moral dumbfounding."
What we have instead falls between two systems — an ancestral, reflexive system that is only partly responsive to the overall goals of the organism, and a deliberative system (built from inappropriate old parts, such as contextual memory) that can act in genuinely independent fashion only with great difficulty.
... the reflexive system is better at what it does than the deliberative system is at deliberating.
When people make effective snap decisions, it's usually because they have ample experience with similar problems.
... our best intuitions are those that are the result of thorough unconscious thought, honed by years of experience.

Chapter 5, Language

To be perfect, a language would presumably have to be unambiguous (except perhaps where deliberately intended to be ambiguous), systematic (rather than idiosyncratic), stable (so that, say, grandparents could communicate with their grandchildren), non-redundant (so as not to waste time or energy), and capable of expressing any and all of our thoughts.
Most of us rarely notice the instability or vagueness of language, even when our words and sentences aren't precise, because we can decipher language by supplementing what grammar tells us with our knowledge of the world. But the fact that we can rely on something other than language — such as shared background knowledge — is no defense.
With no nation-state invested in the success of Esperanto, it's perhaps not surprising that it has yet to displace English (or French, Spanish, German, Chinese, Japanese, Hindi, or Arabic, to name a few) as the most prevalent language in the world.
... idiosyncrasy often arises in evolution when function and history clash, when good design is at odds with the raw materials already close at hand.
In the hodgepodge that is language, at least three major sources of idiosyncrasy arise from three separate clashes: (1) the contrast between the way our ancestors made sounds and the way we would ideally like to make them, (2) the way in which our words build on a primate understanding of the world, and (3) a flawed system of memory that works in a pinch but makes little sense for language.
It's probably no accident that language evolved primarily as a medium of sound, rather than, say, vision or smell. Sound travels over reasonably long distances, and it allows one to communicate in the dark, even with others one can't see.
The vocal tract, in contrast, is tuned only to words. All the world's languages are drawn from an inventory of 90 sounds, and any particular language employs no more than half that number — an absurdly tiny subset when you think of the many distinct sounds the ear can recognize.
What "twists" your tongue, in short, is not a muscle but a limitation in an ancestral timing mechanism. ... Why such a complex system? Here again, evolution is to blame; once it locked us into producing sounds by articulatory choreography, the only way to keep up the speed of communication was to cut corners. Rather than produce every phoneme as a separate, distinct element (as a simple computer modem would), our speech system starts preparing sound number two while it's still working on sound number one.
Words in computer languages are fixed in meaning, but words in human languages change constantly; one generation's bad means "bad," and the next generation's bad means "good."
Human languages are idiosyncratic — and verging on redundant — inasmuch as they routinely exploit both systems, generics and the more formal quantifiers. ... The split between generics and quantifiers may reflect the divide in our reasoning capacity, between a sort of fast, automatic system on the one hand and a more formal, deliberative system on the other. Formal quantifiers rely on our deliberative system (which, when we are being careful, allows us to reason logically), while generics draw on our ancestral reflexive system. Generics are, she argues, essentially a linguistic realization of our older, less formal cognitive systems.
But, as we have seen time and again, what is natural for computers isn't always natural for the human brain: building a tree would require a precision in memory that humans just don't appear to have. Building a tree structure with postal - code memory is trivial, something that the world's computer programmers do many times a day. But building a tree structure out of contextual memory is a totally different story, a kluge that kind of works and kind of doesn't.
Perhaps the biggest problem with grammar is not the trouble we have in constructing trees, but the trouble we have in producing sentences that are certain to be parsed as we intend them to be. Since our sentences are clear to us, we assume they are clear to our listeners. But often they're not; as engineers discovered when they started trying to build machines to understand language, a significant fraction of what we say is quietly ambiguous.
In Robert Louis Stevenson's words, "The difficulty of literature is not to write, but to write what you mean."
Put together all these factors — inadvertent ambiguity, idiosyncratic memory, snap judgments, arbitrary associations, and a choreography that strains our internal clocks — and what emerges? Vagueness, idiosyncrasy, and a language that is frequently vulnerable to misinterpretation — not to mention a vocal apparatus more byzantine than a bagpipe made up entirely of pipe cleaners and cardboard dowels. In the words of the linguist Geoff Pullum, "The English language is, in so many ways, a flawed masterpiece of evolution, loaded with rough bits, silly design oversights, ragged edges, stupid gaps, and malign and perverted irregularities."

Chapter 6, Pleasure

My dictionary defines happiness as "pleasure" — and pleasure as a feeling of "happy satisfaction and enjoyment." As if that weren't circular enough, when I turn to the word feeling, I find that a feeling is defined as " a perceived emotion " while an emotion is defined as a strong feeling.
... the neural hardware that governs pleasure is, like much of the rest of the human mind, split in two: some of our pleasure (like, perhaps, the sense of accomplishment we get from a job well done) derives from the deliberative system, but most of it doesn't. Most pleasure springs from the ancestral reflexive system, which, as we have seen, is rather shortsighted, and the weighting between the two systems still favors the ancestral.
... our pleasure center wasn't built for creatures as expert in culture and technology as we are; most of the mechanisms that give us pleasure are pretty crude, and in time, we've become experts at outwitting them.
... but also with more modern compulsions, like addiction to the Internet. This compulsion presumably begins with an ancestral circuit that rewarded us for obtaining information. As the psychologist George Miller put it, we are all "informavores" ... and it's easy to see how ancestors who liked to gather facts might have outpropagated those who showed little interest in learning new things.
We would probably all be better off if we were choosier about what information we sought, à la Sherlock Holmes, who notoriously didn't even know that the earth revolved around the sun.
... organisms that sought out environments in which they had a measure of control would outcompete those that left themselves entirely at the mercy of stronger forces.
More generally, modern life is full of what evolutionary psychologists call "hypernormal stimuli," stimuli so "perfect" they don't exist in the ordinary world: the anatomically impossible measurements of Barbie, the airbrushed sheen of a model's face, the fast, sensation-filled jump cuts of MTV, and the artificial synthesized drum beats of the nightclub. Such stimuli deliver a purer kick than anything could in the ancestral world.
Video games aren't just about control; they are the distillation of control: hypernormal variations on the naturally rewarding process of skill learning, designed to deliver as frequently as possible the kick associated with mastery. If video games (produced by an industry racking up billions of dollars in sales each year) strike some people as more fun than life itself, genes be damned, it is precisely because the games have been designed to exploit the intrinsic imprecision of our mechanism for detecting pleasure.
Music likely also taps into the sort of pleasure we (and most apes) derive from social intimacy, the enjoyment we get from accurate predictions (as in the anticipation of rhythmic timing) and their juxtaposition with the unexpected and something rather more mundane, the "mere familiarity effect" (mentioned earlier, in the context of belief). And in playing musical instruments (and in singing), we get a sense of mastery and control.
As long as something is a constant, we can learn to live with it. Our circumstances do matter, but psychological adaptation means that they often matter less than we might expect.
Our lack of self - understanding may seem startling at first, but in hindsight it should scarcely seem surprising. Evolution doesn't "care" whether we understand our own internal operations, or even whether we are happy. Happiness, or more properly, the opportunity to pursue it, is little more than a motor that moves us. The happiness treadmill keeps us going: alive, reproducing, taking care of children, surviving for another day. Evolution didn't evolve us to be happy, it evolved us to pursue happiness.
Had our brains been built from scratch, the instruments that evaluated our mental state would no doubt behave a little like the meters electric companies use, which are instruments that we can inspect but not tinker with.
... we do everything in our power to make ourselves happy and comfortable with the world, but we stand perfectly ready to lie to ourselves if the truth doesn't cooperate.
A robot that was more sensibly engineered might retain the capacity for deliberative reason but dispense with all the rationalization and self-deception. Such a robot would be aware of its present state but prepared, Buddha-like, to accept it, good or bad, with equanimity rather than agony, and thus choose to take actions based on reality rather than delusion.
Deliberative prefrontal thought is piled on top of automatic emotional feelings — it doesn't replace them. So we've wound up with a double-edged kluge: our id perpetually at war with our ego, short-term and long-term desires never at peace.
... the nucleus accumbens, which assesses reward, matures before the orbital frontal cortex, which guides long-term planning and deliberative reasoning. ... Ideally, our judicious system and our reflexive system would mature at comparable rates. But perhaps because of the dynamics of how genomes change, biology tends, on average, to put together the evolutionarily old before the evolutionarily new.

Chapter 7, Things Fall Apart

There can be little doubt that the human brain too is fragile, and not just because it routinely commits the cognitive errors we've already discussed, but also because it is deeply vulnerable both to minor malfunctions and even, in some cases, severe breakdown.
The fact that even the best of us are prone to the occasional blunder illustrates something important about the neural hardware that runs our mental software: consistency just isn't our forte.
The more that's on our mind, for example, the more likely we are to fall back on our primitive ancestral system. ... An ideal creature would be endowed with an iron will, sticking, in all but the most serious emergencies, to carefully constructed goals. Humans, by contrast, are characteristically distractible, no matter what the task might be.
My guess is that our inherent distractibility is one more consequence of the sloppy integration between an ancestral, reflexive set of goal-setting mechanisms (perhaps shared with all mammals) and our evolutionarily more recent deliberative system, which, clever as it may be, aren't always kept in the loop.
The standard tack in evolutionary psychiatry, the branch of evolutionary psychology that deals with mental disorders, is to explain particular disorders (or occasionally symptoms) in terms of hidden benefits.
The depression theory initially seems more promising; as the authors note, it might well be better for the low man on the totem pole to accede to the wishes of an alpha male than to fight a battle that can't be won. Furthermore, depression often does stem from people's sense that their status is lower, relative to some peer group. But does the rest of the social competition theory even fit the facts? Depression isn't usually about accepting defeat, it's about not accepting it.
It's true that many disorders have at least some compensation, but the reasoning is often backward. The fact that some disorders have some redeeming features doesn't mean that those features offset the costs, nor does it necessarily explain why those disorders evolved in the first place. What happy person would volunteer to take a hypothetical depressant — call it "anti-Prozac" or "inverse Zoloft" — in order to accrue the benefits that allegedly accompany depression?
Just as cars run out of gas, the brain can run out of (or run low on) neurotransmitters (or the molecules that traffic in them). We are born with coping mechanisms (or the capacity to acquire them), but nothing guarantees that those coping mechanisms will be all powerful or infallible. A bridge that can withstand winds of 100 miles per hour but not 200 doesn't collapse in gusts of 200 miles per hour because it is adaptive to fail in such strong winds; it falls apart because it was built to a lesser specification. Similarly, other disorders, especially those that are extremely rare, may result from little more than "genetic noise," random mutations that convey no advantage whatsoever.
The bitter reality is that evolution doesn't "care" about our inner lives, only results.
In a creature empowered to set and follow its own goals, it's not clear that anxiety would serve any useful function.
What occasionally allows normal people to spiral out of control is a witch's brew of cognitive kluges: (1) the clumsy apparatus of self-control (which in the heat of the moment all too often gives the upper hand to our reflexive system); (2) the lunacy of confirmation bias (which convinces us that we are always right, or nearly so); (3) its evil twin, motivated reasoning (which leads us to protect our beliefs, even those beliefs that are dubious); and (4) the contextually driven nature of memory (such that when we're angry at someone, we tend to remember other things about them that have made us angry in the past). In short, this leaves "hot" systems dominating cool reason; carnage often ensues.
Sad memories stoke sadder memories, and those generate more that are sadder still.
"... there is a particular kind of pain, elation, loneliness, and terror involved in this kind of madness... When you're high it's tremendous. The ideas and feelings are fast and frequent like shooting stars... But, somewhere, this changes. The fast ideas are far too fast, and there are far too many; overwhelming confusion replaces clarity... madness carves its own reality."
In short, many aspects of mental illness may be traced to, or at least intensified by, some of the quirks of our evolution: contextual memory, the distorting effects of confirmation bias and motivated reasoning, and the peculiar split in our systems of self-control. A fourth contributor may be our species' thirst for explanation, which often leads us to build stories out of a sparse set of facts.
One setback does not a miserable life make, yet it's human to treat the latest, worst news as an omen, as if a whole life of cyclical ups and downs is negated by a single vivid disaster. Such misapprehensions might simply not exist in a species capable of assigning equal mental weight to confirming and disconfirming evidence.

Chapter 8, True Wisdom

Every kluge also underscores what is fundamentally wrong-headed about creationism: the presumption that we are the product of an all-seeing entity. Creationists may hold on to the bitter end, but imperfection (unlike perfection) beggars the imagination. It's one thing to imagine an all-knowing engineer designing a perfect eyeball, another to imagine that engineer slacking off and building a half-baked spine.
The German chemist Ernst Fischer mused that "as machines become more and more efficient and perfect, so it will become clear that imperfection is the greatness of man."
From the perspective of brute rationality, time spent making and appreciating art is time that could be "better" spent gathering nuts for winter. From my perspective, the arts are part of the joy of human existence. By all means, let us make poetry out of ambiguity, song and literature out of emotion and irrationality.
That said, not every quirk of human cognition ought to be celebrated. Poetry is good, but stereotyping, egocentrism, and our species-wide vulnerability to paranoia and depression are not. To accept everything that is inherent to our biological makeup would be to commit a version of the "naturalistic fallacy," confusing what is natural with what is good.
... people often fail to take into account the amount of data they've used in drawing their conclusions.
A recognition of our klugey nature can help explain why our late-evolved deliberative reasoning, grafted onto a reflexive, ancestral system, has limited access to the brain's steering wheel; instead, almost everything has to pass through the older ancestral, reflexive system. Specific contingency plans offer a way of working around that limitation by converting abstract goals into a format (if-then, basic to all reflexes) that our ancestral systems can understand.
If we want to reason by emotion alone, fine, but if we prefer rationality, it is important to create "winning conditions" — and that means, for important decisions, adequate rest and full concentration.
... irrationality often dissipates with time, and complex decisions work best if given time to steep.
... we have the luxury to take the time to reflect, and it behooves us to use it, compensating for our vulnerability to the vivid by giving special weight to the impersonal but scientific.
Reserve your most careful decision making for the choices that matter most.
In the information age, children have no trouble finding information, but they have trouble interpreting it. The fact (discussed earlier) that we tend to believe first and ask questions later is truly dangerous in the era of the Internet — wherein anyone, even people with no credentials, can publish anything.
For example, nearly half of all consumers (or 46.1%) in the study assessed the credibility of sites based in part on the appeal of the overall visual design of a site, including layout, typography, font size, and color schemes.
... metacognition, or knowing about knowing.

Footnotes

As a statistical matter, small changes thus appear to have a disproportionately large influence on evolution.
Ambiguity comes in two forms, lexical and syntactic. Lexical ambiguity is about the meanings of individual words; I tell you to go have a ball, and you don't know whether I mean a good time, an elaborate party, or an object for playing tennis. Syntactic (or grammatical) ambiguity, in contrast, is about sentences like Put the block on the box in the table, that have structures which could be interpreted in more than one way. Classic sentences like Time flies like an arrow are ambiguous in both ways; without further context, flies could be a noun or a verb, like a verb or a comparative, and so forth.
In a perfect language, in an organism with properly implemented trees, this sort of inadvertent ambiguity wouldn't be a problem; instead we'd have the option of using what mathematicians use: parentheses, which are basically symbols that tell us how to group things.
The art of improvisation is to invent what in hindsight seems surprising yet inevitable.