Fred Field
California State University, Northridge
e-mail:fredric.field@csun.edu
Abstract
The language faculty is, simply put, all of the neural and
biological mechanisms involved in the uniquely human ability to
do language. Some consider it a biological endowment, that
it is a gift from our Creator, and that it is here by design,
hard-wired into our brains. Others believe that it evolved purely
by chance and the accumulation of accidental mutations that evolved
over millions of years. The implications of these two mutually
exclusive viewpoints are immense and sweeping. Both basic positions
can appear far-fetched and fanciful when followed to their logical
conclusions, and both involve comprehensive belief systems in
order to account for the amazing complexity and diversity of human
language(s). This paper looks at some of the empirical evidence
so that we can see what is at stake and what the current state
of our knowledge suggests about its origins. Is language there
by design, or did it evolve? Is one position more reasonable than
the other?
1. Introduction
Humankind's fascination with the spoken word is far
from new, probably because it has been fairly obvious since the
dawn of history that human beings are the only ones on this planet
who have this remarkable ability. As a consequence, the study
of language and the language faculty has been around for at least
two thousand years, perhaps much longer. In our early history,
it was simply assumed that it was a gift from the gods (or God)
As far back as the fourth and fifth century BC, Plato (427-327)
pondered the origins of words. He may have been the first or,
at least, among the first to discuss whether the links between
language forms (words) and their referents (meanings) reflect
some sort of natural, objective reality, or are merely the result
of convention, associations created by a particular culture and/or
language (Bloomfield:1933:4).
Some of the issues Plato brought up are still far from settled
in the minds of linguists. In fact, many of the generalizations
of language made by ancient Greek scholars regarding the nature
of language (e.g., word classes such as Noun, Verb, and Adjective)
have not been seriously challenged or improved upon until fairly
recently, when linguists broadened the scope of investigation
away from a strictly European bias and began to grapple with the
diversity of indigenous languages of Africa, the Americas, Asia,
and Australia. Perhaps, not until the 18th century A.D. did philosophers
revisit questions of the origins of language. Since then, speculation
has included the "bow-wow" theory (man's attempt to
imitate noises), "ding-dong" theory (natural sound producing
responses), and "pooh-pooh" or "yo-he-ho"
theory (exclamations and aggressive crying out). Sadly, not much
has really changed in the discussion of origins, though the terminology
has grown considerably more sophisticated.
The topic itself invites a great deal of speculation. If language
is indeed unique to humans, it seems a reasonable question to
ask:where has it come from? Awards have been given for the best
answers. Divine origin, evolutionary development, even language
as a human invention have all been proposed. Frustration is the
suspected outcome, particularly because of the obvious conflict
between actual written evidence and the popular assumptions
of humankind's origins in a very remote past. "For
these reasons, scholars in the latter part of the nineteenth century,
who were only interested in 'hard science,' ridiculed, ignored,
and even banned discussions of language origin. In 1886, the Linguistic
Society of Paris passed a resolution 'outlawing' any papers concerned
with the subject" (Fromkin & Rodman 1998:52). This was
because of the pure speculative nature of the arguments.
Spurred on by the recent work of such linguists as Derek Bickerton
and Steven Pinker, there seems to be renewed interest, but discussion
is typically positioned within a particular context, evolutionary
theory. As a consequence, the discussion also finds renewed respect
and credibility due to its association with respected members
of the scientific community. This may be of some concern to those
who hold any sort of alternative view of the origins of the universe
and that of humankind. If such an evolutionary account appears
weak or fails, divine-origin does not necessarily become convincing
by default. "There is no way to 'prove' or 'disprove' the
divine origin of language, just as one cannot argue scientifically
for or against the existence of God" (Fromkin & Rodman
1998:53). For a claim to be considered a viable hypothesis or
theory, there is the scientific issue of falsifiability. Perhaps
needless to say, neither creationism nor evolution is falsifiable.
Undaunted by the magnitude of the task, this paper seeks to propose
a reasonable alternative by posing a simple question (and giving
a reasonable response):Which scenario, evolution or conscious
design, offers the most likely or reasonable view of the origins
of language and the language faculty? In other words, how likely
is the amazing order that we find in language and the language
faculty to be the result of random processes of genetic mutations,
even given a rational account of natural selection? Given the
combination of complexity, power, and order, how likely or probable
is such a "language organ?" And, as biologically based
(programmed into our DNA) and species specific (with no continuity
to any known species), what are the probabilities that something
like language and the language faculty have evolved by chance
to belong only to human beings?
In the following section, I attempt to provide a brief and very
general overview of the two most common ways in our culture to
view the world.
Worldviews:seeing the world with a trained eye
One thing that seems certain is that we typically look at the
universe and address the "big questions" with an eye
that is already trained to see certain things, and the origins
of the species and our unique capacities are certainly among those
questions. Ultimately, our assumptions about the origins and nature
of the universe shape how we view that universe and everything
in it. We instinctively put together some organizing principle(s)
to make sense of all the bits of information that surround us,
even if that is just tacitly accepting the view we've inherited
from our parents, teachers, or other community figures. If it
was good enough for them, then it must be good enough for us.
This may be a dangerous assumption:can 50 million Germans be wrong?
The answer is an emphatic, "Yes!" To be fair (and an
"equal-opportunity" critic), we all need to check our
assumptions from time to time in the light of the Truth that we
accept as certain, irrespective of any particular point of view.
The world is immense, and it is amazing. Most of us, I believe,
have views of the world, our worldviews, that help us explain
to ourselves why things are the way they are. For example, if
I believe that God created the heavens and the earth, then the
amazing order and systematicity in the universe can easily be
attributed to God's skillful design. The order I see is a consequence
of the fact that God is rational, and everything He creates, He
creates with a purpose and a design. The universe operates according
to organized principles and laws, so, when I watch my body heal
after an injury or sickness, I marvel at how God designed the
human body. We see biological and physical laws such as gravitation,
and look to the Lawgiver for the explanation.
Another way to view the world, accepted, it seems, by the majority
of the people of Western Culture, excludes the existence of any
kind of god, particularly the God of the Bible. As a consequence,
all the amazing order that we see is merely an accident, a result
of mere chance. Its evolving form is based on mathematical probabilities
and a principle known as "the survival of the fittest."
All matter and forms of life adapt to their circumstances. Somewhere
along an evolutionary path, life leapt into existence with no
real cause, no real explanation. Out of chaos, has come order,
and out of pure matter has come Mind (consciousness). Taken to
its extreme, humankind is ultimately an accident, and we, as human
beings, are responsible to nothing or no one higher (or lower)
on any kind of scale of Being. There is no design, and there are
no laws. Otherwise, we'd have to explain where the design and
laws come from.
These two fundamental positions are most certainly mutually exclusive.
If one is to be logically consistent, there doesn't seem to be
any middle ground. The person who will not believe in a God-Designer
is driven to evolution, or by some "leap of faith,"
dives into a belief system that is at once both speculative and
based on myth. Likewise, if a person finds evolutionary theories
difficult to accept, then some sort of supreme Being must be faced.
Perhaps, the only possible ground between the two positions, irrespective
of the question of design, is a general assumption of order, which
seems to be obvious and a point that either side can accept without
serious self-contradiction. Of course, any conclusions or explanations
are typically reconciled with underlying assumptions of causes,
origins, and so on. To illustrate the nature of the exclusivity
of the positions, Pinker, perhaps the leading expert on language
and the mind, rightly makes the following remarks on the "complex
design" of such human organs as the eye (emphasis his in
each instance):
Natural selection is not just a scientifically respectable alternative to divine creation. It is the only alternative that can explain the evolution of a complex organ like the eye. The reason that the choice is so stark--God or natural selection--is that structures that can do what the eye does are extremely low-probability arrangements of matter. By an unimaginably large margin, most objects thrown together out of generic stuff, even generic animal stuff, cannot bring an image into focus, modulate incoming light, and detect edges and depth boundaries. The animal stuff in an eye seems to have been assembled with the goal of seeing in mind-but in whose mind, if not God's? The very special power of natural selection is to remove the paradox. What causes eyes to see well now is that they descended from a long line of ancestors that saw a bit better than their rivals, which allowed them to out-reproduce those rivals. The small random improvements in seeing were retained and combined and concentrated over the eons, leading to better and better eyes. The ability of many ancestors to see a bit better in the past causes a single organism to see extremely well now. (Pinker 1994:360-361)
If guided by a basic belief in the various theories of evolution,
one likely assumes that the elements, the basic building blocks
of the universe, are merely there and, more or less, always have
been there in one form or another. They have not come from anywhere
specific, and they are not going anywhere either, except along
some sort of evolutionary path. There may have been a beginning,
and it is likely there will be some kind of end-at least to our
universe. However, what came before and what may come after
will be forever a mystery. The order to all that we see is phenomenal,
perhaps epiphenomenal, literally and figuratively. It must be
stated unequivocally that there is no design to it, that is, unless
one takes the rather irrational leap to a belief that evolutionary
processes themselves possess some sort of conscious characteristics
enabling them to act in specific and organized ways (evolution
selects), or that something--some type of impersonal force--or
someone is behind the obvious order (like the Wizard of
Oz).
What the evolutionary researcher sees will conform to these basic,
underlying principles, those that she/he has already accepted
beforehand. Consciously or unconsciously, the observed facts must
fit into a systematic worldview so that the researcher can interpret
what they mean. Otherwise, nothing can be "known,"
and all data appear to be disconnected. It also follows that one
may speak of the "facts of evolution" because certain
details are presumed to be true, for instance, that humans
and the great apes (e.g., gorillas and chimpanzees) have a common
ancestor. Obviously, this must be assumed because it is not provable
in a laboratory, and no one claims that it could be proved or
disproved, for that matter. Scientists are the "experts,"
and on their authority, normal, untrained people learn to trust
their conclusions. Despite the apparent optimism, the appeal to
authority by science, nevertheless, is a logical fallacy. In other
words, just because a particular scientist (however that can be
defined) says that a particular statement or proposition
is a fact that it is unquestionably a fact. The history
of science is full of claims that have been proved false. Science
must self-correct. We don't have to go very far back to find examples,
for instance, regarding "proof" of the spontaneous generation
of life, allegedly proved by observing fly larvae emerging from
garbage, that is, before it was known how flies actually reproduce.
Of course, we cannot assume that all the conclusions of
every modern scientist are false, either (another logical fallacy),
but we must, nevertheless, discern between what is knowable and
falsifiable and what is not. It should be added immediately and
emphatically that no proponent of design refutes the utility
or validity of the scientific method. What is open to debate
are the conclusions one comes to on the basis of worldview or
belief and the presuppositions that each and every one of us brings
to the table as we examine the evidence.
To illustrate where evolutionary thinking can lead regarding the
origins of our species and the kind of confidence it engenders,
the contention is that, by a natural process of selection among
randomly occurring mutations, simple forms of life have evolved
into those that are increasingly complex, which, presumably, are
better suited to their respective environments. Our species has
developed as a product of these processes of complexification.
As a consequence, our consciousness-our awareness of ourselves
and others like us, along with the totality of our cognitive abilities,
whether imagined or real-is the complex result of biochemically-based
reactions contained in amazingly ordered neural networks. Thus,
a vast array of interconnecting neurons in the brain allows us
to make "mental" associations, however they may be configured,
which somehow effectively create the impression that we are aware,
that we possess and organize real knowledge of our surroundings,
experience a wonderful panoply of emotions, and so on, all in
response to various sensory stimuli emanating from our environment.
In his widely acclaimed book, Consciousness Explained,
the philosopher, Daniel Dennett, states, rather optimistically:
Human consciousness is just about the last surviving mystery. A mystery is a phenomenon that people don't know how to think about--yet. There have been other great mysteries:the mystery of the origin of the universe, the mystery of life and reproduction, the mystery of the design to be found in nature, the mysteries of time, space, and gravity. These were not just areas of scientific ignorance, but of utter bafflement and wonder. We do not yet have the final answers to any of the questions of cosmology and particle physics, molecular genetics and evolutionary theory, but we do know how to think about them. The mysteries haven't vanished, but they have been tamed. They no longer overwhelm our efforts to think about the phenomena, because now we know how to tell the misbegotten questions from the right questions, and even if we turn out to be dead wrong about some of the currently accepted answers, we know how to go about looking for better answers. (Dennett 1991:21-22)
It is implicit in such statements that evolutionary thinking
is the right way to frame the right questions, and one is left
to imagine what is meant by "misbegotten." Assuming
the self-correcting nature of science, it is noteworthy that Dennett
does admit to the possibility that "we" could be dead
wrong, but he will not admit to the possibility that the misbegotten
questions could produce fruitful lines of inquiry. And, I wonder
at how "we" have tamed the mysteries of space and time,
and of the "design to be found in nature."
In stark contrast, a worldview based on the existence of God and
His revelation of Himself (the Bible) clearly puts the same evidence,
or observable facts, into an entirely different perspective.
The starting point in a Biblical worldview is that there is a
designer, one who has built into the universe basic, fundamental
physical and biological laws, certain universals (laws
that are uniformly applied and never random) that explain
(an important word in all branches of science) how all of the
interrelated elements and structures function like "clockwork".
If we assume design, we have the intellectual freedom to
note cause-and-effect relationships and, as a consequence, to
formulate principles and laws within dynamic physical and
biological systems, and, perhaps, within certain social and psychological
domains, as well. One can reasonably account for the fact that
there is a reason behind the order, and that it won't change overnight
or over millions of years (trees bearing seed after their
own kind, human children born of human parents, etc.). We can
conclude that the accumulation of facts and the principles drawn
from the representative samples (examples in nature) that we observe
in the physical world, for instance, constitute a body of knowledge
that is objectively true, that is, when the observed facts are
consistent with the principles revealed in scripture (e.g., a
design to nature).
Evolutionary theory, therefore, claims to account for the order
that we see in the universe, and it applies the concept of natural
selection to the evolution of our species. The assumption, of
course, is that all life began at a point in time out of some
sort of primordial soup of elements, and that complex organisms
have evolved out of simple organisms. While this claim is untestable,
the available laboratory evidence demonstrates that organisms
do not become more complex. Granted, they may evolve or change,
but the changes do not involve the creation of additional,
novel genetic information. Two parent animals may produce an offspring
that is not necessarily identical to either parent, but the genetic
information that offspring possess is derived from the parents
nonetheless. The work of Gregor Mendel with varieties of peas
illustrated the concept of recessive traits, that certain genetic
information may be present in the parents but not directly realized
in their progeny. His work illustrated that changes are not random
(in the mathematical sense); cross fertilization may introduce
"new" genetic information into a species, but this information
is not necessarily novel.
In the case of mutations in humans, for example, there is a price
to pay. Genetic abnormalities typically produce weaker (in the
biological sense), not "superior" offspring. With respect
to genetic mutations, much of the work in humans and other animal
species concerns genetically inherited diseases. To my knowledge,
no mutations in DNA have been found to produce positive changes,
however they may be defined. Nevertheless, this is a very great
gray area intrinsically open to a wide range of interpretation.
In plant species, the focus of gene research is the elimination
of disease producing mutations, for example, to breed disease-resistant
strains of plants (crops) and so on. Mutations in animal genes
(e.g., those caused by lesions and so on) do occur. Nevertheless,
the outcome of these mutations is an empirical question for researchers
in genetics, and, if it can be proved in the laboratory that random
mutations produce novel and genetically stronger species
or races of animals, then that needs to be made public (to people
of all worldviews) and open to scrutiny.
Genetic manipulation, which obviously includes questions of ethics
and the current lack of knowledge of long-term consequences of
such manipulation, has apparently been successful with respect
to plant species. It may be assumed that genetic engineering may
accomplish similar results in animal species. Nevertheless, manufactured
changes are clearly the result of outside manipulation in controlled,
laboratory environments. And, of course, design is explicit. When
a researcher tinkers with the DNA of any species, she/he plays
a causative role. The long-term results may be accidental (i.e.,
not fully predicted), but they are certainly not random.
Under normal, natural circumstances, mutations do not create entirely
new species, just different sorts of the same species. Even regarding
races within species, adaptation (the preservation of preferred
or advantageous genes) within a specific environment simply means
that "selection" has resulted in the elimination of
those genes that weaken and not the spontaneous generation ex
nihilo of novel DNA. In addition, mutations occur to existing
materials, and the consequences appear to be within limits.
Speculation about the origin of our species in Darwinian terms
relies on the concept of selection plus the great time depths
required to make required changes, say from reptiles to mammals,
and from the first mammals to diverse kinds of mammals, including
the biological order of primates. The idea of a common ancestor
that existed millions of years ago is critical for understanding
the wide diversity within this particular order. For example,
it is commonly held that gorillas, monkeys, and humans split off
from a shared lineage at different times, though there is not
universal agreement as to which split occurred first. Based on
the analysis of DNA, chimps and humans share about 99% of the
same genetic information. But, Seidenberg (1987:fn. 31) points
out that the 1% difference involves a minimum of 40 million base-pairs
(sets of bonded nucleotides on opposite strands of DNA).
Bickerton (1990:1) states:"Yet if you consider our respective
natures, you would never expect the gap between us and the apes
to be as vast as it is. We share with the chimpanzee perhaps as
much as 99 percent of our genetic material, and our common ancestor
may be as little as five million years behind us." Bickerton
contrasts the magnificent accomplishments of our species-the airports
(and airplanes), buildings, roads, bridges, and other monuments
to our engineering prowess-with those of other species, and admits
to the unlikelihood that one can logically "prove" evolution.
He adds the following comments of the cognitive abilities
of humankind contrasted with those of our nearest genetic
relatives:
These vast differences, qualitative as well as merely quantitative, between our species and those that are closest to it pose no problem for those who believe, as many still do, that we result from a unique act of creation, a supernatural irruption into the natural scheme of things. For those who do not believe this, and who find overwhelming the evidence that we developed, as all other species did, through the natural processes of evolution, these differences must remain puzzling indeed. That evolution, over all-but-infinite time, could change one physical organ into another, a leg into a wing, a swim bladder into a lung, even a nerve net into a brain with billions of neurons, seems remarkable, indeed, but natural enough. That evolution, over a period of a few million years, should have turned physical matter into what has seemed to many, in the most literal sense of the term, to be some kind of metaphysical entity is altogether another matter. So, on the face of it, both sides seem to be left holding beliefs rather than theories:the one side, belief in a special creation, the other, belief in a no-less-miraculous transmutation of matter into mind. (1990:2)
Situated somewhere in the dialog of origins, this paper looks
specifically at language and the language faculty, to see what
light available empirical evidence can possibly shed on
the question of design. Which proposed theory fits the facts better?
Perhaps, both are equally defensible (or indefensible). If design
is assumed, then all evidence of order is merely the expected
consequence. If it is not, questions arise of the ability of random
mutations to generate order and how evolutionary processes can
be reasonably construed. From the Biblical perspective, the systematic
nature of language and the language faculty have a cause, and
it is not based on probability (which can cause nothing).
Chance is not an entity that can do anything. From the
evolutionary standpoint, there must be some sort of force--a hand
that rolls the dice--that acted to produce an effect (there are
no effects without a cause), even if it merely acted upon existing
elements within the limitations imposed by basic possibilities
and/or probabilities.
Just as particular principles guide a researcher to look for homologies
and analogies according to presumed patterns of evolution, the
Biblical worldview guides research by pointing out where to look
for patterns and correspondences. It motivates efforts to gather
more information about such areas as language and the mechanisms
behind it, to increase our understanding of who (or what) we are,
and where we might stand in relation to an orderly cosmos. The
evolutionary viewpoint motivates efforts to demonstrate how order
comes out of chaos, and how complex organisms evolve from simple
ones. The Biblical viewpoint motivates efforts to show how the
amazing order evident in the cosmos is the natural consequence
of design.
1:Human language
Among the many fields and sub-fields that make up the study
of language as we know it today, there are a number of commonly
held principles that are consistent with views that assume design.
At their base is another one of those wonderful chicken-and-the-egg
conundrums. What came first, the language faculty (the human capacity
for language), or language? This may not be as obvious as it looks.
First of all, can we say that languages exist apart from their
users? And, what makes any system of communication a language,
and what defines it as human? Are all linguistic systems
known as human languages a consequence of the ways the
speaker's mind/brain is configured, brain architecture, ignoring
for the moment the implied architect?
One scenario, the one we could call the egg, suggests that
languages are entities unto themselves, the products of human
societies. So-called dead languages are not really dead
at all. It is simply that their speakers are no longer living-they
died, not their language(s). Latin and Ancient (Biblical) Hebrew
are wonderful linguistic systems full of expressive power and
beautifully constructed grammars. In fact, Latin is often held
up as one of the most elegant and powerful languages ever spoken
and/or written in history, the part we know. What is left of it
is a family of languages full of irregularities and characteristics
that have accrued from outside (non-Latin) sources; the modern
varieties pale by comparison. Modern Israeli Hebrew systematically
differs from its ancient predecessor. To the purist (who wants
to preserve the pristine nature of a language), as soon as those
pesky, impertinent, and unpredictable humans started to
speak Hebrew again, it began to change, deteriorate, and degenerate.
The other scenario, the chicken one, suggests that human
language mirrors brain architecture and provides just about the
best possible window into the inner workings and circuitry of
the brain. It often constitutes one of the key elements of our
attempts to map the brain and identify where certain functions
occur or perhaps are located. This view of the mind/brain also
fits into evolutionary views, which assume that the brain itself
is the product of accumulated changes that have taken place over
"all-but-infinite time", changes that have somehow become
encoded into our DNA. When humankind's brain reached a certain
point in its development, language was possible. From the evidence
available so far, Homo sapiens is the only species that has a
brain developed in such specific ways. In one worldview, the mind/brain
came first, and language somehow gradually developed. But, from
the other, the mind/brain and the language faculty were constructed
simultaneously (God spoke to Adam in the garden), with purpose
and design. Language is a gift, and it appears to be the one characteristic
that goes beyond the mere physical. It has given our species the
ability to communicate, to create huge bodies of literature of
all kinds (from technical to spiritual), and generally to reflect
upon itself and questions such as the ones we are asking here.
Four characteristics of the egg
Within linguistics, there are at least four commonly held principles concerning the nature of language:(1) Every language is, in principle, infinite. As Wilson (1975:176) states:
The great dividing line in the evolution of communication lies between man and all the remaining ten million or so species of organisms. The most instructive way to view the less advanced systems is to compare them with human language. With our own unique verbal system as a standard of reference we can define the limits of animal communication in terms of the properties it rarely-or never-displays. Consider the way I address you now. Each word I use has been assigned a specific meaning by a particular culture and transmitted to us down through generations by learning. What is truly unique is the very large number of such words and the potential for creating new ones to denote any number of additional objects and concepts. This potential is quite literally infinite.
To illustrate, there is a finite set of digits-ten, from 0
to 9--yet, there is no largest number. This is due to a principle
known as recursion:each element can recur without limits.
One can simply add another digit to any number, and it increases
exponentially. The same applies to language:there is no longest
sentence in any language because there is no limit to the
number of nouns, verbs, and so on that are allowable in a single
utterance, and no limit to the number of possible sequences (e.g.,
phrases and clauses) constructed by the syntax of an individual
language. This is true of every human language, English included.
All that needs to be done to expand a single sentence in English
is to add the words and or that, for instance, and
then keep right on going:"This is the house that Jack built"
Despite the fact that one might be able to list the individual
words of a language in a dictionary, it would be silly
to even imagine trying to list the possible sentences in any language.
The possibilities are infinite. Every day, we produce and
understand sentences that we have neither heard nor spoken before.
This is possible only because language is systematic.
Second, in addition to being, in principle, infinite, (2) all
human languages are equally complex or challenging for any child
to learn, and this goes beyond the desire merely to say something
politically correct and to defend so-called minority languages.
The speech systems of the world exhibit an amazing diversity in
the ways they represent words, phrases, and/or human thought.
Irrespective of their potential differences, every bilingual has
at least the intuition that whatever can be expressed in one language
can be expressed in the other, even if it must be put in somewhat
different terms. The Inuit (Eskimo) languages, for instance, consist
of extremely long and complex words that take entire sentences
to translate into English. Nevertheless, no one would say that
English is less developed or evolved, or that what can be expressed
in an Inuit language cannot be put into English with a little
effort. At the other extreme of language types are those such
as Vietnamese, Khmer, and the Chinese languages which structure
words in ways that make English appear complex. They have no distinctions
of Tense on verbs and do not express singular and plural on nouns.
Yet, these languages are no easier to learn than Inuit languages.
By that it is meant that a child born into an Inuit-speaking family
acquires an Inuit language just as quickly, efficiently, and effortlessly
as one born into a Vietnamese-speaking family acquires Vietnamese
or another in an English-speaking family acquires English. (Learning
a second or subsequent language as an adult is another matter.)
The moral of the story is that human babies learn human languages
at basically the same rate, with the same degrees of success,
quite unconsciously (through observation), and without the aid
of a private tutor or teacher (no textbooks or exams, either).
Third, along with the concept of complexity, linguists often draw
attention to the fact that (3) there are no primitive languages
out there, only primitive people and cultures. Calling one
society "advanced" and another one "primitive"
is a rather subjective judgment anyway, based on perceptions of
culture, not intelligence or genetics. The people of the Kalahari
Desert or of, say, Borneo, or the remotest of villages in the
outback of any continent may be considered primitive by Western
technological standards, but they speak languages that are easily
as grammatically complex as English, maybe even more so. At no
time and in no place has a primitive language been found,
one that appears to be less evolved or developed than any language
spoken today or throughout recorded history. Despite film fantasies
of cavemen and cave women, there are no "ooga-boogas"
out there.
Fourth, and shedding light on chicken and egg alike, (4) wherever
humans have been found to exist, there is language. In other
words, chickens and eggs go together. Nothing has been uncovered
to suggest that human language or the capacity for language
has evolved in any way from earlier stages, from simple to increasingly
complex structures, either with regard to grammatical structures
or the neural structures necessary to sustain language. This is
interestingly expressed in the so-called Paradox of Continuity,
that there is no previously existing genetically-based biological
system in any other species out of which language could have evolved
(Bickerton 1990:7-8). This observation is continually reinforced
by the unconscious ways in which all children acquire languages,
a fact that has led many linguists to agree that language or the
knowledge of language (how to do it) is part of our (species-specific)
biological endowment, an ability that is inherently there and
not likely to change. There is something about human language
that enables it to be acquired by human children efficiently,
with little conscious effort. There is something about the human
mind/brain that allows children to acquire human language.
Now for the chicken
Turning, once again, to the chicken, every normal child--i.e.,
one born without some sort of physical abnormality or condition
that can limit the brain's functioning-acquires his/her native
or first language naturally and in a predictable order, beginning
with linguistic sounds (or gestures) in general and progressing
to the specific sounds and sound patterns (or signs) of the language
they are acquiring. After sounds come words. Child acquirers then
pass through grammatical stages from one-word utterances to two-word
combinations, and eventually recreate the grammatical characteristics
of the adult speech in their environments. All of this is accomplished
without the benefit of any kind of special training or instruction,
and without a trainer or instructor. Note, too, that, it is likely
that the majority of the people alive today use more than one
language on a daily basis. Those who learn several languages
typically keep them separate and use each of them skillfully and
in their appropriate social contexts. For instance, some languages
are used in the home, with parents and relatives, while other
languages may be used in business settings or in other more official
(formal) settings (school, government, etc.).
Babies acquiring English or any other human language seem to have
a built-in, internal syllabus, an order in which their language(s)
are learned. Some proceed faster than others, but all go through
similar stages in acquisition and get to basically the same place-complete
and total mastery of their native languages. To highlight the
unconscious aspects of native-language acquisition, few people
recall acquiring their first languages, though we often remember
trying to learn one as adults. I probably made all the same boo-boos
as every other child acquiring English and said things like feets
and pusketti, but I certainly don't recall ever doing them.
Then again, I may have been quite precocious. I also grew up with
two languages in the home, Spanish and English, and I don't remember
acquiring either of them. Nevertheless, taking German courses
at a community college is something that I will never forget.
It was time-consuming, boring (a "dirty" word to many
in language instruction), and I was embarrassed at how my voice
sounded on tape (yes, the instructor made us speak into a tape
recorder and listen to how we sounded in German-ugh!). Despite
the discomfort, I thought I was doing quite well (I got good grades)
until the first few minutes I was in Germany and came face to
face with native speakers who did not slow their speech down for
my benefit, as my German instructor had done.
The capacity for native language acquisition is truly remarkable.
Apparently, some sort of developmental program begins to run around
the child's first birthday, with or without input, when the first
signs of spoken language, or babbling, begins. Children
born deaf make oral, babbling sounds just as their hearing counterparts,
though their babbling gradually fades away in the absence of the
feedback hearing children experience. Deaf children will then
spontaneously construct visual utterances (perhaps part of what
may develop into what is known as home sign) to communicate
if their parents and/or caregivers are hearing and not familiar
with a visual language system. Importantly, those who are exposed
to visual language, e.g., American Sign Language (ASL), from birth
go through the same kinds of developmental steps of language acquisition
as those who are acquiring a spoken language.
To illustrate the natural progression, researchers specializing
in the native acquisition of sign language describe a phenomenon
known as pronoun reversal, when a signing child reverses
the direction of a pointing gesture representing the pronoun "you"
to refer to "me." Signing children may assume that the
linguistic function of a gesture pointing towards them (at "me")
and away from the person with whom they are "talking"
(e.g., Mommy), refers to them, and obviously not the signer herself.
So, the child may use that same sign pointing away from him- or
herself (technically "you") to refer back to, namely,
"me," the one who is usually pointed at in that way.
The child reverses the gesture to mean "me" or first
person, despite the fact that the sign (form) means you
(second person) in ASL. The confusion seems understandable from
the child's viewpoint. What is noteworthy is that this is also
reported among some hearing children when mastering the pronoun
system of English. For instance, when a young child says, "You
want some candy," he or she may very well mean "I want
some candy." This may escape the grasp of the parent entirely,
who naively answers, "No, thank you. I don't want any candy."
Until this stage is passed, there may be some awkward moments
and children frustrated with their parents' inability to understand
"plain English."
It is interesting to note, too, that signed language systems such
as ASL are human languages, too, with the same creative and expressive
properties, grammatical complexities, and so on as those that
are typically spoken. It may seem remarkable to those unfamiliar
with deaf language users that many native or proficient signers
subsequently learn spoken languages (perhaps more than one) in
addition to their signed language(s). There is no reason to assume
that the capacity for language is absent from a person just because
a particular perceptual ability (or input device such as hearing
or vision) allowing access to particular kinds of linguistic information
is impaired (Hoff-Ginsberg 1997:298-306).
The literature on language acquisition abounds with evidence of
systematicity and order. Regarding the basic building blocks of
language, individual speech sounds are typically learned in a
predictable order. For instance, all vowels are generally learned
before all consonants, and particular consonants are mastered
before others (e.g., stops such as the sounds for the letters
p, b, and m are learned before the th-sound of the word
three or the r-sound of red. Nouns (names
of persons, things, etc.) are learned before prepositions (e.g.,
in, on, or under). Even with respect to specific
word endings (or suffixes) such grammatical (inflectional) forms
such as suffixes in English, the -sused to form plural nouns (the
-sof toys) is typically acquired long before the verbal
suffix -s, for example the -son loves in the following:"Fred
loves Cathy." With respect to word order and the sequences
in which sentence structures are acquired (syntax), declarative
sentences (statements) precede questions, either yes-no
types or questions using the so-called wh-words such as
who, what, where, when, and why,
and so on. Affirmative sentences also precede negatives, e.g.,
sentences such as "Fred speaks Spanish" are typically
acquired before a sentence such as "Fred does not speak Mandarin".
All aspects of grammar are acquired in some kind of order or sequence
in all the languages of the world, whether these languages are
written or not, acquired in the home, or learned at school.
As the work of Eve Clark points out (e.g., Clark 2003), children
are active participants in the acquisition processes. It used
to be thought that we are all born with brains that were essentially
blank slates, with no inherited or innate knowledge. Recent research
has definitely spoken to that (e.g., Karmiloff-Smith 1993). It
is abundantly clear that children learning English, for instance,
develop specific strategies to get new words. They pay attention
to specific places in a parent's or care giver's speech to locate
new words for new concepts, for instance discovering new words
in adult speech at the end of an utterance, "Look at the
__________" (Clark 2003:68). It is also clear that it is
not always the gaze of the parent that determines where the child's
attention goes. In fact, it is typically quite the opposite. Whatever
draws the child's attention (e.g., an elephant or clown at a circus),
draws the attention of the parent or caregiver. The child often
is the one who initiates shared attention or gaze (Clark 2003:32).
Of course, the question that naturally follows is, "Why?"
Why do children demonstrate such systematicity in acquisition.
Current theory suggests that complexity and learn ability based
on the processing demands that each individual language places
on the mind/brain of the child acquirer or adult learner play
significant roles (see, e.g., Field 2004; Pienemann 1998, 2000).
In essence, the elements of language that are simple (e.g., names
like nouns) are learned/acquired first, followed by those that
are more complex (e.g., inflectional suffixes), aspects of language
that require greater and deeper kinds of knowledge of the language.
This is also one reason why some words such as nouns and verbs
are so easy to learn in a new language, and why more subtle and
intricate grammatical characteristics and patterns may take so
long to acquire and seem so difficult and complicated.
What language is not
It may be worthwhile to consider what this remarkable cognitive
skill is not. Those in the cognitive sciences note that
language is neither THOUGHT nor INTELLIGENCE per se. Even
though we often talk to ourselves or encode our thoughts
into language, we use our minds (an interesting concept in itself)
for functions that are not linguistic. We can do many things simultaneously
without mentally discussing it with ourselves, as any experienced
automobile driver will confirm. This is significant because implicit
in most conceptions of consciousness is some form of intelligence,
the Sapiens part of the scientific name of our species.
It may be difficult for some of us to think of consciousness without
the presence of language, but it is there nevertheless. The next
time you change stations on your car radio while driving in traffic
on an interstate, you'll notice that you do so with little or
no discussion. In fact, you may be talking to someone else at
the same time. We all engage in multitasking, and
I think it's fairly obvious that we don't lose consciousness when
we do. We all know how to think without language, e.g., when we
are busy doing such skilled behaviors as typing, playing a musical
instrument, or engaging in a sport.
Despite our intuitions, there is no strict equivalence of intelligence,
on the one hand, and language, on the other, a point linguists
like to make when speaking of the autonomy of language and the
language faculty. It is illustrated most clearly by the great
apes, who are very clever indeed and, therefore, intelligent.
Nevertheless, they cannot attain to human language. It is also
true that one cannot equate language with cognition, the totality
of our ability to know things-how to see, how to calculate,
how to walk upright, how to do language, and so on. It is only
one aspect of our inherited abilities as humans. Evidence for
the important distinction between cognition and language comes
from those with Down Syndrome, who may have significant mental
retardation, yet whose language abilities develop relatively normally,
and those with Williams syndrome, who have severe cognitive deficits
but who appear to have extraordinary linguistic abilities, nonetheless.
We turn now to a more detailed discussion of language and the
language faculty. Because language and the language faculty are
warp-and-woof, in a sense, the discussion of the nature of language
will continue to be relevant.
2:Complexity and power-neural mechanisms of language
One of the advantages that modern linguistics has gained over
the last half century or so is the very rapid development of technology.
For instance, technological advances in brain imaging have allowed
researchers to get close-up views of what seems to be going on
in the mind/brain as it goes about its normal functioning. Computerized
brain-imaging machines can generate graphic displays of entire
brain structures and pinpoint areas within a single millimeter
of a particular slice of brain tissue (Sylwester 1995). Brains
can be monitored for electro-chemical activation through a number
of methods, including CAT scans (computerized axial tomography-making
images of specific slices of tissue used to produce three-dimensional
representations of X-rays), PET scans (positron emission tomography,
using radioactive materials to chart blood flow), and MRIs (magnetic-resonance
imaging that produce images of a cross-section of an object or
bodily organ). The electroencephalogram (EEG), in use for over
half a century, reports patterns of the electrical transmission
of information; rapid advances in this type of technology has
produced such machines as BEAM (brain electrical activity mapping),
which more precisely records brain activity and can generate a
graphic representation of the cerebral cortex.
Recent models of language processing use some of the concepts
inferred from actual brain structure. Various connectionist models,
for example, describe "spreading activation" along the
pathways of the brain. These models have been severely criticized,
mostly for the failure of computer models to generate anything
close to human language. Nevertheless, they are based on the idea
of neurons and neural pathways (dendrites, axons, neurotransmitters,
etc.). According to Gee (1992:36-37), the brain contains a total
of approximately 1011 (one hundred
billion) neurons, each having at least 1,000 connections. The
number of possible connections is, therefore, 1014,, or 100,000,000,000,000
(one hundred trillion). On this view, each individual connection
receives signals that vary in intensity; they are "weighted"
according to the importance, appropriateness, or correctness of
the signal. (The relative intensity of different signals along
neural connections can be illustrated by tests involved the recognition
among words. All likely candidates "light up" in the
brain until the so-called recognition point is reached, at which
point, the signals of excluded candidates gradually fade. The
connection to the correct form is preserved at a maximal intensity.)
Gee suggests that a conservative estimate of ten degrees of activation,
or weights, multiplies the basic possibilities by a like number,
meaning that 1 quadrillion sorts of signals are possible in such
a neural environment.
A more traditional account of brain architecture (of linguist
Noam Chomsky and others) is that it is modular, consisting
of interlocking (interdependent, yet autonomous) regions. Language
is, therefore, one of these modules, sub-divided into separate
sub-modules for pronunciation, syntax, and so on. Recent work
in brain injuries, e.g., into various aphasias such as Wernicke's
or Broca's aphasia (a dysfunction resulting from strokes, lesions,
and so on), and the split-brain work of researchers such as Roger
Sperry (who won the 1981 Nobel Prize) seem to support a position
perhaps somewhere between the two extremes, between a strictly
connectionist approach, on the one hand, and a strictly modular
approach, on the other. It seems clear enough that particular
areas of the brain have specialized functions that can be located
(more or less) and affected by stimulus, as those illustrated
in studies of the role of the corpus callosum in grand mal seizures
associated with forms of epilepsy. Particular probes in certain
areas trigger mental images (that can be quite realistic) in patients
who experience such surgical procedures without any type of anesthetic.
Irrespective of how the debate between modularity and connectionism
may turn out (both can be partially correct), it seems clear that
the substance of the mind/brain is a vast array of neurons connected
in various ways. The fact that it functions with such remarkable
precision is a testimony to its structure-and, regardless
of one's viewpoint concerning origins, the term "architecture"
certainly appears applicable. We think abstract thoughts, perceive
and sense the things and people around us, move about, create
works of art, feel and express various emotions, compile memories
of our thoughts of events and different kinds of sensory information,
compose our thoughts into coherent messages, and communicate with
others like ourselves, all by means of this amazing organ. It
is a complex universe of connections and activity in its own right.
Most linguists and scholars in other branches of the cognitive
sciences would agree that language and the language faculty reflect
the cognitive abilities that every human being shares--our
repertoire of innate abilities (e.g., to walk, see, think, make
basic mathematical computations, and talk), however the term innate
may be defined. Whether it is autonomous and domain specific (there
is a specific aspect of the mind/brain devoted to its operation
and function that is independent of our other cognitive abilities)
or domain general as part of our overall, general cognitive abilities,
humankind is "born" with the capacity for language.
We all know how to do it, whether we can discuss it intelligently
or not. You may not be a linguist, theoretical psychologist, or
an English teacher, but you know how to speak or sign a language.
People are remarkably efficient language users, to be sure. In
the normal, everyday use of language, we have to access the correct
meanings for the forms (words and so on) we hear and read, and
find appropriate forms for the concepts we wish to express. We
comprehend and produce thousands of words accurately and efficiently
on a daily basis. There are various estimates to the number of
words contained in our heads-our vocabularies or mental lexicons.
Some have put the number that we can use more-or-less accurately
(our production lexicon) at around 30,000. Based on recognition,
one study estimated the vocabulary size of the average Oxford
undergraduate at 75,000 words. The rate of access and retrieval
in normal spoken conversation is around 120-150 words a minute,
with spurts that may reach double that rate.
This means that out of the tens of thousands of words in our heads,
we are pulling out from two to four or five per second. And, we
seldom make mistakes. A study done on a body of 200,000 spoken
words in English found 86 word-choice (lexical selection)
errors and 105 other slips of the tongue (Garnham, Shillcock,
Brown, Mill, and Cutler 1982). Taken together, this amounts to
an error rate of about one per thousand, which would be equivalent
to making one mistake or slip while speaking non-stop as fast
as you can for three minutes, eight to ten minutes if you take
your time. This kind of efficiency and speed in processing cannot
be the result of blind luck or chance. It has prompted more than
one researcher to stand back in awe. Willem Levelt, a well-known
researcher and author writes:"Reading this literature may
create the misleading impression that felicitous lexical access
is a matter of good luck rather than of exquisite design"
[emphasis mine] (Levelt 1992:2).
There are numerous theories about how we actually do this, despite
the fact that no one yet knows for sure what a word could
possibly look like as represented in the brain, or what you actually
know when you know a word. Most approaches begin with the
idea that particular forms (sequences of sounds in spoken language
or gestures in signed language) are associated somehow in the
brain with information about their (a) meanings (both in relation
to a word's immediate context and at the so-called global level
in relation to the individual's knowledge of the world); (b) other
forms (e.g., words in English that may have various forms such
as foot-feet, go-went-gone, electric-electricity,
differ-different-difference, and so on); and (c) positions
in which the forms are allowed or required to occur in an utterance
(whether a word can be the subject of a sentence, a verb, adjective,
and so on). Add to this the physical aspects:the complex and subtle
acoustic properties of individual sounds (e.g., the difference
between the p-sounds of "pit" and "spit",
the aspiration) and the articulatory mechanisms that need
to interact to produce them (the brain, lungs, and various muscles
in and around the oral cavity).
We can even guess what a word will be before we hear or
read it. It could be the sound of the word to complete a rhyme,
its meaning to complete a joke, or the simple fact that there
is only one correct possibility grammatically (e.g., part of speech)
or semantically (according to its meaning). Many of us know the
experience of anticipating the next word in a camp song, rhyme,
or, joke that we have never heard before. For example, you should
be able to complete this ____________. Apparently, there is something
about the mind/brain that allows it to locate appropriate word
forms for meanings and the reverse, meanings for words, that gives
us the ability to fill in the blanks and predict what is coming
next.
Considering the speed with which words are encountered in the
speech stream and the sheer number of possible meanings they could
have, there must be an equally rapid and efficient process that
enables us to correctly identify them. In view of the fact that
we typically recognize a word at its so-called uniqueness point,
the point at which it differs from all other words that start
with similar sounds, the recognition process is most likely based
on initial sound sequences, scanning from left-to-right, that
is, in the order the individual sounds are uttered (Hawkins 1994:4).
For example, contrast connect, connecting, and connection,
three words that share much of the same phonetic information-to
a point. None can occur in the same grammatical context (they
all belong to different word classes), so the ability to ANTICIPATE
which one is being uttered in the speech stream or written on
a page is built into the grammar of English. (Which one of those
three words would correctly complete the following:"In order
for electricity to flow from this outlet to my computer, a clear
__________ must be made.") Additional portions of a word
(middles and ends) may be important for full and accurate identification
in certain instances. (Parenthetically, reading alphabetic or
syllabic writing systems requires matching visual or orthographic
representations with meanings, most likely circumventing the need
for sound-meaning correspondence in a great number of circumstances.
For example, all proficient readers of English know that "right"
and "rite" sound the same when spoken, but their different
meanings in any kind of context would make sounding the word out
unnecessary. Reading specialists are increasingly aware of humankind's
ability to read.)
Linking forms and meanings-the perfect fit of language and mind
To give a clearer picture of just how the basic building blocks
of language seem to optimize the ways that the whole system works,
the number of sounds (the amount of phonetic information) needed
for efficient identification and retrieval of word forms will
depend on the number of possible meanings to be retrieved. Therefore,
the larger the number of possible meanings that need to be organized
and cataloged in our mental lexicons (in principle, unlimited
with respect to nouns, for example), the greater the amount of
detailed information we need to have for every word in order to
identify it. In other words, a form or label (i.e., a word)
needs to have enough phonetic information associated with it to
have the uniqueness required to pull up a corresponding and equally
unique meaning.
This is very much like the requirements of telephone numbers for
individuals and license plates for cars. The mathematical possibilities
represented by the symbols themselves stand in direct proportion
to the number of items that need to be identified. Seven digits
(555-1234) can successfully identify ten million individuals.
Add to that, three-digit area codes, and you multiply the possibilities
by another thousand (one billion individual telephones). (Add
another digit anywhere in the sequence, and you multiple the possibilities
by ten.) The more people that own telephones or cars, the more
information that needs to go into a telephone number or on a license
plate.
This also suggests that an efficient system will use labels that
possess optimal amounts of phonetic information for the
purpose of economy-not too little and not too much (Field 2002).
On the one hand, a system composed of forms with too little phonetic
substance (too few symbols) would produce a large number of words
that sound the same (homophones), or that sound so similar
that the hearer would have trouble distinguishing it to recover
the (one and only) intended meaning. (A lot of people would have
to share the same phone number.) On the other hand, too much information
would make retrieval inefficient, too. Too many unwarranted digits
regarding telephone numbers would make the job of remembering
someone's number (linking a particular number with the appropriate
person) unnecessarily tedious. If I had, say, a telephone number
with 35 digits, I would have a tough time remembering it.
Sequences of sounds (amalgams of individual linguistic sounds),
sometimes referred to as phonological addresses, possess
a number of characteristics typical of telephone numbers, for
instance in the form of patterns or sequential constraints (only
certain sounds can occur in sequence in a particular language),
that act like prefixes or area codes for ease of recognition.
(Word-forming affixes such as the prefixes and suffixes of English
act in similar ways to enhance the optimal running of the system.)
In effect, systematic constraints allow a system to work at peak
efficiency--and this certainly smacks of design and not
just the results of randomly occurring processes of mutation.
If the system was invented by humankind, we are very clever, indeed.
But this begs the question:was there a first human who stumbled
on this amazing ability, or did it arise as a collective effort,
with a purpose, among a society of humans? Regardless, if a human
invented this ability, then it is presupposed that the capacity
for language was already there in his/her head, and we are once
again back at the chicken-and-egg situation.
Looking further into the retrieval process and how organized it
is, consider the amount of phonetic information we have to deal
with on a moment-by-moment basis with just the language I'm using
now. Concerning the amount of detailed phonetic information available
for the construction of words, English, has between 6,500 to 7,000
syllables (Levelt 1992:17). (In a very non-technical sense, syllables
are rhythmic units or beats in a word that consist of combinations
of consonant and vowel sounds peculiar to the specific language.
The word syllable, for example, has three syllables.) Taking
the lower figure (for the purpose of discussion), as many as 6,500
unique meanings can be linked to individual, unique forms or syllables,
with no homonyms and no synonyms (words with the same or similar
meanings, e.g., couch, sofa, davenport, divan,
and chesterfield).
The number of possible two-syllable words soars well into the
millions (6,500 times 6,500, or 42,250,000 to be exact), which
can be added to the original number of one-syllable possibilities
for a total of 42,256,500. The number of possible three-syllable
word forms is a lot (274,667,250,000 + 42,256,500 = 274,709,506,500).
The word improbable has four syllables. This amounts to
a starting point of over 270 billion possible phonological addresses
(three-syllable words) for an ever expanding list of entities
that need to be named.
It may not be obvious to the non-specialist, but these bare possibilities
are multiplied by three additional characteristics of pronunciation.
One (1) is called STRESS, how loudly a particular syllable is
pronounced (stress goes on the nucleus of the syllable, typically
a vowel). For example, in Spanish, one two-syllable combination,
pa-pa, can be pronounced normally with stress on the first
vowel "a" to mean either el papa, "the Pope,"
or la papa, "potato." Putting the stress on the
second vowel changes the meaning of the word. That is, papá
means "father." Second, vowels can be contrasted also
by (2) PITCH (the melody or music that accompanies vowels, how
high or low they are on a musical scale), for example in languages
like Mandarin. One basic syllable, for example ma, can
refer to such concepts as "mother," "hemp",
or "horse," depending on the tonal qualities that are
given to the vowel. In Vietnamese, ma with additional tones
specific to that language can also mean "ghost,", "young
rice plant," and "grave," in addition to "mother"
and "horse." The word for "mother," má,
with a rising tone, can also mean "cheek." So, each
form can have more than one meaning, as well. Third, in other
languages, e.g., the First Nation (indigenous) language of Mexico
called Mexicano (Náhuatl), vowels are contrasted by (3)
DURATION (how long a vowel is pronounced). For instance, the contrast
of long o:and short o produces two separate words
to:ca "to bury" and toca "to follow,
pursue," respectively. With respect to stress and duration,
the number of possible syllables is doubled, and, with tone, the
number of syllables are multiplied by the number of tones a language
possesses. In Mandarin there are four distinct tones, Thai has
five, and Vietnamese has six.
Needless to say, a language that possesses such raw potential
in the formation of words has the capacity to refer to an enormous
number of concepts with no resultant combinations (labels) that
sound alike to cause any possible confusion. If we add prefixes
and suffixes and other word-formation strategies such as compounding
(two words joined together to create a third-e.g., hotdog
in English) and so on, which are all restricted by language-specific
word-formation processes or rules, the potential is clearly inexhaustible
in principle. This is the exact requirement that needs to
be met. The possibilities are infinite, and the mechanisms for
its use are extremely fast and efficient, a far cry from the so-called
language of Nim Chimpsky, discussed in the next section. Nevertheless,
the language faculty functions at a very fast rate and very efficiently
(as far as we can tell). This is all the more amazing when we
consider the incredible diversity of the worlds languages and
the uniformity of the language faculty, and the basic fact that
there seems to be no limit to the number of languages one human
being can know-that is, outside the limitations of time and opportunity.
3:Language evolution
Over the years, attempts to link language and evolution have
encountered a number of challenges. From the historical perspective,
specialists have estimated that one very important language, Proto-Indo-European
(PIE), a theoretical reconstruction of what may have been the
original ancestor or progenitor of the Indo-European family of
languages, was spoken about 4500-6500 years ago-not a terribly
long time into the past. The fact that modern languages seem to
have devolved from their relatively ancient predecessors
has caused considerable consternation among those linguistic pioneers
who earnestly sought to establish a logical progression in their
development, from simple to increasingly complex forms and structures.
Referring to the proposed links between language and species that
are woven into evolutionary theories, Bickerton (1990:260) comments:"the
only aspect of Darwin's ideas on language that interested Darwin's
contemporaries was his comparison of the evolution of species
with the evolution of languages-a connection between 'language
and species' that will emphatically not be pursued in these pages!"
There is the rather obvious problem in linking the two, one that
infers its way into the kinds of social Darwinism that characterized
the racist beliefs of the Nazis in Germany during the early to
middle parts of the twentieth century.
Seemingly ignoring the speculative nature of this line of inquiry,
some linguists have proposed, according to the most conservative
estimates, that the origin of human language took place approximately
35,000 years before the present (or BP), corresponding to the
time that Homo sapiens is believed to have begun its great migrations
into various regions of the globe. The consensus, however, appears
to be more at 100,000 years before the present, when Homo sapiens
allegedly first appeared, with the time frame espoused by some
anthropologists reaching from one to five or six million years.
These estimates are made despite the fact that the earliest written
records, the writings of the Sumerians, date back only to around
3200 B.C.E., obviously very recent history (cf., Hock & Joseph
1996:71). One can't help but wonder what could have occurred between
5,000 and 5,500 years ago that would trigger such a significant
development. A number of very tough questions arise as a consequence:Are
the languages spoken today which have no writing systems (there
are many) less developed? Are the brains of their speakers less
evolved? The answer is an obvious and emphatic "No!"
If one does assume that all human beings are fundamentally "equal"
in development, what can account for the amazing consistency,
that one race is or is not more (or less) evolved than any other?
Ape language studies
There have been a number of interesting attempts at establishing
links between the language faculty and evolutionary principles.
Looking for homologues with respect to the structures of basic
articulators (the parts of the human body apart from the brain)
have proved to be somewhat troublesome (e.g., regarding the positioning
of the larynx). Examining communicative behaviors have yielded
similarly meager results; no continuity has been found at all
between ape communication and human language. It is true that
certain parrots can mimic human speech sound, but no one
would call that "language." Looking to our closest genetic
relatives (those with the closest resemblances in DNA), the great
apes became the likely candidates for the investigation of the
potential for language. Once again, the results have not
been promising.
Various studies have been conducted, and many have even appeared
on public television, although their numbers have dropped off
significantly in recent years (perhaps to zero). These studies-the
most famous of which were done with a gorilla, Koko, and two different
chimpanzees, Washoe and Nim Chimpsky-focused on attempts to teach
apes some sort of sign language. The reasoning behind these attempts
was twofold. (1) Chimp DNA is as close to human DNA as it apparently
gets, and if the language capacity is encoded into our DNA, maybe
it is there in the chimp's in some form or degree (attempting
to resolve the issue of a complete lack of continuity of communicative
behaviors from ape to human-or from human to ape, for that matter).
(2) The great apes may not be able to vocalize language,
but they seem to demonstrate the manual dexterity necessary for
signing. So, maybe they have some linguistic ability even though
they don't have the ability to speak.
The first attempts were intriguing and caught the eye of the academic
community. There is no question that these animals are intelligent,
and that they are able to communicate in some ways among themselves
in the wild and across species to humans. The disappointment lay
in the realization that what they are capable of learning
is not language in the human sense. Taking the well-known
example of Nim Chimpsky, what was revealed by the chimp's behavior
illustrates both the quantitative and qualitative aspects of language.
Raised from birth by human caregivers, Nim learned about 125 signs-which
is a very real accomplishment for any animal. But, his signing
fell short in more than a quantitative measure. Qualitatively,
only 12% of his utterances were spontaneous; the rest occurred
in response to prompting or when he wanted something (food, drink,
or attention). As much as 40% of his utterances were simple imitations
of the signs of his trainer, more an indication of conditioned
behavior than evidence of a biological endowment. In contrast,
all human children initiate conversations and are creative
with language almost from the beginning, the onset of language,
all characteristics that Nim failed to exhibit.
According to Seidenberg (1986), a harsh critic of past ape language
research, "At first glance, studies of ape language seem
to be premised on denial of the obvious. Humans acquire and use
natural languages, and lower primates do not. This irrefutable
fact would seem to constrain a priori what might be learned by
training apes to perform 'linguistically'" (30). It was hoped
that these projects would provide information about the behavior
of other intelligent species, contrasts between humans and other
species, the origins and evolution of human language, and relationships
between linguistic and nonlinguistic intelligence.
Evolution provides no basis on which to anticipate particular behavioral similarities, in terms of language or otherwise. Evolution is a theory of speciation, not of behavioral continuity As a consequence, comparisons of behavior need to be interpreted in the context of a theory of behavioral similarity, not merely in terms of evolution This point has not been sufficiently appreciated in the ape language literature. The problem is that general evolutionary facts are sometimes used in order to establish behavioral similarities The apes exhibit complex behaviors that are ambiguous at best. The interpretation of these behaviors is assisted by appeals to evolution, leading to the conclusion that an ape's behavior corresponds to that of a human because apes and humans descended from a common ancestor. However, this reasoning is entirely circular. In absence of an explicit theory as to how particular behaviors evolved, evolutionary facts such as common ancestry provide no basis on which to mediate comparisons of behavior (Seidenberg 1986:30-1).
In other words, evolution applies to physical characteristics
and not to behavioral characteristics. And, even though the apes
demonstrated complex behaviors, making an appeal to evolution
to explain apparent similarities is not adequate. Ape behaviors
could be interpreted in a number of different ways, and apparently,
none of the explanations proved conclusively that the apes showed
true linguistic abilities.
Seidenberg (1986) also points out that chimps may be thought to
be less highly developed than humans; however, if we take the
position that chimpanzees and humans have evolved from a common
ancestor, they are equally evolved along the evolutionary
continuum, "...they simply evolved in a different manner"
(32). All attempts to infer anything from the behavior of chimpanzees
to that of a common ancestor that may have existed millions of
years ago are speculative at best. Nevertheless, the research
was looked at as possibly contributing to the knowledge of our
species and to our origins.
The researchers generally had a couple of strategies for finding
the information they wanted in their study of apes. The first
(which is still taking place in current studies) was to identify
the natural communicative behaviors of lower primates that may
share important properties with human language. Despite the fact
that the natural behaviors of apes showed interesting characteristics,
it was not very likely that lower primates would be shown to exhibit
a natural system of communication that would resemble human language
to any significant degree.
The second strategy was to train apes--and here, the operative
word is "train." Chomskyans seem to demonstrate that
children do not learn language in any kind of behaviorist sense
(i.e., according to the views of B. F. Skinner and others). Therefore,
in the complete absence of the apes learning/acquiring linguistic
skills/language in the wild, it was clear that the apes needed
a little help. Maybe they had LINGUISTIC CAPACITIES (as a result
of the similarities of DNA structure, etc.), but they just never
had the opportunity to acquire LINGUISTIC SKILLS--it isn't their
fault that language isn't in their environment. Maybe their full
capacities cannot be developed in a jungle. Also, if it could
be proved that apes could learn language, this would have
the effect of refuting Chomsky's claims that language is acquired
naturally through observation, or that it is species specific
and a biological endowment. This would also reinforce the behaviorists
views of Skinner and others that prevailed in language teaching
before the advent of Chomskyan linguistics. "Construed in
this manner, the ape language experiments could only provide a
test of how much linguistic behavior could be acquired through
application of the precepts of Verbal Behavior [the well-know
book by Skinner]," according to strict behaviorist principles
and not what an ape is capable of doing in a natural environment
(Seidenberg 1986:33).
Consequently, child language acquisition studies and the studies
of ape behaviors have very different goals, backgrounds, and contexts.
What every human child does naturally, unconsciously, and
without prompting was being compared to what primates might
be able to accomplish with intensive training. In a kind of
wishful thinking, the hope seemed to be that apes might have the
capacity to produce or comprehend at least some kind of linguistic
communication, but this capacity remains unexpressed in a natural
environment. Perhaps laboratory conditions can provide a way of
realizing this capacity, and maybe part of the problem is that
apes only lack the articulatory mechanisms to produce speech.
This logic is inconsistent, however. Apes lack part of the neuro and motor-physiology that support speech. The sign language researchers proposed to overcome this limitation by exploiting the apes' natural ability to gesture. This effort would only succeed if they were capable of using the alternate modality. But if apes possess this capacity, the explanation for the fact that they fail to naturally express their linguistic capacity is wholly lost (Seidenberg 1986:33).
The ape studies would never be able to explain why apes don't
express any sort of linguistic abilities on their own. And, they
would only show that apes had somehow unexpressed linguistic abilities,
that is, if they succeeded in using a manual system in
ways that approximated human language--which they completely failed
to do.
It appears that apes (in fact, all other species) lack the same
capacity for computation or grammar that every single human being
expresses. For instance, even in imitative behavior, the clever
apes simply cannot copy a series of behaviors beyond a
very limited set. They don't seem able to store complex patterns
of behavior in their ape brains. If specific behaviors are presented
A + B, then apes can replicate ABAB patterns. However, if the pattern
is AABBAA, then imitation stops. Apes are not capable of AABBAABB
(and so on) patterns, which require perceiving, remembering, and
then performing more complex sequences (Fitch 2003). They simply
lack the cognitive capacity for syntax, something that every child
exhibits during the second or third year. Moreover, if they were
indeed able to use different modes of linguistic communication
(manually instead of vocally), as non-hearing children do, then
what can inhibit this expression besides the total absence of
linguistic ability? "It is interesting to note in
this regard that the natural communication of lower primates is
not primarily gestural. On their own, they seem to make little
of their opportunity to use their hands for communication"
(Seidenberg 1986:33). They may exhibit genetically-based calls
and screeches, but they don't show the capacity for human language.
What the apes seemed to learn
It seems simple enough to see what the apes did not learn,
but what did they truly learn? They learned tasks that they could
perform (the instrumental function of signing) in a lab context.
These tasks were in essence non-linguistic in nature because the
behaviors (manual gestures) had consequences; the apes received
rewards for their signing, illustrating the typical results of
operant, conditioned behavior, not language acquisition. Despite
their successes, they did not learn the symbolic function of language.
Based on the empirical evidence, it simply is not possible to
determine if the sign for banana, for example, represented
the linguistic concept of "banana" the way it does for
human children (based on perceptual properties of the class of
objects known as bananas), or if the manual sign was merely a
non-linguistic task that the apes had to perform in order
to get one.
The apes also seemed to learn how to imitate the teacher's input.
They could form many signs, but they could not string them together
to make contrasts. With Nim, his MLU (mean length of utterance,
or the length of his "sentences") was essentially flat,
restricted to one sign with a very small number of two-sign combinations.
Perhaps most importantly, the ordering of the elements
of ape signing (or their syntax) was of no effect and, therefore,
meaningless. To illustrate how significant this is, putting words
in a different order in English changes the meanings of sentences
(e.g., "John hit Larry" versus "Larry hit John").
Nim's utterances qualified only as a kind of word salad. To further
confuse the issue, he would sometimes guess what an appropriate
response would be to a question by rapidly making clusters of
signs (in random order). In place of language-like syntax, there
was a great deal of repetition of a very small number of signs,
typically, me, you, and names of participants and
food objects and actions such as "eat." He apparently
succeeded only in learning how to get the things he wanted.
Another important aspect of the apes' gesturing behaviors was
their complete lack of creativity and spontaneity, and their passive
receptivity to the trainers' prodding. Trainers could mold their
hands to approximate an appropriate sign to accomplish a specific
task, but this simply cannot be done with children. Remember,
deaf children naturally babble with their hands in the absence
of aural feedback to their oral babbling.
Perhaps the most serious blow to the entire line of research was
that the apes' signs were interpreted by their trainers. The researchers
would interpret the meanings of the apes' gestures as correct
when they were ambiguous at best, so it seemed that close enough
counted. From the academic perspective, this is a very serious
charge. It amounts to discrediting the validity of the research.
Ape signing, when shown to users of American Sign Language (ASL)
was not understood as ASL, despite the claims that the apes were
in fact learning ASL. Seidenberg theorizes that the television
documentaries of Koko and Washoe gained wide acceptance because
the utterances shown were selected in advance for filming and
represented the apes' best efforts. Their performances did not
reflect what they did on a regular basis. With videos of Washoe,
the signing was also accompanied by running commentaries by the
researcher of what the ape was "saying," including intonation
and emphasis that increased the impression that the chimp was
indeed "talking."
4. Implications of evolutionary thinking
Currently, a number of scholars express concern about Darwinian principles applied to language, perhaps not so critical of evolutionary principles, per se, but to what may be termed neo-Darwinian linguistics. In a biting Discussion Note in a recent issue of Language, the official journal of the Linguistics Society of America (LSA), Michel DeGraff, an MIT syntactician, discusses the "traditional" view of creole languages in Darwinian terms, based on a perceived simplicity of structure (DeGraff 2003).
In a nutshell, the literature on pidgin and creole languages has
changed a great deal in recent years. One traditional approach
is based on the theoretical position that creoles develop from
pidgins varieties, and that they are greatly reduced (read inferior)
versions of their parent languages, in many cases, European colonial
languages such as French, English, Portuguese, and so on. Current
research has challenged such claims based on empirical evidence
(e.g., no Caribbean creole has an attested pidgin language in
its ancestry). Most current approaches begin with the idea that
second/subsequent language acquisition (SLA)--under very specific
and unfortunate circumstances--was the key process involved but
only up to a point, after which speakers expanded the linguistic
range of the so-called "creole" with different sorts
of innovations, that is, grammatical and lexical (vocabulary)
expansion by means of the mechanisms of language creation (change
and evolution). Original native languages, of course, played a
large role, as well, via transfer of prior linguistic knowledge,
in particular, through the unconscious application of native pronunciation
patterns, and prior grammatical and semantic knowledge of their
native tongues.
It should be noted, too, that there is no basis whatsoever for
the assumption that the Africans who had been forcibly kidnapped
and brought to the so-called New World were monolinguals,
speakers of only one African language. There is a great likelihood
that they were speakers of several languages. Most of the very
negative views of the slaves attempts at speaking the so-called
master's language were the results of Europeans' (white,
native speakers of the European languages) perceptions of the
speech of the Africans. To the prejudiced ears of these native
speakers, the slaves sounded crude and uneducated, and their speech
was highly accented, which is precisely the expected outcome of
anyone being forced to learn any language under the horrific conditions
placed upon the slave population. In the view of many of the Europeans,
Africans did not appear to be capable of learning the more
civilized European languages, a view which fit conveniently with
current racist beliefs.
According to DeGraff, a native speaker of Haitian (Creole French),
there has been an "imperialist construction of political,
cultural, and racial hegemony, making it impossible to view Caribbean
Creole languages as being on a genealogical or structural par"
with their European counterparts. He goes on to argue convincingly
that Haitian is as close to European French as French is to Latin
and that Jamaican Creole English (the "patwa") is as
close to English as English is to Proto-Germanic. He argues that
there was no break in transmission, as claimed by several linguists
(see, e.g., Thomason & Kaufman 1988, Thomason 2000), that
the European languages were learned in some (non-native) fashion.
DeGraff also states that any objective view of the syntax of Haitian
reveals its inherent complexity, rivaling that of any so-called
"normal" language. He does not discount the role of
the "mother tongue" in SLA, and that the Africans quite
likely transferred aspects of their native languages to the non-native
learning of French, but he fiercely resists the idea that the
African's native language(s) were somehow inferior and the speakers
themselves somehow linguistically or cognitively inferior by implication.
In this last point, DeGraff is joined by the vast majority (I
would hope, all) of the community of linguists.
Another example of recent criticism of a Darwinian perspective
in linguistics comes from Frederick Newmeyer, past president of
the LSA. In a review article, Newmeyer (2003) reviews three books,
one of which is Language in a Darwinian Perspective (henceforward
LDP), by Bernard H. Bichakian. Newmeyer's candid evaluation includes
that LDP is "bizarre," and that it attempts to apply
"defective argumentation in an attempt to explain non-existent
'facts'." He does refer to other evolutionary thinkers in
positive ways, but his criticism of this specific book seems focused
on the apparent lack of empirical evidence, the assumption that
languages "advance" and become more functional (a term
that the Bichakian does not carefully define, according to Newmeyer),
and that there is a directionality to change, that is along evolutionary
lines (from simple to complex), something that has been argued
against for over a century.
Neither DeGraff nor Newmeyer can be accused of arguing from a
Biblical perspective, yet both reject specific implications of
Darwinism in some way. To DeGraff, it is the implication that
Africans and their languages are somehow inferior to their European
counterparts, and to Newmeyer, it is the long-abandoned idea of
a unidirectional progression to the evolution of particular languages.
DeGraff cites the French linguist, Lucien Adam (an ironic twist),
who classed languages into "natural" (languages spoken
in the wild by savages) and "civilized" languages,
those, for example, used in civilized, European cultures. No one
has to point out the cultural bias in such statements, but it
is worth noting that in the case of Creole languages, we are,
in fact, discussing spoken languages. We are not comparing
literatures that took centuries to develop with oral traditions
that may have emerged within a single generation. In the case
of the evolution of European languages, the empirical evidence
simply does not sustain any kind of belief in any sort of evolutionary
perspective.
To illustrate the inherent dangers in making strict correlations
between language change and the hypothetical evolution of species,
the following quotation comes from a two-volume set entitled Time
Depth in Historical Linguistics, published by the McDonald
Institute for Archaeological Research in Cambridge, England. It
was written by a husband and wife team (she a linguist, he a geneticist).
Rates of change of the sort outlined above are typically predicated on known dates of species split, which themselves are calculated with reference to external, fossil evidence. However, once such rates are known they can be used to hypothesize split dates for lineages where no fossil evidence is available, such as human-chimp divergence. If these splits took place in the very distant past, the method is likely to work well, and the dates achieved should be robust and credible. But when we are dealing with relatively recent speciation events, as is probably the case for humans and chimps, errors reflecting the stochastic operation of mutation and substitution may be significant. Nei (1985) shows that measures of alpha hemoglobin differences between humans, chimpanzees and gorillas generate a preferred tree grouping humans most closely with chimps, but that the standard errors involved are greater than the time-depths proposed to the branching points. In consequence, it is not possible to state securely which of the three species separated from the others first. Inevitably, the significance of the standard errors will be even greater when dealing with divergence among human populations, when the time since common ancestry is far less again. It is possible to minimize these errors by combining values for a large range of proteins, rather than focusing on one alone, but the confidence intervals will still be large:Nei (1985, 42) calculates time of separation for Caucasoid and Negroid populations, based on 85 different loci, as 113,000 years plus or minus 34,000 [equaling a range of 146,000-79,000 years]. Alternatively, it may be preferable over shorter time depths to base calculations on a system which mutates at a faster rate, which is true of mitochondrial DNA (Cann et al. 1987), or mini- and micro-satellites (Armour et al. 1996; Bowcock et al. 1994; Jorde et al. 1997). Since mitochondrial DNA mutates at an overall rate of 2-4 per cent per million years, approximately 20 times faster than autosomal DNA (Cann et al. 1987), it follows that standard errors over shorter time depths will be greatly reduced, although they will not be negligible. Even based on over 1000 sequence comparisons, the date of the human-chimp split can only be estimated at 4.7 million years with the confidence limit of 0.5 million (Gagneux et al. 1999). (McMahon & McMahon 2000:66-7)
It must be immediately stressed that neither of these authors
intend to make racist or potentially hateful comments, but the
implications seem quite clear. To state that the one race split
off from another assumes a common ancestor. It is explicit that
chimpanzees and humans had a common ancestor, an issue not questioned
in evolutionary circles. Nevertheless, the point is made that
the two races of human beings sharing a common ancestor with chimps
and gorillas now also differ in evolutionary terms. Could this
be linked to concepts of intellectual and linguistic inferiority
that have been explicitly made by linguists who assume Darwinian,
evolutionary thinking? It is a good and necessary thing that people
like DeGraff and Newmeyer stand up to argue against such commonly
held misconceptions. One is left to wonder, nonetheless, what
is offered in contrast that could effectively and permanently
cure the misconceptions. It still seems reasonable to assume that,
if our species has evolved from a common ancestor, then
it will continue to evolve. And, if we allow for a common
ancestor among the various species of primates, then one can logically
assume that the species called homo sapiens may branch further
into competing species. The implications of this are staggering.
Unless we are prepared to argue that all chimps, gorillas, and
humans are either the same or equal in their development and
in their cognitive abilities, it seems reasonable to conclude
that members of the Negroid and Caucasoid "populations"
will show cognitive differences (dissimilarities), as well. This
certainly seems to imply a hierarchy of some kind.
When it comes to racism, both sides need to be reminded of the
prominent role racism and slavery have played in Western Civilization
and in the history of the so-called New World. We should all be
alarmed by anyone who purportedly holds to any kind of philosophy
or worldview that attempts to rationalize this evil. To be honest,
both sides of the design-evolution debate appear to be guilty.
In addition to racists scientists (and linguists), there are those
who have distorted the content and intent of the Bible to legitimize
racist beliefs, e.g., in the American South, past and present.
Forced conversion to Christianity was even seen as a justification
for slavery, that the poor, "childlike" Africans would
never have heard the Gospel unless Europeans had snatched them
away and made them slaves. It was claimed that, in slavery, the
kidnapped millions would possess "true freedom," and
that it was the Africans, not the Portuguese and other slave traders
who received the greatest benefit of slavery (Raboteau 1978:96).
While the institution of slavery continued for centuries in the
Americas, it was (and is) an unquestioned evil. I'm sure that
this view is shared by the vast majority of adherents to either
side of the evolution-design debate. No amount of zeal in "spreading
the Gospel" can possibly justify the kidnapping of some seven
million people, as many as one-fourth of whom died in transit
as a result of the brutal conditions. No appeal to science can
possibly justify the human experimentation (e.g., the twin experiments)
of Mengele and his ilk, perpetrated by Nazi Germany. A calloused
indifference to human life can in no way be derived from even
a cursory reading of the Bible, nor is the belief in the superiority
of one race over another. Nevertheless, certain logical conclusions
of evolutionary thinking need to be continually evaluated. If
one is philosophically opposed to the racist practices of certain
religious sects or groups, one must be equally opposed to similar
beliefs held by state-sanctioned and supported groups, irrespective
of their prestige or legal status.
5. Discussion
There is a great gulf between human language and animal communication.
As a result, one must wonder how it is that every member of humankind
has this capacity while none of the other ten million (or so)
species does. If chimps and humans are equally "evolved"
or developed (having a common ancestor, as evolutionary thinking
would have us believe), then why is it that no other primate species
also has it, at least in some form? All things being equal, the
premise for the Planet of the Apes series of films seems
quite reasonable assuming the accepted beliefs of evolutionary
thinking. Furthermore, applying the principle of "equal development"
to a wider sample, one must ask why classes of reptilians, that
have presumably been around longer than their mammal counterparts,
have not also developed some sort of language skills. This is
the stuff of science fiction, not true science. Apparently, we
are to assume that the mathematical probabilities are 1 in 10,000,000
that any species will develop language, and humans are the winners.
We just happen to have the optimal biological makeup, and inhabit
the optimal planet with the optimal environment in the optimal
solar system in the optimal galaxy.
The knowledge of language sets us apart from all other species
we know of so far (or that we can all agree to knowing). It is
part of our unique heritage, our biological endowment. Its significance
may be difficult to appreciate because so many aspects of it are
taken for granted. We are simply unaware of what we do when
we do language. From a Biblical perspective, it is an aspect
of our humanity that links us to the infinite. It allows us to
gain and express real knowledge, to learn of our respective environments
and ourselves, and ultimately to reach many things and ideas that
are beyond our individual capabilities. The obvious coherence
of language and the language faculty raises significant questions
about how random mutations could possibly account for such profound
and effective order.
The predictability of language places its acquisition and use,
in perception and performance, well within the realm of design:design
is a reasonable assumption. Human language and its users are
perfect fits, compatible in every respect, just as the ideal software
is systematically compatible with its hardware (wetware, in this
case). When a structure functions with such astounding speed and
efficiency as that intrinsic to language, the terms intelligent
design certainly seem appropriate. From the Biblical viewpoint,
design is characteristic of the entire universe, and only in a
stable universe can an outcome be completely assured and anticipated.
Jet airplanes can fly because certain physical laws exist. If
you doubt this, then you are not very likely to accumulate a lot
of frequent-flyer miles. We can observe the principles of biology,
as well, and marvel at the complex design of the eye, the ways
the mind and body heal after injury, and so forth. We can witness
the evolution of viruses and other simple life forms as they adapt
to specific environments (the most adaptable survive). Their evolution
is orderly and gradual, not random and sudden. We can interpret
this as evidence of design that encompasses natural change. We
may not know enough to be able to predict particular improvements
or changes, but the fact of our ignorance does not preclude design.
It does not mean that results are random.
From the evolutionary view, the systematic nature of language
may be even more amazing, perhaps even startling, simply because
the language faculty and all human languages are believed to have
somehow evolved through impersonal processes of selection, as
a blind watchmaker makes choices with little understanding of
possible short- and long-term consequences. In pursuing evolutionary
principles, one must conclude that our own consciousness is merely
an evanescent phenomenon, illusory and fleeting at its base. If
that truly reflects ultimate reality, on what basis do we trust
our own logic? Logic presupposes rules, the existence and
power of Reason and the orderly arrival at some form of truth,
whether it is based on observation (reached through empirical
hypotheses a posteriori) or as a mental construct that
is simply presupposed and reached by means of an existing consensus
a priori-which in the present academic environment, seems
to characterize the belief in evolution. The alternative to a
rational universe is one that lacks sense. Where does that leave
human consciousness, thought, and cognition? My belief cannot
alter facts, only my interpretation of them. From the Biblical
perspective, God exists whether I, you, or any other human being
believes it or not. Ultimately, from the perspective of the evolutionist,
all beliefs are in essence illusions-including the belief in evolution.
One thing that is fairly clear about the two opposing viewpoints:there
is little common ground upon which the agnostic can stand, little
room for straddling fences. The lines are drawn. Judging from
past discussions and the history of the debate, each side resists
the other, to the point of hurling its fair share of invective.
It may not be possible to be completely unbiased, but to take
a stand, there is an important caveat:to hold a position, one
must be open to its implications. It may not be a matter of choosing
the path of least resistance-the most likely, pleasant, or politically
correct alternative. Nor should it be the more pragmatic solution,
which is to purchase some sort of "fire insurance" and
become a religious person or "believer" to avoid the
eventual outcome of unbelief set down in Scripture. Much has been
said and written about the hope that accompanies belief and the
nihilism at the end of atheism. It has done little to settle the
seemingly endless controversies.
As far as faith is concerned, both stances require generous amounts.
Both may seem hard to believe at times. Authors from the evolutionary
side can probably afford to be light-hearted and even entertaining
in their presentations. They certainly are not driven to defend
the consensus of the scientific community. Its orthodoxy is merely
taken for granted within its stated domains. Nevertheless, faith
and faithfulness, in and of themselves, should never exclude executing
critical judgment, inquiring after evidence (to prove or disprove),
asking questions for clarification, or even questioning the validity
and authority of conclusions drawn in accordance with particular
worldviews. This is extremely important when positions governing
how we are to interpret reality are set up as dogma that should
not be questioned, whether they are based on Scripture or on the
word of scientists. The potential for arrogance and misuse of
authority appears to be inherent to either side of the debate.
We all need to examine why we do or do not believe one position
or the other and use the cognitive abilities we possess to arrive
at a satisfactory decision, one that may tip the balance and allow
for real explanation. And, we certainly need the individual courage
to follow our views to their logical conclusions.
REFERENCES
Arbib, M. 2003. "From monkey-like action recognition to
human language:An evolutionary framework for neurolinguistics."
Paper presented to the departments of linguistics and psychology
at the Max Planck Institute for Evolutionary Anthropology, Leipzig,
Germany, 7 July, 2003.
Bellugi, U., P. Wang, and T. Jernigan. 1994. "Williams syndrome:An
unusual neuropsychological profile," in Atypical cognitive
deficits in developmental disorders:Implications for brain function,
S. Froman and J. Grafman (Eds), 23-56. Hillsdale, NJ:Lawrence
Erlbaum.
Bickerton, D. 1990. Language & Species. Chicago:University
of Chicago Press.
Bloomfield, L. 1933. Language. New York:Henry Holt.
Bynon, T. 1977. Historical Linguistics. New York:Cambridge
University Press.
Cahn, S. 1971. A New Introduction to Philosophy. New York:Harper
& Row, Publishers.
Churchland, P. M. 1989. A Neurocomputational Perspective:the
Nature of Mind and the Structure of Science. Cambridge, MA:the
MIT Press.
Churchland, P. S. 1986. Neurophilosophy:Toward a Unified Science
of he mind/brain. Cambridge, MA:The MIT Press.
Clark, E. 2003. First Language Acquisition. Cambridge,
U. K.:Cambridge University Press.
Comrie, B. 1989. Language Universals and Linguistic Typology.
Chicago:University of Chicago Press.
DeGraff, M. 2003. "Against creole exceptionalism." Language
79.2.391-410.
Dennett, D. 1991. Consciousness Explained. Boston:Little,
Brown and Company
Edwards, J. 1994. Multilingualism. London, U.K.:Penguin
Books.
Field, F. 2002. Linguistic Borrowing in Bilingual Contexts.
Philadelphia and Amsterdam:John Benjamin's Publishing Company.
Field, F. 2004. "Second language acquisition in creole genesis:The
role of processability." In Creoles, Contact and Language
Change:Linguistics and Social Implications, G. Escure &
A. Schwegler (eds.). Amsterdam:John Benjamins
Fitch, T. 2003. "The evolution of spoken language:A comparative
approach (The descent of the larynx)." Paper presented at
The Institute Seminar of the Max Planck Institute for Evolutionary
Anthropology, Leipzig, Germany, 6 June, 2003.
Fromkin, V., and R. Rodman. 1998. An Introduction to Language
[sixth edition]. Orlando, FL:Harcourt Brace College Publishers.
Garnham, A, R. Schillcock, G. Brown, A. Mill, and A. Cutler. 1982.
"Slips of the tongue in the London-Lund corpus of spontaneous
conversations. In Slips of the tongue and language production,
A. Cutler (ed.), 351-273. Berlin:Mouton.
Gee, J. P. The Social Mind:Language, Ideology, and Social Practice.
New York:Bergin and Garvey.
Greenberg, J. H. 1974. Language Typology:A Historical and Analytic
Overview. The Hague:Mouton.
Hawkins, J. 1994. A performance theory of order and constituency.
Cambridge, U.K.:Cambridge University Press.
Hock, H. & B. Joseph. 1996. Language History, Language Change,
and Language Relationship. New York:Mouton de Gruyter.
Hoff-Ginsberg, E. 1997. Language Development. Pacific Grove,
CA:Brooks/Cole Publishing Company.
Holm, J. 1988. Pidgins and Creoles, Volume 1:Theory and Structure.
Cambridge:Cambridge University Press.
Karmiloff-Smith, A. 1992. Beyond Modularity:A Developmental
Perspective on Cognitive Science. Cambridge, MA:The MIT Press.
Lakoff, G. 1987. Women, Fire, and Dangerous Things:What Categories
Reveal about the Mind. Chicago and London:University of Chicago
Press.
Levelt, W. J. M. 1992b. "Accessing words in speech production:Stages,
processes and representations. In Speaking:From Intention to
Articulation, W. J. M. Levelt (ed.), 1-22. Cambridge, MA:the
MIT Press.
Levelt, W. J. M. (ed.). 1992a. Speaking:From Intention to Articulation.
Cambridge, Mass.:The MIT Press.
McMahon, A. & R. McMahon. 2000. "Problems of dating and
time depth in linguistics and biology." In Time Depth
in Historical Linguistics, Volume 2. C. Renfrew, A. McMahon,
and L. Trask (Eds). Cambridge, U.K.:Oxbow Books.
Newmeyer, F. 2003. Review Article:Chomsky:On nature and language.
Anderson & Lightfoot:The Language Organ:Linguistics as
Cognitive Physiology. Bichakjian:Language in a Darwinian
Perspective. Language 79,3.583-599.
Pienemann, M. 1998. Language Processing and Second Language
Development:Processability Theory. Amsterdam:John Benjamins.
Pienemann, M. 2000. "Psycholinguistic mechanisms in the development
of English as a second language." In Language Use, Language
Acquisition, and Language History:(Mostly) Empirical Studies in
Honour of Rüdiger Zimmerman, I. Plag & K. P. Schneider
(eds.), 99-118. Trier:Wissenshaftlicher Verlag Trier.
Pinker, S. 1994. The Language Instinct. New York:Harper
Collins Publishers.
Raboteau, A. 1978. Slave Religion. New York:Oxford University
Press.
Seidenberg, M. 1986. "Evidence from great apes concerning
the biological bases of language" Language learning and
concept acquisition. Norwood, NJ:Ablex.
Sylwester, R. 1995. A Celebration of Neurons, An Educator's
Guide to the Human Brain, ASCD, Alexandria, VA.
Thomason, S. & T. Kaufman. 1988. Language Contact, Creolization,
and Genetic Linguistics. Berkeley, CA:University of California
Press.
Thomason, S. G. 2001. Language Contact:An Introduction.
Washington, D. C.:Georgetown University Press.
Poasted 2/17/04