How does language work? What are the smallest meaningful language units? How does the brain interpret speech?
In The Language Instinct, Steven Pinker aims to inspire readers to appreciate the unique qualities of human language. He explains how language works by describing the infinite combinatorial system, the mechanics of speech, and the brain processes involved in making sense of speech.
Keep reading to learn how language works.
How Language Works
How does language work? To answer that, we’ll begin by describing the elements of human language that make it both complex and precise. Pinker writes that, to understand the basics of human language, we must first understand its infinite combinatorial system—the characteristic that lets us transform a finite number of sounds into infinite sentences based on a set of grammar rules called syntax. We’ll also explain the physical mechanics of speaking and the neurological processing that enables people to interpret speech.
Infinite Combinatorial System
Pinker writes that the infinite combinatorial system in human language is an underappreciated design element: It’s ingenious because it enables endless creative expressions based on a relatively small set of basic units. (Shortform note: Zellig Harris was the first linguistic researcher to describe the “discrete combinatorial system” in language, although Pinker was the first to use this exact phrasing. Discrete combinatorial systems also occur in other areas, such as music, which uses a set of notes and rhythmic notation to create complex musical compositions.)
The elements of this system include phonemes, morphemes, words, and phrases—all of which are universal in human languages. Phonemes are the smallest unit of speech. They’re the individual sounds in a language that create differences in meaning, like the “d” sound in the word “dog.” Vowels can represent multiple phonemes, like the “a” sound in “cat” versus the word “father.”
Pinker explains that we combine phonemes into morphemes, which are the smallest meaningful language units—for example, root words such as “hope” or prefixes and suffixes like “un-” or “-less.” By combining morphemes, we can create many different types of words, like “hopelessness,” “hoped,” and “hopefully.”
Then, we can combine words into phrases, and phrases into highly complex sentences. To tie words and phrases together logically, languages use syntax. Syntax includes the structural rules of language, which determine things like the order of the subject, verb, and object in a sentence. Using the combinatorial system and syntax, we can precisely indicate the what, where, when, why, and how of an event.
|Diverse Methods of Combining Linguistic Elements|
The infinite combinatorial property manifests in diverse ways in different languages. One of the main structural differences among languages is morphology: the process of combining morphemes to create larger words.
For example, Spanish has a fusion-type morphology because two different words like “yo hablo” can be combined into one word: “hablo.” On the other hand, polysynthetic languages like Mohawk combine many morphemes—that signify pronouns, prepositions, direct objects, and verbs—to create an entire sentence in one word. For example, the word “sahųwanhotųkwahseʔ” means “she opened the door for him again.” In these types of languages, it’s difficult to identify phrases, and the distinction between a sentence and word is obscured.
Mechanics of Language
Pinker writes that, for humans to actually speak, six different body parts have to physically coordinate: the larynx, soft palate, tongue body, tongue tip, tongue root, and lips. Each phoneme represents a specific configuration of these body parts.
Adding to this complexity, we often drop phonemes and blend them together for convenience when we speak—a process called coarticulation. There’s also no distinct gap between each word when we speak. So, as someone interprets speech, their brain is constantly parsing the audio input, separating it into discrete words, and processing the meaning of words based on memory and context.
Information Processing in the Brain
Pinker explains that, to process speech, humans not only sort out the individual words but also parse the words into noun phrases, verb phrases, and prepositional phrases. We logically link the phrases, use our short-term memory to keep track of multiple phrases, and interpret the most likely meaning of each word as we go along.
Combining Word-By-Word Interpretation and Cultural Nuance
Understanding sentences is partly a modular process because we group words into phrases, but we also interpret the most likely meaning of each word as we go along. Sometimes, if we initially interpret the wrong meaning of a word, we have to backtrack and try interpreting the sentence again with a different plausible word meaning.
Pinker points out that, in addition to quickly choosing a word meaning based on the context, people rely on subtext, humor, sarcasm, and metaphor to understand what other people are really saying. This is partly due to our desire to adhere to social norms, like being polite.
Pinker asserts that the combination of these skills—grouping types of phrases, identifying a word’s meaning based on the context, and incorporating cultural nuance—are what make the human approach to interpreting language highly sophisticated and difficult to replicate. Pinker claims that without these uniquely human advantages, AI will never come close to interpreting language with the same accuracy as humans.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Steven Pinker's "The Language Instinct" at Shortform.
Here's what you'll find in our full The Language Instinct summary:
- How language is an innate ability—not an element of culture
- A look at unique qualities of human language
- How slang enhances a language, rather than diminishing it