You can tell an Aussie the minute they open their mouths, however not all of them don the full bogan like Paul Hogan.
In the documentary, The Sounds of Aus, the Australian accent is seen as the most significant identifier of the Australian cultural DNA, and is arguably the most Aussie thing about our mates across the ditch.
The doco looks at the story of the Australian accent: How it came about, how it has evolved over two hundred years of colonial and cultural history, and how and why they speak the way they do.
In an article for news.com, Nadine Hayes writes:
I never thought too much about this subject until a Swedish friend of mine said she imagined our vernacular was concocted by a bunch of drunk sailors. I hope she wasn’t referring to Captain Cook, I can’t imagine him standing with a bunch of tinnies in his budgie smugglers shouting, “Ahoy mates there’s Botany Bay, reckon that’s a great place for a barbie!”
Captain Cook and the First Fleet actually brought with them, in sober fashion, settlers with a mixture of English and Irish accents.
It’s thought the settlers’ children were the ones that created the unique accent we recognise today.
Some find the Aussie accent amusing, others find it downright annoying. I mean why is it that our inflections go up at the end of every sentence? Is it because we’re too insecure to make that statement? Perhaps we’re looking for affirmation? Are we trying to be inclusive? Maybe in our Aussie bizarreness, it’s all of these things.
I know – it’s because the sun’s too bright.
And what’s with asking questions that don’t need answering, like “how great was the weather today?” People just look at you like, what the hell am I supposed to do with that?
We’re a pretty laidback bunch, and this is reflected in the laziness of our language. We often forget Rs and Gs at the end of words, like ‘rippa’ and ‘fishin’. Let’s face it, we don’t open our mouths much and that’s not to avoid eating the flies.
There is no such thing as the perfect Australian accent, and our accent doesn’t vary as greatly from state to state the way it does in other countries.
We do as a nation tend to abbreviate just about everything though, with a little ‘bogan speak’ thrown in.
If you don’t believe me here’s some of our faves …
I came across an article published back in October 2016 that discussed how native-English speakers are the “worst communicators”. It’s definitely an interesting read.
How well do you communicate?
“A lot of native speakers are happy that English has become the world’s global language. They feel they don’t have to spend time learning another language,” says Chong. “But… often you have a boardroom full of people from different countries communicating in English and all understanding each other and then suddenly the American or Brit walks into the room and nobody can understand them.”
Can you speak a foreign language?
I have met a number of foreign residents here in Korea who have not bothered to learn Korean, or at least haven’t learned beyond the very basic expressions. I also have Korean friends who get confused by some English idioms or certain words within a context.
NOOBSian Stuart Semmel of Yale University has passed along two new (to me) NOOBs. The first is the verb “liaise,” a back-formation from the French noun “liaison,” which originally meant a sauce-thickening agent (who knew?) but has since referred to a close (sometimes intimate) connection between two people or organizations. The OED describes “liaise” as […]
I was approached by the website, Grammarcheck.net, to share their infographic on common word mistakes people make. It’s easy to confuse words like “compliment” and “complement”, or “affect” and “effect”. These words sound similar but have different meanings. So, without further ado, enjoy this handy reference! Source: http://www.grammarcheck.net
Exactly 950 years ago today, a crooked monarch with a false claim to the English crown called Harold suffered a Norman arrow through his eye. The missile penetrated his brain and caused almost instant death. William the Conqueror thus won the Battle of Hastings, a town that spent most of the next millennium failing to rise above the level of toilet. It still has 50 years to put this right, and local estate agents tell me it is “on the up”. So by 2066 it may make it to Motorway Service Station. We can but hope.
On a brighter note, the military exchange led to linguistic development and quite a bit of shagging. Today, some 45% of Brits have Norman DNA. This beats the DNA ravishing inputs from Eriks and Hermanns into a poor second place at 30%. But the links between Normande French and Anglo-Saxon language forms far outweigh…
AMERICA’S Supreme Court allows them to be banned from public spaces, and permits heavy fines for their improper handling, making rare exceptions to the protections of the constitution’s Bill of Rights. Guns? Only in a saner world. The weapons in question are swear-words, and readers who agree that they are objectively dangerous will want to stop reading at this point, as Johnson does not share the court’s view.
The Federal Communications Commission may warn or even impose six-figure penalties on a broadcaster that allows even a “fleeting” expletive on air, as when Bono, a singer, told an awards-show audience that winning was “fucking brilliant”. A mother in South Carolina was arrested for shouting “Stop squishing the fucking bread!” at her family. (Witnesses said she shouted at her children; she said it was at her husband.) A North Augusta city ordinance includes in its definition of disorderly conduct “any bawdy, lewd or obscene words…while in a state of anger, in the presence of another”.
As with guns, attitudes towards swearing vary widely. Big majorities of New Zealanders rate words like shit and balls as“acceptable”. The French are blasé about their “c-word”, con. Japanese has insults, and of course words for genitalia and excretion. It even has special polite registers that must be mastered to avoid offence. But it has no real taboo words.
This and more is the focus of a delightful new book, “What the F?”, by Benjamin Bergen, a cognitive scientist at the University of California, San Diego. Despite the regional variation, there are four near-universal sources of swear-words: religion, sex, bodily wastes and slurs. As befits Mr Bergen’s discipline, the core of the book is about swearing in the mind itself. On hearing the bluest of blue words, people’s heart rates speed up, and their palms begin to sweat. Their concentration on tricky tasks can be severely disrupted. Merely being told to free-associate with the word faggot (frocio in Italian) made experimental subjects less willing to allocate funds for an HIV centre in a subsequent simulation. But Mr Bergen criticises bans or fines, arguing that education about the harm slurs can do is more effective.
Some swearing is hard to stop. Automatic swearing—the kind that happens when your hammer meets your thumb—seems to have its own brain circuitry: Mr Bergen tells the tale of the French priest who lost all language ability but the words je (I) and foutre (fuck). Reflexive swearing seems to be routed through a part of the brain that is evolutionarily older, and may be analogous to the circuitry that causes calls of fear or surprise in other animals. Swearing can increase pain tolerance.
Though taboos are everywhere, they change over time. English law forbade swearing by the deity in plays in 1606; this means that Shakespeare’s later plays see the drop-off of “zounds” (“by His wounds”) and the like. The Victorian era was notorious for sexual prudery. Today, it is slurs that pack by far the biggest punch. A survey in 2000 found that British respondents rated wanker as more unacceptable than nigger, but a 2016 study found the reverse. And words like cripple and retarded, formerly unimpeachable medical terms, have become unusable in polite company.
Swear-words in English tend to be short with hard-sounding consonants, especially k and g. But there is nothing strictly taboo about curse-words’ sounds; truck and punt are not taboo. Nor do the referents alone make a word taboo: copulate and vulva aren’t unmentionable to little ears. But when children see their parents cringe at the use of their sweary synonyms, they quickly pick up how powerful they are. Taboo words, ultimately, are those that people treat as taboo, the treatment itself giving them their force.
It would be better to take a more lighthearted view. Cuss-words can no more be wished or censored out of existence than colour-terms or animal words. A widely reported article in 2011 in Pediatrics, a medical journal, claimed that merely hearing swear-words made children aggressive, but this conclusion was based on a long string of debatable assumptions that Mr Bergen unpicks with gusto. Studying swearing is a way of studying human nature itself. “Strong Language”, a group blog by language experts, “Holy Sh*t”, Melissa Mohr’s book on the history of profanity, “In Praise of Profanity” by Michael Adams of Indiana University, or Mr Bergen’s own fine book would all be good places to start.
The language of authors who suffered from dementia has a story for the rest of us.
By Adrienne Day September 29, 2016
In the early 1990s, Iris Murdoch was writing a new novel, as she’d done 25 times before in her life. But this time something was terribly off. Her protagonist, Jackson, an English manservant who has a mysterious effect on a circle of friends, once meticulously realized in her head, had become a stranger to her. As Murdoch later told Joanna Coles, a Guardian journalist who visited her in her house in North Oxford in 1996, a year after the publication of the book, Jackson’s Dilemma, she was suffering from a bad writer’s block. It began with Jackson and now the shadows had suffused her life. “At the moment I’m just falling, falling … just falling as it were,” Murdoch told Coles. “I think of things and then they go away forever.”
Jackson’s Dilemma was a flop. Some reviewers were respectful, if confused, calling it “an Indian Rope Trick, in which all the people … have no selves,” and “like the work of a 13-year-old schoolgirl who doesn’t get out enough.” Compared to her earlier works, which showcase a rich command of vocabulary and a keen grasp of grammar, Jackson’s Dilemma is rife with sentences that forge blindly ahead, lacking delicate shifts in structure, the language repetitious and deadened by indefinite nouns. In the book’s final chapter, Jackson sits sprawled on grass, thinking that he has “come to a place where there is no road,” as lost as Lear wandering on the heath after the storm.
Two years after Jackson’s Dilemma was published, Murdoch saw a neurologist who diagnosed her with Alzheimer’s disease. That discovery brought about a small supernova of media attention, spurred the next year by the United Kingdom publication of Iris: A Memoir of Iris Murdoch (called Elegy for Iris in the United States), an incisive and haunting memoir by her husband John Bayley, and a subsequent film adaptation starring Kate Winslet and Judi Dench. “She is not sailing into the dark,” Bayley writes toward the end of the book. “The voyage is over, and under the dark escort of Alzheimer’s, she has arrived somewhere.”
In 2003, Peter Garrard, a professor of neurology, with an expertise in dementia, took a unique interest in the novelist’s work. He had studied for his Ph.D. under John Hodges, the neurologist who had diagnosed Murdoch with Alzheimer’s. One day Garrard’s wife handed him her copy of Jackson’s Dilemma, commenting, “You’re interested in language and Alzheimer’s; why don’t you analyze this?” He resolved he would do just that: analyze the language in Murdoch’s fiction for signs of the degenerative effects of Alzheimer’s.
Researchers believe cognitive impairment begins well before signs of dementia are obvious to outsiders.
Prior to his interest in medicine, Garrard had studied ancient literature at Oxford, at a time when the discipline of computational language analysis, or computational linguistics, was taking root. Devotees of the field had developed something they called the Oxford Concordance Program—a computer program that created lists of all of the word types and word tokens in a text. (Token refers to the total number of words in a given text, and the type is the number of different words that appear in that text.) Garrard was intrigued by the idea that such lists could give ancient literature scholars insight into texts whose authorship was in dispute. Much as a Rembrandt expert might examine paint layers in order to assign a painting to a forger or to the Old Master himself, a computational linguist might count word types and tokens in a text and use that information to identify a work of ambiguous authorship.
Garrard had the idea to apply a similar computational technique to books by Murdoch. Alzheimer’s researchers believe cognitive impairment begins well before signs of dementia are obvious to outsiders. Garrard thought it might be possible to sift through three of Murdoch’s novels, written at different points in her life, to see if signs of dementia could be read between the lines.
Scientists believe Alzheimer’s disease is caused by cell death and tissue loss as a result of abnormal build up of plaques and tangles of protein in the brain. Language is impacted when the brain’s Wernicke’s and Broca’s areas, responsible for language comprehension and production, are affected by the spread of disease. Language, therefore, provides an exceptional window on the onset and development of pathology. And a masterful writer like Murdoch puts bountiful language in high relief, offering a particularly rich field of study.
The artist, in fact, could serve science. If computer analysis could help pinpoint the earliest signs of mild cognitive impairment, before the onset of obvious symptoms, this might be valuable information for researchers looking to diagnose the disease before too much damage has been done to the brain.
Barbara Lust, a professor of human development, linguistics, and cognitive science at Cornell University, who researches topics in language acquisition and early Alzheimer’s, explains that understanding changes in language patterns could be a boon to Alzheimer’s therapies. “Caregivers don’t usually notice very early changes in language, but this could be critically important both for early diagnosis and also in terms of basic research,” Lust says. “A lot of researchers are trying to develop drugs to halt the progression of Alzheimer’s, and they need to know what the stages are in order to halt them.”
Before Garrard and his colleagues published their Murdoch paper in 2005, researchers had identified language as a hallmark of Alzheimer’s disease. As Garrard explains, a patient’s vocabulary becomes restricted, and they use fewer words that are specific labels and more words that are general labels. For example, it’s not incorrect to call a golden retriever an “animal,” though it is less accurate than calling it a retriever or even a dog. Alzheimer’s patients would be far more likely to call a retriever a “dog” or an “animal” than “retriever” or “Fred.” In addition, Garrard adds, the words Alzheimer’s patients lose tend to appear less frequently in everyday English than words they keep—an abstract noun like “metamorphosis” might be replaced by “change” or “go.”
Researchers also found the use of specific words decreases and the noun-to-verb ratio changes as more “low image” verbs (be, come, do, get, give, go, have) and indefinite nouns (thing, something, anything, nothing) are used in place of their more unusual brethren. The use of the passive voice falls off markedly as well. People also use more pauses, Garrard says, as “they fish around for words.”
In their 2005 paper, Garrard and colleagues point out that the assessment of language changes in Alzheimer’s patients was based in many cases on standardized tasks such as word fluency and picture naming, the kind of tests criticized for lacking a “real-world validity.” But writing novels is a more naturalistic activity, one done voluntarily and without knowledge of the disease. That eliminates any negative or compensatory response that a standardized test might induce in a patient. With Murdoch, he and his colleagues could analyze language, “the products of cognitive operations,” over the natural course of her novel-writing life, which stretched from her 30s to 70s. “I thought it would be fascinating to be able to show that language could be affected before the patient or anyone else was aware of symptoms,” Garrard says.
For his analysis of Murdoch, Garrard used a program called Concordance to count word tokens and types in samples of text from three of her novels: her first published effort, Under the Net; a mid-career highlight, The Sea, The Sea, which won the Booker prize in 1978; and her final effort, Jackson’s Dilemma. He found that Murdoch’s vocabulary was significantly reduced in her last book—“it had become very generic,” he says—as compared to the samples from her two earlier books.
The Murdoch paper by Garrard and his colleagues proved influential. In Canada, Ian Lancashire, an English professor at the University of Toronto, was conducting his own version of textual analysis. Though he’d long toiled in the fields of Renaissance drama, Lancashire had been inspired by the emergence of a field called corpus linguistics, which involves the study of language though specialized software. In 1985, he founded the Center for Computing in the Humanities at the University of Toronto. (Today Lancashire is an emeritus professor, though he maintains a lab at the University of Toronto.)
What he discovered astounded him: Agatha Christie’s use of vocabulary had “completely tanked.”
In trying to determine some sort of overarching theory on the genesis of creativity, Lancashire had directed the development of software for the purpose of studying language through the analysis of text. The software was called TACT, short for Textual Analysis Computing Tools. The software created an interactive concordance and allowed Lancashire to count types and tokens in books by several of his favorite writers, including Shakespeare and Milton.
Lancashire had been an Agatha Christie fan in his youth, and decided to apply the same treatment to two of Christie’s early books, as well as Elephants Can Remember, her second-to-last novel. What he discovered astounded him: Christie’s use of vocabulary had “completely tanked” at the end of her career, by an order of about 20 percent. “I was shocked, because it was so obvious,” he says. Even though the length of Elephants was comparable to her other works, there was a marked decrease in the variety of words she used in it, and a good deal more phrasal repetition. “It was as if she had given up trying to find le mot juste, exactly the right word,” he says.
Lancashire presented his findings at a talk at the University of Toronto in 2008. Graeme Hirst, a computational linguist in Toronto’s computer science department, was in the audience. He suggested to Lancashire that they collaborate on statistical analysis of texts. The team employed a wider array of variables and much larger samples of text from Christie and Murdoch, searching for linguistic markers for Alzheimer’s disease. (Unlike Murdoch, Christie was never formally diagnosed with Alzheimer’s.)
The Toronto team, which included Regina Jokel, an assistant professor in the department of Speech-Language Pathology at the University of Toronto, and Xuan Le, at the time one of Hirst’s graduate students, settled on P.D. James—a writer who would die with her cognitive powers seemingly intact—as their control subject. Using a program called Stanford Parser, they fed books by all three writers through the algorithm, focusing on things like vocabulary size, the ratio of the size of the vocabulary to the total number of words used, repetition, word specificity, fillers, grammatical complexity, and the use of the passive voice.
“Each type of dementia has its own language pattern, so if someone has vascular dementia, their pattern would look different than someone who has progressive aphasia or Alzheimer’s,” says Jokel. “Dementia of any kind permeates all modalities, so if someone has problems expressing themselves, they will have trouble expressing themselves both orally and in writing.”
To the researchers, evidence of Murdoch’s decline was apparent in Jackson’s Dilemma. A passage from The Sea, The Sea illustrates her rich language:
The chagrin, the ferocious ambition which James, I am sure quite unconsciously, prompted in me was something which came about gradually and raged intermittently.
In Jackson’s Dilemma, her vocabulary seems stunted:
He got out of bed and pulled back the curtains. The sun blazed in. He did not look out of the window. He opened one of the cases, then closed it again. He had been wearing his clothes in bed, except for his jacket and his shoes.
It seems that after conceiving of her character, Murdoch had trouble climbing back inside of his head. According to Lancashire, this was likely an early sign of dementia. “Alzheimer’s disease … damages our ability to see identity in both ourselves and other people, including imagined characters,” Lancashire later wrote. “Professional novelists with encroaching Alzheimer’s disease will forget what their characters look like, what they have done, and what qualities they exhibit.”
The Toronto team’s “Three British Novelists” paper, as it came to be called, influenced a number of other studies, including one at Arizona State University. Using similar software, researchers examined non-scripted news conferences of former presidents Ronald Reagan and George Herbert Walker Bush. President Reagan, they wrote, showed “a significant reduction in the number of unique words over time and a significant increase in conversational fillers and non-specific nouns over time,” while there was no such pattern for Bush. The researchers conclude that during his presidency, Reagan was showing a reduction in linguistic complexity consistent with what others have found in patients with dementia.
Brian Butterworth, a professor emeritus of cognitive neuropsychology at the Institute of Cognitive Neuropsychology at the University College London, also “diagnosed” Reagan in the mid ’80s, years before Reagan was clinically diagnosed with Alzheimer’s disease. Butterworth wrote a report comparing Reagan’s 1980 debate performance with then-President Jimmy Carter, with that of his debate performance with democratic presidential nominee Walter Mondale four years later.
“With Carter, Reagan was more or less flawless, but in 1984, he was making mistakes of all sorts, minor slips, long pauses, and confusional errors,” Butterworth says. “He referred to the wrong year in one instance.” If one forgets a lot of facts, as Reagan did, Butterworth says, that might be an effect of damage to the frontal lobes; damage to the temporal lobes and Broca’s area affects speech. “The change from 1980 to 1984 was not stylistic, in my opinion,” Butterworth says. Reagan “got much worse, probably because his brain had changed in a significant way. He had been shot. He had been heavily rehearsed. Even with all that, he was making a lot of mistakes.”
Thanks in part to the literary studies, the idea of language as a biomarker for Alzheimer’s has continued to gain credibility. In 2009, the National Institute on Aging and the Alzheimer’s Association charged a group of prominent neurologists with revising the criteria for Alzheimer’s disease, previously updated in 1984. The group sought to include criteria that general healthcare providers, who might not have access to diagnostic tools like neuropsychological testing, advanced imaging, and cerebrospinal fluid measures, could use to diagnose dementia. Part of their criteria included impaired language functions in speaking, reading, and writing; a difficulty in thinking of common words while speaking; hesitations; and speech, spelling, and writing errors.
The embrace of language as a diagnostic strategy has spurred a host of diagnostic tools. Hirst has begun working on programs that use speech by real patients in real time. Based on Hirst’s work, Kathleen Fraser, a Ph.D. student, and Frank Rudzicz, an assistant professor of computer science at the University of Toronto, and a scientist at the Toronto Rehabilitation Institute, who focuses on machine learning and natural language processing in healthcare settings, have developed software that analyzes short samples of speech, 1 to 5 minutes in length, to see if an individual might be showing signs of cognitive impairment. They are looking at 400 or so variables right now, says Rudzicz, such as pitch variance, pitch emphasis, pauses or “jitters,” and other qualitative aspects of speech.
Few of us are prolific novelists, but most of us are leaving behind large datasets of language, courtesy of email and social media.
Rudzicz and Fraser have co-founded a startup called Winterlight Labs, and they are working on similar software to be used by clinicians. Some organizations are already piloting their technology. They hope to capture the attention of pharmaceutical companies regarding using their program to help quickly identify the best individuals to be part of clinical trials—which tends to be a very expensive and laborious process—or to help track people’s cognitive states once they’ve been clinically diagnosed. They also hope one day to be able to use language as a lens to peer into people’s general cognitive states, so that researchers might gain a clearer understanding of everything from depression to autism.
Lust and other researchers agree, however, that the idea of using language as a biomarker for Alzheimer’s and other forms of cognitive impairment is still in its early stages. “We ultimately need some kind of low-cost, easy-to-use and noninvasive tool that can identify someone who should go on for more intensive follow-up, such as a cup on your arm can detect high-blood pressure that could indicative of heart disease,” says Heather Snyder, a molecular biologist and Senior Director of Medical and Scientific Operations at the Alzheimer’s Association. “At this point we don’t have that validated tool that tells us that something is predictive, at least to my knowledge.”
Howard Fillit, the founding executive editor and chief scientific officer of the Alzheimer’s Drug Discovery Foundation, says language is a valid way to test for Alzheimer’s disease and other forms of dementia. “If someone comes in complaining of cognitive impairment, and you want to do a diagnostic evaluation and see how serious their language problem is, I can see that [such software] would be useful,” he says. But he says the language analysis would have to be performed with other tests that measure cognitive function. “Otherwise,” Fillit says, “you might end up scaring a lot of people unnecessarily.”
One of the main reasons Garrard undertook the Murdoch study in the early 2000s was he saw her novels as a kind of large, unstructured dataset of language. He loved working with datasets, he says, “to see whether they tell a story or otherwise support a hypothesis.” Now, with computer programs that analyze language for cognitive deficits on the horizon, the future of Alzheimer’s diagnosis looks both beneficial and unnerving. Few of us are prolific novelists, but most of us are leaving behind large, unstructured datasets of language, courtesy of email, social media, and the like. There are such large volumes of data in that trail, Garrard says, “that it’s going to be usable in predicting all sorts of things, possibly including dementia.”
Garrard agrees a computer program that aids medical scientists in diagnosing cognitive diseases like Alzheimer’s holds great promise. “It’s like screening people for diabetes,” he says. “You wouldn’t want to have the condition, but better to know and treat it than not.”
Adrienne Day is a Bay Area-based writer and editor. She covers issues in science, culture, and social innovation.
The building blocks of understanding are memorization and repetition.
By Barbara Oakley Illustration by Sam Falconer September 15, 2016
I was a wayward kid who grew up on the literary side of life, treating math and science as if they were pustules from the plague. So it’s a little strange how I’ve ended up now—someone who dances daily with triple integrals, Fourier transforms, and that crown jewel of mathematics, Euler’s equation. It’s hard to believe I’ve flipped from a virtually congenital math-phobe to a professor of engineering.
One day, one of my students asked me how I did it—how I changed my brain. I wanted to answer Hell—with lots of difficulty! After all, I’d flunked my way through elementary, middle, and high school math and science. In fact, I didn’t start studying remedial math until I left the Army at age 26. If there were a textbook example of the potential for adult neural plasticity, I’d be Exhibit A.
Learning math and then science as an adult gave me passage into the empowering world of engineering. But these hard-won, adult-age changes in my brain have also given me an insider’s perspective on the neuroplasticity that underlies adult learning. Fortunately, my doctoral training in systems engineering—tying together the big picture of different STEM (Science, Technology, Engineering, Math) disciplines—and then my later research and writing focusing on how humans think have helped me make sense of recent advances in neuroscience and cognitive psychology related to learning.
In the years since I received my doctorate, thousands of students have swept through my classrooms—students who have been reared in elementary school and high school to believe that understanding math through active discussion is the talisman of learning. If you can explain what you’ve learned to others, perhaps drawing them a picture, the thinking goes, you must
Japan has become seen as a much-admired and emulated exemplar of these active, “understanding-centered” teaching methods. But what’s often missing from the discussion is the rest of the story: Japan is also home of the Kumon method of teaching mathematics, which emphasizes memorization, repetition, and rote learning hand-in-hand with developing the child’s mastery over the material. This intense afterschool program, and others like it, is embraced by millions of parents in Japan and around the world who supplement their child’s participatory education with plenty of practice, repetition, and yes, intelligently designed rote learning, to allow them to gain hard-won fluency with the material.
Your powers of attention: fooled!Attention is, by definition, limited. And that’s usually a good thing. If you’re searching for a lost earring on the floor, you want to ignore anything that’s not small and shiny. When talking to someone at…READ MORE
Teachers can inadvertently set their students up for failure as those students blunder in illusions of competence.
In the United States, the emphasis on understanding sometimes seems to have replaced rather than complemented older teaching methods that scientists are—and have been—telling us work with the brain’s natural process to learn complex subjects like math and science.
The latest wave in educational reform in mathematics involves the Common Core—an attempt to set strong, uniform standards across the U.S., although critics are weighing in to say the standards fail by comparison with high-achieving countries. At least superficially, the standards seem to show a sensible perspective. They propose that in mathematics, students should gain equal facility in conceptual understanding, procedural skills and fluency, and application.
The devil, of course, lies in the details of implementation. In the current educational climate, memorization and repetition in the STEM disciplines (as opposed to in the study of language or music), are often seen as demeaning and a waste of time for students and teachers alike. Many teachers have long been taught that conceptual understanding in STEM trumps everything else. And indeed, it’s easier for teachers to induce students to discuss a mathematical subject (which, if done properly, can do much to help promote understanding) than it is for that teacher to tediously grade math homework. What this all means is that, despite the fact that procedural skills and fluency, along with application, are supposed to be given equal emphasis with conceptual understanding, all too often it doesn’t happen. Imparting a conceptual understanding reigns supreme—especially during precious class time.
The problem with focusing relentlessly on understanding is that math and science students can often grasp essentials of an important idea, but this understanding can quickly slip away without consolidation through practice and repetition. Worse, students often believe they understand something when, in fact, they don’t. By championing the importance of understanding, teachers can inadvertently set their students up for failure as those students blunder in illusions of competence. As one (failing) engineering student recently told me: “I just don’t see how I could have done so poorly. I understood it when you taught it in class.” My student may have thought he’d understood it at the time, and perhaps he did, but he’d never practiced using the concept to truly internalize it. He had not developed any kind of procedural fluency or ability to apply what he thought he understood.
There is an interesting connection between learning math and science, and learning a sport. When you learn how to swing a golf club, you perfect that swing from lots of repetition over a period of years. Your body knows what to do from a single thought—one chunk—instead of having to recall all the complex steps involved in hitting a ball.
In the same way, once you understand why you do something in math and science, you don’t have to keep re-explaining the how to yourself every time you do it. It’s not necessary to go around with 25 marbles in your pocket and lay out 5 rows of 5 marbles again and again so that you get that 5 x 5 = 25. At some point, you just know it fluently from memory. You memorize the idea that you simply add exponents—those little superscript numbers—when multiplying numbers that have the same base (104 x 105 = 109). If you use the procedure a lot, by doing many different types of problems, you will find that you understand both the why and the how behind the procedure very well indeed. The greater understanding results from the fact that your mind constructed the patterns of meaning. Continually focusing on understanding itself actually gets in the way.
I learned these things about math and the process of learning not in the K-12 classroom but in the course of my life, as a kid who grew up reading Madeleine L’Engle and Dostoyevsky, who went on to study language at one of the world’s leading language institutes, and then to make the dramatic shift to become a professor of engineering.
As a young woman with a yen for learning language and no money or skills to speak of, I couldn’t afford to go to college (college loans weren’t then in the picture). So I launched directly from high school into the Army. I had loved learning new languages in high school, and the Army seemed to be a place where people could actually get paid for their language study, even as they attended the top-ranked Defense Language Institute—a place that had made language- learning a science. I chose Russian because it was very different from English, but not so difficult that I could study it for a lifetime only to perhaps gain the fluency of a 4-year-old. Besides, the Iron Curtain was mysteriously appealing—could I somehow use my knowledge of Russian to peer behind it?
After leaving the service, I became a translator for the Russians on Soviet trawlers on the Bering Sea. Working for the Russians was fun and engrossing—but it was also a superficially glamorous form of migrant work. You go to sea during fishing season, make a decent salary while getting drunk all the time, then go back to port when the season’s over and hope they’ll rehire you next year. There was pretty much only one other alternative for a Russian language speaker—working for the National Security Agency. (My Army contacts kept pointing me that way, but it wasn’t for me.)
I began to realize that while knowing another language was nice, it was also a skill with limited opportunities and potential. People weren’t pounding down my door looking for my Russian declension abilities. Unless, that is, I was willing to put up with seasickness and sporadic malnutrition out on stinking trawlers in the middle of the Bering Sea. I couldn’t help but reflect back on the West Point-trained engineers I’d worked with in the Army. Their mathematically and scientifically based approach to problem-solving was clearly useful for the real world—far more useful than my youthful misadventures with math had been able to imagine.
So, at age 26, as I was leaving the Army and casting about for fresh opportunities, it occurred to me: If I really wanted to try something new, why not tackle something that could open a whole world of new perspectives for me? Something like engineering? That meant I would be trying to learn another very different language—the language of calculus.
You go to sea during fishing season, make a decent salary while getting drunk all the time, then go back to port when the season’s over.
With my poor understanding of even the simplest math, my post-Army retraining efforts began with not-for-credit remedial algebra and trigonometry. This was way below mathematical ground zero for most college students. Trying to reprogram my brain sometimes seemed like a ridiculous idea—especially when I looked at the fresh young faces of my younger classmates and realized that many of them had already dropped their hard math and science classes—and here I was heading right for them. But in my case, from my experience becoming fluent in Russian as an adult, I suspected—or maybe I just hoped—that there might be aspects to language learning that I might apply to learning in math and science.
What I had done in learning Russian was to emphasize not just understanding of the language, but fluency. Fluency of something whole like a language requires a kind of familiarity that only repeated and varied interaction with the parts can develop. Where my language classmates had often been content to concentrate on simply understanding Russian they heard or read, I instead tried to gain an internalized, deep-rooted fluency with the words and language structure. I wouldn’t just be satisfied to know that понимать meant “to understand.” I’d practice with the verb—putting it through its paces by conjugating it repeatedly with all sorts of tenses, and then moving on to putting it into sentences, and then finally to understanding not only when to use this form of the verb, but also when not to use it. I practiced recalling all these aspects and variations quickly. After all, through practice, you can understand and translate dozens—even thousands— of words in another language. But if you aren’t fluent, when someone throws a bunch of words at you quickly, as with normal speaking (which always sounds horrifically fast when you’re learning a new language), you have no idea what they’re actually saying, even though technically you understand all the component words and structure. And you certainly can’t speak quickly enough yourself for native speakers to find it enjoyable to listen to you.
This approach—which focused on fluency instead of simple understanding—put me at the top of the class. And I didn’t realize it then, but this approach to learning language had given me an intuitive understanding of a fundamental core of learning and the development of expertise—chunking.
Chunking was originally conceptualized in the groundbreaking work of Herbert Simon in his analysis of chess—chunks were envisioned as the varying neural counterparts of different chess patterns. Gradually, neuroscientists came to realize that experts such as chess grand masters are experts because they have stored thousands of chunks of knowledge about their area of expertise in their long-term memory. Chess masters, for example, can recall tens of thousands of different chess patterns. Whatever the discipline, experts can call up to consciousness one or several of these well-knit-together, chunked neural subroutines to analyze and react to a new learning situation. This level of true understanding, and ability to use that understanding in new situations, comes only with the kind of rigor and familiarity that repetition, memorization, and practice can foster.
As studies of chess masters, emergency room physicians, and fighter pilots have shown, in times of critical stress, conscious analysis of a situation is replaced by quick, subconscious processing as these experts rapidly draw on their deeply ingrained repertoire of neural subroutines—chunks. At some point, self-consciously “understanding” why you do what you do just slows you down and interrupts flow, resulting in worse decisions. When I felt intuitively that there might be a connection between learning a new language and learning mathematics, I was right. Day-by-day, sustained practice of Russian fired and wired together my neural circuits, and I gradually began to knit together chunks of Slavic insight that I could call into working memory with ease. By interleaving my learning—in other words, practicing so that I knew not only when to use that word, but when not to use it, or to use a different variant of it—I was actually using the same approaches that expert practitioners use to learn in math and science.
When learning math and engineering as an adult, I began by using the same strategy I’d used to learn language. I’d look at an equation, to take a very simple example, Newton’s second law of f = ma. I practiced feeling what each of the letters meant—f for force was a push, m for mass was a kind of weighty resistance to my push, and a was the exhilarating feeling of acceleration. (The equivalent in Russian was learning to physically sound out the letters of the Cyrillic alphabet.) I memorized the equation so I could carry it around with me in my head and play with it. If m and a were big numbers, what did that do to f when I pushed it through the equation? If f was big and a was small, what did that do to m? How did the units match on each side? Playing with the equation was like conjugating a verb. I was beginning to intuit that the sparse outlines of the equation were like a metaphorical poem, with all sorts of beautiful symbolic representations embedded within it. Although I wouldn’t have put it that way at the time, the truth was that to learn math and science well, I had to slowly, day by day, build solid neural “chunked” subroutines—such as surrounding the simple equation f = ma—that I could easily call to mind from long term memory, much as I’d done with Russian.
Time after time, professors in mathematics and the sciences have told me that building well-ingrained chunks of expertise through practice and repetition was absolutely vital to their success. Understanding doesn’t build fluency; instead, fluency builds understanding. In fact, I believe that true understanding of a complex subject comes only from fluency.
In other words, in science and math education in particular, it’s easy to slip into teaching methods that emphasize understanding and that avoid the sometimes painful repetition and practice that underlie fluency. I learned Russian not just by understanding it—understanding, after all, is facile, and can easily slip away. (What did that word понимать mean?) I learned Russian by gaining fluency through practice, repetition, and rote learning—but rote learning that emphasized the ability to think flexibly and quickly. I learned math and science by applying precisely those same ideas. Language, math, and science, as with almost all areas of human expertise, draw on the same reservoir of brain mechanisms.
As I forayed into a new life, becoming an electrical engineer and, eventually, a professor of engineering, I left the Russian language behind. But 25 years after I’d last raised an inebriated glass on the Soviet trawlers, my family and I decided to take the trans-Siberian railway across Russia. Although I was excited to take the long-dreamed-of trip, I was also worried. I’d barely uttered a word of Russian in all that time. What if I’d lost it all? What had those years of gaining fluency really bought me?
Sure enough, when we first got on the train, I spoke Russian like a 2-year-old. I’d grasp for words, my declensions and conjugations were all wrong, and my formerly near-perfect accent sounded dreadful. But the foundation was there, and day by day, my Russian improved. And even with my rudimentary Russian, I could handle the day-to-day needs of our traveling. Soon, tour guides were coming to me for help translating for the other passengers. When we finally arrived in Moscow, we hopped in a taxi. The driver, I soon discovered, was intent on ripping us off—heading directly the wrong way and trapping us in a logjam of cars, where he expected us ignorant foreigners to quietly acquiesce to an unnecessary extra hour of meter time. Suddenly, Russian words I hadn’t spoken for decades flew from my mouth. I hadn’t even consciously known I knew those words.
Underneath it all, when it was needed, the fluency was there—and it quickly got us out of trouble (and into another taxi). Fluency allows understanding to become embedded, emerging when needed.
As I look today at the shortage of science and math majors in this country, and our current trend in how we teach people to learn, and as I reflect on my own pathway, knowing what I know now about the brain, it occurs to me that we can do better. As parents and teachers, we can use simple, accessible methods for deepening understanding and making it useful and flexible. We can encourage others and ourselves to try new disciplines that we thought were too hard—math, dance, physics, language, chemistry, music—opening new worlds for ourselves and others.
As I discovered, having a basic, deep-seated fluency in math and science—not just an “understanding,” is critical. It opens doors for many of life’s most intriguing jobs. Looking back, I realize that I didn’t have to just blindly follow my initial inclinations and passions. The “fluency” part of me that loved literature and language was also the same part of me that ultimately fell in love with math and science—and transformed and enriched my life.
Barbara Oakley is a professor of engineering at Oakland University, Rochester, Michigan, and the author of, most recently, A Mind for Numbers: How to Excel at Math and Science (Even If You Flunked Algebra). She is also co-instructor, with Terrence Sejnowski, the Francis Crick Professor at the Salk Institute, of one of the world’s largest online courses, “Learning How to Learn,” with Coursera.