Curriculum Upsidedownia

Frequently, these days, I’m reminded of Edward Lear’s whimsical illustration titled Manypeeplia Upsidedownia. Depicting an imagined botanical species, the drawing shows a half-dozen characters suspended upside-down from a flower’s bending stem. A product of the Victorian golden age of nonsense, Lear’s fanciful drawing increasingly strikes me as all too realistic, too true to be good.

We seem to have landed in a new, darker era of nonsense, one in which we take our follies seriously, and act upon them. We shoot first, and aim later.

On the domestic front, folly is fully in evidence with the implementation of the Common Core Curriculum. Now adopted in 46 states, this federal effort imposes uniform standards on what subjects are taught in American schools and how, with student’s performance measured by extensive testing. Imposed on states as a corollary to the Race to the Top initiative, the Curriculum is the fruit of a process tainted with politics, vested interests and a lack of transparency.

For an initiative so oriented to students’ test results, the program has been put in place without itself being rigorously tested—making America’s secondary school students 50 million pedagogic guinea pigs.

Focused on developing critical thinking in students, the program’s design, implementation and potential costs reflect a conspicuous lack of reflection. If the program’s goal is to enhance reasoning skills, the curriculum’s developers would fail their own test.

Oriented to career development, the Common Core Curriculum emphasizes skill sets over content, and nonfiction texts over literature. The imposition of the one-size-fits all top-down approach recalls the imposition of the whole language (also known as “whole-word)” reading technique in the 20th century. Replacing phonetic reading, sounding out the words, by whole word: “look-and-say” recognition. The unproven method has produced a steady decline in American reading scores, and overall literacy.

The test-centric No Child Left Behind program resulted with half the nation’s schools receiving a failing grade. The Common Core cure? Create tests that are considerably tougher, longer and vastly more expensive.

It’s irrational and cruel to impose uniform standards on schools whose budgets, resources and environments vary so dramatically. How is an inner-city child supposed to compete with his affluent suburban counterpart, blessed with a wealthier school district, educated parents, and tutors and test coaches on call?

If leveling the educational playing field is our goal—and it is a laudable one—it would make sense to first align how much money is spent per student, before applying uniform standards of teaching and achievement.

The Common Core standards were developed by academics and testing experts, with little or no input from teachers and parents. Many of the curriculum’s consultants have ties to testing companies: indeed David Coleman, the Curriculum’s chief architect now heads the College Board.

In an ever-changing world, common sense would propose a broad range of educational approaches rather than a single one. In education, as in gardens, a monoculture is one doomed to decay and eventual failure.

A vast educational experiment, the Core Curriculum has been implemented without empirical evidence of its value, designed by a flawed process, and imposed hurriedly without consulting the very people most affected: students, teachers and parents.

In the future, American students might do well to study the Upsidedownia Curriculum as a textbook example of what critical thinking is not.

 

As seen in The San Francisco Chronicle

The Whole Language Problem

How reading is taught is a matter of national urgency. The ability to read is the first building block of education: the key that opens the door to all later learning.

When it comes to how our public schools teach our children to read, a failed technique — whole language or “whole word” —continues to prevail over phonics, the teaching method that’s been a proven success over millennia.

American literacy statistics make for troubling reading. Today’s average American reads at an 8th grade level. The literacy rate for Americans is about 75 percent, leaving 25 percent functionally illiterate, scarcely able to read street signs.

If you are reading this, you likely belong to the 15 percent of Americans who read at the highest level — that of a college undergraduate.

Designed by 19th century German psychologists in order to improve educational efficiency, whole word became widely adopted in the U.S. in the 1930s. The phonics reading manuals, with words broken down into syllables, were jettisoned in favor of the infamous Dick and Jane books.

Alarm bells rang at the beginning of the 1950s, when the military saw a dramatic drop-off in reading ability among recruits. At first, military brass suspected draftees were faking it, trying to appear less literate to avoid service in the Korean War. In actuality, rejection rates due to illiteracy rose from less than 3 percent in World War II to approximately 17 percent during the Korean War, to 21 percent during the Vietnam War.

After the first salvo was fired in 1955 with the publication of the wildly popular book, Rudolf Flesch’s “Why Johnny Can’t Read,” another war began between the proponents of phonics and whole word. Despite dozens of studies showing that phonics is vastly more effective, public school educators have largely stuck with whole word — with disastrous results.

Whole language advocates embrace the notion that children learn reading as intuitively and effortlessly as they do language. That’s no less absurd than saying you can learn to read music if you can hum a tune. They belittle phonics — with its alphabet, syllables, drills and instruction — as a brittle, artificial technique that only gets in the way of the student’s “natural” quest for meaning. But, in fact, it takes effort — even courage — to learn to read well. That’s why memorizing grammar and vocabulary was thought best learned “by heart.” But the scolds of the whole word movement gave it the dreary label, “by rote.”

Whole language learners are required to memorize hundreds, and eventually thousands, of words by the way they look, as if they were Egyptian hieroglyphs or Chinese ideograms. If they come across a word they don’t know, they can’t go to a dictionary, because spelling, like the alphabet, is viewed as an afterthought in the loopy land of whole language. Since they are never taught to decode, they’re stuck in a state of ignorance until the nurse-like teacher spoons them new words.

Yet phonics — a method that goes back to at least ancient Rome — teaches children to read letter by letter, syllable by syllable, phrase by phrase, what we used to call “learning your ABCs.” Once learned, phonics breaks down the code of written language, providing students with a toolkit that allows them to decode — to unlock meaning — and thus profoundly learn any of the million-odd words in the English language, a skill enhanced with each reading experience.

Whole language should be tossed on the scrapheap of history. It doesn’t work, and it will never work. It’s imperative for America’s civil life and culture, and our country’s international competitiveness, to return to teaching reading with phonics, and abolish the poisonous pedagogy known as whole language.

George Ball is chairman and CEO of the Burpee Company, as well as vice chairman of The Orme School, a college preparatory boarding school in Arizona.

 

As seen in The Detroit News

Winter Sweetness: Guest Blog by Nick Rhodehamel

Last week we had our first real freeze. What wind there was in our little vegetable garden flapped the stiff, unyielding leaves of the tomato and black kale plants. By noon, the temperature had risen above freezing. The tomato vines were droopy and those fruits that remained on the vines were water soaked with drops of water beginning to form on their surfaces. The kale seemed no different than before.

If you’re in central Indiana, the “living history museum” Conner Prairie Farm is well worth seeing. Among its permanent features is Prairietown, a recreated pioneer community set in 1836. Prairietown has several homes that represent the various people that would have been found in and around such a community; there are, for instance, a prosperous family from Kentucky and a family of hardscrabble pioneers. There is also a blacksmith shop, a pottery shop, an inn, a doctor’s office, and a schoolhouse. “Historic interpreters” dressed in period clothing perform first-person impressions of the people of Prairietown and interact with visitors in character. At Christmas time, Conner Prairie Farm presents a special program that is set on Christmas Eve. In the evening, visitors go from house to house observing how different households celebrated the night before Christmas. The Kentuckians prepare for a dinner party complete with fine crystal glasses and a multiple course meal. In the windowless cabin of subsistence pioneers, the husband is away and the wife, with babe on arm, prepares for winter and treats visitors to a demonstration of the art of sausage making. That and the few vegetables they could preserve were apparently what they ate during the long winter. Life was tough for a lot of people on the frontier, and winter was particularly grim.

Our tomatoes were killed outright by the freeze. Ice crystals formed within their cells, and that’s always lethal. They had survived frost earlier, and of course frosts too can kill plants. But it’s not frozen cells that gets them; it’s dehydration and the resulting cell membrane damage. Plants have spaces between and outside their cells (the apoplast) that allow air to diffuse in and out and carbon dioxide to be taken up for photosynthesis and oxygen released to the atmosphere. The apoplast contains water too, and as temperatures fall, ice can form in these intercellular spaces. This intercellular ice causes water to flow out of the neighboring living cells into the intercellular spaces where it too freezes. As the amount of intercellular ice increases, more and more water flows out of cells, membranes rupture, and cells die.

Our kale, on the other hand, was fine; it’s quite cold tolerant, as are brassicaceous plants in general and many “greens”. These hardier plants differ from their more tender cousins, like tomato, by adapting to cold temperatures and accumulating soluble sugars and other low molecular compounds inside their cells. These act as antifreezes that inhibit cellular ice crystal formation and dehydration, and they stabilize delicate cell membranes. For the gardener, cold acclimation is a boon: not only can you continue to harvest fresh into winter, but your produce has a sweeter, richer flavor. Some crops such as parsnip and turnip certainly can be eaten in fall but are really best after they’ve spent a winter under snow.

Lots of people find winter a forbidding time. The nights are long, and the days can be cold and gray; the late spring snows make winter seem eternal. If you’re among them, don’t despair. It’s not 1836, and even if you were a subsistence pioneer, your diet need not be too tough during winter. With a little planning and a root cellar or a corner of your garage or basement that remains above freezing, you can eat your own produce all winter long. In addition to what you leave in the ground, many crops are easily stored; carrot, cabbage, brussels sprouts, onion, leek, shallot, beet, potato, and winter squashes are examples, but by no means is that an exhaustive list. And you can always freeze (corn and broccoli do well) or can (tomato) the fruits of your garden.

Indeed, winter is time for designing your next garden and planning the sowing and planting schedules. The Sumerians reckoned that this process—this specific and manual forethought—was the basis of civilized society: making something out of nothing, or a lot out of a little. Please see Samuel Noah Kramer’s classic The Sumerians for an illustration of the first known “farmer’s almanac” from about 2000 B.C.

At Burpee we offer our new version of last year’s app, ‘Garden Time’, that answers the question,”When?”, the anxious moment for all new gardeners. Very soon, Burpee and The Cook’s Garden will have their websites up with all their new 2014 blockbuster varieties of vegetables, flowers, bulbs, herbs and perennials.

So settle into a comfy chair and start gardening!

Pining for Pines: Guest Blog by Nick Rhodehamel

Just north of St. Ignace and the bridge over the Straits of Mackinac, the sign for the Mystery Spot looked pretty much as it did the last time I saw it. There seems to be a Mystery Spot just about everywhere. I know there are at least three in California alone. Mystery Spots purport to be the sites of “gravitational anomalies”, where you can witness balls rolling uphill and people standing effortlessly at fantastic angles. My guess is that what you see has more to do with clever engineering and our tendency to accept as real what our senses tell us. I am certain too that in any visit to a Mystery Spot, there’s nothing anomalous in the lightening of the weight of your wallet.

If you’ve driven around the top of Lake Michigan, you may have noticed that the forest crowds the highway; trees dominate the landscape, and the mix of trees is different from farther south. All along the route, we’d seen stands composed of conifers (mainly pines) mixed with common deciduous species. And before the novelty of the drive was long gone, my wife wondered why there are “so many pines up here”. She meant “conifers”, in general, and that characterization is not terribly quantitative, but it’s a good observation.

To be sure, countless things influence what grows where (and with what). On a small scale, you could search for answers in differences in soil, topography, or any number of other variables, but on the grand scale of the Upper Peninsula of Michigan, the quick-and-dirty short answer is the way to go. As political commentator James Carville might say “It’s the climate, stupid”.

The climate of any area is shaped mostly by the air masses that grace it and the physical characteristics those air masses possess. In the central part of North America, three primary air masses dominate: Subtropical, Arctic, and Continental. North Pacific and North Atlantic air only infrequently have an effect. Warm air from the Atlantic Ocean collects moisture as it moves across the Caribbean; this subtropical air then flows up the Mississippi Valley and reaching the middle of the country, it heads east. Arctic air masses that form over the Beaufort Sea are cold and dry. They flow south over Canada and turn east also in the middle latitudes. When these two air masses meet, the cold Arctic air sinks under the warm, moist subtropical air, forcing it to rise and cool and often lose its moisture to precipitation. The third air mass develops in the Great Basin area. Prevailing westerly winds drive it east over the Rocky Mountains, which comb out most of the little moisture it held. This Continental air mass often acts to separate the two contentious air masses. When that happens, dry conditions prevail.

The interplay among these air masses mainly determines the climate in the central part of the continent. One clear feature is a climatic gradient that runs roughly east-west from Minnesota and approximately through the central parts of Wisconsin and Michigan and, east of the Great Lakes, through Pennsylvania, New York, and the less mountainous parts of New England. South of the gradient, conditions are relatively warmer and drier. To the north, the climate is cooler and moister; the winters are longer and somewhat more severe, and snow usually remains on the ground all winter. Most precipitation in the north falls in summer where it’s more constant than to the south where there are often summer droughts.

This climatic gradient marks the southern boundary of the “North Woods” (the “North Country”, the “Great North Woods”, or the “Laurentian Mixed Forest Province”, whatever), which extends north through parts of the Canadian provinces Ontario, Quebec, and the Maritimes. It is a vast area that was covered by ice during the last glaciation, and its physical features reflect this pedigree in lakes, swampy depressions, outwash plains, and barrens. Elevations are relatively low, a few hundred feet above sea level to a little more than the 2,300 feet of remnant mountain ranges that in their days would have rivaled the Alaska Range but have been worn down by this last glaciation, other past glaciations, and mostly eons of wind and rain and freezing and thawing.

The North Woods lies between the boreal forests of Canada and Alaska and the ancient deciduous broadleaf forests that radiate from the vicinity of the Ozark Mountains in Arkansas and Missouri and in the southeast, the southern Appalachian region. It is transitional between these two forest zones having components of both. Around 15,000 years ago, there were no trees, only the Laurentide Ice Sheet. Then as the glaciation waned, conifers began to colonize the area in successive waves. Boreal species such as tamarack, spruce, and balsam fir arrived about 13,000 years ago. Probably these initially grew in the soil on top of the retreating ice. They were followed by white pine and hemlock, both from the southeast. Toward the end of the conifer migrations, the flowering trees then began arriving . Oaks and elms first appeared around 11,000 years ago, then maples a little later and finally the American beech.

The Upper Peninsula, where we were, has always seemed wild and remote to me. It’s a difficult place to make a living, even in the best of times, as evidenced by the number of stores along our route selling pasties, gas, liquor, and sundries that have closed their doors. In prehistoric times, the populations of native peoples were much greater farther south. Logging, and later mining, brought in lots of men. In the old days, those industries were practiced rapaciously and all but decimated the forest and landscape. When logging reached its predictable conclusion in about 1930, there was nothing left to hold them and the men who worked the forests drifted off. Now logging and mining are conducted on a smaller scale in a more controlled way and fewer workers are needed. Farming has always been an iffy proposition here; summer is sometimes called 3 months of bad sledding. Those who tried farming, maybe castoffs of the logging days, often gave up, and then they too decamped to more favorable climes.

The forests have grown back since the logging days, though not exactly as before. Nearly 300 years ago when the French came here to trap beaver, convert the native peoples, and search for ingots of pure copper, they found the forests mature (late successional) and dominated by conifers, mostly white pine and hemlock. Deciduous species dominate now, in both mature and juvenile forests. Those French would not disagree with my wife’s observation, but they might be surprised at what she called “so many pines”.

Where Have All the Flowers Gone?

Where have all the flowers gone? American cities, proud hubs of the arts, increasingly lack the very soul of culture: the flower. The original earthly joy, flowers bestow what our urban spaces are most in need of: beauty, romance, delirious color, fragrance, and a panoply of extraordinary forms.

Our urban centers, meanwhile, are reveling in a vegetable renaissance. Vegetable gardens large and small are sprouting in our cities: in backyards, window boxes, and repurposed warehouses, and on rooftops and balconies.

Farmer’s markets offer up splendorous harvests of fresh produce: cooed over by urban vegetable aficionados, who, a while back, likely didn’t know the difference between radishes and radicchio. Heirloom vegetables, standard pre-World War II market varieties, must fairly blush at the lavish attention – and prices – accorded them.

The Dawning of the Age of the Vegetable, unfortunately, converges with the Decline and Fall of the Flower. Not so long ago, our cities were abloom with florists, a species fast going the way of record stores. Restaurant tables flared with flora. Women wore corsages. Men’s suit lapels boasted boutonnieres. Guests arrived bearing bouquets. Homes were festooned with flowering potted plants.

Compared with European metropolises, America’s cities are strikingly unfloriferous – and poorer for it. Vegetable gardens now abound; flower gardens are few. Vegetables are delectable and nutritious; we admire them, we respect them, but we do not love them. Flowers engage our senses and spirits in ways even the ripest, juiciest heirloom tomato cannot.

The author Iris Murdoch observed, “People from a planet without flowers would think we must be mad with joy the whole time to have such things about us.” If only flowers were about us.

The flower is the crown jewel of botanical creation. Without flowers, there would be no seeds, no fruits or vegetables, no life. Humankind has coevolved with flora: We domesticated flowers; flowers – the first plants cultivated without a utilitarian purpose – have domesticated us.

The ultimate symbols, flowers are prettily strewn throughout poetry, song, legends, and stories. They bloom in paintings, architecture, ceramics, textiles, photographs: every form of visual pleasure. Enshrined in the trajectory of our lives, flowers signify birth, youth, romance, and marriage; at death, they promise new life. In our anhedonic, pixilated digital age, we need flowers more than ever.

Unfortunately, commercially available flowers are mostly poor in quality, limited in selection, and grown abroad under shockingly unsustainable conditions. Most imported blooms fly in from South America on pollution-spewing jets. Grown in greenhouses staffed by exploited workers, flowers are sprayed with pesticides, fungicides, and herbicides, and picked too young.

On arrival, newly imported flowers are gassed with ethylene to hasten ripening – sacrificing buds, leaves, and richness of color. That’s why your supermarket-bought blooms look dead on arrival.

The burgeoning crop of urban gardeners will provide an extraordinary service by cultivating flowers, serving “locaflors” as well as locavores. Flowers, enduring symbols of renewal and rebirth, await their urban renaissance.

As seen in The Philadelphia Inquirer

Burpee CEO Reblooms Urban Agriculture

At the keynote speech of the Urban Agriculture Conference in New York City, organized by The Horticultural Society of New York, George Ball, Burpee Chairman and CEO told leading-edge urban gardeners and rooftop farmers to “stop and smell the cut flowers”.

Most urban agriculture projects consist mainly of vegetables and herbs with occasionally a few flowers on the side. Ball, a 35 year veteran of both the cut flower as well as the vegetable business, urged the participants at the May 16th conference held at the Kimmel Center of New York University, to meet the great potential, as well as pent-up demand, of fresh cut flowers that have almost vanished from urban homes, parties and other public and private events.

“Think of cut flowers as an endangered species” quipped Ball. “If you grow flowers in a 1-2 acre farm or garden, you not only serve customers who have not been pleased for over 30 years, but also you avoid regional competitors and government regulators in the fresh vegetable business.”

Ball went on to discuss the attractiveness of a cut flower urban farm to employees as well as customers. “You will have volunteers line up early every morning to work on a seasonal, outdoor cut-flower farm—vegetables don’t have that kind of deep and universal attractiveness.”

Ball added that the contemporary flower industry is dominated by huge exporters from countries 4,000-6,000 miles away, whose flowers are picked “green” when the buds are not fully pigmented (much as a tomato is picked green) and shipped by air-polluting jumbo-jets to wholesalers who keep them up to a week in storage. Finally, they are distributed to an ever-decreasing number of retail florists. “Today most florists are gift shops with a small cooler in the back filled with pale-colored flowers from Asia, South America or the Middle East” Ball said. “The consumers have fewer choices in flowers than they have in vegetables in a supermarket.”

Ball also pointed out the latest research at Rutgers University by Jeannette Haviland-Jones that proves that fresh flowers in the home elevates mild depression or other mood disorders. “So long as the flowers are proportionate to the room—not too many, not too few—they transform the space into a place of happiness”, added Ball.

“Vegetables are fuel for our body, but flowers connect with the deepest parts of our spirit.”

The Urban Agriculture Conference at the Kimmel Center was attended by over 300 urban gardeners and city farmers from across the nation.

 

The above is a copy of a press release that went out last week. Your comments are welcome! And please pass along to other bloggers if you have a chance. Thank you.

Antidepressants? Grow Your Own!

A few years back I witnessed an unforgettable sight. Having just led a contingent of Asian visitors around Burpee’s Fordhook Farm floral display gardens, I noticed one man remained standing outside the garden, rocking back and forth, his eyes closed. Concerned, I asked him if everything was okay. “I … am … happy”, he replied simply, lost in rapture. In his honor, we have named it the “Happiness Garden”.

I mention this episode because our country is right now in the midst of an epidemic of unhappiness. If Walt Whitman were to hear America singing today, he might hear a low, moaning blues rather than “strong melodious songs”, America has a case of the blues—and one of epidemic proportions.

Depression is one of America’s leading health problems, accounting for half the costs in mental health today. America’s new Great Depression exacts a high cost, with lost productivity and medical expenses adding up to $83 billion annually.

Since 1988, the use of antidepressants by Americans 18 and older has increased fivefold. Ten percent of Americans are currently on antidepressants, filling 245 million prescriptions designed to boost their mood at a cost of 8 billion dollars. Still another 20 percent of Americans who are depressed receive no treatment.

Antidepressants do not work for all, and are apt to lose their efficacy over time. Better than nothing, you might say? Studies have shown that for one-quarter of patients, a sugar pill proved more effective than the prescription medicine.

What accounts for this tidal wave of depression? Along with genetic factors and one’s biology, experts point to environmental factors, the stress of modern life, the rapid pace of technological change, and our increasingly sedentary modern lifestyle.

Other observers believe that the depression is currently over diagnosed. Another school of thought contends that what’s causing the depression epidemic are … antidepressants. The notion is that while the medications can be effective for the short term, they can worsen symptoms over the long term due to neuroadaptation on the part of the brain.

Lifestyle factors surely play a part, and depression is frequently coupled with being overweight. Americans currently spend seven hours a day in front of a computer or TV screen—an activity—or non-activity—which surely does little to boost their physical or mental health. Mesmerized before their screens, social and family life are sacrificed to a solitary realm offering little in the way of challenge or stimulation.

So what’s a depressed person to do? I’m happy to note that there’s light at the end of this depressing tunnel. It’s recognized that milder cases of depression can be a number of ways for which no prescription is required, including yoga, meditation and a daily dose of mild exercise.

The best clinic for treating depression might be right outside your door: the garden. According to Sir Richard Thompson, the president of the Royal College of Physicians, working in the garden frequently proves a more effective antidote than expensive pharmaceuticals.

The garden provides visual stimulation, mood-boosting sunlight—and a realm of effects you won’t find sitting before your computer screen: fragrance, flavor, color, beauty—not to mention a harvest of the freshest herbs and vegetables and vase-ready blooms. And scientists have found that M. vaccae, a benign soil bacterium, has antidepressant effects.

Being in the gardening business, I meet thousands of gardeners a year. You cannot imagine a group of more spirited, upbeat, enthusiastic and contented people. They are invariably fit, with blooming color in their cheeks and sunlit sparkle in their eyes. Whether men or women, young or old, urban, suburban or rural, their therapy is the same: the Happiness Garden.


A slightly different version of this essay appeared as an op/ed piece in The Philadelphia Inquirer on May 24, 2013.

The Price Is Lice: Guest Blog by Nick Rhodehamel

Note: Every once in a while we here at the old bloggie limp, or shuffle, over to the stove and brew up a nasty, filthy, strong pot of coffee. The ensuing, almost hallucinatory, stimulation allows us to publish “monster” blog posts. This is one.

Oh, dear readers, you may recall from your early childhood the bowl haircut boys—this was usually due to parents’ extreme frugality—but you may remember as well the rare boy who would disappear between class breaks and return a few days later with a bald head and hang-dog look on his face. Worried exchanges would take place well out of his earshot—so lovingly thoughtful were we children back then—and the “L-word” was introduced to us. Around the JFK assassination in my clock.

So, welcome to a “Back To The Future” moment. Here’s a future no one wants to back into, so to speak. However, Nick is on the beat. And I’m an amateur entomologist. I just love bugs. Happy reading!  

During late winter this year, head lice were epidemic at my children’s school. That’s not to say that the place was exactly lousy with lice. But for a few weeks, children with lice were found often enough to worry my wife and me.

Frequently, when one infested child was identified, others were found. Usually, we were able to infer who the children were by who was sent home. Often, the same two siblings, who were coincidentally in both of our children’s classrooms, were sent home.

The school immediately informed us of newly discovered (or rediscovered) infestations. And when they did, we made our children strip in the subfreezing garage, searched through their hair like frantic chimps, and envisioned the ballooning water bills when we would be driven to wash and rewash bedding. We felt fortunate that they remained louse free.

The word “louse”, or variations of it, is firmly rooted in our language. Figurative use as pejoratives dates from late 14th century: “lousy weather” or “louse of brother-in-law” (whatever) are examples. “Lousy with” (swarming with) is an American usage from the 1840s. But these are expressions we use without thinking, just as we use “lock, stock, and barrel”. Lice infestations today seem somewhat novel and unusual. They were once more visceral and immediate.

During most of human history, lice infestation was a ubiquitous experience. Lice have been found with Egyptian mummies. Or take, for instance, the Vikings, but nearly any group would do. The Vikings, as they were exploiting and expanding the world, in addition to arms and amulets, carried among their accoutrement finely crafted combs with closely spaced teeth. During that period, most people wore their hair long, and combs were used as much for removing head lice as for making hair look beautiful.

The historical novel Lord Grizzly tells the story of the American mountain man Hugh Glass who travels up the Missouri River in 1823 with a party of beaver trappers. Hugh is attacked by a she grizzly bear with a cub. Though he succeeds in killing the bear with his knife, he gets pretty well torn up and a displaced fracture of his leg. His companions find him, sew him up with sinew, dig his grave, and wait for him to die. But Hugh won’t die. It’s hostile Indian country and after watching over him for several days, they abandon him, taking all food and weapons with them.

When Hugh finally comes around, he manages to realign the bone in his leg and to crawl nearly the entire 200 miles back to Ft. Kiowa, which his party had left 3 months earlier. During his crawl, Hugh comes across an ancient, dying Arikara woman abandoned by her tribe, whose members Hugh knows would kill him on sight in a heartbeat. Out of kindness, Hugh gives her water and cooks her food. He then buries her when she dies. No good deed goes unpunished; the dead woman gives Hugh an unwelcome gift—lice.

Unlike my wife and me, Hugh knows lice. He removes his buckskin clothes, puts them on an ant mound, and the ants feed on the lice. Soon the ants are crawling on Hugh too. When they begin to bite, he knows that he is clean. Hugh calls the lice “graybacks”, a term not now in usage that was common among Civil War soldiers who were tormented by lice also. Not too long ago, nearly everyone would have gotten lice at one time or another.

Almost certainly my wife and I naively overreacted. The Center for Disease Control says that head lice infestation, or pediculosis, is a common occurrence. Firm data on head lice incidence in the United States are not available, but 6 to 12 million infestations are estimated to occur each year in the United States among children 3 to 11 years of age, who are more likely to contract it than older children or adults. Some studies suggest that girls get head lice more often than boys, probably because of more frequent head-to-head contact. Head lice are insects that neither fly nor jump; transfer to another person is largely passive. Head lice feed on human blood; they are obligate human parasites, meaning they need humans (and humans only) to complete their life cycle. They can live no more than 36 hours apart from their host and prefer to lay their eggs (nits) on hair. There is no evidence that head lice transmit disease. Children diagnosed with head lice need not to be sent home early from school and can return to class after appropriate treatment has begun. Successful treatment should kill all crawling lice, but nits may persist after treatment. No big deal.

But when I was a child, I don’t remember anyone getting head lice. None of my siblings ever had them, and others tell me the same. What happened? Are head lice more common today?

Apparently, they are. Reasons for this are not certain, but societal changes surely contributed. Forty or so years ago classroom behaviors began to change. Children no longer only worked alone at individual desks but spent more time in small groups and moved around to different work areas, increasing contact among children and coincidentally promoting infestations. And as the work force expanded and both parents (and single parents) went to work outside the home, more children attended day care and after school programs. This too increased potential contact with infested children and in turn infestation incidence.

People, of course, want head lice about as much as they want a sharp stick in the eye. As the incidence of head lice infestations increased in schools, the demand for effective louse treatments grew. And a lucrative industry was spawned. One current estimate is that various louse shampoos, sprays, and rinses bring in $150 million a year.

Many of these treatments are based on pyrethroid-type compounds (originally isolated from chrysanthemums). For a time, the treatments delivered eradication in one application as promised but not now. Resistance in pest populations resulting from extensive use of single or similar compounds is an old story. And it’s a big problem too. Consider multiple drug-resistant tuberculosis or resistances in such common bacteria as Staphylococcus and Streptococcus (“flesh eating bacteria”). So why not lice?

This is the age of the New Normal, and that’s any suboptimal situation that we seem unable to correct: the jobless recovery, events such as the Boston Marathon bombings, and (I read) for the fast food industry, price cuts and stagnant sales. I suggest we add head lice infestation to this assemblage.

Bye, Bye Buckthorn: Guest Blog by Nick Rhodehamel

Here in the upper Midwest, it’s been a long winter. The ground has been continuously under snow since December 19. That snowfall closed down the city, and to children’s delight, schools too. It was the fourth greatest snowfall on record with 19.2 inches recorded near my house. I told my own children to take note: this is an unusual storm; you won’t see one like this again anytime soon. But, as it turned out, we had three more substantial snow storms. They weren’t as big as the first one, but they brought the same, heavy wet snow that covered all the spruces and pines (and everything else) so that we looked like the Canadian Rockies.

It was a winter wonderland. I liked it. Of course not everyone did, even some in my own family, and most people are now absolutely sick to death of it. I’m ready for spring too, but with this week’s high pressure and cold air, the snow hangs on —not a crocus, not a daffodil in sight. But with your thumbnail, scratch off a little of the bark of a common buckthorn twig, and you will find bright green tissue ready to go.

Around here, buckthorn’s all over. When we first moved to our house last summer, we admired the mature landscaping. Only gradually did it dawn on us how overgrown the place was. When we looked carefully, we saw emerging from —whatever it was —here an old crabapple and there a mature hawthorn. Then everywhere we looked, we saw lilacs, dogwoods, various types of fern, and smooth hydrangea, all peering from beneath, around, and through the overgrowth. There was even a towering old tamarack that I had failed somehow to see. The “whatever it was” was buckthorn. It was then that I got interested in buckthorn eradication.

Buckthorn, a native of Eurasia and North Africa, was introduced to North America as an ornamental shrub in the early 19th century. Why it was thought a plant of merit, I can’t say for sure. It certainly takes no effort to cultivate (but why would you want to) and will flourish to a confluent hedge in no time. Birds like its fruit. The bark and fruit also were used as a purgative (though apparently results could be violent). In the early 19th century, much of medicine focused on shocking the body back to health. For that reason, buckthorn’s —let’s say —cleansing attributes may have been valued, and perhaps its specific name (cathartica) was an optimistic forecast of buckthorn’s medicinal effects. Who knows?

But the problems with buckthorn must have been apparent soon after it escaped cultivation. It aggressively invaded forests, woodlands, meadows, fields, and roadsides. And it aids the spread of a disease that was important back then —crown rust of oat. In summer, the undersides of buckthorn leaves are marked by small, orange somewhat powdery protruding pustules. These are fruiting bodies of the fungus that causes crown rust. Buckthorn is required for the fungus to complete its life cycle (and infect oats). In the early part of the last century in the upper Mississippi Valley, where oat was a major crop, concerted efforts were made to contain and eradicate buckthorn expressly to spare that crop. But buckthorn’s presence today in all but the southernmost tier of US states and farthest north Canadian provinces demonstrates how unsuccessful those efforts were.

Buckthorn spreads like mad in almost any environment it colonizes. It freely grows in full sun to deep shade. It grows rapidly, tolerating various soil types and varying moisture and drought conditions. It produces an abundance of fruit that is attractive to birds, who spread it far and wide. If a buckthorn thicket is cleared, exposing soil that had been in deep shade, the shed seeds quickly germinate and initiate a new thicket. And cut stumps readily sprout. In spring, buckthorn has a leg up on its deciduous peers by leafing out earlier; in fall, those leaves continue to photosynthesize well after most plants have senesced. There is also evidence that buckthorn changes soil carbon and nitrogen dynamics, facilitating the elimination of leaf litter and invasion by exotic European earthworms and ultimately making the soil less suitable for native plants. These changes in the soil may persist even after buckthorn has been removed.

So what to do this spring when December is still in the air and on the ground? Why continue my buckthorn eradication project. I began it last summer in a small way, cutting a few buckthorns that were obviously in the way. But it quickly ballooned to a major undertaking. My procedure is simple: plants that are small enough I pull out of the ground; otherwise, I cut them at the soil line and treat the stump with Roundup. One application of Roundup is not usually enough though, so I reapply it when the stump begins to sprout.

From an area of probably well less than half an acre, I have removed literally hundreds of buckthorns, ranging in size from small seedlings to real trees close to a foot in diameter at the soil line. The real work comes in disposing of the cut brush. I’m lucky though: I drag it down to the road by my house, make a small effort at consolidating the pile, and the city ultimately takes it away (it’s all part of my municipal services bill).

I get a certain amount of satisfaction in removing the buckthorn. It’s made a remarkable difference in opening up the property. It probably doesn’t increase the resale value —any more than remodeling a kitchen does, though. But I’m not moving anyway. I have roughly a third more to clear, and it’s easier now than in summer when it’s hot and the buckthorns have leaves. But this is a long-term project. When I’ve removed all the buckthorn, I won’t be done. For years, I anticipate that buckthorn will continue to come up from roots and seed. I’ll keep you posted.

Growing Concern

Let me be frank. I oppose “growth” and object to the “growing economy,” I take exception to “growing” companies. These terms—used chronically and uncritically by politicians and pundits—leave me vexed and perplexed. Why? Because I am convinced that “growing” is precisely what economies don’t do. They might increase or expand, but they grow not.

At some point in the 20th century, “growth” grew into shorthand for an increase or expansion in the amount of goods and services produced by the economy. “Grow” is now used as a transitive verb, as in Paul Hawken’s manifesto, “Growing a Business.”

Even as a metaphor, “growth” has its limits, soon apparent in the absurd, oxymoronic terms “flat growth” and “negative growth”. A governor of a large state recently declared he wanted to “grow” the size of his economy’s “pie”. Block that metaphor!

I hear your protests. “Grow” is a figure of speech, a metaphor, Mr. Ball! Why put this harmless butterfly of a phrase upon a syntactical wheel?

My animus derives partly from my role as the head of a “growing business,” a 136-year-old firm specializing in plants and seeds, things that really grow. Our long-term motto, “Burpee seeds grow,” is not a metaphor, but a statement of fact.

The indiscriminate use of “grow” and “growth” has profound implications for how we view our economy. The anthropomorphism of the “growth” term, endowing business and financial statistics with animate, living qualities, is imprecise to the point of being delusional. The notion of a “growing economy,” I suppose, puts color in the cheeks of pale, sexless numbers.

The use of “grow” or “growing” as a synonym for expand, increase, develop and enlarge is largely a 20th century creation, or growth, one that should be pruned from the English language.

Adam Smith, in his Wealth of Nations (1776), makes multiple references to “growing” and “growth”, but he is referring to seed (of all things!) grain, fleece or timber, not economies. Thomas Malthus, the unheeded prophet of the limits of growth, in his An Essay on the Principle of Population (1798) , likewise applies the terms to, simply, things that grow.

TheOxford Dictionary of English Etymology defines growth as “Show[ing] the development characteristic of living things.” The word “grow”, I might add, derives from the Indo-European term word for “grass”. Tell me, is the economy growing like grass?

“Grow” is one of many terms that have migrated from agriculture into business. We speak of “seed money”, “hedge funds”, “yields”, and “plants.” “Share”, as in stocks derives from the “shearing” of sheep and, thus, a unit of raw wool. No surprise that the rise of agriculture back in 8,000 B.C., give or take a millennium, brought economies into being.

Not so many centuries ago, agriculture pretty much constituted the economy: economic growth and the growth of plants and animals were inextricably bound. The ancient Mesopotamian currency, the shekel, introduced around 3,000 B.C., was based on a weight of barley: 180 grains, to be precise. Seed capital, you might say.

I fear one result of using our “growing” terminology is that we ourselves cease to grow, because we’ve stopped thinking. Try this thought experiment: if the economy is growing, is the hidden hand of the market growing with it?

Talk of the “growing economy” is a pathetic fallacy: projecting our wishes and feelings onto external phenomena, resulting in angry skies, brooding mountains and roses that art sick. The effect belongs in poetry, not the economy nor in economics, that properly dismal science.

The notion of economic “growth” is pure magic realism. It’s as if we imagine that cheek-puffing zephyrs propel clouds, autumn leaves gaily cartwheel across the lawn, and water sprites dance in our water glass.

The economy represents and involves numbers and statistics—and precision. Innumeracy is a now serious issue in this country. The myopic, hazy, lazy thinking behind our talk of “growth” appears to be shared by our children, whose math skills are perilously close to those of their peers in debt-ridden Spain, Portugal and Greece. Unless we start taking our economic numbers seriously, the problem, not the economy, will only grow.