Online With Dr. Faust

“The hardest thing to see is what is in front of your eyes,”  Goethe said.  After going online this morning, I have seen what is right in front of my eyes, and I rather I hadn’t.

Have I been asleep, blind, or both?  How else to explain my previous failure to perceive the proliferation of creatures that are half-human, half-machines?

Compensating childhood memories return of John Glenn fully clad in his Mercury spacesuit armor, almost indistinguishable from his cockpit, seamlessly part of his spectacular shiny vessel.  The Age of Heroes.  Now, cyberspace is being expanded, penetrated and poked at by anyone.

And anyone is everywhere.  I am surrounded by people attached to machines, and vice-versa.  Having no handheld telecommunications device, I experience a post-modern solitude, alone in the midst of people using their devices. I see them, but they do not see me: their minds are elsewhere.  They are each in a different place, distinct time zone, far from the here and now which keeps me company.

I see that individuals are not, as they suppose, using technology, but are themselves appendages of technology, consumers in the process of being consumed, hunters captured by the game.

When I see a post-human clutching a cellular device, or aglow before a computer screen, I instantly imagine the person vanishing into the device head-first, their legs wriggling helplessly as from the jaws of a lion.

People talk about the “singularity” – a post-human future convergence where man and machine morph into one.  But that moment has arrived, and the post-humans with it.

Thinking of Mr. Goethe, I am reminded of his play, Dr. Faust.  The tragedy tells the story of the eponymous scholar and magician who enters into a pact with the Devil, exchanging his soul for boundless worldly knowledge and limitless personal experiences.

Sounds like the Internet, doesn’t it?  Offering an endless supply of information, services and ways to communicate with fellow post-humans? Like the Devil, the spiders on this vast web prey on our human foibles: our curiosity, desire for gossip and titillation, our voyeuristic tendencies.  It offers, if one so desires, a Faust-like omniscience plus a diabolical cloak of anonymity in which we can become anyone, or say anything with no concern – much less responsibility – for our consequences.

Online, we feel unusually free.  Yet we are slaves.  Our horizons are delimited by algorithms that tailor what we see according to our past behavior.  The Internet user feels he is on a mountaintop, the world his to survey, but is instead on a treadmill of feedback loops.  Even without the NSA getting involved, every time we log on we sacrifice our privacy, rending us prized data for marketers.

The statistics on Internet usage are startling.  American spend five hours, nine minutes on the Internet each day, in addition to four hours, 31 minutes watching television; add it up and the average media diet equals 147 24-hour days, more than a third of your year.  I’m reminded of the French expression, “It’s one thing to go into a whorehouse; it’s another thing to never come out of it.”

Each time we use the Internet, we sacrifice our time, our perception, our senses, ourselves.  Since 80% of human communication is non-verbal, we become fractions of our social selves.  In the Internet we have migrated to a sensory deprivation chamber: a zone where we are stripped of physicality, the human touch, voice and gaze, fragrance, dimensions, weather, spontaneous dialogue.

And what of boredom?  And the dreams and insights that follow?  It is often overlooked that both Dr. Faust and the Devil lost – and won – in the bargain.  Attention and awareness – and even their gauzy gaps and spongy pauses – are the soul of our relationships and personal development: the sunshine that brings us to life.

For those who seek a respite from media saturation, the ultimate antidote is the garden.  In your yard is a realm of beauty, color, fragrance, a panoply of forms, dimension, authenticity, and truth.  The gardener is attuned to the life of plants, the seasons, sunlight, the earth, weather, bees, butterflies and hummingbirds.  In creating and nurturing a garden, you can see (and taste) the results of your efforts.  The garden is a place to connect with nature, ourselves and each other: the ultimate social network – vividly and easily there right in front of your eyes.

Winter Solitaire: Guest Blog by Nick Rhodehamel

Not every evening but most during the last month the Canada geese have flown over. They seem to be heading for the big lake and open water. That’s what I think anyway.

Sometimes they drift over in easy flight and large flocks so high their calls sound distant. During fog, though, they fly low and barely reach 100 feet. Sometimes they come in ragged, leaderless, motley troops that seem undisciplined, their movements rushed. Their calls are cacophonous, raucous, and complaining. Other times, there are only single birds or perhaps triplets whose cries, maybe I imagine, sound plaintive. I consider that they’ve been separated from their friends or are searching for lost relatives.

When they came by during the snow storm, I thought they need to get on with it; it’s a long way to Tennessee or where ever they’re going. And the open water, marshes, and corn fields where they beef up and stage for the long flights are vanishing quickly.

Do you ever dream of flying? There is a documentary by German filmmaker Werner Herzog called (in English) The Great Ecstasy of the Sculptor Steiner. The film covers ski-flying world championship events in the early 1970s and focuses on the competitor Walter Steiner who is a woodcarver. In ski flying, one tries to maximize aerodynamic lift by body and ski position to remain aloft to cover the greatest possible distance. From the film, you understand that Steiner is in a class alone. The competitors are all elite athletes who have trained their entire lives for these events. But no one can touch Steiner. In (then) Yugoslavia in 1974, even after shortening his takeoff run to reduce his speed, Steiner breaks the world ski-jumping record with a distance of 179 meters (587 feet). On camera, he complains bitterly that the judges are pressuring him to do more jumps to see how much farther he can go but that really they want to see him go splat. If a jumper flies beyond the slope of the hill to the flat run out area, that’s what would happen.

To capture Steiner in flight, often at low light levels, the video is shot with very high-speed film; the images are grainy and sometimes almost pixilated. In the slow-motion sequences of Steiner flying, he looks like nothing if not a Canada goose in flight.

That same evening that the geese flew over in the snow, standing still as a stump and drinking a small glass of whiskey in the grove of hemlocks next to the house while the snow and night fell around me, I saw a coyote. Even in the dim light, he was unmistakable, trotting deliberately along looking this way and that. I thought he was probably making his way to the wooded ravine behind me, where I’d seen them and their tracks before. Up the hill he came, and only when he was within 50 feet did he pause for a moment sensing that something was different or wrong. Then he changed course and in the same business-like way traversed along the side of the hill and around my woods and disappeared from sight.

My wife was raised on the rim of the LA basin where at night coyotes came down from the wild hills to see what they could see. One night they took her kitten, and she’s never forgiven them for that. I didn’t tell her about this one.

But later that night or maybe early the next morning somewhere between sleep and wakefulness I saw him again. This time he was not alone. What was in his mouth wasn’t clear, but I knew it was a cat. Freshly killed I thought, judging from the way it swung limply as he came up the hill toward me. The scene changed, and now it was the cat that was full of life and grown far bigger than it had been in the coyote’s mouth. The coyote was shrunken and looked like some Native American headdress or a crumpled incompletely shed insect exoskeleton. The cat, now wearing the coyote and looking like some shape shifting skin-walker, stopped and stared up at me.

I woke. Wrong cat to mess with, I thought. I thought also that I wouldn’t tell my wife about the dream either.

Curriculum Upsidedownia

Frequently, these days, I’m reminded of Edward Lear’s whimsical illustration titled Manypeeplia Upsidedownia. Depicting an imagined botanical species, the drawing shows a half-dozen characters suspended upside-down from a flower’s bending stem. A product of the Victorian golden age of nonsense, Lear’s fanciful drawing increasingly strikes me as all too realistic, too true to be good.

We seem to have landed in a new, darker era of nonsense, one in which we take our follies seriously, and act upon them. We shoot first, and aim later.

On the domestic front, folly is fully in evidence with the implementation of the Common Core Curriculum. Now adopted in 46 states, this federal effort imposes uniform standards on what subjects are taught in American schools and how, with student’s performance measured by extensive testing. Imposed on states as a corollary to the Race to the Top initiative, the Curriculum is the fruit of a process tainted with politics, vested interests and a lack of transparency.

For an initiative so oriented to students’ test results, the program has been put in place without itself being rigorously tested—making America’s secondary school students 50 million pedagogic guinea pigs.

Focused on developing critical thinking in students, the program’s design, implementation and potential costs reflect a conspicuous lack of reflection. If the program’s goal is to enhance reasoning skills, the curriculum’s developers would fail their own test.

Oriented to career development, the Common Core Curriculum emphasizes skill sets over content, and nonfiction texts over literature. The imposition of the one-size-fits all top-down approach recalls the imposition of the whole language (also known as “whole-word)” reading technique in the 20th century. Replacing phonetic reading, sounding out the words, by whole word: “look-and-say” recognition. The unproven method has produced a steady decline in American reading scores, and overall literacy.

The test-centric No Child Left Behind program resulted with half the nation’s schools receiving a failing grade. The Common Core cure? Create tests that are considerably tougher, longer and vastly more expensive.

It’s irrational and cruel to impose uniform standards on schools whose budgets, resources and environments vary so dramatically. How is an inner-city child supposed to compete with his affluent suburban counterpart, blessed with a wealthier school district, educated parents, and tutors and test coaches on call?

If leveling the educational playing field is our goal—and it is a laudable one—it would make sense to first align how much money is spent per student, before applying uniform standards of teaching and achievement.

The Common Core standards were developed by academics and testing experts, with little or no input from teachers and parents. Many of the curriculum’s consultants have ties to testing companies: indeed David Coleman, the Curriculum’s chief architect now heads the College Board.

In an ever-changing world, common sense would propose a broad range of educational approaches rather than a single one. In education, as in gardens, a monoculture is one doomed to decay and eventual failure.

A vast educational experiment, the Core Curriculum has been implemented without empirical evidence of its value, designed by a flawed process, and imposed hurriedly without consulting the very people most affected: students, teachers and parents.

In the future, American students might do well to study the Upsidedownia Curriculum as a textbook example of what critical thinking is not.

 

As seen in The San Francisco Chronicle

The Whole Language Problem

How reading is taught is a matter of national urgency. The ability to read is the first building block of education: the key that opens the door to all later learning.

When it comes to how our public schools teach our children to read, a failed technique — whole language or “whole word” —continues to prevail over phonics, the teaching method that’s been a proven success over millennia.

American literacy statistics make for troubling reading. Today’s average American reads at an 8th grade level. The literacy rate for Americans is about 75 percent, leaving 25 percent functionally illiterate, scarcely able to read street signs.

If you are reading this, you likely belong to the 15 percent of Americans who read at the highest level — that of a college undergraduate.

Designed by 19th century German psychologists in order to improve educational efficiency, whole word became widely adopted in the U.S. in the 1930s. The phonics reading manuals, with words broken down into syllables, were jettisoned in favor of the infamous Dick and Jane books.

Alarm bells rang at the beginning of the 1950s, when the military saw a dramatic drop-off in reading ability among recruits. At first, military brass suspected draftees were faking it, trying to appear less literate to avoid service in the Korean War. In actuality, rejection rates due to illiteracy rose from less than 3 percent in World War II to approximately 17 percent during the Korean War, to 21 percent during the Vietnam War.

After the first salvo was fired in 1955 with the publication of the wildly popular book, Rudolf Flesch’s “Why Johnny Can’t Read,” another war began between the proponents of phonics and whole word. Despite dozens of studies showing that phonics is vastly more effective, public school educators have largely stuck with whole word — with disastrous results.

Whole language advocates embrace the notion that children learn reading as intuitively and effortlessly as they do language. That’s no less absurd than saying you can learn to read music if you can hum a tune. They belittle phonics — with its alphabet, syllables, drills and instruction — as a brittle, artificial technique that only gets in the way of the student’s “natural” quest for meaning. But, in fact, it takes effort — even courage — to learn to read well. That’s why memorizing grammar and vocabulary was thought best learned “by heart.” But the scolds of the whole word movement gave it the dreary label, “by rote.”

Whole language learners are required to memorize hundreds, and eventually thousands, of words by the way they look, as if they were Egyptian hieroglyphs or Chinese ideograms. If they come across a word they don’t know, they can’t go to a dictionary, because spelling, like the alphabet, is viewed as an afterthought in the loopy land of whole language. Since they are never taught to decode, they’re stuck in a state of ignorance until the nurse-like teacher spoons them new words.

Yet phonics — a method that goes back to at least ancient Rome — teaches children to read letter by letter, syllable by syllable, phrase by phrase, what we used to call “learning your ABCs.” Once learned, phonics breaks down the code of written language, providing students with a toolkit that allows them to decode — to unlock meaning — and thus profoundly learn any of the million-odd words in the English language, a skill enhanced with each reading experience.

Whole language should be tossed on the scrapheap of history. It doesn’t work, and it will never work. It’s imperative for America’s civil life and culture, and our country’s international competitiveness, to return to teaching reading with phonics, and abolish the poisonous pedagogy known as whole language.

George Ball is chairman and CEO of the Burpee Company, as well as vice chairman of The Orme School, a college preparatory boarding school in Arizona.

 

As seen in The Detroit News

Winter Sweetness: Guest Blog by Nick Rhodehamel

Last week we had our first real freeze. What wind there was in our little vegetable garden flapped the stiff, unyielding leaves of the tomato and black kale plants. By noon, the temperature had risen above freezing. The tomato vines were droopy and those fruits that remained on the vines were water soaked with drops of water beginning to form on their surfaces. The kale seemed no different than before.

If you’re in central Indiana, the “living history museum” Conner Prairie Farm is well worth seeing. Among its permanent features is Prairietown, a recreated pioneer community set in 1836. Prairietown has several homes that represent the various people that would have been found in and around such a community; there are, for instance, a prosperous family from Kentucky and a family of hardscrabble pioneers. There is also a blacksmith shop, a pottery shop, an inn, a doctor’s office, and a schoolhouse. “Historic interpreters” dressed in period clothing perform first-person impressions of the people of Prairietown and interact with visitors in character. At Christmas time, Conner Prairie Farm presents a special program that is set on Christmas Eve. In the evening, visitors go from house to house observing how different households celebrated the night before Christmas. The Kentuckians prepare for a dinner party complete with fine crystal glasses and a multiple course meal. In the windowless cabin of subsistence pioneers, the husband is away and the wife, with babe on arm, prepares for winter and treats visitors to a demonstration of the art of sausage making. That and the few vegetables they could preserve were apparently what they ate during the long winter. Life was tough for a lot of people on the frontier, and winter was particularly grim.

Our tomatoes were killed outright by the freeze. Ice crystals formed within their cells, and that’s always lethal. They had survived frost earlier, and of course frosts too can kill plants. But it’s not frozen cells that gets them; it’s dehydration and the resulting cell membrane damage. Plants have spaces between and outside their cells (the apoplast) that allow air to diffuse in and out and carbon dioxide to be taken up for photosynthesis and oxygen released to the atmosphere. The apoplast contains water too, and as temperatures fall, ice can form in these intercellular spaces. This intercellular ice causes water to flow out of the neighboring living cells into the intercellular spaces where it too freezes. As the amount of intercellular ice increases, more and more water flows out of cells, membranes rupture, and cells die.

Our kale, on the other hand, was fine; it’s quite cold tolerant, as are brassicaceous plants in general and many “greens”. These hardier plants differ from their more tender cousins, like tomato, by adapting to cold temperatures and accumulating soluble sugars and other low molecular compounds inside their cells. These act as antifreezes that inhibit cellular ice crystal formation and dehydration, and they stabilize delicate cell membranes. For the gardener, cold acclimation is a boon: not only can you continue to harvest fresh into winter, but your produce has a sweeter, richer flavor. Some crops such as parsnip and turnip certainly can be eaten in fall but are really best after they’ve spent a winter under snow.

Lots of people find winter a forbidding time. The nights are long, and the days can be cold and gray; the late spring snows make winter seem eternal. If you’re among them, don’t despair. It’s not 1836, and even if you were a subsistence pioneer, your diet need not be too tough during winter. With a little planning and a root cellar or a corner of your garage or basement that remains above freezing, you can eat your own produce all winter long. In addition to what you leave in the ground, many crops are easily stored; carrot, cabbage, brussels sprouts, onion, leek, shallot, beet, potato, and winter squashes are examples, but by no means is that an exhaustive list. And you can always freeze (corn and broccoli do well) or can (tomato) the fruits of your garden.

Indeed, winter is time for designing your next garden and planning the sowing and planting schedules. The Sumerians reckoned that this process—this specific and manual forethought—was the basis of civilized society: making something out of nothing, or a lot out of a little. Please see Samuel Noah Kramer’s classic The Sumerians for an illustration of the first known “farmer’s almanac” from about 2000 B.C.

At Burpee we offer our new version of last year’s app, ‘Garden Time’, that answers the question,”When?”, the anxious moment for all new gardeners. Very soon, Burpee and The Cook’s Garden will have their websites up with all their new 2014 blockbuster varieties of vegetables, flowers, bulbs, herbs and perennials.

So settle into a comfy chair and start gardening!

Pining for Pines: Guest Blog by Nick Rhodehamel

Just north of St. Ignace and the bridge over the Straits of Mackinac, the sign for the Mystery Spot looked pretty much as it did the last time I saw it. There seems to be a Mystery Spot just about everywhere. I know there are at least three in California alone. Mystery Spots purport to be the sites of “gravitational anomalies”, where you can witness balls rolling uphill and people standing effortlessly at fantastic angles. My guess is that what you see has more to do with clever engineering and our tendency to accept as real what our senses tell us. I am certain too that in any visit to a Mystery Spot, there’s nothing anomalous in the lightening of the weight of your wallet.

If you’ve driven around the top of Lake Michigan, you may have noticed that the forest crowds the highway; trees dominate the landscape, and the mix of trees is different from farther south. All along the route, we’d seen stands composed of conifers (mainly pines) mixed with common deciduous species. And before the novelty of the drive was long gone, my wife wondered why there are “so many pines up here”. She meant “conifers”, in general, and that characterization is not terribly quantitative, but it’s a good observation.

To be sure, countless things influence what grows where (and with what). On a small scale, you could search for answers in differences in soil, topography, or any number of other variables, but on the grand scale of the Upper Peninsula of Michigan, the quick-and-dirty short answer is the way to go. As political commentator James Carville might say “It’s the climate, stupid”.

The climate of any area is shaped mostly by the air masses that grace it and the physical characteristics those air masses possess. In the central part of North America, three primary air masses dominate: Subtropical, Arctic, and Continental. North Pacific and North Atlantic air only infrequently have an effect. Warm air from the Atlantic Ocean collects moisture as it moves across the Caribbean; this subtropical air then flows up the Mississippi Valley and reaching the middle of the country, it heads east. Arctic air masses that form over the Beaufort Sea are cold and dry. They flow south over Canada and turn east also in the middle latitudes. When these two air masses meet, the cold Arctic air sinks under the warm, moist subtropical air, forcing it to rise and cool and often lose its moisture to precipitation. The third air mass develops in the Great Basin area. Prevailing westerly winds drive it east over the Rocky Mountains, which comb out most of the little moisture it held. This Continental air mass often acts to separate the two contentious air masses. When that happens, dry conditions prevail.

The interplay among these air masses mainly determines the climate in the central part of the continent. One clear feature is a climatic gradient that runs roughly east-west from Minnesota and approximately through the central parts of Wisconsin and Michigan and, east of the Great Lakes, through Pennsylvania, New York, and the less mountainous parts of New England. South of the gradient, conditions are relatively warmer and drier. To the north, the climate is cooler and moister; the winters are longer and somewhat more severe, and snow usually remains on the ground all winter. Most precipitation in the north falls in summer where it’s more constant than to the south where there are often summer droughts.

This climatic gradient marks the southern boundary of the “North Woods” (the “North Country”, the “Great North Woods”, or the “Laurentian Mixed Forest Province”, whatever), which extends north through parts of the Canadian provinces Ontario, Quebec, and the Maritimes. It is a vast area that was covered by ice during the last glaciation, and its physical features reflect this pedigree in lakes, swampy depressions, outwash plains, and barrens. Elevations are relatively low, a few hundred feet above sea level to a little more than the 2,300 feet of remnant mountain ranges that in their days would have rivaled the Alaska Range but have been worn down by this last glaciation, other past glaciations, and mostly eons of wind and rain and freezing and thawing.

The North Woods lies between the boreal forests of Canada and Alaska and the ancient deciduous broadleaf forests that radiate from the vicinity of the Ozark Mountains in Arkansas and Missouri and in the southeast, the southern Appalachian region. It is transitional between these two forest zones having components of both. Around 15,000 years ago, there were no trees, only the Laurentide Ice Sheet. Then as the glaciation waned, conifers began to colonize the area in successive waves. Boreal species such as tamarack, spruce, and balsam fir arrived about 13,000 years ago. Probably these initially grew in the soil on top of the retreating ice. They were followed by white pine and hemlock, both from the southeast. Toward the end of the conifer migrations, the flowering trees then began arriving . Oaks and elms first appeared around 11,000 years ago, then maples a little later and finally the American beech.

The Upper Peninsula, where we were, has always seemed wild and remote to me. It’s a difficult place to make a living, even in the best of times, as evidenced by the number of stores along our route selling pasties, gas, liquor, and sundries that have closed their doors. In prehistoric times, the populations of native peoples were much greater farther south. Logging, and later mining, brought in lots of men. In the old days, those industries were practiced rapaciously and all but decimated the forest and landscape. When logging reached its predictable conclusion in about 1930, there was nothing left to hold them and the men who worked the forests drifted off. Now logging and mining are conducted on a smaller scale in a more controlled way and fewer workers are needed. Farming has always been an iffy proposition here; summer is sometimes called 3 months of bad sledding. Those who tried farming, maybe castoffs of the logging days, often gave up, and then they too decamped to more favorable climes.

The forests have grown back since the logging days, though not exactly as before. Nearly 300 years ago when the French came here to trap beaver, convert the native peoples, and search for ingots of pure copper, they found the forests mature (late successional) and dominated by conifers, mostly white pine and hemlock. Deciduous species dominate now, in both mature and juvenile forests. Those French would not disagree with my wife’s observation, but they might be surprised at what she called “so many pines”.

Where Have All the Flowers Gone?

Where have all the flowers gone? American cities, proud hubs of the arts, increasingly lack the very soul of culture: the flower. The original earthly joy, flowers bestow what our urban spaces are most in need of: beauty, romance, delirious color, fragrance, and a panoply of extraordinary forms.

Our urban centers, meanwhile, are reveling in a vegetable renaissance. Vegetable gardens large and small are sprouting in our cities: in backyards, window boxes, and repurposed warehouses, and on rooftops and balconies.

Farmer’s markets offer up splendorous harvests of fresh produce: cooed over by urban vegetable aficionados, who, a while back, likely didn’t know the difference between radishes and radicchio. Heirloom vegetables, standard pre-World War II market varieties, must fairly blush at the lavish attention – and prices – accorded them.

The Dawning of the Age of the Vegetable, unfortunately, converges with the Decline and Fall of the Flower. Not so long ago, our cities were abloom with florists, a species fast going the way of record stores. Restaurant tables flared with flora. Women wore corsages. Men’s suit lapels boasted boutonnieres. Guests arrived bearing bouquets. Homes were festooned with flowering potted plants.

Compared with European metropolises, America’s cities are strikingly unfloriferous – and poorer for it. Vegetable gardens now abound; flower gardens are few. Vegetables are delectable and nutritious; we admire them, we respect them, but we do not love them. Flowers engage our senses and spirits in ways even the ripest, juiciest heirloom tomato cannot.

The author Iris Murdoch observed, “People from a planet without flowers would think we must be mad with joy the whole time to have such things about us.” If only flowers were about us.

The flower is the crown jewel of botanical creation. Without flowers, there would be no seeds, no fruits or vegetables, no life. Humankind has coevolved with flora: We domesticated flowers; flowers – the first plants cultivated without a utilitarian purpose – have domesticated us.

The ultimate symbols, flowers are prettily strewn throughout poetry, song, legends, and stories. They bloom in paintings, architecture, ceramics, textiles, photographs: every form of visual pleasure. Enshrined in the trajectory of our lives, flowers signify birth, youth, romance, and marriage; at death, they promise new life. In our anhedonic, pixilated digital age, we need flowers more than ever.

Unfortunately, commercially available flowers are mostly poor in quality, limited in selection, and grown abroad under shockingly unsustainable conditions. Most imported blooms fly in from South America on pollution-spewing jets. Grown in greenhouses staffed by exploited workers, flowers are sprayed with pesticides, fungicides, and herbicides, and picked too young.

On arrival, newly imported flowers are gassed with ethylene to hasten ripening – sacrificing buds, leaves, and richness of color. That’s why your supermarket-bought blooms look dead on arrival.

The burgeoning crop of urban gardeners will provide an extraordinary service by cultivating flowers, serving “locaflors” as well as locavores. Flowers, enduring symbols of renewal and rebirth, await their urban renaissance.

As seen in The Philadelphia Inquirer

Burpee CEO Reblooms Urban Agriculture

At the keynote speech of the Urban Agriculture Conference in New York City, organized by The Horticultural Society of New York, George Ball, Burpee Chairman and CEO told leading-edge urban gardeners and rooftop farmers to “stop and smell the cut flowers”.

Most urban agriculture projects consist mainly of vegetables and herbs with occasionally a few flowers on the side. Ball, a 35 year veteran of both the cut flower as well as the vegetable business, urged the participants at the May 16th conference held at the Kimmel Center of New York University, to meet the great potential, as well as pent-up demand, of fresh cut flowers that have almost vanished from urban homes, parties and other public and private events.

“Think of cut flowers as an endangered species” quipped Ball. “If you grow flowers in a 1-2 acre farm or garden, you not only serve customers who have not been pleased for over 30 years, but also you avoid regional competitors and government regulators in the fresh vegetable business.”

Ball went on to discuss the attractiveness of a cut flower urban farm to employees as well as customers. “You will have volunteers line up early every morning to work on a seasonal, outdoor cut-flower farm—vegetables don’t have that kind of deep and universal attractiveness.”

Ball added that the contemporary flower industry is dominated by huge exporters from countries 4,000-6,000 miles away, whose flowers are picked “green” when the buds are not fully pigmented (much as a tomato is picked green) and shipped by air-polluting jumbo-jets to wholesalers who keep them up to a week in storage. Finally, they are distributed to an ever-decreasing number of retail florists. “Today most florists are gift shops with a small cooler in the back filled with pale-colored flowers from Asia, South America or the Middle East” Ball said. “The consumers have fewer choices in flowers than they have in vegetables in a supermarket.”

Ball also pointed out the latest research at Rutgers University by Jeannette Haviland-Jones that proves that fresh flowers in the home elevates mild depression or other mood disorders. “So long as the flowers are proportionate to the room—not too many, not too few—they transform the space into a place of happiness”, added Ball.

“Vegetables are fuel for our body, but flowers connect with the deepest parts of our spirit.”

The Urban Agriculture Conference at the Kimmel Center was attended by over 300 urban gardeners and city farmers from across the nation.

 

The above is a copy of a press release that went out last week. Your comments are welcome! And please pass along to other bloggers if you have a chance. Thank you.

Antidepressants? Grow Your Own!

A few years back I witnessed an unforgettable sight. Having just led a contingent of Asian visitors around Burpee’s Fordhook Farm floral display gardens, I noticed one man remained standing outside the garden, rocking back and forth, his eyes closed. Concerned, I asked him if everything was okay. “I … am … happy”, he replied simply, lost in rapture. In his honor, we have named it the “Happiness Garden”.

I mention this episode because our country is right now in the midst of an epidemic of unhappiness. If Walt Whitman were to hear America singing today, he might hear a low, moaning blues rather than “strong melodious songs”, America has a case of the blues—and one of epidemic proportions.

Depression is one of America’s leading health problems, accounting for half the costs in mental health today. America’s new Great Depression exacts a high cost, with lost productivity and medical expenses adding up to $83 billion annually.

Since 1988, the use of antidepressants by Americans 18 and older has increased fivefold. Ten percent of Americans are currently on antidepressants, filling 245 million prescriptions designed to boost their mood at a cost of 8 billion dollars. Still another 20 percent of Americans who are depressed receive no treatment.

Antidepressants do not work for all, and are apt to lose their efficacy over time. Better than nothing, you might say? Studies have shown that for one-quarter of patients, a sugar pill proved more effective than the prescription medicine.

What accounts for this tidal wave of depression? Along with genetic factors and one’s biology, experts point to environmental factors, the stress of modern life, the rapid pace of technological change, and our increasingly sedentary modern lifestyle.

Other observers believe that the depression is currently over diagnosed. Another school of thought contends that what’s causing the depression epidemic are … antidepressants. The notion is that while the medications can be effective for the short term, they can worsen symptoms over the long term due to neuroadaptation on the part of the brain.

Lifestyle factors surely play a part, and depression is frequently coupled with being overweight. Americans currently spend seven hours a day in front of a computer or TV screen—an activity—or non-activity—which surely does little to boost their physical or mental health. Mesmerized before their screens, social and family life are sacrificed to a solitary realm offering little in the way of challenge or stimulation.

So what’s a depressed person to do? I’m happy to note that there’s light at the end of this depressing tunnel. It’s recognized that milder cases of depression can be a number of ways for which no prescription is required, including yoga, meditation and a daily dose of mild exercise.

The best clinic for treating depression might be right outside your door: the garden. According to Sir Richard Thompson, the president of the Royal College of Physicians, working in the garden frequently proves a more effective antidote than expensive pharmaceuticals.

The garden provides visual stimulation, mood-boosting sunlight—and a realm of effects you won’t find sitting before your computer screen: fragrance, flavor, color, beauty—not to mention a harvest of the freshest herbs and vegetables and vase-ready blooms. And scientists have found that M. vaccae, a benign soil bacterium, has antidepressant effects.

Being in the gardening business, I meet thousands of gardeners a year. You cannot imagine a group of more spirited, upbeat, enthusiastic and contented people. They are invariably fit, with blooming color in their cheeks and sunlit sparkle in their eyes. Whether men or women, young or old, urban, suburban or rural, their therapy is the same: the Happiness Garden.


A slightly different version of this essay appeared as an op/ed piece in The Philadelphia Inquirer on May 24, 2013.

The Price Is Lice: Guest Blog by Nick Rhodehamel

Note: Every once in a while we here at the old bloggie limp, or shuffle, over to the stove and brew up a nasty, filthy, strong pot of coffee. The ensuing, almost hallucinatory, stimulation allows us to publish “monster” blog posts. This is one.

Oh, dear readers, you may recall from your early childhood the bowl haircut boys—this was usually due to parents’ extreme frugality—but you may remember as well the rare boy who would disappear between class breaks and return a few days later with a bald head and hang-dog look on his face. Worried exchanges would take place well out of his earshot—so lovingly thoughtful were we children back then—and the “L-word” was introduced to us. Around the JFK assassination in my clock.

So, welcome to a “Back To The Future” moment. Here’s a future no one wants to back into, so to speak. However, Nick is on the beat. And I’m an amateur entomologist. I just love bugs. Happy reading!  

During late winter this year, head lice were epidemic at my children’s school. That’s not to say that the place was exactly lousy with lice. But for a few weeks, children with lice were found often enough to worry my wife and me.

Frequently, when one infested child was identified, others were found. Usually, we were able to infer who the children were by who was sent home. Often, the same two siblings, who were coincidentally in both of our children’s classrooms, were sent home.

The school immediately informed us of newly discovered (or rediscovered) infestations. And when they did, we made our children strip in the subfreezing garage, searched through their hair like frantic chimps, and envisioned the ballooning water bills when we would be driven to wash and rewash bedding. We felt fortunate that they remained louse free.

The word “louse”, or variations of it, is firmly rooted in our language. Figurative use as pejoratives dates from late 14th century: “lousy weather” or “louse of brother-in-law” (whatever) are examples. “Lousy with” (swarming with) is an American usage from the 1840s. But these are expressions we use without thinking, just as we use “lock, stock, and barrel”. Lice infestations today seem somewhat novel and unusual. They were once more visceral and immediate.

During most of human history, lice infestation was a ubiquitous experience. Lice have been found with Egyptian mummies. Or take, for instance, the Vikings, but nearly any group would do. The Vikings, as they were exploiting and expanding the world, in addition to arms and amulets, carried among their accoutrement finely crafted combs with closely spaced teeth. During that period, most people wore their hair long, and combs were used as much for removing head lice as for making hair look beautiful.

The historical novel Lord Grizzly tells the story of the American mountain man Hugh Glass who travels up the Missouri River in 1823 with a party of beaver trappers. Hugh is attacked by a she grizzly bear with a cub. Though he succeeds in killing the bear with his knife, he gets pretty well torn up and a displaced fracture of his leg. His companions find him, sew him up with sinew, dig his grave, and wait for him to die. But Hugh won’t die. It’s hostile Indian country and after watching over him for several days, they abandon him, taking all food and weapons with them.

When Hugh finally comes around, he manages to realign the bone in his leg and to crawl nearly the entire 200 miles back to Ft. Kiowa, which his party had left 3 months earlier. During his crawl, Hugh comes across an ancient, dying Arikara woman abandoned by her tribe, whose members Hugh knows would kill him on sight in a heartbeat. Out of kindness, Hugh gives her water and cooks her food. He then buries her when she dies. No good deed goes unpunished; the dead woman gives Hugh an unwelcome gift—lice.

Unlike my wife and me, Hugh knows lice. He removes his buckskin clothes, puts them on an ant mound, and the ants feed on the lice. Soon the ants are crawling on Hugh too. When they begin to bite, he knows that he is clean. Hugh calls the lice “graybacks”, a term not now in usage that was common among Civil War soldiers who were tormented by lice also. Not too long ago, nearly everyone would have gotten lice at one time or another.

Almost certainly my wife and I naively overreacted. The Center for Disease Control says that head lice infestation, or pediculosis, is a common occurrence. Firm data on head lice incidence in the United States are not available, but 6 to 12 million infestations are estimated to occur each year in the United States among children 3 to 11 years of age, who are more likely to contract it than older children or adults. Some studies suggest that girls get head lice more often than boys, probably because of more frequent head-to-head contact. Head lice are insects that neither fly nor jump; transfer to another person is largely passive. Head lice feed on human blood; they are obligate human parasites, meaning they need humans (and humans only) to complete their life cycle. They can live no more than 36 hours apart from their host and prefer to lay their eggs (nits) on hair. There is no evidence that head lice transmit disease. Children diagnosed with head lice need not to be sent home early from school and can return to class after appropriate treatment has begun. Successful treatment should kill all crawling lice, but nits may persist after treatment. No big deal.

But when I was a child, I don’t remember anyone getting head lice. None of my siblings ever had them, and others tell me the same. What happened? Are head lice more common today?

Apparently, they are. Reasons for this are not certain, but societal changes surely contributed. Forty or so years ago classroom behaviors began to change. Children no longer only worked alone at individual desks but spent more time in small groups and moved around to different work areas, increasing contact among children and coincidentally promoting infestations. And as the work force expanded and both parents (and single parents) went to work outside the home, more children attended day care and after school programs. This too increased potential contact with infested children and in turn infestation incidence.

People, of course, want head lice about as much as they want a sharp stick in the eye. As the incidence of head lice infestations increased in schools, the demand for effective louse treatments grew. And a lucrative industry was spawned. One current estimate is that various louse shampoos, sprays, and rinses bring in $150 million a year.

Many of these treatments are based on pyrethroid-type compounds (originally isolated from chrysanthemums). For a time, the treatments delivered eradication in one application as promised but not now. Resistance in pest populations resulting from extensive use of single or similar compounds is an old story. And it’s a big problem too. Consider multiple drug-resistant tuberculosis or resistances in such common bacteria as Staphylococcus and Streptococcus (“flesh eating bacteria”). So why not lice?

This is the age of the New Normal, and that’s any suboptimal situation that we seem unable to correct: the jobless recovery, events such as the Boston Marathon bombings, and (I read) for the fast food industry, price cuts and stagnant sales. I suggest we add head lice infestation to this assemblage.