Free Range

Topic Box Image: 

Common Food Additives May Be Making Us Fat

It could be more than just the calories in that creamy ice cream that’s packing on the pounds.

Science has given wary consumers another reason to avoid some of those tongue-twisting ingredients listed on the packaging of countless products in the average American grocery store. In a study published this week in Nature, researchers say they’ve found evidence that two commonly used emulsifiers in processed foods may be linked to the rise in obesity and to certain chronic digestive disorders.

The team of researchers, led by two scientists at Georgia State University, wanted to see what impact the synthetic emulsifiers polysorbate-80 and carboxymethylcellusose might be having on the trillions of bacteria that make up the gut microbiota and are essential for healthy digestion. Both emulsifiers are found in a slew of products—most notably ice cream and other frozen dairy desserts. But they can also crop up in everything from canned soup and salad dressing to frozen entrées and cream cheese (and even sunscreen and hemorrhoid cream, but we won’t think about that).


The scientists fed the emulsifiers to mice at doses comprable to what your average person might consume. What they found was that the gut bacteria of the mice that were given the emulsifiers were altered in a way that made the digestive tract of the animals more prone to inflammation—which is linked to the onset of metabolic syndrome, a group of common obesity-related disorders that can lead to type 2 diabetes as well as heart and liver disease. In mice genetically predisposed to inflammatory bowel disease, the changes to their gut bacteria appeared to trigger that disorder.

That would seem significant, as public health experts have struggled to thoroughly explain the alarming spike in obesity rates in America and in other developed countries. While many say overeating and a relative lack of physical activity are leading factors, they argue those issues alone are not enough to explain the obesity epidemic and the proliferation of related health problems.


“The dramatic increase in these diseases has occurred despite consistent human genetics, suggesting a pivotal role for an environmental factor,” Benoit Chassaing, one of the study’s lead researchers, said in a statement. “Food interacts intimately with the microbiota, so we considered what modern additions to the food supply might possibly make gut bacteria more pro-inflammatory.”

We may not give much thought to the estimated 100 trillion organisms that call us home, but when it comes down to it, we’re more bacteria than human. Bacteria outnumber our own cells 10 to one, and scientists are increasingly coming to understand that messing with all those tiny organisms may be causing a host of big problems. A blockbuster essay published in The New York Times last year by science writer Pagan Kennedy looked at the research suggesting our overreliance on antibiotics may be linked to obesity, while in a study published last September, scientists found a link between another popular food additive and changes in the gut bacteria of mice—artificial sweeteners.

  • Food
  • Go Nuts! Study Finds Feeding Babies Peanut Products May Prevent Allergies

    Keeping infants away from such foods could be partly to blame for the allergy epidemic.

    Whack! That’s the sound of whiplash for millions of pediatricians across the country as they dramatically rethink everything they’ve been telling anxious parent for years about peanut allergies. Total avoidance of peanuts, it seems, might not have been the best approach to take.

    New research published Monday in The New England Journal of Medicine finds that babies between four and 11 months old who were regularly fed food containing peanuts were significantly less likely to develop peanut allergies later on. What’s more, these infants were selected because they were considered to be at high risk for peanut allergy, though the study excluded babies whose tests showed they were already allergic to peanuts.


    In a separate editorial published in the Journal, two pediatricians not involved with the study said the research “clearly indicates that the early introduction of peanut dramatically decreases the risk of development of peanut allergy,” and that the results “makes it clear that we can do something now to reverse the increasing prevalence of peanut allergy.”

    In the past two decades, the number of American children who are allergic to peanuts has more than quadrupled, a phenomenon that has puzzled scientists and transformed the humble PB&J from a lunchtime staple to a lunchroom pariah at day care centers and schools.

    Ironically, all that peanut avoidance—at least when it comes to very young children—could be what's in part causing the peanut allergy epidemic.

    Keeping babies away from peanuts (or more specifically, food containing peanuts, because no one in their right mind would try to feed a baby whole nuts) “could have been in part responsible for the rise in peanut allergies we have seen,” Dr. Gideon Lack, professor of pediatric allergy at King’s College London and leader of the current study, told The New York Times.

    Lack got the idea for his study in 2000. He was giving a talk in Israel and asked doctors in the audience how many of them had patients with peanut allergy and was surprised to see only a few hands go up. “In the U.K., if you had asked that question, every single member of the audience would have put up their hand,” he told the Times. Lack was further intrigued to learn that Israeli parents often feed their young children snacks such as Bamba, which is made from peanut butter and corn.

    To put his hypothesis to the test, Lack recruited 530 infants between the ages of four and 11 months who, because they already suffered from severe eczema or were allergic to eggs, were at a higher risk of developing peanut allergy. In a randomized trial, the parents of half the children were instructed to feed their babies at least six grams of peanut protein per week spread out over three or more meals, while the parents of the rest of the children were told to avoid peanuts.

    When the children turned five, they were given another allergy test. Remarkably, less than 2 percent of the children who regularly ate peanuts tested positive for peanut allergy, while almost 14 percent of children in the no-peanut group did.

    The results are almost certain to cause leading pediatric organizations to reconsider their recommendations regarding kids and peanuts. As recently as eight years ago, the American Academy of Pediatrics was telling parents whose kids might be particularly susceptible to developing peanut allergy to withhold peanuts until the age of three. The academy changed its tune in 2008, but it did not go so far as to tell parents to start feeding their babies peanut-based foods. With this new study, pediatricians may now start to recommend feeding peanut products to their young children.


    But for those parents with children older than 11 months who have dutifully been keeping their kids away from peanuts, don’t break out the Skippy just yet.

    “If you’re a parent sitting at home with your child looking at them saying, ‘Well, gee, they didn’t eat peanut yet. Maybe I should run to the cupboard and get some peanut butter for them,’ it could be a little dangerous because if you do that and the child has a bad allergic reaction, you would be at home and have a problem,” Scott Sicherer, who advises the American Academy of Pediatrics on allergies, tells NPR.

    Instead, Sicherer recommends that parents who think their child might be allergic get them tested and consult with their pediatrician on the best way forward.

    For a whole new generation of babies, though, the future looks nutty indeed.

  • Food
  • Sorry, Paleo Eaters: Your Diet Is Pretty Much Made Up

    Today’s ‘caveman' menu doesn’t look anything like what humans were consuming at the time.

    Bread and pasta lovers tired of hearing from their carnivorous friends on the Paleo diet about the evils lurking in that plate of spaghetti can take heart: Here’s yet another academic rebuttal to the Paleo fad that will make you feel better about not trying to eat like a caveman.

    “Reconstructions of human evolution are prone to simple, overly tidy scenarios,” writes Ken Sayers, who studies primate and human evolution at Georgia State University. “Like much of our understanding of early hominoid behavior, the imagined diet of our ancestors has also been over-simplified.”


    Sayers’ opinion piece, which appeared over at The Conversation on Tuesday, is a follow-up to research he and Kent State University anthropologist C. Owen Lovejoy published in The Quarterly Review of Biology last December. The pair looked at the fossil record as well as chemical and archaeological evidence to try to determine what some of our earliest ancestors ate. They also incorporated a bit of “optimal foraging theory,” which uses mathematical models to predict how certain animals would find food in the wild based on various parameters.

    RELATED: Bad News for Paleos: Study Finds High-Protein Diets Are as Risky as Smoking

    The crux of their findings? What our hirsute forebears ate 6 million to 1.6 million years ago was no doubt entirely different from what modern-day fans of the Paleo diet eat. The diet varied so much depending on place and circumstance that it’s almost impossible for anthropologists to generalize what early hominids, who adapted to living in a range of environments, ate on a regular basis.

    “Hominids didn’t spread first across Africa, and then the entire globe, by utilizing just one foraging strategy or sticking to a precise mix of carbohydrates, proteins and fats,” Sayers writes. “We did it by being ever so flexible, both socially and ecologically, and always searching for the greener grass (metaphorically), or riper fruit (literally).”

    True, our ancestors weren’t kicking back around the fire noshing on Doritos and Dr. Pepper. If anthropologists have labored to point out that our modern ideas about what our distant forebears were eating are based more on myth than on science (see evolutionary biologist Marlene Zuk’s Paleofantasy), most nutritionists at least cautiously endorse the Paleo diet’s rejection of highly processed foods that have been stripped of vital nutrients, the rampant consumption of which has been linked to a host of chronic, diet-related ills.

    But as Sayers’ research suggests, anyone trying to approximate the diet of our primitive ancestors would likely have to start by replicating a sort of feast-and-famine cycle with respect to various food groups—because foraging in the wild means dealing with seasonal flux in the availability of different types of food. It would also mean dramatically expanding our notion of what’s edible.


    “[P]lants’ underground storage organs (such as tubers), sedges, fruits, invertebrate and vertebrate animals, leaves and bark were all on the menu for at least some early hominids,” Sayers writes, noting that while evidence shows that hominids 2.6 million years ago were eating antelope, the question of whether the animals were hunted or scavenged “is hotly debated.”

    That our earliest ancestors may have been subsisting on a diet of bugs and bark, or the prehistoric equivalent of roadkill, flies in the face of what is the oft-unstated, perhaps even subliminal, attraction of the Paleo diet, particularly among men: the image of the lean, fleet, proud early hunter, perfectly attuned to his natural environment, gorging on a feast of freshly killed beast—which, for today’s Paleos, apparently equates to eating plenty of bison steak.

    “[T]he idea that our more ancient ancestors were great hunters is likely off the mark, as bipedality—at least before the advance of sophisticated cognition and technology—is a mighty poor way to chase game,” Sayers writes. “The anthropologist Bruce Latimer has pointed out that the fastest human being on the planet can’t catch up to your average rabbit. Another reason to be opportunistic about food.”

    Instead of thinking of ourselves as somehow prehistorically aligned with the mighty hunters of the African plains—lions, say, or cheetahs—we might do better to look to the foraging habits of other animals for clues to how our ancestors ate. Sayers cites research that has found the back teeth of hominids were “bunodont,” that is, “low with rounded cusps,” suggesting our distant cousins were perhaps more omnivorous, like bears, which have similar teeth, as do pigs.

    But eating like a hog in the name of good health is a lot less attractive than gorging on bacon.

  • Food
  • Your Food Waste Could Be Turned Into the Strongest Material on Earth

    Scientists in the U.K. are working to transform garbage into the lightest, toughest substance ever discovered.

    Here’s one for the annals of gee-whiz futuristic-fabulous technology: Researchers in the U.K. are working on a process to make graphene, arguably the most heralded new substance discovered in a generation, from an unlikely source—your expired leftovers.

    Just a single atom thick, graphene is the lightest material ever discovered. Counterintuitively (at least to the non-chemists among us), it also is the strongest—about 300 times stronger than steel. As if that weren’t enough, it’s also the best conductor of electricity on Earth.


    Like graphite (remember pencils?), from which it was first derived, graphene is made up of carbon. Because carbon is among the most abundant elements on the planet—as in the building block of all life-forms—graphene can theoretically be obtained from a slew of organic sources. Like, say, your old coffee grounds and that container of leftover pad thai you didn't get around to finishing.

    Scientists working with PlasCarb, a project based at the Centre for Process Innovation in the U.K., are seeing if they can’t transform a fraction of the 1.3 billion tons of food we waste every year around the world, according to the United Nations, into the next-gen wonder material.

    How are they doing it?

    Well, it’s complicated, at least to anyone (like me) who last encountered the periodic table in high school. As The Guardian reports, the process first converts the food waste to biogas using anaerobic digestion. Then that biogas, which consists primarily of methane and carbon, is transformed into graphitic carbon and hydrogen using “an innovative low-energy plasma reactor.” It’s from the graphitic carbon that the graphene is derived.

    The project is in the second year of a three-year run. Researchers are preparing to embark on a trial project that will transform 150 tons of food waste into 25,000 cubic meters of biogas, and they still need to figure out whether the entire process can be commercially scaled and made economically viable.

    That’s probably OK, because scientists around the world are still trying to figure out what to do with graphene. Since it was first obtained in a lab at the University of Manchester a little more than a decade ago (an achievement that would win scientists there the Nobel Prize), graphene has captivated research engineers around the world. As John Colapinto detailed recently in The New Yorker, some 3,018 graphene-related patents were filed in 2011. By the dawn of 2013, that number had jumped to more than 8,400.


    Ultralight aircraft…new biotechnologies to help victims of paralysis…ultrafast superconductors. The possibilities for graphene to remake our world of stuff seem endless. Bill Gates has even invested in the development of a superlight, superstrong graphene-based “super condom.”

    But getting those innovations through R & D and to market has proved a challenge for a host of reasons (see Colapinto’s look at why Silicon Valley isn’t likely to become Graphene Valley anytime soon).

    Nevertheless, graphene is the sort of OMG substance that someone, sometime soon, is going to find amazing uses for—and if we could use our trashed food to make it, so much the better.

  • Food
  • Innovation & Technology
  • The Best and Worst Cities to Work in a Restaurant

    New analysis finds glaring discrepancies in how tipped restaurant workers are treated across the country.

    Would you rather be a server in New York or in San Diego? Before you answer, you’d do well to consult a recent analysis that shows just how unfair our nation’s outdated and byzantine minimum-wage system is about taking care of the workers who take care of the rest of us when we dine out. When it comes to service industry jobs, the difference between a living wage and barely getting by is, much like real estate, all about location.

    PayScale, which collects data on salaries and wages across the country, crunched the numbers regarding what servers bring home in 15 cities with a bustling restaurant industry. Taking both tips and the local minimum wage into account, PayScale came up with an hourly income figure for each market, and the range is pretty staggering.


    While tipped servers pull in a relatively decent $21.50 per hour in San Francisco, their peers in Houston have to make do with a measly $11.60. Despite its reputation as a fine-dining capital of the world (and one of the most outrageously expensive cities to live in), New York pays its servers a middling $15.30 per hour, while wait staff in San Diego take home $17.50.

    The discrepancies are more glaring when you consider the degree to which servers in different cities must rely on our demonstrably ridiculous system of tipping to make ends meet. One bad night of tips for a server in, say, Houston, makes a lot more difference than for one in Seattle.

    That’s because in Seattle, only 43.5 percent of a server’s hourly wage comes from tips, while in Houston, tips account for more than 75 percent. Servers in Miami, Philadelphia, Kansas City, and San Antonio also depend on tips for more than 70 percent of their hourly wage.

    It’s no surprise, really, that wait staff in Seattle, Minneapolis, and various cities in California are the least dependent on tips to make ends meet. They’re in states that are among the relative few—just seven in all—that have dispensed with the pitifully inadequate double standard that is the “tipped minimum wage” and require restaurant owners to pay their employees the same minimum wage as other workers.

    Currently, the minimum that employees who receive at least $30 per month in tips can be paid under federal law is a measly $2.13 per hour—where it’s been stuck for almost 20 years. Back in 1996, when Congress raised the federal minimum wage, it royally screwed over tipped workers by setting the tipped minimum wage at a fixed rate rather than allowing it to increase by a percentage over time.

    While a number of states have raised the tipped minimum wage or abolished it altogether, 17 states still allow restaurants to pay their tipped staff the paltry $2.13 an hour.

    That would be laughable were it not of such dire consequence for a growing number of Americans. The poverty rate for tipped workers is 12.8 percent, almost twice as high as for the rest of the population, according to a report last year from the Economic Policy Institute. Meanwhile, the full-service restaurant sector has grown more than 85 percent since the early 1990s, compared with just 24 percent growth in the overall private sector—meaning more people than ever are relying on tips to make a living.

    Recently, New York had the opportunity to become the eighth state to dispense with the tipped minimum wage, but the state wage board caved to industry pressure, recommending instead that New York’s tipped minimum wage be increased from $5 to $7.50 per hour. “The continuation of subminimum wages for tipped workers is a gift to an industry that has been kowtowed to for too long,” the editorial board of The New York Times howled. “It smacks of legalized wage theft, and it is unworthy of a state that regards itself as progressive.”


    No matter. The restaurant industry still complained, calling the wage hike “outrageous and unprecedented,” in the words of Melissa Fleischut, president of the New York State Restaurant Association, according to The Associated Press.

    Fleischut raised the specter of job losses and cut hours for tipped workers, which has long been the bedrock of the hospitality industry’s argument for maintaining the status quo.

    But as the report from the Economic Policy Institute found, employment in the leisure and hospitality industry from 1995 to 2014 actually grew faster in states where tipped workers were paid an equal minimum wage—43.2 percent—versus the 39.2 percent growth in states that have a lower minimum wage for tipped workers.

  • Food
  • Health by Stealth: McDonald's Secretly Cuts Calories but Not at Home

    If fast-food mainstays can be made healthier abroad, why aren't the same changes made in the U.S.?

    Is the secret to making fast food healthier to keep healthier fast food a secret? Halfway around the world, it appears McDonald’s is trying to find out.

    At franchises across India, the chain has reduced the amount of sodium in its sauces and buns by 10 percent and in its fries by 20 percent, according to Reuters. During the last six months, it’s cut the calories in its sauces by 30 to 40 percent. Rather than make a big to-do about its new better-for-you fare, McDonald’s has instead opted for a quieter approach—so quiet that the company has pretty much kept mum about the changes.


    Instead, the strategy here seems to be to sneak the healthier changes past the taste buds of consumers—and it may very well be working.

    As Reuters reports, “Loyalists interviewed in Delhi, Kolkata and Mumbai said they did not detect any difference in taste.”

    “I order from McD’s at least twice a month and think it tastes pretty much the same,” Rahul Dutta, a marketing exec based in New Delhi, tells the news service.

    The rise of fast-food chains in countries such as India—along with the general creep of diet trends related to the West’s penchant for sugar-, salt-, and fat-laden processed foods—has coincided with a rise in diet-related ailments such as obesity and diabetes. It would seem the world’s largest fast-food chain may be trying to head off more criticism that it's contributing to such ills.


    Yet what about in the U.S.? It’s not like we don't have an obesity epidemic of our own, and as Reuters points out, the burger giant is facing increasing competition at home from chains such as Panera and Chipotle, which are perceived by health-conscious consumers to offer healthier food.

    Could a similar health-by-stealth strategy work for McDonald's here? Don't bet on it.

    Yes, the restaurant industry has long been bracing for federal regulations that will require large chains to prominently post calorie counts on their menu boards. Last fall the Food and Drug Administration announced its final rules, which apply to not only restaurant chains but a host of food outlets, including vending machines, movie theaters, and grocery stores that sell prepared food. These rules are set to take effect later in the year, but McDonald’s has been posting calorie counts since 2012.

    It would seem the push to make calorie counts more public, coupled with Americans’ stated demand for healthier options, has led restaurant chains to focus on new menu offerings—from McDonald’s wraps to Pizza Hut’s Skinny Slice—instead of making standbys healthier. A study conducted at the Johns Hopkins Bloomberg School of Public Health published last October found that the amount of calories in new menu items that debuted in 2013 at major chains, including McDonald’s, dropped an average of 60 calories—or 12 percent—compared with the year before.


    But that's for new food. Big restaurant chains are notoriously wary of tinkering with their most popular, tried-and-true menu items, and McDonald’s, which recently sent its CEO packing after suffering a string of lackluster quarterly reports, would appear to be in no position to mess with its stable of fast-food icons like the Big Mac and the Quarter Pounder.

    It’s ironic, in a way: By quietly reducing the amount of sodium and calories at its American restaurants, McDonald’s could arguably have a far greater impact on public health at large, but that’s what makes such a move unlikely. After all, McDonald’s is a relative newcomer in India—the first outpost there opened less than 20 years ago. Indians, by and large, did not grow up eating McDonald’s; there are fewer than 400 McDonald’s stores in all of India, compared with more than 14,000 in the U.S. Thanks to deep-seated cultural and religious norms, the chain doesn't serve beef on the subcontinent, making the menu altogether different. Thus, Indians’ expectations of how a McDonald’s sandwich “should” taste are more malleable. 

    Meanwhile, back in the U.S., whatever “lovin’ it” we may be doing at McDonald’s may very well have stagnated into more of a love-to-hate relationship.

  • Food
  • Buying Organic to Avoid Pesticides? Science Confirms You Have the Right Idea

    Researchers are trying to understand how all the chemicals in our food might be harming us in the long run.

    Eat organic produce, and you end up exposed to less pesticide.

    That would seem like a no-brainer: Certified organic produce, after all, is by definition grown without the use of pesticides. Nevertheless, new research published Thursday came to that common-sense conclusion through scientific rigor, adding to the growing body of evidence that should make you feel better about shelling out more for organic apples. Perhaps more important, it contributes to the larger effort by scientists to come up with ways to evaluate people’s long-term exposure to certain pesticides and better understand what that exposure may be doing to our health. The researchers also found some concerning evidence that, contrary to both popular belief and federal regulatory standards, organic produce might not be altogether free of pesticides.


    For the study, which appears in the journal Environmental Health Perspectives, a team of researchers led by scientists at the University of Washington reviewed data collected from more than 4,400 participants as part of another long-term health study—you know, one of those where a bunch of information is collected from a range of people who are then followed for a certain period of time.

    In this case, all the participants were over 45, but they were from racially diverse backgrounds. The researchers first analyzed the participants’ responses to a dietary questionnaire, focusing specifically on answers related to how much fruit and vegetables they consumed and how often that produce was organic. Then they tested participants’ urine for exposure to organophosphate pesticides, one of the most widely used classes of pesticides in U.S. agriculture.

    Not surprisingly, when the researchers compared people who reported eating organic produce most often with those who said they almost never did, they found significantly lower amounts of lingering pesticides among the prior group. Interestingly enough, this only held true when the amount of produce consumed was roughly equivalent. In other words, people who ate less produce overall, even if it wasn’t organic, showed less exposure to pesticide compared with people who reported eating more organic produce.

    “On its face, this finding is counterintuitive and perhaps even concerning, as it might suggest that organic produce is not actually free of [organophosphate] pesticides,” the authors write. “However, we hypothesize that this reflects the difference in total produce consumption among these groups. This study did not include a group of individuals who exclusively ate organic produce, and it is difficult to know exactly how much of a participant’s diet is organic when they report that organic produce is ‘often’ eaten.”

    Thus, while vegetable haters may be inclined to one takeaway here (Don’t want to be exposed to pesticides? Don’t eat produce), the healthier line appears to be that if you’re worried about long-term pesticide exposure, buying organic is the way to go.

    How worried should you be about the organophosphates that might be lurking in your crisper? Honestly, no one knows for sure.

    The Environmental Protection Agency was concerned enough about acute exposure to organophosphates to ban their household use back in 2000. But this class of pesticides, derived from nerve gas produced as a chemical weapon during World War II, continues to be sprayed extensively on crops. It’s yet another blind spot in our federal regulatory system that the danger such chemicals poses to human health has generally been determined based on adult exposure to significant quantities over a relatively short period of time. The EPA’s view that long-term, low-level exposure is “safe” is based more on faith than on science.


    It’s exceptionally difficult from a scientific standpoint to calculate the risks of such exposure, and as it turns out, that’s the primary aim of the current study—not to give you a warm glow when you opt for organic salad greens. Public health researchers are looking for a scientifically valid way to evaluate a person’s long-term exposure level, which may then be used in other research trying to determine the health risks that might be associated with such exposure.

    In the meantime, organophosphates have been linked to certain cancers and have been shown to disrupt the endocrine system, according to the Pesticide Action Network. The chemicals have also been linked to neurodevelopment issues in children, including autism. In a paper published in 2012 by the National Institutes of Health, David Bellinger, a professor of neurology at Harvard Medical School, made the rather jaw-dropping (and media-savvy) calculation that America’s collective IQ has dropped by almost 17 million points owing to exposure to organophosphates. The decline is part of what he and other crusading public health advocates call a “silent pandemic” of toxins that are wreaking havoc on the neurological development of unborn children.

    While others have charged Bellinger, et al., with scaremongering, the fact remains: If long-term, low-level exposure to such pesticides has yet to be definitively proved to cause harm, neither has it exactly been proved to be safe.

  • Food
  • This Bite-Size Documentary Reveals the Enormity of America's Food-Waste Problem

    See how one community is trying to make sense out of the insane system that supplies what we eat.

    Even for those relatively well versed in the issues surrounding our culture’s dismal epidemic of food waste, it's a shocking scene: A dump truck unceremoniously tips an entire load of what appears to be perfectly edible produce into a landfill.

    The moment occurs just a few minutes into Man in the Maze, one of five films to nab a top nod (and $10,000) at this year’s Sundance Institute Short Film Challenge. The documentary is having its world premiere, so to speak, this week on the website of Tucson's Arizona Daily Star.

    Which is appropriate, as the bite-size documentary focuses on the efforts of locals in nearby Santa Cruz County to make sense of a food system that is anything but sensible.


    Our guide here is Gary Paul Nabhan, a sort of Renaissance man vis-à-vis the local food movement: writer, activist, academic, farmer, “wild foods forager and pollinator habitat restorationist.” Nabhan made a name for himself back in the 1980s when he cofounded Native Seeds/SEARCH, a nonprofit dedicated to preserving the rich agricultural legacy of the indigenous peoples of the Southwest.

    More recently, he's turned his attention to food waste, a tangible issue in the borderlands. As he informs us, 25 to 30 percent of the produce Americans consume comes up from Mexico through the border towns, trucked by thousands of semis along a “big food superhighway.” Yet an unconscionable amount ends up in the dump.

    “If Florida tomato prices drop on a certain day,” Nabhan offers by way of example, “120,000 pounds might be thrown in a landfill just because of pricing.”

    Such waste is galling when you consider that economically disadvantaged Santa Cruz County suffers from some of the highest rates of child food insecurity in the state.

    The wanton waste would be thoroughly depressing were it not for the rest of the eight-minute film, which touches on the inspirational ways that the community is coming together to create a more just and equitable food system. Borderlands Food Bank, for example, rescues between 30 and 40 million pounds of produce each produce season, saving it from landfills and redistributing it to area residents who have limited access to affordable fresh produce.


    The fascinating work of Native Seeds/SEARCH to preserve the seed stock developed by indigenous groups in the Southwest and adapted for the region’s arid climate gets due attention as well, as does the group’s equally important efforts to reestablish locals’ connection with the land. Worth noting is that despite the Southwest’s reputation as suitable for growing nothing but cacti and sagebrush, the chronicles of early European explorers record native communities farming a dizzying array of drought-tolerant crops—a precedent that we might do well to learn from as we confront climate change.

    Short as it is, Man in the Maze provides plenty of food for thought, including Nabhan’s parting words: “Food is a sacrament; food is what binds us together. It behooves all of us, whether it's for health reasons or because we care about the land or because our faith requires us to care about the people most marginalized by our broken food system, to heal that food system. That’s the only way we’re going to heal our economies, our bodies, and our land.”

  • Food
  • Another Reason to Loathe Factory Farms: Massive Air Pollution

    Two new lawsuits charge the EPA with looking the other way as Big Ag unleashes toxic gases.

    You probably already know that the giant factory farms on which the majority of American livestock are condemned to live out their miserable lives aren’t exactly paradise for the animals—and you’re also likely aware that the enormous amount of waste these industrial-scale operations generate can contaminate local water supplies. But that factory farms emit such a staggering amount of pollution to make the air in some rural communities more dangerous than in America’s most polluted cities? That may come as news. And the Environmental Protection Agency has done virtually nothing to stop it.

    Now the EPA is being forced to answer in court for its inaction. Two lawsuits filed Thursday by a coalition of groups—including the Environmental Integrity Project and the Humane Society of the United States, as well as a number of grassroots organizations—contend that the agency has shirked its duty by ignoring petitions filed as long as six years ago that called on it to regulate air pollution from factory farms.


    “Animal factories subject millions of animals and farm workers to highly toxic levels of air pollution on the farm, and also release huge amounts of these toxins into the environment,” Jonathan Lovvorn, a lawyer with the Humane Society, said in a statement. “EPA’s failure to address these impacts should be alarming to anyone that cares about animal welfare, worker safety, human health, environmental protection or the preservation of rural communities.”  

    Lovvorn’s choice of the words “animal factories” rather than “factory farms” is telling. Even as they’ve come to dominate the American farmscape, with an estimated 20,000 now housing billions of chickens, hogs, and other animals, the public—or Washington, at least—has chosen to focus on the “farm” half of the oxymoronic nomenclature. By holding onto an association with the red barn and the open pasture, the term integrates these outsize operations seamlessly into our collective romanticized notion of bucolic rural America. But for those Americans who have to live next door to them, the emphasis decidedly shifts to the “factory” part of the equation. “When the emissions are at their worst, we have had to leave our home for days at a time,” says Rosie Partridge, whose family farm in Iowa is surrounded by more than 30,000 hogs within four miles. “The ammonia and hydrogen sulfide are so strong that my husband has trouble breathing.”

    Indeed, a 2011 analysis by the Environmental Integrity Project of data collected during an EPA study (one funded by the livestock industry, no less) found a majority of factory farms surveyed releasing more than 100 pounds of ammonia on an average day, which in any other industry would trigger pollution reporting requirements. Some factory farms “emitted thousands of pounds on their worst days.”

    That’s not all. Fine particle pollution, which can cause respiratory and heart disease, also exceeded federal limits at a number of sites, while levels of hydrogen sulfide released by some large hog and dairy operations were comparable to those emitted by oil refineries. Factory farms have also been shown to be major sources of methane and nitrous oxide, both of which are potent greenhouse gases. Just last week, a new study showed that both antibiotics and drug-resistant bacteria are also blowing in the foul wind.


    “EPA has acknowledged the harmful impacts of factory farm air pollution for over a decade yet is still failing to act on the problem,” Tarah Heinzen, an attorney for the Environmental Integrity Project, said in a statement.

    The foot-dragging can no doubt be traced to the powerful agriculture lobby in Washington, which in 2008 managed to convince the Bush administration to exempt factory farms from most pollution reporting requirements, even as the amount of waste generated by these operations has swelled to more than 300 million tons a year—three times the amount of waste produced by people.

    “Factory farm air pollution harms public health, the environment and rural quality of life,” Heinzen says. “Yet EPA is looking the other way while citizen pleas for action collect dust on the agency’s shelf.”

    With the courts now involved, the plaintiffs are hoping those pleas will finally be answered.

  • Food
  • Don’t Call It Stinky Stout—Brewing Beer From Sewage Water Is an Idea Worth Toasting

    Sure, this idea from a wastewater treatment company may sound gross, but the plan really makes a lot of sense.

    When news of a bid by a Portland-area wastewater treatment company to turn recycled sewage into beer made headlines this week, it didn’t take long for at least one commenter to offer some cheeky suggestions for labels: Naturally Yellow, Distinktive Brew, Organic Beer, Second Time Around, Pissa Beer... But what’s surprising here isn’t that anyone would want to do such a thing—it’s how behind-the-curve hip, earthy, compost-happy Portland seems to be when it comes to turning toilet water into something well worth drinking.

    While this marks the first time that the state has considered allowing residents to drink treated wastewater, utilities and regulators elsewhere in the country haven’t been so lucky. 


    Although it at first may seem inevitable that toilet-beer would emerge from a city whose unofficial slogan is “Keep Portland Weird”—at the very least it sounds like a sendup straight out of Portlandiaperhaps the notion of drinking something associated with human waste is causing a stir in 2015 is because of the Pacific Northwest’s rain-soaked reputation. If we were talking about turning sewage into, say, sunlight, the city may very well have emerged on the cutting edge.

    In the Southwest, residents would gladly take some of Portland’s wet, gray weather—anything for a bit of rain. As it is, nearly a third of the country (including a significant part of southern Oregon) is enduring a prolonged period of moderate to extreme drought, a situation that experts say may only be exacerbated by climate change. Thus, parch-prone cities in the U.S. and around the world are focusing their attention on taking water that was once flushed and forgotten and turning it into ultra-pure drinking water.

    Clean Water Services, located in Hillsboro, Oregon, just west of Portland, might get permission to supply treated wastewater to a group of local home brewers so that they, in turn, can produce small batches of novelty beer for special events. But it depends on the company’s ability to jump through a lot of regulatory hoops.

    Thus far, as Oregon Public Broadcasting reports, Clean Water Services has managed to get the green light from the state health authority. Now it must secure approval from the Oregon Department of Environmental Quality, which is holding a public hearing on the proposal next month. Even if that all goes well, the company “will still need additional state approvals for an amended Recycled Water Reuse Plan before the brews are cleared for drinking,” according to OPB.

    In bone-dry California and Texas, those debates over health and regulatory considerations are over—toliet-to-tap is increasingly a thing. In November, San Diego’s city council voted unanimously to advance a $2.5-billion plan to recycle wastewater, with an eye toward supplying about a third of the city’s water needs by 2035. The vote signaled a stunning turnaround in public opinion: A decade ago, only one in four San Diegans favored turning wastewater into drinking water, according to Fox News. By 2012, three in four did.

    Farther north, in Orange County (hardly a bastion of progressivism), the water utility has been transforming sewage into tap water since 2008. This year, it’s on track to expand its recycling operation from 70 million gallons per day to 100 million—enough to quench the thirst of about a third of the county’s population. Municipalities ranging from El Paso, Texas, to Fairfax County, Virginia, have also launched wastewater recycling programs.

    “It’s a watershed moment right now. We’re seeing widespread acceptance of these technologies,” Mike Markus, general manager of the Orange County Water District, told CNN last year. “As the shortages become more extreme and water supplies are cut, it has raised awareness that we need to find alternative resources.”


    So why not wastewater beer too? The process described by Clean Water Services in Oregon appears to be more or less the same one used in Orange County and elsewhere: a three-stage system whereby sewage goes through “ultra-filtration” followed by reverse osmosis and then exposure to UV light and oxidization, which kills off any remaining bacteria. The result, advocates say, is drinking water that’s cleaner than what most people get out of their kitchen faucets. That the treated water is often released back into the groundwater supply only to be collected and treated again is largely political (and unnecessary)—the trip down into the aquifer and back up again is essentially to mollify a squeamish public that fails to understand the basics of hydrology.

    “It’s the same water now as when dinosaurs walked the earth,” Melissa Meeker, executive director of the advocacy group WateReuse, told CNN. “It’s about understanding the water cycle and how we fit into it. Once people think about it, they become more open-minded.”

    To wit: rivers. If you live anywhere that gets any portion of its water supply from a river, you’re likely already drinking “recycled wastewater,” as the treated sewage from towns upstream gets cycled through your own municipal water system.

    Thus, in the running competition to come up with a name for Portland’s newfangled brew, here’s a suggestion: Just call it beer.   

  • Food