Free Range

Topic Box Image: 

World Leaders May Do Something About the Worsening Superbug Problem

New research shows that MRSA can be transmitted from infected poultry to consumers.

Are governments around the world getting serious about tackling antibiotic abuse in the livestock industry, or are we going to allow countless people to die in the name of cheap meat?

It’s a question that only sounds dramatic if you haven’t been following the terrifying rise of antibiotic-resistant superbugs that’s been tied to the rampant overuse of antibiotics on factory farms.

As if on cue, just as world leaders were convening at the United Nations General Assembly this week to discuss the issue, researchers announced they had discovered a new superbug in Denmark linked to poultry. It’s a strain of MRSA, an antibiotic-resistant type of staph bacteria more commonly associated with hospitals.

While MRSA infections have been on the rise, food-borne transmission has been rare and continues to be. Yet the researchers note it’s alarming that the new strain appears to have been transmitted from infected birds not to poultry workers but to consumers, through either the handling or the consumption of infected meat.

As one of the study’s authors said in a statement, “Our findings implicate poultry meat as a source for these infections. At present, meat products represent only a minor transmission route for MRSA to humans, but our findings nevertheless underscore the importance of reducing the use of antibiotics in food-producing animals as well as continuing surveillance of the animal-food-human interface.”

It’s a warning that scientists and leading medical organizations around the world have been sounding for years—and until now, it seems it has largely fallen on deaf ears. In the U.S., for example, the overwhelming majority of antibiotics—70 percent or more—continue to be given to livestock. Not to treat animals that are sick, mind you, but to prevent illness, often on overcrowded factory farms.


But at a U.N. summit on Wednesday, more than 190 member nations signed a landmark declaration promising to do something about the burgeoning crisis of antibiotic resistance. It’s only the fourth time in the U.N.’s 70-year history that the General Assembly has taken up a health-related issue, following summits on HIV, Ebola, and noncommunicable diseases such as obesity.

“I think the declaration will have very strong implications,” Dr. Keiji Fukuda of the World Health Organization told NPR. “What it will convey is that there’s recognition that we have a big problem and there’s a commitment to do something about it.”

NPR reported that a similar resolution concerning HIV that the U.N. passed in 2001 is generally credited with spurring action to combat the pandemic, bringing attention and a wave of money devoted to treatment and prevention. Countries targeted with such aid have seen AIDS-related deaths drop by 45 percent since 2004.

There’s reason to be skeptical about the U.N. declaration on antibiotic resistance. For starters, it’s nonbinding, and it doesn’t set any firm targets for countries to reduce the use of antibiotics.

That sort of toothless posturing should sound familiar. After all, an “action plan” unveiled by the White House last year to combat antibiotic resistance in the U.S. set targets for cutting the amount of antibiotics prescribed by doctors to people—but it failed to do the same for the agriculture industry, which continues to dose chickens, cows, and pigs with the same drugs. The Food and Drug Administration has called on the industry to stop feeding animals antibiotics on a regular basis to promote growth, but the agency continues to allow the drugs to be given as a “preventative measure”—more or less allowing factory farms to keep doing what they’ve long done. Only one state, California, has passed a law restricting antibiotic use on farms.

In short, when it comes to protecting the efficacy of some of the most important lifesaving drugs known to humankind, the U.S. is forced to rely on an ad hoc collection of environmental and public health groups to do what regulators won’t. Naming and shaming has helped push America’s biggest restaurant chains into eliminating antibiotics from their supply chain. The hope is that by convincing huge buyers like McDonald’s and Subway to go antibiotic-free, the entire livestock industry will be forced to change.

How’s that going? In a report issued earlier this week, just two chains—Panera and Chipotle—get an A for taking meaningful action to reduce antibiotics in the poultry, beef, and pork they sell. The majority of restaurant chains got an F.

  • Food
  • Fast Food Isn’t Doing Enough When It Comes to Antibiotics

    Chains have made promises to remove drugs from their meat-supply chains, which is lauded in a new report, but action is needed.

    Considering the swirl of headlines these past couple years touting this or that restaurant chain’s vow to go antibiotic-free when it comes to meat, you could be forgiven for thinking that the industry has come a long way in doing its part to tackle one of the most potentially devastating health crises of our time.

    Simply put, it hasn’t.

    That’s the takeaway from a report released Tuesday by a coalition of nonprofit environmental and public health advocacy groups—including the Natural Resources Defense Council and Friends of the Earth—that follows up on a similar report from last year.

    Or, at least, that’s my glass-is-more-than-half-empty takeaway. The coalition, which no doubt deserves tremendous credit for attempting to do something—anything—about the escalating crisis of disease-causing antibiotic-resistant bacteria in America as the laggards at the Food and Drug Administration more or less do nothing, puts a more positive spin on things. Compared with last year, the report points out, twice as many of the nation’s top 25 restaurant chains received a passing grade for working to eliminate or reduce the use of antibiotics in the meat they sell. So progress has been made, but what does that really mean?

    For starters, that means only nine scored the equivalent of a D or higher. A paltry four scored either an A or a B. Meanwhile, 16 chains—among them big names like Starbucks, Burger King, Olive Garden, Domino’s, and Jack in the Box—were given an F, which means they have taken “no action to reduce the use of antibiotics in their supply chains,” according to the current report.


    You’ve probably read the dire warnings about antibiotic-resistant infections from the experts at the Centers for Disease Control and Prevention, the World Health Organization, the American Medical Association, or really, pretty much anywhere else that matters when it comes to public health. If, like me, you want to avoid the kind of antibiotic-riddled factory-farmed meat that’s fueling the growing epidemic of drug-resistant superbugs, there are but two big chains whose food you can eat in good conscience: Panera Bread and Chipotle.

    No surprise there. Those were the only chains to receive an A grade on the coalition’s report last year.

    The seven chains that got at least a passing grade represent a range of effort that is liable to befuddle the average consumer. Both Subway and Chick-fil-A are given respectable B’s, but because the coalition’s criteria favor robust, public-facing commitments to reduce the use of antibiotics in meat rather than the implementation of those commitments, you’re not likely to find much antibiotic-free meat at either chain today. While Subway gets kudos for promising to end the use of antibiotics across its entire meat supply—it’s the largest chain by far to make such a sweeping commitment—the company won’t do so until 2025, nine long years away. Only in the area of chicken has Subway made much progress. Even so, you’ve only got a two-in-three chance of being served antibiotic-free chicken at Subway and even less of one at Chick-fil-A, where it’s just one in four.

    Meanwhile, at McDonald’s, which scored a C+, you have a 100 percent chance of noshing on McNuggets made with chicken that doesn’t use antibiotics that are important to human medicine. But the country’s biggest burger chain remains frustratingly vague on its commitment to switching entirely to antibiotic-free beef and pork, as do any number of other chains that scored a passing grade.

    This is where it all starts to feel ridiculous. I shouldn’t even be writing this column because in all honesty, we shouldn’t be relying on an ad hoc group of nonprofit organizations to defend the efficacy of one of the most important classes of medicine known to humankind. Why? Because, to be blunt, the Food and Drug Administration should be doing its bleeping job.

    Despite ever more scary developments worthy of a Hollywood medi-scare drama—such as the discovery in the U.S. last spring of a gene that can easily confer resistance in bacteria to one of our last remaining antibiotics of last resort—the FDA continues to allow the livestock industry to ply animals with copious amounts of antibiotics, including many of the same drugs we rely on to fight infections in people. More than two-thirds of all the antibiotics used in the United States are used in animals—not to treat animals that are sick, mind you, but simply to prevent disease in what are often the abysmally filthy, overcrowded conditions of your average factory farm.

    So yes, by all means, support those restaurant chains that have taken meaningful action to curb antibiotic abuse—including the smaller chains that the authors of the new report give a shout-out to, such as Au Bon Pain, Noodles & Co., and Papa Murphy’s. But let’s not fool ourselves into thinking that when it comes to protecting something as important as humanity’s last line of defense against a host of infectious agents, we should be putting our faith in the good intentions of a handful of restaurant industry CEOs.

  • Food
  • American Honey: Same Great Taste but Now With More Weed Killer

    FDA documents show that the herbicide glyphosate is finding its way into honey too.

    What in the world to do now that glyphosate, the most heavily used weed killer in the world—a probable human carcinogen, no less—is showing up in everything from breakfast cereal to eggs?

    The public interest group U.S. Right to Know announced this week that it has obtained documents showing the Food and Drug Administration has found residues of glyphosate in samples of American honey. Glyphosate is the active ingredient in Monsanto’s Roundup herbicide, the use of which has increased 15-fold since the company introduced its line of Roundup Ready crops genetically modified to withstand the chemical some 20 years ago.

    Yet despite the skyrocketing use of glyphosate, federal regulators have been pressing the snooze button when it comes to dealing with Big Ag’s chemical onslaught. It was only this year that the FDA agreed to start testing samples of U.S. food for the presence of glyphosate, spurred by growing public unease about an herbicide that Monsanto and other chemical makers have long assured is safe—but that the International Agency for Research on Cancer declared last year likely can cause cancer in humans.

    That it took a Freedom of Information Act request from a nongovernmental watchdog group to get some answers from the FDA on its testing only begins to point to the government’s dysfunctional approach to regulating glyphosate—or, more aptly, not regulating it.


    The newly obtained documents include the testing results for three honey samples, which contained glyphosate in concentrations of 22 parts per billion, 41 parts per billion, and 107 parts per billion. That’s a small number of samples, but an FDA scientist lamented in an email, “One of the issues I found is that it is difficult to find blank honey that does not contain residue. I collect about 10 samples of honey in the market and they all contain glyphosate.”

    How do the test results square with the level of glyphosate that federal regulators legally allow in honey? Well, as with any number of foods, the feds haven’t bothered to set a tolerance level for glyphosate in honey—something the FDA scientist testing honey pointed out to Chris Sack, who oversees such pesticide residue testing at the agency. As Sack responded, “You are correct that honey has no tolerance listed for glyphosate, but there are good reasons for that.... In recent re-evaluations of glyphosate exposure and toxicity, [the Environmental Protection Agency] has confirmed that glyphosate is almost non-toxic to humans and animals. So, while the presence of glyphosate in honey is technically a violation, it is not a safety issue.”

    But the EPA’s assessment of whether glyphosate causes cancer was removed from that agency’s website almost as soon as it was posted last year, and it hasn’t reappeared. The EPA continues to push back the date by which the public might expect it to weigh in on the simmering controversy, most recently suggesting that Americans who are growing ever more wary of glyphosate might have to wait until next spring for answers.

    Meanwhile, independent tests by environmental and public health groups have found glyphosate residues in a range of foods. Many, like honey, are unexpected places to find the chemical because it’s not directly used in the food’s production—suggesting Americans may be consuming far more glyphosate than thought. For example, a study of store-bought breakfast foods by the Alliance for Natural Health turned up glyphosate in items such as dairy-based coffee creamer, organic eggs, and whole wheat bread. Consumer advocates have filed suit against companies such as PepsiCo and Post, claiming that products like Quaker Oats and Shredded Wheat shouldn’t be marketed as “all-natural” if they contain glyphosate residue.

    All of which points to perhaps the only way to effectively put the skids on glyphosate’s two-decade march to dominance. Just this week, Bayer announced plans to buy Monsanto for $66 billion. If the feds have been reluctant to scrutinize Monsanto’s claims that glyphosate is perfectly safe, we probably can’t expect them to take on a corporate behemoth as large as a combined Monsanto-Bayer. But big food makers, with billions of dollars in sales at stake, might prove a formidable counterforce if U.S. courts start to rule that foods tainted with glyphosate can’t be hawked as “pure” or “natural.”

  • Food
  • We May All Have to Cut Carbs Thanks to Climate Change

    Rising global temperatures will harm wheat harvests—and poor countries will be hardest hit.

    It seems that as the world’s temperature heats up, more of the world may be forced to go gluten-free.

    Remember when climate change was commonly called “the greenhouse effect,” which seemed to suggest that at the very worst we might all end up living in a kind of perpetual summer surrounded by lush greenery? It would be a little humid, maybe, but might otherwise resemble a verdant, abundant, postindustrial Garden of Eden? Oh, those were the days.

    On the contrary, climate experts and agricultural scientists have long warned that climate change will likely wreak havoc on the global food supply. A new study appears to offer some of the most convincing evidence to date on the serious effect global warming could have on one of the world’s most important crops, wheat.

    More than 50 scientists based around the world—from China to the EU to the U.S.—participated in the research, the results of which were published Monday in the journal Nature Climate Change. The team found that an increase of 1 degree Celsius in global temperature would cause worldwide wheat production to fall between 4 percent and almost 6.5 percent. The Intergovernmental Panel on Climate Change, the leading international scientific body on the issue of global warming, predicts global temperatures to rise between 2 and 6 degrees Celsius by the end of the century.

    All told, worldwide wheat production hit nearly 735 million metric tons last year, a record high that the 2016–17 harvest is expected to surpass. A loss of 4 percent—on the conservative end of the estimate—would equate to 30 million metric tons of wheat, while the 13 percent decline that might occur if global temperatures rise by 2 degrees Celsius would equate to a staggering 95 million metric tons. That’s almost double the entire current output of the United States. Such losses are not the direction we need to be going, especially given that the world population is expected to hit 9 billion by the middle of this century, spiking global food demand by 60 percent, according to the United Nations.


    Adding insult to the injury that is already the general social injustice of climate change: The study predicts that countries in warmer regions will experience a more significant drop in wheat production, while those in cooler regions will fare better. Warmer, often poorer countries with lower emissions have long complained that they bear more of the burden of climate change than wealthier, heavier-polluting countries. The current study predicts, for instance, that an increase of 1 degree Celsius would mean a 3 percent decrease in wheat yields in China and an 8 percent decrease in India.

    What’s the silver lining in all this? Not much for the layperson warily eyeing rising sea levels and worrying about the future of bread. But for the scientists involved, the study represented something of a breakthrough in that it employed three separate methods—two model simulations and a rigorous statistical analysis—all of which produced essentially the same results.

    “This means we’re closer to more precisely predicting crop yields and their response to climate change worldwide, but we have shown this only for wheat so far,” Senthold Asseng, a professor of agricultural and biological engineering at the University of Florida and a lead author of the study, said. “It’s the first time that a scientific study compared different methods of estimating temperature impacts on global crop production. Since the different methods point to very similar impacts, it improves our confidence in estimating temperature impact on global crop production.”

    Good news for science. Probably not so great for dinner in the 22nd century.

  • Food
  • China Is Finally Addressing Animal Welfare Issues

    New standards for the transportation and slaughter of birds are a first for the world’s second-largest poultry producer.

    Is the world’s second-biggest producer of poultry meat getting serious about chicken welfare?

    For the first time, China has released official recommendations for the humane slaughter of chickens, specifically for farms in Shandong province, which leads the country in poultry production, raising 20 percent of domestic birds. As the English-language website People’s Daily Online reported, the standards provide detailed steps for handling poultry from transportation to slaughter and provide clear recommendations for minimizing the suffering of the animals, including limiting transport to a maximum of three hours and adopting the EU practice of anesthetizing birds before they are killed.

    The recommendations are just that—they’re not mandatory—and they address the slaughter of chickens, not how they’re raised. Nevertheless, animal welfare advocates are praising the move as “a step in the right direction,” as Jeff Zhou of the nonprofit Compassion in World Farming told The New York Times.


    Yet it’s unlikely that the guidelines were a product of a spasm of concern on the part of Chinese officials over the well-being of chickens. As the Times notes: “In China, where factory farming practices and a lax enforcement of food safety codes have contributed to one food scandal after another, there is a business incentive to treat animals better.”

    Animal welfare issues are a rising concern among Chinese consumers, as they are in countries around the world. But the more pressing concern is the ability of Chinese poultry producers to compete on the world market. With evidence of the birds’ “violent deaths” remaining visible even after processing and the quality of the meat from stressed, abused birds often poorer than that of animals treated humanely, poor animal welfare is hurting exports. Only the U.S. produces more chicken meat than China, yet while America also ranks No. 1 in poultry exports, China lags in fifth place.

    Thus, the country’s move toward improved animal welfare practices may well be part of a longer-term strategy to appeal to conscientious consumers abroad, including in the U.S. Since 2013, the only chicken from China permitted to be sold here is chicken that has been raised and slaughtered in the U.S. (or Canada), processed and cooked in China, then shipped back to the U.S. again. If you think that seems patently absurd, you’re not alone. Even a chief lobbyist for the American poultry industry thinks the trade “probably doesn’t make any economic sense.”

    Most trade watchers believe that the bizarre arrangement was an incremental move toward overcoming China’s ban on U.S. beef. Sure enough, in May the U.S. Department of Agriculture announced that it had conducted a follow-up audit on China’s inspection system for slaughtered poultry and found the system met U.S. standards, meaning we’re one step closer to accepting chicken raised in China for sale in the U.S.—and one step closer, perhaps, to caring more about how chickens in China are raised.

  • Food
  • If the Fight for $15 Wins, Fewer Americans Will Go Hungry

    Raising the federal minimum wage should be a major policy proposal for fighting food insecurity.

    Want to find a way to dramatically reduce hunger in America? How about bumping up the federal minimum wage to $15 an hour?

    It would seem like common sense: A significant number of the more than 17.2 million households in the U.S. that are food insecure include at least one member who works, and many of those workers are employed in low-wage jobs. So raising the minimum wage would seem a surefire way to boost the grocery budgets of those households.

    Remarkably, putting more money in the pockets of low-wage workers has often been overlooked in the debate over how to end the unconscionable yet seemingly intractable problem of hunger in the richest country on Earth. Among the 20 recommendations put forth by Congress’ blue-ribbon National Commission on Hunger earlier this year, not one included raising the minimum wage—even as the commission admitted that the economic trends of the past half century “have contributed to fewer well-paying job opportunities for those without postsecondary education.”

    It has been almost seven years since America’s lowest-wage workers got a raise—and arguably a lot longer than that if you adjust for inflation. At $7.25 an hour, the federal minimum is about $3.60 less in today’s dollars than what minimum wage workers were making 50 years ago.


    Thus, the only surprise to emerge from an economic analysis released Thursday by The Century Foundation on the effect raising the minimum wage would have on the hunger crisis is that its conclusions should seem like any surprise at all. As the report’s author, economist William M. Rodgers III, a fellow at the nonpartisan progressive think tank, writes, “A minimum wage increase can make a major impact on hunger among families with working members.”

    What does Rodgers mean by “major”? By gradually raising the federal minimum wage to $15 per hour over the next seven years—a proposal put forth by Rep. Donald Norcross, D-N.J., that has failed to catch fire in the Republican-controlled House—more than 1.2 million families would achieve food security, according to Rodgers’ analysis. Among them would be some of the most economically vulnerable, including a significant number of single-parent and minority households. In a climate in which federal assistance to the poor, including food stamps, has increasingly come under political attack, raising the income of the working poor would allow those programs to focus on serving the food-insecure households with the most dire need.

    Yet even as the movement to boost the minimum wage to $15 has scored big victories in certain cities and states across the country, including Seattle, New York, and California, increasing the federal minimum wage remains more or less a nonstarter in D.C. Rodgers sees marrying the wage issue to the fight against hunger as critical: “Shifting the minimum wage debate’s focus to food security provides advocates with a concrete rationale for increasing the minimum wage,” he writes. “It provides [a] simple, yet very tangible outcome for policy makers and the public to observe. Minimum wage policy helps anti-hunger advocates to address their concerns that, in the current fiscal climate, low-wage workers are unable to receive adequate public support to meet basic needs.”

  • Food
  • Long Hours May Soon Pay Better for California Farmworkers

    A bill passed by the state legislature would make overtime pay kick in sooner for farm employees.

    California’s long-struggling farmworkers, who labor in harsh conditions to harvest more than a third of all vegetables grown in the U.S. and two-thirds of our fruits and nuts, are one big step closer to getting the bump in pay they deserve.

    On Monday, the California Assembly passed a measure that would require that farmworkers be paid the same overtime as other hourly workers in the state. Previously, farmworkers were only eligible to receive time and a half after 10 hours of work in a day or 60 hours in a week; the new bill, which was already passed by the California Senate, puts those numbers in line with the state’s broader wage laws that require that overtime be paid for work in excess of eight hours in a day or 40 hours a week. The legislation now heads to the desk of Gov. Jerry Brown, who has not publicly said whether he’ll sign it.

    On the face of it, the bill would seem to do no more than erase a nonsensical—and patently unfair—division between farmworkers and other hourly laborers in California. Yet it proved to be one of the more controversial pieces of legislation introduced this session, and its passage came after the measure’s primary sponsor, Assemblymember Lorena Gonzalez, D–San Diego, resorted to some extraordinary procedural maneuverings to bring it back from all-but-certain legislative death.

    Labor advocates praised the measure, with the United Farm Workers, for example, pointing out that farmworkers have been excluded from receiving fair overtime pay for almost 80 years.


    But as you might expect, California’s powerful agriculture industry sees the situation differently.

    Although the overtime bill gives growers seemingly more than enough time to comply with the new overtime pay regulations—until 2022 for large farms and until 2025 for farms with 25 employees or fewer—the state’s ag lobby has resorted to the same overdramatic, sky-is-falling arguments employed by any industry battling similar wage legislation. Tom Nassif, president and CEO of Western Growers, professed himself “extremely disappointed” in the move by the California Assembly. He warned that the bill would place California farms “at an even further competitive disadvantage internationally and with other states” while at the same time hurting farmworkers because growers would inevitably be forced to cut back on the number of hours given to workers.

    It’s an argument that’s been taken up by some local growers as well. Jeff Merwin, a third-generation farmer and president of the Yolo County Farm Bureau, told Capital Public Radio in Sacramento this week that if the overtime bill becomes law, he wouldn’t be able to pay his farmworkers time and a half for more than 40 hours a week. The farmworkers, Merwin said, are “being fed this utopian line about how it’ll be great. You’ll get all this overtime because they’ll pay it—they’ll pay it. How?”

    “I’m not the Grinch here,” Merwin continued. “All I’m saying is it’s economics 101. If the money’s not there, you can’t pay it, and the money is not there.”

    It would seem a compelling point coming from a grower on the front lines. But is it accurate? Philip Martin, an agricultural labor economist at the University of California, Davis, has crunched the numbers and found that while growers receive, on average, about 28 percent of the cost consumers pay at the supermarket for fresh fruits and veggies, they only pass on about a third of that to farmworkers, as National Geographic reported earlier this year. This makes hired farmwork one of the lowest-paid occupations in the country, with most farmworkers making less than $20,000 per year—which is below the federal poverty level for a family of three, according to federal labor statistics.

    A lack of specific employment data makes determining the ultimate cost of California’s farmworker overtime bill to consumers almost impossible, according to Martin. But in a separate analysis of the impact of raising the minimum hourly wage in California to $15 an hour—including for farmworkers—the economist found that wage hike, seemingly much larger than the new overtime standards, would raise the grocery bill of the average American family by a whopping $1.76 per month.

  • Food
  • If GMO ‘Labels’ Are Buried in QR Codes, Few Consumers Will See Them

    A new survey found that just 15 percent of Americans have used their smartphones to scan grocery items in the last year.

    When was the last time you used your smartphone to scan a QR code in the grocery store?

    If you answered “Never,” “Can’t remember,” or—not unreasonably—“What the heck is a QR code?” you’re by no means alone. A recent survey from the Annenberg Public Policy Center at the University of Pennsylvania found that just 15 percent of Americans said they had scanned an electronic code to find information about a product’s nutrition or ingredients in the past year.

    If all this is starting to sound more than a little confusing, well, that’s kind of the point. In what amounts to an enormous thumbing of the nose to the American public by politicians and industry, consumers who want to know whether a food product contains genetically modified ingredients will now more than likely have to scan the product to find out—despite there being a law on the books requiring the mandatory disclosure of GMO ingredients for the first time since the first genetically engineered crops hit the consumer market in the early 1990s.

    This is in almost diametric opposition to what an overwhelming majority of Americans—time and again, in multiple polls—have said they want. Upwards of 90 percent of the American public believes products made with GMO ingredients should be labeled—labeled with clearly worded text printed directly on the package. That’s what Vermont’s first-in-the-nation GMO labeling law mandated.

    Pretty straightforward, right?

    But now Vermont’s commonsense, consumer-friendly law has been superseded, just a month after it went into effect, and replaced by the more lax federal law. While the food industry is sighing with relief, consumer advocates are fuming.

    Yes, the United States finally has a federal law that requires food makers to label products made with GMO ingredients. But here’s the rub: That “label” can be a QR code, requiring shoppers to scan it to determine whether or not the product contains GMO ingredients.

    How many shoppers are likely to do that? Just 40 percent, according to the Annenberg survey. Let’s face it: That may be wildly optimistic. Remember, a scant 15 percent of those surveyed scanned a product in the past year. Who really wants to be standing in the middle of the grocery aisle scanning one product after another just to find out if there’s anything in the cart that’s been made with GMO ingredients?

    What makes the federal GMO labeling law that much more ridiculous is that it not only allows food makers to essentially hide GMO information from smartphone-toting consumers behind a confusing QR code but also allows those companies to ignore a huge swath of the non-smartphone-toting population. As the Environmental Working Group noted last year in its takedown of the not-so-smart “smart label” touted by food makers, “More than 40 percent of consumers—especially low income, less educated and elderly consumers—don’t have phones that can scan QR codes.”

    Far from being the sort of no-nonsense mandatory GMO label consumer advocates have long fought for, the federal label is only “mandatory” if you think that printing the surgeon general’s warning on, say, six in 10 packs of cigarettes, or posting stop signs at only about half of busy intersections, constitutes “mandatory.”

    So why would big food makers and their allies in Congress push for such a demonstrably asinine label? It becomes clear when you look at further results from the Annenberg survey. Some 90 percent of corn, soy, and other major crops in the U.S. are genetically modified—and the food industry estimates that 75 to 80 percent of food products contain GMO ingredients. But when asked how much genetically modified food they’d eaten in the last week, a third of Americans surveyed said they consumed “not much or none at all,” while another third said they didn’t know. Yet nearly half said they would be less likely to purchase a food if they found out it contained GMO ingredients—giving the food industry a multimillion-dollar profit incentive to keep that GMO info as obscure as possible.

    As William K. Hallman, a visiting scholar at the Annenberg Public Policy Center, puts it, “Without mandatory labeling, consumers are unlikely to recognize that many of the foods they buy have genetically modified components.”

  • Food
  • The ‘Bloody’ Battle to Make Veggie Burgers Great Again

    Will the next generation of vegetarian patties be able to disrupt their meat counterparts?

    The “bloody” battle to build the best next-generation veggie burger is heating up this week, with Impossible Foods’ own Impossible Burger set to debut on Wednesday at one of David Chang’s Manhattan restaurants, Momofuku Nishi. The burger is the first product to come to market from the much-hyped, much-VC-funded Impossible Foods, which was founded five years ago by Stanford biochemist Patrick Brown to develop plant-based alternatives to meat and dairy products.

    It seems Chang was chomping at the bit to become an early adopter of Impossible Foods’ burger technology. Eater reports that the Momofuku chef sought out the company after hearing about the Impossible Burger a year ago. “I was genuinely blown away when I tasted the burger.... The Impossible Foods team has discovered how to reengineer what makes beef taste like beef,” he told Eater.

    The quest to create a plant-based burger that cooks, smells, tastes, feels, and even seems to bleed like real beef has become something of a Holy Grail for a cadre of headline-grabbing food-tech start-ups led by Impossible Foods and its rival, Beyond Meat. The latter debuted its Beyond Burger at a handful of Whole Foods in Colorado and Washington, D.C., in May. Some locations sold out of the raw patties, slyly placed near the meat counter, within an hour.

    For its part, Impossible Foods says it has plans to bring its meatless burgers to as-yet-unannounced restaurants in San Francisco and Los Angeles “soon.” It will make its faux beef available in grocery stores sometime after that.


    Among the attributes that have garnered oohs and aahs from early tasters of both companies’ plant-based patties is their sanguineous ooze—that gross-if-you-think-too-much-about-it-but-nevertheless-trademark trait of your classic all-American all-beef burger. Whereas the trompe l’oeil effect is achieved via beet juice in Beyond Meat’s burger, Impossible Foods synthesized cow blood using heme, a molecule most often found in hemoglobin but also found in a few plants—thus, presumably, keeping it all vegetarian.

    There would appear to be a dazzling pot of gold waiting for whichever company succeeds in creating a viable veggie burger that is able to pass a blind taste test alongside a run-of-the-mill ground beef patty. But the take-it-slow approach adopted by both Beyond Meat and Impossible Foods in unveiling their burgers suggests there’s plenty of risk in rushing to market—even as it belies the tens of millions of dollars in venture capital funding both companies have received (including from Bill Gates). After all, the American burger is an icon, and the veggie burger market is littered with scores of would-be imitators whose most salient achievement has been to make most consumers wish they were holding the real thing in their hands.

    The new generation of veggie burgers would appear to be a whole different ball game, and they’ve attracted breathless reviews beyond David Chang’s. On the one hand, as Americans are growing more conscious of the disastrous health and environmental consequences of our meat obsession, who among us wouldn’t welcome a guilt-free alternative that tastes—and “bleeds”—the same as the flame-broiled patties on which we were raised?

    Yet for all the wide-eyed wonder inspired by bleeding veggie burgers, it remains to be seen just how well they might be received relative to another dynamic at play in our current food moment. If more of us are going more vegetarian in our diets, we’re also more suspicious of food that’s been processed beyond all recognition—and it’s hard to get more highly processed beyond recognition than plant matter that’s been engineered to bleed.

  • Food
  • The New ‘Ugly’ Apples Sold at 300 Walmart Locations Are More Than Perfect

    The country’s leading grocery retailer is trying something that’s good for the environment and, perhaps, our egos.

    Could learning to love a little imperfection in our fruits and veggies help us love the little imperfections in ourselves?

    The burgeoning “ugly produce” movement got a big boost this week when Walmart announced it was launching a pilot program to sell less-than-perfect apples at 300 of its stores in Florida. Of course, when the country’s largest grocer decides to hop on the latest food-trend bandwagon—as Walmart did when it committed to expand its offerings of organic products or locally grown produce—it pushes said trend solidly into the mainstream.

    There’s a lot of good to be said for championing fruits and vegetables that would never make their way into your favorite glossy food magazine. For starters, farms throw out a staggering amount of perfectly edible produce each year—some 20 percent of their harvest—just because those fruits and veggies don’t conform to retailers’ specifications for how produce should look. We’re talking about everything from crooked carrots to the weather-dented apples Walmart plans to sell. This, in turn, contributes mightily to America’s food-waste problem, which adds up to an estimated 40 percent of food rotting away in landfills.

    Wasted food means wasted water, land, and agrochemicals. But beyond the outsize environmental benefits of cutting down on food waste, the move to sell more ugly produce stands to be mutually beneficial for consumers and retailers. Such produce is often sold at a (sometimes steep) discount, meaning better access to more affordable fruits and vegetables at a time when a nation in the grips of an obesity epidemic is being encouraged to fill half our plates each meal with produce. On the business side of the equation, it allows retailers to capitalize on produce that would not have made it onto the delivery dock.


    Yet it must be said that retailers didn’t dream up those Miss Universe–style standards for perfect peaches and pears for no reason—which is also why Walmart’s embrace of ugly produce, as with similar moves by Whole Foods and a handful of regional grocery chains, so far has been confined to the testing phase. The big question: Will picky consumers buy in?

    Here’s where the topic of bruised apples intersects with the larger one of our collective bruised egos. Among the more interesting non-food-related reads to cross my path this week was Heather Havrilesky’s trenchant takedown of the myth of the supremely self-assured millennial. Havrilesky is the advice-dispensing columnist behind “Ask Polly” over at New York magazine’s The Cut. Rather than being spoiled and entitled, as we so often hear, Havrilesky writes, “What I discover in my email in-box each morning are dispatches from young people who feel guilty and inadequate at every turn and who compare themselves relentlessly to others.” Let’s face it: It’s not just millennials. Anyone who spends any amount of time immersed in our digital-centric media-saturated culture can relate to what Havrilesky calls the “pervasive subconscious longing” that “tells us that no matter what our circumstances might be, we should be dressing like fashion bloggers and vacationing like celebrities and eating like food critics and [having sex] like porn stars.”

    In other words, we’re bombarded by images of a certain kind of perfection, often artificially generated to appeal to our innermost desires but simultaneously leaving us feeling unsatisfied, left out, and wanting. Whatever satisfaction might be had by finding the “perfect” anything is often eclipsed by the nagging thought that there might just be something more perfect to be found. Just think of shoppers endlessly rooting around in a bin of peaches searching for the most picture-perfect, perfectly ripe peach.

    In that context, that Walmart’s damaged apples will be sold under an “I’m Perfect” label becomes kind of profound, as if we might all do well to take that little produce sticker and wear it proud, like preschoolers donning those Chiquita stickers from their breakfast bananas. When we do, we should keep Havrilesky’s prescription in mind: “The best version of you is who you are right here, right now, in this fucked-up, impatient, imperfect, sublime moment. Shut out the noise and enjoy exactly who you are and what you have, right here, right now.” If that’s while you’re savoring the sweet deliciousness of a weather-beaten apple, so much the better.

  • Food