Search Farmer Goes to Market

Search Site
Sunday November 19, 2017

You’ve seen the headlines:

As Farmer Goes to Market has cautioned before, much of the mainstream media, including the traditional grocery retail trade press, often rely on “willingness to pay” predictions like those to urge grocery retailers to stock such items lest they miss out on increased sales or margin opportunities. Yet they often miss the one important, obvious question: How well does that kind of research predict you can actually depend on shoppers to follow through on that promise?

Says Oklahoma State ag economist Jayson Lusk, who has studied the structure and reliability of willingness to pay studies for more than a decade, “Among the staunchest criticisms…is the fact that people tend to overstate the amount they are willing to pay for improvements in a public good or an increase in quality of a private good.” The research evidence is widespread, he says, for this “hypothetical bias”—the idea that talk is cheap when study subjects don’t actually have to sacrifice anything to get what they say they would hypothetically like to have.

Now, a new study published in the journal Agricultural Economics continues to add evidence that current studies on willingness to pay values do not put enough attention on considering psychological effects when people promise to pay more for such foods--in this case, organic foods. Past studies have predicted the willingness to pay values for organic food range wildly, from a low of around 2 percent to a high of more than 500 percent.

This Agricultural Economics study enrolled 233 college students from an agricultural university in China, where values for organic food run about 130 percent of conventional foods on average in the cities, in order to test whether those students’ willingness to pay could be manipulated by how the test was structured.

The researchers tested two potentially confounding factors. First, they rushed the study subjects to make their decision on how high a premium they would pay for organic pork, tomatoes and milk by giving them varying levels of limited time to complete the questionnaire. Second, they forced them to distraction while asking them the sets of questions regarding their willingness to pay by requiring them to complete simple math problems simultaneously.

The study authors found that although there were no significant differences in the subjects' stated willingness to pay when they were put under time pressure to make a decision, there was strong evidence that willingness to pay can be easily influenced by mental distractions while making a decision. The set of students forced to distraction by the math calculations decreased their willingness to pay by 5.5 percent for organic meat, 6.4 percent for organic tomatoes and 34.1 percent for organic milk.

What’s it mean? Past psychological work has shown people quickly and predictably lose self-control when they face such so-called “cognitive load.” That is, when our brains are overworked, we lose abilty to focus concentration on a single task. When that overload occurs, people typically default from making decisions based on motives shaped by social norms, tangible rewards and rational thought, toward making decisions based on preferences that evolved gradually through early learning and experience. In this case, the researchers theorized that willingness to pay for organic food can be shifted in the same way when consumers are distracted. In China, they write, such "implicit" motives may be particularly powerful because many consumers' underlying "implicit" beliefs about their food evolved at a time when widespread hunger and malnutrition posed a real and constant threat. In contrast, acceptance of organic food, which many researchers have identified as associated with social status and wealth, requires conscious and purposeful decision-making.

At the least, this latest study should remind us that willingness to pay research can be manipulated or biased and that the confounding factors should always be considered when interpreting the results, the authors write. 

Gardeners may feed the world, but not cost-effectively

You likely heard the term coined at the start of this lingering economic downturn: the "recession garden." Like the so-called victory gardens of the 1940s, today's private backyard and shared inner-city vegetable gardens are hailed as the new means to make fruits and vegetables affordable for low-income consumers, improve the health of those who tend and eat of the harvest, and make the American hungry more "food secure."

Does it work?

Not so much, it turns out. A new study published by Oregon State University's state-wide coordinator of gardening volunteers shows that while backyard gardening is a great hobby and a good way to introduce the fruit and vegetable-averse to the benefits of produce, as an economic enterprise, it's a poor second to the commercial system that includes the food retailer.

Author Gail Langellotto conducted an extensive review of the scientific literature to find a total of four journal articles and two blogs reporting 10 different sets of data on costs and yields for 11 vegetable gardens. The original authors in those studies estimated the dollar value of garden yields, based on the cost per pound for each crop at a local grocery store, and then netted out the final value of that produce by tracking the reported material and supply costs for the gardens. Most authors also reported the number of hours worked in the garden and the fair market labor costs associated with these hours. If no labor rate was quoted, Langellotto calculated labor costs using the minimum wage for the year the study was published. After adjusting all those studies' costs and values to current prices in order to make an accurate comparison across all studies, she calculated the difference between yield and cost to estimate the net value of each garden.

Gardens are not profitable if you count labor costs

Her results: Overall, gardens were profitable--but only if the labor to tend them was free. Excluding labor costs, gardens yielded an average $678 worth of fruits and vegetables, over and above the costs of irrigating and buying seeds, starts, soil and other materials. However, when labor costs were included, the net value of home vegetable gardens fell to an average loss of $81 per garden. Those values were also widely variable around the average, varying by $499 to $515.

Naturally, "most people do not hire help to tend their vegetable garden," Langellotto comments. She therefore concludes that her fellow Agriculture Extension professionals can confidently recommend vegetable gardening as a way to save money on fresh fruit and vegetable purchases, "...particularly if household members (rather than hired help) maintain the garden."

However, it's important to temper her recommendation with two points:

  1. Many farmers through history have gone broke using precisely the logic she employs; that is, relying on under-valued family labor as a "free" resource. Your own labor and that of family members may not carry direct costs, but they do come at an "opportunity cost." That is, time spent gardening is free labor when it replaces time spent on the couch, but it's a real cost if it takes away from time that could be devoted to a part-time job.
  2. It's precisely the poor who can benefit from her garden savings who in fact have the least time to spend on growing their own food, this recent study showed. If they have no other occupation opportunities, investing that labor in gardening may be time well spent. But if any wage-paying opportunity is available, the poor would be dollars ahead to take the job and leave the food growing to the professionals.

Is food quality really getting worse?

"Our food is killing us," proclaims the publicity flack for the new food documentary Fed Up, produced by An Inconvenient Truth producer Laurie David and starring former Today Show host Katie Couric. "Not all food, but almost everything we buy in the grocery store that is not a vegetable or a piece of fruit."

"I got an e-mail out of the blue from Katie Couric," producer David recalls for the Boston Globe. "She basically said, 'I’ve been working on the issues of food and diet for 30 years and I’m completely baffled as to why the problem keeps getting worse.' She asked, 'Would you consider doing An Inconvenient Truth on food with me?'"

Even the more measured scientific researchers warn us: Poor nutrition is contributing to at least four out of the 10 most common causes of American deaths, giving us heart disease, cancer, strokes and type 2 diabetes. But if any inconvenient truth exists surrounding this issue, Minnesota economics professor Timothy Beatty may have stumbled upon it in a new study published in the American Journal of Agricultural Economics. In it, Beatty, USDA economist Biing-Hwan Lin and grad student Travis Smith apply sophisticated economic modeling to two decades' worth of food consumption data from two nationally representative USDA surveys, the Continuing Survey of Food Intakes by Individuals and the National Health and Nutrition Examination Survey. By applying the modeling tool known as "stochastic dominance," which is particularly well suited to studying diet quality, where exact thresholds between “good” diets and “poor” diets is fuzzy, the economists directly contradict the contention that American diets are getting worse.

In fact, they've actually improved over the past 20 years.

"Conventional wisdom maintains that the quality of the American diet has been deteriorating for at least the past two decades," Beatty writes. "In contrast, we document a previously unknown pattern of improvement in U.S. dietary quality. We find statistically significant improvements for all adults over the period 1989 to 2008, at all levels of dietary quality."

For any level of dietary quality, they demonstrated, more Americans have higher scores on the Dietary Guidelines for Americans Healthy Eating Index for the years 2005 through 2008--the most recent period they studied--than then did in 1989 through 1991--the earliest period. Analysis of the survey data show the entire population has shifted gradually but systemically toward a more nutritious diet. And in another challenge to the modern conventional wisdom, Beatty showed that the improvements in nutrition occurred not only in affluent households, but also in low-income households, which he defined a 1.85 times the poverty level. Although they did find higher-income individuals consistently have higher dietary quality than low-income individuals, they also discovered find some evidence the gap is shrinking.

How does the research team explain such "counter-factual" results?

Their modeling suggests fully half the improvements in diet can be attributed to changes in demographics; that is, the population in the last 20 years has grown older, more educated and more ethnically diverse, all of which are associated with better eating. An additional 10 percent of the change is attributable to food manufacturers' altering the dietary composition of foods: removing saturated fats, trans-fats, sugar and sodium. The remaining 40 percent of causes could not be distinguished between the two factors, so could be related to either or both. Changes in food formulation helped explain considerably more of the improvement in dietary quality for low-income individuals than it did for higher-income individuals, the researchers noted.

Feed vs. food is a false choiceYet again, last week, the public was treated to the perrennial accusation that global agriculture starves people because commodities are "wasted" by being fed to livestock (or ethanol-driven cars) rather than humans. In response to a widely reported study by a New York professor claiming beef production was worse for the environment than any other commodity, University of Leeds professor Tim Benton told London's The Guardian, saving the planet was only one good reason to give up U.S. beef. "Another recent study implies the single biggest intervention to free up calories that could be used to feed people would be not to use grains for beef production in the US,” Benton said.

It's a common perception, say researchers who just authored a report for the Council for Agricultural Science and Technology. The belief that commodities grown for livestock subtract from the supply of human food is held by many in the supply chain, from retailers to policymakers to consumers. But in their recently released 16-page issue paper, available here, the team of scientists and economists from several universities review some facts and offer you a bit of science-based information to help consumers decide whether the issue really should inform their meat-purchasing decisions:

• Large areas of land across the globe are either incapable of or not environmentally wise to use in supporting the production of human food commodities. For instance, more than 70 percent of all agricultural land in the world, according to the Food and Agriculture Organization of the United Nations, is better suited for grazing than it is tilling. Terrain, soil type and climate render the majority of that land currently used for grazing unsuitable for cultivation for the production of vegetable-based foods for human consumption. Only animal agriculture, which takes advantage of the forages grown on those lands, can effectively convert the product of those lands into usable food for humans--in the form of meat and milk products.

• The assumption that cutting back on the number of livestock raised in the world will increase the availability of commodities for human food only holds true if the same cereal crops are interchangeable between animal feed and human food. Granted, on a regional basis, this may be true of certain livestock systems. But when you look at the “feed vs. food” competition on a global scale, the CAST researchers say, livestock diets include a considerable quantity of crops and by-products from human food, fiber, and fuel production that are not suitable as human food because they're unsafe, poor quality, undigestible or not culturally acceptable. Those crops are best suited--and sometimes only suited--for being converted into higher quality foods through animals. And in fact, many otherwise valueless by-products from human food and fiber production are in effect recycled through animals, which actually reduces competition between humans and animals for crops. In the long run, that reduced waste maximizes land use efficiency and decreases the environmental impact of food production.

Food or feed? Feeds commonly used in U.S. livestock farming

Feed type


Can humans eat it?

Forage crops

Pasture grasses, alfalfa, clovers, hays, silages (grass or crop based)



Corn, wheat, barley, millet, sorghum, triticale, oats


Plant proteins

Soybean meal and hulls, cottonseed, safflower meal, canola meal, peanut meal

Only partially

Grain byproducts

Distillers grains (wet and dry), corn gluten, wheat bran, straw, crop residues


Vegetable byproducts

Apple pomace, citrus pulp, almond hulls, pea silages



Waste fruits and vegetables

Only partially

Food industry byproducts

Bakery waste, cannery waste, restaurant waste, candy, potato chips

Only partially

Sugar industry byproducts

Molasses (cane, beet, and citrus), beet pulp

Only partially

Animal byproducts

Meat and bone meal, tallow, feather meal, bloodmeal, poultry litter

Only partially

Dairy byproducts

Milk, whey products, casein

Only partially

Seafood byproducts

Fish and seafood meal and oils, algae

Only partially


Vitamins, minerals, probiotics, antibiotics, yeasts, flavors, enzymes, preservatives

Only partially

• Livestock production is an important component in the economies and society of both developed and developing countries. In the developing world, animals often serve as an important means of accumulating portable and convertable capital. And in some areas of the world, livestock continues to serve as a considerable source of draft power within smallholder operations which make up the majority of global food production.

All foods have an environmental cost, the report notes, whether of plant or animal origin. Does animal agriculture use resources and have a measurable environmental footprint? Yes. But at the same time, the benefits accrued to society by livestock worldwide are substantial in terms of economic and efficient nutrition and in terms of wealth and cultural standing. Modern agriculture continues to improve the environmental footprint and economic cost of animal production through improved productivity, by using high-tech animal nutrition, genetics and management. Consumers need help to understand the issue is not as black-and-white as groups like PETA make it out to be, the authors say.

The truth about menu nutrition labeling

Despite a temporary postponement of the U.S. Food & Drug Administration's plans to require menu items at chain restaurants be labeled for nutritional content, the agency continues moving forward with the proposed regulation. The National Grocers Association--with the backing of some midwestern Congressmen who introduced federal legislation in November to specifically exempt grocery stores and other related businesses--has raised several important questions about how the regs should apply to grocers: Did Congress intend to cover them? Can service departments of groceries rightly be considered restaurants? Can independent supermarkets be considered chains? And what will the paperwork requirement do to grocers?

But amid those questions about the practicality and implementation of the regulations, a larger question appears to be going unanswered, even unasked: Do menu labels even work to meet the stated goals of making consumers healthier?

Some recent research casts that answer in doubt.

London School of Economics economist Matteo M. Galizzi examines the new school of thought behind such practices as menu labeling under the umbrella of "behavioral economics," including interventions like financial incentives to get healthy, sin taxes, nudges and information labels. When it comes specifically to information labels in that overall matrix, he writes, it's not difficult to argue that information-based policies sound perfectly logical, whether you look at them through the new lens of behavioral economics or through the traditional lens of conventional economics: The more information, the better, when it comes to making better decisions and plans. Right?

“Yes and no," Galizzi writes. "But mainly no.”

How can such a logical theory as informational labeling fail, he considers? Three reasons:

  1. Merely providing more information may be perfectly effective in raising awareness, he says. But that awareness doesn't necessarily lead to significant and sustained change in behavior. Whether it's full calorie-specific labeling or simply red-light/yellow-light/green-light schemes, direct evidence on the effectiveness of the both systems is relatively scarce, his review of the research demonstrates. Giving shoppers pure calories and nutritional information has been shown to make only a minimal or modest impact on food purchase and behavioral change. And giving them traffic-light type information tends to produce only a substitution effect, the research says: Consumers tend to avoid really bad foods, but they only switch to a less-bad "yellow-light" food, rather than going fully to healthiest options. And when they do switch, they tend to switch to healthier options within the same categories, but they rarely radically alter the overall structure of their diet enough to make changes in their health, he says.
  2. Mere information release can actually trigger unintended consequences. Not only is simple nutritional or calorie labeling unlikely to have beneficial effects, it's been shown to cause inadvertant behavior that completely offsets the possible benefits. A 2012 study in Philadelphia, for example, that aimed to shift consumers toward zero-calorie drinks by one of several methods found giving shoppers calorie counts on beverage choices actually caused consumption of suger-sweetened drinks to increase, not decrease. Another study from 2006 found similar results with labels indicating low fat: Labeling products as low-fat caused a 50 percent, 84-calorie increase in overall food consumption.
  3. Information like menu labeling only works when it really understands consumer behavior--a tough job. For example, one reason labeling fails is because subjects underestimate the true calorie content of "healthier" snacks by as much as half and thus feal less guilty about eating too much of a not-so-healthy food. Similar failure to predict consumer attitudes confound portion-size control, he notes. Consumers given identically large portions of spaghetti ate significantly more when it was labeled “regular” than when it was labeled “double-size" in a 2013 study. And a "health-halo" effect also confounds many labeling intentions, he notes. When asked to rate the taste and caloric content of yogurts and crisps, subjects estimated that the food labelled as “organic” had significantly lower calories than food labelled “regular,” even when they were identical. That effect implies high-fat, high-sugar foods labeled organic would give consumers license to eat more than is healthy for them.

S5 Box