Search Farmer Goes to Market

Search Site
Thursday March 22, 2018

Why do farmers use so much atrazine?

When the U.S. Environmental Protection Agency released a 500-page draft report in June arguing the nation's second most popular weed killer poses risk to aquatic plants, fish and wildlife, farmer groups across the nation criticized the agency's call for increased limits on use of that pesticide, atrazine. The U.S. Corn Growers argue EPA's new recommendations are excessively cautionary, are based on science even the agency's own advisors argue is flawed, and contradict more than 7,000 previous scientific studies that have found atrazine safe. According to Decatur farmer and Corn Growers President Larry Mussack, the new EPA usage limits would cut average field application rates down to one cup per acre, a level that would make it virtually useless in controlling weeds in a large portion of the Corn Belt. EPA would thus effectively eliminate the pesticide from the market.

Right now, farmers apply an estimated 36,000 tons of the chemical in about 200 different registered products every year to kill and control weeds in corn, sorghum and other crops. Why do farmers like Mussack use so much atrazine?

  • Despite being used for almost 60 years in the United States, atrazine remains one of the most reliable herbicides on the market, especially in killing weeds that are resistant to other herbicides. Excessive weed populations in a crop like corn compete with the crop for water, nutrients and sunlight and can directly reduce the amount of crop harvested from a given amount of land.
  • It is cost-effective, particularly for conservation tillage, or fields that don't get the full plowing traditionally used in order to help protect the soil and water. Farmers can't just stop using weed-killers, so because atrazine is used on most of the country's corn acres, removing it from the market or severely limiting its use would mean changing to other, probably more expensive, alternatives. That means costs for farmers and food buyers would increase. A 2012 study reported by University of Chicago economists predicted farming without atrazine would add up to $59 per acre in productivity losses or additional production costs. While $59 per acre may not strike you as much, in a business that lives or dies on tight margins—just like retail grocers—that inavailability of atrazine could make the difference between making a profit or taking a loss. In some cases it could make a crop nonviable: A 2007 economic analysis in Kansas found that without atrazine, growing grain sorghum would be economically impossible.
  • It's versatile. Farmers would likely have to resort to more complicated herbicide mixtures because some of the alternative products don't control as wide a variety of weeds that atrazine does, at least not as effectively. The one atrazine product would have to be replaced by two or more different products to get the same weed control.

  • Atrazine has actualy allowed farmers to use less chemicals, because of that effective versatility. Banning atrazine will not lead to less herbicide use; it will likely lead to more, and more use of chemicals with even higher potential to impact the environment than atrazine.

  • In the bigger picture, use of atrazine actually improves the nation's water quality. Chemical weed killers like atrazine make no-till agriculture practical, and no-till farming dramatically lessens the amount of soil that washes away during storms, carrying the nutrients and sediments that clog streams and lakes. Estimates say no-till agriculture reduces soil erosion by as much as 90 percent when compared to heavy tillage. No-till farming practices made possible by herbicide use also create habitat for wildlife. One of the alternatives to atrazine, were it to be banned, would be to return to mechanical tilling. A 2013 study in Wisconsin, where state law limits atrazine use in some specific areas in order to protect against the possibility of groundwater contamination, limiting the use of atrazine caused farmers to begin plowing up formerly minimally or no-tilled acres, which can be assumed to increase soil erosion.
  • They view the reasons to stop using atrazine as based on alarmism. Evidence some environmental advocates see as clearly pointing out a need to stop using atrazine on farms, farmers and ag scientists see as taking a leap that low doses EPA found in water and soil somehow add up beyond the impact research studies have actually demonstrated. EPA's draft report has already been met with calls for an outright ban, arguing the fact the chemical can make its way into the nation's waters means it has to be banned, which is the rationale Europe used to ban its use. That kind of all-or-none thinking about pesticide residues, as longtime critic Dennis Avery put it almost 20 years ago, is an example of the "modern world waging a bizarre new war against its own success." Modern agriculture, he says, which has prevented famine, reduced cancer, saved soil and preserved wildlife habitat equal to more than 15 million square miles, is being attacked for doing so by using technology like atrazine to make that miracle possible. "Conservation tillage is the most powerful weapon against soil erosion that farmers have ever found," he wrote. "The weed killers permit them to quit plowing, and keep their soil in place with cover crops and crop residue. Without conservation tillage the world will face a huge topsoil crisis in the 21st century. With it, we have the most sustainable farming in 10,000 years."

The Nebraska Corn Growers is urging farmers and other stakeholders to enter public comments on EPA's proposal. If you'd like to tell the retail grocer's side of the story, you have until Aug. 5 to comment on the proposal.

Why do farmers get paid not to grow crops?

The U.S. Department of Agriculture announced in early May that Nebraska comes in third nationwide in the amount of farm-ground acreage the government pays farmers to leave unplanted and idle. Behind only Iowa and Washington in the program, this state will idle almost 775,000 acres total, an area roughly 1.5 times the size of all of Lancaster County.

To some, it has raised the perennial question: Why does the government pay farmers to not grow crops?

As is often the case, Internet mythology about food and farming has overshadowed some of the fact. The U.S. farm program began during the Great Depression of the 1930s as a limited safety net to help support the income of farmers who were being driven from the land by the thousands. In its early and following years, it did provide some forms of "set asides," in which farmers were subsidized to reduce the supply of crops in order to help make them more scarce and thus hold their prices up. And some vestiges of that supply-control mentality does exist even today, most notably in voluntary programs in which farmers of various commodities either pool funds to buy out other farmers to keep supplies down or voluntarily restrict production per farm.

However, with the passage of the most recent federal Farm Bill, in 2014, the U.S. government—for the most part—got out of the business of paying farmers directly to support crop prices on the whole. Instead, the farm program transitioned to the goals of turning the regulation of ag commodity prices back over to the market to determine price through supply and demand, and then making aid available to help farm owners adjust to that market, most notably in the form of subsidized insurance against crop losses in bad years.

The only real remaining government program that still pays farmers not to plant is the U.S. Conservation Reserve Program. The more than 30-year-old program pays landowners an annual rent over a contract period of from 10 to 15 years to leave land the government considers environmentally sensitive out of crop production. It also makes cost-sharing funds available to help landowners pay for conservation improvements like planting grasses and trees and protecting streams and rivers. The United States currently enrolls about 23.4 million acres in CRP—down from its high of about 36 million acres 10 years ago and nearly the maximum the government will be allowed to pay for by 2018, according to the Farm Bill.

The government's goal in paying farmers an average of about $94 per acre in Nebraska is to return that environmentally vulnerable ground, which in many cases shouldn't have been plowed in the first place, to a more natural state by replanting land cover that was removed during normal cropping. In the process, the CRP hopes to improve water quality, prevent soil erosion and restore wildlife habitat. And in some cases, CRP is still recognized to play a smaller, but still important, role in giving farmers a source of income for protecting those acres and discouraging them from selling them for non-ag uses like housing developments and golf courses.

Are taxpayers getting their money's worth?

Nationwide, the CRP has been credited with reducing soil erosion by nearly 224 million tons a year, or about 6.8 tons per acre enrolled. By reducing erosion of that soil which often carries excess nutrients that can cause problems in nearby waters, the CRP has also significantly improved those waters. Research estimates it has cut the amount of the polluting nutrient nitrate by 90 percent, sediment and herbicide by 50 percent and phosphorous by as much as 30 percent in some farm regions. Also, by converting row cropland into native grasslands and trees, CRP has given nesting cover, wintering habitat, and plant and insect feed to numerous wildlife species.

A 2012 review by a pair of Oregon State rural social scientists of the CRP's selected economic benefits estimated enrolling an acre of land in the CRP improved the value of that acre by an estimated $58 per year, in 2011 dollars. Naturally, most of that value accrues to the owner of the land. However, annual benefits from the reduced soil erosion and increased recreational opportunities amounted to another roughly $49 per acre. The researchers cited studies estimating that only about 10 percent of that $49 goes back into the pocket of the landowner, with the remaining 90 percent accruing to society as a whole. They suggest those figures demonstrate that, although the performance of the CRP could be improved, the average national CRP rental cost of $52 per acre in 2011 provides benefits that outweigh its costs to taxpayers.

Here's why American farmers haven't gone organic

"The challenges of the 21st century demand a fundamental rethink of agriculture that takes environmental harm into account," says the Organic Consumers Association, the advocacy group for universal organic farming. "Promising methods and technologies like organic are in the vanguard of that effort. We cannot afford to move toward the future without such technologies."

So far, OCA's stated goal of converting 30 percent of American agriculture to organic by the start of this year has fallen short--by about 30-fold. Despite apparent strong interest in organic food, some evidence consumers are willing to pay more for it, and better crop prices for organic farmers compared to prices for conventional crops, organic's share of U.S. tillable acreage has slowed to only about 3.1 million acres, according to most recent USDA survey data, representing less than 1 percent of the country's total tillable acreas. Why aren't farmers playing along? A couple of recent studies give some meaningful indications about why they don't believe they can afford the technology.


Productivity still falls far short

Author and crop-technology consultant Steven Savage conducted an indepth analysis of USDA data sets on both conventional crop production and organic from 2008 and 2014, ultimately making 370 different comparisons of organic and total data for the same crop in the same state where the organic production represented at least 20 acres. His analysis covers 80 percent of US crop acreage, Savage says. He found that in 84 percent of the comparisons, organic crop yields were lower than conventional crops, most in the range of 20 percent to 50 percent lower. In addition, for the 9 percent of cropping cases where organic was more productive than conventional, most were hay or silage crop systems; that is, feed for animals and not people.

Higher prices still not worth the uncertainty?

One of the issues hampering widespread adoption of organic production, particularly for the "big three" farm crops that fill the lion's share of U.S. acres--corn, soybeans and wheat--is that data from cropping experiments suggesting farmers can make more money from organic don't play out consistently in the real world, as Savage's analysis demonstrates. USDA economists William McBride and Catherine Greene attempt to get to the bottom of this contradition in a new study from USDA.

They note several experimental studies have found that some conventional farms could in fact earn higher returns if they transitioned to organic production, yet adoption of the organic approach among U.S. field crop producers remains extremely low. The problem, they say, is that the economic analyses used with the experimental research has primarily examined only operating or variable costs while excluding the economic costs of such resources as land, labor and capital. Their study's findings, which adjusted for those hidden costs, showed the economic costs of organic compared with conventional production were roughly between $83 and $98 per acre higher for corn, $55 to $62 per acre higher for wheat, and $106 to $125 per acre higher for soybeans.

Still, the remaining mystery is why organic's relatively higher crop prices don't attract more farmers, who could expect to earn greater returns despite the higher costs if they transitioned to organic. The USDA researchers suggest even the higher returns aren't enough to encourage farmers to take the risk organic brings. Organic field-crop production is particularly challenging compared with conventional production in achieving effective weed control and crop yields. But costs are also incurred to obtain and then maintain organic certification. Before an operation is certified to sell organic crops, the cropland must be managed organically for a minimum of 36 months, meaning a farm must endure two years of selling crops--raised at organic's higher costs--into a market paying only the relatively lower conventional crop prices. But the bigger risk may follow that period, they suggest: The sunk organic production costs associated with transition are reliant upon continuing high prices for organic crops, which are not guaranteed long into the future. That uncertainty may be the risky cloud that keeps the typical farmer from adopting the movement on any acreage other than small plots.


Why do ranchers still brand cattle?

Q Seriously, I know Nebraska is a big cattle-ranching state, but this is not the wild, wild west anymore. But you still see cattle with brands burned into their hides? Why?

A You're right. Branding young calves every spring carries remnants of the ranching heritage of cattle raising. But the practice also has an important purpose on today's ranches--so important that, according to USDA estimates, almost half of all cattle and calves across the nation are still branded.

For more than 500 years, burning a distinguishing mark into the hide of large animals like cattle has been a common, easily visible means of distinguishing their ownership. That system of identification was--and still is--especially important in the western U.S. states, where cattle still graze on large areas of publicly owned lands. On those large tracts, it's not uncommon for cattle owned by one person to get mixed into herds owned by somebody else. Branding makes a quick and permanent way to separate those cattle out and return them to home herds.

But in Nebraska, almost all our grazing land is privately owned, so that kind of mixing is less common. In Nebraska's case, although the purpose of branding remains to identify the owner of a herd, that identification is predominantly to help prevent modern-day cattle rustling.

How big is the problem? The Nebraska Brand Comittee, a division of the state government tasked with overseeing cattle brand registrations and enforcement, reports that for the five-year period ending in 2013, the commission's hired criminal investigators brought 15 felony convictions over more than 400 cattle valued at $338,000, with an additional three cases pending over another 113 head. And although not all were stolen, the committee also reported it recovered 810 cattle for their owners between July 2015 and March 2016, valued at nearly $1.25 million, by using their brands.

The recent record-high cattle prices have made cattle a lucrative target for thieves. Unlike the case with most stolen goods, which are typically sold for pennies on the dollar, if thieves can escape the eyes of brand investigators, those animals can usually be sold for full market value. That black market has led several states to tighten up their penalties and beef up their own brand-inspection systems. In December, Kansas formed a new office to investigate livestock theft. In Oklahoma, penalties for cattle theft now rival criminal assault, and a team of special rangers targets rustlers. Like 13 other states, Nebraska has brand inspection laws, requiring cattle sold in the western two-thirds of the state to be branded and that brand to be inspected by one of the agency's brand inspectors before cattle can be sold or hauled out of the region.

Branding remains the only meaningful, permanent means to ensure cattle remain identified by owner. Plastic ear tags, the second most common method of identifying cattle, can be cut off or lost during grazing, leaving livestock auctions with little or no means of knowing that an animal has been stolen. Although branding is still done the old-fashioned way, using irons heated in a fire pit or barrel, other techniques have also been introduced, including electric branding irons and freeze branding, which uses extreme cold to kill the cells in the hide that produce pigment, rendering the brand in white.

You can see all the brands registered in the state of Nebraska at the Brand Committee's online directory, here.

No pesticide is safe any more?

USDA’s latest Pesticide Data Program report on samples collected in 2014 found, again, that America’s food supply is safe, showing pesticide residues well below even the more sensitive levels that could pose health risk for infants and children. However, the news that pesticide residue levels were at or below EPA-deemed safe levels in all but 38 out of 10,619 fresh fruit and vegetable, baby food, salmon, oat and rice samples tested was not reassuring to anti-technology advocates like the U.S. Right to Know coalition.

That group and others questioned why USDA doesn’t test fresh produce for all pesticides, in particular glyphosate, the broad-spectrum weed killer whose use has gone up since row crops were genetically modified to resist its action. USDA argues because glyphosate isn’t widely used in fresh produce, there’s no good reason to go to the expense of testing for it in those particular foodstuffs covered by the USDA report.

True to form, the Environmental Working Group joined the criticism of glyphosate in early February, claiming increased use of the pesticide and increases in allowable levels of residues by regulators—in response to scientific studies showing those levels are safe—are posing an unacceptable risk to consumers. EWG breathlessly reports the herbicide has showed up in “samples of honey, soy sauce, infant formula and even breast milk.”

But once again, the activist group repeats the same critical error it makes annually in its highly publicized, but logically suspect, “Dirty Dozen” list of the most pesticide-contaminated produce. Its annual report, based on the same USDA data, links pesticide food residue data with toxicological profiles for each chemical. The end result is a litany of common pesticides, most showing up as residues on any of 93 different foods listed, ranging in incidence from none at all to as high as nearly 90 percent.

But in all the alarming facts and figures, EWG fails to enter the most important discussion of all: Are the residues relevant?

Farmer Goes to Market re-analyzed some of EWG’s data on incidence to paint a more accurate picture of the real issue: exposure. We found that in all cases, the amount of produce necessary for adults and children to eat daily, according to the minimum safety standards established by the U.S. Environmental Protection Agency, are far beyond the physical capability, let alone the desire to do so.


Pounds of strawberries eaten per day


Pounds of potatoes eaten per day


EWG’s repeated habit of equating presence with danger is not only inaccurate, it borders on harmful exploitation if it drives consumers away from the healthy foods they most need.

As water contamination expert Dr. Shane Snyder, research and development project manager for the Southern Nevada Water Authority. told Congress in 2008, "Are we going to make decisions based upon our ability to find contaminants, or based upon protection of public health? I am not a policy-maker; I am a scientist. However, I can tell you with absolute certainty that, if we regulate contaminants based upon detection rather than health effects, we are embarking on a futile journey without end. The reason is simple: Decades ago, we could only detect contaminants at parts per million levels. Years ago, we advanced to parts per billion. We are now able to detect compounds at the parts-per-trillion level, and are breaching the parts-per-quadrillion boundary in some cases."

Although Snyder's testimony was in regard to pharmaceutical residues found in drinking water, the point applies equally to fear-mongering based on pesticide residues in food. If we insist on scaring people from food based simply on our ability to find a trace compound, we risk not only reducing grocery profitability by scaring shoppers from one of the highest-margin areas of the grocery, we are on our way to making meaningful regulation based on realistic risk impossible.

S5 Box