ACC Sees GOP Wins Clearing Path For TSCA After EPA Review,

Inside EPA’s
Risk Policy Report
An exclusive weekly report for scientists interested in environmental
policymaking and policymakers interested in science
Vol. 21, No. 44 - November 11, 2014
After EPA Review,
Advocates, Industry
Spar Over
Neonicotinoid Data
In the wake of an EPA analysis
finding negligible benefits from a
controversial pesticide product,
industry officials and environmentalists are sparring over neonicotinoids’
efficacy, with industry officials
promising additional data for a
broader EPA analysis, and environmentalists arguing companies should
have given regulators the information
years ago.
EPA is taking comment until Dec.
22 on the agency’s Oct. 15 analysis
that found neonicotinoid-treated seeds
largely fail to improve soybean yields,
a report environmentalists say bolsters
their long-standing calls for prohibition of the systemic pesticides, which
move into plants’ pollen, nectar and
stem, and that environmentalists say
harm pollinators.
But the pesticide producers
coalition Growing Matters is pushing
back on EPA’s efficacy analysis,
calling for a more thorough review
and releasing the first installments in a
series of 15 reports they say show the
value of neonicotinoids to farmers and
the economy. Relevant documents are
available on InsideEPA.com. See page
2 for details.
“Neonicotinoid seed treatments
provide convenience, safety improvements for workers, less disruption to
continued on page 6
ACC Sees GOP Wins Clearing Path For TSCA
Reform, But Hurdles Remain
American Chemistry Council (ACC) President Cal Dooley says GOP
midterm election gains clear a path to advance a Toxic Substances Control Act
(TSCA) reform bill as outgoing Senate environment panel chair Barbara
Boxer (D-CA) will no longer be able to block the bill in a Republican Senate,
though major hurdles remain including how to end a fight over the bill’s
preemption of state toxics programs and how the 2016 elections might affect
the bill.
During a Nov. 18 conference call with reporters, Dooley noted that
outgoing Senate Environment & Public Works Committee (EPW) Chairman
continued on page 12
EPA Weighs Major Data Questions In Plan For
Reviewing SO2 Air Standard
EPA plans to address major data policy questions in its plan for reviewing
the sulfur dioxide (SO2) national ambient air quality standard (NAAQS),
including how to assess and incorporate first-time air monitoring data collected by states following the last review and other information that could
inform whether EPA revises the standard.
The final integrated review plan (IRP) released Nov. 10 is an early part of
the agency’s Clean Air Act-mandated five-year review of the SO2 NAAQS,
which the agency last updated in 2010. At that time EPA established a new
one-hour standard at a level of 75 parts per billion (ppb) and revoked the two
continued on page 10
EPA Advisors Supportive Of ‘Risk-Weighted’
VOC Drinking Water Regulation
EPA advisors are generally supportive of the agency’s novel strategy to
regulate carcinogenic volatile organic compounds (cVOCs) in drinking water
as a group by using an approach based on risk-weighted factors, despite some
concerns about policy and technical hurdles in using this approach, which is
costlier than traditional methods.
“The risk-based method is the way to go,” National Drinking Water
Advisory Council (NDWAC) member Bob Vincent said at the panel’s Nov. 6
meeting, where EPA officials outlined findings from a working group tasked
with implementing the agency’s landmark 2010 plan to regulate 16 cVOCs in
continued on page 8
IN THIS ISSUE . . .
Advocates, Industry Seek Changes To EPA-FDA Fish Consumption Advisory ..................................... page 3
Advocates Detail Opposition To EPA’s Novel Dioxin Cleanup Proposal ............................................... page 5
Observers Downplay Data Losses From EPA Reducing Air Monitor Network .................................... page 11
Industry Urges EPA Advisors To Press For Reconsideration Of TMB Analyses .................................. page 14
Advocates Oppose Delay In Federal Pollinator Plan As EPA Seeks Input
Environmental groups are criticizing an EPA-led federal pollinator task force for delaying development of a strategy
to comply with President Obama’s memo on protecting honey bees and other pollinators, thereby preventing any changes
from occurring before the spring planting season begins, although federal officials are holding public comment sessions
to inform the plan’s development.
EPA and the U.S. Department of Agriculture (USDA) are leading the federal Pollinator Health Task Force in crafting
a national strategy to protect pollinators, and the task force had been slated to complete the strategy to better understand,
prevent and recover pollinator losses before the end of the year.
But a White House official Oct. 22 told a meeting of the North American Pollinator Protection Campaign in Washington, DC, that the December deadline for completing the strategy has been postponed by several months.
Environmental groups Center for Food Safety (CFS) and Friends of the Earth, in separate statements say the delay
precludes any new pollinator protection efforts from taking effect before the upcoming growing season.
“We are disappointed that the White House is already missing its deadlines,” CFS says in an Oct. 23 statement. “As
beekeepers, scientists and environmentalists continue to stress how urgent our current pollinator crisis is, it’s imperative
that our nation’s leadership be doing all it can to respond.”
Friends of the Earth, in an Oct. 22 statement says, “The weight of the science tells us that we must take immediate
action to suspend systemic bee-killing pesticides before the spring planting season begins.”
Obama created the federal pollinator task force with his June 20 memo that seeks to stem massive declines in
pollinators by improving habitat, assessing how pesticides and other stressors contribute to pollinator declines and taking
action where appropriate, including a call for EPA to assess potential risks of neonicotinoid pesticides to pollinators.
Environmentalists have said the administration’s plans, which largely call on federal agencies to improve bee habitat,
do not go far enough because they stop short of calling for specific regulatory restrictions and other measures advocates
are seeking to reduce the potential risks of pesticides to bees.
While the federal strategy has been delayed, EPA and USDA are seeking public comment through a pair of listening
sessions to inform development of the plan. In a Nov. 5 Federal Register notice, EPA says the task force is seeking
comment on best management practices, including how to mitigate pesticide risks to pollinators, as well as on improving
pollinator habitat, research and public-private partnerships that seek to improve cooperation between beekeepers and
pesticide applicators.
The meetings are scheduled for Nov. 12 in Arlington, VA, and Nov. 17 in Riverdale, MD.
EPA and USDA are working to stem massive declines in honey bee populations seen since 2006, and have named
pesticides as one of several factors in the declines, with others including poor nutrition, shrinking habitat, and parasites
such as the varroa mite. Industry officials have said neonicotinoid pesticides do not harm pollinators when used properly
and have focused on the varroa mite as a primary culprit in the declines. — Dave Reynolds
Hot Documents Available on InsideEPA.com
Subscribers to InsideEPA.com have access to hundreds of policy documents, including draft regulations
and legislation, as well as a searchable database of daily news stories and documents. The documents listed
below are in addition to the background documents referenced throughout this issue. For more information
about Risk Policy Report, or for a free trial, call 1-800-424-9068.
2
„
Advisors Struggle To Refine Advice To EPA On TMBs IRIS Assessment
„
Advocates Oppose EPA’s Proposed Dioxin Cleanup
„
EPA Finalizes Cuts To Chemical Speciation Network
„
EPA Releases Final Integrated Plan For SO2 NAAQS Review
„
Panel Supports EPA Plan To Regulate Groups Of Water Contaminants
„
Pesticide Industry Issues Reports On Neonicotinoids’ Benefits
„
USGS Says Wetlands Create ‘Ideal Conditions’ To Convert Methylmercury
Risk Policy Report - www.InsideEPA.com - November 11, 2014
USGS Report Calls For Balancing Wetlands’ Benefits With Mercury Concerns
Environmental managers should weigh the ecological and water quality benefits of wetland construction and restoration against data that show wetlands promote the conversion of inorganic mercury into methylmercury, a potent neurotoxin that accumulates in fish and other aquatic organisms, a new U.S. Geological Survey (USGS) report says.
The Oct. 14 report says that while predator fish in one-fourth of the nearly 300 streams sampled contained methylmercury at levels exceeding EPA’s fish tissue mercury criterion for the protection of human health, 0.3 parts per million,
concentrations of the ubiquitous metal were highest in “wetland-dominated landscapes, particularly in coastal plain
streams of the Southeastern United States.”
“Across the United States, methylmercury concentrations in fish and stream water generally were highest in undeveloped areas with abundant wetlands, which provide ideal conditions for methylmercury production. In contrast, methylmercury levels in largemouth bass from urban streams were the lowest of all land uses and land covers studied,” says the
report, “Mercury in the Nation’s Streams—Levels, Trends, and Implications.” Relevant documents are available on
InsideEPA.com. See page 2 for details.
The report summarizes selected stream studies conducted by USGS since the late 1990s, while also drawing on work
from other sources, and USGS says its studies “provide the most comprehensive, multimedia assessment of streams
across the United States, and yield insights about the importance of watershed characteristics relative to mercury inputs.”
“Three key factors determine the level of mercury contamination in fish—the amount of inorganic mercury available
to an ecosystem, the conversion of inorganic mercury to methylmercury, and the bioaccumulation of methylmercury
through the food web,” the report says. Wetlands, which have limited dissolved oxygen and abundant organic matter, can
increase the conversion of inorganic mercury to methylmercury, USGS says.
Coal combustion is the predominant source of mercury in fish, being deposited in lakes and rivers from the atmosphere. High levels of methylmercury in fish tissue remain the leading cause of state-issued fish consumption advisories.
Although regulation eliminating mercury from many products and waste streams has resulted “in about a 60-percent
decrease in emissions in the United States since 1990,” USGS says, further reductions in mercury pollution to the air are
necessary.
“Methylmercury production in wetlands and other aquatic ecosystems generally increases with increasing sulfate,
which can be contributed by anthropogenic sources, such as emissions from coal burning. Thus, decreasing sulfate
emissions, in response to implementation of the Clean Air Act, are expected to cause decreasing methylmercury concentrations in some areas of the United States,” according to the report.
The report says the United States must participate in a global strategy to reduce mercury emissions. While Eastern
states will benefit from reductions in domestic pollution, Western states are more susceptible to pollution from other continents.
“[E]mission controls will provide smaller benefits in the Western United States, where reduced domestic emissions
may be offset by increased emissions from Asia. Implementation of the recently adopted U.S. Mercury and Air Toxics
Standards and worldwide Minamata Convention goals should lead to reductions in both U.S. and global mercury emissions,” the report says.
Furthermore, USGS says, monitoring programs, which focus mostly on methylmercury concentrations in fish tissue,
should be re-evaluated in order to link mercury levels to their sources.
“Given the complexities of mercury emissions, transport pathways, and ecological factors that influence the extent of
methylmercury contamination in fish, a multimedia monitoring approach is critical to track the effectiveness of management actions intended to reduce mercury emissions and resulting environmental mercury levels,” according to the report.
Advocates, Industry Seek Changes To EPA-FDA Fish Consumption Advisory
Industry groups and public health advocates are presenting competing recommendations to a federal advisory panel
on fish consumption, with industry pressing for EPA and the Food and Drug Administration (FDA) to encourage pregnant
women and children to eat more fish and others questioning FDA’s net effects model that is the basis for the agencies’
latest advice.
At issue is guidance on fish consumption that EPA and FDA jointly issue to women of child bearing age and children.
FDA first deemed such advice necessary in the mid 1990s, because of concerns about the levels of methylmercury in
seafood, FDA official Phil Spiller said during the Nov. 3-4 meeting of FDA’s Risk Communication Advisory Committee
in Silver Spring, MD.
But that and subsequent advice, eventually issued jointly by EPA and FDA, led many women to stop eating fish,
which also has health risks. The beneficial oils and fatty acids in fish boost fetus’ and young children’s healthy neurological and eye development.
In the late 2000s, FDA began efforts to build a model including both risks and benefits of fish consumption, in an
effort to understand what Spiller calls “net effects” and better inform the advisory. But a draft version of the model,
released in 2009 for public comment and peer review, was roundly criticized by many environmental and public health
Risk Policy Report - www.InsideEPA.com - November 11, 2014
3
groups for underestimating mercury risks. And a leaked version of EPA’s comments on FDA’s model raised similar
concerns (Risk Policy Report, April 21, 2009).
FDA released the finalized version of the model along with the draft updated fish consumption advice in June. FDA
has not sought further peer review, and the model has not been published in the scientific literature. The update suggests
that women eat a minimum of eight ounces of fish per week, and a maximum of 12 ounces of fish per week. The advice is
based on the model, Spiller told the committee.
Spiller explained that the model was developed to inform upon the net effects, positive and negative, of eating fish.
As they developed the model, Spiller explained that it became apparent that “adverse effects, with a greater dose, just
kept getting worse” but “beneficial effects are not linear.” Instead, they plateau. “What you will get after the plateau is
just more methylmercury and no more benefit,” Spiller said. He added that the model “is very supportive of the 8-12
ounces [of fish per week] we’re trying to get people to eat.”
National Fishing Institute (NFI) representatives at the meeting warned the committee that American women eat far
too few fish, putting their children at risk of missing the benefits of fish consumption. They urged the committee to make
the next advisory more positive to try to encourage women to eat more fish. And they argued that FDA’s model supports
exceeding some of the limits the agencies wrote into their guidance: the 12-ounce per week overall maximum and a sixounce per week maximum for Albacore tuna.
“The media and doctors have created a nation of low seafood eaters,” Rima Kleiner, a dietician with NFI, told the
committee Nov. 3. “It is time for the government to step up to the plate and get the advice right. Pregnant women do not
consume [fish] amounts anywhere near enough to be at risk [of mercury exposure], and they’re not consuming enough to
benefit their babies.”
Kleiner argued that the “six ounce [weekly] limit on Albacore tuna isn’t supported” by the model “and should be
removed.” Kleiner urged the committee to “encourage [women] to eat a variety of fish and not the four species” included
in the agencies’ list of fish for pregnant women to avoid. “The real risk in eating seafood is not eating enough,” Kleiner
said.
And a contractor to NFI, Lori Davis, a statistician at Impact who reviewed the model, told the committee Nov. 3 that
the agencies’ advisory does not “reflect the positive aspects of the report. . . . Net benefits do not end at 12 ounces of fish
per week. . . . The evidence may not support a 12 ounce per week limit.”
Davis also said that “according to the modeling in the net effects report, there is not a good reason to support a limit
of six ounces [of tuna] per week. . .. The report gives irrefutable evidence that the benefits of fish consumption is very
high. The vast majority of pregnant women do not [eat enough fish]. Encouragement is the most important message.”
But environmentalists and public health advocates at the meeting urged the agencies to re-evaluate the model,
arguing that it contains numerous uncertainties and that epidemiology studies published in the past decade show health
effects in children exposed prenatally to levels of methylmercury experienced by American consumers.
“Models are not [ready for use in this way]. The model is a really useful tool for analytical [purposes] but it does not
produce facts,” Edward Groth III, a scientist retired from Consumers Union, told the committee Nov. 3. “It’s true there is
a lot of new evidence of benefits, but there is also an enormous amount of new evidence on risks in the past decade and
FDA has ignored it.”
In written comments to the committee, Groth raises concerns about the data used as the basis for the FDA model,
noting that the meta analyses of studies used to form the dose-response curve for methylmercury exposure risk is confounded with benefits from fish consumption. There is no newer meta analysis to replace the original, Groth says in a
Nov. 7 interview.
But the biggest failing of the model in Groth’s opinion is that it doesn’t consider the differences in humans, and how
they are likely to be affected differently by exposure to methylmercury. There is nothing that addresses this uncertainty in
the model or the advice, Groth argues.
The model also looks at benefits and impacts on IQ points, but not other neurological affects that can be seen with
mercury exposure, Groth said, such as attention deficit hyperactivity disorder. “It’s a blind spot in the model. It means
you need to consider the public health effect” more broadly.
Similarly, Amy Kyle, an associate professor of public health at the University of California at Berkeley also urged the
committee to suggest caution to FDA in its reliance on the model in crafting the fish advisory. Kyle noted that the model
overlaps various uncertainties on top of each other, and she urged the committee to remember that the most sensitive
population to methylmercury risk is the developing fetus. “As you think about risk communication,” ask yourself, ‘Do we
really know the risk?’ I think you’ve heard concerns about underestimates. Also, there is a tremendous amount of modeling and estimating . . . Uncertainty is characterized well in some of the specific elements . . . but the overall study
[uncertainty] is not.”
As one example, Kyle pointed to an estimate in the model that it is intended to protect 99 percent of children, leaving
1 percent unprotected. “We don’t usually accept that level,” Kyle said in a Nov. 7 interview. “But they also say it could be
as high as 50 percent [of children that wouldn’t be protected by the calculations]. That would be very high.”
In her remarks to the committee, Kyle also said that the analysts’ decision to translate all the various neurological and
4
Risk Policy Report - www.InsideEPA.com - November 11, 2014
developmental effects seen into impact on IQ points made her “a bit queasy.” In the interview afterward, Kyle explained
that the different effects “don’t always track,” that the effects are sometimes motor effects, sometimes language effects, or
something else. The effects “are not all on the same curve,” she said. “They’re not on the same trajectory. It’s a limitation
of the study.”
Both Kyle and Groth also criticized FDA for deciding to use the model as the basis for the fish advice without
publishing the model in a peer reviewed journal. Still, Groth indicates that there are ways in which the model is useful
and can be helpful in informing the advisory. In his written comments, Groth outlines several exercises he recommends
that FDA staff perform to better inform the fish advisory.
Some could be accomplished in the model’s current form, “because what is of interest is not the quantitative results
per se, but rather the ability to compare results of different scenarios . . . The model allows us to compare what might
happen if the government offered different versions of fish consumption advice, and (some, most or all—the model can
be flexible about that) pregnant women followed it,” Groth writes in undated comments to the FDA advisory committee.
Groth suggests FDA run additional scenarios beyond the eight it describes in the modeling paper, to consider issues such
as risks and benefits to those who don’t comply with the advice, or if the advice were altered to recommend only those
fish containing less than 0.06 parts per million methylmercury.
Groth also suggests ways to improve the model, such as incorporating uncertainty factors to account for differences
between individual humans and their responses to methylmercury exposure and trying different dose-response curves in
the model from other, newer studies than the current 2007 meta analysis that is the basis for the model. — Maria Hegstad
Advocates Detail Opposition To EPA’s Novel Dioxin Cleanup Proposal
Environmentalists are reiterating their opposition to EPA’s proposed plan for cleaning up dioxin from a Michigan
river floodplain, arguing in comments to EPA that the site-specific plan’s novel cleanup standards are based on faulty
assumptions, fail to consider cumulative exposures and are inadequate to protect human health and the environment.
The proposed cleanup goals “are much too high to be protective” and fail to “take into account the already high
dioxin body-burden in” area residents, the Lone Tree Council, a Michigan environmental group, says in Oct. 10 comments. Relevant documents are available on InsideEPA.com. See page 2 for details.
In comments prepared by the consulting firm Environmental Stewardship Concepts, LLC, Lone Tree Council argues
EPA’s Aug. 12 proposed plan for cleaning up the Tittabawassee River Floodplain inappropriately focuses on non-cancer
rather than cancer health risks. The group also protests the limited information that is the basis for the plan, with particular concern to its inclusion of research from the site’s responsible party, Dow Chemical Company.
EPA, which is working with the Michigan Department of Environmental Quality (DEQ) on cleaning up the overall
Saginaw-Tittabawassee River and Bay site, took comment on the proposed plan for cleaning up contaminated floodplain
soil through Oct. 14. The floodplain cleanup is being closely watched by environmental groups who say EPA’s handling
of the site could set a precedent for how the agency implements its non-cancer risk estimate for dioxin, crafted in the
agency’s 2012 Integrated Risk Information System (IRIS) assessment.
Dioxin is a category of persistent and accumulative compounds inadvertently created through industrial incineration
processes and also through the burning of trash and forest fires. It was a primary ingredient in the herbicide Agent Orange
used during the Vietnam War.
Environmentalists have long urged EPA to strengthen dioxin cleanup requirements and generally praised a 50 parts
per trillion (ppt) limit EPA floated following the agency’s February 2012 IRIS non-cancer risk assessment of 2,3,7,8tetrachlorodibenzo-p-dioxin, the most toxic form of the compound. That limit was significantly more stringent than the
1,000 ppt limit EPA set in 1998.
The IRIS assessment set an oral reference dose (RfD) — or amount below which EPA expects no adverse health
effects if ingested daily for a lifetime — of 0.7 picograms per kilogram bodyweight per day (pg/kg-day). The 2012 IRIS
assessment of dioxin’s non-cancer risks was part of a reassessment of dioxin’s health risks that agency staff has been
working on for decades, though IRIS has yet to complete the cancer portion of that assessment.
The proposed cleanup plan for the Tittabawassee River floodplain soil also relies on the 2012 non-cancer RfD. But
EPA and DEQ also considered studies of how contamination is absorbed into the bloodstream and tissues after a person is
exposed in their efforts to derive site-specific non-cancer risk values. The agencies’ August document on the site-specific
standards also notes other factors that may limit exposures, including that dioxin levels vary widely in the river floodplain
and cold weather often limits exposures to contaminated soil because the ground is frozen and people spend less time
outside.
After EPA announced the proposal this summer, environmentalists told Inside EPA the plan’s proposed cleanup
standards of 250 ppt in residential areas and 2,000 ppt in other land areas, such as farms, parks, commercial properties
and a wildlife refuge, showed EPA floating significantly weaker cleanup standards than the 50 ppt standard the agency
estimated in 2012 and which industry groups have claimed is flawed and overly stringent (Risk Policy Report, Aug. 19).
The proposed cleanup goals are based on protecting against non-cancer risks because EPA has not yet issued the
Risk Policy Report - www.InsideEPA.com - November 11, 2014
5
cancer values for dioxin. But in the document supporting the proposed cleanup, EPA says the site-specific cleanup levels
based on the 2012 non-cancer RfD are expected to be protective of cancer risks. The agency also says that development
of cancer risk information “will take some additional time, and no projected completion date is available.”
Dow, the site’s responsible party, declined a request seeking the company’s comments on the floodplain soil cleanup,
referring the request to EPA. A spokesman for EPA’s Region 5 also declined the request for public comments submitted to
the agency, but said the Region would provide a “responsiveness summary” when it is completed.
In a statement to Inside EPA, the Dow spokesman said, “we remain committed to resolution of this issue and will
continue working collaboratively with the EPA, DEQ and the community.”
In the Oct. 10 comments, Lone Tree Council argues there is insufficient evidence to merit deviating from a longstanding conservative default oral soil bioavailability factor — or relative bioavailability (RBA) of 1, which assumes 100
percent of dioxins present in contaminated soil could interact with an animal or human that ingested the soil, causing
harm.
EPA’s August document on calculating the site-specific standards shows EPA set an RBA of 0.43 for use with EPA’s
2012 non-cancer RfD, and that the agencies considered a Dow study of RBA of dioxin in soil in crafting the site-specific
standards.
But the Lone Tree Council says “the assumptions regarding the relative bioavailability are not appropriate and at
least one is illogical to the point of being arbitrary and not based on any empirical data.” Additionally, the group says that
the few studies EPA cites to support use of a weaker bioavailability factor in setting cleanup goals is based on inconclusive studies that “have small sample sizes, and are largely funded by Dow, for which there is an obvious conflict of
interest.”
Lone Tree Council also says agency risk assessors should have considered risks from inhalation exposures, and that
the agencies’ proposal includes no discussion of ambient levels of dioxin, despite years of releases in the area. Additionally, the advocates urge EPA to strengthen its assessment of oral risks to account for bioaccumulation in plants and
animals, including livestock.
The group says, “A high number of uncertainties exist within the risk assessment process at this site, and thus, the
most conservative default assumptions should be used.” — Dave Reynolds
Advocates, Industry Spar Over Neonicotinoid Data . . . begins on page one
beneficial insects (and less chance of secondary pest resurgence), and pesticide resistance management through less
reliance on older broader-spectrum alternatives,” the industry contractor AgInfomatics says in an Oct. 22 statement.
“These other values need to be integrated into the EPA assessment.”
Among the reports the industry officials have promised to release this year, is a study of how neonicotinoid-treated
seeds improve soybean yields that will provide EPA with additional data to support a more thorough assessment,
AgInfomatics has said.
But a source with the Center for Food Safety (CFS) argues in an interview with Inside EPA that industry should have
provided efficacy data long before EPA’s Oct. 15 analysis. Companies should have provided the information before the
products were put on the market through EPA’s Federal Insecticide, Fungicide and Rodenticide Act (FIFRA) registration
process, and certainly after the EPA’s pesticides office in 2013 requested efficacy data from neonicotinoid registrants.
“EPA has asked for that information for years, why was the industry holding the information back?” the CFS source
says, adding the long-awaited data is “probably not very reliable.”
Also, CFS Oct. 21 filed a Freedom Of Information Act request with EPA seeking information “related to past,
pending or future pollinator field tests” that would produce data on “any conditional product registrations” of
neonicotinoid pesticides.
An official with AgInfomatics tells Inside EPA that studies of pesticide benefits are complex and time-consuming.
Growing Matters, the industry consortium led by Bayer CropScience, Syngenta and Valent U.S.A. Corporation hired
AgInfomatics to study neonicotinoid benefits in June 2013, the source says, adding that the report on the efficacy of
neonicotinoid-treated seeds in soybean production will be released before the end of the year.
EPA is taking comment on the Oct. 15 efficacy analysis to inform the agency’s ongoing registration review of
imidacloprid, clothianidin and thiamethoxam, which is expected to take several more years. The review will consider
whether the three neonicotinoids perform their intended function without causing unreasonable adverse effects to human
health or the environment, according to the agency’s Oct. 22 Federal Register notice seeking public comment. And EPA
has said the evaluation could lead to label restrictions or product bans.
EPA and the U.S. Department of Agriculture are working to stem massive declines in pollinators seen since 2006 and
have named pesticides as one of several factors, with others including poor nutrition and parasites, including the varroa
mite.
EPA, in August 2013, moved quickly to strengthen labeling requirements for foliar applications of neonicotinoids
after the pesticides were suspected in bee kills following pesticide sprays at several sites in Oregon, and agency officials
6
Risk Policy Report - www.InsideEPA.com - November 11, 2014
said they were also working to address potential risks to bees from treated seeds, but that additional measures would take
longer.
Whether neonicotinoid-treated seeds, a staple of modern agriculture, increase crop yields has been a central question
in environmentalists’ push for EPA to restrict neonicotinoid use and speed reviews of the controversial systemic pesticides (Risk Policy Report, Sept. 30).
In a July 22, 2013, letter, EPA asked registrants of the neonicotinoids imidacloprid, dinotefuran, clothianidin and
thiamethoxam to provide efficacy data “that describes the movement and concentration of active ingredients and major
degredates in plant structures, fluids and tissues.”
While the industry report countering EPA’s efficacy finding for neonicotinoid-treated seeds in soybean production
has yet to be released, industry officials have called the agency’s analysis “preliminary,” and noted it relied on limited
data of a single criterion, whether the seeds improve farmers’ yields.
The first installments in the series of industry reports detail a host of benefits, including reduced spraying of older
more harmful pesticides and savings in farmers’ time and expense in checking fields for pests, as well as increased yields.
And in the Oct. 22 statement, AgInfomatics says neonicotinoid-treated seeds improve soybean yields by 3 percent
compared to untreated seeds.
In one of several reports released Nov. 5, the contractor cites survey results from interviews with hundreds of farmers
in the United States and Canada who say treated seeds increase flexibility and convenience for farmers while reducing
human health and environmental risks.
“Neonicotinoid seed treatments were the most highly valued insect management practice in North American corn,
soybeans and canola with a total farmer value of $1.4 billion,” Growing Matters says in a statement announcing the
contractor’s study.
A second report, which considers how a potential ban of neonicotinoids would affect farmers’ pest management
practices says changes that would result, such as more frequent applications, as well as the training and equipment
required for spraying alternative pesticides, would increase the total costs of North American farmers by an estimated
$848 million per year.
The industry study also acknowledges that neonicotinoid-treated seeds are often planted prophylactically,
before determining pests are present, and says the seeds are not needed in nearly one quarter of fields where they are
planted.
The report, “The Value of Neonicotinoids in North American Agriculture: Estimated Impact of Neonicotinoid
Insecticides on Pest Management Practices and Costs for U.S. Corn, Soybean, Wheat, Cotton and Sorghum Farmers,”
projects that 77 percent of fields where neonicotinoids are planted would have to be treated with older more harmful
pesticides.
The remaining 23 percent of fields now treated with neonicotinoids would be farmed with non-chemical options,
including higher density seeding to offset losses, according to a Nov. 5 statement from Growing Matters, announcing the
study. The study also notes that in 10 percent of the untreated acres, farmers working without neonicotinoid-treated seeds
would incur costs checking for the presence of the original target pest, and emphasizes that more harmful pesticides
would be sprayed on the majority of fields that are now planted with treated seeds.
“Despite the lower acreage treated, the total volume of insecticides used in these crops would actually increase —
driven primarily by growers needing to rely on older chemicals, which require more frequent applications,” according to
the Growing Matters statement. “Across the major commodity crops evaluated, the study found that each pound of
neonicotinoid lost would be replaced by nearly five pounds of these older chemicals.”
Though EPA’s Oct. 15 analysis focuses exclusively on treated soybean seeds, the efficacy finding is similar to those
in CFS’ March 24 report, “Heavy Costs: Weighing the Value of Neonicotinoid Insecticides in Agriculture,” which found
that nearly all corn seeds and roughly half of soybeans planted in the United States are treated with neonicotinoids, a
massive increase since the late 1990s, when EPA first began approving neonicotinoid products.
The advocates’ report also argued that seed treatments fail to improve crop yields, threaten bees, and called for a
suspension of all uses of neonicotinoids as seed treatments in order to protect pollinators. — Dave Reynolds
SUBSCRIPTIONS:
703-416-8500 or
800-424-9068
[email protected]
NEWS OFFICE:
703-416-8541
Fax: 703-416-8543
[email protected]
Publisher:
Managing Editor:
Associate Editor:
Jeremy Bernstein ([email protected])
Maria Hegstad ([email protected])
Dave Reynolds ([email protected])
Production Manager:
Production Specialists:
Lori Nicholson ([email protected])
Daniel Arrieta, Michelle Moodhe
Risk Policy Report is a service of InsideEPA.com and is published every Tuesday by Inside Washington
Publishers, P.O. Box 7167, Ben Franklin Station, Washington, DC 20044. Subscription rates: $500 per year
in U.S. and Canada; $550 per year elsewhere (air mail). © Inside Washington Publishers, 2014. All rights
reserved. Contents of Risk Policy Report are protected by U.S. copyright laws. No part of this publication may be
reproduced, transmitted, transcribed, stored in a retrieval system, or translated into any language in any form
or by any means, electronic or mechanical, without written permission of Inside Washington Publishers.
Risk Policy Report - www.InsideEPA.com - November 11, 2014
7
Advisors Supportive Of ‘Risk-Weighed’ VOC Rule . . . begins on page one
drinking water as a group rather than individually.
But some NDWAC members expressed concerns that although the benefits appeared to outweigh the costs to water
systems, the approach would cause confusion among water system managers and lead to lower levels of risk overall,
echoing prior concerns from water utility sources that there would be some potential technical and policy hurdles in
implementing the group or risk approach.
However, even skeptical NDWAC members acknowledged that regulating cVOCs as a group was the way forward “I
do think it’s a good approach, and does make a lot of sense and brings in unregulated contaminants, as opposed to if we
were to wait for five different MCLs on all of these,” said one member.
While EPA has set drinking water standards, or maximum contaminant levels (MCLs), for some VOCs, others
included in the group of 16 are currently unregulated.
“We feel that looking at a group of contaminants allows the water system to make the best long-term decisions on
capital investment,” said Lisa Christ, chief of EPA’s Office of Ground Water and Drinking Water’s Risk Management
Division, at the Nov. 6 meeting. “So rather than consider contaminant A and regulation A and ask, ‘What am I going to do
about that?’ and some years later, contaminant B, and what will I do . . . this is an opportunity to make a long-term
investment.”
The Safe Drinking Water Act (SDWA) requires EPA to regulate contaminants that may be health risks and that
may be present in drinking water supplies. Under the law, the agency first sets a non-enforceable public health goal
known as a maximum contaminant level goal (MCLG), which is the maximum amount of a contaminant in drinking
water at which there is no known or anticipated adverse effect on public health. The enforceable MCL is set as close to
the MCLG as possible but can take treatment costs into consideration and sometimes result in a less stringent standard
than the MCLG.
The EPA workgroup has been considering two approaches for developing the group MCL, focusing on tests done on
the contaminants 1,2,3-trichloropropane, vinyl chloride and trichloroethylene (TCE). EPA has existing MCLs for vinyl
chloride and TCE but not 1,2,3-trichloropropane.
The first approach, known as the “feasible level” approach, aims to set the MCL “as close as feasible” to the MCLG
goal by adding the minimum reporting level (MRL) in micrograms per liter for each member of the cVOC group to be
studied so that the group MCL is the total of all MRLs. A lab would therefore be able to measure the level of concentration for each individual cVOC studied and then add them, Christ explained. MRLs refer to the lowest level analytical
methods can accurately measure.
The second approach, which the workgroup calls the “risk-weighted feasible level addition” approach, would
multiply the MRLs for each cVOC by its unit risk factor and total the values.
“This results in an overall risk level for the group that cannot be exceeded,” the workgroup explains in slides
presented at the meeting. “The unit risk is then divided by the total risk to derive the risk ‘weight.’” Relevant documents
are available on InsideEPA.com. See page 2 for details.
The risk-weighted approach would mean $450,000 in annual costs to a water system serving around 21,000 people,
and $526,000 in annual benefits, according to workgroup estimates. The feasible level addition approach would mean
$268,000 in annual costs and $11,000 in annual benefits.
“It’s not quite as straightforward as just simple addition,” Christ said of the risk-weighted approach. “It’s gonna
require a little bit more effort. However, EPA would be the ones providing the risk weights. We wouldn’t expect a water
system or a state to work through all of the math, so we can provide a straightforward equation and perhaps even spreadsheet tools.”
Among the advantages of the approach, the workgroup said in its presentation slides, is that it would “account for
risk variation across a group of contaminants with unit risks that vary by several orders of magnitude” and then systems
that exceed the group MCL would install treatment systems to reduce the contaminants in the group with the highest
health risks. However, Christ acknowledged that although the workgroup believes the benefits of this approach outweigh
the costs, new cVOCs added to the group in the future could change the group MCL, as could changes in cancer slope
factors.
The feasible level approach, on the other hand, is more straightforward and familiar to water systems, but does not
take into account health risk variations between different cVOCs, and may require systems to install treatment for less
risky members of the group, resulting in minimal health benefits, according to the workgroup.
And Eric Burneson, of EPA’s Office of Ground Water and Drinking Water, said the risk-weighted approach “makes a
lot of sense.”
“This is an innovative way to address unregulated contaminants,” Burneson said at the Nov. 6 meeting. “It brings in
more unregulated contaminants than if we were to wait for five different [individual] MCLs. We will continue to try to
make this approach work.”
However, Burneson said, “that doesn’t mean we can suspend parts of the [Safe Drinking Water Act] that make sure
8
Risk Policy Report - www.InsideEPA.com - November 11, 2014
we have to make sure the benefits justify the costs, it still has to pass through a rigorous proposal and promulgation
process,” he added.
Drinking water utilities are currently gathering data on the occurrence of seven VOCs, and that information will
inform whether the agency ultimately decides upon the risk-weighted approach for setting the group MCL. The VOCs
monitored under the unregulated contaminant monitoring rule are 1,2,3-trichloropropane, 1,3-butadiene, chloromethane
(methyl chloride), 1,1-dichloroethane, bromomethane (methyl bromide), chlorodifluoromethane (HCFC-22) and
bromochloromethane (halon 1011). — Amanda Palleschi
Industries Move To Deter California From Targeting Workplace Chemicals
A major California industry organization has launched an effort to discourage the toxics department from considering
chemicals used in the workplace for possible regulation under the state’s green chemistry program, arguing that such
compounds are already adequately regulated by agencies that oversee workplace safety.
The industry group — the California Manufacturers & Technology Association (CMTA) — fears that the department
will “sweep” chemicals used in commercial and industrial settings into the broader green chemistry program, also known
as the safer consumer products regulation, resulting in potentially exponential additional costs to comply and new
restrictions or bans on chemicals.
CMTA is calling on its member companies to help the organization develop examples of actual products that fall
within categories of chemicals being targeted by the Department of Toxic Substances Control (DTSC), to “show how
their use in industrial settings is subject to existing regulations that effectively preclude exposures which might otherwise
justify DTSC intervention,” according to an Oct. 23 blog on CMTA’s website.
The examples would be shared with DTSC officials at a forthcoming meeting.
CMTA’s effort follows an Oct. 20 meeting of DTSC’s Green Ribbon Science Panel, at which members discussed the
department’s draft “2015-2017 Priority Product Work Plan.” The plan identifies priority products from seven categories
of goods that the department may target for regulation over the next three years. The categories are: beauty/personal care/
hygiene; clothing; building products; household/office furniture/furnishings; cleaning products; office machinery consumable products; and fishing and angling equipment.
DTSC in March announced the first three “priority products” and chemicals that it will initially target under the
program. The chemicals and products are: unreacted diisocyanates in spray polyurethane foam systems used in building
insulation; flame retardants Tris (1,3-dichloro-2-propyl) phosphate or “TDCPP” found in children’s foam-padded
sleeping products; and methylene chloride in paint and varnish strippers and surface cleaners.
Some academic members of the science advisory panel recommended during the Oct. 20 meeting that DTSC include
chemicals handled in the workplace under the regulations, in addition to consumer products sold at retail.
For example, Julie Schoenung, with the University of California-Davis, recommended that DTSC highly factor
chemical exposure to not only workers who handle finished consumer products but those who help manufacture the
chemicals that ultimately wind up in products. The current work plan is unclear on this topic and DTSC should revise the
language to provide clarity and certainty, she said.
Megan Schwarzman, with the University of California-Berkeley, added that this is especially important for workers
who handle chemicals used in textile finishes, such as those in wrinkle-resistant clothes that contain significant levels of
formaldehyde. These workers are handling the chemicals before they are “treated” and “cured” for application to the
retail products and therefore are being subject to “potentially very high exposures,” she said.
DTSC officials appeared receptive to the recommendations, prompting concerns from CMTA. “Given sustained pressure on DTSC from certain activist groups to regulate chemical use in the workplace, it is unlikely DTSC will
act on its own initiative to narrow the scope of current and future Priority Product listings,” the CMTA blog states.
“Absent information and advocacy to the contrary, any use restrictions intended for consumer products sold at retail are
likely to be swept into commercial and industrial settings.”
CMTA is hoping to collect the real-world examples of how onerous and unnecessary such regulation would be in
advance of DTSC’s issuance of a final work plan and proposed rulemaking, in order to present them to DTSC Interim
Director Miriam Ingenito and department staff, the blog says. Given uncertainty about DTSC’s time frame for advancing
the rules and work plan, “we would need to initiate this effort in early November and schedule a meeting with DTSC no
later than early December,” the blog adds.
DTSC officials have said that they plan to initiate a rulemaking to adopt a priority products list in early 2015, which
will take about a year to finalize. The rulemaking will include priority product “listing” language, supporting documents,
an economic and fiscal impact statement, an external scientific peer review and a review by the state’s Environmental
Policy Council.
The department’s safer consumer product alternatives regulation went into effect last October. The program is viewed
by many experts as a possible national model for chemical policy reform and for addressing potentially problematic
Risk Policy Report - www.InsideEPA.com - November 11, 2014
9
chemicals in consumer products.
Under the regulations, DTSC will require manufacturers to study whether replacing chemicals of concern with
alternatives is feasible. This is known as the alternatives assessment process, considered a key part of the program. DTSC
also has the authority to ban certain chemicals found in products if it deems that action necessary.
The department plans to release a draft guide on conducting alternatives assessments in early 2015.
EPA Weighs Data Questions In SO2 NAAQS Plan . . . begins on page one
existing primary standards because they would not provide additional public health protection given a one-hour
standard at 75 ppb. SO2 is a sub-species of sulfur oxides (SOx). The plan is available on InsideEPA.com. See page 2
for details.
The IRP outlines how EPA intends to conduct the review, which will include scientific assessments on the risks
of SO2 emissions to public health, and eventually a policy assessment (PA) listing options for revising the NAAQS.
EPA will also seek input from its Clean Air Scientific Advisory Committee (CASAC) on whether to update the
standard.
Based on input received from CASAC and the public earlier this year, EPA’s IRP presents the current plan and
specifies the schedule for conducting the review and the major policy-relevant science issues that will guide the review
and EPA’s eventual rulemaking.
EPA will follow the IRP with a science assessment in February 2015 and a risk and exposure assessment in July next
year. The agency plans to issue a first draft PA in April 2016, with a notice of proposed rulemaking on the NAAQS in
October 2018 followed by a final rulemaking it will issue in July 2019, the IRP says.
The new plan builds on the “substantial” body of work done during the course of the last review and takes into
account more recent scientific information and air quality data now available to inform the agency, using information
developed since the last broad scientific review in 2008 that informed the prior NAAQS update.
Some of the major policy-relevant issues in the new IRP that were considered in previous reviews include to what
extent has new information altered scientific support for the occurrence of health effects as a result of short- and longterm exposure to SOx in the ambient air, and what are the air quality relationships between short- and long-term exposures to SO2. Those are important factors for EPA in deciding whether the existing NAAQS is adequate to protect public
health as required under the Clean Air Act, or should be tightened or weakened.
The new IRP expands on the question of relationships between short- and long-term exposures by asking to what
extent can five-minute monitoring data collected since the last review be used to further characterize the relationship
between five-minute peaks and longer-term average concentrations of one hour, three hours or 24 hours.
During the last review, “considerable” weight was placed on substantially limiting health effects associated with fiveminute peak SO2 concentrations, the IRP says. As a result, as part of the final 2010 rulemaking, EPA required for the first
time state reporting of either the highest five-minute concentration for each hour of the day, or all 12 five-minute concentrations for each hour of the day so that the additional data could be used in future reviews to evaluate the extent to which
the one-hour SO2 NAAQS at 75 ppb provides against five-minute peaks of concern.
A question on whether new information changes conclusions from the previous review regarding the effects of SOx
on susceptible populations is also expanded in the new IRP. Under the new plan, EPA will consider whether new data
alters the agency’s understanding of human lifestages, as well as populations, that are particularly at increased risk for
experiencing health effects associated with exposure to SOx. The IRP asks whether there is new information to “shed
light” on the nature of the exposure-response relationship in different at-risk lifestages and/or populations, and is there
new or emerging evidence on health effects beyond respiratory effects in asthmatics, children and the elderly that suggests
additional at-risk populations and lifestages should be given increased focus in this review.
The new IRP also says EPA will assess what evidence is available from recent studies focused on specific
chemical components within the broader group of SOx to inform EPA’s understanding of the nature of exposures that are
linked to various health outcomes. EPA will also review to what extent health effects are associated with SOx exposures,
as opposed to one or more co-occurring pollutants, or “co-pollutants” — emissions of other air pollution, like ozone.
Questions surrounding co-pollutants and whether they can create uncertainties, or confound, conclusions on the risks
of SO2 exposure arose during a June EPA workshop held to inform the SO2 NAAQS review.
During that workshop, agency officials asked academics, scientists and other participants to provide feedback on the
questions that will shape the SO2 NAAQS review. Some of the questions EPA asked during the workshop sought input on
SOx atmospheric chemistry, ambient concentrations and exposure and to what extent the preliminary materials appropriately characterize personal exposure to ambient SO2.
If the evidence suggests it might be appropriate to revise the current standard, EPA will evaluate how it might revise
the standard, focusing on how the scientific information and assessments inform decisions regarding the basic elements
— indicator, averaging time, form and level — of the primary SO2 NAAQS.
According to the plan, areas of uncertainty that remain in EPA’s understanding of several policy-relevant issues from
10
Risk Policy Report - www.InsideEPA.com - November 11, 2014
the 2010 review include statistical relationships between five-minute concentrations and longer averaging times, including the extent to which these longer averaging times can limit five-minute concentrations of concern identified from
controlled human exposure studies, and understanding the range of ambient concentrations in which we have confidence
that the health effects observed in epidemiological studies are attributable to SO2. — Lea Radick
Observers Downplay Data Losses From EPA Reducing Air Monitor Network
State officials and other observers are downplaying the potential adverse impacts on emissions data collection
following EPA’s decision to scrap funding for 44 air monitoring sites in its chemical speciation network (CSN) that assess
the components of air pollution, saying the remaining network will provide adequate coverage.
EPA is paring back the CSN due to funding cuts and other resource constraints, and observers say the monitoring site
closures make sense given those limits. “Nobody likes to lose a monitor, but these are tough times and something’s got to
give,” said George Allen of the Northeast States for Coordinated Air Use Management (NESCAUM) in a Sept. 19
interview with Inside EPA. “[T]his is probably [the] most reasonable” of all options, he said.
Similarly, another state source said EPA’s hands were “tied to some degree” with the monitor site closures and the
agency “had to make budget cuts somewhere.” The source, whose state is not losing any sites from the defunding, says
the approach EPA is taking “doesn’t completely remove data from geographical areas.”
A source with the Health Effects Institute, a research group which receives funding from EPA and the auto industry,
would be “surprised if this would dramatically change [the] nature of information available.” The source, however, adds
that “hopefully this does not mark [the] beginning of later disinvestment.”
EPA’s Beth Landis detailed the CSN changes in a presentation to a recent ambient air monitoring conference, noting
that the site closures were scaled back from an initial plan of 53 down to the 44 closures. Other changes include reduced
monitoring frequency at three sites, and ending the measurement of fine particulate matter (PM2.5) mass effective last
month. Relevant documents are available on InsideEPA.com. See page 2 for details.
EPA established the CSN in 1997 following the agency’s decision that year to establish the PM2.5 national ambient
air quality standard (NAAQS) at 65 micrograms per cubic meter (ug/m3) over 24 hours and 15 ug/m3 on an annual basis.
Monitoring began in 2000 at 13 pilot sites and has expanded over the years to a total of 198 sites that collect data aerosol
samples over 24 hours.
The CSN is designed to help regulators determine compliance with EPA’s PM2.5 air standard, and to break down, or
speciate, the individual components of PM, which is made up of different sources of air pollution. Speciation has long
been a goal of air officials because it could help identify which industrial sources contribute the greatest amount of
pollution toward PM, and to target controls or regulations on those sources.
But the large scale of the CSN required major resources from states and EPA, and the agency began an assessment in April 2013 to “create a CSN network that is financially sustainable going forward,” to “redistribute resources to
new or high priorities from those of low-priority or low-benefit,” to “extract more value from the existing network” and to
“fully leverage the value of other existing networks,” according to Landis’ presentation.
The assessment sought to cut 30 percent from the current network cost of approximately $6.7 million and to reinvest
10 percent of the money saved through the cuts, resulting in a 20 percent total reduction in spending on the base network
to $4.7 million. The 44 sites selected for defunding are located in 19 states.
Pennsylvania is slated to lose six CSN sites; North Carolina five; Ohio four; Kentucky, Michigan and Tennessee
three; Alabama, Georgia, Iowa, Missouri, New Jersey, South Carolina and Wisconsin two; and Delaware, Florida,
Indiana, Minnesota, Washington and West Virginia will lose one site.
As of January next year, the sites recommended for defunding will no longer receive laboratory analysis funding, but
their speciation monitors may continue to operate if other funding sources are provided.
When EPA indicated in May during a webinar that it was planning to end funding for some low-value emissions
monitoring sites, some observers weighed in with concerns.
One participant said EPA was only targeting sites with high concentrations of PM chemical components and that lowconcentration sites should be included, too. Another participant asked whether the Supreme Court’s April ruling that
reinstated EPA’s Cross-State Air Pollution Rule emissions trading program would have an impact on the network. At the
time, EPA’s Landis said she had not heard yet whether the ruling would affect the network.
“No one likes to see a site close, but analytics costs have skyrocketed,” said NESCAUM’s Allen in the interview.
NESCAUM is a nonprofit association composed of air quality agencies the Northeastern United States.
The biggest change to result from the CSN assessment, Allen said, is EPA’s defunding of 44 sites, which he says were
identified because they are “relatively redundant with other measurements being made” from other existing networks,
such as the Interagency Monitoring of Protected Visual Environments (IMPROVE) emissions network.
IMPROVE was launched in 1985 to establish current visibility conditions in national parks and wilderness areas to
track changes in visibility and determine why visibility is impaired in these areas in support of the regional haze program.
Risk Policy Report - www.InsideEPA.com - November 11, 2014
11
EPA is weighing cost-cutting measures for IMPROVE similar to the cuts to the CSN.
For the CSN monitoring reductions, “It’s more of a thinning,” the state source added, noting that between the CSN
and IMPROVE network, EPA “still has pretty good geographic coverage.”
While the decision to defund the sites will result in having less information to work with, the source notes that some
of the money saved “could potentially be reinvested in equipment and methods.”
Some investment options the agency is considering include investigating new analytical techniques, new sites in areas
with emerging air quality issues and new measurement parameters.
EPA will continue to receive feedback regarding reinvestment options until Dec. 31, and it plans to determine the use
of available funds for reinvestment by Jan. 31, Landis said in her presentation to the August air monitoring conference in
Atlanta, GA. — Lea Radick
ACC Sees GOP Wins Clearing Path For TSCA Reform . . . begins on page one
Boxer impeded progress on an existing bipartisan TSCA reform bill introduced by current EPW ranking member Sen.
David Vitter (R-LA) and the late Sen. Frank Lautenberg (D-NJ). Following Lautenberg’s death, EPW member Sen. Tom
Udall (D-NM) worked with Vitter on his bill, S. 1009.
Dooley noted that Sen. James Inhofe (R-OK) is set to resume his role as EPW chairman in the GOP-led Senate when
the 114th Congress convenes early next year — potentially boosting prospects for TSCA reform.
EPA Administrator Gina McCarthy has repeatedly said that she favors updating the 1976 law. Proponents of overhauling TSCA say it is necessary because the law does not have a workable system for how EPA should regulate existing
chemicals that might be of concern because of their human health impacts.
With Inhofe leading the environment panel, ACC is “confident” that “we could see the committee act relatively soon
and have bipartisan legislation within the next year that could pass a floor vote,” said Dooley.
Other industry sources, however, say it is crucial to begin work on getting bipartisan support for any reform bill early
on in the new legislative session to avoid the 2016 election causing problems with the bill.
One industry source says that given the Republican gains in Congress, one scenario is a strong push for S. 1009 in
the first nine months of the next session because “later runs into Presidential election politics,” adding that assuming
Inhofe takes over as chairman of EPW, “using this proposal as a starting point makes sense.”
Boxer may also still retain a major role in TSCA reform negotiations if she becomes EPW’s ranking member next
year. Should she stay as the top Democrat on the committee, environmentalists are hopeful that the change in EPW
leadership from Vitter to Inhofe might boost bipartisan negotiations between the two senators.
“Inhofe is a pragmatic legislator with a long history of getting laws passed. I’m optimistic he can work with Boxer to
craft TSCA legislation to ensure chemicals are safe,” according to one environmentalist.
Similarly, one state regulator expresses optimism about the pending leadership changes, claiming Boxer and Inhofe
have a “collegiate relationship” which could open the door to more productive negotiations.
Boxer as EPW chair said she would not allow movement on S. 1009 until a series of her concerns with the bill were
addressed, including the provisions preempting state chemical laws, which she feared could disrupt existing California
programs such as its green chemistry rules, protections for vulnerable populations and other issues.
Vitter, in turn, revised S. 1009 earlier this year, revisions which included amended language addressing EPA’s
authority to act on chemicals it finds do not meet the law’s “safety standard,” but Vitter indicated at the time that the
preemption provision remains a sticking point for Boxer and did not expect to win her support.
The gridlock escalated in September when Vitter criticized Boxer for what he said was a premature release of
the revised bill. Boxer released not only a revised S. 1009 bill but her revisions to the updated measure, which included
dropping the provisions that she said would preempt state programs and tightening the bill’s safety standard (Risk Policy
Report, Sept. 23).
The senator argued that Vitter’s revised proposal fell short of what is needed to reform the decades-old chemical
safety law and that she was “strongly committed” to efforts to reform TSCA.
But Vitter said he would return to supporting an earlier version of the bill that lacks some of the compromises
outlined in the revisions, characterizing Boxer’s actions as a “press stunt/temper tantrum.”
A Vitter spokesman told Inside EPA at the time, “We’ve had very fruitful discussions but it hasn’t produced an
agreement before the election. Therefore, Sen. Vitter will go back to S.1009 as introduced and see where we are after the
election.” Those elections ensured Republican control of the Senate with at least 52 seats at press time, and the GOP also
made gains in the House, where Rep. John Shimkus (R-IL) has been leading TSCA reform efforts.
Shimkus, chair of the House Energy & Commerce Committee’s environment panel, floated a draft reform bill,
but Democrats circulated a red-lined version seen as widening the gap between the two parties (Risk Policy Report,
June 10).
A House source says that TSCA reform will remain a priority for the energy committee’s environment panel next
Congress, and that Shimkus “hopes to build off the momentum gained through hearings and work” on the draft bill he
12
Risk Policy Report - www.InsideEPA.com - November 11, 2014
floated in the 113th Congress, but that there is “no specific timeline for introduction at this point.”
ACC’s Dooley said during the Nov. 10 call that he is optimistic the GOP-led Congress will advance TSCA reform
and said while “[w]e hope to wrap up passage of TSCA reform by 2015” but added that if a bill is not passed by then, he
is not “overly concerned” that the 2016 presidential election would be an impediment to passage.
However, the industry source says “the House is quite fluid and how the Shimkus version can be aligned with VitterUdall is unclear but presumably a conference committee could make it happen.”
Although ACC’s Dooley is hopeful that the Republican-led Congress will make it easier to move TSCA reform
legislation than in the divided 113th Congress, sources note that major hurdles remain.
The industry source suggests that Vitter might again be willing to make some concessions. “I think Vitter will return
to the table as he was already making conciliatory gestures having condemned Senator Boxer’s ‘outing’ of the Udall —
Vitter proposal, but seemed willing to re-engage in the debate not long after the dust-up.”
But the issue of whether and how the legislation would preempt states’ efforts to address chemicals still needs to be
resolved, says a second industry source says. “We have a lot of work to do to get to critical mass” of support.
The initial draft of S. 1009 would have allowed EPA to preempt any state rules limiting or banning the use of
chemicals in certain applications once EPA has made a safety determination.
The legislation would also have barred states from imposing new restrictions on chemicals identified as “highpriority” by EPA at the time the agency publishes a schedule for assessing safety of the substance.
It also contained language that some observers have said could preempt toxic tort claims in state courts for harms
caused by chemicals that EPA has deemed to be safe — an outcome strongly opposed by the American Association for
Justice, which represents trial lawyers, many states, and most environmentalists.
The state source says that preemption remains a “huge hurdle” and that they would “hate to see” Vitter revert back to
the language in S. 1009, because discussions had progressed since the bill was introduced.
Though S. 1009 currently has 13 Democratic and 13 Republican cosponsors, the second industry source says more
Democrats would likely be needed to dissuade a possible filibuster from Boxer if the GOP tries to move a TSCA reform
bill, adding, “it’s not going to be an easy conference between the House and Senate.”
“The goal is a credible federal program,” the source says, adding that “if we don’t get more support from states and
[non-governmental organizations (NGO’s)]” for a TSCA reform bill, which many in industry view as key to gaining
consumer confidence that chemicals are safe, then the efforts will fail. — Bridget DiCosmo
Industry Presses Reconsideration Of TMB IRIS Analyses . . . begins on page 14
presented public comments to the committee. Nancy Beck, senior director of regulatory science policy at ACC urged the
committee to clearly communicate its recommendations to EPA and sought to remind panelists of the importance of
presenting the agency with a consensus report. “There appear to be many places in the report where the panel is not in
consensus,” Beck said. “We appeal to the panel to try to find consensus.”
Vincent Cogliano, the acting IRIS director, also sought clarification from the committee on its recommendations. He
pressed the committee to elucidate when panelists felt IRIS staff must make a change to the assessment and when they felt
staff might consider a change. Cogliano also noted that the first draft of the report contained a series of different recommendations to improve the physiologically based pharmacokinetic (PBPK) model used to calculate the risk estimates. He
also sought help from the panel in determining which change panelists felt best and whether another peer review of the
model would be needed after the change — a decision that would add to the cost and time of the assessment.
Hayes, a modeling expert, replied that the changes were different ways of addressing the issues with the model. But
he suggested a simpler fix, noting that in this case, a similar result could be obtained by using a less complex benchmark
dose model rather than a PBPK model. The suggestion would obviate the need for tinkering with the PBPK model or
seeking further peer review, Hayes said. — Maria Hegstad
Risk Policy Report - www.InsideEPA.com - November 11, 2014
13
Industry Urges EPA Advisors To Press For Reconsideration Of TMB Analyses
The chemical industry is touting a recent decision from EPA’s pesticides office to urge the Science Advisory
Board (SAB) to recommend the agency’s research office reconsider its ongoing toxicological assessment of the
human health risks of three trimethylbenzenses (TMBs), a recommendation on which the SAB panel is struggling to
reach consensus.
The issue stems from EPA’s Integrated Risk Information System (IRIS) draft assessment of three TMB isomers,
1,2,3-TMB, 1,2,4-TMB and 1,3,5-TMB, the first assessment to be peer reviewed by SAB’s new Chemical Assessment
Advisory Committee (CAAC). The panel reviewing the draft assessment has struggled since its first meeting to reach
consensus on how best to assess the chemicals’ risks. EPA in its draft used data on one of the isomers to extrapolate and
assess the risks of another isomer. But IRIS staff also deemed it inappropriate to use studies of a larger group of chemicals containing TMBs, the C9 aromatic hydrocarbon, in the assessments (Risk Policy Report, June 24).
During Nov. 5 and 7 conference calls to discuss a first draft of SAB’s review report, committee members again
struggled to determine whether and how to use the C9 studies, and whether EPA should extrapolate data from one TMB
isomer to assess the risks of another. Industry representatives urged the panel to reach consensus on their recommendations to the agency and also again pressed the group to urge EPA to include the C9 studies in the assessment.
David Adenuga, a toxicologist with ExxonMobil Biomedical Sciences, speaking for the chemical industry association
American Chemistry Council (ACC), pointed to a recent decision from EPA’s pesticides offices that he argued further
supports industry’s request that the IRIS program include the C9 studies in the TMBs assessment. Adenuga provided SAB
a Sept. 26 Federal Register notice, announcing the Office of Pesticide Program’s (OPP) final rule exempting complex C9
aromatic hydrocarbon inert ingredients from the general requirement that the agency establish a tolerance for a safe level
of residue that can remain on food products post-application.
“The EPA/OPP determines that there is a reasonable certainty that no harm will result from aggregate exposure to
pesticide chemical residue that may result from all anticipated dietary exposures and all other exposures for which there
is reliable information, including through drinking water and residential exposure (for example exposure via inhalation
during refueling and transport of gasoline — for which the EPA/IRIS office has determined as the largest exposure point
source for trimethylbenzenes),” Adenuga told the panel. “The conclusions reached in this final rule are very pertinent as it
addresses some of the issues that are the basis for some of the conclusions reached in the IRIS assessment.”
Adenuga added that among the key points for SAB’s and IRIS’ consideration is that the OPP rule determined that
“[t]he C9 aromatic studies could be used to address hazard assessment for trimethylbenzenes, as the individual constituents were structurally and toxicologically similar.” Relevant documents are available on InsideEPA.com. See page 2 for
details.
Adenuga noted that OPP’s risk estimates for the C9 aromatic group are significantly less strict than the draft IRIS
assessment, with OPP providing a reference dose of of 1.5 milligrams per kilogram bodyweight per day (mg/kg/day), “a
value that is at least 2 orders of magnitude higher than the values proposed in the IRIS assessment of TMB.” A reference
dose is the maximum amount EPA estimates an adult can ingest daily over a lifetime without experiencing related adverse
health effects. OPP also calculated a reference concentration (RfC), an analogous value for inhalation of 5 milligrams per
cubic meter of air (mg/m^3).
By contrast, the draft IRIS assessment includes an RfC for each of the three isomers of 0.05 mg/m^3 of air.
“To conclude, we believe that there is a need to return to the drawing board to understand why there is such a huge
disparity in the EPA IRIS and EPA/OPP assessments of practically the same substances,” Adenuga said.
The SAB, panel, however, during a three-day meeting June 17-19 seemed largely to deem the C9 studies too far
afield from the TMBs to present information useful for calculating risk estimates. But the panel does recommend that
EPA reconsider its decision to calculate an RfD and RfC for 1,3,5-TMB based on toxicological studies of 1,2,4-TMB in
its first draft of the report, though it remains unclear whether that recommendation will remain in the final report.
SAB CAAC members disagreed over the recommendation again during the Nov. 5 conference call. “I thought we
didn’t have consensus on deriving [1,3,5-TMB’s RfC] from 1,2,3-TMB and 1,2,4-TMB,” said Bob Howd, a toxicology
consultant with Tox Services.
Helen Goeden, a senior toxicologist with Minnesota’s Department of Health, replied that some CAAC panelists felt
that 1,3,5 was “sufficiently different” that its RfC should be calculated with a 1,3,5-TMB-specific study. “But I thought
the majority felt [the extrapolation] was ok,” she said, as long as EPA provides a better explanation of why it felt it
appropriate to do so.
Panelists raised similar comments regarding whether EPA should extrapolate 1,3,5-TMB’s RfD from 1,2,4-TMB or
calculate the RfD based on a 1,3,5-TMB-specific study.
“It seems to me we’re walking into the path of a contradiction,” said panelist Sean Hayes, president of the firm
consulting firm Summit Toxicology. “If we agree all congeners are equivalent, why not use the C9 studies?”
But Gary Ginsberg, a toxicologist with the Connecticut Department of Public Health replied, “The C9 isomer studies
are only about 50 percent TMBs. I don’t know how one usage [of extrapolating] buys into endorsing the other.”
The continuing struggle for consensus among the panelists raised concern for another ACC representative who also
continued on page 13
14
Risk Policy Report - www.InsideEPA.com - November 11, 2014