Understanding the President’s Cancer Panel Report

May 10, 2010

The May 6 report of the President’s Cancer Panel is significant for several reasons. First, the things it says: Environmental contaminants probably cause many more cancers than are commonly acknowledged. The mix of contaminants people are exposed to, especially children, make the risks much higher than from any one source. We need to warn people about the steps they can take to reduce their exposure, and to carefully study chemicals and technologies that can affect human well-being before they are released for general use. Specific contaminants discussed in the report include bisphenol A, formaldehyde, radon, radiation from cell phone usage, and a range of other carcinogens and endocrine disrupters. The report urges systematic monitoring of environmental contaminants, exposure level in humans, and health outcomes. It endorses a precautionary approach to contaminants that can affect human health.
[By the way, there are things you can do to reduce your own and your children’s’ exposure: choose food, household chemicals, toys and play spaces that have few chemicals—read labels, look through http://householdproducts.nlm.nih.gov/, don’t use plastics in your microwave oven, spend less time at the mall (flame retardants on all of those fabrics), wash new clothes a couple of times before wearing them, use a filter for your tap water, limit time with a cell phone held next to your head or on a belt, limit CATscans and other uses of x-rays, and have your house checked for radon.]
In the United States we do not employ a precautionary approach. Quite the contrary, our laws specify that industry is free to deploy new substances and technologies unless there is a compelling reason to suspect people will be harmed. The big test for effects on health is conducted by exposing millions of people.
Consider the example of neurotoxins. The EPA maintains a list of known neurotoxins, two hundred and one in all. Only three have been thoroughly investigated for presence in the environment, exposure levels and health effects on people. All three—PCBs, lead, and mercury—have been found to harm people, particularly children, and have become targets of programs to reduce their release. We do not take strong action on the others, or drastic action on mercury, because of the high burden of proof required under our laws—and because programs to do so are opposed by big contributors to Congress.
The top scientists who study neurotoxins, endocrine disruptors and other widespread contaminants believe they are responsible for serious health problems, including asthma, autism, and behavioral disorders in children.
We so not routinely conduct studies of exposure to environmental contaminants, as the President’s Cancer Panel advocates. When such tests are conducted, we find people harbor dozens of toxic chemicals at levels that threaten their health.
The Report warns we will not beat cancer as long as we continue to expose people to thousands of carcinogens.

Here is what the Report does not say. It is presented to the a public, and to a public policy community, that is debating how to think about, and design policies for, risks to human health. It is widely accepted that public policy should pay some heed to risk. Given scarce budget dollars, we are under an ethical obligation to spend money wisely. More should be spent on the risks we are more likely to encounter, and from which we are more likely to suffer dire consequences.
That argument has won the day in the world of public policy. We need to understand how to do it right. Industry insiders and advocacy groups generally argue for risk-based decision making, but against the systematic testing and monitoring advocated in the Report. Why?
Consider an example from the world of food safety. Let us agree our knowledge of risk should not come from incidence of illness due to food borne pathogens. The point is to prevent illness and death, without conducting field experiments that sicken and kill people.
An industrialized food system generates new risks all the time, through the invention of new processes and chemicals. Most risks from these innovations are going to be unknown, such as in the recent news story about salmonella in hydrolyzed vegetable protein (HVP—which is really MSG produced by means other than fermentation). HVP is sold as an ingredient for other products. Most of those subsequent products will be heated enough to kill salmonella—but perhaps not all of them. The food finally served to consumers may or may not be cooked just before eating. So the risk is, to be precise, unknown.
Opponents of testing have a point. If a company that buys HVP tests it and finds contamination, it is obligated to report the result to the FDA. The company that sent them the HVP must then track down all the other shipments that might also have been contaminated, and perhaps recall all of them, before enough information is available about the extent of contamination (note: the FDA lacks the power to compel a recall in this situation). The recall will have cost food processors and sellers a lot of money. And, in that situation, the risk to eaters is unknown. From a regulatory view, we are right back where we were, with the addition of a lot of bad feeling about government regulation. Is this an argument for not testing—unknown risk if you do test, unknown risk if you don’t?
The question points to an absurdity. Of course we are better off knowing that a particular shipment of HVP is contaminated. It sets into motion a search for the contamination, a fix, and dissemination of the knowledge acquired so that other food processors can improve, based on experience. The two states of unknown risk are not at all equivalent.
A system of testing, in food safety and in environmental contaminants, will reduce risks to health. Right now we don’t know by how much. Opponents of testing are right that testing introduces financial risk. But to not test destroys our ability to make decisions based on risk. It introduces a bias against public health, a bias in favor of making money at the expense of public health. We are a capitalist country, but to oppose testing is a dodge.
The European Union is now running its REACH program (Registration, Evaluation and Authorization of Chemicals), which should produce, over the next ten years, data on the likely health impacts of the 100,000 or so chemicals that are now used in industrial, food, and other applications. The Report and other studies insist that something like REACH is required here. There will be arguments in the US over whether we will use EU data on risks from chemicals. Right now we are completely unprepared to cope with this impending flood of data.
If we want a risk based decision system we need to know about risks. It is within our power to start a REACH-like testing of chemicals, and we need it to support risk based decision making. To advocate risk based decision making without the testing is a dodge.
That is the context of the President’s Cancer Panel report. We will not make significant progress against cancer until we adopt the precautionary approach to exposure. The political forces that oppose it will fight it. They will probably win.
The report is available online at http://deainfo.nci.nih.gov/advisory/pcp/pcp08-09rpt/PCP_Report_08-09_508.pdf. It’s official title is 2008–2009 Annual Report, President’s Cancer Panel, and the title page reads Reducing Environmental Cancer Risk: What We Can Do Now.
P. Grandjean, P.J. Landrigan, “ Developmental neurotoxicity of industrial chemicals,” Lancet Published Online November 8, 2006DOI:10.1016/S0140-6736(06)69665-7.
In addition to the Report, see General Accountability Office, Biomonitoring: EPA Needs to Coordinate Its Research Strategy and Clarify Its Authority to Obtain Biomonitoring Data, April, 2009, GAO-09-353; and General Accountability Office, Testimony of John B. Stephenson before the Subcommittee on Environment and Hazardous Materials, House Committee on Energy and Commerce, April 25, 2007, Perchlorate: EPA Does Not Systematically Track Incidents of Contamination, GAO 07-797T.
See Is It In Us? Chemical Contamination of Our Bodies, Commonweal Biomonitoring Resource Center and the Body Burden Work Group, 2008; and Bobbi Chase Wilding, Kathy Curtis, Kristen Welker-Hood, Hazardous Chemicals in Health Care: A Snapshot of Chemicals in Doctors and Nurses, Physicians for Social Responsibility, 2009.
A comparison of the US and EU approaches to controlling toxic substances is in General Accountability Office, Chemical Regulation: Comparison of U.S. and Recently Enacted European Union Approaches to Protect against the Risks of Toxic Chemicals, August 2007, GAO 07-825.
See also, for example, Toxicity Testing in the 21st Century: A Vision and a Strategy, Committee on Toxicity Testing and Assessment of Environmental Agents, National Research Council, 2007; and Science and Decisions-Advancing Risk Assessment, Committee on Improving Risk Analysis Approaches Used by the U.S. EPA, National Research Council, 2009.
See FDA Science and Mission at Risk, Report of the Subcommittee on Science and Technology, Prepared for FDA Science Board, November 2007, pp. 45-51.

Time To Stop The Madness

May 6, 2010

The President’s Cancer Panel has issued its annual report. You can read it at http://deainfo.nci.nih.gov/advisory/pcp/pcp08-09rpt/PCP_Report_08-09_508.pdf. This year the Report examines the relation between cancer and environmental exposure. Many people have arrived at this conclusion before, but this is the top of the medical profession verifying it: They found “the true burden of environmentally induced cancer has been grossly underestimated.” (from letter to President, front material) And they say we must adopt a precautionary approach to the approval of new chemicals and other forms of environmental exposure.
As the report says, there is much we do not know about environmental exposure to chemicals and electromagnetic radiation—but we know enough to act. (p. 97) A new chemical needs to be studies and found to be safe before we approve it for uses that will end up in our bodies. We need to study the effects of chemicals and other environmental risks in current circulation. We need to begin systematic biomonitoring to understand the links between the contaminants we store in our bodies and the incidence of cancer. 41% of us, at current rates, will get a cancer diagnosis at some point. We will not beat cancer as long as we continue to expose people to thousands of carcinogens.
They don’t use these words, but reading the report confirms the sentiment: time to stop the madness.

Europe is fun to visit

March 10, 2010

I was looking at numbers from the US Centers for Disease Control and the European Union’s Food Safety Authority. Using assumptions that are very generous to the US public health outcomes, here is what I found: Compared to the EU, in the US your chance of dying from a food borne illness (fbi) is 100 times greater, your chance of being hospitalized for fbi is about 600 times greater, and your chances of getting sick from fbi is about 1,250 times greater. The numbers are, with less skewed assumptions, probably more than twice as bad.

Mmmm, tainted tomatoes….

February 25, 2010

This story illustrates a big problem. http://www.nytimes.com/2010/02/25/business/25tomatoes.html?ref=us
The incentives run against your interests, the system is set up to hide such misdeeds, and we seldom check My dad actually told me about such practices back in about 1962, when the company he was working for went into producing tomatoes. In today’s complex food system, this is unacceptable. The basic problem has been around for a while, as noted here: http://www.plu.edu/~olufsdw/unpleasant.htm

important heads up from Steven Grossman

February 22, 2010

Check his description of the importance of the upcoming appointment of the head of FDA’s Office of Regulatory Affairs, at http://www.fdamatters.com/?p=832

How Should We Think About This?

February 19, 2010

Here is a good opportunity to look at how you think. Read the editorial about the possibilities of pain free cows. The argument is well-crafted. Think about it.

href=’http://www.nytimes.com/2010/02/19/opinion/19shriver.html’ >pain free cows editorial

Chemicals Testing, Information, and Public Health

February 16, 2010

            Most people will find this topic uninteresting.  But it is a big deal, and will be one of the biggest changes in regulatory politics over the next decade. 

            In the US, regulatory authority over chemicals that may affect human well-being is spread over different agencies.  The FDA has responsibility for most food, but food producers use a vast array of substances that are regulated by other agencies.  USDA sets standards for most chemicals used in meats.  Standards for water, pesticides, fungicides, and industrial chemicals are set by the EPA.  Since the Toxic Substances Control Act was passed in 1976, which is when we started keeping track, over 80,000 new industrial chemicals were introduced into our environment.  We have thorough testing results on only a few of them. 

            This matters because these chemicals get into us, and affect our health.  When people take the trouble to get tested for a wide range of industrial chemicals they may be exposed to at work, at home, or through unknown exposure, they find their bodies contain a soup of contaminants.[1] 

            The chemicals that are allowed in our food are also problematic.  Consider the ambiguity between the definition of an industrial chemical and a component of food.  What qualifies as a natural flavor, for example?  Can both be “natural”?  Is a combination of different chemicals, however produced, not found in nature but tasting just like something in nature, a natural flavor?  Is it a natural flavor because it is labeled as such, and tastes like lemon, even though it has no lemons in it and is mass produced in China (as is most of what you think is lemon flavored)?[2] 

            Many chemicals not intended for the food chain find their way into it, via water, air, or surprising means of transmissions.  Things sprayed on plants may, from the perspective of one who eats those plants, may actually be in them.  In a now infamous case, the bits of beef carcasses that used to be waste processed into pet food are now treated with NH3—which perhaps sounds better than ‘ammonia’—and is added to ground beef  The ammonia was defined by USDA as a treatment, not an additive.[3] 

            Of the tens of thousands of new chemicals introduced into the food chain by these various means, almost none of them are actually tested for safety.  Among the suspected neurotoxins, for example, only three out of more than two hundred have been adequately investigated, and the suspicions were confirmed.[4]  In other words, 100% of the suspected neurotoxins that have been adequately tested have confirmed the fears.  Yet we do not aggressively investigate the remaining suspects.[5]  Once we do start testing, the work to be done in setting standards for chemicals testing may qualify for one of the labors of Hercules.  For example, there are no required ethical standards, or oversight, for testing pesticides on human subjects.[6]

            The testing of products for human consumption that is done is subject to error, of course, but sometimes the errors lean so far in one direction that one could easily conclude the entire system suffers from a bias against public health.[7]

            The EU empowered its European Chemicals Agency (ECHA) to administer its new REACH standards for chemicals (Registration, Evaluation, Authorisation and Restriction of Chemical substances).  The legislation passed in 2007 and the active implementation begins in 2010.  Chemical companies will be obligated to conduct extensive safety tests on new chemicals and for those currently in use. 

            While the US approach to chemicals is quite different, it and the EU countries are affected by new developments in testing technologies that promise to produce a dramatic increase of data and useful test results.  For example, to test the effects of chemicals on humans has required extensive and time-consuming animal tests.  Cost pressures contributed to the development of in vitro cell tests.  Yet individual cells have so many targets for the many chemicals recently introduced into our environment.  It would help sort through the mess if scientists develop a limited number of genes that can signal if a cell would be harmed by a wide range of chemicals.  It would help if more detailed tests of blood components could quickly identify if individuals were responding in particular ways to chemical exposures.  The new types of tests could essentially be automated, which means tests of individual chemicals would become much cheaper.  Such tests are being evaluated right now.[8] 

            Between the new regulatory requirements and the new testing techniques, regulators expect an avalanche of data that they are presently unprepared to organize and use.  A recent report from the FDA Science Board’s Subcommittee on Information Technology found that, with the possible exception of new drug review, the agency is not now prepared to deal with today’s volume of information, and that rather dramatic upgrades of capabilities are in order.[9]  The tone of the report was that since this is a minimal requirement for the FDA to do its job, of course lawmakers will provide the needed resources.  This report was done two years after a major FDA Science Board report included information technology as a critical inadequacy, and that without an urgent fix the agency would not be able to do its job.[10] 

            Both the European REACH and US approaches rely heavily on private companies to finance and report the tests of their chemicals.  There are, of course, perverse incentives at work here.  Companies that stand to make a lot of money upon getting a ‘safe for consumption’ label on their product will be doing the research.  To address this, rules for testing, and testing protocols, need to be clearly specified.  Plus, the data need to be “harmonized,” meaning that everyone needs to use similar reporting codes so that we will be able to make fairly consistent decisions across chemicals. 

In the US, such rules were voluntary in our recent past.  Any attempt to do what must be done will be labeled as a dramatic increase in government power. 

REACH is an attempt to make such rules compulsory in Europe, but the US is not yet ready to take this step.  Companies in Europe have to demonstrate their new chemicals are safe, while in the US it is up to the EPA, chronically underfunded, to investigate and prove if a chemical is unsafe, and then pursue regulation in a way that is ‘least burdensome’ to reduce risk to acceptable levels.[11]

            In both the European and US approaches the stakeholder model is used to write and to evaluate the rules for testing.  This introduces some interesting issues into the regulation of chemicals.  Complexities of the stakeholder model will be the subject of a subsequent entry. 

[1] See Is It In Us? Toxic Trespass, Regulatory Failure & Opportunities for Action, The Commonweal Biomonitoring Resource Center and the Body Burden Working Group, 2007, available at http://cdn.publicinterestnetwork.org/assets/zC9kECTDtAMQl5kgCDFA4w/Is-it-in-US-Report.pdf.  See also Bobbi Chase Wilding, et. al., Hazardous Chemicals In Health Care, Physicians for Social Responsibility, 2009, available at http://www.nursingworld.org/DocumentVault/OccupationalEnvironment/Hazardous-Chemicals-In-Health-Care.aspx.

[2] For an illuminating description of the natural flavor business, see Raffi Khatchadourian, Annals of Science, “The Taste Makers,” The New Yorker, November 23, 2009, p. 86+. 

[3] See the last page of Reference Document: Anti-microbial Interventions for Beef, Center for Food Safety, Department of Animal Science, Texas A&M University, May 2009, available at http://haccpalliance.org/sub/Antimicrobial%20Interventions%20for%20Beef.pdf. 

[4] See, for example, P. Grandjean, P.J. Landrigan, Developmental neurotoxicity of industrial chemicals,” Lancet Published Online November 8, 2006DOI:10.1016/S0140‐6736(06)69665‐7. 100% of suspected neurotoxins we have thoroughly investigated have turned out to be toxic, and programs are underway to control them. Yet besides the three investigations, hundreds more await analysis.

[5] This one example is repeated for many contaminants in our foods.  One book that tracked down the leads on many chemicals is Robyn O’Brien,, with Rachel Kranz, The Unhealthy Truth: How Our Food Is Making Us Sick—and What We Can Do About It (NY: Broadway Books, 2009).

[6] See Christopher Oleskey, et.al., “Pesticide Testing in Humans: Ethics and Public Policy,” Environmental Health Perspectives, Volume 112, Number 8, June 2004, pp. 914-919. 

[7] See, for example, Amir Miodovnik and Philip J. Landrigan, “The U.S. Food and Drug Administration Risk Assessment on Lead in Women’s and Children’s Vitamins Is Based on Outdated Assumptions,”  Environmental Health Perspectives Volume 117, Number 7, July 2009, pp. 1021-2. 

[8] For a quick summary of several dozen new approaches to chemical testing, see Toxicity Pathway-Based Risk Assessment: Preparing For Paradigm Change, program for a meeting at the National Academy of Science, Washington, DC, May 11-13, 2009.

[9] FDA, Science Board Subcommittee Review of Information Technology, August 2009, quite difficult to find on their website but available at http://www.fda.gov/downloads/AdvisoryCommittees/CommitteesMeetingMaterials/ScienceBoardtotheFoodandDrugAdministration/UCM182872.pdf. 

[10] FDA Science and Mission at Risk, Report of the Subcommittee on Science and Technology, Prepared for FDA Science Board, November 2007.

[11] General Accountability Office, Chemical Regulation: Comparison of U.S. and Recently Enacted European Union Approaches to Protect against the Risks of Toxic Chemicals, August 2007, available at http://www.gao.gov/cgi-bin/getrpt?GAO-07-825.

Paying for Food Safety

February 2, 2010

Paying for a Difficult Job
An earlier post briefly described the ambitious agenda facing the new Deputy Director for Food, Michael R. Taylor. It included these lines describing his responsibilities: “inspection and ensuring compliance with rules, coordination with state and local agencies who do much of the work of food safety, responses to “incidents” that hurt people, getting a handle on imported items that find their way into food, sponsoring and organizing the scientific and technical research needed to design a safe food system, including animal feed and veterinary care, and building the information network that helps to knit all of these together.”
Others have looked at the costs of building a food safety system that can do all of this adquately. In the most comprehensive outside analysis of FDA responsibilities and capacity, the authors concluded that the FDA is critically underfunded in its science-based regulation and decision making, so much so as to put public health at risk. (FDA Science and Mission at Risk, Report of the Subcommittee on Science and Technology, Prepared for FDA Science Board, November 2007.) The Congressional Budget Office estimate of the costs of implementing HR 2749, which falls short of the more complete goals of FDA Science and Mission at Risk, came to almost half a billion dollars a year for the first few years of transition, and about a billion dollars per year thereafter. (CBO Cost Estimate, h.r. 2749, July 24, 2009.)
The Obama administration has now released its budget for FY2011. As described in more detail in the agency budget documents submitted to Congress, FDA requested about $220 million in the form of fee-based inspections, plus additional spending over current spending which brought the total food system upgrade request to about $318 million in additional money for 2011. (see http://www.fda.gov/downloads/AboutFDA/ReportsManualsForms/Reports/BudgetReports/UCM199447.pdf, p. 15, for a brief summary.) I spent a little time looking at the documents, and only found about $290 million in budget increases in the documents released by OMB. (see their detailed HHS budget description, at http://www.whitehouse.gov/omb/budget/fy2011/assets/hhs.pdf. The first three pages are the FDA budget summary (the pagination starts at p. 461).) You will see the $220 million on 09.02 Food Registration and Inspection User Fee, on the third page (page # 463) of this document. The fees are “proposed.”
I found no separate discussion of the resources devoted to upgrading the FDA’s information technology, a critical resource in the needed upgrades. It could be that a separate item is included for this, but I searched in vain in the detailed agency appendices and in the Analytical Perspectives that accompany the Budget.
Judging just by the number referred to here, it does appear the Obama administration is taking food seriously, at about two-thirds to three-quarters of the resource level implied by the panel of experts that assembled FDA Science and Mission at Risk. I for one will find it interesting to read how far down that road this new budget authority will take us.

FDA New Dep’y Commission for Food

February 2, 2010

The Appointment of Michael R. Taylor
The FDA has created a new position, Deputy Commissioner of Food, and has appointed Michael R. Taylor to fill it. The new office is supposed to unify a focus on food safety among the Center for Food Safety and Applied Nutrition (CFSAN), the Center for Veterinary Medicine (CVM), and the foods-related activities of the Office of Regulatory Affairs (ORA), plus other staff offices that support them.
This is a significant consolidation of FDA resources toward a mission focused on food safety. The idea has been out there for a while—for example, in May of 2009 the group Trust for America’s Health released a report advocating precisely this move, arguing it is an important interim step in the creation of a single food safety agency. Yet even if all goes well this is going to take a year to bear significant fruit. The several food-related units within FDA have many items on their now-shared plate: inspection and ensuring compliance with rules, coordination with state and local agencies who do much of the work of food safety, responses to “incidents” that hurt people, getting a handle on imported items that find their way into food, sponsoring and organizing the scientific and technical research needed to design a safe food system, including animal feed and veterinary care, and building the information network that helps to knit all of these together. The new Deputy Commissioner’s office is supposed to employ risk-based criteria for the difficult budget and other resources decisions to make all of this happen.
It will be a difficult job.
Mr. Taylor clearly has the experience to take on this job, and some of the most knowledgeable people in the food safety world (like Marion Nestle and Steven Grossman) are saying this is very good news. Mr. Taylor’s published writing and congressional testimony may tell us what to expect—these will be the subject of a subsequent note.
This is all good news. But do not read too much into it. I have two reservations. First, we still have the split responsibility for food between the FDA and USDA. The FDA mission includes much less of the industry cheerleader responsibilities shouldered by USDA, but there is enough wiggle room in the mission language to let a pro-business conservative appointee take the agency in a very different direction. This is the second reservation. A food safety system should not depend on a particular party winning an election. A subsequent blog will compare different models of locking in a mission at an institution like FDA.

Mr. Taylor has written and testified about food safety issues, which provides a record that may tell us what to expect.
During his time at Resources for the Future (RFF) he co-edited (with Sandra A. Hoffmann) Toward Safer Food: Perspectives on Risk and Priority Setting (RFF, 2005), and contributed a short summary chapter. The topic of risk analysis is inherently contentious—in general, advocates of a precautionary approach, as found in several western European countries, do not like it, and advocates of using the business bottom line as a factor in safety issues endorse it. [Disclosure: I personally advocate the use of risk analysis, and am convinced it needs to be part of a precautionary approach—but now seldom is.]
It makes sense to go after the biggest threats to human well-being, and to pursue the policies that produce the most improvement for scarce available dollars. Yet risk analysis comes with a technical vocabulary and measurement techniques that tend to limit participation in debates to those with the skills and experience to speak about and decipher the approach. What I saw throughout Toward Safer Food was a model of analysis that treated technical innovations as mostly exogenous variables, as the economists call them, which respond to market forces. Take the example of concentrated animal feeding operations (CAFOs). They produce meat that costs less per pound and brings animals to slaughter many days faster. Markets gave us CAFOs. And yet, they produced public health consequences that have to be paid for through public policy and increased consumer caution—they have killed people. One of the things markets do is to allow some folks to seize the benefits of new technologies while avoiding the costs the new practices impose on others (indeed, many are created for precisely this reason). Risk analysis should include the likely effects of proposed food system technologies to vet them for things like the production of new super-germs, the need for more drugs in animals during production, and so on. None of the authors of the chapters in Toward Safer Food appear to embrace the more robust modeling needed to make risk analysis vigorously support public health.
Another important piece by Mr. Taylor is the 2007 report from Harnessing Knowledge to Ensure Food Safety, coauthored with Michael Batz (Food Safety Research Consortium, 2008). The report focuses on the food safety information infrastructure needed right now. [The 9-page executive summary is available at http://www.thefsrc.org/FSII/Documents/FSII_Exec_Summary.pdf. It is worth reading.] Briefly, we now have a chaotic cloud of food safety information, no widely shared standards for how to develop research, or how to test ideas about food safety, how to collect information at state and local government, in private industry, and in a dozen national government entities. We have to build an information system that supports public health. There are many good ideas in here, as you can see in the executive summary. One key feature of the approach advocated by Mr. Taylor is “the stakeholder model,” (TSM) in which the interested parties discuss challenges and prospects and develop approaches for improvement. Representatives from government, industry, consumer groups, and anyone else with a recognized stake get together (in this report, the gathering will be called a council) to seek consensus among their perspectives. Like the topic of risk analysis, this is a highly contentious idea. TSM has been widely adopted by state and local governments in environmental policy, to cite one example example, because of the political pressures against regulation. Plans that emerge from stakeholder committees or councils emphasize self-financing elements and voluntary compliance. Sometimes a stakeholder has a perspective that harms public health. This fact needs to be part of the institutional design. [Another disclosure: my view on this is probably influenced by personal experience, as described in http://www.plu.edu/~olufsdw/unpleasant.htm.%5D
A third publication, Stronger Partnerships for Safer Food (coauthored with Stephanie D. David, from GWU School of Public Health and Health Services, 2009) describes the intergovernmental maze of food safety organizations we will have to integrate into an effective system. Nineteen recommendations describe the needed changes, and it is a compelling case (see the executive summary at http://www.rwjf.org/files/research/20090417foodsafetysummary.pdf).
Among the more difficult features are a clear congressional mandate to the Dept. of Health and Human Services (HHS), a HHS secretary committed to food safety as a main agency priority, and a significant increase in intergovernmental funding. One frequently mentioned feature of the current system is the required visual inspection of all animal carcasses, which do not detect the most dangerous sources of contamination. If they were cut out, perhaps $200 million or so could be saved—at some cost to public health. But this goes at most halfway toward funding the required changes needed at FDA alone, which needs to be supplemented by large increases in intergovernmental funding for state and local food agencies. (The budget estimates I refer to here come from sources like the Congressional Research Service and Congressional Budget Office, not the publications mentioned above.)
Observers of the health care debate over the last half year must be forgiven for their profound skepticism that these obstacles can be overcome.
I don’t wish to sound entirely negative. There is much in the latter two documents that need to be put into an effective food safety agency. But more is needed, in my view. To repeat: A food safety system should not depend on a particular party winning an election. The reservations noted above are all things that can be dramatically altered by a shift in party control of Congress or the White House. A subsequent blog will compare different models of locking in a mission at an institution like FDA.

See the posts on Carl Anderson’s blog

January 27, 2010

I will have some short posts on Carl Anderson’s blog about the appointment of Michael Taylor as Deputy Commissioner for Food.


Get every new post delivered to your Inbox.