Computational Toxicology Seen as Promising, But Challenges Remain to Realize Potential


By Pat Rizzuto, Chemical Regulation Reporter (source: http://news.bna.com 33 CRR 938)


Computational toxicology could help researchers and regulators determine which chemicals are a priority for further scrutiny, a senior scientist with the National Institute for Occupational Safety and Health said Sept. 22. Computational toxicology also could be used to obtain information on a chemical intended to substitute for another compound of concern, said D. Gayle DeBord, chief of NIOSH's Biomonitoring and Health Assessment Branch. “Testing substitutes ahead of time would be helpful,” she said, because workers are sometimes harmed by the replacement chemical. DeBord was among the participants in a two-day workshop, Computational Toxicology: From Data to Analyses to Applications, hosted by the National Academies.


The workshop was convened by the National Institute for Environmental Health Sciences (NIEHS) to examine the different approaches to computational toxicology and how federal agencies might use data generated by such studies. Computational toxicology merges toxicology, biology, and computer science in order to predict chemical hazards.


William Farland, a senior vice president at Colorado State University and the former deputy assistant administrator for science at the Environmental Protection Agency, said the agency could use computational toxicology to screen new chemicals to determine whether the manufacturer should provide the agency with data so it can determine whether the new chemical can be sold in the United States.


Treye Thomas, a toxicologist with the Consumer Product Safety Commission, said that agency is exploring the use of computational toxicology as a tool to help evaluate safety, but at present it would not make any regulatory decisions based on results from computational toxicological studies.


Lauren Zeise, chief of the reproductive and cancer hazard assessment branch of California's Environmental Protection Agency, said federal agencies need more funding if they are to develop a strategic approach to fostering the field of computational toxicology.


Christopher Portier, associate director of NIEHS, said scientists have to focus on myriad computational and analytic challenges to understand what the data from a computational toxicity study would mean. It took the National Toxicology Program, which NIEHS manages, about one year to resolve a problem it was having with its high throughput screens, Portier said. High throughput screens, which can test dozens of chemicals simultaneously, are used in computational toxicology. “We are not discussing those technical difficulties openly and honestly with the regulatory community that would use the data,” Portier said.


Despite such challenges, George Daston, a toxicologist working at the Procter & Gamble Co., said he is optimistic about the future of computational toxicology. “We are all desperate for a new and better way to assess toxicity for human health,” Daston said. He added that tools merging the power of computers with biological information are poised to provide the new approach.


 October 2, 2009