Limits...
Thinking one step ahead: strategies to strengthen epidemiological data for use in risk assessment.

Arnold C - Environ. Health Perspect. (2014)

View Article: PubMed Central - PubMed

AUTOMATICALLY GENERATED EXCERPT
Please rate it.

Risk assessment is a cornerstone of environmental health research and policy making... A commentary in this issue of EHP presents a set of recommendations and guidelines to help researchers more effectively characterize uncertainty in epidemiological findings... As with any kind of scientific question, it’s important to know how certain we are of our data,” says Thomas Burke, director of the Johns Hopkins Risk Science and Public Policy Institute, who was not involved with the commentary. “We can’t ever fully eliminate uncertainty, but we can describe it and put bounds around it with statistics. ” Experimental data have traditionally formed the basis for most human health risk assessments, but increasingly regulators are recognizing the value of epidemiological data for this purpose. “Different types of studies, like toxicology studies in animals and epidemiological studies in humans, can help compensate for each other’s inherent weaknesses,” says Michael Dourson, director of Toxicology Excellence for Risk Assessment, a public health organization located in Cincinnati, Ohio... This system enables policy makers to rate the quality of epidemiological data and how well study findings can be generalized to larger populations... This further allows them to weigh the uncertainties from different studies based on the quality of research, creating more accurate and nuanced risk assessments... For authors, applying the system to their own work can point to areas where uncertainty can benefit from further analysis... These methods can transform the discussion of uncertainty from its usual qualitative form into a quantitative measurement... This allows scientists to clearly communicate their results and accompanying uncertainties in the numbers-driven language of policy makers. “You need to communicate what you’ve done, and you’ve got to be able to state your results in a way that managers can get their head around,” Dourson says... The authors of the commentary recommend several more techniques to more clearly and accurately present data... Among others, they suggest the use of directed-acyclic graphs as a way to visualize the sometimes complex relationships among confounders... They also emphasize the need to distinguish between correlation and causation in describing study results, to ensure scientists and policy makers don’t draw incorrect conclusions about risk.

Show MeSH
Directed-acyclic graphs (DAGs) can be an effective way to visualize relationships between the variables in a study.© Joseph Tart; Tomislav Pinter/Shutterstock
© Copyright Policy - public-domain
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4216155&req=5

d35e100: Directed-acyclic graphs (DAGs) can be an effective way to visualize relationships between the variables in a study.© Joseph Tart; Tomislav Pinter/Shutterstock


Thinking one step ahead: strategies to strengthen epidemiological data for use in risk assessment.

Arnold C - Environ. Health Perspect. (2014)

Directed-acyclic graphs (DAGs) can be an effective way to visualize relationships between the variables in a study.© Joseph Tart; Tomislav Pinter/Shutterstock
© Copyright Policy - public-domain
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4216155&req=5

d35e100: Directed-acyclic graphs (DAGs) can be an effective way to visualize relationships between the variables in a study.© Joseph Tart; Tomislav Pinter/Shutterstock

View Article: PubMed Central - PubMed

AUTOMATICALLY GENERATED EXCERPT
Please rate it.

Risk assessment is a cornerstone of environmental health research and policy making... A commentary in this issue of EHP presents a set of recommendations and guidelines to help researchers more effectively characterize uncertainty in epidemiological findings... As with any kind of scientific question, it’s important to know how certain we are of our data,” says Thomas Burke, director of the Johns Hopkins Risk Science and Public Policy Institute, who was not involved with the commentary. “We can’t ever fully eliminate uncertainty, but we can describe it and put bounds around it with statistics. ” Experimental data have traditionally formed the basis for most human health risk assessments, but increasingly regulators are recognizing the value of epidemiological data for this purpose. “Different types of studies, like toxicology studies in animals and epidemiological studies in humans, can help compensate for each other’s inherent weaknesses,” says Michael Dourson, director of Toxicology Excellence for Risk Assessment, a public health organization located in Cincinnati, Ohio... This system enables policy makers to rate the quality of epidemiological data and how well study findings can be generalized to larger populations... This further allows them to weigh the uncertainties from different studies based on the quality of research, creating more accurate and nuanced risk assessments... For authors, applying the system to their own work can point to areas where uncertainty can benefit from further analysis... These methods can transform the discussion of uncertainty from its usual qualitative form into a quantitative measurement... This allows scientists to clearly communicate their results and accompanying uncertainties in the numbers-driven language of policy makers. “You need to communicate what you’ve done, and you’ve got to be able to state your results in a way that managers can get their head around,” Dourson says... The authors of the commentary recommend several more techniques to more clearly and accurately present data... Among others, they suggest the use of directed-acyclic graphs as a way to visualize the sometimes complex relationships among confounders... They also emphasize the need to distinguish between correlation and causation in describing study results, to ensure scientists and policy makers don’t draw incorrect conclusions about risk.

Show MeSH