Implementing food-based dietary guidelines for - United Nations ...
Implementing food-based dietary guidelines for - United Nations ...
Implementing food-based dietary guidelines for - United Nations ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
S36<br />
of uncertainty factors. Furthermore, ADME data<br />
could be used to explore the dose–response curve at<br />
the lower extreme of intakes to set safe lower levels as<br />
part of a risk–benefit analysis [16]. The construction of<br />
biological models <strong>for</strong> dose–response curves and CSAFs<br />
would need different gradations of intakes from those<br />
traditionally used <strong>for</strong> toxicologic studies.<br />
In some instances, high nutrient intakes have already<br />
been associated with phenomena that correspond with<br />
stage 2 or 3 markers <strong>for</strong> adverse health effects. These<br />
are metabolic interactions among nutrients (<strong>for</strong> example,<br />
those among iron, zinc, and copper) in situations in<br />
which imbalanced intakes compromise the specificity<br />
of individual metabolic pathways [1].<br />
The value of using markers in nutrient risk assessment<br />
would be enhanced if the totality of the evidence<br />
supporting the biological validity <strong>for</strong> each marker<br />
could be explicitly evaluated in the context of overall<br />
causation, incorporating the strength of the association;<br />
consistency across all lines of evidence; specificity; temporal<br />
relationship; a demonstrable relationship between<br />
intake and a functional or health effect response <strong>for</strong> the<br />
marker (i.e., <strong>for</strong> a homeostatic stage 2 marker, an indication<br />
of an adaptive phenomenon, rather than a linear<br />
response that might reflect exposure rather than a specific<br />
adaptive health effect [5]); and plausibility, coherence,<br />
and experimental support from other sources<br />
(e.g., animal models [17]). It is doubtful whether much<br />
in<strong>for</strong>mation is available to support potential markers<br />
<strong>for</strong> nutrient risk assessment.<br />
These principles are also relevant to the early detection<br />
of inadequate intakes. Some initiatives have considered<br />
whether it would be possible to have a common<br />
approach to assessing nutrient deficiency and excess.<br />
This has been explored <strong>for</strong> essential trace elements [2]<br />
and as a risk–benefit analysis <strong>for</strong> micronutrients in<br />
general [16]. Recently a human health dose–response<br />
risk assessment has been used to explore the dual<br />
response curve risk assessment <strong>for</strong> copper [11], and<br />
a spectrum from copper deficiency to copper toxicity<br />
has been compiled with the use of data from studies on<br />
humans and animal models. The exercise provided an<br />
opportunity to explore several approaches to dose– or<br />
intake–response modeling, including the benchmark<br />
dose and categorical regression. Existing data allowed<br />
<strong>for</strong> these approaches, but the development of a biologically<br />
<strong>based</strong> dose–response risk assessment and of<br />
CSAFs was limited by the quality and amount of the<br />
data [11]. Many of the individual studies that were<br />
reviewed during hazard identification and characterization<br />
in this exercise had been designed to demonstrate<br />
the effects of prolonged exposures to single measured<br />
and usually very high or very low concentrations of<br />
copper in diets. These studies were not designed to<br />
generate intake–response curves or to examine risk.<br />
Most just reported the copper contents of diets fed to<br />
animals and gave no indication of actual intakes; these<br />
had to be estimated, albeit imperfectly, from knowledge<br />
of animal weights and data from other reports<br />
on animal <strong>food</strong> consumption. After such reports were<br />
excluded during the systematic literature search, the<br />
residual database was very scant. This experience cautions<br />
against having high expectations of being able<br />
soon to address nutrient risk assessment through a<br />
biologically <strong>based</strong> response approach. The situation <strong>for</strong><br />
amino acids, where systematic metabolic studies using<br />
tracers are improving the generic understanding of the<br />
application of kinetic and dynamic studies to homeostasis,<br />
may be more encouraging [3–5].<br />
Summary and conclusions<br />
P. J. Aggett<br />
There is a need <strong>for</strong> a transparent model <strong>for</strong> nutrient risk<br />
assessment that would enable key elements of nutrient<br />
metabolism and function, and gastrointestinal and<br />
systemic adaptive phenomena in response to excess<br />
intakes (i.e., above the “physiological requirements”),<br />
to be identified and used as markers of excessive exposure.<br />
Such a biologically <strong>based</strong> dose–response model<br />
to determine ULs <strong>for</strong> nutrients could also be used to<br />
explore lower levels of intake and thereby enable the<br />
setting of lower levels of reference intakes.<br />
Nutrient hazard identification and characterization is<br />
an iterative process. It needs to be supported by a complete<br />
compilation and review of the available literature<br />
and data, i.e., an evidence-<strong>based</strong> systematic review with<br />
predefined search and summary strategies and transparent<br />
criteria <strong>for</strong> rating, including and excluding individual<br />
studies and their data. As with risk assessment of<br />
non-nutrient chemicals, published systematic reviews<br />
may be useful. However, they should provide a means<br />
to access primary data and to rate their quality, and the<br />
bases of their systematization should not be allowed<br />
to prejudice the nutrient hazard identification and<br />
characterization. The critical intermediate outcome of<br />
this process is the agreed selection of an adverse health<br />
effect from which a UL, as a health-<strong>based</strong> guidance<br />
value, can be derived <strong>for</strong> the protection of the public’s<br />
health. The approach proposed in this paper <strong>for</strong> identification<br />
of the adverse health effect, namely, the use of<br />
health effects and markers that occur relatively earlier<br />
or at lower intakes on the pathogenic response curve<br />
than classic toxicologically adverse effects, necessitates<br />
specific data search strategies that make greater use of<br />
ADME, biokinetic, and biodynamic data and that will<br />
probably need specific research. The advantage of using<br />
adverse health effects, or markers thereof, that occur<br />
at such lower intakes is that such research should be<br />
feasible in human participants. The disadvantages at<br />
the moment are the overall paucity of data, the nonsystematic<br />
and opportunistic nature of most of the<br />
relevant data, and the fact that most systematic data<br />
are derived from animal models.