28.02.2013 Views

Implementing food-based dietary guidelines for - United Nations ...

Implementing food-based dietary guidelines for - United Nations ...

Implementing food-based dietary guidelines for - United Nations ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Setting upper levels <strong>for</strong> nutrient risk assessment<br />

not been fully validated, there is probably a need <strong>for</strong><br />

research to develop and validate markers appropriate<br />

<strong>for</strong> this nutrient risk assessment strategy.<br />

A continuum as a basis <strong>for</strong> identifying adverse health<br />

effects <strong>for</strong> this strategy in nutrient hazard identification<br />

and characterization is illustrated in figure 3 [16].<br />

Steps 4–7 in the figure reflect the increasing severity<br />

of adverse health effects; the workshop [1] agreed that<br />

phenomena occurring at level 3, namely, at levels of<br />

intake that were assumed to be not far in excess of the<br />

homeostatic range, could indeed be surrogates, or predictive<br />

markers, of adverse health effects. Furthermore,<br />

it concluded that<br />

when data are available, the optimal endpoint <strong>for</strong> use<br />

in setting a UL would be an effect at step 3 and possibly<br />

step 2, with steps 4–7 reflective of clinical phenomena.<br />

Step 2 may be applicable in some cases in which sufficient<br />

in<strong>for</strong>mation is available to suggest that changes<br />

outside the homeostatic range that occur without known<br />

sequelae would be relevant as surrogates <strong>for</strong> an adverse<br />

health effect [1].<br />

This is emphasized in figure 3 by showing, against<br />

the background of the ranked effects, a hypothetical<br />

cascade of markers or effects arising from exceeding<br />

a safe level, or conceptual UL, of intake. These figures<br />

show how a critical control point analytical approach<br />

to available in<strong>for</strong>mation could be used in nutrient<br />

hazard identification and characterization; it provides<br />

a strategic structure <strong>for</strong> an evidence-<strong>based</strong> systematic<br />

review and categorization of the data and potential<br />

markers and <strong>for</strong> the determination of gaps in insight<br />

of the pathophysiology and uncertainty.<br />

The use of such markers would be an important<br />

�� ����������� ������� ������ ����������� ����� ���<br />

�� ������� ��������<br />

�� ����������� ������� ������� ��� ����������� �����<br />

������� ����� ��������<br />

�� ����������� ������� ������� ��� ����������� ������<br />

� ������ �� ��������� ������� �������<br />

��� �� ������<br />

�� �������� �������� ���������� �� ����� ��� ����������<br />

�������<br />

�� �������� �������� �� ����������� ��� ���������� �������<br />

�� �������� ����� ���������� �� ����������� ���������� �����<br />

������<br />

�� �������� ����� ���������� �� ������������ ����� ������<br />

FIG. 3. The range and cascade of effects and of markers:<br />

opportunities <strong>for</strong> identifying critical markers of adverse<br />

health effects. Each circle represents a hypothetical marker,<br />

and the dark circle represents the marker at a critical point<br />

<strong>for</strong> the subsequent cascade of adverse health effects. The<br />

spectrum of health effects is from Renwick et al. [16]<br />

S35<br />

innovation in nutrient risk assessment. Their introduction<br />

would be amenable to the NOAEL, BMD, and<br />

categorical regression approaches to hazard characterization.<br />

Using markers of earlier effects of excess intakes<br />

would be expected to increase the size of the database<br />

<strong>for</strong> hazard identification and characterization. They<br />

probably will not improve the quality of the database.<br />

In particular, the use of markers at steps 2 and 3 of<br />

figure 3 needs to be backed by confidence in their<br />

validity and quality.<br />

With these issues in mind, the workshop emphasized<br />

that biomarkers comprised two classes: “factors” that<br />

represent “an event…directly involved in the process<br />

of interest and are causally associated with the adverse<br />

health effect”[1] and “indicators” that represent correlated<br />

or associated effects and events that have not been<br />

shown to be part of the causal pathway. Thus, a biomarker<br />

that is part of the causal pathway can be regarded as<br />

being “predictive” of an adverse health effect; however,<br />

some “predictive” biomarkers might not be causal. In<br />

this regard, the workshop appreciated that biomarkers<br />

can be diagnostic in that they indicate adverse<br />

health effects relevant to nutrient risk assessment, <strong>for</strong><br />

example, liver damage, but as such these could still be<br />

categorized as factors or indicators according to their<br />

perceived role in the pathogeneses involved. Thus,<br />

nutrient risk assessors may have available a portfolio of<br />

biomarkers that could be used as surrogates <strong>for</strong> adverse<br />

health effects, in that such markers can be typified as<br />

being causally associated with the adverse health effect;<br />

diagnostic of the adverse health effect; and predictive<br />

of, but not causally associated with, the adverse health<br />

effect [1].<br />

Overall markers, whether they are functional, chemical,<br />

or morphological, would need to meet the quality<br />

criteria of being biologically valid and reproducible, of<br />

known specificity and sensitivity, and methodologically<br />

or analytically valid and reproducible. A recent<br />

consideration of the use of markers as surrogate endpoints<br />

in the justification of claims of reduced risk of<br />

disease is particularly relevant to these issues [17] and<br />

emphasizes the need <strong>for</strong> markers to be ethically and<br />

practically feasible if they are to be used in systematic<br />

studies in populations. The workshop appreciated that<br />

these criteria provided a basis <strong>for</strong> characterizing the<br />

uncertainty and variability associated with markers at<br />

any stage in the above ranking, but particularly those<br />

at stages 2 and 3, where the best chances of improving<br />

the database through human research exist.<br />

Theoretically, a biologically <strong>based</strong> or metabolic dose–<br />

response model would be applicable to all nutrients<br />

and should or could derive from the compilation and<br />

acquisition of new data on absorption, distribution,<br />

metabolism, and excretion as the basis of in<strong>for</strong>mation<br />

on biokinetics and biodynamics. In essence, this<br />

resembles the use and derivation of chemical-specific<br />

adjustment factors (CSAFs) to improve the specificity

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!