Recent technology developments have fundamentally changed the way in which biology is studied. In 2001 the first draft of the human genome sequence was published. Since then significant inroads have been made towards characterising biological processes at the molecular level, defining which genes are induced or repressed (the transcriptome), what proteins are produced (the proteome) and how these proteins influence metabolic processes (the metabolome). New insights have also been revealed about the way in which the environment can modify these molecular events and the way in which genotype may predict phenotype and influence responsiveness to environmental factors.
Nutritional genomics is a promising new research area and as a young science the way in which it is defined is still a topic for debate. For many researchers nutritional genomics is closely associated with ‘personalised nutrition’, an emerging concept in which the diet of an individual is customised, based on their own genomic/genetic information, to optimise health and prevent the onset of disease. In this context nutritional genomics is largely concerned with elucidating the interactive nature of genomic, dietary and environmental factors and how these interactions impact on health outcomes. One aim is to understand how diets and dietary components affect gene expression and how these changes in turn influence protein expression and modification, metabolism and ultimately health (nutrigenomics). A second aim is to understand how genetic variation modifies an individual's physiological response to diet and how this may influence parameters of health or disease (nutrigenetics). Alternatively the topic of nutritional genomics may be considered more generally, a broader definition being the application of high throughput genomics (transcriptomics, proteomics, metabolomics/metabonomics) and functional genomic technologies to the study of nutritional sciences and food technology (van der Werf et al. Reference van der Werf, Schuren, Bijlsma, Tas and van Ommen2001). If viewed from this wider perspective then the impact of nutritional genomics on the food industry is considerable, with applications from ‘farm to fork’ anticipated (see Fig. 1). Indeed, in a number of instances these applications are no longer just a vision but are now approaching reality. In this review the relevance of nutritional genomics to the food industry will be considered and examples given on how this science area is starting to be leveraged for economic benefits and to improve human nutrition and health.
Nutritional genomics and agriculture
Plant biotechnologists were the first group within the food industry to seriously adopt and implement genomic science. Indeed, in 1999 the term ‘nutritional genomics’ was used to describe the successful manipulation of plant biosynthetic pathways to improve human health (DellaPenna, Reference DellaPenna1999). Agricultural interest in genomics has now expanded to the livestock industry. Although transgenic approaches to altering the genetic makeup of meat-producing animals are unlikely to be acceptable to many consumers, opportunities do exist for agricultural researchers to use marker-assisted selection to exploit natural variation. In a broad sense agriculture-based nutritional genomics is concerned with defining how to grow and feed domesticated crops and livestock to maximise their own health, resilience and yield as well as modifying their composition to improve nutritional qualities that are important for human health. Applications for nutritional genomics therefore include the identification of dietary signals that boost immunity, eliminating the need for antibiotic use in animal feed, as well as the development of crops or animal produce with increased levels of healthful phytochemicals.
Domesticated food crops
The combination of genomics and molecular biology has created a new way for scientists to generate plant varieties, one that offers wider functional scope and greater precision than conventional plant breeding methods. Although in the first instance GM plants were produced for pure agro-economic reasons, to achieve resistance to herbicides or to pathogens (mainly insects or viruses), more recently the technology has been focused towards improving the nutritional qualities of plants and enhancing human health.
A good example of the nutritional benefits that can be achieved through genetic modification is Golden Rice, a rice strain engineered to produce β-carotene (pro-vitamin A) in the endosperm and designed as a biotechnology solution to the problem of vitamin A deficiency. Vitamin A deficiency is a significant public health problem in many parts of the developing world, particularly Africa and South East Asia where rice is a primary food staple. It is the leading cause of preventable blindness in children and may increase the incidence and severity of infectious diseases. In a proof of concept study, scientists showed that it was possible to establish an entire biosynthetic pathway de novo in rice endosperm, enabling the accumulation of pro-vitamin A (Ye et al. Reference Ye, Al Babili, Kloti, Zhang, Lucca, Beyer and Potrykus2000). In the original strains pro-vitamin accumulation was relatively limited, supplying only 15–20 % of the recommended dietary allowance for vitamin A. However, recently Golden Rice 2 has appeared; this new variety accumulates pro-vitamin A at levels more than 20-fold higher than those of the original and could deliver up to 50 % of a child's vitamin A requirements (Paine et al. Reference Paine, Shipton and Chaggar2005).
A further example of using genetic modification to improve nutritional traits is the development of transgenic tomatoes. In an example of such work a gene from Petunia encoding chalcone isomerase, an important flavonoid biosynthesis enzyme, was over-expressed in tomatoes. Processing of the resultant high-flavonol tomato fruit generated a tomato paste containing levels of flavonols that were 21-fold higher than the paste manufactured from a standard fruit (Muir et al. Reference Muir, Collins, Robinson, Hughes, Bovy, Ric, van Tunen and Verhoeyen2001). Flavonols are known to exhibit antioxidant properties and consumption of flavonol-rich foods has been linked to improvements in health, particularly cardiovascular health (Fisher & Hollenberg, Reference Fisher and Hollenberg2005).
Despite the nutritional benefits offered by GM food and crops the public reaction to this technology has been poor, particularly in Europe. Indeed, the progress of GM crops through the European Union regulatory system largely halted in 1998 pending a review of all regulations pertaining to the release of GM organisms and the marketing of GM products (Hails & Kinderlerer, Reference Hails and Kinderlerer2003). Given this, some plant biotechnologists have sought alternative ways of using genomic information to deliver crop improvements. This has led to the development of Targeting Induced Local Lesions IN Genomes (TILLING), a technique that shows considerable promise as a non-transgenic way of improving domesticated crops (Slade & Knauf, Reference Slade and Knauf2005). Essentially TILLING is a reverse genetics technique. Rather than engineering a new cultivar with specific characteristics, technologists identify variants with the desired genetic characteristics by high throughput screening of large natural germplasm collections or novel, chemically mutagenised populations. Although originally developed for use in Arabidopsis thaliana, a diploid model organism, TILLING has also been successfully applied to a range of other plant species (soyabean, maize, romaine and iceberg lettuce, rice, peanut, castor) including those with more complex genomes, such as durum (tetraploid) and bread (hexaploid) wheat (Slade & Knauf, Reference Slade and Knauf2005).
Livestock and animal produce
Selective breeding has been applied for thousands of years to improve desirable characteristics (e.g. disease resistance) in domesticated animal species. With the advent of genomic technologies efforts are now underway to understand the genetic basis of commercially important traits. In animals quantitative trait loci, multiple genomic regions identified as important in determining the variation of a complex phenotype, have been identified for a range of economically important traits including, growth, fatness, fertility, milk production/composition, meat quality and health (Harlizius et al. Reference Harlizius, van Wijk and Merks2004; Gordon et al. Reference Gordon, Gordish Dressman and Hoffman2005). The molecular basis of these phenotypes is complex and for the most part remains uncertain, with only a small number of genes in quantitative trait loci identified to date. However, as knowledge builds and the genetic variations underlying these traits are more comprehensively delineated opportunities may emerge for stock improvement through marker assisted selection. Indeed, genotyping services (www.cogenics.com) to test for markers associated with scrapie susceptibility and meat tenderness are already being offered to the agricultural community.
The past decade has also seen a concerted effort to map the genomes of all major farm animal species. Initially efforts were focused on pigs, cattle and chicken. However, medium-density genetic linkage maps are now also available for horses, goats and several fish species (Harlizius et al. Reference Harlizius, van Wijk and Merks2004). More recently, the sequencing of farm animal genomes has begun. The first draft of the chicken (Gallus gallus) genome was published in 2004 and a genetic variation map, based on a comparison of the sequence of three domestic chicken breeds (a broiler, a layer and a chinese silkie) with that of the red jungle fowl, the predecessor of the domestic chicken, has also been developed (Hillier et al. Reference Hillier, Miller and Birney2004; Wong et al. Reference Wong, Liu and Wang2004). Despite the lack of detailed sequence maps new insights on how genetics influence commercially important traits have started to emerge. For example, the genes PPARGC1A, DGAT1 and FASN have been shown to influence the fat content of milk (Pareek et al. Reference Pareek, Czarnik, Zabolewicz, Pareek and Walawski2005; Weikard et al. Reference Weikard, Kuhn, Goldammer, Freyer and Schwerin2005; Roy et al. Reference Roy, Ordovas, Zaragoza, Romero, Moreno, Altarriba and Rodellar2006), RYR1, MSTN, PRKAG3, IGF2 and CLPG are all associated with muscle growth, the major determinant of the performance of meat-producing animals (Gordon et al. Reference Gordon, Gordish Dressman and Hoffman2005) and polymorphisms in the prolactin and prolactin receptor genes have been linked to broodiness in chickens (Jiang et al. Reference Jiang, Xu, Zhang and Yang2005). However, for some of these genes, particularly those linked to musculature, negative side effects have also been observed. In pigs the high muscle growth associated with the RYR1 and PRKAG3 genes has been linked to reduced meat quality and the MSTN gene is associated with a reduction in stress tolerance, calf viability and female fertility (Harlizius et al. Reference Harlizius, van Wijk and Merks2004; Gordon et al. Reference Gordon, Gordish Dressman and Hoffman2005).
In addition to studying direct gene effects, agricultural researchers have also started to explore the way in which genetic factors may interact with diet. Although this field of research is still very much in its infancy there is considerable excitement about what the future may hold, particularly the impact such interactions may have on the nutritional value of food products (meat, butter, milk, cheese and eggs, etc.). For example, supplementation of dairy cattle diets with plant oils high in linoleic and/or linolenic acid is known to increase milk levels of cis-9, trans-11 conjugated linoleic acid, a putative anti-cancer agent. Furthermore, striking differences between individual animals in the degree of conjugated linoleic acid enrichment have been observed and may, in part, be the result of genetic factors (Lock & Bauman, Reference Lock and Bauman2004). A second example is eggs and milk containing increased amounts of n-3 fatty acids compared to conventional products. Diets rich in n-3 fatty acids, such as DHA and EPA, have long been considered as beneficial to human health, particularly against CVD (Bautista & Engler, Reference Bautista and Engler2005). However, the n-3 content of western diets tends to be relatively low. As a consequence, opportunities to enhance n-3 fatty acids in many foods are being explored. Supplementation of cattle feedstuffs with fish oils, fish by-products and marine algae is known to increase DHA and EPA levels in milk (Lock & Bauman, Reference Lock and Bauman2004), whilst poultry fed flaxseed produce eggs containing high levels of α-linolenic acid, a plant-derived n-3 fatty acid (Bourre, Reference Bourre2005). Whether this enrichment can be improved through genotype selection is still to be determined. However, genetic strain has been shown to influence the incorporation of n-3 fatty acids into the yolk of chicken eggs suggesting that it may be possible (Steinhilber, Reference Steinhilber2005).
Nutritional genomics and food processing, food safety and quality assurance
Moving beyond agriculture to other parts of the food chain, researchers are now using genomic technologies to drive improvements in food processing, food safety and quality assurance. Applications include the use of ‘DNA finger printing’ to check the authenticity of food ingredients and to verify the composition of processed food products as well as the development of molecular-based diagnostics to guide food processes, predict the shelf-life of fresh produce and detect microbiological contamination.
Food processing
A relatively new application of genomic technologies is the discovery of ‘process markers’, informative molecular markers that may be used to guide industrial processes or improve supply chain management. For example, the manufacture of tea is a complex process, involving multiple steps (withering, ‘fermentation’, heat processing and drying) that can influence both the aroma and taste of the final product. Defining the molecular basis of these processes should help improve current manufacturing techniques and may also highlight alternative, perhaps more scaleable, manufacturing approaches. In terms of supply chain management molecular markers may prove useful for predicting and improving the shelf-life of fresh produce. Broccoli, a popular green vegetable, has a notoriously short shelf-life, yellowing and losing turgor within days of harvest. Researchers studying the way in which genetic and environmental factors influence post-harvest performance have revealed that the processes occurring in harvested broccoli are highly comparable to those that occur during normal leaf senescence (Page et al. Reference Page, Griffiths and Buchanan-Wollaston2001). Genomic approaches have been used to identify and characterise a range of senescence-enhanced genes in the model plant species Arabidopsis, allowing a putative network of signalling pathways to be proposed (Buchanan-Wollaston et al. Reference Buchanan-Wollaston, Earl, Harrison, Mathas, Navabpour, Page and Pink2003). As understanding of the molecular basis of plant senescence continues to develop it should enable the rational design of growth and harvesting techniques to improve post-harvest crop performance. Indeed, knowledge gained from leaf senescence studies has already been used to manipulate the post-harvest characteristics of broccoli and lettuce (Henzi et al. Reference Henzi, Christey and McNeil2000; Chen et al. Reference Chen, Hwang, Charng, Sun and Yang2001; McCabe et al. Reference McCabe, Garratt, Schepers, Jordi, Stoopen, Davelaar, van Rhijn, Power and Davey2001).
Food safety
To date, the use of genomics in food safety has concentrated on two main areas, the safety evaluation of food components (van Ommen & Groten, Reference van Ommen, Groten, Simpopoulos and Ordovas2004) and the detection of micro-organisms which may cause food spoilage or be hazardous to human health (Abee et al. Reference Abee, van Schaik and Siezen2004). Safety evaluation of food components is concerned with both hazard identification (whether a food component causes an adverse health effect), and hazard characterisation (the level of exposure required to elicit an adverse health effect). This data is then used to determine the acceptable daily intake of a particular food or food chemical. Although hazard analysis is clearly important, gathering appropriate data can be costly and time consuming, requiring detailed toxicological experimentation in animals, often on an empirical basis.
Genomic technologies can offer a number of benefits when conducting toxicological evaluation. Firstly, their high throughput nature means that it is possible to analyse multiple tissues in a timely and cost-effective manner. Secondly, by applying transcriptomics, proteomics and metabolomics/metabonomics the full range of biological responses from gene expression through to cellular functions can be studied. Thirdly, and perhaps most importantly, mechanistic understanding is not a prerequisite for good experimental design. In fact, genomic technologies may usefully be applied as hypothesis generation as well as hypothesis testing tools. As knowledge grows on the gene, protein and metabolic changes induced by particular xenobiotics it should be possible to derive mechanistic insights for new compounds by comparing their profiles with existing data. Furthermore, such comparisons may prove useful in highlighting potential toxicities early on, enabling more targeted toxicological analysis. The utility and challenges of using genomic technologies in toxicological assessment has recently been reviewed by a number of authors (Battershill, Reference Battershill2005; Heijne et al. Reference Heijne, Kienhuis, van Ommen, Stierum and Groten2005; Lindon et al. Reference Lindon, Keun, Ebbels, Pearce, Holmes and Nicholson2005; Reynolds, Reference Reynolds2005).
A second application of genomics to food safety is the control and detection of food borne micro-organisms. Traditional means of controlling microbial spoilage and safety hazards in foods include freezing, blanching, sterilisation, curing and use of preservatives. However, the developing consumer trend for ‘naturalness’, as indicated by the strong growth in sales of organic and chilled food products, has resulted in a move towards milder food preservation techniques. This raises new challenges for the food industry.
The first bacterial genome sequence was completed over a decade ago. Now genome sequences are available for many of the microbes that cause food borne diseases, including Listeria monocytogenes, Yersinia enterocolitica, Escherichia coli, Clostridium botulinum A, Campylobacter jejuni and various Salmonella species (Abee et al. Reference Abee, van Schaik and Siezen2004). This information combined with comparative genomic techniques may help microbiologists to identify unique, strain-specific genetic signatures which can be used for detecting and typing microbial contamination. For example, comparison of the genome data of ninety-seven strains of C. jejuni has recently been reported (Taboada et al. Reference Taboada, Acedillo, Carrillo, Findlay, Medeiros, Mykytczuk, Roberts, Valencia, Farber and Nash2004). Gene divergence in C. jejuni is largely restricted to a small number of genomic ‘hotspots’. Many genes located at these variability loci are divergent across multiple strains. However, a large number are also unique to a single strain. Genes that display a high degree of intra-species variability represent good targets for genotype detection purposes. Genetic-based detection of microbes offers significant benefits over conventional detection methodology, particularly from a human health perspective. Firstly, multiple species and strains, including virulence markers, can be analysed simultaneously. Secondly, a much more rapid turn around of test results can be achieved. Indeed, for current PCR-based methods results can be produced in 24–48 h, whilst advances in direct DNA detection techniques, i.e. methods that do not require target amplification by PCR, should facilitate the development of robust, portable systems, reducing analysis time from hours to minutes as well as enabling ‘in the field’ testing. For example, one such system, which is currently under development at Integrated-Nanotechnologies (www.integratednano.com), can directly detect the binding of a target DNA molecule to sensors on a microchip surface. Detection involves metallisation of the bound target, creating a highly conductive DNA wire which links the previously isolated sensors.
Genomic technologies can also help scientists to derive better understanding of the life cycles of bacteria. Defining the mode of action of food borne bacteria and the mechanisms that confer ‘stress resistance’ should enable more rational design of food preservation techniques. In addition, this information can also be used to pinpoint areas of the food chain that are most susceptible to microbial contamination.
Quality assurance
DNA identification is now being applied within the food industry as a means of authenticating plants, animals and packaged food products.
One application of genetic analysis is as a means of authenticating and controlling conventional animal identification systems. National disease monitoring and eradication programmes depend heavily on conventional animal identification, usually with ear tags. Protecting the integrity of these programmes is vital to their success. However, conventional methods are open to fraudulent practices such as tag swapping and may be further compromised by animal theft and smuggling. In contrast, DNA is largely unalterable and is an integral part of the animal. These properties not only enable the verification of live animals but also allow complete traceability of foodstuffs throughout the length of the food chain.
A second application is the molecular authentication of food ingredients and packaged food products. Consumers rely heavily on food labelling to guide product choice, particularly if the food has been processed, removing the ability to distinguish one ingredient from another. As food choice often reflects important personal beliefs, as well as health concerns, it is essential that food labelling is honest and accurate. Authentication of foodstuffs may be carried out by manufacturers as part of their quality assurance processes, either to check the provenance of supplied ingredients or to detect for cross-contamination during production. It may also be used by food standard agencies as a means of detecting ‘food fraud’ and prosecuting fraudsters. Regional and traditional ingredients that have particular consumer appeal can attract a premium price. For example, traditional Basmati rice, which is known for its superior aroma and grain quality, commands a higher market price than cross-bred Basmati and non-Basmati rice. Consequently certain food stuffs may be prey to illegal practices, including adulteration or even complete substitution of the premium ingredient with a less costly alternative. Indeed, substitution of tuna with inferior ‘bonito’ fish species has been reported in the canned fish industry and adulteration of hard wheat (Triticum durum) derived flour with cheaper common wheat (Triticum aestivum) flour is known to be a problem in the pasta industry (Woolfe & Primrose, Reference Woolfe and Primrose2004).
Checking the legitimacy of ingredients can be difficult for food manufacturers, particularly when key appearance characteristics, such as head, skin and fins in the case of fish, are removed. However, genetic analysis appears to offer a promising solution. Not only is DNA a highly robust molecule, capable of withstanding the rigours of food processing (e.g. sterilisation at temperatures of >100°C), it is also species unique and present in all parts of the plant or animal. In the UK, the government funded Central Services Laboratory (www.csl.gov.uk) has recently developed a microsatellite-based technique to distinguish between different varieties of Basmati and long grain rice (Woolfe & Primrose, Reference Woolfe and Primrose2004). By analysing a range of repetitive elements the test is able to identify the presence of just 5 % non-Basmati grain in a sample. Other researchers have also identified genetic markers that can distinguish traditional, cross-bred and non-Basmati rice varieties (Jain et al. Reference Jain, Jain and McCouch2004). Further examples of this type of application include genetic tests, developed in Japan, that can distinguish premium Koshihikari rice from cheaper varieties and genuine Tochiotome strawberries from inappropriately labelled Korean fruit (Williams, Reference Williams2005). In addition, a high density DNA chip, FoodExpert-ID® (bio-Mérieux, Marcy L'Etoile, France), for detecting multiple animal species in either packaged food products or animal feed has been developed in Europe.
Nutritional genomics and human health
For thousands of years it has been recognised that nutrition plays a crucial role in the development of disease. Conversely, diet can play an equally important role in disease management. A classic example is scurvy, a devastating health condition that is caused by vitamin C deficiency but which is readily addressable, even in individuals with advanced disease, with vitamin C supplementation. Many disorders caused by vitamin and mineral deficiency have largely been eradicated in the developed world through fortification of common food stuffs. For example, salt has been fortified with iodine to prevent goitre and vitamins A and D added to milk to prevent rickets. In response to this success attention has now turned to the role played by diet and lifestyle in the development and management of age-related chronic diseases, particularly diabetes, CVD and cancer.
Globally the prevalence of diabetes is increasing, with conservative estimates predicting that the worldwide incidence will rise above 350 million by 2030 (Wild et al. Reference Wild, Roglic, Green, Sicree and King2004). This rising disease burden is due primarily to population growth, aging, urbanisation and the increasing prevalence of obesity. Recent studies have shown how powerful diet and lifestyle modification can be for preventing the development of type 2 diabetes, particularly in high risk individuals with impaired glucose tolerance (Tuomilehto et al. Reference Tuomilehto, Lindstrom and Eriksson2001; Knowler et al. Reference Knowler, Barrett-Connor, Fowler, Hamman, Lachin, Walker and Nathan2002; Li et al. Reference Li, Hu, Yang, Jiang, Wang, Xiao, Hu, Pan, Howard and Bennett2002). Indeed, in the NIH-sponsored Diabetes Prevention Programme not only was diet and lifestyle modification shown to be a successful strategy it was also shown to be more effective than pharmacological treatment and estimated as being economically more attractive (Herman et al. Reference Herman, Hoerger, Brandle, Hicks, Sorensen, Zhang, Hamman, Ackermann, Engelgau and Ratner2005). Findings from the various diabetes prevention trials have helped to heighten awareness of the connection between nutrition and disease. Furthermore, they have stimulated considerable interest in functional foods and nutrition-based health management, two areas in which genomic technologies are anticipated to have substantial impact.
Functional foods
The daily consumption of margarine fortified with plant sterols has been shown to significantly reduce serum cholesterol levels compared with consumption of a comparable unfortified spread (Hendriks et al. Reference Hendriks, Weststrate, van Vliet and Meijer1999). Cholesterol-lowering spreads fall into a relatively new product category known as functional foods. These are food products that have, or claim to have, a specific health-promoting or enhancing effect over and above their nutritional content. Products may be focused on disease prevention (e.g. phytosterol-containing products for cholesterol-lowering) or enhancing daily health and wellbeing (e.g. pro-biotic products for digestive health).
Understanding how foods and food components modulate health is a core technology requirement for the development of functional foods. Epidemiological studies have repeatedly shown associations between food intake and the incidence and severity of disease. However, identifying the bioactive components of foodstuffs and defining their mode of action is challenging. Not only are diets highly complex, consisting of many separate food items, but each food item itself is a complex mix of bioactive components. To help understand this complexity nutritional scientists are starting to turn to the genomic tool box. For example, high density DNA and protein arrays can be used to delineate changes in gene and protein expression induced by whole diets, dietary constituents or individual phytochemicals. Comparison of the molecular changes post-intervention may directly help to identify bioactive components. However, more importantly, they can help nutritional researchers to elucidate diet-responsive genes and pathways, providing molecular targets for the development of in vitro screens, which in turn can be used to define the bioactive components of diets and individual food stuffs. To date, transcriptomics has been more readily embraced by the nutrition research community than proteomics. However, as new and improved proteomic techniques emerge this will undoubtedly change. The ability to integrate gene and protein expression data as well as to map post-translational modifications and even amino acid substitutions to particular cellular functions, offers a powerful approach for understanding the molecular basis of nutrition. Indeed, the application of proteomic analysis to cell and animal models, clearly illustrates the potential of this technique (Fuchs et al. Reference Fuchs, Winkelmann, Johnson, Mariman, Wenzel and Daniel2005).
A second important aspect for functional food development is the availability of appropriate biomarkers for evaluating efficacy. This is particularly important if the goal is to prevent rather than treat health conditions. Indeed, in the case of plant stanol/sterol-based functional foods the ultimate goal is to reduce levels of morbidity and mortality by improving cardiovascular health. However, the focus of efficacy studies is measurement of serum levels of LDL cholesterol, an accepted CVD risk factor. Genomic technologies can be used to characterise the development of disease states to identify novel biomarkers. Alternatively they can be used to map the molecular and physiological response to interventions in order to identify efficacy markers. In particular, metabolomics and metabonomics, emerging disciplines that aim to globally measure low-molecular-weight molecules in biological systems, are well placed to impact on biomarker discovery and to explore the complex relationship between nutrition, metabolism and disease (Whitfield et al. Reference Whitfield, German and Noble2004). For example, 1H-NMR-based metabonomics has been successfully applied to the diagnosis of coronary artery disease, analysis of the spectral data providing information on both the presence and severity of disease (Brindle et al. Reference Brindle, Antti and Holmes2002) and has also been used to explore the relationship between serum metabolite profiles and hypertension (Brindle et al. Reference Brindle, Nicholson, Schofield, Grainger and Holmes2003). More recently, researchers have started to apply metabonomics to human nutritional research, examples to date include the use of 1H-NMR-based metabonomics to characterise the metabolite changes in plasma and urine in response to an isoflavone dietary intervention (Solanky et al. Reference Solanky, Bailey, Beckwith-Hall, Davis, Bingham, Holmes, Nicholson and Cassidy2003, Reference Solanky, Bailey, Beckwith-Hall, Bingham, Davis, Holmes, Nicholson and Cassidy2005) and characterisation of the changes in urinary metabolites in response to daily chamomile tea ingestion (Wang et al. Reference Wang, Tang, Nicholson, Hylands, Sampson and Holmes2005).
Nutrigenetic testing and personalised nutrition
Recent advances in genomic and informatic technologies have resulted in a shift away from traditional epidemiology (understanding how diet, lifestyle and societal factors influence the causes, distribution and control of disease) and a move towards molecular epidemiology (understanding the interactive nature of gene, diet and environmental factors and how these interactions may impact on health outcomes). Although 99·9 % of human DNA sequences are identical, the 0·1 % difference between any two individuals has profound biological significance. Not only do people look different, but they also react differently. For example, work conducted by Schaefer and colleagues showed that there is considerable variability in plasma lipid response to a standard cholesterol-lowering diet (Schaefer et al. Reference Schaefer, Lamon-Fava, Ausman, Ordovas, Clevidence, Judd, Goldin, Woods, Gorbach and Lichtenstein1997), observed changes in LDL cholesterol level ranging from +3 % to − 55 % in men and from +13 % to − 39 % in women. This differential response is, in part, the result of genetic variation.
Understanding of the full complexity of genetic factors that underpin such observations is currently limited. However, a number of examples of single gene–diet interactions have been reported, resulting in the development of nutrigenetic testing as a new paradigm for health management. Nutrigenetic testing focuses on testing for DNA polymorphisms which are known to influence phenotypic responses to diet. The purpose of testing is to use this genotype information to provide more personalised advice about nutrition and health, the primary goal being to use an individual's genetic information to predict their future health susceptibilities and to guide selection of ‘the best’ preventative action. A well studied example of a diet–gene interaction which illustrates this concept is the interaction between the MTHFR gene, dietary folate and plasma homocysteine levels. A C/T change at position 677 in the MTHFR gene results in production of a partially defective (about 35 % of normal enzyme activity) form of methylenetetrahydrofolate reductase (MTHFR) an enzyme which is involved in regulating folate metabolism and lowering plasma levels of the amino acid homocysteine. Consistent with this, the TT genotype is associated with elevated plasma levels of homocysteine when folate status is low (Saw et al. 2001), implying that individuals who are MTHFR positive (i.e. possess the MTHFR TT genotype) may require higher levels of dietary folate to prevent the development of hyperhomocysteinemia. Indeed, dietary intervention with folic acid is known to reduce homocysteine levels and individuals with the MTHFR TT genotype appear to be particularly responsive to the intervention (Miyaki et al. Reference Miyaki, Murata, Kikuchi, Takei, Nakayama, Watanabe and Omae2005). Findings from observational studies have implicated hyperhomocysteinemia as a modifiable risk factor for CHD, stroke and dementia and indicate that individuals carrying the MTHFR TT genotype have an elevated risk of developing CVD (Strain et al. Reference Strain, Dowey, Ward, Pentieva and McNulty2004; Clarke, Reference Clarke2005; Troen & Rosenberg, Reference Troen and Rosenberg2006). As such, nutrigenetic testing for the MTHFR C677T polymorphism would seem to offer a route for identifying individuals with particular health susceptibilities that can be addressed through dietary means. In contradiction to this, recent findings from a number of randomised, controlled trials suggest that homocysteine lowering, through dietary supplementation with B vitamins, does not reduce CVD risk or improve cognitive performance (Stott et al. Reference Stott, MacIntosh and Do Lowe2005; Bonaa et al. Reference Bonaa, Njolstad, Ueland, Schirmer, Tverdal, Steigen, Wang, Nordrehaug, Arnesen and Rasmussen2006; Lonn et al. Reference Lonn, Yusuf and Arnold2006; McMahon et al. Reference McMahon, Green, Skeaff, Knight, Mann and Williams2006). This has raised questions over the causal role of homocysteine in disease pathogenesis and the suggestion that elevated plasma homocysteine levels may represent a risk marker of disease rather than a risk factor (Seshadri, Reference Seshadri2006). A further concern is the usefulness of testing for a single genetic marker in the context of a condition, in this case CVD, which is known to be highly polygenic and to have multiple environmental risk factors. In a recent meta-analysis the difference in effect size between MTHFR genotype groups for coronary artery disease risk though significant was relatively small, reflecting the contribution of multiple genetic factors to the final phenotype (Lewis et al. Reference Lewis, Ebrahim and Smith2005). Even if consideration is limited to the MTHFR gene, dietary folate and plasma homocysteine levels, gene–gene interactions may still modify the end phenotype. For example, a functional interaction between two MTHFR polymorphisms (C677T and the A1298C) has recently been reported (Ulvik et al. Reference Ulvik, Ueland, Fredriksen, Meyer, Vollset, Hoff and Schneede2006). Thus genetic testing of single gene variants may provide additional information about disease aetiology and the role of diet. However, the predictive nature of such tests will be highly limited. Indeed, if nutrigenetic testing is to contribute significantly to improvements in health management then research needs to shift a way from single gene–diet interactions and focus on the interplay between multiple genetic and environmental factors.
An alternative and perhaps more attractive early application of nutrigenetic testing is the diagnosis of food intolerances, particularly monogenic conditions such as lactose intolerance. Intestinal lactase is essential for the digestion of lactose, a carbohydrate found in milk and other dairy products. In most mammals, including man, lactase activity declines after weaning. This maturational decline renders most of the world's adult population intolerant to lactose-containing food products. However, a minority of adults, primarily Northern Europeans or those with Northern European ancestry, display adult lactase persistence. Recently, genetic variation in the Lactase gene, a single nucleotide polymorphism (C/T) at position − 13 910, was shown to be strongly associated with lactase persistence/non-persistance (Enattah et al. Reference Enattah, Sahi, Savilahti, Terwilliger, Peltonen and Jarvela2002). The CC genotype completely associates with biochemically verified lactase non-persistance. Furthermore, molecular epidemiology studies have shown that the frequency of this genotype is in good agreement with the prevalence data for lactose intolerance in >70 ethnically diverse populations (Jarvela, Reference Jarvela2005). Lactose intolerance can have a major impact on health and vitality. Primarily associated with gastric upset, lactose consumption by a lactose-intolerant individual, may also result in a range of systemic symptoms, including headache, severe tiredness, muscle and joint pain and allergy (Matthews et al. Reference Matthews, Waud, Roberts and Campbell2005). It has also been associated with a reduction in peak bone mass and increased susceptibility to osteoporosis (Sibley, Reference Sibley2004). Genetic-based diagnosis of lactose intolerance has significant advantages over the conventional lactose tolerance test. Firstly, it is less costly and labour intensive. Secondly, and more importantly it avoids the symptoms induced by lactose challenge, which can be debilitating for up to 3 d for some individuals (Sibley, Reference Sibley2004).
Although the science underpinning nutrigenetic testing is still immature the business outlook is viewed as encouraging. Several key driving forces including science, technology and consumer appeal are converging that should encourage growth of this area. Indeed, consumer research studies indicate that American consumers are already receptive to having their diet tailored to their genetic make up. In response to this emerging ‘consumer pull’ a number of early-stage companies (e.g. Sciona, Interleukin Genetics), focused on developing nutrigenetic tests for the consumer market, have started to appear. In addition, several food ingredient companies have begun investing in this area and a number of packaged food companies are actively engaged with European Union-funded research initiatives, including LIPGENE (Nugent, Reference Nugent2005) and DIOGenes (Saris & Harper, Reference Saris and Harper2005). However, this burgeoning commercial excitement needs to be tempered by a certain degree of caution.
Even though evidence is rapidly accumulating to support the concept of personalised nutrition, reports on the clinical utility and validity of specific nutrigenetic markers are still rare. Studies investigating the sensitivity and specificity of markers linked to monogenic conditions, such as lactose intolerance have started to emerge and do show promising clinical value (Rasinpera et al. Reference Rasinpera, Savilahti, Enattah, Kuokkanen, Totterman, Lindahl, Jarvela and Kolho2004; Hogenauer et al. Reference Hogenauer, Hammer, Mellitzer, Renner, Krejs and Toplak2005; Matthews et al. Reference Matthews, Waud, Roberts and Campbell2005). However, for complex polygenic traits, such as CVD or diabetes, generating such evidence is a much more challenging proposition. In the case of CVD not only is disease trajectory influenced by multiple genetic and lifestyle factors but it is also characterised by multiple biological risk factors (e.g. blood pressure, weight, serum cholesterol, LDL cholesterol, HDL cholesterol, TAG, C-reactive protein, etc.) and multiple disease outcome measures (e.g. all cause mortality, fatal CHD, non-fatal myocardial infarction, stroke, angina, coronary revascularisation, congestive heart failure, etc.). Disentangling this degree of complexity will not only take time but will also require significant research investment and the development of large interdisciplinary research consortia (Kaput et al. Reference Kaput, Ordovas and Ferguson2005). In addition to these technical issues consensus also needs to be reached on a large number of ethical and regulatory issues (Chadwick, Reference Chadwick2004). Should nutrigenetic testing be delivered directly to consumers or through healthcare professionals? Who should be tested? Who should have access to test information? How should individual privacy be protected? How should genetic discrimination be prevented? The way in which these and other issues are resolved will have a significant impact on the nutrigenetic business environment. Finally, consideration needs to be given to the way in which the consumer benefits of nutrigenetic science are communicated. The technology backlash experienced by plant biotechnologists over the introduction of GM foods and crops serves as an important lesson for the food industry, emphasising the need for early and proactive public engagement, and the development of communication strategies that clearly articulate the benefits, both for the individual consumer and the population at large, as well as the risks of nutrigenetic technology.
Conclusions
Nutrition plays a crucial role in health as well as disease. Despite its youth, nutritional genomics is already influencing multiple aspects of the food chain (agriculture, food production, food safety and quality assurance) and is starting to be leveraged more widely to deliver economic benefits and to enhance aspects of human nutrition and health. Over time nutritional genomics should accelerate the development of functional food products as well as extend knowledge and understanding of gene–diet interactions, a key building block for the future development of personalised nutrition.