martes, 26 de septiembre de 2017

NCI-funded TMIST study compares 2-D and 3-D mammography for finding breast cancers | National Institutes of Health (NIH)

NCI-funded TMIST study compares 2-D and 3-D mammography for finding breast cancers | National Institutes of Health (NIH)

National Institutes of Health (NIH) - Turning Discovery into Health



NCI-funded TMIST study compares 2-D and 3-D mammography for finding breast cancers

Study will help women make informed decisions about screening tests in the future.
The Tomosynthesis Mammographic Imaging Screening Trial (TMIST), the first randomized trial to compare two types of digital mammography for breast cancer screening, is now open for enrollment. The study was developed by the ECOG-ACRIN Cancer Research Group (ECOG-ACRIN) and the National Cancer Institute (NCI), part of the National Institutes of Health. ECOG-ACRIN is leading the trial.
TMIST researchers are enrolling healthy women ages 45 to 74 who are already planning to get routine mammograms. By taking part in TMIST, the 165,000 planned participants will provide critical information that will help researchers learn how to most effectively screen women for breast cancer and help women make informed decisions about the screening tests in the future. 
“Nearly 50 million screening mammograms occur each year in the United States, yet it has been decades since a large-scale randomized trial of mammography has been done,” said Worta McCaskill-Stevens, M.D., director of the NCI Community Oncology Research Program (NCORP), the NCI program supporting the trial. “The evolution of mammography technology provides us with an opportunity to fill in the gaps in our knowledge about two available breast cancer screening tests.”
TMIST is comparing two types of digital mammography approved by the Food and Drug Administration: tomosynthesis (known as three-dimensional, or 3-D) and conventional (two-dimensional, or 2-D). Although 3-D mammography, being the newer technology, is likely to detect more findings that require follow-up, it is also likely to lead to more procedures and treatments. It is not known if this newer mammography technology is reducing a woman’s risk of developing a life-threatening (advanced) cancer compared with 2-D mammography. The TMIST trial aims to find out.
“We need to determine if 3-D mammography is better than 2-D at finding the sort of breast cancers that are most likely to spread and kill women,” said ECOG-ACRIN study chair Etta D. Pisano, M.D., vice chair of research in the Department of Radiology at Beth Israel Deaconess Medical Center and professor in residence of radiology at Harvard Medical School, Boston. “If a newer screening technology does not reduce the numbers of advanced, life-threatening cancers, then are we really improving screening for breast cancer?”
TMIST researchers are collecting data on the results of every mammogram, whether the imaging shows no signs of cancer, findings suspicious of cancer, or a breast cancer. Any medical follow-ups, such as more imaging or biopsies, are also being reported. TMIST researchers intend to follow all participants for breast cancer status, treatment, and outcomes from the time of randomization until the end of the study (at least 2025).
Based on the findings of earlier studies, researchers know that the vast majority of women in the study will not develop breast cancer. If a woman does receive a diagnosis of any kind of breast cancer while in the trial, she will receive treatment just as she would if she was not part of TMIST, while continuing to be part of the trial.
In addition to data from mammograms, the trial is building a biorepository for future research on genetic markers for breast cancer by asking all participants to voluntarily submit blood samples and swabs of cells from inside the mouth (buccal cells). This data could, in the future, help women and their doctors decide the best ways to screen for breast cancer by evaluating their individual risk factors for developing the disease. TMIST researchers are also analyzing tissue collected from women who have biopsies during the trial because of mammogram findings that require follow-up. This is to learn more about the biology of breast cancers detected through screening.
About 100 mammography clinics in the United States are planning to participate in the trial and are opening on a rolling basis over the next several months. Women are being told about the opportunity to enroll in the trial when they schedule a routine mammogram. Once enrolled, they will be assigned to either 2-D or 3-D mammography screening. Most women enrolled in the trial will be screened annually. Postmenopausal women with no high-risk factors will be screened every two years.
To ensure a diverse group of participants, sites are well represented both geographically and by the race/ethnicity of the women the sites serve. Several Canadian clinics are joining the trial, having already enrolled more than 3,000 women in a smaller lead-in study that is helping to inform TMIST.
To find TMIST information on social media: #TMIST4BC
About the ECOG-ACRIN Cancer Research Group (ECOG-ACRIN): ECOG-ACRIN is a membership-based scientific organization that designs and conducts cancer research involving adults who have or are at risk of developing cancer. Research personnel in nearly 1,200 member institutions are involved in Group science, which is organized into three programs: Cancer Control and Outcomes, Therapeutic Studies, and Biomarker Sciences. The Group’s primary funding is from the National Cancer Institute (NCI), including but not limited to a grant as a Research Base within NCORP, the NCI program supporting the TMIST trial. Visit is external), follow on Twitter @eaonc, or call 215-789-3631.
About the National Cancer Institute (NCI): NCI leads the National Cancer Program and NIH’s efforts to dramatically reduce the prevalence of cancer and improve the lives of cancer patients and their families, through research into prevention and cancer biology, the development of new interventions, and the training and mentoring of new researchers. For more information about cancer, please visit the NCI website at or call NCI’s Contact Center (formerly known as the Cancer Information Service) at 1-800-4-CANCER (1-800-422-6237).
About the National Institutes of Health (NIH): NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit
NIH…Turning Discovery Into Health®

Reminder—Recovery, Prevention, & Hope: Live Stream Panel of National Experts on Opioids Equips Faith and Community Leaders

Reminder: HHS Live Stream Recovery Month Event

Reminder—Recovery, Prevention, & Hope: Live Stream Panel of National Experts on Opioids Equips Faith and Community Leaders

HHS Live Stream—National Recovery Month Event

Wednesday, September 27 | 1–2 p.m. Eastern Time
HHS will convene national leaders and experts to discuss the opioid epidemic and other addictions, and to help raise awareness, encourage compassion, reinforce the role of community and families in long-term recovery and prevention, and make a call to action. The webcast is being held as part of National Recovery Month.
The event, hosted by the HHS Partnership Center, will be live streamed on both and the Recovery Month Facebook page. Join us and other #PartnersinHope by posting on social media a selfie or group photo of you gathering to watch this event!
View the Live Stream on

Wednesday, September 27 | 2–3 p.m. Eastern Time
You can also host a post-broadcast conversation in your community, working with local experts to discuss approaches that will foster healing for individuals and families, align regional efforts to renew wholeness in your community, and issue a call to action to encourage local communities to get involved to address this national emergency.
For more information or to request publications to assist with planning, please email

Myriad Genetics presents positive results of new hereditary cancer test at 36th annual conference of NSGC

Myriad Genetics presents positive results of new hereditary cancer test at 36th annual conference of NSGC


Myriad Genetics presents positive results of new hereditary cancer test at 36th annual conference of NSGC

Myriad Genetics, Inc., a leader in molecular diagnostics and personalized medicine, today announced that two important studies will be featured in podium presentations at the 36th annual conference of the National Society of Genetic Counselors (NSGC) in Columbus, OH.
"We look forward to presenting our pioneering research at NSGC this year and bringing forward new innovations that are changing how a woman's risk of breast cancer is being managed and improving the lives of even more people," said Johnathan Lancaster, M.D., Ph.D., chief medical officer, Myriad Genetics. "We are particularly excited to present the results of a large study of our new hereditary cancer test, called riskScore, that we believe will help genetic counselors and doctors improve care for their unaffected patients who test negative for a hereditary mutation. We're also looking forward to presenting data from a study that evaluated variant reclassification in 1.4 million patients tested for hereditary cancer risk assessment over a 10 year period."
More information about the company's presentations can be found at: Follow Myriad on Twitter via @MyriadGenetics and stay informed about conference news and updates by using the hashtag #NSGC17.
myRisk® Hereditary Cancer with riskScore™ Podium Presentation
Title: Development and Validation of a Residual Risk Score to Predict Breast Cancer Risk in Unaffected Women Negative for Mutations on a Multi-Gene Hereditary Cancer Panel.
Presenter: Elisha Hughes, Ph.D.
Date: Saturday, Sept. 16, 2017, 11:30—11:45 a.m.
Location: Platform Presentation, #1131
This study evaluated 86 single nucleotide polymorphisms (SNPs) as breast cancer risk factors through the validation of a polygenic residual risk score in large, consecutive cohorts of more than 17,205 women of European ancestry who tested negative for mutations in known breast cancer susceptibility genes. The results show that the 86 SNP residual risk score was highly predictive of breast cancer risk in unaffected women of European ancestry with a family cancer history who tested negative for germline mutations in known breast cancer risk genes (p<10-50). The clinical implementation of a residual risk score for women at risk for hereditary breast cancer may offer significant potential for the management of high-risk, unaffected women who test negative for monogenic mutations in breast cancer-risk genes.
"As healthcare providers our goal is to help patients understand their risk of breast cancer so that they personalize their medical care and live healthier lives. Women who have not yet developed cancer but have a family history of breast cancer should consider hereditary cancer testing with multi-gene panels," said Ora Gordon, M.D., medical director at the Center for Clinical Genetics & Genomics, Providence Health & Services Southern California. "Despite being at high familial risk for breast cancer, in reality, most patients will not carry a hereditary mutation in a breast cancer-risk gene, which doesn't mean they're risk free. For patients who test negative, there are other factors including SNPs, family history of cancer and personal factors that may increase the risk of breast cancer. The new riskScore test combines this data to provide patients with additional information about their individual risk for developing breast cancer in five years and over their lifetime to confirm whether high risk interventions are still needed despite negative single gene testing results. It can also provide reassurance that routine surveillance is appropriate despite having a family history. This is a much needed and long awaited advance in the personalization of breast cancer risk."
myRisk® Hereditary Cancer Podium Presentation
Title: Variant Reclassification in a Clinical Cohort: A Decade of Experience.
Presenter: Nichole Brown, MS, CGC.
Date: Friday, Sept. 15, 2017, 3:15—3:30 p.m.
Location: Platform Presentation, #1394
This study assessed variant reclassification in 1.4 million patients tested for hereditary cancer risk assessment between 2006 and 2016. Of these, 96 percent were female, 52 percent were of European decent and the median age was 49 years. Approximately 56 percent had a personal history of cancer at the time of testing. Approximately 36,264 unique variants (mutations) were identified during testing and 293,496 total variants were reported. The results show that when a comprehensive classification approach is employed, variant re-classification is relatively common (~19 percent) in genetic testing for hereditary cancer risk.
"Accurate classification of genetic variants can significantly impact on clinical care of patients and highlights the need timely reclassification and notification. Our ultimate goal is to definitively classify all genetic mutations, which is why Myriad invests so heavily in variant classification research and through scientific collaborations and publications," said Lancaster.
"Importantly, Myriad's commitment to patients doesn't stop once we've given them a test result. We understand that science evolves and that's why we have a commitment to notify doctors when we learn new information that could affect patient care. We offer support to patients and their families that lasts a lifetime."

Shotgun Sequencing

Shotgun Sequencing


Shotgun Sequencing

Shotgun sequencing is the method used to sequence the human genome by Craig Venter at Celera Genomics. The first method of DNA sequencing, the chain termination method or Sanger sequencing, is limited to a maximum DNA chain length of about 1,000 base pairs. On the other hand, shotgun sequencing increases the total amount of DNA that can be sequenced. It is more of a strategy than a distinct method.
A shotgun approach was first used for early sequencing of small genomes like cauliflower mosaic virus. Later, shotgun methods were adapted (with the development of powerful computer algorithms) for sequencing and reassembling large genomes, most notably the human genome.
DNA sequencing result sheet. Image Credit: SINITAR / Shutterstock
DNA sequencing result sheet. Image Credit: SINITAR / Shutterstock

The Sanger Method

Base pairs are the building blocks of DNA. The four nucleotides, or bases, are adenosine, cytosine, guanine, and thymine (abbreviated as A, C, G, and T, respectively). Adenosine pairs with thymine, and cytosine pairs with guanine, so a single stranded chain of AGTTAC would pair with a complementary chain of TCAATG. Strands of these paired chains for the familiar double helix pattern of DNA.
The Sanger method of sequencing is sufficient for reading the sequences of short chains, but is inadequate for longer sequences. The human genome has about three billion nucleotides, and many other genomes and sequences that are of interest to science are also too large for Sanger sequencing.

Shotgun Fragmentation of Sequences

In the late 1990s, Craig Venter adapted the shotgun approach to large genomes. In that method, the DNA is randomly broken into many small pieces, cloned into a bacterial host, and sequenced using chain termination. Multiple rounds of fragmentation and sequencing are carried out, creating overlapping sequences. Powerful computer algorithms are then used to reassemble the sequence.
Venter first developed his shotgun sequencing method while working on the bacterial species Haemophilus influenzae at the National Institutes of Health (NIH) in the US. The project took four months, compared to thirteen years researchers spent sequencing E. coli using Sanger sequencing, and ten years for yeast organisms.
The alternative at the time was to create a low-resolution map of the genome first, and then perform a calculation of the minimum number of fragments needed to sequence the entire genome. The genome was then broken up randomly into fragments and the fragments cloned into bacterial hosts. Based on the map, the cloned fragments were assembled into a scaffold, or tiling path, that theoretically covers the entire sequence, and those fragments were sequenced.
Shotgun sequencing was a more direct alternative, but required a great deal more computing power, pushing the limits of processors available at the time.

Paired Ends

Sanger sequencing originally worked in one direction only, starting with the 5-prime end and working toward the 3-prime end. However, sequencing from both ends of the strand increases the read length and can correct for certain kinds of errors in the sequencing process. Shotgun sequencing usually makes use of a paired ends strategy.

Advantages and Disadvantages of Shotgun Sequencing

Shotgun sequencing had a number of important advantages over previous methods:
  • Faster because the mapping process was eliminated
  • Uses less DNA than other methods
  • Less expensive than approaches requiring a map
Some disadvantages of shotgun sequencing include:
  • Requires computer processing power beyond what an ordinary laboratory would possess
  • Can introduce errors in the assembly process
  • Requires a reference genome
  • May not be able to assemble repetitive sequences
Reviewed by: Dr Tomislav Meštrović, MD, PhD


Further Reading

Last Updated: Sep 26, 2017

Analytical Science in Precision Medicine

Analytical Science in Precision Medicine


Analytical Science in Precision Medicine

Thought LeadersProfessor Jeremy K. NicholsonHead of the Department of Surgery and Cancerand Director of the MRC-NIHR National
​Phenome Centre Faculty of Medicine
An interview with Professor Jeremy K. Nicholson, Head of the Department of Surgery and Cancer and Director of the MRC-NIHR National Phenome Centre Faculty of Medicine, conducted by April Cashin-Garbutt, MA (Cantab)

It was recently announced that you will be presenting the upcoming Wallace H. Coulter Lecture at Pittcon 2018. What will be the main focus of your lecture?

Pittcon is an analytical conference, so naturally my talk will be about analytical chemistry and how analytical chemistry will become increasingly important in delivering healthcare solutions, not only for rich people, but also, hopefully, for poor people across the world.
The challenges that we face in 21st century medicine will be illustrated, particularly the diversity in terms of the emergent diseases that are occurring and the genetic and environmental drivers that change disease patterns and prevalence in whole populations. For instance, the obesity epidemic is a driver for a whole range of things including cancer, diabetes and Alzheimer's disease.
The important factor here is that we generally think about solving the problems we have currently and what technology or chemistry we need now. The world is changing very quickly, so we also need to address problems that will be coming over the horizon, or are already on the horizon, that we are going to face seriously in the next 20, 30, 40 years.
Those are problems such as anti-microbial resistance; the convolution of global warming with healthcare, which changes the way that parasites and infectious diseases operate on a global scale and the fact that our populations in western countries are living for much longer, which gives us a whole range of diseases that were not so common in the population before.
The idea is that we have to study chemistry and develop technology that is future-proofed, because, in the future, we won't have time to generate the technology. We have got a lot of work to do very quickly. It is a big challenge and everybody needs to pull together.
Credit: Apple's Eyes Studio/

What are the main challenges of 21st century healthcare?

There are what we call emergent threats, which are partly due to there being a lot more people living on the planet than ever before and therefore more things that can go wrong with more people.
Emergent threats include things such as diseases that may have been previously isolated in the tropics, but now are not because of modern transportation. There are things like Ebola and many others. We are seeing diseases in the West that we haven't faced before, but are going to see more and more.
There are also diseases that have emerged due to changes in the biology of the organisms that live on or within us. Over the last 30 or 40 years, about 30 completely new emergent diseases have popped up and we should expect to see more of those.
There is also the incredible interaction of the human body with the microbes that live inside us. We are supraorganisms with symbiotic microbes and we have changed our own chemistry, physiology and also our disease risk. We have realized that we are not alone. We are not just looking at our own genetics, but at our own biology.
Credit: Anatomy Insider/
We have to look at ourselves as a complex ecosystem that can get a disease as a systemic disease, a disease that affects the whole ecology of the body, which seems to be increasingly the case for things like gut cancers, liver cancers and possibly a whole range of what we would normally call immunologically-related diseases.
There's a lot more than we thought that has to be sorted out analytically. We have to be able to measure it all to understand it, which is where analytical chemistry comes in, in all its various forms.
We are trying to apply analytical technologies to define human populations, human variability and how that maps onto ethnicity, diet, the environment and how all those things combine to create an individual’s disease. We also want to understand it at the population level.
The other thing I will emphasize is how personalized healthcare and public healthcare are just flip sides of the same coin. Populations are made of individuals. We want to improve the therapies and the diagnostics for individuals, but we also want to prevent disease in those individuals, which means also understanding the chemistry and biochemistry of populations.

How can analytical science in precision medicine help overcome some of these challenges?

When you think about all aspects of human biology that we measure, they are all based on analytical chemistry. Even genetics and genomics are carried out on a DNA analyzer, which is an analytical device that has sensitivity, reproducibility, reliability and all the other things that we normally think of as analytical chemists.
Analytical chemistry underpins every part of our understanding of biological function. Proteomics is based on a variety of different technologies for measuring proteins for example.  The different technologies put different bricks in the wall of our understanding of total biology.
What we understand much less, is how those different blocks and units work together. We understand it quite well from the point-of-view of individual cells; how a cell works in terms of its DNA, RNA, protein production, metabolism, transport and so on. A huge amount is known about how the basic engine works.
When you start to put lots of different sorts of cells together, however, we understand much less. We know much less about how cells communicate locally and at long-range and how chemical signaling enables cells to talk to each other. When we start thinking about humans as supra-organisms, we also need to understand how our cells and bacterial cells talk to each other.
One of the great challenges is not only getting the analytical chemistry right for measuring the individual parts, but having the appropriate informatics to link the analytical technologies in ways that give information that is what we would call “clinically actionable”.
It is very easy to measure something in a patient, by taking blood and measuring a chemical in it, for example. However, to understand what that measurement really means, it has to be placed in a framework of knowledge that allows a doctor to decide what to do next based on that piece of information. In almost all analytical technologies and despite all the fancy new things, whichever “omics” you are interested in, the ability to inform a doctor to the point that they can do something is lacking.
One of the greatest challenges we face is not just using the new technologies, which have got to be reliable, reproducible, rugged and all the other things you need when you're making decisions about humans. It is also about visualizing data in ways that are good for doctors, biologists and epidemiologists to understand, so that they can help provide advice about healthcare policy in the future.
Aside from the challenges faced, one of the points that I would like to make in my lecture, is the deep thinking that needs to surround any technological development for it to be useful in the real world. That's ultimately where all our fates lie.

Will you be outlining any specific examples or case studies in your talk at Pittcon?

I will describe the challenges and big issues in the first five or ten minutes and then we'll start to look at what creates complexity.  I will give the supra-organism examples and since I am mainly a metabolic scientist, I will show how, particularly from a metabolic point of view, microorganisms influence biochemical phenotypes in humans and how those relate to things like obesity risk, cancer risk and so on.
I will deliver it in a way that provides a more general understanding of the complexity and how that affects us as human beings. Then I will talk about more specific examples about how to ensure a technology makes a difference to the patient. How you study the human biology, how biology is complex and what you need to study about it analytically is the first part and then I will give some examples where it is on a clinical timescale.
If you are thinking about understanding population health and population biology, which is what epidemiologists do, it does not really matter how long it takes to get the answers, so long as they get the right answer, because not one individual patient is dependent on their thinking. What you are trying to do, is understand where diseases come from so that in the future, you can make a better world by actioning the knowledge.
When somebody is ill, there is a timescale. There are different timescales, with each presenting their own different analytical challenges. If somebody has gut cancer, for example, they are going to need hospitalizing and to have some diagnostics performed. Physicians will decide exactly what sort of cancer it is, if the physicians understand it, and then there will either be surgery or some sort of chemotherapy. There will be a physical or chemical intervention of some sort.
Therefore, any analytical chemistry performed has got to be done within the timescale of the decision-making that the doctors make. In this example, it would have to be done within a day or two and the answer has to be interpretable by a doctor in that timescale.
For instance, there is no point doing a huge genomic screen on somebody; it would take three months to analyze the data, the doctors would have moved on and the patient may well be dead. All analytical technology has got to fit the constraints and the timescales in which the physicians operate.
Credit: Africa Studio/
The most extreme example is surgery, which involves real-time decision-making. The surgeon will cut a bit, have a look, cut another bit and then, based on a whole range of different background information, knowledge and potentially spectroscopic information, they will make decisions about whether to cut or not to cut.
The iKnife technology developed by Professor Zoltán Takáts, which I had a part in at Imperial, connects a surgical diathermy knife to a mass spectrometer, which means the chemistry of the tissue can be read 3 times per second and based on an annotated knowledge data set, we can know exactly what the surgeon is cutting through. That is an example of an analytical technology that gives a surgeon unprecedented molecular knowledge in real time and it is a real game-changer.
Then there are other things, one of which, is on an intermediate timescale. If somebody is in critical care, for example, then by definition, they are seriously ill. They change very quickly over minutes to hours, so any analytical technology that is informing on them has to be very, very fast and the output has to be very quickly readable and interpretable by a doctor.
I think this is also very interesting from an analytical chemistry point-of-view. It's not just about what you are measuring; the whole protocol you construct has to be fit for purpose within the timescale that is relevant to the medical decision-making, which is a huge challenge. A lot of the people, when they develop these technologies, do not necessarily think about it from the point-of-view of the doctor who is on the ward.
A few years ago, I used to give talks where I put forward an idea that I called the "Minute-Man challenge." The Minute Men were the Americans who wanted things ready within a minute, to fight off the British. There are Minute-Man missiles that the Americans made, which are ready to launch at any enemy within one minute of the go-code being given. The Minute-Man challenge for analytical chemistry is to get from a sample being provided to complete diagnosis within a minute or less.
Nothing exists that does that yet, but we are working on solutions. I thought NMR spectroscopy would be the most likely to provide that because it is a technology where you don't have to do anything to the sample; it is all based on radiophysics. Obviously, making samples and getting them ready for a machine takes time, so whatever the winner of the Minute-Man challenge is, it is going to be something that basically operates on the raw fluid or something incredibly close to that. It will almost certainly have to be directly injected into some sort of measuring device, so of course, in mass spectrometry, we start to think about direct injection mass spectrometry. There is a whole range of other atmospheric pressure methods such as desorption electrospray ionization and so on.
There is the technology invented by Professor Zoltán Takáts, which can also give you two- and three-dimensional tissue information. I am going to discuss the challenges in time, the challenges in actionability, as well as the challenge in the idea of providing complete diagnosis very rapidly, as something that is effectively a one-shot diagnosis.
Other things will come along; whether there will be one methodology that answers all possible biological or medical questions is, to me, extremely doubtful, but I think technologies are available now, probably mainly revolving around mass spectrometry, that will allow that sort of diagnostic fire power.
Also, if you are using a zero reagent and direct injection methodology, the cost comes down. The time and the cost go together, so, ideally, if you want to study large populations of people, you need to be able to test millions of samples at a relatively low cost. You want something that can measure 10,000 things at once in less than a minute and cost you a dollar. That would be the dream.
I think we might not be that far way from being able to do that, so what I'm will try to do in my talk is to juxtapose the big challenge ideas against the big analytical challenges and hopefully paint a picture that isn't entirely black.

Which analytical techniques have been most important to your work to date?

I am a spectroscopist by training and I am well-known for NMR spectroscopy, but over the last 15 or 20 years, I've been doing mass spectrometry just as much as NMR. We have 13 high-field NMR machines in my department and about 60 mass spectrometers, all analyzing metabolism, which is quite a collection. I never thought there would be a day when I had more mass spectrometers than NMR spectrometers, but there you go!
Historically, I also worked on X-ray spectroscopy and analytical electron microscopy and atomic spectroscopy, but it was NMR that really made my life come alive from the point-of-view of biology. When I started doing this as a post-doc, I realized that NMR could make the sort of measurements we are talking about - the Minute-Man type measurement - on a load of different things and extremely quickly, so I have been toying with that for over 30 years.
Certainly, when I first started working with NMR body fluids in the early 1980s, people thought I was completely mad. High-field NMR machines were for putting pure chemicals in, to get certain structures out and the idea of putting horse urine in an NMR machine would drive some of my colleagues completely crazy.
Aside from being an abnormal use of a highly advanced spectroscopic technology, people thought it would be too complex to work out what all the thousands of components are. In fact, it was complex and we still haven't worked out what all of them are after 30 years and a thousand man years of working in my research group. However, we have sorted out thousands of signals and what they mean biologically.
NMR is the most important technique for me personally because it made me the scientist that I am, but the other thing I love about NMR is the fact that it does not destroy anything. It is non-invasive and non-destructive. You can study living cells and look at molecular interactions, as well as just concentration. The binding of small molecules to large molecules can be studied, and those have diagnostic value as well, which I think, is underappreciated in the metabolic community.
Most people think mass spec must be better than NMR because it is more sensitive, which it is for most things, but NMR provides a whole set of information that mass spec could never obtain. The two together is the ideal combination if you want to study the structure, dynamics and diversity of biomolecules.

How do you think advances in technology will impact the field?

Technology advances in all ways, all the time and the advance in analytical technology is accelerating. There are things that we can do now in mass spec and NMR, for instance, that we would have thought impossible five or ten years ago. The drivers in analytical chemistry are always the improvements in sensitivity, specificity, reliability, accuracy, precision, reproducibility; but for clinical applications, and also for very large-scale applications that you need to study populations, it is robustness, reliability and reproducibility that are the most important things. The ability to harmonize data sets is also very important; irrespective of where you are working in the world, others should be able to access and interpret your data.
It's not just the advances and individual pieces of instrumentation that are important; it's how you use them to create harmonizable pictures of biology, that also has an informatic court. It is not just the analytical chemistry or technology itself, but how you use the data.
Let's just compare NMR and mass spec. One of the things about NMR is that it uses radiophysics and is based on the ability to measure the exact frequencies of spin quantum transitions in the atomic nucleus and the frequencies of those, which are characteristic of the particular molecular moiety that's being looked at.
One of the beauties of NMR, is that if you take, for example a 600 or 900 mega Hertz or giga Hertz NMR spectrum now, the same analysis of it would be still be possible in a thousand or a million years because it's a physical statement about the chemical properties of that fluid that will never change. Even if NMR spectrometers were to get more sensitive, the basic structures of the data will be identical in a thousand or a million years.
In mass spectrometry, we're changing the technology all the time. The ion optics change, the ionization modes change and all of these things affect how the molecules or the fragments fly through the mass spectrometer and how they are detected.
One of the greatest challenges in mass spectrometry, for instance, is therefore time-proofing the data. You do not want to have to analyze a million samples now and then find that in five years the technology is out of date and you have to analyze them all again.
Some technologies such as NMR are in effect much more intrinsically time-proofed than mass spectrometry and a whole range of other technologies that are changing all the time. One of the ways forward is the informatic use for extracting principal features of spectroscopic data, which will be preserved irrespective of how those spectroscopic data were originally provided. Therefore, again, there's a different sort of informatic challenge that has to do with time-proofing, which I think is very interesting.

What other challenges still need to be overcome?

It is mainly about making analysis faster, cheaper and more reliable. I think the biggest challenge is to do with data sharing and the way that humans work or don't work together. If you pick up a copy of “Analytical Chemistry” any week, you would find half the papers describe a new, better method for measuring X, Y, or Z. It is always going to be a better, superior method, since otherwise, it would not get published in the journal. Analytical chemists who use current methods always think they can do better. That is the way they think, but when you are being clinical, you've got to draw the line somewhere and decide to harmonize and understand that you may have to wait some time until something significantly better comes along
An interesting problem going forward is going to be settling on analytical protocols that everybody can expect, despite the fact you know they are not perfect. It brings us back to the three Rs: ruggedness, reliability and reproducibility. When studying humans, populations or clinical situations using analytical chemistry, the three Rs are always going to be more important than absolute sensitivity and specificity (although these are always important), which is often what motivates most analytical chemists. The three Rs require very strict adherence to harmonized protocols that are sharable. I think that is going to be a challenge to people's egos.

What do you think the future holds for analytical science in precision medicine?

Precision medicine is about getting an optimized interventional strategy for a patient, based on a detailed knowledge of that patient's biology. That biology is reflected in the chemistry of the body at all the different organizational levels, whether it is genes, proteins, metabolites, or pollutants for that matter.
All the analytical chemistry technologies that inform on a patient or individual complexity will be ones that are important in the future. With respect to the future healthcare challenges, analytical chemistry is absolutely key; it is core to solving all of the emergent problems, as well as the ones we already face.

How do you hope the National Phenome Centre will contribute?

We have set up The National Phenome Centre to look at big populations and personalize healthcare challenges. We are in our fifth year of operation now and we are running lots of projects that are generating some tremendous findings.
We have now created an International Phenome Centre Network, where there is a series of laboratories built with core instrumentation that is either identical or extremely similar to ours. This means we can harmonize and exchange data, methodologies and therefore biology.
Imperial College was the first in the world, the National Phenome Center, and then the Medical Research Council funded the Birmingham Phenome Center about two years ago. We have transferred all our technologies and methods over and they have an NMR and a mass spectrometry core, which is effectively the same as ours, so we have completely interoperable data. In fact, we've just finished a huge trial, which involved Birmingham University as well.
There is also one in Singapore now, the Singapore Phenome Centre. The Australian Phenome Centre has just been funded by the second largest grant ever given in Australia and then there is a whole series lined up. However, we've already formed the International Phenome Centre Network, which was formally announced by Sally Davies, Chief Medical Officer of England in November last year.
This year, we will start our first joint project, which is going to be stratification of diabetes between the phenome centers that are up and running and using the same technology. We can carry out international harmonization of diabetes biology for the first time ever, so we are putting our money where our mouth is. To me, this is the most exciting thing that has come out of our work; the fact that there are now groups around the world that agree that harmonization is the way forward to get the best international biology hit and also to create massive data sets that are unprecedented in size and complexity to describe human biology.
This is another informatic challenge, but that's the future. There are a lot of dark emergent problems in human disease and we are going to go through some tough times over the next 30 or 40 years. However, we are starting to get our act together with things like the Phenome Centre Network and if it is not the network itself, it will be groupings like it that will rise to these great challenges facing humanity in the 21st century.

Where can readers find more information?

    About Professor Jeremy K. Nicholson

    • Professor of Biological Chemistry
    • Head of the Department of Surgery, Cancer and Interventional Medicine
    • Director of the MRC-NIHR National Phenome Centre
    • Director of the Centre for Gut and Digestive Health (Institute of Global Health Innovation)
    • Faculty of Medicine, Imperial College London
    Professor Nicholson obtained his BSc from Liverpool University (1977) and his PhD from London University (1980) in Biochemistry working on the application of analytical electron microscopy and the applications of energy dispersive X-Ray microanalysis in molecular toxicology and inorganic biochemistry. After several academic appointments at London University (School of Pharmacy and Birkbeck College, London, 1981-1991) he was appointed Professor of Biological Chemistry (1992).
    In 1998 he moved to Imperial College London as Professor and Head of Biological Chemistry and subsequently Head of the Department of Biomolecular Medicine (2006) and Head of the Department of Surgery, Cancer and Interventional Medicine in 2009 where he runs a series of research programs in stratified medicine, molecular phenotyping and molecular systems biology.
    In 2012 Nicholson became the Director of world’s first National Phenome Centre specializing in large-scale molecular phenotyping and he also directs the Imperial Biomedical Research Centre Stratified medicine program and Clinical Phenome Centre. Nicholson is the author of over 700 peer-reviewed scientific papers and many other articles/patents on the development and application of novel spectroscopic and chemometric approaches to the investigation of metabolic systems failure, metabolome-wide association studies and pharmaocometabonomics.
    Nicholson is a Fellow of the Royal Society of Chemistry, The Royal College of Pathologists, The British Toxicological Society, The Royal Society of Biology and is a consultant to several pharmaceutical/healthcare companies.
    He is a founder director of Metabometrix (incorporated 2001), an Imperial College spin-off company specializing in molecular phenotyping, clinical diagnostics and toxicological screening. Nicholson’s research has been recognised by several awards including: The Royal Society of Chemistry (RSC) Silver (1992) and Gold (1997) Medals for Analytical Chemistry; the Chromatographic Society Jubilee Silver Medal (1994); the Pfizer Prize for Chemical and Medicinal Technology (2002); the RSC medal for Chemical Biology (2003); the RSC Interdisciplinary Prize (2008) the RSC Theophilus Redwood Lectureship (2008); the Pfizer Global Research Prize for Chemistry (2006); the NIH Stars in Cancer and Nutrition Distinguished Lecturer (2010), the Semelweiss-Budapest Prize for Biomedicine (2010), The Warren Lecturer, Vanderbilt University (2015).
    He is a Thomson-Reuters ISI Highly cited researcher (2014 and 2015, Pharmacology and Toxicology, WoS H index = 108).  Professor Nicholson was elected as a Fellow of the UK Academy of Medical Sciences in 2010, elected Lifetime Honorary Member of the US Society of Toxicology in 2013, and Honorary Lifetime Member of the International Metabolomics society in 2013.
    He holds honorary professorships at 12 Universities (including The Mayo Clinic, USA, University of New South Wales, Chinese Academy of Sciences, Wuhan and Dalian, Tsinghua University, Beijing and Shanghai Jiao Tong University, Nanyang Technological University Singapore. In 2014 was Elected as an Albert Einstein Professor of the Chinese Academy of Sciences.