New study applies quantitative modelling to genomics
Saturday, 05 January 2013
Genomic research is widely expected to transform medicine, but progress has been slower than expected. While critics argue that the genomics "promise" has been broken – and that money might be better spent elsewhere – proponents say the deliberate pace underscores the complexity of the relationship between medicine and disease and, indeed, argues for more funding.
But thus far, these competing narratives have been based mostly on anecdotes. Ramy Arnaout, MD, DPhil, a founding member of the Genomic Medicine Initiative at Beth Israel Deaconess Medical Center (BIDMC), decided it was time to look at genomics from a new perspective. So he turned to quantitative modelling, a numerical forecasting approach used to predict everything from weather events to the outcomes of political elections, and an extremely useful way to both set expectations and assist in decision-making.
Arnaout and colleagues knew that drug-related adverse outcomes cost the health-care system upwards of $80 billion a year, and that many such cases should be avoidable by choosing and dosing drug prescriptions according to a person's genome. So they developed a quantitative model to estimate how much time and money would be required to use genomics, specifically pharmacogenomics, to cut these adverse outcomes in half. Their findings, currently published online in the journal Clinical Chemistry, provide one of the first examples of data-driven estimates being applied to genomic medicine and offer a template for the use of quantitative modelling in this field.
How do the numbers add up? After analysing their model for a range of situations, the research team found that the cost can be expected to be less than $10 billion, spread out over approximately 20 years.
"If you look across medicine, you can see specific places here and there where genomics is really starting to change things, but it's been hard to know how it all adds up in the big picture," explains Arnaout, who is also an Assistant Professor of Pathology at Harvard Medical School (HMS) and Associate Director of Clinical Microbiology at BIDMC.
"Quantitative modelling is a standard approach for forecasting and setting expectations in many fields as we all remember from the recent presidential election and from the hurricane season. Genomics is so important and is so often on the minds of our patients, students and staff, that it seemed like a good idea to use modelling to get some hard numbers on where we're headed."
The idea for the study originated nearly two years ago, while Arnaout (whose laboratory studies genomics) and Sukhatme, BIDMC's Chief Academic Officer, were attending a lecture, shortly after the 10-year anniversary of the sequencing of the genome.
"Vikas asked me, 'So when is genomics really going to change medicine?'" remembers Arnaout.
"I realized I didn't know. And that got me thinking."
Arnaout and Sukhatme, together with co-authors Thomas Buck, MD, and Paulvalery Roulette, MD, of BIDMC and HMS, decided to try and answer this question by applying forecasting methods to a big clinical problem – drug-related adverse outcomes.
"We know that preventable causes of these adverse outcomes -- patients' non-adherence, interactions between multiple drugs, and medical error, for example -- account for only a fraction of the millions of adverse outcomes that patients experience each year," explains Arnaout.
"This leaves a significant number that are currently considered non-preventable and are thought to be caused by genomic variation."
By way of example, Arnaout explains that 30 million Americans currently use the blood-thinning drug warfarin. But because, in some cases, patients' genomes contain variants that make the standard dose of warfarin too high for them, these individuals are likely to experience bleeding, an extremely dangerous side effect. In fact, researchers now estimate that three-quarters of the variability in warfarin dosing requirement is due to these genomic variants, and they have already identified a set of variants in six specific genes that explain two-thirds of the variability.
"This kind of progress suggested an interesting thought experiment," says Arnaout.
"What if we took existing examples in which there appears to be a carefully vetted, clinically useful connection between a specific adverse outcome and a specific genetic variant, found out how much it cost and how long it took to discover, and applied that model to all drugs? How much would it cost and how long would it take to cut adverse outcomes by 25 percent? How about by half?"
As data for the model, the authors selected eight associations involving six prescription drugs (clopidogrel, warfarin, escitalopram, carbamazepine, the nicotine-replacement patch and abacavir) and one drug class, the statin class of anti-cholesterol drugs.
Using an approach called Monte Carlo modelling, the team ran simulations to forecast the research investment required to learn how to cut adverse outcomes by meaningful amounts, and how long that research work would be expected to take. For statistical confidence, they ran their simulations thousands of times and explored a wide range of assumptions.
"The results were surprising," says Arnaout.
"Before we did this work, I couldn't have told you whether it would take a million dollars or a trillion dollars or whether it would take five years or a hundred years. But now, we've got a basis for thinking that we're looking at single-digit billions of dollars and a couple of decades. That may sound like a lot or a little, depending on your point of view. But with these numbers, we can now have a more informed conversation about planning for the future of genomic medicine."
The most important determinant of the numbers is the extent to which the examples used in the model will turn out to be representative of drugs as a whole.
"It's a broad set of drugs that were used, but we know how the genome can surprise us," says senior author Sukhatme.
"For example, you won't be able to use genomics to cut adverse outcomes in half if genomics turns out to explain less than half of the adverse outcomes. But even in that case, we found that pharmacogenomics will be able to make a significant dent in adverse outcomes – cutting them by a quarter – for multi-billion-dollar investments."
Also surprising, say the authors, was the timing.
"As a rule, the fruits of research come only after research dollars have already been spent," points out Arnaout. This means that, in this case, hundreds of millions of dollars will be spent for "pump-priming" long before the public can expect to see any meaningful clinical impact.
"It's one thing to say, 'Be patient,' based on just faith," he adds.
"It's another to be able to say so based on data and a model. We now have that. This enables the conversation to shift to which indicators of progress to look for, over the five or so years of pump-priming, to make sure we're on track."
Can we go faster?
"If we could enrol an ethnically diverse set of patients who are taking each of the 40 or 50 most commonly prescribed drugs, get their blood samples, and keep track of the adverse outcomes that some of them are bound to experience, we should be able to move faster, for less money," adds Arnaout, who describes this idea as a "50,000 Pharmacogenomes Project," a pursuit along the lines of the 1,000 Genomes Project, the UK10K or the Veteran's Association Million Veteran Program.
"This model provides the start of a provocative conversation and illustrates the value of quantitative modelling in this very practical and publically relevant aspect of genomics," adds BIDMC Chief of Pathology Jeffrey Saffitz, MD, PhD.
"Such models should help both decision makers and the public set expectations and priorities for translating genomic research into better patient care."
Contact: Bonnie Prescott
.........
For more on stem cells and cloning, go to CellNEWS at