Integrating genetic risk factors with minimal residual disease improves the accuracy of risk stratification for children with acute lymphoblastic leukemia.
Integrating genetic risk factors with minimal residual disease (MRD) improves the accuracy of risk stratification for children with acute lymphoblastic leukemia (ALL), according to a population-based cohort study published in the Journal of Clinical Oncology.
Current protocols use one MRD threshold to risk-stratify patients, but the new findings suggest that MRD interpretation should integrate genetic subtypes to more accurately differentiate patients at highest and lowest risk of relapse.
“A single threshold for assigning patients to an MRD risk group does not reflect the response kinetics of the different genetic subtypes,” wrote Dr. David O’Connor, of Great Ormond Street Hospital in London, and colleagues. “Although the risk of relapse was directly proportional to the MRD level within each genetic risk group, absolute relapse rate that was associated with a specific MRD value or category varied significantly by genetic subtype.”
The authors analyzed data from 3,113 consecutive patients treated in the UKALL2003 trial between 2003 and 2011. MRD was assessed using polymerase chain reaction analysis of Ig/TCR gene rearrangements. Cytogenetic and fluorescence in situ hybridization testing data were used to classify children into four genetic risk categories: good-risk cytogenetics, intermediate-risk cytogenetics, high-risk cytogenetics, and T-cell ALL.
These genetic subtypes were mutually exclusive and distinct (P < .001). MRD response was significantly different between the genetic risk categories, with the good-risk cytogenetics group having the best responses. MRD values had different prognostic implications for the different genetic risk categories; for example, relapse risk for MRD values of 0.01% to < 0.1% was 40% for high-risk cytogenetics compared with 12% for good-risk cytogenetics. The 5-year overall survival rates among children with MRD values of 0.01% to < 0.1% were 96% in the good-risk cytogenetics group, compared with 78% for the high-risk cytogenetics group.
Current risk stratification approaches discretely categorize or dichotomize variables that are actually continuous, including MRD, white blood cell count, and patient age.
“This loses power and creates sometimes stark differences that have limited biologic plausibility-for example, by classifying a child who is 9 years and 364 days old as standard risk and one who is 10 years and 1-day old as high risk, or one with a white blood cell count of 49,999 µL as standard risk and another with a white blood cell count of 50,100 µL as high risk,” wrote Stephen P. Hunger, MD, of the Children’s Hospital of Philadelphia, in an editorial that accompanied the study. “Most groups consider patients with end induction MRD < 0.01% to be excellent responders, and those with induction MRD ≥ 0.01% as poor responders, but it is also clear that patients with MRD of ≥ 0.01% or ≥ 0.05% have much worse outcomes than those with lower levels of MRD positivity.”
The new study analyzed early treatment response using log-transformed absolute MRD to yield a continuous variable rather than dichotomizing MRD into discrete categories.