I recently attended the IAPAC (International Association of Physicians in AIDS Care) Adherence Conference in Miami and one of the most interesting studies presented involved the use of adherence patterns to predict virologic failure. I've copied the abstract below and would interested to hear feedback from the community. One potential point of controversy is whether clinicians would be willing to use an algorithm to decide when to order a viral load. It could save dramatically on costs, but clinicians may worry about missing virologic rebound. Data on malaria diagnostics, for instance, has shown that clinicians still often treat for malaria even when the rapid tests (with high sensitivity and specificity) are negative.
Data-Adaptive Super Learning to Predict Viral Rebound based on Electronic Adherence Monitoring: An Analysis of the MACH-14 Cohort Consortium
Maya Petersen, Varada Sarovar, Anna Decker, Erin LeDell, Joshua Schwab, Robert Gross, Ira Wilson, Carol Golin, Nancy Reynolds, Robert Remien, Kathleen Goggin, , Jane Simoni, Marc Rosen, Mark van der Laan, Honghu Liu , David Bangsberg
Background. Electronic adherence monitoring has the potential to improve outcomes by triaging viral load tests and adherence interventions. Machine learning (automated algorithms for signal detection from complex data) may improve the accuracy with which viral failure can be identified.
Methods. We applied an ensemble machine-learning algorithm (“Super Learner”) to predict viral failure (rebound>400 copies/ml after suppression to <=400 copies/ml) using pooled data from the MACH14 consortium and compared the cross-validated accuracy of the resulting predictor to that achieved by traditional approaches. Medication event monitoring (MEMS) data were analyzed to predict viral rebound using: 1) average adherence; 2) logistic regression including average adherence and interruption >= 3 days; 3) Super Learner applied to 142 a priori selected candidate predictor variables, including basic clinical data and 134 adherence summaries (averages, nadirs of moving averages, variances, and frequencies and durations of interruptions). Super Learner employed internal cross-validation to data-adaptively select from among a user-specified library of algorithms including random forests, generalized additive models, Bayesian and Lasso regularized generalized linear models, and neural networks. Cross-validated area under the receiver operating characteristic curves [AUC] were calculated based on data not used in model fitting.
Results. 1137 patients with complete data for each predictor variable contributed 3149 HIV-RNA tests. 138 of 771 patients (18%) had at least one failure observed subsequent to 1810 suppressed HIV-RNA tests. The AUCs for simple average adherence and average adherence + 3 day interruption were 0.64 (95% CI 0.59-0.68) and 0.64 (95% CI 0.59-0.70), respectively. The cross-validated AUC for the Super Learner predictor was 0.72 (95% CI 0.68-0.76).
Conclusions. Super Learner analysis of electronic adherence data predicted viral failure with reasonable accuracy in a highly heterogeneous population of HIV infected individuals and could potentially be combined with real time monitoring to triage viral load testing and/or target patients for adherence interventions.