PLoS Medicine

Condividi contenuti PLOS Medicine: New Articles
A Peer-Reviewed Open-Access Journal
Aggiornato: 5 ore 25 min fa

Computer-aided X-ray screening for tuberculosis and HIV testing among adults with cough in Malawi (the PROSPECT study): A randomised trial and cost-effectiveness analysis

Gi, 09/09/2021 - 15:00

by Peter MacPherson, Emily L. Webb, Wala Kamchedzera, Elizabeth Joekes, Gugu Mjoli, David G. Lalloo, Titus H. Divala, Augustine T. Choko, Rachael M. Burke, Hendramoorthy Maheswaran, Madhukar Pai, S. Bertel Squire, Marriott Nliwasa, Elizabeth L. Corbett

Background

Suboptimal tuberculosis (TB) diagnostics and HIV contribute to the high global burden of TB. We investigated costs and yield from systematic HIV-TB screening, including computer-aided digital chest X-ray (DCXR-CAD).

Methods and findings

In this open, three-arm randomised trial, adults (≥18 years) with cough attending acute primary services in Malawi were randomised (1:1:1) to standard of care (SOC); oral HIV testing (HIV screening) and linkage to care; or HIV testing and linkage to care plus DCXR-CAD with sputum Xpert for high CAD4TBv5 scores (HIV-TB screening). Participants and study staff were not blinded to intervention allocation, but investigator blinding was maintained until final analysis. The primary outcome was time to TB treatment. Secondary outcomes included proportion with same-day TB treatment; prevalence of undiagnosed/untreated bacteriologically confirmed TB on day 56; and undiagnosed/untreated HIV. Analysis was done on an intention-to-treat basis. Cost-effectiveness analysis used a health-provider perspective. Between 15 November 2018 and 27 November 2019, 8,236 were screened for eligibility, with 473, 492, and 497 randomly allocated to SOC, HIV, and HIV-TB screening arms; 53 (11%), 52 (9%), and 47 (9%) were lost to follow-up, respectively. At 56 days, TB treatment had been started in 5 (1.1%) SOC, 8 (1.6%) HIV screening, and 15 (3.0%) HIV-TB screening participants. Median (IQR) time to TB treatment was 11 (6.5 to 38), 6 (1 to 22), and 1 (0 to 3) days (hazard ratio for HIV-TB versus SOC: 2.86, 1.04 to 7.87), with same-day treatment of 0/5 (0%) SOC, 1/8 (12.5%) HIV, and 6/15 (40.0%) HIV-TB screening arm TB patients (p = 0.03). At day 56, 2 SOC (0.5%), 4 HIV (1.0%), and 2 HIV-TB (0.5%) participants had undiagnosed microbiologically confirmed TB. HIV screening reduced the proportion with undiagnosed or untreated HIV from 10 (2.7%) in the SOC arm to 2 (0.5%) in the HIV screening arm (risk ratio [RR]: 0.18, 0.04 to 0.83), and 1 (0.2%) in the HIV-TB screening arm (RR: 0.09, 0.01 to 0.71). Incremental costs were US$3.58 and US$19.92 per participant screened for HIV and HIV-TB; the probability of cost-effectiveness at a US$1,200/quality-adjusted life year (QALY) threshold was 83.9% and 0%. Main limitations were the lower than anticipated prevalence of TB and short participant follow-up period; cost and quality of life benefits of this screening approach may accrue over a longer time horizon.

Conclusions

DCXR-CAD with universal HIV screening significantly increased the timeliness and completeness of HIV and TB diagnosis. If implemented at scale, this has potential to rapidly and efficiently improve TB and HIV diagnosis and treatment.

Trial registration

clinicaltrials.gov NCT03519425.

Body muscle gain and markers of cardiovascular disease susceptibility in young adulthood: A cohort study

Gi, 09/09/2021 - 15:00

by Joshua A. Bell, Kaitlin H. Wade, Linda M. O’Keeffe, David Carslake, Emma E. Vincent, Michael V. Holmes, Nicholas J. Timpson, George Davey Smith

Background

The potential benefits of gaining body muscle for cardiovascular disease (CVD) susceptibility, and how these compare with the potential harms of gaining body fat, are unknown. We compared associations of early life changes in body lean mass and handgrip strength versus body fat mass with atherogenic traits measured in young adulthood.

Methods and findings

Data were from 3,227 offspring of the Avon Longitudinal Study of Parents and Children (39% male; recruited in 1991–1992). Limb lean and total fat mass indices (kg/m2) were measured using dual-energy X-ray absorptiometry scans performed at age 10, 13, 18, and 25 y (across clinics occurring from 2001–2003 to 2015–2017). Handgrip strength was measured at 12 and 25 y, expressed as maximum grip (kg or lb/in2) and relative grip (maximum grip/weight in kilograms). Linear regression models were used to examine associations of change in standardised measures of these exposures across different stages of body development with 228 cardiometabolic traits measured at age 25 y including blood pressure, fasting insulin, and metabolomics-derived apolipoprotein B lipids. SD-unit gain in limb lean mass index from 10 to 25 y was positively associated with atherogenic traits including very-low-density lipoprotein (VLDL) triglycerides. This pattern was limited to lean gain in legs, whereas lean gain in arms was inversely associated with traits including VLDL triglycerides, insulin, and glycoprotein acetyls, and was also positively associated with creatinine (a muscle product and positive control). Furthermore, this pattern for arm lean mass index was specific to SD-unit gains occurring between 13 and 18 y, e.g., −0.13 SD (95% CI −0.22, −0.04) for VLDL triglycerides. Changes in maximum and relative grip from 12 to 25 y were both positively associated with creatinine, but only change in relative grip was also inversely associated with atherogenic traits, e.g., −0.12 SD (95% CI −0.18, −0.06) for VLDL triglycerides per SD-unit gain. Change in fat mass index from 10 to 25 y was more strongly associated with atherogenic traits including VLDL triglycerides, at 0.45 SD (95% CI 0.39, 0.52); these estimates were directionally consistent across sub-periods, with larger effect sizes with more recent gains. Associations of lean, grip, and fat measures with traits were more pronounced among males. Study limitations include potential residual confounding of observational estimates, including by ectopic fat within muscle, and the absence of grip measures in adolescence for estimates of grip change over sub-periods.

Conclusions

In this study, we found that muscle strengthening, as indicated by grip strength gain, was weakly associated with lower atherogenic trait levels in young adulthood, at a smaller magnitude than unfavourable associations of fat mass gain. Associations of muscle mass gain with such traits appear to be smaller and limited to gains occurring in adolescence. These results suggest that body muscle is less robustly associated with markers of CVD susceptibility than body fat and may therefore be a lower-priority intervention target.

Infections in temporal proximity to HPV vaccination and adverse effects following vaccination in Denmark: A nationwide register-based cohort study and case-crossover analysis

Me, 08/09/2021 - 15:00

by Lene Wulff Krogsgaard, Irene Petersen, Oleguer Plana-Ripoll, Bodil Hammer Bech, Tina Hovgaard Lützen, Reimar Wernich Thomsen, Dorte Rytter

Background

Public trust in the human papilloma virus (HPV) vaccination programme has been challenged by reports of potential severe adverse effects. The reported adverse symptoms were heterogeneous and overlapping with those characterised as chronic fatigue syndrome (CFS) and have been described as CFS-like symptoms. Evidence suggests that CFS is often precipitated by an infection. The aim of the study was to examine if an infection in temporal proximity to HPV vaccination is a risk factor for suspected adverse effects following HPV vaccination.

Methods and findings

The study was a nationwide register-based cohort study and case-crossover analysis. The study population consisted of all HPV vaccinated females living in Denmark, born between 1974 and 2006, and vaccinated between January 1, 2006 and December 31, 2017. The exposure was any infection in the period ± 1 month around time of first HPV vaccination and was defined as (1) hospital-treated infection; (2) redemption of anti-infective medication; or (3) having a rapid streptococcal test done at the general practitioner. The outcome was referral to a specialised hospital setting (5 national HPV centres opened June 1, 2015) due to suspected adverse effects following HPV vaccination. Multivariable logistic regression was used to estimate the association between infection and later HPV centre referral. The participants were 600,400 HPV-vaccinated females aged 11 to 44 years. Of these, 48,361 (9.7%) females had a hospital-treated infection, redeemed anti-infective medication, or had a rapid streptococcal test ± 1 month around time of first HPV vaccination. A total of 1,755 (0.3%) females were referred to an HPV centre. Having a hospital-treated infection in temporal proximity to vaccination was associated with significantly elevated risk of later referral to an HPV centre (odds ratio (OR) 2.75, 95% confidence interval (CI) 1.72 to 4.40; P < 0.001). Increased risk was also observed among females who redeemed anti-infective medication (OR 1.56, 95% CI 1.33 to 1.83; P < 0.001) or had a rapid streptococcal test (OR 1.45, 95% CI 1.10 to 1.93; P = 0.010). Results from a case-crossover analysis, which was performed to adjust for potential unmeasured confounding, supported the findings. A key limitation of the study is that the HPV centres did not open until June 1, 2015, which may have led to an underestimation of the risk of suspected adverse effects, but stratified analyses by year of vaccination yielded similar results.

Conclusions

Treated infection in temporal proximity to HPV vaccination is associated with increased risk for later referral with suspected adverse vaccine effects. Thus, the infection could potentially be a trigger of the CFS-like symptoms in a subset of the referred females. To our knowledge, the study is the first to investigate the role of infection in the development of suspected adverse effects after HPV vaccination and replication of these findings are needed in other studies.

Effectiveness of seasonal malaria chemoprevention (SMC) treatments when SMC is implemented at scale: Case–control studies in 5 countries

Me, 08/09/2021 - 15:00

by Matthew Cairns, Serign Jawo Ceesay, Issaka Sagara, Issaka Zongo, Hamit Kessely, Kadidja Gamougam, Abdoulaye Diallo, Johnbull Sonny Ogboi, Diego Moroso, Suzanne Van Hulle, Tony Eloike, Paul Snell, Susana Scott, Corinne Merle, Kalifa Bojang, Jean Bosco Ouedraogo, Alassane Dicko, Jean-Louis Ndiaye, Paul Milligan

Background

Seasonal malaria chemoprevention (SMC) has shown high protective efficacy against clinical malaria and severe malaria in a series of clinical trials. We evaluated the effectiveness of SMC treatments against clinical malaria when delivered at scale through national malaria control programmes in 2015 and 2016.

Methods and findings

Case–control studies were carried out in Mali and The Gambia in 2015, and in Burkina Faso, Chad, Mali, Nigeria, and The Gambia in 2016. Children aged 3–59 months presenting at selected health facilities with microscopically confirmed clinical malaria were recruited as cases. Two controls per case were recruited concurrently (on or shortly after the day the case was detected) from the neighbourhood in which the case lived. The primary exposure was the time since the most recent course of SMC treatment, determined from SMC recipient cards, caregiver recall, and administrative records. Conditional logistic regression was used to estimate the odds ratio (OR) associated with receipt of SMC within the previous 28 days, and SMC 29 to 42 days ago, compared with no SMC in the past 42 days. These ORs, which are equivalent to incidence rate ratios, were used to calculate the percentage reduction in clinical malaria incidence in the corresponding time periods. Results from individual countries were pooled in a random-effects meta-analysis. In total, 2,126 cases and 4,252 controls were included in the analysis. Across the 7 studies, the mean age ranged from 1.7 to 2.4 years and from 2.1 to 2.8 years among controls and cases, respectively; 42.2%–50.9% and 38.9%–46.9% of controls and cases, respectively, were male. In all 7 individual case–control studies, a high degree of personal protection from SMC against clinical malaria was observed, ranging from 73% in Mali in 2016 to 98% in Mali in 2015. The overall OR for SMC within 28 days was 0.12 (95% CI: 0.06, 0.21; p < 0.001), indicating a protective effectiveness of 88% (95% CI: 79%, 94%). Effectiveness against clinical malaria for SMC 29–42 days ago was 61% (95% CI: 47%, 72%). Similar results were obtained when the analysis was restricted to cases with parasite density in excess of 5,000 parasites per microlitre: Protective effectiveness 90% (95% CI: 79%, 96%; P<0.001), and 59% (95% CI: 34%, 74%; P<0.001) for SMC 0–28 days and 29–42 days ago, respectively. Potential limitations include the possibility of residual confounding due to an association between exposure to malaria and access to SMC, or differences in access to SMC between patients attending a clinic and community controls; however, neighbourhood matching of cases and controls, and covariate adjustment, attempted to control for these aspects, and the observed decline in protection over time, consistent with expected trends, argues against a major bias from these sources.

Conclusions

SMC administered as part of routine national malaria control activities provided a very high level of personal protection against clinical malaria over 28 days post-treatment, similar to the efficacy observed in clinical trials. The case–control design used in this study can be used at intervals to ensure SMC treatments remain effective.

The cardiovascular effects of amodiaquine and structurally related antimalarials: An individual patient data meta-analysis

Ma, 07/09/2021 - 15:00

by Xin Hui S. Chan, Ilsa L. Haeusler, Yan Naung Win, James Pike, Borimas Hanboonkunupakarn, Maryam Hanafiah, Sue J. Lee, Abdoulaye Djimdé, Caterina I. Fanello, Jean-René Kiechel, Marcus VG Lacerda, Bernhards Ogutu, Marie A. Onyamboko, André M. Siqueira, Elizabeth A. Ashley, Walter RJ Taylor, Nicholas J. White

Background

Amodiaquine is a 4-aminoquinoline antimalarial similar to chloroquine that is used extensively for the treatment and prevention of malaria. Data on the cardiovascular effects of amodiaquine are scarce, although transient effects on cardiac electrophysiology (electrocardiographic QT interval prolongation and sinus bradycardia) have been observed. We conducted an individual patient data meta-analysis to characterise the cardiovascular effects of amodiaquine and thereby support development of risk minimisation measures to improve the safety of this important antimalarial.

Methods and findings

Studies of amodiaquine for the treatment or prevention of malaria were identified from a systematic review. Heart rates and QT intervals with study-specific heart rate correction (QTcS) were compared within studies and individual patient data pooled for multivariable linear mixed effects regression.The meta-analysis included 2,681 patients from 4 randomised controlled trials evaluating artemisinin-based combination therapies (ACTs) containing amodiaquine (n = 725), lumefantrine (n = 499), piperaquine (n = 716), and pyronaridine (n = 566), as well as monotherapy with chloroquine (n = 175) for uncomplicated malaria. Amodiaquine prolonged QTcS (mean = 16.9 ms, 95% CI: 15.0 to 18.8) less than chloroquine (21.9 ms, 18.3 to 25.6, p = 0.0069) and piperaquine (19.2 ms, 15.8 to 20.5, p = 0.0495), but more than lumefantrine (5.6 ms, 2.9 to 8.2, p < 0.001) and pyronaridine (−1.2 ms, −3.6 to +1.3, p < 0.001). In individuals aged ≥12 years, amodiaquine reduced heart rate (mean reduction = 15.2 beats per minute [bpm], 95% CI: 13.4 to 17.0) more than piperaquine (10.5 bpm, 7.7 to 13.3, p = 0.0013), lumefantrine (9.3 bpm, 6.4 to 12.2, p < 0.001), pyronaridine (6.6 bpm, 4.0 to 9.3, p < 0.001), and chloroquine (5.9 bpm, 3.2 to 8.5, p < 0.001) and was associated with a higher risk of potentially symptomatic sinus bradycardia (≤50 bpm) than lumefantrine (risk difference: 14.8%, 95% CI: 5.4 to 24.3, p = 0.0021) and chloroquine (risk difference: 8.0%, 95% CI: 4.0 to 12.0, p < 0.001). The effect of amodiaquine on the heart rate of children aged <12 years compared with other antimalarials was not clinically significant. Study limitations include the unavailability of individual patient-level adverse event data for most included participants, but no serious complications were documented.

Conclusions

While caution is advised in the use of amodiaquine in patients aged ≥12 years with concomitant use of heart rate–reducing medications, serious cardiac conduction disorders, or risk factors for torsade de pointes, there have been no serious cardiovascular events reported after amodiaquine in widespread use over 7 decades. Amodiaquine and structurally related antimalarials in the World Health Organization (WHO)-recommended dose regimens alone or in ACTs are safe for the treatment and prevention of malaria.

Call for emergency action to limit global temperature increases, restore biodiversity, and protect health

Ma, 07/09/2021 - 15:00

by Lukoye Atwoli, Abdullah H. Baqui, Thomas Benfield, Raffaella Bosurgi, Fiona Godlee, Stephen Hancocks, Richard Horton, Laurie Laybourn-Langton, Carlos Augusto Monteiro, Ian Norman, Kirsten Patrick, Nigel Praities, Marcel GM Olde Rikkert, Eric J. Rubin, Peush Sahni, Richard Smith, Nick Talley, Sue Turale, Damián Vázquez

Derivation and external validation of a risk score for predicting HIV-associated tuberculosis to support case finding and preventive therapy scale-up: A cohort study

Ma, 07/09/2021 - 15:00

by Andrew F. Auld, Andrew D. Kerkhoff, Yasmeen Hanifa, Robin Wood, Salome Charalambous, Yuliang Liu, Tefera Agizew, Anikie Mathoma, Rosanna Boyd, Anand Date, Ray W. Shiraishi, George Bicego, Unami Mathebula-Modongo, Heather Alexander, Christopher Serumola, Goabaone Rankgoane-Pono, Pontsho Pono, Alyssa Finlay, James C. Shepherd, Tedd V. Ellerbrock, Alison D. Grant, Katherine Fielding

Background

Among people living with HIV (PLHIV), more flexible and sensitive tuberculosis (TB) screening tools capable of detecting both symptomatic and subclinical active TB are needed to (1) reduce morbidity and mortality from undiagnosed TB; (2) facilitate scale-up of tuberculosis preventive therapy (TPT) while reducing inappropriate prescription of TPT to PLHIV with subclinical active TB; and (3) allow for differentiated HIV–TB care.

Methods and findings

We used Botswana XPRES trial data for adult HIV clinic enrollees collected during 2012 to 2015 to develop a parsimonious multivariable prognostic model for active prevalent TB using both logistic regression and random forest machine learning approaches. A clinical score was derived by rescaling final model coefficients. The clinical score was developed using southern Botswana XPRES data and its accuracy validated internally, using northern Botswana data, and externally using 3 diverse cohorts of antiretroviral therapy (ART)-naive and ART-experienced PLHIV enrolled in XPHACTOR, TB Fast Track (TBFT), and Gugulethu studies from South Africa (SA). Predictive accuracy of the clinical score was compared with the World Health Organization (WHO) 4-symptom TB screen. Among 5,418 XPRES enrollees, 2,771 were included in the derivation dataset; 67% were female, median age was 34 years, median CD4 was 240 cells/μL, 189 (7%) had undiagnosed prevalent TB, and characteristics were similar between internal derivation and validation datasets. Among XPHACTOR, TBFT, and Gugulethu cohorts, median CD4 was 400, 73, and 167 cells/μL, and prevalence of TB was 5%, 10%, and 18%, respectively. Factors predictive of TB in the derivation dataset and selected for the clinical score included male sex (1 point), ≥1 WHO TB symptom (7 points), smoking history (1 point), temperature >37.5°C (6 points), body mass index (BMI) <18.5kg/m2 (2 points), and severe anemia (hemoglobin <8g/dL) (3 points). Sensitivity using WHO 4-symptom TB screen was 73%, 80%, 94%, and 94% in XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively, but increased to 88%, 87%, 97%, and 97%, when a clinical score of ≥2 was used. Negative predictive value (NPV) also increased 1%, 0.3%, 1.6%, and 1.7% in XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively, when the clinical score of ≥2 replaced WHO 4-symptom TB screen. Categorizing risk scores into low (<2), moderate (2 to 10), and high-risk categories (>10) yielded TB prevalence of 1%, 1%, 2%, and 6% in the lowest risk group and 33%, 22%, 26%, and 32% in the highest risk group for XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively. At clinical score ≥2, the number needed to screen (NNS) ranged from 5.0 in Gugulethu to 11.0 in XPHACTOR. Limitations include that the risk score has not been validated in resource-rich settings and needs further evaluation and validation in contemporary cohorts in Africa and other resource-constrained settings.

Conclusions

The simple and feasible clinical score allowed for prioritization of sensitivity and NPV, which could facilitate reductions in mortality from undiagnosed TB and safer administration of TPT during proposed global scale-up efforts. Differentiation of risk by clinical score cutoff allows flexibility in designing differentiated HIV–TB care to maximize impact of available resources.

Altering product placement to create a healthier layout in supermarkets: Outcomes on store sales, customer purchasing, and diet in a prospective matched controlled cluster study

Ma, 07/09/2021 - 15:00

by Christina Vogel, Sarah Crozier, Daniel Penn-Newman, Kylie Ball, Graham Moon, Joanne Lord, Cyrus Cooper, Janis Baird

Background

Previous product placement trials in supermarkets are limited in scope and outcome data collected. This study assessed the effects on store-level sales, household-level purchasing, and dietary behaviours of a healthier supermarket layout.

Methods and findings

This is a prospective matched controlled cluster trial with 2 intervention components: (i) new fresh fruit and vegetable sections near store entrances (replacing smaller displays at the back) and frozen vegetables repositioned to the entrance aisle, plus (ii) the removal of confectionery from checkouts and aisle ends opposite. In this pilot study, the intervention was implemented for 6 months in 3 discount supermarkets in England. Three control stores were matched on store sales and customer profiles and neighbourhood deprivation. Women customers aged 18 to 45 years, with loyalty cards, were assigned to the intervention (n = 62) or control group (n = 88) of their primary store. The trial registration number is NCT03518151. Interrupted time series analysis showed that increases in store-level sales of fruits and vegetables were greater in intervention stores than predicted at 3 (1.71 standard deviations (SDs) (95% CI 0.45, 2.96), P = 0.01) and 6 months follow-up (2.42 SDs (0.22, 4.62), P = 0.03), equivalent to approximately 6,170 and approximately 9,820 extra portions per store, per week, respectively. The proportion of purchasing fruits and vegetables per week rose among intervention participants at 3 and 6 months compared to control participants (0.2% versus −3.0%, P = 0.22; 1.7% versus −3.5%, P = 0.05, respectively). Store sales of confectionery were lower in intervention stores than predicted at 3 (−1.05 SDs (−1.98, −0.12), P = 0.03) and 6 months (−1.37 SDs (−2.95, 0.22), P = 0.09), equivalent to approximately 1,359 and approximately 1,575 fewer portions per store, per week, respectively; no differences were observed for confectionery purchasing. Changes in dietary variables were predominantly in the expected direction for health benefit. Intervention implementation was not within control of the research team, and stores could not be randomised. It is a pilot study, and, therefore, not powered to detect an effect.

Conclusions

Healthier supermarket layouts can improve the nutrition profile of store sales and likely improve household purchasing and dietary quality. Placing fruits and vegetables near store entrances should be considered alongside policies to limit prominent placement of unhealthy foods.

Trial registration

ClinicalTrials.gov NCT03518151 (pre-results)

Assisted reproduction technology and long-term cardiometabolic health in the offspring

Ma, 07/09/2021 - 15:00

by Ronald C. W. Ma, Noel Y. H. Ng, Lai Ping Cheung

Ronald Ma and co-authors discuss Emma Norrman and colleagues’ accompanying research study on the health of children born with assisted reproductive technology.

Cardiovascular disease, obesity, and type 2 diabetes in children born after assisted reproductive technology: A population-based cohort study

Ma, 07/09/2021 - 15:00

by Emma Norrman, Max Petzold, Mika Gissler, Anne Lærke Spangmose, Signe Opdahl, Anna-Karina Henningsen, Anja Pinborg, Aila Tiitinen, Annika Rosengren, Liv Bente Romundstad, Ulla-Britt Wennerholm, Christina Bergh

Background

Some earlier studies have found indications of significant changes in cardiometabolic risk factors in children born after assisted reproductive technology (ART). Most of these studies are based on small cohorts with high risk of selection bias. In this study, we compared the risk of cardiovascular disease, obesity, and type 2 diabetes between singleton children born after ART and singleton children born after spontaneous conception (SC).

Methods and findings

This was a large population-based cohort study of individuals born in Norway, Sweden, Finland, and Denmark between 1984 and 2015. Data were obtained from national ART and medical birth registers and cross-linked with data from national patient registers and other population-based registers in the respective countries. In total, 122,429 children born after ART and 7,574,685 children born after SC were included. Mean (SD) maternal age was 33.9 (4.3) years for ART and 29.7 (5.2) for SC, 67.7% versus 41.8% were primiparous, and 45.2% versus 32.1% had more than 12 years of education. Preterm birth (<37 weeks 0 days) occurred in 7.9% of children born after ART and 4.8% in children born after SC, and 5.7% versus 3.3% had a low birth weight (<2,500 g). Mean (SD) follow-up time was 8.6 (6.2) years for children born after ART and 14.0 (8.6) years for children born after SC. In total, 135 (0.11%), 645 (0.65%), and 18 (0.01%) children born after ART were diagnosed with cardiovascular disease (ischemic heart disease, cardiomyopathy, heart failure, or cerebrovascular disease), obesity or type 2 diabetes, respectively. The corresponding values were 10,702 (0.14%), 30,308 (0.74%), and 2,919 (0.04%) for children born after SC. In the unadjusted analysis, children born after ART had a significantly higher risk of any cardiovascular disease (hazard ratio [HR] 1.24; 95% CI 1.04–1.48; p = 0.02), obesity (HR 1.13; 95% CI 1.05–1.23; p = 0.002), and type 2 diabetes (HR 1.71; 95% CI 1.08–2.73; p = 0.02). After adjustment, there was no significant difference between children born after ART and children born after SC for any cardiovascular disease (adjusted HR [aHR]1.02; 95% CI 0.86–1.22; p = 0.80) or type 2 diabetes (aHR 1.31; 95% CI 0.82–2.09; p = 0.25). For any cardiovascular disease, the 95% CI was reasonably narrow, excluding effects of a substantial magnitude, while the 95% CI for type 2 diabetes was wide, not excluding clinically meaningful effects. For obesity, there was a small but significant increased risk among children born after ART (aHR 1.14; 95% CI 1.06–1.23; p = 0.001). Important limitations of the study were the relatively short follow-up time, the limited number of events for some outcomes, and that the outcome obesity is often not considered as a disease and therefore not caught by registers, likely leading to an underestimation of obesity in both children born after ART and children born after SC.

Conclusions

In this study, we observed no difference in the risk of cardiovascular disease or type 2 diabetes between children born after ART and children born after SC. For obesity, there was a small but significant increased risk for children born after ART.

Trial registration number

ISRCTN11780826.

The latent tuberculosis cascade-of-care among people living with HIV: A systematic review and meta-analysis

Ma, 07/09/2021 - 15:00

by Mayara Lisboa Bastos, Luca Melnychuk, Jonathon R. Campbell, Olivia Oxlade, Dick Menzies

Background

Tuberculosis preventive therapy (TPT) reduces TB-related morbidity and mortality in people living with HIV (PLHIV). Cascade-of-care analyses help identify gaps and barriers in care and develop targeted solutions. A previous latent tuberculosis infection (LTBI) cascade-of-care analysis showed only 18% of persons in at-risk populations complete TPT, but a similar analysis for TPT among PLHIV has not been completed. We conducted a meta-analysis to provide this evidence.

Methods and findings

We first screened potential articles from a LTBI cascade-of-care systematic review published in 2016. From this study, we included cohorts that reported a minimum of 25 PLHIV. To identify new cohorts, we used a similar search strategy restricted to PLHIV. The search was conducted in Medline, Embase, Health Star, and LILACS, from January 2014 to February 2021. Two authors independently screened titles and full text and assessed risk of bias using the Newcastle–Ottawa Scale for cohorts and Cochrane Risk of Bias for cluster randomized trials. We meta-analyzed the proportion of PLHIV completing each step of the LTBI cascade-of-care and estimated the cumulative proportion retained. These results were stratified based on cascades-of-care that used or did not use LTBI testing to determine eligibility for TPT. We also performed a narrative synthesis of enablers and barriers of the cascade-of-care identified at different steps of the cascade.A total of 71 cohorts were included, and 70 were meta-analyzed, comprising 94,011 PLHIV. Among the PLHIV included, 35.3% (33,139/94,011) were from the Americas and 29.2% (27,460/94,011) from Africa. Overall, 49.9% (46,903/94,011) from low- and middle-income countries, median age was 38.0 [interquartile range (IQR) 34.0;43.6], and 65.9% (46,328/70,297) were men, 43.6% (29,629/67,947) were treated with antiretroviral therapy (ART), and the median CD4 count was 390 cell/mm3 (IQR 312;458). Among the cohorts that did not use LTBI tests, the cumulative proportion of PLHIV starting and completing TPT were 40.9% (95% CI: 39.3% to 42.7%) and 33.2% (95% CI: 31.6% to 34.9%). Among cohorts that used LTBI tests, the cumulative proportions of PLHIV starting and completing TPT were 60.4% (95% CI: 58.1% to 62.6%) and 41.9% (95% CI:39.6% to 44.2%), respectively. Completion of TPT was not significantly different in high- compared to low- and middle-income countries. Regardless of LTBI test use, substantial losses in the cascade-of-care occurred before treatment initiation. The integration of HIV and TB care was considered an enabler of the cascade-of-care in multiple cohorts. Key limitations of this systematic review are the observational nature of the included studies, potential selection bias in the population selection, only 14 cohorts reported all steps of the cascade-of-care, and barriers/facilitators were not systematically reported in all cohorts.

Conclusions

Although substantial losses were seen in multiple stages of the cascade-of-care, the cumulative proportion of PLHIV completing TPT was higher than previously reported among other at-risk populations. The use of LTBI testing in PLHIV in low- and middle-income countries was associated with higher proportion of the cohorts initiating TPT and with similar rates of completion of TPT.

Changes in maternal risk factors and their association with changes in cesarean sections in Norway between 1999 and 2016: A descriptive population-based registry study

Ve, 03/09/2021 - 15:00

by Ingvild Hersoug Nedberg, Marzia Lazzerini, Ilaria Mariani, Kajsa Møllersen, Emanuelle Pessa Valente, Erik Eik Anda, Finn Egil Skjeldestad

Background

Increases in the proportion of the population with increased likelihood of cesarean section (CS) have been postulated as a driving force behind the rise in CS rates worldwide. The aim of the study was to assess if changes in selected maternal risk factors for CS are associated with changes in CS births from 1999 to 2016 in Norway.

Methods and findings

This national population-based registry study utilizes data from 1,055,006 births registered in the Norwegian Medical Birth Registry from 1999 to 2016. The following maternal risk factors for CS were included: nulliparous/≥35 years, multiparous/≥35 years, pregestational diabetes, gestational diabetes, hypertensive disorders, previous CS, assisted reproductive technology, and multiple births. The proportion of CS births in 1999 was used to predict the number of CS births in 2016. The observed and predicted numbers of CS births were compared to determine the number of excess CS births, before and after considering the selected risk factors, for all births, and for births stratified by 0, 1, or ≥1 of the selected risk factors. The proportion of CS births increased from 12.9% to 16.1% (+24.8%) during the study period. The proportion of births with 1 selected risk factor increased from 21.3% to 26.3% (+23.5%), while the proportion with >1 risk factor increased from 4.5% to 8.8% (+95.6%). Stratification by the presence of selected risk factors reduced the number of excess CS births observed in 2016 compared to 1999 by 67.9%. Study limitations include lack of access to other important maternal risk factors and only comparing the first and the last year of the study period.

Conclusions

In this study, we observed that after an initial increase, proportions of CS births remained stable from 2005 to 2016. Instead, both the size of the risk population and the mean number of risk factors per birth continued to increase. We observed a possible association between the increase in size of risk population and the additional CS births observed in 2016 compared to 1999. The increase in size of risk population and the stable CS rate from 2005 and onward may indicate consistent adherence to obstetric evidence-based practice in Norway.

Corporate political activity in the context of unhealthy food advertising restrictions across Transport for London: A qualitative case study

Gi, 02/09/2021 - 15:00

by Kathrin Lauber, Daniel Hunt, Anna B. Gilmore, Harry Rutter

Background

Diets with high proportions of foods high in fat, sugar, and/or salt (HFSS) contribute to malnutrition and rising rates of childhood obesity, with effects throughout the life course. Given compelling evidence on the detrimental impact HFSS advertising has on children’s diets, the World Health Organization unequivocally supports the adoption of restrictions on HFSS marketing and advertising. In February 2019, the Greater London Authority introduced novel restrictions on HFSS advertising across Transport for London (TfL), one of the most valuable out-of-home advertising estates. In this study, we examined whether and how commercial actors attempted to influence the development of these advertising restrictions.

Methods and findings

Using requests under the Freedom of Information Act, we obtained industry responses to the London Food Strategy consultation, correspondence between officials and key industry actors, and information on meetings. We used an existing model of corporate political activity, the Policy Dystopia Model, to systematically analyse arguments and activities used to counter the policy. The majority of food and advertising industry consultation respondents opposed the proposed advertising restrictions, many promoting voluntary approaches instead. Industry actors who supported the policy were predominantly smaller businesses. To oppose the policy, industry respondents deployed a range of strategies. They exaggerated potential costs and underplayed potential benefits of the policy, for instance, warning of negative economic consequences and questioning the evidence underlying the proposal. Despite challenging the evidence for the policy, they offered little evidence in support of their own claims. Commercial actors had significant access to the policy process and officials through the consultation and numerous meetings, yet attempted to increase access, for example, in applying to join the London Child Obesity Taskforce and inviting its members to events. They also employed coalition management, engaging directly and through business associations to amplify their arguments. Some advertising industry actors also raised the potential of legal challenges. The key limitation of this study is that our data focused on industry–policymaker interactions; thus, our findings are unable to present a comprehensive picture of political activity.

Conclusions

In this study, we identified substantial opposition from food and advertising industry actors to the TfL advertising restrictions. We mapped arguments and activities used to oppose the policy, which might help other public authorities anticipate industry efforts to prevent similar restrictions in HFSS advertising. Given the potential consequences of commercial influence in these kinds of policy spaces, public bodies should consider how they engage with industry actors.

Risk of a permanent work-related disability pension after incident venous thromboembolism in Denmark: A population-based cohort study

Ma, 31/08/2021 - 15:00

by Helle Jørgensen, Erzsébet Horváth-Puhó, Kristina Laugesen, Sigrid Brækkan, John-Bjarne Hansen, Henrik Toft Sørensen

Background

Long-term complications of venous thromboembolism (VTE) hamper physical function and impair quality of life; still, it remains unclear whether VTE is associated with risk of permanent work-related disability. We aimed to assess the association between VTE and the risk of receiving a permanent work-related disability pension and to assess whether this association was explained by comorbidities such as cancer and arterial cardiovascular disease.

Methods and findings

A Danish nationwide population-based cohort study consisting of 43,769 individuals aged 25 to 66 years with incident VTE during 1995 to 2016 and 218,845 birth year-, sex-, and calendar year-matched individuals from the general population, among whom 45.9% (N = 120,540) were women, was established using Danish national registries. The cohorts were followed throughout 2016, with permanent work-related disability pension as the outcome. Hazard ratios (HRs) with 95% confidence intervals (CIs) for disability pension were computed and stratified by sex and age groups (25 to 34, 35 to 44, 45 to 54, and 55 to 66 years of age) and adjusted for comorbidities and socioeconomic variables.Permanent work-related disability pensions were granted to 4,415 individuals with VTE and 9,237 comparison cohort members (incidence rates = 17.8 and 6.2 per 1,000 person-years, respectively). VTE was associated with a 3-fold (HR 3.0, 95% CI: 2.8 to 3.1) higher risk of receiving a disability pension. Adjustments for socioeconomic status and comorbidities such as cancer and cardiovascular diseases reduced the estimate (HR 2.3, 95% CI: 2.2 to 2.4). The risk of disability pension receipt was slightly higher in men than in women (HR 2.5, 95% CI: 2.3 to 2.6 versus HR 2.1, 95% CI: 2.0 to 2.3). As this study is based on medical and administrative registers, information on post-VTE care, individual health behavior, and workplace factors linked to disability pension in the general population are lacking. Furthermore, as disability pension schemes vary, our results might not be directly generalizable to other countries or time periods.

Conclusions

In this study, incident VTE was associated with increased risk of subsequent permanent work-related disability, and this association was still observed after accounting for comorbidities such as cancer and cardiovascular diseases. Our results emphasize the social consequences of VTE and may help occupational and healthcare professionals to identify vulnerable individuals at risk of permanent exclusion from the labor market after a VTE event.

Building global health research capacity to address research imperatives following the COVID-19 pandemic

Ma, 31/08/2021 - 15:00

by Peter H. Kilmarx, Roger I. Glass

Peter Kilmarx and Roger Glass discuss strengthening health research capabilities as a response to the COVID-19 pandemic.

Utility of ctDNA in predicting response to neoadjuvant chemoradiotherapy and prognosis assessment in locally advanced rectal cancer: A prospective cohort study

Ma, 31/08/2021 - 15:00

by Yaqi Wang, Lifeng Yang, Hua Bao, Xiaojun Fan, Fan Xia, Juefeng Wan, Lijun Shen, Yun Guan, Hairong Bao, Xue Wu, Yang Xu, Yang Shao, Yiqun Sun, Tong Tong, Xinxiang Li, Ye Xu, Sanjun Cai, Ji Zhu, Zhen Zhang

Background

For locally advanced rectal cancer (LARC) patients who receive neoadjuvant chemoradiotherapy (nCRT), there are no reliable indicators to accurately predict pathological complete response (pCR) before surgery. For patients with clinical complete response (cCR), a “Watch and Wait” (W&W) approach can be adopted to improve quality of life. However, W&W approach may increase the recurrence risk in patients who are judged to be cCR but have minimal residual disease (MRD). Magnetic resonance imaging (MRI) is a major tool to evaluate response to nCRT; however, its ability to predict pCR needs to be improved. In this prospective cohort study, we explored the value of circulating tumor DNA (ctDNA) in combination with MRI in the prediction of pCR before surgery and investigated the utility of ctDNA in risk stratification and prognostic prediction for patients undergoing nCRT and total mesorectal excision (TME).

Methods and findings

We recruited 119 Chinese LARC patients (cT3-4/N0-2/M0; median age of 57; 85 males) who were treated with nCRT plus TME at Fudan University Shanghai Cancer Center (China) from February 7, 2016 to October 31, 2017. Plasma samples at baseline, during nCRT, and after surgery were collected. A total of 531 plasma samples were collected and subjected to deep targeted panel sequencing of 422 cancer-related genes. The association among ctDNA status, treatment response, and prognosis was analyzed. The performance of ctDNA alone, MRI alone, and combining ctDNA with MRI was evaluated for their ability to predict pCR/non-pCR.Ranging from complete tumor regression (pathological tumor regression grade 0; pTRG0) to poor regression (pTRG3), the ctDNA clearance rate during nCRT showed a significant decreasing trend (95.7%, 77.8%, 71.1%, and 66.7% in pTRG 0, 1, 2, and 3 groups, respectively, P = 0.008), while the detection rate of acquired mutations in ctDNA showed an increasing trend (3.8%, 8.3%, 19.2%, and 23.1% in pTRG 0, 1, 2, and 3 groups, respectively, P = 0.02). Univariable logistic regression showed that ctDNA clearance was associated with a low probability of non-pCR (odds ratio = 0.11, 95% confidence interval [95% CI] = 0.01 to 0.6, P = 0.04). A risk score predictive model, which incorporated both ctDNA (i.e., features of baseline ctDNA, ctDNA clearance, and acquired mutation status) and MRI tumor regression grade (mrTRG), was developed and demonstrated improved performance in predicting pCR/non-pCR (area under the curve [AUC] = 0.886, 95% CI = 0.810 to 0.962) compared with models derived from only ctDNA (AUC = 0.818, 95% CI = 0.725 to 0.912) or only mrTRG (AUC = 0.729, 95% CI = 0.641 to 0.816). The detection of potential colorectal cancer (CRC) driver genes in ctDNA after nCRT indicated a significantly worse recurrence-free survival (RFS) (hazard ratio [HR] = 9.29, 95% CI = 3.74 to 23.10, P < 0.001). Patients with detectable driver mutations and positive high-risk feature (HR_feature) after surgery had the highest recurrence risk (HR = 90.29, 95% CI = 17.01 to 479.26, P < 0.001). Limitations include relatively small sample size, lack of independent external validation, no serial ctDNA testing after surgery, and a relatively short follow-up period.

Conclusions

The model combining ctDNA and MRI improved the predictive performance compared with the models derived from individual information, and combining ctDNA with HR_feature can stratify patients with a high risk of recurrence. Therefore, ctDNA can supplement MRI to better predict nCRT response, and it could potentially help patient selection for nonoperative management and guide the treatment strategy for those with different recurrence risks.

Cell-free DNA ultra-low-pass whole genome sequencing to distinguish malignant peripheral nerve sheath tumor (MPNST) from its benign precursor lesion: A cross-sectional study

Ma, 31/08/2021 - 15:00

by Jeffrey J. Szymanski, R. Taylor Sundby, Paul A. Jones, Divya Srihari, Noah Earland, Peter K. Harris, Wenjia Feng, Faridi Qaium, Haiyan Lei, David Roberts, Michele Landeau, Jamie Bell, Yi Huang, Leah Hoffman, Melissa Spencer, Matthew B. Spraker, Li Ding, Brigitte C. Widemann, Jack F. Shern, Angela C. Hirbe, Aadel A. Chaudhuri

Background

The leading cause of mortality for patients with the neurofibromatosis type 1 (NF1) cancer predisposition syndrome is the development of malignant peripheral nerve sheath tumor (MPNST), an aggressive soft tissue sarcoma. In the setting of NF1, this cancer type frequently arises from within its common and benign precursor, plexiform neurofibroma (PN). Transformation from PN to MPNST is challenging to diagnose due to difficulties in distinguishing cross-sectional imaging results and intralesional heterogeneity resulting in biopsy sampling errors.

Methods and findings

This multi-institutional study from the National Cancer Institute and Washington University in St. Louis used fragment size analysis and ultra-low-pass whole genome sequencing (ULP-WGS) of plasma cell-free DNA (cfDNA) to distinguish between MPNST and PN in patients with NF1. Following in silico enrichment for short cfDNA fragments and copy number analysis to estimate the fraction of plasma cfDNA originating from tumor (tumor fraction), we developed a noninvasive classifier that differentiates MPNST from PN with 86% pretreatment accuracy (91% specificity, 75% sensitivity) and 89% accuracy on serial analysis (91% specificity, 83% sensitivity). Healthy controls without NF1 (participants = 16, plasma samples = 16), PN (participants = 23, plasma samples = 23), and MPNST (participants = 14, plasma samples = 46) cohorts showed significant differences in tumor fraction in plasma (P = 0.001) as well as cfDNA fragment length (P < 0.001) with MPNST samples harboring shorter fragments and being enriched for tumor-derived cfDNA relative to PN and healthy controls. No other covariates were significant on multivariate logistic regression. Mutational analysis demonstrated focal NF1 copy number loss in PN and MPNST patient plasma but not in healthy controls. Greater genomic instability including alterations associated with malignant transformation (focal copy number gains in chromosome arms 1q, 7p, 8q, 9q, and 17q; focal copy number losses in SUZ12, SMARCA2, CDKN2A/B, and chromosome arms 6p and 9p) was more prominently observed in MPNST plasma. Furthermore, the sum of longest tumor diameters (SLD) visualized by cross-sectional imaging correlated significantly with paired tumor fractions in plasma from MPNST patients (r = 0.39, P = 0.024). On serial analysis, tumor fraction levels in plasma dynamically correlated with treatment response to therapy and minimal residual disease (MRD) detection before relapse. Study limitations include a modest MPNST sample size despite accrual from 2 major referral centers for this rare malignancy, and lack of uniform treatment and imaging protocols representing a real-world cohort.

Conclusions

Tumor fraction levels derived from cfDNA fragment size and copy number alteration analysis of plasma cfDNA using ULP-WGS significantly correlated with MPNST tumor burden, accurately distinguished MPNST from its benign PN precursor, and dynamically correlated with treatment response. In the future, our findings could form the basis for improved early cancer detection and monitoring in high-risk cancer-predisposed populations.

Urine tumor DNA detection of minimal residual disease in muscle-invasive bladder cancer treated with curative-intent radical cystectomy: A cohort study

Ma, 31/08/2021 - 15:00

by Pradeep S. Chauhan, Kevin Chen, Ramandeep K. Babbra, Wenjia Feng, Nadja Pejovic, Armaan Nallicheri, Peter K. Harris, Katherine Dienstbach, Andrew Atkocius, Lenon Maguire, Faridi Qaium, Jeffrey J. Szymanski, Brian C. Baumann, Li Ding, Dengfeng Cao, Melissa A. Reimers, Eric H. Kim, Zachary L. Smith, Vivek K. Arora, Aadel A. Chaudhuri

Background

The standard of care treatment for muscle-invasive bladder cancer (MIBC) is radical cystectomy, which is typically preceded by neoadjuvant chemotherapy. However, the inability to assess minimal residual disease (MRD) noninvasively limits our ability to offer bladder-sparing treatment. Here, we sought to develop a liquid biopsy solution via urine tumor DNA (utDNA) analysis.

Methods and findings

We applied urine Cancer Personalized Profiling by Deep Sequencing (uCAPP-Seq), a targeted next-generation sequencing (NGS) method for detecting utDNA, to urine cell-free DNA (cfDNA) samples acquired between April 2019 and November 2020 on the day of curative-intent radical cystectomy from 42 patients with localized bladder cancer. The average age of patients was 69 years (range: 50 to 86), of whom 76% (32/42) were male, 64% (27/42) were smokers, and 76% (32/42) had a confirmed diagnosis of MIBC. Among MIBC patients, 59% (19/32) received neoadjuvant chemotherapy. utDNA variant calling was performed noninvasively without prior sequencing of tumor tissue. The overall utDNA level for each patient was represented by the non-silent mutation with the highest variant allele fraction after removing germline variants. Urine was similarly analyzed from 15 healthy adults. utDNA analysis revealed a median utDNA level of 0% in healthy adults and 2.4% in bladder cancer patients. When patients were classified as those who had residual disease detected in their surgical sample (n = 16) compared to those who achieved a pathologic complete response (pCR; n = 26), median utDNA levels were 4.3% vs. 0%, respectively (p = 0.002). Using an optimal utDNA threshold to define MRD detection, positive utDNA MRD detection was highly correlated with the absence of pCR (p < 0.001) with a sensitivity of 81% and specificity of 81%. Leave-one-out cross-validation applied to the prediction of pathologic response based on utDNA MRD detection in our cohort yielded a highly significant accuracy of 81% (p = 0.007). Moreover, utDNA MRD–positive patients exhibited significantly worse progression-free survival (PFS; HR = 7.4; 95% CI: 1.4–38.9; p = 0.02) compared to utDNA MRD–negative patients. Concordance between urine- and tumor-derived mutations, determined in 5 MIBC patients, was 85%. Tumor mutational burden (TMB) in utDNA MRD–positive patients was inferred from the number of non-silent mutations detected in urine cfDNA by applying a linear relationship derived from The Cancer Genome Atlas (TCGA) whole exome sequencing of 409 MIBC tumors. We suggest that about 58% of these patients with high inferred TMB might have been candidates for treatment with early immune checkpoint blockade. Study limitations included an analysis restricted only to single-nucleotide variants (SNVs), survival differences diminished by surgery, and a low number of DNA damage response (DRR) mutations detected after neoadjuvant chemotherapy at the MRD time point.

Conclusions

utDNA MRD detection prior to curative-intent radical cystectomy for bladder cancer correlated significantly with pathologic response, which may help select patients for bladder-sparing treatment. utDNA MRD detection also correlated significantly with PFS. Furthermore, utDNA can be used to noninvasively infer TMB, which could facilitate personalized immunotherapy for bladder cancer in the future.