PLoS Medicine

Condividi contenuti PLOS Medicine: New Articles
A Peer-Reviewed Open-Access Journal
Aggiornato: 8 ore 5 min fa

Selective serotonin reuptake inhibitors, and serotonin and norepinephrine reuptake inhibitors for anxiety, obsessive-compulsive, and stress disorders: A 3-level network meta-analysis

Gi, 10/06/2021 - 16:00

by Natan Pereira Gosmann, Marianna de Abreu Costa, Marianna de Barros Jaeger, Luis Souza Motta, Júlia Frozi, Lucas Spanemberg, Gisele Gus Manfro, Pim Cuijpers, Daniel Samuel Pine, Giovanni Abrahão Salum

Background

Anxiety, obsessive-compulsive, and stress-related disorders frequently co-occur, and patients often present symptoms of several domains. Treatment involves the use of selective serotonin reuptake inhibitors (SSRIs) and serotonin and norepinephrine reuptake inhibitors (SNRIs), but data on comparative efficacy and acceptability are lacking. We aimed to compare the efficacy of SSRIs, SNRIs, and placebo in multiple symptom domains in patients with these diagnoses over the lifespan through a 3-level network meta-analysis.

Methods and findings

We searched for published and unpublished randomized controlled trials that aimed to assess the efficacy of SSRIs or SNRIs in participants (adults and children) with diagnosis of any anxiety, obsessive-compulsive, or stress-related disorder in MEDLINE, PsycINFO, Embase, and Cochrane Library from inception to 23 April 2015, with an update on 11 November 2020. We supplemented electronic database searches with manual searches for published and unpublished randomized controlled trials registered in publicly accessible clinical trial registries and pharmaceutical companies’ databases. No restriction was made regarding comorbidities with any other mental disorder, participants’ age and sex, blinding of participants and researchers, date of publication, or study language. The primary outcome was the aggregate measure of internalizing symptoms of these disorders. Secondary outcomes included specific symptom domains and treatment discontinuation rate. We estimated standardized mean differences (SMDs) with 3-level network meta-analysis with random slopes by study for medication and assessment instrument. Risk of bias appraisal was performed using the Cochrane Collaboration’s risk of bias tool. This study was registered in PROSPERO (CRD42017069090). We analyzed 469 outcome measures from 135 studies (n = 30,245). Medication (SSRI or SNRI) was more effective than placebo for the aggregate measure of internalizing symptoms (SMD −0.56, 95% CI −0.62 to −0.51, p < 0.001), for all symptom domains, and in patients from all diagnostic categories. We also found significant results when restricting to the most used assessment instrument for each diagnosis; nevertheless, this restriction led to exclusion of 72.71% of outcome measures. Pairwise comparisons revealed only small differences between medications in efficacy and acceptability. Limitations include the moderate heterogeneity found in most outcomes and the moderate risk of bias identified in most of the trials.

Conclusions

In this study, we observed that SSRIs and SNRIs were effective for multiple symptom domains, and in patients from all included diagnostic categories. We found minimal differences between medications concerning efficacy and acceptability. This 3-level network meta-analysis contributes robust evidence to the ongoing discussion about the true benefit of antidepressants, with a significantly larger quantity of data and higher statistical power than previous studies. The 3-level approach allowed us to properly assess the efficacy of these medications on internalizing psychopathology, avoiding potential biases related to the exclusion of information due to distinct assessment instruments, and to explore the multilevel structure of transdiagnostic efficacy.

Association between exercise habits and stroke, heart failure, and mortality in Korean patients with incident atrial fibrillation: A nationwide population-based cohort study

Ma, 08/06/2021 - 16:00

by Hyo-Jeong Ahn, So-Ryoung Lee, Eue-Keun Choi, Kyung-Do Han, Jin-Hyung Jung, Jae-Hyun Lim, Jun-Pil Yun, Soonil Kwon, Seil Oh, Gregory Y. H. Lip

Background

There is a paucity of information about cardiovascular outcomes related to exercise habit change after a new diagnosis of atrial fibrillation (AF). We investigated the association between exercise habits after a new AF diagnosis and ischemic stroke, heart failure (HF), and all-cause death.

Methods and findings

This is a nationwide population-based cohort study using data from the Korea National Health Insurance Service. A retrospective analysis was performed for 66,692 patients with newly diagnosed AF between 2010 and 2016 who underwent 2 serial health examinations within 2 years before and after their AF diagnosis. Individuals were divided into 4 categories according to performance of regular exercise, which was investigated by a self-reported questionnaire in each health examination, before and after their AF diagnosis: persistent non-exercisers (30.5%), new exercisers (17.8%), exercise dropouts (17.4%), and exercise maintainers (34.2%). The primary outcomes were incidence of ischemic stroke, HF, and all-cause death. Differences in baseline characteristics among groups were balanced considering demographics, comorbidities, medications, lifestyle behaviors, and income status. The risks of the outcomes were computed by weighted Cox proportional hazards models with inverse probability of treatment weighting (IPTW) during a mean follow-up of 3.4 ± 2.0 years. The new exerciser and exercise maintainer groups were associated with a lower risk of HF compared to the persistent non-exerciser group: the hazard ratios (HRs) (95% CIs) were 0.95 (0.90–0.99) and 0.92 (0.88–0.96), respectively (p < 0.001). Also, performing exercise any time before or after AF diagnosis was associated with a lower risk of mortality compared to persistent non-exercising: the HR (95% CI) was 0.82 (0.73–0.91) for new exercisers, 0.83 (0.74–0.93) for exercise dropouts, and 0.61 (0.55–0.67) for exercise maintainers (p < 0.001). For ischemic stroke, the estimates of HRs were 10%–14% lower in patients of the exercise groups, yet differences were statistically insignificant (p = 0.057). Energy expenditure of 1,000–1,499 MET-min/wk (regular moderate exercise 170–240 min/wk) was consistently associated with a lower risk of each outcome based on a subgroup analysis of the new exerciser group. Study limitations include recall bias introduced due to the nature of the self-reported questionnaire and restricted external generalizability to other ethnic groups.

Conclusions

Initiating or continuing regular exercise after AF diagnosis was associated with lower risks of HF and mortality. The promotion of exercise might reduce the future risk of adverse outcomes in patients with AF.

Ending AIDS as a public health threat by 2030: Time to reset targets for 2025

Ma, 08/06/2021 - 16:00

by Paul R. De Lay, Adèle Benzaken, Quarraisha Abdool Karim, Sani Aliyu, Carolyn Amole, George Ayala, Kalipso Chalkidou, Judy Chang, Michaela Clayton, Aleny Couto, Carl Dieffenbach, Mark Dybul, Wafaa El Sadr, Marelize Gorgens, Daniel Low-Beer, Smail Mesbah, Jorge Saveedra, Petchsri Sirinirund, John Stover, Omar Syarif, Aditia Taslim, Safiatou Thiam, Lucy Wanjiku Njenga, Peter D. Ghys, Jose Antonio Izazola-Licea, Luisa Frescura, Erik Lamontagne, Peter Godfrey-Faussett, Christopher Fontaine, Iris Semini, Shannon Hader

Paul De Lay and co-authors introduce a Collection on the design of targets for ending the AIDS epidemic.

Optimal protamine dosing after cardiopulmonary bypass: The PRODOSE adaptive randomised controlled trial

Lu, 07/06/2021 - 16:00

by Lachlan F. Miles, Christiana Burt, Joseph Arrowsmith, Mikel A. McKie, Sofia S. Villar, Pooveshnie Govender, Ruth Shaylor, Zihui Tan, Ravi De Silva, Florian Falter

Background

The dose of protamine required following cardiopulmonary bypass (CPB) is often determined by the dose of heparin required pre-CPB, expressed as a fixed ratio. Dosing based on mathematical models of heparin clearance is postulated to improve protamine dosing precision and coagulation. We hypothesised that protamine dosing based on a 2-compartment model would improve thromboelastography (TEG) parameters and reduce the dose of protamine administered, relative to a fixed ratio.

Methods and findings

We undertook a 2-stage, adaptive randomised controlled trial, allocating 228 participants to receive protamine dosed according to a mathematical model of heparin clearance or a fixed ratio of 1 mg of protamine for every 100 IU of heparin required to establish anticoagulation pre-CPB. A planned, blinded interim analysis was undertaken after the recruitment of 50% of the study cohort. Following this, the randomisation ratio was adapted from 1:1 to 1:1.33 to increase recruitment to the superior arm while maintaining study power. At the conclusion of trial recruitment, we had randomised 121 patients to the intervention arm and 107 patients to the control arm. The primary endpoint was kaolin TEG r-time measured 3 minutes after protamine administration at the end of CPB. Secondary endpoints included ratio of kaolin TEG r-time pre-CPB to the same metric following protamine administration, requirement for allogeneic red cell transfusion, intercostal catheter drainage at 4 hours postoperatively, and the requirement for reoperation due to bleeding. The trial was listed on a clinical trial registry (ClinicalTrials.gov Identifier: NCT03532594).Participants were recruited between April 2018 and August 2019. Those in the intervention/model group had a shorter mean kaolin r-time (6.58 [SD 2.50] vs. 8.08 [SD 3.98] minutes; p = 0.0016) post-CPB. The post-protamine thromboelastogram of the model group was closer to pre-CPB parameters (median pre-CPB to post-protamine kaolin r-time ratio 0.96 [IQR 0.78–1.14] vs. 0.75 [IQR 0.57–0.99]; p < 0.001). We found no evidence of a difference in median mediastinal/pleural drainage at 4 hours postoperatively (140 [IQR 75–245] vs. 135 [IQR 94–222] mL; p = 0.85) or requirement (as a binary outcome) for packed red blood cell transfusion at 24 hours postoperatively (19 [15.8%] vs. 14 [13.1%] p = 0.69). Those in the model group had a lower median protamine dose (180 [IQR 160–210] vs. 280 [IQR 250–300] mg; p < 0.001).Important limitations of this study include an unblinded design and lack of generalisability to certain populations deliberately excluded from the study (specifically children, patients with a total body weight >120 kg, and patients requiring therapeutic hypothermia to <28°C).

Conclusions

Using a mathematical model to guide protamine dosing in patients following CPB improved TEG r-time and reduced the dose administered relative to a fixed ratio. No differences were detected in postoperative mediastinal/pleural drainage or red blood cell transfusion requirement in our cohort of low-risk patients.

Trial registration

ClinicalTrials.gov Unique identifier NCT03532594.

Human papillomavirus seroprevalence in pregnant women following gender-neutral and girls-only vaccination programs in Finland: A cross-sectional cohort analysis following a cluster randomized trial

Lu, 07/06/2021 - 16:00

by Penelope Gray, Hanna Kann, Ville N. Pimenoff, Tiina Eriksson, Tapio Luostarinen, Simopekka Vänskä, Heljä-Marja Surcel, Helena Faust, Joakim Dillner, Matti Lehtinen

Background

Cervical cancer elimination through human papillomavirus (HPV) vaccination programs requires the attainment of herd effect. Due to its uniquely high basic reproduction number, the vaccination coverage required to achieve herd effect against HPV type 16 exceeds what is attainable in most populations. We have compared how gender-neutral and girls-only vaccination strategies create herd effect against HPV16 under moderate vaccination coverage achieved in a population-based, community-randomized trial.

Methods and findings

In 2007–2010, the 1992–1995 birth cohorts of 33 Finnish communities were randomized to receive gender-neutral HPV vaccination (Arm A), girls-only HPV vaccination (Arm B), or no HPV vaccination (Arm C) (11 communities per trial arm). HPV16/18/31/33/35/45 seroprevalence differences between the pre-vaccination era (2005–2010) and post-vaccination era (2011–2016) were compared between all 8,022 unvaccinated women <23 years old and resident in the 33 communities during 2005–2016 (2,657, 2,691, and 2,674 in Arms A, B, and C, respectively). Post- versus pre-vaccination-era HPV seroprevalence ratios (PRs) were compared by arm. Possible outcome misclassification was quantified via probabilistic bias analysis. An HPV16 and HPV18 seroprevalence reduction was observed post-vaccination in the gender-neutral vaccination arm in the entire study population (PR16 = 0.64, 95% CI 0.10–0.85; PR18 = 0.72, 95% CI 0.22–0.96) and for HPV16 also in the herpes simplex virus type 2 seropositive core group (PR16 = 0.64, 95% CI 0.50–0.81). Observed reductions in HPV31/33/35/45 seroprevalence (PR31/33/35/45 = 0.88, 95% CI 0.81–0.97) were replicated in Arm C (PR31/33/35/45 = 0.79, 95% CI 0.69–0.90).

Conclusions

In this study we only observed herd effect against HPV16/18 after gender-neutral vaccination with moderate vaccination coverage. With only moderate vaccination coverage, a gender-neutral vaccination strategy can facilitate the control of even HPV16. Our findings may have limited transportability to other vaccination coverage levels.

Trial registration

ClinicalTrials.gov number NCT00534638, https://clinicaltrials.gov/ct2/show/NCT00534638.

Cervical intraepithelial neoplasia and the risk of spontaneous preterm birth: A Dutch population-based cohort study with 45,259 pregnancy outcomes

Ve, 04/06/2021 - 16:00

by Diede L. Loopik, Joris van Drongelen, Ruud L. M. Bekkers, Quirinus J. M. Voorham, Willem J. G. Melchers, Leon F. A. G. Massuger, Folkert J. van Kemenade, Albert G. Siebers

Background

Excisional procedures of cervical intraepithelial neoplasia (CIN) may increase the risk of preterm birth. It is unknown whether this increased risk is due to the excision procedure itself, to the underlying CIN, or to secondary risk factors that are associated with both preterm birth and CIN. The aim of this study is to assess the risk of spontaneous preterm birth in women with treated and untreated CIN and examine possible associations by making a distinction between the excised volume of cervical tissue and having cervical disease.

Methods and findings

This Dutch population-based observational cohort study identified women aged 29 to 41 years with CIN between 2005 and 2015 from the Dutch pathology registry (PALGA) and frequency matched them with a control group without any cervical abnormality based on age at and year of pathology outcome (i.e., CIN or normal cytology) and urbanization (<100,000 inhabitants or ≥100,000 inhabitants). All their 45,259 subsequent singleton pregnancies with a gestational age ≥16 weeks between 2010 and 2017 were identified from the Dutch perinatal database (Perined). Nineteen potential confounders for preterm birth were identified. Adjusted odds ratios (ORs) were calculated for preterm birth comparing the 3 different groups of women: (1) women without CIN diagnosis; (2) women with untreated CIN; and (3) women with treated CIN prior to each childbirth.In total, 29,907, 5,940, and 9,412 pregnancies were included in the control, untreated CIN, and treated CIN group, respectively. The control group showed a 4.8% (1,002/20,969) proportion of spontaneous preterm birth, which increased to 6.9% (271/3,940) in the untreated CIN group, 9.5% (600/6,315) in the treated CIN group, and 15.6% (50/321) in the group with multiple treatments. Women with untreated CIN had a 1.38 times greater odds of preterm birth compared to women without CIN (95% confidence interval (CI) 1.19 to 1.60; P < 0.001). For women with treated CIN, these odds 2.07 times increased compared to the control group (95% CI 1.85 to 2.33; P < 0.001). Treated women had a 1.51 times increased odds of preterm birth compared to women with untreated CIN (95% CI 1.29 to 1.76; P < 0.001). Independent from cervical disease, a volume excised from the cervix of 0.5 to 0.9 cc increased the odds of preterm birth 2.20 times (37/379 versus 1,002/20,969; 95% CI 1.52 to 3.20; P < 0.001). These odds further increased 3.13 times and 5.93 times for women with an excised volume of 4 to 8.9 cc (90/724 versus 1,002/20,969; 95% CI 2.44 to 4.01; P < 0.001) and ≥9 cc (30/139 versus 1,002/20,969; 95% CI 3.86 to 9.13; P < 0.001), respectively. Limitations of the study include the retrospective nature, lack of sufficient information to calculate odds of preterm birth <24 weeks, and that the excised volume could only be calculated for a select group of women.

Conclusions

In this study, we observed a strong correlation between preterm birth and a volume of ≥0.5 cc excised cervical tissue, regardless of the severity of CIN. Caution should be taken when performing excisional treatment in women of reproductive age as well as prudence in case of multiple biopsies. Fertile women with a history of performing multiple biopsies or excisional treatment for CIN may benefit from close surveillance during pregnancy.

Associations of obesity and malnutrition with cardiac remodeling and cardiovascular outcomes in Asian adults: A cohort study

Ma, 01/06/2021 - 16:00

by Shih-Chieh Chien, Chanchal Chandramouli, Chi-In Lo, Chao-Feng Lin, Kuo-Tzu Sung, Wen-Hung Huang, Yau-Huei Lai, Chun-Ho Yun, Cheng-Huang Su, Hung-I Yeh, Ta-Chuan Hung, Chung-Lieh Hung, Carolyn S. P. Lam

Background

Obesity, a known risk factor for cardiovascular disease and heart failure (HF), is associated with adverse cardiac remodeling in the general population. Little is known about how nutritional status modifies the relationship between obesity and outcomes. We aimed to investigate the association of obesity and nutritional status with clinical characteristics, echocardiographic changes, and clinical outcomes in the general community.

Methods and findings

We examined 5,300 consecutive asymptomatic Asian participants who were prospectively recruited in a cardiovascular health screening program (mean age 49.6 ± 11.4 years, 64.8% male) between June 2009 to December 2012. Clinical and echocardiographic characteristics were described in participants, stratified by combined subgroups of obesity and nutritional status. Obesity was indexed by body mass index (BMI) (low, ≤25 kg/m2 [lean]; high, >25 kg/m2 [obese]) (WHO-recommended Asian cutoffs). Nutritional status was defined primarily by serum albumin (SA) concentration (low, <45 g/L [malnourished]; high, ≥45 g/L [well-nourished]), and secondarily by the prognostic nutritional index (PNI) and Global Leadership Initiative on Malnutrition (GLIM) criteria. Cox proportional hazard models were used to examine a 1-year composite outcome of hospitalization for HF or all-cause mortality while adjusting for age, sex, and other clinical confounders. Our community-based cohort consisted of 2,096 (39.0%) lean–well-nourished (low BMI, high SA), 1,369 (25.8%) obese–well-nourished (high BMI, high SA), 1,154 (21.8%) lean–malnourished (low BMI, low SA), and 681 (12.8%) obese–malnourished (high BMI, low SA) individuals. Obese–malnourished participants were on average older (54.5 ± 11.4 years) and more often women (41%), with a higher mean waist circumference (91.7 ± 8.8 cm), the highest percentage of body fat (32%), and the highest prevalence of hypertension (32%), diabetes (12%), and history of cardiovascular disease (11%), compared to all other subgroups (all p < 0.001). N-terminal pro B-type natriuretic peptide (NT-proBNP) levels were substantially increased in the malnourished (versus well-nourished) groups, to a similar extent in lean (70.7 ± 177.3 versus 36.8 ± 40.4 pg/mL) and obese (73.1 ± 216.8 versus 33.2 ± 40.8 pg/mL) (p < 0.001 in both) participants. The obese–malnourished (high BMI, low SA) group also had greater left ventricular remodeling (left ventricular mass index, 44.2 ± 1.52 versus 33.8 ± 8.28 gm/m2; relative wall thickness 0.39 ± 0.05 versus 0.38 ± 0.06) and worse diastolic function (TDI-e′ 7.97 ± 2.16 versus 9.87 ± 2.47 cm/s; E/e′ 9.19 ± 3.01 versus 7.36 ± 2.31; left atrial volume index 19.5 ± 7.66 versus 14.9 ± 5.49 mL/m2) compared to the lean–well-nourished (low BMI, high SA) group, as well as all other subgroups (p < 0.001 for all). Over a median 3.6 years (interquartile range 2.5 to 4.8 years) of follow-up, the obese–malnourished group had the highest multivariable-adjusted risk of the composite outcome (hazard ratio [HR] 2.49, 95% CI 1.43 to 4.34, p = 0.001), followed by the lean–malnourished (HR 1.78, 95% CI 1.04 to 3.04, p = 0.034) and obese–well-nourished (HR 1.41, 95% CI 0.77 to 2.58, p = 0.27) groups (with lean–well-nourished group as reference). Results were similar when indexed by other anthropometric indices (waist circumference and body fat) and other measures of nutritional status (PNI and GLIM criteria). Potential selection bias and residual confounding were the main limitations of the study.

Conclusions

In our cohort study among asymptomatic community-based adults in Taiwan, we found that obese individuals with poor nutritional status have the highest comorbidity burden, the most adverse cardiac remodeling, and the least favorable composite outcome.

Integrated treatment of hepatitis C virus infection among people who inject drugs: A multicenter randomized controlled trial (INTRO-HCV)

Ma, 01/06/2021 - 16:00

by Lars T. Fadnes, Christer Frode Aas, Jørn Henrik Vold, Rafael Alexander Leiva, Christian Ohldieck, Fatemeh Chalabianloo, Svetlana Skurtveit, Ole Jørgen Lygren, Olav Dalgård, Peter Vickerman, Håvard Midgard, Else-Marie Løberg, Kjell Arne Johansson, for the INTRO-HCV Study Group

Background

The standard pathways of testing and treatment for hepatitis C virus (HCV) infection in tertiary healthcare are not easily accessed by people who inject drugs (PWID). The aim of this study was to evaluate the efficacy of integrated treatment of chronic HCV infection among PWID.

Methods and findings

INTRO-HCV is a multicenter, randomized controlled clinical trial. Participants recruited from opioid agonist therapy (OAT) and community care clinics in Norway over 2017 to 2019 were randomly 1:1 assigned to the 2 treatment approaches. Integrated treatment was delivered by multidisciplinary teams at opioid agonist treatment clinics or community care centers (CCCs) for people with substance use disorders. This included on-site testing for HCV, liver fibrosis assessment, counseling, treatment, and posttreatment follow-up. Standard treatment was delivered in hospital outpatient clinics. Oral direct-acting antiviral (DAA) medications were administered in both arms. The study was not completely blinded. The primary outcomes were time-to-treatment initiation and sustained virologic response (SVR), defined as undetectable HCV RNA 12 weeks after treatment completion, analyzed with intention to treat, and presented as hazard ratio (HR) and odds ratio (OR) with 95% confidence intervals.Among 298 included participants, 150 were randomized to standard treatment, of which 116/150 (77%) initiated treatment, with 108/150 (72%) initiating within 1 year of referral. Among those 148 randomized to integrated care, 145/148 (98%) initiated treatment, with 141/148 (95%) initiating within 1 year of referral. The HR for the time to initiating treatment in the integrated arm was 2.2 (1.7 to 2.9) compared to standard treatment. SVR was confirmed in 123 (85% of initiated/83% of all) for integrated treatment compared to 96 (83% of initiated/64% of all) for the standard treatment (OR among treated: 1.5 [0.8 to 2.9], among all: 2.8 [1.6 to 4.8]). No severe adverse events were linked to the treatment.

Conclusions

Integrated treatment for HCV in PWID was superior to standard treatment in terms of time-to-treatment initiation, and subsequently, more people achieved SVR. Among those who initiated treatment, the SVR rates were comparable. Scaling up of integrated treatment models could be an important tool for elimination of HCV.

Trial registration

ClinicalTrials.gov.no NCT03155906

Association between industry payments and prescriptions of long-acting insulin: An observational study with propensity score matching

Ma, 01/06/2021 - 16:00

by Kosuke Inoue, Yusuke Tsugawa, Carol M. Mangione, O. Kenrik Duru

Background

The rapidly increased spending on insulin is a major public health issue in the United States. Industry marketing might be one of the upstream determinants of physicians’ prescription of long-acting insulin—the most commonly used and costly type of insulin, but the evidence is lacking. We therefore aimed to investigate the association between industry payments to physicians and subsequent prescriptions of long-acting insulin.

Methods and findings

Using the databases of Open Payments and Medicare Part D, we examined the association between the receipt of industry payments for long-acting insulin in 2016 and (1) the number of claims; (2) the costs paid for all claims; and (3) the costs per claim of long-acting insulin in 2017. We also examined the association between the receipt of payments and the change in these outcomes from 2016 to 2017. We employed propensity score matching to adjust for the physician-level characteristics (sex, years in practice, specialty, and medical school attended). Among 145,587 eligible physicians treating Medicare beneficiaries, 51,851 physicians received industry payments for long-acting insulin worth $22.3 million. In the propensity score–matched analysis including 102,590 physicians, we found that physicians who received the payments prescribed a higher number of claims (adjusted difference, 57.8; 95% CI, 55.8 to 59.7), higher costs for total claims (adjusted difference, +$22,111; 95% CI, $21,387 to $22,836), and higher costs per claim (adjusted difference, +$71.1; 95% CI, $69.0 to $73.2) of long-acting insulin, compared with physicians who did not receive the payments. The association was also found for changes in these outcomes from 2016 to 2017. Limitations to our study include limited generalizability, confounding, and possible reverse causation.

Conclusions

Industry marketing payments to physicians for long-acting insulin were associated with the physicians’ prescriptions and costs of long-acting insulin in the subsequent year. Future research is needed to assess whether policy interventions on physician–industry financial relationships will help to ensure appropriate prescriptions and limit overall costs of this essential drug for diabetes care.

Inequities in access to primary care among opioid recipients in Ontario, Canada: A population-based cohort study

Ma, 01/06/2021 - 16:00

by Tara Gomes, Tonya J. Campbell, Diana Martins, J. Michael Paterson, Laura Robertson, David N. Juurlink, Muhammad Mamdani, Richard H. Glazier

Background

Stigma and high-care needs can present barriers to the provision of high-quality primary care for people with opioid use disorder (OUD) and those prescribed opioids for chronic pain. We explored the likelihood of securing a new primary care provider (PCP) among people with varying histories of opioid use who had recently lost access to their PCP.

Methods and findings

We conducted a retrospective cohort study using linked administrative data among residents of Ontario, Canada whose enrolment with a physician practicing in a primary care enrolment model (PEM) was terminated between January 2016 and December 2017. We assigned individuals to 3 groups based upon their opioid use on the date enrolment ended: long-term opioid pain therapy (OPT), opioid agonist therapy (OAT), or no opioid. We fit multivariable models assessing the primary outcome of primary care reattachment within 1 year, adjusting for demographic characteristics, clinical comorbidities, and health services utilization. Secondary outcomes included rates of emergency department (ED) visits and opioid toxicity events.Among 154,970 Ontarians who lost their PCP, 1,727 (1.1%) were OAT recipients, 3,644 (2.4%) were receiving long-term OPT, and 149,599 (96.5%) had no recent prescription opioid exposure. In general, OAT recipients were younger (median age 36) than those receiving long-term OPT (59 years) and those with no recent prescription opioid exposure (44 years). In all exposure groups, the majority of individuals had their enrolment terminated by their physician (range 78.1% to 88.8%). In the primary analysis, as compared to those not receiving opioids, OAT recipients were significantly less likely to find a PCP within 1 year (adjusted hazard ratio [aHR] 0.55, 95% confidence interval [CI] 0.50 to 0.61, p < 0.0001). We observed no significant difference between long-term OPT and opioid unexposed individuals (aHR 0.96; 95% CI 0.92 to 1.01, p = 0.12). In our secondary analysis comparing the period of PCP loss to the year prior, we found that rates of ED visits were elevated among people not receiving opioids (adjusted rate ratio (aRR) 1.20, 95% CI 1.18 to 1.22, p < 0.0001) and people receiving long-term OPT (aRR 1.37, 95% CI 1.28 to 1.48, p < 0.0001). We found no such increase among OAT recipients, and no significant increase in opioid toxicity events in the period following provider loss for any exposure group. The main limitation of our findings relates to their generalizability outside of PEMs and in jurisdictions with different financial incentives incorporated into primary care provision.

Conclusions

In this study, we observed gaps in access to primary care among people who receive prescription opioids, particularly among OAT recipients. Ongoing efforts are needed to address the stigma, discrimination, and financial disincentives that may introduce barriers to the healthcare system, and to facilitate access to high-quality, consistent primary care services for chronic pain patients and those with OUD.

Risk of prostate cancer in relatives of prostate cancer patients in Sweden: A nationwide cohort study

Ma, 01/06/2021 - 16:00

by Xing Xu, Elham Kharazmi, Yu Tian, Trasias Mukama, Kristina Sundquist, Jan Sundquist, Hermann Brenner, Mahdi Fallah

Background

Evidence-based guidance for starting ages of screening for first-degree relatives (FDRs) of patients with prostate cancer (PCa) to prevent stage III/IV or fatal PCa is lacking in current PCa screening guidelines. We aimed to provide evidence for risk-adapted starting age of screening for relatives of patients with PCa.

Methods and findings

In this register-based nationwide cohort study, all men (aged 0 to 96 years at baseline) residing in Sweden who were born after 1931 along with their fathers were included. During the follow-up (1958 to 2015) of 6,343,727 men, 88,999 were diagnosed with stage III/IV PCa or died of PCa. The outcomes were defined as the diagnosis of stage III/IV PCa or death due to PCa, stratified by age at diagnosis. Using 10-year cumulative risk curves, we calculated risk-adapted starting ages of screening for men with different constellations of family history of PCa. The 10-year cumulative risk of stage III/IV or fatal PCa in men at age 50 in the general population (a common recommended starting age of screening) was 0.2%. Men with ≥2 FDRs diagnosed with PCa reached this screening level at age 41 (95% confidence interval (CI): 39 to 44), i.e., 9 years earlier, when the youngest one was diagnosed before age 60; at age 43 (41 to 47), i.e., 7 years earlier, when ≥2 FDRs were diagnosed after age 59, which was similar to that of men with 1 FDR diagnosed before age 60 (41 to 45); and at age 45 (44 to 46), when 1 FDR was diagnosed at age 60 to 69 and 47 (46 to 47), when 1 FDR was diagnosed after age 69. We also calculated risk-adapted starting ages for other benchmark screening ages, such as 45, 55, and 60 years, and compared our findings with those in the guidelines. Study limitations include the lack of genetic data, information on lifestyle, and external validation.

Conclusions

Our study provides practical information for risk-tailored starting ages of PCa screening based on nationwide cancer data with valid genealogical information. Our clinically relevant findings could be used for evidence-based personalized PCa screening guidance and supplement current PCa screening guidelines for relatives of patients with PCa.

Global economic costs due to vivax malaria and the potential impact of its radical cure: A modelling study

Ma, 01/06/2021 - 16:00

by Angela Devine, Katherine E. Battle, Niamh Meagher, Rosalind E. Howes, Saber Dini, Peter W. Gething, Julie A. Simpson, Ric N. Price, Yoel Lubell

Background

In 2017, an estimated 14 million cases of Plasmodium vivax malaria were reported from Asia, Central and South America, and the Horn of Africa. The clinical burden of vivax malaria is largely driven by its ability to form dormant liver stages (hypnozoites) that can reactivate to cause recurrent episodes of malaria. Elimination of both the blood and liver stages of the parasites (“radical cure”) is required to achieve a sustained clinical response and prevent ongoing transmission of the parasite. Novel treatment options and point-of-care diagnostics are now available to ensure that radical cure can be administered safely and effectively. We quantified the global economic cost of vivax malaria and estimated the potential cost benefit of a policy of radical cure after testing patients for glucose-6-phosphate dehydrogenase (G6PD) deficiency.

Methods and findings

Estimates of the healthcare provider and household costs due to vivax malaria were collated and combined with national case estimates for 44 endemic countries in 2017. These provider and household costs were compared with those that would be incurred under 2 scenarios for radical cure following G6PD screening: (1) complete adherence following daily supervised primaquine therapy and (2) unsupervised treatment with an assumed 40% effectiveness. A probabilistic sensitivity analysis generated credible intervals (CrIs) for the estimates. Globally, the annual cost of vivax malaria was US$359 million (95% CrI: US$222 to 563 million), attributable to 14.2 million cases of vivax malaria in 2017. From a societal perspective, adopting a policy of G6PD deficiency screening and supervision of primaquine to all eligible patients would prevent 6.1 million cases and reduce the global cost of vivax malaria to US$266 million (95% CrI: US$161 to 415 million), although healthcare provider costs would increase by US$39 million. If perfect adherence could be achieved with a single visit, then the global cost would fall further to US$225 million, equivalent to $135 million in cost savings from the baseline global costs. A policy of unsupervised primaquine reduced the cost to US$342 million (95% CrI: US$209 to 532 million) while preventing 2.1 million cases. Limitations of the study include partial availability of country-level cost data and parameter uncertainty for the proportion of patients prescribed primaquine, patient adherence to a full course of primaquine, and effectiveness of primaquine when unsupervised.

Conclusions

Our modelling study highlights a substantial global economic burden of vivax malaria that could be reduced through investment in safe and effective radical cure achieved by routine screening for G6PD deficiency and supervision of treatment. Novel, low-cost interventions for improving adherence to primaquine to ensure effective radical cure and widespread access to screening for G6PD deficiency will be critical to achieving the timely global elimination of P. vivax.

Vitamin D and COVID-19 susceptibility and severity in the COVID-19 Host Genetics Initiative: A Mendelian randomization study

Ma, 01/06/2021 - 16:00

by Guillaume Butler-Laporte, Tomoko Nakanishi, Vincent Mooser, David R. Morrison, Tala Abdullah, Olumide Adeleye, Noor Mamlouk, Nofar Kimchi, Zaman Afrasiabi, Nardin Rezk, Annarita Giliberti, Alessandra Renieri, Yiheng Chen, Sirui Zhou, Vincenzo Forgetta, J. Brent Richards

Background

Increased vitamin D levels, as reflected by 25-hydroxy vitamin D (25OHD) measurements, have been proposed to protect against COVID-19 based on in vitro, observational, and ecological studies. However, vitamin D levels are associated with many confounding variables, and thus associations described to date may not be causal. Vitamin D Mendelian randomization (MR) studies have provided results that are concordant with large-scale vitamin D randomized trials. Here, we used 2-sample MR to assess evidence supporting a causal effect of circulating 25OHD levels on COVID-19 susceptibility and severity.

Methods and findings

Genetic variants strongly associated with 25OHD levels in a genome-wide association study (GWAS) of 443,734 participants of European ancestry (including 401,460 from the UK Biobank) were used as instrumental variables. GWASs of COVID-19 susceptibility, hospitalization, and severe disease from the COVID-19 Host Genetics Initiative were used as outcome GWASs. These included up to 14,134 individuals with COVID-19, and up to 1,284,876 without COVID-19, from up to 11 countries. SARS-CoV-2 positivity was determined by laboratory testing or medical chart review. Population controls without COVID-19 were also included in the control groups for all outcomes, including hospitalization and severe disease. Analyses were restricted to individuals of European descent when possible. Using inverse-weighted MR, genetically increased 25OHD levels by 1 standard deviation on the logarithmic scale had no significant association with COVID-19 susceptibility (odds ratio [OR] = 0.95; 95% CI 0.84, 1.08; p = 0.44), hospitalization (OR = 1.09; 95% CI: 0.89, 1.33; p = 0.41), and severe disease (OR = 0.97; 95% CI: 0.77, 1.22; p = 0.77). We used an additional 6 meta-analytic methods, as well as conducting sensitivity analyses after removal of variants at risk of horizontal pleiotropy, and obtained similar results. These results may be limited by weak instrument bias in some analyses. Further, our results do not apply to individuals with vitamin D deficiency.

Conclusions

In this 2-sample MR study, we did not observe evidence to support an association between 25OHD levels and COVID-19 susceptibility, severity, or hospitalization. Hence, vitamin D supplementation as a means of protecting against worsened COVID-19 outcomes is not supported by genetic evidence. Other therapeutic or preventative avenues should be given higher priority for COVID-19 randomized controlled trials.

Blood pressure-lowering treatment for the prevention of cardiovascular events in patients with atrial fibrillation: An individual participant data meta-analysis

Ma, 01/06/2021 - 16:00

by Ana-Catarina Pinho-Gomes, Luis Azevedo, Emma Copland, Dexter Canoy, Milad Nazarzadeh, Rema Ramakrishnan, Eivind Berge, Johan Sundström, Dipak Kotecha, Mark Woodward, Koon Teo, Barry R. Davis, John Chalmers, Carl J. Pepine, Kazem Rahimi, on behalf of the Blood Pressure Lowering Treatment Trialists’ Collaboration

Background

Randomised evidence on the efficacy of blood pressure (BP)-lowering treatment to reduce cardiovascular risk in patients with atrial fibrillation (AF) is limited. Therefore, this study aimed to compare the effects of BP-lowering drugs in patients with and without AF at baseline.

Methods and findings

The study was based on the resource provided by the Blood Pressure Lowering Treatment Trialists’ Collaboration (BPLTTC), in which individual participant data (IPD) were extracted from trials with over 1,000 patient-years of follow-up in each arm, and that had randomly assigned patients to different classes of BP-lowering drugs, BP-lowering drugs versus placebo, or more versus less intensive BP-lowering regimens. For this study, only trials that had collected information on AF status at baseline were included. The effects of BP-lowering treatment on a composite endpoint of major cardiovascular events (stroke, ischaemic heart disease or heart failure) according to AF status at baseline were estimated using fixed-effect one-stage IPD meta-analyses based on Cox proportional hazards models stratified by trial. Furthermore, to assess whether the associations between the intensity of BP reduction and cardiovascular outcomes are similar in those with and without AF at baseline, we used a meta-regression. From the full BPLTTC database, 28 trials (145,653 participants) were excluded because AF status at baseline was uncertain or unavailable. A total of 22 trials were included with 188,570 patients, of whom 13,266 (7%) had AF at baseline. Risk of bias assessment showed that 20 trials were at low risk of bias and 2 trials at moderate risk. Meta-regression showed that relative risk reductions were proportional to trial-level intensity of BP lowering in patients with and without AF at baseline. Over 4.5 years of median follow-up, a 5-mm Hg systolic BP (SBP) reduction lowered the risk of major cardiovascular events both in patients with AF (hazard ratio [HR] 0.91, 95% confidence interval [CI] 0.83 to 1.00) and in patients without AF at baseline (HR 0.91, 95% CI 0.88 to 0.93), with no difference between subgroups. There was no evidence for heterogeneity of treatment effects by baseline SBP or drug class in patients with AF at baseline. The findings of this study need to be interpreted in light of its potential limitations, such as the limited number of trials, limitation in ascertaining AF cases due to the nature of the arrhythmia and measuring BP in patients with AF.

Conclusions

In this meta-analysis, we found that BP-lowering treatment reduces the risk of major cardiovascular events similarly in individuals with and without AF. Pharmacological BP lowering for prevention of cardiovascular events should be recommended in patients with AF.

Hypertension and atrial fibrillation: Closing a virtuous circle

Ma, 01/06/2021 - 16:00

by Ying X. Gue, Gregory Y. H. Lip

Ying Gue and Gregory Lip discuss the accompanying study by Ana-Catarina Pinho-Gomes and co-workers on blood pressure lowering treatment in patients with atrial fibrillation.

Association of <i>APOE</i> ε4 genotype and lifestyle with cognitive function among Chinese adults aged 80 years and older: A cross-sectional study

Ma, 01/06/2021 - 16:00

by Xurui Jin, Wanying He, Yan Zhang, Enying Gong, Zhangming Niu, John Ji, Yaxi Li, Yi Zeng, Lijing L. Yan

Background

Apolipoprotein E (APOE) ε4 is the single most important genetic risk factor for cognitive impairment and Alzheimer disease (AD), while lifestyle factors such as smoking, drinking, diet, and physical activity also have impact on cognition. The goal of the study is to investigate whether the association between lifestyle and cognition varies by APOE genotype among the oldest old.

Methods and findings

We used the cross-sectional data including 6,160 oldest old (aged 80 years old or older) from the genetic substudy of the Chinese Longitudinal Healthy Longevity Survey (CLHLS) which is a national wide cohort study that began in 1998 with follow-up surveys every 2–3 years. Cognitive impairment was defined as a Mini-Mental State Examination (MMSE) score less than 18. Healthy lifestyle profile was classified into 3 groups by a composite measure including smoking, alcohol consumption, dietary pattern, physical activity, and body weight. APOE genotype was categorized as APOE ε4 carriers versus noncarriers. We examined the associations of cognitive impairment with lifestyle profile and APOE genotype using multivariable logistic regressions, controlling for age, sex, education, marital status, residence, disability, and numbers of chronic conditions.The mean age of our study sample was 90.1 (standard deviation [SD], 7.2) years (range 80–113); 57.6% were women, and 17.5% were APOE ε4 carriers. The mean MMSE score was 21.4 (SD: 9.2), and 25.0% had cognitive impairment. Compared with those with an unhealthy lifestyle, participants with intermediate and healthy lifestyle profiles were associated with 28% (95% confidence interval [CI]: 16%–38%, P < 0.001) and 55% (95% CI: 44%–64%, P < 0.001) lower adjusted odds of cognitive impairment. Carrying the APOE ε4 allele was associated with 17% higher odds (95% CI: 1%–31%, P = 0.042) of being cognitively impaired in the adjusted model. The association between lifestyle profiles and cognitive function did not vary significantly by APOE ε4 genotype (noncarriers: 0.47 [0.37–0.60] healthy versus unhealthy; carriers: 0.33 [0.18–0.58], P for interaction = 0.30). The main limitation was the lifestyle measurements were self-reported and were nonspecific. Generalizability of the findings is another limitation because the study sample was from the oldest old in China, with unique characteristics such as low body weight compared to populations in high-income countries.

Conclusions

In this study, we observed that healthier lifestyle was associated with better cognitive function among the oldest old regardless of APOE genotype. Our findings may inform the cognitive outlook for those oldest old with high genetic risk of cognitive impairment.

Effects of community-based antiretroviral therapy initiation models on HIV treatment outcomes: A systematic review and meta-analysis

Ve, 28/05/2021 - 16:00

by Ingrid Eshun-Wilson, Ajibola A. Awotiwon, Ashley Germann, Sophia A. Amankwaa, Nathan Ford, Sheree Schwartz, Stefan Baral, Elvin H. Geng

Background

Antiretroviral therapy (ART) initiation in the community and outside of a traditional health facility has the potential to improve linkage to ART, decongest health facilities, and minimize structural barriers to attending HIV services among people living with HIV (PLWH). We conducted a systematic review and meta-analysis to determine the effect of offering ART initiation in the community on HIV treatment outcomes.

Methods and findings

We searched databases between 1 January 2013 and 22 February 2021 to identify randomized controlled trials (RCTs) and observational studies that compared offering ART initiation in a community setting to offering ART initiation in a traditional health facility or alternative community setting. We assessed risk of bias, reporting of implementation outcomes, and real-world relevance and used Mantel–Haenszel methods to generate pooled risk ratios (RRs) and risk differences (RDs) with 95% confidence intervals. We evaluated heterogeneity qualitatively and quantitatively and used GRADE to evaluate overall evidence certainty. Searches yielded 4,035 records, resulting in 8 included studies—4 RCTs and 4 observational studies—conducted in Lesotho, South Africa, Nigeria, Uganda, Malawi, Tanzania, and Haiti—a total of 11,196 PLWH. Five studies were conducted in general HIV populations, 2 in key populations, and 1 in adolescents. Community ART initiation strategies included community-based HIV testing coupled with ART initiation at home or at community venues; 5 studies maintained ART refills in the community, and 4 provided refills at the health facility. All studies were pragmatic, but in most cases provided additional resources. Few studies reported on implementation outcomes. All studies showed higher ART uptake in community initiation arms compared to facility initiation and refill arms (standard of care) (RR 1.73, 95% CI 1.22 to 2.45; RD 30%, 95% CI 10% to 50%; 5 studies). Retention (RR 1.43, 95% CI 1.32 to 1.54; RD 19%, 95% CI 11% to 28%; 4 studies) and viral suppression (RR 1.31, 95% CI 1.15 to 1.49; RD 15%, 95% CI 10% to 21%; 3 studies) at 12 months were also higher in the community-based ART initiation arms. Improved uptake, retention, and viral suppression with community ART initiation were seen across population subgroups—including men, adolescents, and key populations. One study reported no difference in retention and viral suppression at 2 years. There were limited data on adherence and mortality. Social harms and adverse events appeared to be minimal and similar between community care and standard of care. One study compared ART refill strategies following community ART initiation (community versus facility refills), and found no difference in viral suppression (RD −7%, 95% CI −19% to 6%) or retention at 12 months (RD −12%, 95% CI −23% to 0.3%). This systematic review was limited by there being overall few studies for inclusion, poor-quality observational data, and short-term outcomes.

Conclusions

Based on data from a limited set of studies, community ART initiation appears to result in higher ART uptake, retention, and viral suppression at 1 year compared to facility-based ART initiation. Implementation on a wider scale necessitates broader exploration of costs, logistics, and acceptability by providers and PLWH to ensure that these effects are reproducible when delivered at scale, in different contexts, and over time.

Bile acid synthesis, modulation, and dementia: A metabolomic, transcriptomic, and pharmacoepidemiologic study

Gi, 27/05/2021 - 16:00

by Vijay R. Varma, Youjin Wang, Yang An, Sudhir Varma, Murat Bilgel, Jimit Doshi, Cristina Legido-Quigley, João C. Delgado, Anup M. Oommen, Jackson A. Roberts, Dean F. Wong, Christos Davatzikos, Susan M. Resnick, Juan C. Troncoso, Olga Pletnikova, Richard O’Brien, Eelko Hak, Brenda N. Baak, Ruth Pfeiffer, Priyanka Baloni, Siamak Mohmoudiandehkordi, Kwangsik Nho, Rima Kaddurah-Daouk, David A. Bennett, Shahinaz M. Gadalla, Madhav Thambisetty

Background

While Alzheimer disease (AD) and vascular dementia (VaD) may be accelerated by hypercholesterolemia, the mechanisms underlying this association are unclear. We tested whether dysregulation of cholesterol catabolism, through its conversion to primary bile acids (BAs), was associated with dementia pathogenesis.

Methods and findings

We used a 3-step study design to examine the role of the primary BAs, cholic acid (CA), and chenodeoxycholic acid (CDCA) as well as their principal biosynthetic precursor, 7α-hydroxycholesterol (7α-OHC), in dementia. In Step 1, we tested whether serum markers of cholesterol catabolism were associated with brain amyloid accumulation, white matter lesions (WMLs), and brain atrophy. In Step 2, we tested whether exposure to bile acid sequestrants (BAS) was associated with risk of dementia. In Step 3, we examined plausible mechanisms underlying these findings by testing whether brain levels of primary BAs and gene expression of their principal receptors are altered in AD.

Conclusions

We combined targeted metabolomics in serum and amyloid positron emission tomography (PET) and MRI of the brain with pharmacoepidemiologic analysis to implicate dysregulation of cholesterol catabolism in dementia pathogenesis. We observed that lower serum BA concentration mainly in males is associated with neuroimaging markers of dementia, and pharmacological lowering of BA levels may be associated with higher risk of VaD in males. We hypothesize that dysregulation of BA signaling pathways in the brain may represent a plausible biologic mechanism underlying these results. Together, our observations suggest a novel mechanism relating abnormalities in cholesterol catabolism to risk of dementia.

Evaluation of splenic accumulation and colocalization of immature reticulocytes and <i>Plasmodium vivax</i> in asymptomatic malaria: A prospective human splenectomy study

Me, 26/05/2021 - 16:00

by Steven Kho, Labibah Qotrunnada, Leo Leonardo, Benediktus Andries, Putu A. I. Wardani, Aurelie Fricot, Benoit Henry, David Hardy, Nur I. Margyaningsih, Dwi Apriyanti, Agatha M. Puspitasari, Pak Prayoga, Leily Trianty, Enny Kenangalem, Fabrice Chretien, Valentine Brousse, Innocent Safeukui, Hernando A. del Portillo, Carmen Fernandez-Becerra, Elamaran Meibalan, Matthias Marti, Ric N. Price, Tonia Woodberry, Papa A. Ndour, Bruce M. Russell, Tsin W. Yeo, Gabriela Minigo, Rintis Noviyanti, Jeanne R. Poespoprodjo, Nurjati C. Siregar, Pierre A. Buffet, Nicholas M. Anstey

Background

A very large biomass of intact asexual-stage malaria parasites accumulates in the spleen of asymptomatic human individuals infected with Plasmodium vivax. The mechanisms underlying this intense tropism are not clear. We hypothesised that immature reticulocytes, in which P. vivax develops, may display high densities in the spleen, thereby providing a niche for parasite survival.

Methods and findings

We examined spleen tissue in 22 mostly untreated individuals naturally exposed to P. vivax and Plasmodium falciparum undergoing splenectomy for any clinical indication in malaria-endemic Papua, Indonesia (2015 to 2017). Infection, parasite and immature reticulocyte density, and splenic distribution were analysed by optical microscopy, flow cytometry, and molecular assays. Nine non-endemic control spleens from individuals undergoing spleno-pancreatectomy in France (2017 to 2020) were also examined for reticulocyte densities. There were no exclusion criteria or sample size considerations in both patient cohorts for this demanding approach.In Indonesia, 95.5% (21/22) of splenectomy patients had asymptomatic splenic Plasmodium infection (7 P. vivax, 13 P. falciparum, and 1 mixed infection). Significant splenic accumulation of immature CD71 intermediate- and high-expressing reticulocytes was seen, with concentrations 11 times greater than in peripheral blood. Accordingly, in France, reticulocyte concentrations in the splenic effluent were higher than in peripheral blood. Greater rigidity of reticulocytes in splenic than in peripheral blood, and their higher densities in splenic cords both suggest a mechanical retention process. Asexual-stage P. vivax-infected erythrocytes of all developmental stages accumulated in the spleen, with non-phagocytosed parasite densities 3,590 times (IQR: 2,600 to 4,130) higher than in circulating blood, and median total splenic parasite loads 81 (IQR: 14 to 205) times greater, accounting for 98.7% (IQR: 95.1% to 98.9%) of the estimated total-body P. vivax biomass. More reticulocytes were in contact with sinus lumen endothelial cells in P. vivax- than in P. falciparum-infected spleens. Histological analyses revealed 96% of P. vivax rings/trophozoites and 46% of schizonts colocalised with 92% of immature reticulocytes in the cords and sinus lumens of the red pulp. Larger splenic cohort studies and similar investigations in untreated symptomatic malaria are warranted.

Conclusions

Immature CD71+ reticulocytes and splenic P. vivax-infected erythrocytes of all asexual stages accumulate in the same splenic compartments, suggesting the existence of a cryptic endosplenic lifecycle in chronic P. vivax infection. Findings provide insight into P. vivax-specific adaptions that have evolved to maximise survival and replication in the spleen.