Project description:ObjectiveTo investigate the clinical features, risk factors and underlying pathogenesis of cancer related subarachnoid hemorrhage (SAH).MethodsClinical data of SAH in patients with active cancer from January 2010 to December 2020 at four centers were retrospectively reviewed. Patients with active cancer without SAH were matched to SAH patients with active cancer group. Logistic regression was applied to investigate the independent risk factors of SAH in patients with active cancer, after a 1:1 propensity score matching (PSM). A receiver operator characteristic curve was configured to calculate the optimal cut-off value of the joint predictive factor for cancer related SAH.ResultsA total of 82 SAH patients with active cancer and 309 patients with active cancer alone were included. Most SAH patients with cancer had poor outcomes, with 30-day mortality of 41.5%, and with 90-day mortality of 52.0%. The PSM yielded 75 pairs of study participants. Logistic regression revealed that a decrease in platelet and prolonged prothrombin time were the independent risk factors of cancer related SAH. In addition, receiver operator characteristic curve of the joint predictive factor showed the largest AUC of 0.8131, with cut-off value equaling to 11.719, with a sensitivity of 65.3% and specificity of 89.3%.ConclusionPatients with cancer related SAH often have poor outcomes. The decrease in platelet and prolonged prothrombin time are the independent risk factors of cancer related SAH, and the joint predictive factor with cutoff value equal to 11.719 should hence serve as a novel biomarker of cancer related SAH.
Project description:BackgroundPropensity Score Matching (PSM) stands as a widely embraced method in comparative effectiveness research. PSM crafts matched datasets, mimicking some attributes of randomized designs, from observational data. In a valid PSM design where all baseline confounders are measured and matched, the confounders would be balanced, allowing the treatment status to be considered as if it were randomly assigned. Nevertheless, recent research has unveiled a different facet of PSM, termed "the PSM paradox". As PSM approaches exact matching by progressively pruning matched sets in order of decreasing propensity score distance, it can paradoxically lead to greater covariate imbalance, heightened model dependence, and increased bias, contrary to its intended purpose.MethodsWe used analytic formula, simulation, and literature to demonstrate that this paradox stems from the misuse of metrics for assessing chance imbalance and bias.ResultsFirstly, matched pairs typically exhibit different covariate values despite having identical propensity scores. However, this disparity represents a "chance" difference and will average to zero over a large number of matched pairs. Common distance metrics cannot capture this "chance" nature in covariate imbalance, instead reflecting increasing variability in chance imbalance as units are pruned and the sample size diminishes. Secondly, the largest estimate among numerous fitted models, because of uncertainty among researchers over the correct model, was used to determine statistical bias. This cherry-picking procedure ignores the most significant benefit of matching design-reducing model dependence based on its robustness against model misspecification bias.ConclusionsWe conclude that the PSM paradox is not a legitimate concern and should not stop researchers from using PSM designs.
Project description:BackgroundProgrammed cell death protein 1 (PD-1) inhibitors are commonly used worldwide for the management of non-small cell lung cancer (NSCLC). However, it remains unclear whether pembrolizumab and sintilimab, two of the most widely used PD-1 inhibitors in China, have significantly different effects on patients with NSCLC. A multicenter retrospective cohort study was designed and implemented using propensity-score matching (PSM) analysis to compare the effectiveness and safety profiles of pembrolizumab and sintilimab in patients with advanced NSCLC undergoing comprehensive therapy.MethodsA total of 225 patients who received comprehensive therapy including pembrolizumab (n = 127) or sintilimab (n = 98), from 1 January to 31 December 2020 and met the eligibility criteria were included. PSM analysis (1:1) was performed to balance potential baseline confounding factors. For both treatments, Kaplan-Meier analysis and Cox regression were used to compare 1-year progression-free survival (PFS), disease control rate (DCR), objective response rate (ORR), and rates of all adverse events (AEs).ResultsPSM analysis resulted in 63 matched pairs of patients. After PSM, the median PFS was 8.68 months in the sintilimab group and 9.46 months in the pembrolizumab group. The 1-year PFS showed no significant difference between the pembrolizumab and sintilimab groups before and after PSM (P = 0.873 and P = 0.574, respectively). Moreover, within the matched cohort, the pembrolizumab group had an ORR of 30.2% and a DCR of 84.1%, whereas the sintilimab group exhibited an ORR of 41.3% and a DCR of 88.9%. There were no significant differences in the ORR and DCR between the two groups (P = 0.248 and P = 0.629, respectively). The incidence of grade 3 or 4 treatment-related AEs was significantly higher in the pembrolizumab group than that in the sintilimab group (42.9% vs. 33.3%, P = 0.043). Multivariable Cox proportional hazards regression analysis indicated that the lines of treatment and regimens significantly influenced the PFS of patients (P <0.05).ConclusionsThis study demonstrated the similar effectiveness of sintilimab and pembrolizumab in the treatment of patients with advanced NSCLC, with sintilimab potentially displaying a superior clinical safety profile.Clinical trial registrationhttps://www.medicalresearch.org.cn/, identifier MR4423000113.
Project description:Background/aims: Bisphosphonates are increasingly recognized for their anti-neoplastic properties, which are the result of their action on the mevalonate pathway. Our primary aim was to investigate the association between bisphosphonate use and survival in patients with pancreatic cancer. Since statins also act on the mevalonate pathway, we also investigated the effect of the combined use of bisphosphonates and statins on survival.MethodsThe Surveillance, Epidemiology, and End Results registry (SEER)-Medicare linked database was used to identify patients with pancreatic ductal adenocarcinoma (PDAC) between 2007 and 2015. Kaplan-Meier models were used to examine the association between survival with bisphosphonate use alone and in combination with statins within 1 year prior to the diagnosis of PDAC. Propensity score matching analysis and Cox-proportional hazard models were used to determine the association between overall survival with bisphosphonate use alone and combined with statins, after adjusting for relevant confounders, such as the Charlson comorbidity index score, stage, treatment, sociodemographic characteristics, and propensity score.ResultsIn total, 13,639 patients with PDAC were identified, and 1,203 (8.82%) used bisphosphonates. There was no difference in the mean survival duration between bisphosphonate users (7.27 months) and nonusers (7.25 months, p=0.61). After adjustment for confounders, bisphosphonate use was still not associated with improved survival (hazard ratio, 1.00; 95% confidence interval, 0.93 to 1.08; p=0.96). Combined bisphosphonate and statin use was also not associated with improved survival (hazard ratio, 0.97; 95% confidence interval, 0.87 to 1.07; p=0.48) after adjustment for confounders.ConclusionsOur findings suggest that the use of bisphosphonates, whether alone or in combination with statins, does not confer a survival advantage in patients with PDAC.
Project description:ObjectivesAt present, there are no established clinical guidelines for radiofrequency ablation (RFA) of peribiliary hepatocellular carcinoma (HCC). Therefore, the aim of this study was to compare the long-term outcomes of RFA for peribiliary vs. non-peribiliary HCC.MethodsThis retrospective study included 282 patients with peribiliary HCC (n = 109) or non-peribiliary HCC (n = 173) who received RFA between February 2013 and May 2021. Local tumor progression (LTP), overall survival (OS), disease-free survival (DFS), and complications were compared before and after propensity score matching (PSM).ResultsBefore PSM, there were no significant differences in 5-year LTP rates (26.3% vs. 23.6%, p = 0.602), OS rates (56.6% vs. 68.0%, p = 0.586), or DFS rates (22.9% vs. 25.7%, p = 0.239) between the peribiliary and non-peribiliary groups. After PSM, there were no significant differences in the 1-, 3-, and 5-year LTP rates (13.0%, 23.1%, and 26.3% vs. 12.1%, 25.1%, and 28.2%, respectively, p = 0.857), OS rates (97.2%, 73.5%, and 56.6% vs. 95.3%, 79.5%, and 70.6%, p = 0.727), or DFS rates (59.4%, 29.4%, and 22.9% vs. 64.2%, 33.1%, and 23.8%, p = 0.568) between the peribiliary non-peribiliary groups. Peribiliary location was not a significant prognostic factor for LTP (p = 0.622) or OS (p = 0.587). In addition, mild intrahepatic bile duct dilatation was more frequent in the peribiliary group (9.2% vs. 2.8%, p = 0.045).ConclusionLong-term outcomes of RFA were similar for peribiliary and non-peribiliary HCC. RFA is a viable alternative for treatment of peribiliary HCC.Critical relevance statementThe local tumor progression (LTP), overall survival (OS), and disease-free survival (DFS) rates after radiofrequency ablation (RFA) were similar for peribiliary and non-peribiliary hepatocellular carcinoma (HCC).Key pointsThere are currently no clinical guidelines for radiofrequency ablation (RFA) of peribiliary hepatocellular carcinoma (HCC). Local tumor progression, overall survival, and disease-free survival after RFA were similar for peribiliary and non-peribiliary HCC. RFA is a viable alternative for the treatment of peribiliary HCC.
Project description:BackgroundThe effect of loop diuretic use in critically ill patients on vasopressor support or in shock is unclear. This study aimed to explore the relationship between loop diuretic use and hospital mortality in critically ill patients with vasopressor support.MethodsData were extracted from the Medical Information Mart for Intensive Care III database. Adult patients with records of vasopressor use within 48 h after intensive care unit admission were screened. Multivariable logistic regression and propensity score matching was used to investigate any association.ResultsData on 7828 patients were included. The crude hospital mortality was significantly lower in patients with diuretic use (166/1469 vs. 1171/6359, p < 0.001). In the extended multivariable logistic models, the odds ratio (OR) of diuretic use was consistently significant in all six models (OR range 0.56-0.75, p < 0.05 for all). In the subgroup analysis, an interaction effect was detected between diuretic use and fluid balance (FB). In the positive FB subgroup, diuretic use was significantly associated with decreased mortality (OR 0.64, 95% confidence interval (CI) 0.51-0.78) but was insignificant in the negative FB subgroup. In the other subgroups of mean arterial pressure, maximum sequential organ failure assessment score, and lactate level, the association between diuretic use and mortality remained significant and no interaction was detected. After propensity score matching, 1463 cases from each group were well matched. The mortality remained significantly lower in the diuretic use group (165/1463 vs. 231/1463, p < 0.001).ConclusionsAlthough residual confounding cannot be excluded, loop diuretic use is associated with lower mortality.
Project description:Cluster randomization trials with relatively few clusters have been widely used in recent years for evaluation of health-care strategies. On average, randomized treatment assignment achieves balance in both known and unknown confounding factors between treatment groups, however, in practice investigators can only introduce a small amount of stratification and cannot balance on all the important variables simultaneously. The limitation arises especially when there are many confounding variables in small studies. Such is the case in the INSTINCT trial designed to investigate the effectiveness of an education program in enhancing the tPA use in stroke patients. In this article, we introduce a new randomization design, the balance match weighted (BMW) design, which applies the optimal matching with constraints technique to a prospective randomized design and aims to minimize the mean squared error (MSE) of the treatment effect estimator. A simulation study shows that, under various confounding scenarios, the BMW design can yield substantial reductions in the MSE for the treatment effect estimator compared to a completely randomized or matched-pair design. The BMW design is also compared with a model-based approach adjusting for the estimated propensity score and Robins-Mark-Newey E-estimation procedure in terms of efficiency and robustness of the treatment effect estimator. These investigations suggest that the BMW design is more robust and usually, although not always, more efficient than either of the approaches. The design is also seen to be robust against heterogeneous error. We illustrate these methods in proposing a design for the INSTINCT trial.
Project description:ObjectiveThis study aimed to evaluate the clinical efficacy of belimumab in patients with early systemic lupus erythematosus (SLE), defined as having a disease duration of less than 6 months.MethodsWe retrospectively identified patients with SLE in the early stage who received belimumab and standard of care (belimumab group) or standard of care alone (control group) since September 2020. Propensity score matching (PSM) was used to reduce potential bias. The primary endpoint was lupus low disease activity status (LLDAS) at weeks 12 and 24. The secondary endpoints were remission and the proportion of glucocorticoid dose tapering to 7.5 mg/day. The efficacy of belimumab in patients with lupus nephritis was also assessed.ResultsOut of 111 eligible patients, 16 patients in the belimumab group and 31 patients in the control group were identified by 1:2 PSM. At week 24, a significantly higher proportion of individuals achieved low disease activity state (LLDAS) in the belimumab group compared to the control group (56.3% vs. 19.4%, OR = 5.357, 95% CI = 1.417 to 20.260, p = 0.013). Furthermore, more patients in the belimumab group were reduced to low-dose glucocorticoid ( ≤ 7.5 mg/day) at week 24 (75.0% vs. 35.5%, OR = 5.182, 95%CI = 1.339 to 20.058, p = 0.017). Significant improvements in Patient Global Assessment scores were observed at Week 12 and 24 for those treated with belimumab compared to controls. In a subgroup analysis evaluating the efficacy of belimumab in patients with lupus nephritis, 42.9% of the seven individuals treated with belimumab achieved a complete renal response (CRR) by Week 24, and no instances of disease relapse were observed.ConclusionsIn SLE patients with a disease duration of less than 6 months, belimumab treatment can promote LLDAS achievement and reduce glucocorticoid dose, leading to a better prognosis. Introducing belimumab in the early stage of SLE may be a beneficial decision.
Project description:It remains inconclusive whether hyperuricemia is a true risk factor for kidney graft failure. In the current study, we investigated the association of hyperuricemia and graft outcome. We performed a multi-center cohort study that included 2620 kidney transplant recipients. The patients were classified as either normouricemic or hyperuricemic at 3 months after transplantation. Hyperuricemia was defined as a serum uric acid level ≥ 7.0 mg/dL in males or ≥ 6.0 mg/dL in females or based on the use of urate-lowering medications. The two groups were compared before and after propensity score matching. A total of 657 (25.1%) patients were classified as hyperuricemic. The proportion of hyperuricemic patients increased over time, reaching 44.2% of the total cohort at 5 years after transplantation. Estimated glomerular filtration rate and donor type were independently associated with hyperuricemia. Hyperuricemia was associated with graft loss according to multiple Cox regression analysis before propensity score matching (hazard ratio [HR] = 1.56, 95% confidence interval [CI] = 1.14-2.13, P = 0.005) as well as after matching (HR = 1.65, 95% CI = 1.13-2.42, p = 0.010). Cox regression models using time-varying hyperuricemia or marginal structural models adjusted with time-varying eGFR also demonstrated significant hazards of hyperuricemia for graft loss. Cardiovascular events and recipient survival were not associated with hyperuricemia. Overall, hyperuricemia, especially early onset after transplantation, showed an increased risk for graft failure. Further studies are warranted to determine whether lowering serum uric acid levels would be beneficial to graft survival.