Project description:BACKGROUND:Standard influenza vaccines may be of limited benefit to patients with end-stage renal disease (ESRD). These patients may benefit from high-dose influenza vaccine, currently indicated for patients aged ?65?years. Studies in other populations have demonstrated that high-dose vaccine elicits a stronger immunological response. We compared vaccine uptake in the United States and predictors of receipt for high-dose and standard influenza vaccines. METHODS:Using data from the United States Renal Data System (2010-2013), we conducted a cohort study of 421,482 adult patients on hemodialysis. We examined temporal trends in uptake of high-dose or standard trivalent influenza vaccine each influenza season, and used multivariate logistic regression to assess the association between individual-level variables (e.g., demographics, comorbidities) and facility-level variables (e.g., facility size and type) with vaccine receipt. RESULTS:The proportion of patients with ESRD who were vaccinated with any influenza vaccine increased from 68.3% in 2010 to 72.4% in 2013. High-dose vaccines were administered to 0.9% of patients during the study period, and 16.7% of high-dose vaccines were administered to patients <65?years of age. Among patients aged ?65?years, older patients (>79 vs. 65-69?years: OR, 1.29; 95% CI, 1.19-1.41) and patients at hospital-based versus free-standing dialysis facilities (OR, 2.31; 95% CI, 2.13-2.45) were more likely to receive high-dose vaccine, while blacks (vs. whites [OR, 0.66; 95% CI, 0.61-0.71]) and patients with longer duration of ESRD (>9 vs. 0?years: OR, 0.66; 95% CI, 0.55-0.78) were less likely to receive the high-dose vaccine. CONCLUSIONS:While the overall influenza vaccination rate has increased, use of high-dose vaccine among patients with ESRD was very low. Being an older patient, living in the Midwest, and receiving care at hospital-based facilities were the strongest predictors of receiving high-dose vaccine.
Project description:Background and objectivesWith multiple medications indicated for mineral metabolism, dialysis providers can apply various strategies to achieve target phosphate and parathyroid hormone (PTH) levels. We describe common prescribing patterns and practice variation in mineral metabolism treatment strategies over the last decade.Design, setting, participants, & measurementsIn a cohort of adults initiating hemodialysis at Dialysis Clinic, Inc. facilities, we assessed prescriptions of vitamin D sterols, phosphate binders, and cinacalcet longitudinally. To identify the influence of secular trends in clinical practice, we stratified the cohort by dialysis initiation year (2006-2008, 2009-2011, and 2012-2015). To measure practice variation, we estimated the median odds ratio for prescribing different mineral metabolism treatment strategies at 12 months post-dialysis initiation across facilities using mixed effects multinomial logistic regression. Sensitivity analyses evaluated strategies used after detection of first elevated PTH.ResultsAmong 23,549 incident patients on hemodialysis, there was a decline in vitamin D sterol-based strategies and a corresponding increase in strategies without PTH-modifying agents (i.e., phosphate binders alone or no mineral metabolism medications) and cinacalcet-containing treatment strategies between 2006 and 2015. The proportion with active vitamin D sterol-based strategies at dialysis initiation decreased across cohorts: 15% (2006-2008) to 5% (2012-2015). The proportion with active vitamin D sterol-based strategies after 18 months of dialysis decreased across cohorts: 52% (2006-2008) to 34% (2012-2015). The odds of using individual strategies compared with reference (active vitamin D sterol with phosphate binder) varied from 1.5- to two-fold across facilities in 2006-2008 and 2009-2011 cohorts, and increased to two- to three-fold in the 2012-2015 cohort. Findings were similar in sensitivity analyses starting from first elevated PTH measurement.ConclusionsOver time, mineral metabolism management involved less use of vitamin D sterol-based strategies, greater use of both more conservative and cinacalcet-containing strategies, and increased practice variation, suggesting growing equipoise.
Project description:IntroductionPatients with end-stage kidney disease have a high risk of 30-day readmission to hospital. These readmissions are financially costly to health care systems and are associated with poor health-related quality of life. The objective of this study was to describe and analyze the frequency, causes, and predictors of 30-day potentially avoidable readmission to hospital in patients on hemodialysis.MethodsWe conducted a retrospective cohort study using the US Renal Data System data from January 1, 2008, to December 31, 2008. A total of 107,940 prevalent United States hemodialysis patients with 248,680 index hospital discharges were assessed for the main outcome of 30-day potentially avoidable readmission, as identified by a computerized algorithm.ResultsOf 83,209 30-day readmissions, 59,045 (70.1%) resulted in a 30-day potentially avoidable readmission. The geographic distribution of 30-day potentially avoidable readmission in the United States varied by state. Characteristics associated with 30-day potentially avoidable readmission included the following: younger age, shorter time on hemodialysis, at least 3 or more hospitalizations in preceding 12 months, black race, unemployed status, treatment at a for-profit facility, longer length of index hospital stay, and index hospitalizations that involved a surgical procedure. The 5-, 15-, and 30-day potentially avoidable readmission cumulative incidences were 6.0%, 15.1%, and 25.8%, respectively.ConclusionPatients with end-stage kidney disease on maintenance hemodialysis are at high risk for 30-day readmission to hospital, with nearly three-quarters (70.1%) of all 30-day readmissions being potentially avoidable. Research is warranted to develop cost-effective and transferrable interventions that improve care transitions from hospital to outpatient hemodialysis facility and reduce readmission risk for this vulnerable population.
Project description:BackgroundThe impact of diuretic usage and dosage on the mortality of critically ill patients with acute kidney injury is still unclear.Methods and resultsIn this prospective, multicenter, observational study, 572 patients with postsurgical acute kidney injury receiving hemodialysis were recruited and followed daily. Thirty-day postdialysis mortality was analyzed using Cox's proportional hazards model with time-dependent covariates. The mean age of the 572 patients was 60.8±16.6 years. Patients with lower serum creatinine (p?=?0.031) and blood lactate (p?=?0.033) at ICU admission, lower predialysis urine output (p?=?0.001) and PaO(2)/FiO(2) (p?=?0.039), as well as diabetes (p?=?0.037) and heart failure (p?=?0.049) were more likely to receive diuretics. A total of 280 (49.0%) patients died within 30 days after acute dialysis initiation. The analysis of 30-day postdialysis mortality by fitting propensity score-adjusted Cox's proportional hazards models with time-dependent covariates showed that higher 3-day accumulated diuretic doses after dialysis initiation (HR?=?1.449, p?=?0.021) could increase the hazard rate of death. Moreover, higher time-varying 3-day accumulative diuretic doses were associated with hypotension (p<0.001) and less intense hemodialysis (p<0.001) during the acute dialysis period.Background and significanceHigher time-varying 3-day accumulative diuretic dose predicts mortality in postsurgical critically ill patients requiring acute dialysis. Higher diuretic doses are associated with hypotension and a lower intensity of dialysis. Caution should be employed before loop diuretics are administered to postsurgical patients during the acute dialysis period.
Project description:Most patients with end-stage kidney disease value their health-related quality of life (HRQoL) and want to know how it will be affected by their dialysis modality. We extended the findings of two prior clinical trial reports to estimate the effects of frequent compared to conventional hemodialysis on additional measures of HRQoL. The Daily Trial randomly assigned 245 patients to receive frequent (six times per week) or conventional (three times per week) in-center hemodialysis. The Nocturnal Trial randomly assigned 87 patients to receive frequent nocturnal (six times per week) or conventional (three times per week) home hemodialysis. All patients were on conventional hemodialysis prior to randomization, with an average feeling thermometer score of 70 to 75 (a visual analog scale from 0 to 100 where 100 is perfect health), an average general health scale score of 40 to 47 (a score from 0 to 100 where 100 is perfect health), and an average dialysis session recovery time of 2 to 3 hours. Outcomes are reported as the between-treatment group differences in one-year change in HRQoL measures and analyzed using linear mixed effects models. After one year in the Daily Trial, patients assigned to frequent in-center hemodialysis reported a higher feeling thermometer score, better general health, and a shorter recovery time after a dialysis session compared to standard thrice-weekly dialysis. After one year in the Nocturnal Trial, patients assigned to frequent home hemodialysis also reported a shorter recovery time after a dialysis session, but no statistical difference in their feeling thermometer or general health scores compared to standard home dialysis schedules. Thus, patients receiving day or nocturnal hemodialysis on average recovered approximately one hour earlier from a frequent compared to conventional hemodialysis session. Patients treated in an in-center dialysis facility reported better HRQoL with frequent compared to conventional hemodialysis.
Project description:Reports on the prevalence of torus mandibularis among dialysis patients have been limited and inconclusive. A wide variety of oral manifestations has been found in patients with hyperparathyroidism. Furthermore, uremia-related changes in facial bone structures have been described in the literature. This prospective observational study examined 322 hemodialysis patients treated at the Chang Gung Memorial Hospital from 1 August to 31 December 2016. Two subgroups were identified: patients with torus mandibularis (n = 25) and those without (n = 297). Clinical oral examinations including inspection and palpation were employed. Our study found that most mandibular tori were symmetric (84.0%), nodular (96.0%), less than 2 cm in size (96.0%), and located in the premolar area (92.0%). Poor oral hygiene was observed among these patients, with 49.7% and 24.5% scoring 3 and 4, respectively, on the Quigley-Hein plaque index. More than half (55.0%) of patients lost their first molars. Multivariate logistic regression analysis revealed that blood phosphate level (odds ratio = 1.494, p = 0.029) and younger age (odds ratio = 0.954, p = 0.009) correlated significantly with torus mandibularis. The prevalence of torus mandibularis in patients receiving hemodialysis in this study was 7.8%. Younger age and a higher blood phosphate level were predictors for torus mandibularis in these patients.
Project description:BACKGROUND:Arteriovenous fistulas (AVFs) are the preferred form of hemodialysis vascular access, but maturation failures occur frequently, often resulting in prolonged catheter use. We sought to characterize AVF maturation in a national sample of prevalent hemodialysis patients in the United States. STUDY DESIGN:Nonconcurrent observational cohort study. SETTING & PARTICIPANTS:Prevalent hemodialysis patients having had at least 1 new AVF placed during 2013, as identified using Medicare claims data in the US Renal Data System. PREDICTORS:Demographics, geographic location, dialysis vintage, comorbid conditions. OUTCOMES:Successful maturation following placement defined by subsequent use identified using monthly CROWNWeb data. MEASUREMENTS:AVF maturation rates were compared across strata of predictors. Patients were followed up until the earliest evidence of death, AVF maturation, or the end of 2014. RESULTS:In the study period, 45,087 new AVFs were placed in 39,820 prevalent hemodialysis patients. No evidence of use was identified for 36.2% of AVFs. Only 54.7% of AVFs were used within 4 months of placement, with maturation rates varying considerably across end-stage renal disease (ESRD) networks. Older age was associated with lower AVF maturation rates. Female sex, black race, some comorbid conditions (cardiovascular disease, peripheral artery disease, diabetes, needing assistance, or institutionalized status), dialysis vintage longer than 1 year, and catheter or arteriovenous graft use at ESRD incidence were also associated with lower rates of successful AVF maturation. In contrast, hypertension and prior AVF placement at ESRD incidence were associated with higher rates of successful AVF maturation. LIMITATIONS:This study relies on administrative data, with monthly recording of access use. CONCLUSIONS:We identified numerous associations between AVF maturation and patient-level factors in a recent national sample of US hemodialysis patients. After accounting for these patient factors, we observed substantial differences in AVF maturation across some ESRD networks, indicating a need for additional study of the provider, practice, and regional factors that explain AVF maturation.
Project description:Rationale & objectiveAs the proportion of arteriovenous fistulas (AVFs) compared with arteriovenous grafts (AVGs) in the United States has increased, there has been a concurrent increase in interventions. We explored AVF and AVG maturation and maintenance procedural burden in the first year of hemodialysis.Study designObservational cohort study.Setting & participantsPatients initiating hemodialysis from July 1, 2012, to December 31, 2014, and having a first-time AVF or AVG placement between dialysis initiation and 1 year (N = 73,027), identified using the US Renal Data System (USRDS).PredictorsPatient characteristics.OutcomeSuccessful AVF/AVG use and intervention procedure burden.Analytical approachFor each group, we analyzed interventional procedure rates during maturation maintenance phases using Poisson regression. We used proportional rate modeling for covariate-adjusted analysis of interventional procedure rates during the maintenance phase.ResultsDuring the maturation phase, 13,989 of 57,275 patients (24.4%) in the AVF group required intervention, with therapeutic interventional requirements of 0.36 per person. In the AVG group 2,904 of 15,572 patients (18.4%) required intervention during maturation, with therapeutic interventional requirements of 0.28 per person. During the maintenance phase, in the AVF group 12,732 of 32,115 patients (39.6%) required intervention, with a therapeutic intervention rate of 0.93 per person-year. During maintenance phase, in the AVG group 5,928 of 10,271 patients (57.7%) required intervention, with a therapeutic intervention rate of 1.87 per person-year. For both phases, the intervention rates for AVF tended to be higher on the East Coast while those for AVG were more uniform geographically.LimitationsThis study relies on administrative data, with monthly recording of access use.ConclusionsDuring maturation, interventions for both AVFs and AVGs were relatively common. Once successfully matured, AVFs had lower maintenance interventional requirements. During the maturation and maintenance phases, there were geographic variations in AVF intervention rates that warrant additional study.
Project description:Children with acute myeloid leukemia are at risk for sepsis and organ failure. Outcomes associated with intensive care support have not been studied in a large pediatric acute myeloid leukemia population. Our objective was to determine hospital mortality of pediatric acute myeloid leukemia patients requiring intensive care.Retrospective cohort study of children hospitalized between 1999 and 2010. Use of intensive care was defined by utilization of specific procedures and resources. The primary endpoint was hospital mortality.Forty-three children's hospitals contributing data to the Pediatric Health Information System database.Patients who are newly diagnosed with acute myeloid leukemia and who are 28 days through 18 years old (n = 1,673) hospitalized any time from initial diagnosis through 9 months following diagnosis or until stem cell transplant. A reference cohort of all nononcology pediatric admissions using the same intensive care resources in the same time period (n = 242,192 admissions) was also studied.None.One-third of pediatric patients with acute myeloid leukemia (553 of 1,673) required intensive care during a hospitalization within 9 months of diagnosis. Among intensive care admissions, mortality was higher in the acute myeloid leukemia cohort compared with the nononcology cohort (18.6% vs 6.5%; odds ratio, 3.23; 95% CI, 2.64-3.94). However, when sepsis was present, mortality was not significantly different between cohorts (21.9% vs 19.5%; odds ratio, 1.17; 95% CI, 0.89-1.53). Mortality was consistently higher for each type of organ failure in the acute myeloid leukemia cohort versus the nononcology cohort; however, mortality did not exceed 40% unless there were four or more organ failures in the admission. Mortality for admissions requiring intensive care decreased over time for both cohorts (23.7% in 1999-2003 vs 16.4% in 2004-2010 in the acute myeloid leukemia cohort, p = 0.0367; and 7.5% in 1999-2003 vs 6.5% in 2004-2010 in the nononcology cohort, p < 0.0001).Pediatric patients with acute myeloid leukemia frequently required intensive care resources, with mortality rates substantially lower than previously reported. Mortality also decreased over the time studied. Pediatric acute myeloid leukemia patients with sepsis who required intensive care had a mortality comparable to children without oncologic diagnoses; however, overall mortality and mortality for each category of organ failure studied was higher for the acute myeloid leukemia cohort compared with the nononcology cohort.