Project description:In the United States, influenza vaccines are an important part of public health efforts to blunt the effects of seasonal influenza epidemics. This in turn emphasizes the importance of understanding the spatial distribution of influenza vaccination coverage. Despite this, high quality data at a fine spatial scale and spanning a multitude of recent flu seasons are not readily available. To address this gap, we develop county-level counts of vaccination across five recent, consecutive flu seasons and fit a series of regression models to these data that account for bias. We find that the spatial distribution of our bias-corrected vaccination coverage estimates is generally consistent from season to season, with the highest coverage in the Northeast and Midwest but is spatially heterogeneous within states. We also observe a negative relationship between a county's vaccination coverage and social vulnerability. Our findings stress the importance of quantifying flu vaccination coverage at a fine spatial scale, as relying on state or region-level estimates misses key heterogeneities.
Project description:BackgroundPolymyxins (colistin, polymyxin B) have been first-line antibiotics against carbapenem-resistant Enterobacteriaceae (CRE) infections. New anti-CRE antibiotics (ceftazidime-avibactam, meropenem-vaborbactam, plazomicin) improve outcomes in CRE-infected patients and reduce toxicity compared with polymyxins. It is unclear how widely polymyxins and newer agents are used to treat CRE infections.MethodsWe conducted an online survey of US hospital-based pharmacists to determine antibiotic positioning against CRE infections. Numbers of all infections and CRE infections treated with different antibiotics in the United States were determined using IQVIA prescription data and Driving Re-investment in Research and Development and Responsible Antibiotic Use (DRIVE-AB) estimates of CRE infections.ResultsCeftazidime-avibactam, meropenem-vaborbactam, or plazomicin were positioned as first-line agents against CRE pneumonia, bacteremia, intra-abdominal infections, and urinary tract infections at 87%, 90%, 83%, and 56% of surveyed US hospitals, respectively. From February 2018 to January 2019, an estimated 9437 and 7941 CRE infections were treated with an intravenous polymyxin or new agent, respectively; these figures represented ~28% (range, 19%-50%) and ~23% (range, 16%-42%) of CRE infections in the United States. Use of ceftazidime-avibactam, meropenem-vaborbactam, or plazomicin exceeded that of intravenous polymyxins against CRE infections as of December 2018. Currently, the new drugs are estimated to treat 35% (23% to 62%) of CRE infections in which they were expected to be first-line agents.ConclusionsNew anti-CRE agents recently surpassed intravenous polymyxins as treatment for CRE infections, but use is less than expected from their positioning at US hospitals. Research on behavioral and economic factors that impact use of new antibiotics is needed, as are financial "pull" incentives that promote an economically viable marketplace.
Project description:By March 2020, COVID-19 led to thousands of deaths and disrupted economic activity worldwide. As a result of narrow case definitions and limited capacity for testing, the number of unobserved severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections during its initial invasion of the United States remains unknown. We developed an approach for estimating the number of unobserved infections based on data that are commonly available shortly after the emergence of a new infectious disease. The logic of our approach is, in essence, that there are bounds on the amount of exponential growth of new infections that can occur during the first few weeks after imported cases start appearing. Applying that logic to data on imported cases and local deaths in the United States through 12 March, we estimated that 108,689 (95% posterior predictive interval [95% PPI]: 1,023 to 14,182,310) infections occurred in the United States by this date. By comparing the model's predictions of symptomatic infections with local cases reported over time, we obtained daily estimates of the proportion of symptomatic infections detected by surveillance. This revealed that detection of symptomatic infections decreased throughout February as exponential growth of infections outpaced increases in testing. Between 24 February and 12 March, we estimated an increase in detection of symptomatic infections, which was strongly correlated (median: 0.98; 95% PPI: 0.66 to 0.98) with increases in testing. These results suggest that testing was a major limiting factor in assessing the extent of SARS-CoV-2 transmission during its initial invasion of the United States.
Project description:The preparedness of health systems to detect, treat, and prevent onward transmission of Ebola virus disease (EVD) is central to mitigating future outbreaks. Early detection of outbreaks is critical to timely response, but estimating detection rates is difficult because unreported spillover events and outbreaks do not generate data. Using three independent datasets available on the distributions of secondary infections during EVD outbreaks across West Africa, in a single district (Western Area) of Sierra Leone, and in the city of Conakry, Guinea, we simulated realistic outbreak size distributions and compared them to reported outbreak sizes. These three empirical distributions lead to estimates for the proportion of detected spillover events and small outbreaks of 26% (range 8-40%, based on the full outbreak data), 48% (range 39-62%, based on the Sierra Leone data), and 17% (range 11-24%, based on the Guinea data). We conclude that at least half of all spillover events have failed to be reported since EVD was first recognized. We also estimate the probability of detecting outbreaks of different sizes, which is likely less than 10% for single-case spillover events. Comparing models of the observation process also suggests the probability of detecting an outbreak is not simply the cumulative probability of independently detecting any one individual. Rather, we find that any individual's probability of detection is highly dependent upon the size of the cluster of cases. These findings highlight the importance of primary health care and local case management to detect and contain undetected early stage outbreaks at source.
Project description:News media have been blamed for sensationalizing Ebola in the United States, causing unnecessary alarm. To investigate this issue, we analyzed US-focused news stories about Ebola virus disease during July 1-November 30, 2014. We found frequent use of risk-elevating messages, which may have contributed to increased public concern.
Project description:BackgroundAvailable data on the characteristics of patients with Ebola virus disease (EVD) and clinical management of EVD in settings outside West Africa, as well as the complications observed in those patients, are limited.MethodsWe reviewed available clinical, laboratory, and virologic data from all patients with laboratory-confirmed Ebola virus infection who received care in U.S. and European hospitals from August 2014 through December 2015.ResultsA total of 27 patients (median age, 36 years [range, 25 to 75]) with EVD received care; 19 patients (70%) were male, 9 of 26 patients (35%) had coexisting conditions, and 22 (81%) were health care personnel. Of the 27 patients, 24 (89%) were medically evacuated from West Africa or were exposed to and infected with Ebola virus in West Africa and had onset of illness and laboratory confirmation of Ebola virus infection in Europe or the United States, and 3 (11%) acquired EVD in the United States or Europe. At the onset of illness, the most common signs and symptoms were fatigue (20 patients [80%]) and fever or feverishness (17 patients [68%]). During the clinical course, the predominant findings included diarrhea, hypoalbuminemia, hyponatremia, hypokalemia, hypocalcemia, and hypomagnesemia; 14 patients (52%) had hypoxemia, and 9 (33%) had oliguria, of whom 5 had anuria. Aminotransferase levels peaked at a median of 9 days after the onset of illness. Nearly all the patients received intravenous fluids and electrolyte supplementation; 9 (33%) received noninvasive or invasive mechanical ventilation; 5 (19%) received continuous renal-replacement therapy; 22 (81%) received empirical antibiotics; and 23 (85%) received investigational therapies (19 [70%] received at least two experimental interventions). Ebola viral RNA levels in blood peaked at a median of 7 days after the onset of illness, and the median time from the onset of symptoms to clearance of viremia was 17.5 days. A total of 5 patients died, including 3 who had respiratory and renal failure, for a mortality of 18.5%.ConclusionsAmong the patients with EVD who were cared for in the United States or Europe, close monitoring and aggressive supportive care that included intravenous fluid hydration, correction of electrolyte abnormalities, nutritional support, and critical care management for respiratory and renal failure were needed; 81.5% of these patients who received this care survived.
Project description:BACKGROUND: There is growing interest in using C-reactive protein (CRP) levels to help select patients for lipid lowering therapy--although this practice is not yet supported by evidence of benefit in a randomized trial. OBJECTIVE: To estimate the number of Americans potentially affected if a CRP criteria were adopted as an additional indication for lipid lowering therapy. To provide context, we also determined how well current lipid lowering guidelines are being implemented. METHODS: We analyzed nationally representative data to determine how many Americans age 35 and older meet current National Cholesterol Education Program (NCEP) treatment criteria (a combination of risk factors and their Framingham risk score). We then determined how many of the remaining individuals would meet criteria for treatment using 2 different CRP-based strategies: (1) narrow: treat individuals at intermediate risk (i.e., 2 or more risk factors and an estimated 10-20% risk of coronary artery disease over the next 10 years) with CRP > 3 mg/L and (2) broad: treat all individuals with CRP > 3 mg/L. DATA SOURCE: Analyses are based on the 2,778 individuals participating in the 1999-2002 National Health and Nutrition Examination Survey with complete data on cardiac risk factors, fasting lipid levels, CRP, and use of lipid lowering agents. MAIN MEASURES: The estimated number and proportion of American adults meeting NCEP criteria who take lipid-lowering drugs, and the additional number who would be eligible based on CRP testing. RESULTS: About 53 of the 153 million Americans aged 35 and older meet current NCEP criteria (that do not involve CRP) for lipid-lowering treatment. Sixty-five percent, however, are not currently being treated, even among those at highest risk (i.e., patients with established heart disease or its risk equivalent)-62% are untreated. Adopting the narrow and broad CRP strategies would make an additional 2.1 and 25.3 million Americans eligible for treatment, respectively. The latter strategy would make over half the adults age 35 and older eligible for lipid-lowering therapy, with most of the additionally eligible (57%) coming from the lowest NCEP heart risk category (i.e., 0-1 risk factors). CONCLUSION: There is substantial underuse of lipid lowering therapy for American adults at high risk for coronary disease. Rather than adopting CRP-based strategies, which would make millions more lower risk patients eligible for treatment (and for whom treatment benefit has not yet been demonstrated in a randomized trial), we should ensure the treatment of currently defined high-risk patients for whom the benefit of therapy is established.
Project description:CONTEXT:Although core scientific skills remain a priority to public health, preventing and responding to today's leading causes of death require the workforce to build additional strategic skills to impact the social, community-based, and economic determinants of health. The 2017 Public Health Workforce Interests and Needs Survey allows novel regional analysis of training needs, both individually and across 8 strategic skill domains. OBJECTIVE:The purpose of this article is to describe the training needs of public health staff nationally, across the 10 Department of Health and Human Services Regions. DESIGN:The Public Health Workforce Interests and Needs Survey was a Web-based survey fielded to 100 000 staff nationwide across 2 major frames: state health agency-central office and local health department. State-based respondents were fielded on a census approach, with locals participating in a more complex sampling design. Balanced repeated replication weights were used to address nonresponse and sampling. SETTING:State and local health departments. PARTICIPANTS:Respondents from state and local health departments. MAIN OUTCOME MEASURES:This article draws from the training needs portion of Public Health Workforce Interests and Needs Survey. Descriptive statistics are generated, showing training needs gaps. Inferential analyses pertain to gaps across Region and supervisory status, using Pearson χ test and Rao-Scott design-adjusted χ test. RESULTS:Training needs varied across regions and work setting. Certain strategic skills tended to see larger, consistent gaps regardless of Region or setting, including Budgeting & Finance, Change Management, Systems Thinking, and Developing a Vision for a Healthy Community. CONCLUSIONS:Overall, the data suggest substantial interregional variation in training needs. Until now, this picture has been incomplete; disparate assessments across health departments, Regions, and disciplines could not be combined into a national picture. Regionally focused training centers are well situated to address Region-specific needs while supporting the broader building of capacity in strategic skills nationwide.
Project description:BackgroundIn the last decade, the number of total knee replacements performed annually in the United States has doubled, with disproportionate increases among younger adults. While total knee replacement is a highly effective treatment for end-stage knee osteoarthritis, total knee replacement recipients can experience persistent pain and severe complications. We are aware of no current estimates of the prevalence of total knee replacement among adults in the U.S.MethodsWe used the Osteoarthritis Policy Model, a validated computer simulation model of knee osteoarthritis, and data on annual total knee replacement utilization to estimate the prevalence of primary and revision total knee replacement among adults fifty years of age or older in the U.S. We combined these prevalence estimates with U.S. Census data to estimate the number of adults in the U.S. currently living with total knee replacement. The annual incidence of total knee replacement was derived from two longitudinal knee osteoarthritis cohorts and ranged from 1.6% to 11.9% in males and from 2.0% to 10.9% in females.ResultsWe estimated that 4.0 million (95% confidence interval [CI]: 3.6 million to 4.4 million) adults in the U.S. currently live with a total knee replacement, representing 4.2% (95% CI: 3.7% to 4.6%) of the population fifty years of age or older. The prevalence was higher among females (4.8%) than among males (3.4%) and increased with age. The lifetime risk of primary total knee replacement from the age of twenty-five years was 7.0% (95% CI: 6.1% to 7.8%) for males and 9.5% (95% CI: 8.5% to 10.5%) for females. Over half of adults in the U.S. diagnosed with knee osteoarthritis will undergo a total knee replacement.ConclusionsAmong older adults in the U.S., total knee replacement is considerably more prevalent than rheumatoid arthritis and nearly as prevalent as congestive heart failure. Nearly 1.5 million of those with a primary total knee replacement are fifty to sixty-nine years old, indicating that a large population is at risk for costly revision surgery as well as possible long-term complications of total knee replacement.