Project description:BackgroundEarly start of enteral feeding is an established treatment strategy in intubated patients in intensive care since it reduces invasive bacterial infections and length of hospital stay. There is equipoise whether early enteral feeding is also beneficial in non-intubated patients with cerebral malaria in resource poor settings. We hypothesized that the risk of aspiration pneumonia might outweigh the potential benefits of earlier recovery and prevention of hypoglycaemia.Method and findingsA randomized trial of early (day of admission) versus late (after 60 hours in adults or 36 hours in children) start of enteral feeding was undertaken in patients with cerebral malaria in Chittagong, Bangladesh from May 2008 to August 2009. The primary outcome measures were incidence of aspiration pneumonia, hypoglycaemia and coma recovery time. The trial was terminated after inclusion of 56 patients because of a high incidence of aspiration pneumonia in the early feeding group (9/27 (33%)), compared to the late feeding group (0/29 (0%)), p = 0.001). One patient in the late feeding group, and none in the early group, had hypoglycaemia during admission. There was no significant difference in overall mortality (9/27 (33%) vs 6/29 (21%), p = 0.370), but mortality was 5/9 (56%) in patients with aspiration pneumonia.ConclusionsIn conclusion, early start of enteral feeding is detrimental in non-intubated patients with cerebral malaria in many resource-poor settings. Evidence gathered in resource rich settings is not necessarily transferable to resource-poor settings.Trial registrationControlled-Trials.com ISRCTN57488577.
Project description:Reducing childhood mortality in resource-poor regions depends on effective interventions to decrease neonatal mortality from severe infection, which contributes up to a half of all neonatal deaths. There are key differences in resource-poor, compared to resource-rich, countries in terms of diagnosis, supportive care and treatment. In resource-poor settings, diagnosis is based on identifying clinical syndromes from international guidelines; microbiological investigations are restricted to a few research facilities. Low levels of staffing and equipment limit the provision of basic supportive care, and most facilities cannot provide respiratory support. Empiric antibiotic treatment guidelines are based on few aetiological and antimicrobial susceptibility data. Research on improving health care systems to provide effective supportive care, and implementation of simple pragmatic interventions, such as low-cost respiratory support, are essential, together with improved surveillance to monitor emerging drug resistance and treatment failures. Reductions in mortality will also be achieved through prevention of infection; including emerging vaccination and anti-sepsis strategies.
Project description:ObjectiveTo investigate the safety, feasibility and efficacy of delayed cord clamping (DCC) compared with immediate cord clamping (ICC) at delivery among infants born at 22 to 27 weeks' gestation.Study designThis was a pilot, randomized, controlled trial in which women in labor with singleton pregnancies at 22 to 27 weeks' gestation were randomly assigned to ICC (cord clamped at 5 to 10 s) or DCC (30 to 45 s).ResultsForty mother-infant pairs were randomized. Infants in the ICC and DCC groups had mean gestational ages (GA) of 24.6 and 24.4 weeks, respectively. No differences were observed between the groups across all available safety measures, although infants in the DCC group had higher admission temperatures than infants in the ICC group (97.4 vs. 96.2 °F, P=0.04). During the first 24 h of life, blood pressures were lower in the ICC group than in the DCC group (P<0.05), despite a threefold greater incidence of treatment for hypotension (45% vs. 12%, P<0.01). Infants in the ICC group had increased numbers of red blood transfusions (in first 28 days of life) than infants in DCC group (4.1±3.9 vs. 2.8±2.2, P=0.04).ConclusionAmong infants born at an average GA of 24 weeks', DCC appears safe, logistically feasible, and offers hematological and circulatory advantages compared with ICC. A more comprehensive appraisal of this practice is needed.
Project description:Authoritative international guidelines stipulate that for minors to participate in research, consent must be obtained from their parents or guardians. Significant numbers of mature minors, particularly in low-income settings, are currently being ruled out of research participation because their parents are unavailable or refuse to provide consent despite the possibility that they might wish to do so and that such research has the potential to be of real benefit. These populations are under-represented in all types of clinical research. We propose that, for research with a prospect of direct benefit that has been approved by relevant ethics committees, the default position should be that minors who are able to provide valid consent and meet the following criteria should be able to consent for themselves regardless of age and whether they have reached majority: the minor must be competent and mature relative to the decision; their consent must be voluntary and they must be relatively independent and used to decision making of comparable complexity. In addition, the context must be appropriate, the information related to the research must be provided in a manner accessible to the minor and the consent must be obtained by a trained consent taker in surroundings conducive for decision making by the minor. In this paper, we have argued that consent by mature minors to research participation is acceptable in some situations and should be allowed.
Project description:BackgroundPolicies for timing of cord clamping vary, with early cord clamping generally carried out in the first 60 seconds after birth, whereas later cord clamping usually involves clamping the umbilical cord more than one minute after the birth or when cord pulsation has ceased. The benefits and potential harms of each policy are debated.ObjectivesTo determine the effects of early cord clamping compared with late cord clamping after birth on maternal and neonatal outcomesSearch methodsWe searched the Cochrane Pregnancy and Childbirth Group's Trials Register (13 February 2013).Selection criteriaRandomised controlled trials comparing early and late cord clamping.Data collection and analysisTwo review authors independently assessed trial eligibility and quality and extracted data.Main resultsWe included 15 trials involving a total of 3911 women and infant pairs. We judged the trials to have an overall moderate risk of bias. Maternal outcomes: No studies in this review reported on maternal death or on severe maternal morbidity. There were no significant differences between early versus late cord clamping groups for the primary outcome of severe postpartum haemorrhage (risk ratio (RR) 1.04, 95% confidence interval (CI) 0.65 to 1.65; five trials with data for 2066 women with a late clamping event rate (LCER) of ~3.5%, I(2) 0%) or for postpartum haemorrhage of 500 mL or more (RR 1.17 95% CI 0.94 to 1.44; five trials, 2260 women with a LCER of ~12%, I(2) 0%). There were no significant differences between subgroups depending on the use of uterotonic drugs. Mean blood loss was reported in only two trials with data for 1345 women, with no significant differences seen between groups; or for maternal haemoglobin values (mean difference (MD) -0.12 g/dL; 95% CI -0.30 to 0.06, I(2) 0%) at 24 to 72 hours after the birth in three trials. Neonatal outcomes: There were no significant differences between early and late clamping for the primary outcome of neonatal mortality (RR 0.37, 95% CI 0.04 to 3.41, two trials, 381 infants with a LCER of ~1%), or for most other neonatal morbidity outcomes, such as Apgar score less than seven at five minutes or admission to the special care nursery or neonatal intensive care unit. Mean birthweight was significantly higher in the late, compared with early, cord clamping (101 g increase 95% CI 45 to 157, random-effects model, 12 trials, 3139 infants, I(2) 62%). Fewer infants in the early cord clamping group required phototherapy for jaundice than in the late cord clamping group (RR 0.62, 95% CI 0.41 to 0.96, data from seven trials, 2324 infants with a LCER of 4.36%, I(2) 0%). Haemoglobin concentration in infants at 24 to 48 hours was significantly lower in the early cord clamping group (MD -1.49 g/dL, 95% CI -1.78 to -1.21; 884 infants, I(2) 59%). This difference in haemoglobin concentration was not seen at subsequent assessments. However, improvement in iron stores appeared to persist, with infants in the early cord clamping over twice as likely to be iron deficient at three to six months compared with infants whose cord clamping was delayed (RR 2.65 95% CI 1.04 to 6.73, five trials, 1152 infants, I(2) 82%). In the only trial to report longer-term neurodevelopmental outcomes so far, no overall differences between early and late clamping were seen for Ages and Stages Questionnaire scores.Authors' conclusionsA more liberal approach to delaying clamping of the umbilical cord in healthy term infants appears to be warranted, particularly in light of growing evidence that delayed cord clamping increases early haemoglobin concentrations and iron stores in infants. Delayed cord clamping is likely to be beneficial as long as access to treatment for jaundice requiring phototherapy is available.
Project description:BackgroundThe anticipated scale-up of antiretroviral therapy (ART) in high-prevalence, resource-constrained settings requires operational research to guide policy on the design of treatment programmes. Mathematical models can explore the potential impacts of various treatment strategies, including timing of treatment initiation and provision of laboratory monitoring facilities, to complement evidence from pilot programmes.Methods and findingsA deterministic model of HIV transmission incorporating ART and stratifying infection progression into stages was constructed. The impact of ART was evaluated for various scenarios and treatment strategies, with different levels of coverage, patient eligibility, and other parameter values. These strategies included the provision of laboratory facilities that perform CD4 counts and viral load testing, and the timing of the stage of infection at which treatment is initiated. In our analysis, unlimited ART provision initiated at late-stage infection (AIDS) increased prevalence of HIV infection. The effect of additionally treating pre-AIDS patients depended on the behaviour change of treated patients. Different coverage levels for ART do not affect benefits such as life-years gained per person-year of treatment and have minimal effect on infections averted when treating AIDS patients only. Scaling up treatment of pre-AIDS patients resulted in more infections being averted per person-year of treatment, but the absolute number of infections averted remained small. As coverage increased in the models, the emergence and risk of spread of drug resistance increased. Withdrawal of failing treatment (clinical resurgence of symptoms), immunologic (CD4 count decline), or virologic failure (viral rebound) increased the number of infected individuals who could benefit from ART, but effectiveness per person is compromised. Only withdrawal at a very early stage of treatment failure, soon after viral rebound, would have a substantial impact on emergence of drug resistance.ConclusionsOur analysis found that ART cannot be seen as a direct transmission prevention measure, regardless of the degree of coverage. Counselling of patients to promote safe sexual practices is essential and must aim to effect long-term change. The chief aims of an ART programme, such as maximised number of patients treated or optimised treatment per patient, will determine which treatment strategy is most effective.
Project description:Anthrax threatens human and animal health, and people's livelihoods in many rural communities in Africa and Asia. In these areas, anthrax surveillance is challenged by a lack of tools for on-site detection. Furthermore, cultural practices and infrastructure may affect sample availability and quality. Practical yet accurate diagnostic solutions are greatly needed to quantify anthrax impacts. We validated microscopic and molecular methods for the detection of Bacillus anthracis in field-collected blood smears and identified alternative samples suitable for anthrax confirmation in the absence of blood smears. We investigated livestock mortalities suspected to be caused by anthrax in northern Tanzania. Field-prepared blood smears (n = 152) were tested by microscopy using four staining techniques as well as polymerase chain reaction (PCR) followed by Bayesian latent class analysis. Median sensitivity (91%, CI 95% [84-96%]) and specificity (99%, CI 95% [96-100%]) of microscopy using azure B were comparable to those of the recommended standard, polychrome methylene blue, PMB (92%, CI 95% [84-97%] and 98%, CI 95% [95-100%], respectively), but azure B is more available and convenient. Other commonly-used stains performed poorly. Blood smears could be obtained for <50% of suspected anthrax cases due to local customs and conditions. However, PCR on DNA extracts from skin, which was almost always available, had high sensitivity and specificity (95%, CI 95% [90-98%] and 95%, CI 95% [87-99%], respectively), even after extended storage at ambient temperature. Azure B microscopy represents an accurate diagnostic test for animal anthrax that can be performed with basic laboratory infrastructure and in the field. When blood smears are unavailable, PCR using skin tissues provides a valuable alternative for confirmation. Our findings lead to a practical diagnostic approach for anthrax in low-resource settings that can support surveillance and control efforts for anthrax-endemic countries globally.
Project description:Background: Physiologic-based cord clamping (PBCC) involves deferring umbilical cord clamping until after lung aeration. It is unclear if infant is at risk of becoming hypothermic during PBCC. Objectives: To test if PBCC would maintain core temperature more effectively than immediate cord clamping (ICC). Design: At 0.93 gestation, fetal lambs were surgically exteriorized and instrumented from pregnant ewes under general anesthesia. Prior to the start of the experiment, lambs were thoroughly dried, placed on hot water bottles, and core temperature was continuously monitored using a rectal thermometer. PBCC lambs (n = 21), received intermittent positive pressure ventilation (iPPV) for ≥5 min prior to umbilical cord clamping. In ICC lambs (n = 23), iPPV commenced within 60 s after umbilical cord clamping. iPPV was provided with heated/humidified gas. Lambs were moved under a radiant warmer after umbilical cord clamping. Additional warmth was provided using a plastic overlay, hairdryer, and extra water bottles, as needed. Two-way mixed and repeated measures one-way ANOVAs were used to compare temperature changes between and within a single group, respectively, over time. Results: Basal fetal parameters including core temperature were similar between groups. ICC lambs had a significant reduction in temperature compared to PBCC lambs (p < 0.001), evident by 1 min (p = 0.002). ICC lambs decreased temperature by 0.51°C (± 0.42) and 0.79°C (± 0.55) at 5 and 10 min respectively (p < 0.001). In PBCC lambs, temperature did not significantly change before or after umbilical cord clamping (p = 0.4 and p = 0.3, respectively). Conclusions: PBCC stabilized core temperature at delivery better than ICC in term lambs. Hypothermia may not be a significant risk during PBCC.
Project description:BackgroundOn a global scale, cases of placenta accreta spectrum are often just identified during cesarean delivery because they are missed during antenatal care screening. Routine operating teams not trained in the management of placenta accreta spectrum are faced with difficult surgical situations and have to make decisions that may define the clinical outcomes. Although there are general recommendations for the intraoperative management of placenta accreta spectrum, no studies have described the clinical reality of unexpected placenta accreta spectrum cases in resource-poor settings.ObjectiveThis study aimed to describe the maternal outcomes of previously undiagnosed placenta accreta spectrum managed in resource-poor settings in Colombia and Indonesia.Study designThis was a retrospective case series of women with histologically confirmed placenta accreta spectrum treated in 2 placenta accreta spectrum centers after referral from remote resource-poor hospitals. Clinical outcomes were analyzed according to the initial type of management: (1) no cesarean delivery; (2) placenta left in situ after cesarean delivery; (3) partial removal of the placenta after cesarean delivery; and (4) post-cesarean hysterectomy. In addition, we evaluated the use of telemedicine by comparing the outcomes of women in hospitals that used the support of the placenta accreta spectrum center during the initial surgery.ResultsA total of 29 women who were initially managed in Colombia (n=2) and Indonesia (n=27) were included. The lowest volume of blood loss and the lowest frequency of complications were in women who underwent deferred cesarean delivery (n=5; 17.2%) and in those who had a delayed placental delivery (n=5; 20.7%). Five maternal deaths (14%) occurred in the group that did not receive telehelp, and 4 women died of irreversible shock because of uncontrolled bleeding.ConclusionPreviously undiagnosed placenta accreta spectrum in resource-poor hospitals was associated with a high risk of maternal mortality. Open-close abdominal surgery or leaving the placenta in situ seem to be the best choices for unexpected placenta accreta spectrum management in resource-poor settings. Telemedicine with a placenta accreta spectrum center may improve prognosis.
Project description:Host biomarker testing can be used as an adjunct to the clinical assessment of patients with infections and might be particularly impactful in resource-constrained settings. Research on the merits of this approach at peripheral levels of low- and middle-income country health systems is limited. In part, this is due to resource-intense requirements for sample collection, processing, and storage. We evaluated the stability of 16 endothelial and immune activation biomarkers implicated in the host response to infection stored in venous plasma and dried blood spot specimens at different temperatures for 6 months. We found that -80°C storage offered no clear advantage over -20°C for plasma aliquots, and most biomarkers studied could safely be stored as dried blood spots at refrigeration temperatures (4°C) for up to 3 months. These results identify more practical methods for host biomarker testing in resource-limited environments, which could help facilitate research in rural and remote environments.