Project description:Recent research has explored the possibility of building attitudinal resistance against online misinformation through psychological inoculation. The inoculation metaphor relies on a medical analogy: by pre-emptively exposing people to weakened doses of misinformation cognitive immunity can be conferred. A recent example is the Bad News game, an online fake news game in which players learn about six common misinformation techniques. We present a replication and extension into the effectiveness of Bad News as an anti-misinformation intervention. We address three shortcomings identified in the original study: the lack of a control group, the relatively low number of test items, and the absence of attitudinal certainty measurements. Using a 2 (treatment vs. control) × 2 (pre vs. post) mixed design (N = 196) we measure participants' ability to spot misinformation techniques in 18 fake headlines before and after playing Bad News. We find that playing Bad News significantly improves people's ability to spot misinformation techniques compared to a gamified control group, and crucially, also increases people's level of confidence in their own judgments. Importantly, this confidence boost only occurred for those who updated their reliability assessments in the correct direction. This study offers further evidence for the effectiveness of psychological inoculation against not only specific instances of fake news, but the very strategies used in its production. Implications are discussed for inoculation theory and cognitive science research on fake news.
Project description:BackgroundThe main goal of the whole transcriptome analysis is to correctly identify all expressed transcripts within a specific cell/tissue--at a particular stage and condition--to determine their structures and to measure their abundances. RNA-seq data promise to allow identification and quantification of transcriptome at unprecedented level of resolution, accuracy and low cost. Several computational methods have been proposed to achieve such purposes. However, it is still not clear which promises are already met and which challenges are still open and require further methodological developments.ResultsWe carried out a simulation study to assess the performance of 5 widely used tools, such as: CEM, Cufflinks, iReckon, RSEM, and SLIDE. All of them have been used with default parameters. In particular, we considered the effect of the following three different scenarios: the availability of complete annotation, incomplete annotation, and no annotation at all. Moreover, comparisons were carried out using the methods in three different modes of action. In the first mode, the methods were forced to only deal with those isoforms that are present in the annotation; in the second mode, they were allowed to detect novel isoforms using the annotation as guide; in the third mode, they were operating in fully data driven way (although with the support of the alignment on the reference genome). In the latter modality, precision and recall are quite poor. On the contrary, results are better with the support of the annotation, even though it is not complete. Finally, abundance estimation error often shows a very skewed distribution. The performance strongly depends on the true real abundance of the isoforms. Lowly (and sometimes also moderately) expressed isoforms are poorly detected and estimated. In particular, lowly expressed isoforms are identified mainly if they are provided in the original annotation as potential isoforms.ConclusionsBoth detection and quantification of all isoforms from RNA-seq data are still hard problems and they are affected by many factors. Overall, the performance significantly changes since it depends on the modes of action and on the type of available annotation. Results obtained using complete or partial annotation are able to detect most of the expressed isoforms, even though the number of false positives is often high. Fully data driven approaches require more attention, at least for complex eucaryotic genomes. Improvements are desirable especially for isoform quantification and for isoform detection with low abundance.
Project description:Facial expressions carry key information about an individual's emotional state. Research into the perception of facial emotions typically employs static images of a small number of artificially posed expressions taken under tightly controlled experimental conditions. However, such approaches risk missing potentially important facial signals and within-person variability in expressions. The extent to which patterns of emotional variance in such images resemble more natural ambient facial expressions remains unclear. Here we advance a novel protocol for eliciting natural expressions from dynamic faces, using a dimension of emotional valence as a test case. Subjects were video recorded while delivering either positive or negative news to camera, but were not instructed to deliberately or artificially pose any specific expressions or actions. A PCA-based active appearance model was used to capture the key dimensions of facial variance across frames. Linear discriminant analysis distinguished facial change determined by the emotional valence of the message, and this also generalised across subjects. By sampling along the discriminant dimension, and back-projecting into the image space, we extracted a behaviourally interpretable dimension of emotional valence. This dimension highlighted changes commonly represented in traditional face stimuli such as variation in the internal features of the face, but also key postural changes that would typically be controlled away such as a dipping versus raising of the head posture from negative to positive valences. These results highlight the importance of natural patterns of facial behaviour in emotional expressions, and demonstrate the efficacy of using data-driven approaches to study the representation of these cues by the perceptual system. The protocol and model described here could be readily extended to other emotional and non-emotional dimensions of facial variance.
Project description:Background/objectivesWe sought to determine whether statin use for primary prevention is associated with a lower risk of cardiovascular events or mortality in older men.DesignProspective cohort study.SettingPhysicians' Health Study participants.Participants7,213 male physicians ≥70 years without a history of cardiovascular disease (CVD).MeasurementsMultivariable propensity score for statin use with greedy matching (1:1) to minimize confounding by indication.ResultsMedian baseline age was 77 (70-102), median follow-up was 7 years. Non-users were matched to 1,130 statin users. Statin use was associated with an 18% lower risk of all-cause mortality, HR 0.82 (95% CI 0.69-0.98) and non-significant lower risk of CVD events, HR 0.86 (95% CI 0.70-1.06) and stroke, HR 0.70 (95% CI 0.45-1.09). In subgroup analyses, results did not change according to age group at baseline (70-76 or >76 years) or functional status. There was a suggestion that those >76 at baseline did not benefit from statins for mortality, HR 1.14 (95% CI 0.89-1.47), compared to those 70-76 at baseline, HR 0.83 (95% CI 0.61-1.11); however the CIs overlap between the two groups, suggesting no difference. Statin users with elevated total cholesterol had fewer major CVD events than non-users, HR 0.68 (95% CI 0.50-0.94) and HR 1.43 (95% CI 0.99-2.07)), respectively.ConclusionsStatin use was associated with a significant lower risk of mortality in older male physicians ≥70 and a nonsignificant lower risk of CVD events. Results did not change in those who were >76 years at baseline or according to functional status. There was a suggestion that those with elevated total cholesterol may benefit. Further work is needed to determine which older individuals will benefit from statins as primary prevention.
Project description:In our experiment, we tested how exposure to a mock televised news segment, with a systematically manipulated emotional valence of voiceover, images and TV tickers (in the updating format) impacts viewers' perception. Subjects (N = 603) watched specially prepared professional video material which portrayed the story of a candidate for local mayor. Following exposure to the video, subjects assessed the politician in terms of competence, sociability, and morality. Results showed that positive images improved the assessment of the politician, whereas negative images lowered it. In addition, unexpectedly, positive tickers led to a negative assessment, and negative ones led to more beneficial assessments. However, in a situation of inconsistency between the voiceover and information provided on visual add-ons, additional elements are apparently ignored, especially when they are negative and the narrative is positive. We then discuss the implications of these findings.
Project description:OBJECTIVES:To determine the efficacy and safety of statins for primary prevention of atherosclerotic cardiovascular disease (ASCVD) events in older adults, especially those aged 80 and older and with multimorbidity. METHODS:The National Institute on Aging and the National Heart, Lung and Blood Institute convened A multidisciplinary expert panel from July 31 to August 1, 2017, to review existing evidence, identify knowledge gaps, and consider whether statin safety and efficacy data in persons aged 75 and older without ASCVD are sufficient; whether existing data can inform the feasibility, design, and implementation of future statin trials in older adults; and clinical trial options and designs to address knowledge gaps. This article summarizes the presentations and discussions at that workshop. RESULTS:There is insufficient evidence regarding the benefits and harms of statins in older adults, especially those with concomitant frailty, polypharmacy, comorbidities, and cognitive impairment; a lack of tools to assess ASCVD risk in those aged 80 and older; and a paucity of evidence of the effect of statins on outcomes of importance to older adults, such as statin-associated muscle symptoms, cognitive function, and incident diabetes mellitus. Prospective, traditional, placebo-controlled, randomized clinical trials (RCTs) and pragmatic RCTs seem to be suitable options to address these critical knowledge gaps. Future trials have to consider greater representation of very old adults, women, underrepresented minorities, and individuals of differing health, cognitive, socioeconomic, and educational backgrounds. Feasibility analyses from existing large healthcare networks confirm appropriate power for death and cardiovascular outcomes for future RCTs in this area. CONCLUSION:Existing data cannot address uncertainties about the benefits and harms of statins for primary ASCVD prevention in adults aged 75 and older, especially those with comorbidities, frailty, and cognitive impairment. Evidence from 1 or more RCTs could address these important knowledge gaps to inform person-centered decision-making. J Am Geriatr Soc 66:2188-2196, 2018.
Project description:BACKGROUND:The health care industry has more insider breaches than any other industry. Soon-to-be graduates are the trusted insiders of tomorrow, and their knowledge can be used to compromise organizational security systems. OBJECTIVE:The objective of this paper was to identify the role that monetary incentives play in violating the Health Insurance Portability and Accountability Act's (HIPAA) regulations and privacy laws by the next generation of employees. The research model was developed using the economics of crime literature and rational choice theory. The primary research question was whether higher perceptions of being apprehended for violating HIPAA regulations were related to higher requirements for monetary incentives. METHODS:Five scenarios were developed to determine if monetary incentives could be used to influence subjects to illegally obtain health care information and to release that information to individuals and media outlets. The subjects were also asked about the probability of getting caught for violating HIPAA laws. Correlation analysis was used to determine whether higher perceptions of being apprehended for violating HIPAA regulations were related to higher requirements for monetary incentives. RESULTS:Many of the subjects believed there was a high probability of being caught. Nevertheless, many of them could be incentivized to violate HIPAA laws. In the nursing scenario, 45.9% (240/523) of the participants indicated that there is a price, ranging from US $1000 to over US $10 million, that is acceptable for violating HIPAA laws. In the doctors' scenario, 35.4% (185/523) of the participants indicated that there is a price, ranging from US $1000 to over US $10 million, for violating HIPAA laws. In the insurance agent scenario, 45.1% (236/523) of the participants indicated that there is a price, ranging from US $1000 to over US $10 million, for violating HIPAA laws. When a personal context is involved, the percentages substantially increase. In the scenario where an experimental treatment for the subject's mother is needed, which is not covered by insurance, 78.4% (410/523) of the participants would accept US $100,000 from a media outlet for the medical records of a politician. In the scenario where US $50,000 is needed to obtain medical records about a famous reality star to help a friend in need of emergency medical transportation, 64.6% (338/523) of the participants would accept the money. CONCLUSIONS:A key finding of this study is that individuals perceiving a high probability of being caught are less likely to release private information. However, when the personal context involves a friend or family member, such as a mother, they will probably succumb to the incentive, regardless of the probability of being caught. The key to reducing noncompliance will be to implement organizational procedures and constantly monitor and develop educational and training programs to encourage HIPAA compliance.
Project description:PurposeTo present a novel method for meta-analysis of the fractionation sensitivity of tumors as applied to prostate cancer in the presence of an overall time factor.Methods and materialsA systematic search for radiation dose-fractionation trials in prostate cancer was performed using PubMed and by manual search. Published trials comparing standard fractionated external beam radiation therapy with alternative fractionation were eligible. For each trial the α/β ratio and its 95% confidence interval (CI) were extracted, and the data were synthesized with each study weighted by the inverse variance. An overall time factor was included in the analysis, and its influence on α/β was investigated.ResultsFive studies involving 1965 patients were included in the meta-analysis of α/β. The synthesized α/β assuming no effect of overall treatment time was -0.07 Gy (95% CI -0.73-0.59), which was increased to 0.47 Gy (95% CI -0.55-1.50) if a single highly weighted study was excluded. In a separate analysis, 2 studies based on 10,808 patients in total allowed extraction of a synthesized estimate of a time factor of 0.31 Gy/d (95% CI 0.20-0.42). The time factor increased the α/β estimate to 0.58 Gy (95% CI -0.53-1.69)/1.93 Gy (95% CI -0.27-4.14) with/without the heavily weighted study. An analysis of the uncertainty of the α/β estimate showed a loss of information when the hypofractionated arm was underdosed compared with the normo-fractionated arm.ConclusionsThe current external beam fractionation studies are consistent with a very low α/β ratio for prostate cancer, although the CIs include α/β ratios up to 4.14 Gy in the presence of a time factor. Details of the dose fractionation in the 2 trial arms have critical influence on the information that can be extracted from a study. Studies with unfortunate designs will supply little or no information about α/β regardless of the number of subjects enrolled.