Data processing workflow for large-scale immune monitoring studies by mass cytometry.
Ontology highlight
ABSTRACT: Mass cytometry is a powerful tool for deep immune monitoring studies. To ensure maximal data quality, a careful experimental and analytical design is required. However even in well-controlled experiments variability caused by either operator or instrument can introduce artifacts that need to be corrected or removed from the data. Here we present a data processing pipeline which ensures the minimization of experimental artifacts and batch effects, while improving data quality. Data preprocessing and quality controls are carried out using an R pipeline and packages like CATALYST for bead-normalization and debarcoding, flowAI and flowCut for signal anomaly cleaning, AOF for files quality control, flowClean and flowDensity for gating, CytoNorm for batch normalization and FlowSOM and UMAP for data exploration. As proper experimental design is key in obtaining good quality events, we also include the sample processing protocol used to generate the data. Both, analysis and experimental pipelines are easy to scale-up, thus the workflow presented here is particularly suitable for large-scale, multicenter, multibatch and retrospective studies.
SUBMITTER: Rybakowska P
PROVIDER: S-EPMC8188119 | biostudies-literature |
REPOSITORIES: biostudies-literature
ACCESS DATA