Segmentation priors from local image properties: without using bias field correction, location-based templates, or registration.
Ontology highlight
ABSTRACT: We present a novel approach for generating information about a voxel's tissue class membership based on its signature--a collection of local image textures estimated over a range of neighborhood sizes. The approach produces a form of tissue class priors that can be used to initialize and regularize image segmentation. The signature-based approach is a departure from current location-based methods, which derive tissue class likelihoods based on a voxel's location in standard template space. To use location-based priors, one needs to register the volume in question to the template space, and estimate the image intensity bias field. Two optimizations, over more than a thousand parameters, are needed when high order nonlinear registration is employed. In contrast, the signature-based approach is independent of volume orientation, voxel position, and largely insensitive to bias fields. For these reasons, the approach does not require the use of population derived templates. The prior information is generated from variations in image texture statistics as a function of spatial scale, and an SVM approach is used to associate signatures with tissue types. With the signature-based approach, optimization is needed only during the training phase for the parameter estimation stages of the SVM hyperplanes, and associated PDFs; a training process separate from the segmentation step. We found that signature-based priors were superior to location-based ones aligned under favorable conditions, and that signature-based priors result in improved segmentation when replacing location-based ones in FAST (Zhang et al., 2001), a widely used segmentation program. The software implementation of this work is freely available as part of AFNI http://afni.nimh.nih.gov.
SUBMITTER: Vovk A
PROVIDER: S-EPMC3031751 | biostudies-literature | 2011 Mar
REPOSITORIES: biostudies-literature
ACCESS DATA