The focus of these pages is large lesions, such a stroke lesions. These large lesions present problems not encountered with smaller punctate lesions (e.g., MS lesions):
Creating the Lesion Mask: Large stroke-related lesions are difficult to characterize. Manual methods are laborious and the inter-rater reliability hovers around 80%. Automated routines perform poorly. We explore semi-automated pipelines that can help with both the speed and consistency of drawing lesion masks.
Working with Clinical Data: Although we use a large homogenous dataset to test normalization techniques, real stroke studies may have more heterogenous data. For example, stroke patients who cannot have MRI scans may sometimes have CT scans. Working with the available imaging data presents unique challenges.
Lesion Normalization: A large area of abnormal tissue can adversely affect how the image is warped into standard space. This can result in lesion shrinkage, as discussed in Andersen, S. M., Rapcsak, S. Z., & Beeson, P. (2010). Cost function masking during normalization of brains with focal lesions: Still a necessity? NeuroImage.. Here I explore normalization in a homogenous dataset comparing the results of using different tools and parameters.
Statistical Analysis of Lesions: Both volume and location of the lesion should be considered in statistical analyses (otherwise there is the danger of simply discovering that big lesions damage more behaviors).