Multi-Patch Blending improves lung cancer growth pattern segmentation in whole-slide images

Z. Swiderska-Chadaj, E. Stoelinga, A. Gertych and F. Ciompi

IEEE International Conference on Computational Problems of Electrical Engineering 2020.

DOI Cited by ~1

In this study, we introduce a technique to generate synthetic histologic image data by blending parts of different images into a new image patch. The proposed approach, which we call multi-patch blending(MPB), crops parts of two histologic images of tumor growth patterns annotated with single but different labels and pastes them into a newly created image patch comprising areas with two different annotations. A Cycle-GAN model is employed in MPB to smooth out transitions between the pasted image crops and make the output image patch look realistic. The goal of implementing the MPB is to support the task of semantic segmentation of lung adenocarcinoma grown patterns in whole-slide images (WSI). We used MPB to increase the number of training patches extracted from a set of 18 WSIs with sparse annotations. De facto, MPB was implemented as a novel data augmentation strategy. U-Net trained with MPB-generated patches achieved 13% higher F1-score than the U-Net trained with original (sparsely annotated) patches in a 4-classsemantic segmentation task.