A pathologist's optical microscopic examination of thinly cut, stained tissue on glass slides prepared from a formalin-fixed paraffin-embedded tissue blocks is the gold standard for tissue diagnostics. In addition, the diagnostic abilities and expertise of pathologists is dependent on their direct experience with common and rarer variant morphologies. Recently, deep learning approaches have been used to successfully show a high level of accuracy for such tasks. However, obtaining expert-level annotated images is an expensive and time-consuming task, and artificially synthesized histologic images can prove greatly beneficial. In this study, we present an approach to not only generate histologic images that reproduce the diagnostic morphologic features of common disease but also provide a user ability to generate new and rare morphologies. Our approach involves developing a generative adversarial network model that synthesizes pathology images constrained by class labels. We investigated the ability of this framework in synthesizing realistic prostate and colon tissue images and assessed the utility of these images in augmenting the diagnostic ability of machine learning methods and their usability by a panel of experienced anatomic pathologists. Synthetic data generated by our framework performed similar to real data when training a deep learning model for diagnosis. Pathologists were not able to distinguish between real and synthetic images, and their analyses showed a similar level of interobserver agreement for prostate cancer grading. We extended the approach to significantly more complex images from colon biopsies and showed that the morphology of the complex microenvironment in such tissues can be reproduced. Finally, we present the ability for a user to generate deepfake histologic images using a simple markup of sematic labels.
Keywords: computer vision; deep learning; deepfake pathology; digital pathology synthetic data.
Copyright © 2022 United States & Canadian Academy of Pathology. Published by Elsevier Inc. All rights reserved.