BRACTIVATE: DENDRITIC BRANCHING IN MEDI-CAL IMAGE SEGMENTATION NEURAL ARCHITECTURE SEARCH

Abstract

Researchers manually compose most neural networks through painstaking experimentation. This process is taxing and explores only a limited subset of possible architecture. Researchers design architectures to address objectives ranging from low space complexity to high accuracy through hours of experimentation. Neural architecture search (NAS) is a thriving field for automatically discovering architectures achieving these same objectives. Addressing these ever-increasing challenges in computing, we take inspiration from the brain because it has the most efficient neuronal wiring of any complex structure; its physiology inspires us to propose Bractivate, a NAS algorithm inspired by neural dendritic branching. An evolutionary algorithm that adds new skip connection combinations to the most active blocks in the network, propagating salient information through the network. We apply our methods to lung x-ray, cell nuclei microscopy, and electron microscopy segmentation tasks to highlight Bractivate's robustness. Moreover, our ablation studies emphasize dendritic branching's necessity: ablating these connections leads to significantly lower model performance. We finally compare our discovered architecture with other state-of-the-art UNet models, highlighting how efficient skip connections allow Bractivate to achieve comparable results with substantially lower space and time complexity, proving how Bractivate balances efficiency with performance. We invite you to work with our code here: https://tinyurl.com/bractivate.

1. INTRODUCTION

Researchers manually composing neural networks must juggle multiple goals for their architectures. Architectures must make good decisions; they must be fast, and they should work even with limited computational resources. These goals are challenging to achieve manually, and researchers often spend months attempting to discover the perfect architecture. To overcome these challenges, we turn to the human brain's efficient neural wiring for automated architecture discovery. Neuroscience already underlies core neural network concepts: The perceptron (Rosenblatt, 1958) is directly analogous to a human neuron. One of the brain's fundamental learning mechanisms is dendritic branching (Greenough & Volkmar, 1973) whereby active neurons send out signals for other neurons to form connections, strengthening signals through that neural pathway. This neuroscience concept inspires us to devise Bractivate, a Neural Architecture Search (NAS) algorithm for learning new efficient UNet architectures networks, capable of being trained twice as fast as the traditional UNet, and often one to two orders of magnitude lighter in terms of trainable parameters. We apply Bractivate on three medical imaging segmentation problems: cell nuclei, electron microscopy, and chest X-ray lung segmentation. Medical image segmentation is a growing field in Deep Learning Computer Assisted Detection (CAD): it is a powerful component in clinical decision support tools and has applications in retinal fundus image, lung scan, and mammography analysis. Most papers now approach medical image segmentation with the UNet (Ronneberger et al., 2015) ; the model architecture is straightforward: it has symmetric, hierarchical convolutional blocks, which are components of an initial contracting path and a final expanding path, with an apex bottleneck layer. Between parallel contracting and expanding blocks, the traditional UNet contains skip connections that pass information through concatenation (Ronneberger et al., 2015) . Traditional UNet skip connections involve feature map aggregation with same-scale convolutional blocks, but recent advances have yielded more complex connections ranging from the UNet++ (Zhou et al., 2018) to the NasUNet (Weng et al., 2019) . While the UNet is a powerful tool, it does have many limitations: 1. The depth necessary for many segmentation tasks is initially unknown, and traditional neural architecture search (NAS) struggles to identify the optimal UNet depth. 2. Researchers often manually choose skip connection locations, leading to potentially missed optimal connections. 3. Scientists need a NAS algorithm addressing many implementation objectives, including computational time, number of model parameters, and robust segmentation performance. On a broader level, discovering efficient UNet architectures is crucial because it can generate simpler models for applications on mobile devices, which need low latency for online learning. In the Telemedicine age, many medical applications rely on mobile Deep Learning to segment medical images and process raw patient data (Xu et al., 2017; Vaze et al., 2020) . We address the Medical and Engineering fields' need for efficiency with Bractivate, a NAS algorithm to discover lightweight UNet architectures for medical image segmentation tasks. We present the following three primary contributions: 1. An evolutionary algorithm that non-randomly samples from a distribution of various UNet Model depths and skip connection configurations, with both tensor concatenation and addition operators. 2. "Dendritic Branching"-inspired mutations that, just as in the brain, cause salient UNet blocks to branch to other blocks in the network through dendritic skip connections, creating efficient networks that preserve information signals through the network. 3. Bractivate generates high-performing models with lower space complexity than the current state-of-the-art. The remainder of the paper is structured as follows: In Section 2, we discuss prior works, and what gaps in the literature inspire us to propose Bractivate. Then, in Section 3, we discuss the search algorithm and the dendritic branching mutation. Later, in Section 4, we implement our algorithm with various experiments ranging from changing the search space depth to an ablation study. We report our quantitative and qualitative results, along with baseline comparisons in Section 5 before concluding in Section 6.

2. RELATED WORKS

Deep learning algorithms are often restricted to manual model design (Simonyan & Zisserman, 2014; He et al., 2016; Oktay et al., 2018; Ronneberger et al., 2015) . To automate model schemes, NAS is the process of selecting candidate architectures through various search strategies to achieve optimal performance (Elsken et al., 2019) . Advances in NAS have branched into different areas, including evolutionary algorithms (Miller et al., 1989; de Garis, 1990; Yao, 1993; Fogel et al., 1990; Angeline et al., 1994; Real et al., 2018; Yao, 1999) and automatic pattern recognition (Cai et al., 2018; Radosavovic et al., 2020) . While both approaches are merited, the tasks address image classification problems, and although some focus on skip connections, they lack deeper investigation



Figure 1: Through Bractivate, we discover UNet architecture with high spatio-temporal efficiency by mimicking the brain's dendritic branching.

