Local Attention Distillation for Efficient Semantic Segmentation
27 Pages Posted: 29 Feb 2024
Abstract
Efficient semantic segmentation plays a crucial role in computer vision, and knowledge distillation has gained significant attention as a promising methodology to enhance model efficiency. Nevertheless, current approaches knowledge distillation in efficient semantic segmentation predominantly prioritize the distillation of global correlations, which often overlook significant regions within positive samples. This limitation restricts the learning of local distinctive features. To address these drawbacks, we propose Local Attention Distillation (LAD), which is a block-based knowledge distillation approach. By partitioning feature maps into non-overlapping blocks, our approach emphasizes local positive sample features and facilitates more effective learning of local discriminative features within each block. To assess the validity of our LAD, we conducted comprehensive experiments on Cityscapes, CamVid, and Pascal VOC 2012. Furthermore, we compared our LAD with several state-of-the-art distillation techniques and the comparative analysis proves the efficacy of our proposed LAD.
Keywords: Knowledge distillation, Semantic Segmentation, local attention, global attention, channel-wise distillation
Suggested Citation: Suggested Citation