Toward Simplicity in Dynamic Inference: A Critical Study and Redesign of Early-Exit Networks

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In recent years, dynamic early-exiting neural networks have emerged as an efficient solution for balancing classification performance and inference cost—typically measured in FLOPs—in image classification tasks. Given the significance of this domain, numerous extensions to early-exiting networks have been proposed over the past five years, focusing on improving this tradeoff. This paper critically analyzes these advancements and, through extensive experimentation, demonstrates that a simple yet carefully designed architecture—eschewing unnecessary complexities—can achieve results that surpass the current state-of-the-art models when paired with appropriate algorithmic strategies. We introduce SEEDNet (Simple Early-Exiting Dynamic Image Network), a streamlined early-exiting dynamic image network that leverages a combination of mechanisms to enhance both efficiency and efficacy. By integrating early-exiting strategies within a dynamic framework, SEEDNet achieves a top-1 accuracy of 81.22% on ImageNet while requiring only 2 × 109 FLOPs during inference. This demonstrates that the right existing mechanisms have been selected and integrated to obtain a more efficient architecture. To promote transparency and encourage further advancements in the field, we will release our code to the community.

Article activity feed