Paper ID | COM-2.1 | ||
Paper Title | ANALYSIS OF NEURAL IMAGE COMPRESSION NETWORKS FOR MACHINE-TO-MACHINE COMMUNICATION | ||
Authors | Kristian Fischer, Christian Forsch, Christian Herglotz, André Kaup, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Germany | ||
Session | COM-2: Learning-based Image and Video Coding | ||
Location | Area H | ||
Session Time: | Wednesday, 22 September, 14:30 - 16:00 | ||
Presentation Time: | Wednesday, 22 September, 14:30 - 16:00 | ||
Presentation | Poster | ||
Topic | Image and Video Communications: Lossy coding of images & video | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Video and image coding for machines (VCM) is an emerging field that aims to develop compression methods resulting in optimal bitstreams when the decoded frames are analyzed by a neural network. Several approaches already exist improving classic hybrid codecs for this task. However, neural compression networks (NCNs) have made an enormous progress in coding images over the last years. Thus, it is reasonable to consider such NCNs, when the information sink at the decoder side is a neural network as well. Therefore, we build-up an evaluation framework analyzing the performance of four state-of-the-art NCNs, when a Mask R-CNN is segmenting objects from the decoded image. The compression performance is measured by the weighted average precision for the Cityscapes dataset. Based on that analysis, we find that networks with leaky ReLU as non-linearity and training with SSIM as distortion criteria results in the highest coding gains for the VCM task. Furthermore, it is shown that the GAN-based NCN architecture achieves the best coding performance and even out-performs the recently standardized Versatile Video Coding (VVC) for the given scenario. |