This publication offers a hybrid network compression technique for exploiting the prior knowledge of network parameters by Gaussian scale mixture (GSM) models.
In this paper, the authors propose a hybrid network compression technique for exploiting the prior knowledge of network parameters by Gaussian scale mixture (GSM) models. Specifically, the collection of network parameters are characterized by GSM models and network pruning is formulated as a maximum a posteriori (MAP) estimation problem with a sparsity prior. The key novel insight brought by this work is that groups of parameters associated with the same channel are similar, which is analogous to the grouping of similar patches in natural images. Such observation inspires us to leverage powerful structured sparsity prior from image restoration to network compression - i.e., to develop a flexible filter-grouping strategy that not only promotes structured sparsity but also can be seamlessly integrated with the existing network pruning framework. Extensive experimental results on several popular DCNN models including VGGNet, ResNet and DenseNet have shown that the proposed GSM-based joint grouping and pruning method convincingly outperforms other competing approaches (including both pruning and non-pruning based methods). (Published Abstract Provided)
Downloads
Similar Publications
- Py_Ped_Sim - A Flexible Forward Genetic Simulator for Complex Family Pedigree Analysis
- How the Work Being Done on Statistical Fingerprint Models Provides the Basis for a Much Broader and Greater Impact Affecting Many Areas within the Criminal Justice System
- Assessment of Sexual Assault Kit (SAK) Evidence Selection Leading to Development of SAK Evidence Machine-Learning Model (SAK-ML Model)