Analysis of Quantization Effects on Higher Order Function and Multilayer Feedforward Neural Networks

Analysis of Quantization Effects on Higher Order Function and Multilayer Feedforward Neural Networks

Minghu Jiang (Tsinghua University, China), Georges Gielen (Katholieke Universiteit Leuven, Belgium) and Lin Wang (Beijing University of Posts and Telecom, China)
DOI: 10.4018/978-1-61520-711-4.ch008
OnDemand PDF Download:
$37.50

Abstract

In this chpater we investigate the combined effects of quantization and clipping on Higher Order function neural networks (HOFNN) and multilayer feedforward neural networks (MLFNN). Statistical models are used to analyze the effects of quantization in a digital implementation. We analyze the performance degradation caused as a function of the number of fixed-point and floating-point quantization bits under the assumption of different probability distributions for the quantized variables, and then compare the training performance between situations with and without weight clipping, and derive in detail the effect of the quantization error on forward and backward propagation. No matter what distribution the initial weights comply with, the weights distribution will approximate a normal distribution for the training of floating-point or high-precision fixed-point quantization. Only when the number of quantization bits is very low, the weights distribution may cluster to ±1 for the training with fixed-point quantization. We establish and analyze the relationships for a true nonlinear neuron between inputs and outputs bit resolution, training and quantization methods, the number of network layers, network order and performance degradation, all based on statistical models, and for on-chip and off-chip training. Our experimental simulation results verify the presented theoretical analysis.
Chapter Preview
Top

2. Quantization Analysis Of Hofnn And Mlfnn

In order to increase the reliability of the analysis results and to more accurately predict the quantization error and the properties of the network, the true function of a nonlinear neuron (i.e., a realistic hyperbolic tangent function as activation function) is used with two different probability distributions.

Complete Chapter List

Search this Book:
Reset