A Perturbation Size-Independent Analysis of Robustness in Neural Networks by Randomized Algorithms

A Perturbation Size-Independent Analysis of Robustness in Neural Networks by Randomized Algorithms

C. Alippi
Copyright: © 2003 |Pages: 19
DOI: 10.4018/978-1-59140-037-0.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter presents a general methodology for evaluating the loss in performance of a generic neural network once its weights are affected by perturbations. Since weights represent the “knowledge space” of the neural model, the robustness analysis can be used to study the weights/performance relationship. The perturbation analysis, which is closely related to sensitivity issues, relaxes all assumptions made in the related literature, such as the small perturbation hypothesis, specific requirements on the distribution of perturbations and neural variables, the number of hidden units and a given neural structure. The methodology, based on Randomized Algorithms, allows reformulating the computationally intractable problem of robustness/sensitivity analysis in a probabilistic framework characterised by a polynomial time solution in the accuracy and confidence degrees.

Complete Chapter List

Search this Book:
Reset