Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is First-Order Method

Encyclopedia of Artificial Intelligence
A training algorithm using the objective function and its gradient vector.
Published in Chapter:
Learning in Feed-Forward Artificial Neural Networks I
Lluís A. Belanche Muñoz (Universitat Politècnica de Catalunya, Spain)
Copyright: © 2009 |Pages: 8
DOI: 10.4018/978-1-59904-849-9.ch148
Abstract
The view of artificial neural networks as adaptive systems has lead to the development of ad-hoc generic procedures known as learning rules. The first of these is the Perceptron Rule (Rosenblatt, 1962), useful for single layer feed-forward networks and linearly separable problems. Its simplicity and beauty, and the existence of a convergence theorem made it a basic departure point in neural learning algorithms. This algorithm is a particular case of the Widrow-Hoff or delta rule (Widrow & Hoff, 1960), applicable to continuous networks with no hidden layers with an error function that is quadratic in the parameters.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR