Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Root Mean Absolute Error

Handbook of Research on Technological Advances of Library and Information Science in Industry 5.0
One of the methods most frequently used to assess the accuracy of forecasts is the root mean square error, also known as root mean square deviation. It illustrates the Euclidean distance between measured actual values and forecasts. It calculates the residual (difference between prediction and truth) for each data point and its norm, mean, and square root to determine the root-mean-square error (RMSE). Since it requires and uses real measurements at each projected data point, RMSE is frequently utilized in supervised learning applications.
Published in Chapter:
Prognostication of Crime Using Bagging Regression Model: A Case Study of London
Ashansa Kithmini Wijeratne (Sabaragamuwa University of Sri Lanka, Sri Lanka), Nirubikaa Ravikumar (Sabaragamuwa University of Sri Lanka, Sri Lanka), Pulasthi Mithila Bandara (Sabaragamuwa University of Sri Lanka, Sri Lanka), and Banujan Kuhaneswaran (Sabaragamuwa University of Sri Lanka, Sri Lanka)
DOI: 10.4018/978-1-6684-4755-0.ch023
Abstract
Crime is a social and economic problem that affects a country's quality of day-to-day life and economic growth. However, analyzing and forecasting crime is not a straightforward job for a law enforcement investigator to manually unravel the underlying nuances of crime data. To make this process easier and more automated, the authors present a machine-learning model for crime analysis and predictions. The authors used a London crime dataset and enhanced the data set by incorporating population density, percentage of economically inactive working age, and average monthly temperature. The pre-process step prepares the raw data and makes it suitable for the machine-learning model. Bagging and boosting ensemble techniques were used to find a better- machine-learning model. GridSearchCV was used to tune hyperparameters to find the best-performed model. Parameters were tuned as an iterative processes. Eventually, the researchers compared all the algorithms and selected the Random Forest bagging regression model as the best-performed algorithm.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR