Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Overfitting

Handbook of Research on Cyber Crime and Information Privacy
A condition when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data.
Published in Chapter:
Detection and Prediction of Spam Emails Using Machine Learning Models
Salma P. Z (NSS College of Engineering, Kerala, India) and Maya Mohan (NSS College of Engineering, Kerala, India)
Copyright: © 2021 |Pages: 18
DOI: 10.4018/978-1-7998-5728-0.ch011
One of today's important means of communication is email. The extensive use of email for communication has led to many problems. Spam emails being the most crucial among them. It is one the major issues in today's internet world. Spam emails contain mostly advertisements and offensive content, which are often sent without the recipient's request and are generally annoying, time consuming, and wasting space on the communication media's resources. It creates inconveniences and financial loss to the recipients. Hence, there is always the need to filter the spam emails and separate them from the legitimate emails. There are a lot of content-based machine learning techniques that have proven to be effective in detecting and filtering spam emails. Due to a large increase in email spamming, the emails are studied and classified as spam or not spam. In this chapter, three machine learning models, Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), and Bidirectional LSTM (BLSTM), are used classify the emails as spam and benign.
Full Text Chapter Download: US $37.50 Add to Cart
More Results
Understanding Convolutional Neural Network With TensorFlow: CNN
An unfavorable machine learning characteristic happens when a model provides correct predictions for training examples but not for new data. Researchers first build the model on a collection of available information whenever data analysts make predictions using machine learning algorithms. Depending on this knowledge, the algorithm then attempts to predict results for additional data types. An overfit model might provide erroneous forecasts and need to function more effectively with all new data sources.
Full Text Chapter Download: US $37.50 Add to Cart
Obtaining Deep Learning Models for Automatic Classification of Leukocytes
Analysis that is too close to a specific dataset, that tends to fail to predict future observations or to fit additional data.
Full Text Chapter Download: US $37.50 Add to Cart
Applying Machine Learning to Online Data?: Beware! Computational Social Science Requires Care
Corresponds to the event when a machine learning model memorizes the training set data instead of learning the patterns present within it for accurate generalization. When a model overfits data, the model can return high within-dataset performance while it fails to generalize to other data.
Full Text Chapter Download: US $37.50 Add to Cart
Learning From Imbalanced Data
A modeling error that occurs when a function is too closely fit to a limited set of data point is known as Overfitting.
Full Text Chapter Download: US $37.50 Add to Cart
Plant Disease Classification Using Deep Learning Techniques
Overfitting refers to a situation where a machine learning model is excessively complex and performs well on training data, but poorly on new, unseen data.
Full Text Chapter Download: US $37.50 Add to Cart
Exploiting the Strategic Potential of Data Mining
A condition that occurs when there are too many parameters in a model. In such cases, the model learns the idiosyncrasies of the test data set. This can happen in models such as regression, time series analysis, and neural networks
Full Text Chapter Download: US $37.50 Add to Cart
Medical Image Classification
The classifier accuracy would be extra ordinary when the test data and the training data are overlapping. But when the model is applied to a new data it will fail to show acceptable accuracy. This condition is called as overfitting.
Full Text Chapter Download: US $37.50 Add to Cart
Incremental Neural Network Training for Medical Diagnosis
Refers to fitting a model (e.g., neural network) with too many samples or parameters.
Full Text Chapter Download: US $37.50 Add to Cart
Full Text Chapter Download: US $37.50 Add to Cart
Genetic Programming
Genetic programming evolves individuals over a training set, hopefully representative of the function to be approximated. However, the evolution driving force is usually so strong that individuals will develop specific code to match the training set as well as possible. An “overfit” solution will have a very good fitness on the training set, but will perform poorly on real data.
Full Text Chapter Download: US $37.50 Add to Cart
Efficient High Dimensional Data Classification
On applying a learning algorithm to a smaller training data, the model developed memorizes data and cannot predict well on new unseen data.
Full Text Chapter Download: US $37.50 Add to Cart
Application to Bankruptcy Prediction in Banks
Fitting a model to best match the available data while loosing the capability of describing the general behaviour.
Full Text Chapter Download: US $37.50 Add to Cart
Decision Trees
Fitting a statistical model that has too many parameters.
Full Text Chapter Download: US $37.50 Add to Cart
Employee Classification in Reward Allocation Using ML Algorithms
In machine learning, the fitting of corresponds too closely to the training data set and may therefore fail to fit validation and testing data. An overfitted model is a statistical model that contains more parameters than can be justified by the sample data and therefore fails to predict out-of-sample data reliably.
Full Text Chapter Download: US $37.50 Add to Cart
Counting the Hidden Defects in Software Documents
Learning a complicated function that matches the training data closely but fails to recognize the underlying process that generates the data. As a result of overfitting, the model performs poor on new input. Overfitting occurs when the training patterns are sparse in input space and/or the trained networks are too complex.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR