Machine Learning in Healthcare: Introduction and Real-World Application Considerations

Machine Learning in Healthcare: Introduction and Real-World Application Considerations

Stavros Pitoglou (Computer Solutions SA, Greece & National Technical University of Athens, Greece)
Copyright: © 2020 |Pages: 18
DOI: 10.4018/978-1-7998-2390-2.ch004
OnDemand PDF Download:
No Current Special Offers


Machine learning, closely related to artificial intelligence and standing at the intersection of computer science and mathematical statistical theory, comes in handy when the truth is hiding in a place that the human brain has no access to. Given any prediction or assessment problem, the more complicated this issue is, based on the difficulty of the human mind to understand the inherent causalities/patterns and apply conventional methods towards an acceptable solution, machine learning can find a fertile field of application. This chapter's purpose is to give a general non-technical definition of machine learning, provide a review of its latest implementations in the healthcare domain and add to the ongoing discussion on this subject. It suggests the active involvement of entities beyond the already active academic community in the quest for solutions that “exploit” existing datasets and can be applied in the daily practice, embedded inside the software processes that are already in use.
Chapter Preview


Machine Learning and It’s Origins

One of the most quoted definitions of Machine Learning is:

The subfield of computer science that “gives computers the ability to learn without being explicitly programmed. (Samuel, 1959)

That is a compact, but also a complete description of the major paradigm shift Machine Learning brings to the world of solving problems, answering questions, and taking decisions with the use of Information Technologies. It implies that we can delegate to a computer the task to make sense out of a dataset “on its own,” without needing humans defining the exact course of calculations and actions, thus without us having understood the true nature of the problem at hand and the path to its solution. That way, the machine uses the data as “learning material” in order to assess and classify new or unseen data under the same context, or predict future values, eventually developing the ability to make decisions or/and define courses of action “on its own.” That human-like ability is described in a definition which was given a few decades later:

A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E. (Mitchell, 1997)

Taking a certain risk of oversimplification, the basic concept can be described as follows: One can take a dataset that he believes (or at least hopes) that contains the necessary information, a truth that cannot be easily discovered but is deemed essential in order to complete a specific task. He acknowledges, at the same time, the fact that, as the volume of data, the number of parameters that take part in the outcome and the complexity of their correlation increase, it becomes increasingly difficult (and at some point impossible) for the human mind to process, come up with a visible and intuitive hypothesis about the hidden patterns and model the acting causalities in order to provide means of accurate assessment and/or prediction. Then, the computer is let to create its universe out of this data, a perception of the reality in the form of multidimensional “hyperspheres”, creating vectors out of every data point, and, by the application of complex mathematic principles, calculate its way to an algorithm, that “understands” the acting causalities and “captures” the underlying patterns, thus becoming capable of being applied as “knowledge” and “experience” towards solving (or helping to solve) related problems.

Key Terms in this Chapter

Natural Language Processing (NLP): Natural language processing is an interdisciplinary field of computer science, artificial intelligence, and computational linguistics and deals with the interactions between computers and human (natural) languages. As a consequence, NLP is closely linked to human-computer interaction. Challenges in the NLP include understanding natural language, that is, trying to make computers capable of extracting meanings from human linguistic data, as well as producing natural language.

Artificial Neural Network (ANN): An artificial neuron network (ANN) is a nonlinear statistical data process inspired by the structure and functions of biological neurons, used for pattern recognition and modeling of complex input-output relationships. An ANN “learns” (adjusts its computational parameters) as information “flows” through its node layers, based on that input and output.

Big Data: Big data is an “umbrella term” that describes data processing approaches in situations where enormous and/or unstructured datasets cannot be efficiently manipulated via traditional handling techniques (e.g., relational databases).

Moore’s Law: In 1965, Intel co-founder Gordon E. Moore observed that the number of transistors placed on an integrated circuit (IC) doubles approximately every two years. As this observation has been proven repeatedly, it became known as Moore's law.

Complete Chapter List

Search this Book: