Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Mutual Information

Handbook of Research on Advanced Techniques in Diagnostic Imaging and Biomedical Applications
In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two variables.
Published in Chapter:
Image Registration Algorithms for Applications in Oncology
Katia Marina Passera (Politecnico di Milano, Italy) and Luca Tommaso Mainardi (Politecnico di Milano, Italy)
DOI: 10.4018/978-1-60566-314-2.ch009
Abstract
Image registration is the process of determining the correspondence of features between images collected at different times or using different imaging modalities. A wide range of registration algorithms was proposed in literature for solving this task. In this chapter the focus will be on oncology applications, where registration is the prior step of: i) subtraction imaging (to emphasize hyper (or hypo) enhanced structures), ii) fusion imaging (to integrate anatomical and functional information about lesions) and iii) serial imaging comparison (to monitor the progression/regression of a disease). These applications are of great relevance in tumors diagnosis, staging and treatment planning. The goal of this chapter is to provide an overview of registration algorithms considering these different applications in oncology. We discuss the advantages/disadvantages of each algorithm, the results gained and the possible future developments to comply with new requirements.
Full Text Chapter Download: US $37.50 Add to Cart
More Results
Brain-Machine Interface: Human-Computer Interaction
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the “amount of information” (in units such as bits) obtained about one random variable, through the other random variable.
Full Text Chapter Download: US $37.50 Add to Cart
Counting the Hidden Defects in Software Documents
An entropy-based measure for the degree of stochastic dependence between two random vectors. If the mutual information value is high, the vectors carry much information about each other.
Full Text Chapter Download: US $37.50 Add to Cart
Methods for Reverse Engineering of Gene Regulatory Networks
It is the information of one random variable in another one. In other words, it is the reduction of uncertainty about one variable after observing the other. It is symmetric in respect to the variables.
Full Text Chapter Download: US $37.50 Add to Cart
Using Mutual Information to Analyse Serial Dependence: The Effects of COVID-19
Methodology used to analyze dependence of time series, which in the case of financial time series could be used to analyze their efficiency.
Full Text Chapter Download: US $37.50 Add to Cart
Semi-Supervised Dimension Reduction Techniques to Discover Term Relationships
A non-linear correlation measure that allow us to evaluate the degree of association between the input variables and the response.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR