Quantitative Analysis of Hysteroscopy Imaging in Gynecological Cancer
Marios Neofytou (University of Cyprus, Cyprus), Constantinos Pattichis (University of Cyprus, Cyprus), Vasilios Tanos (Aretaeion Hospital in Nicosia, Cyprus), Marios Pattichis (University of New Mexico, USA) and Eftyvoulos Kyriacou (Frederick University, Cyprus)
Copyright: © 2009
The objective of this chapter is to propose a quantitative hysteroscopy imaging analysis system in gynaecological cancer and to provide the current situation about endoscopy imaging. Recently works, involves endoscopy, gastroendoscopy, and colonoscopy imaging with encouraging results. All the methods are using image processing using texture and classification algorithms supporting the physician diagnosis. But none of the studies were involved with the pre-processing module. Also, the above studies are trying to identify tumours in the organs and no of the are investigates the tissue texture. The system supports a standardized image acquisition protocol that eliminates significant statistical feature differences due to viewing variations. In particular, the authors provide a standardized protocol that provides texture features that are statistically invariant to variations to sensor differences (color correction), angle and distance to the tissue. Also, a Computer Aided Diagnostic (CAD) module that supports the classification of normal vs abnormal tissue of early diagnosis in gynaecological cancer of the endometrium is discussed. The authors investigate texture feature variability for the aforementioned targets encountered in clinical endoscopy before and after color correction. For texture feature analysis, three different features sets were considered: (i) Statistical Features, (ii) Spatial Gray Level Dependence Matrices, and (iii) Gray Level Difference Statistics. Two classification algorithms, the Probabilistic Neural Network and the Support Vector Machine, were applied for the early diagnosis of gynaecological cancer of the endometrium based on the above texture features. Results indicate that there is no significant difference in texture features between the panoramic and close up views and between different camera angles. The gamma correction provided an acquired image that was a significantly better approximation to the original tissue image color. Based on the texture features, the classification algorithms results show that the correct classification score, %CC=79 was achieved using the SVM algorithm in the YCrCb color system with the combination of the SF and GLDS texture feature sets. This study provides a standardized quantitative image analysis protocol for endoscopy imaging. Also the proposed CAD system gave very satisfactory and promising results. Concluding, the proposed system can assist the physician in the diagnosis of difficult cases of gynaecological cancer, before the histopathological examination.
In laparoscopic/hysteroscopic imaging, the physician guides the telescope inside the uterine or abdominal cavity investigating the internal anatomy, in search of suspicious, cancerous lesions (Bankman et al, 2000). During the exam, the experience of the physician plays a significant role in identifying suspicious regions of interest (ROIs), where in some cases, important ROIs might be ignored and crucial information neglected (Sierra et al, 2003). The analysis of endoscopic imaging is usually carried out visually and qualitatively (Fayez et al, 1991, based on the subjective expertise of the endoscopist. Therefore, this procedure suffers from interpretational variability, lack of comparative analysis and it is time consuming.
Key Terms in this Chapter
Computer Aided Diagnosis: Diagnosis supported by computer methods, usually by using automatic classifiers in order to get an estimation on the exact diagnosis.
Hysteroscopy Examination: The physician guides the telescope connected to a camera inside the endometrium in order to investigate the cavity.
Automatic Classifiers: Mathematical functions that can classify events based on several features and previously known cases.