In last two decades, image retrieval has seen a growth of interests in several domains. As a result, a lot of work has been done in order to integrate it in the standard data processing environments (Rui, Huang, & Chang, 1999; Smeulders, Gevers, & Kersten, 1998; Yoshitaka & Ichikawa, 1999). To retrieve images, different methods have been proposed in the literature (Chang & Jungert, 1997; Guttman, 1984; Lin, Jagadish, & Faloutsos, 1994). These methods can be grouped into two major approaches: metadata-based and content-based approaches. The metadata-based approach uses alphanumeric attributes and traditional techniques to describe the context and/or the content of the image such as title, author name, date, and so on. The content-based approach uses image processing algorithms to extract low-level features of images such as colors, textures, and shapes. Image retrieval using these features is done by methods of similarity and hence is a non-exact matching. In this article, we address the spatial and evolutionary issues of images. We propose a novel method that considers different types of relations. This method allows providing a highly expressive and powerful mechanism for indexing images. The rest of this article is organized as follows: the next section is devoted to detail the related work. In the following section, we define our method of computing the different relations and we show how image indexing can be done. The subsequent section demonstrates how our method can adequately index medical images. Finally, we conclude and give future work orientations.