Fish Counting and Measurement: A Modular Framework and Implementation

Fish Counting and Measurement: A Modular Framework and Implementation

Fredrik Anders Westling (UNSW, Australia), Changming Sun (CSIRO, Australia), Dadong Wang (CSIRO, Australia) and Fahim Irfan Alam (Griffith University, Australia)
DOI: 10.4018/978-1-4666-9435-4.ch003
OnDemand PDF Download:


An approach is suggested for automating fish identification and measurement using stereo Baited Remote Underwater Video footage. Simple methods for identifying fish are not sufficient for measurement, since the snout and tail points must be found, and the stereo data should be incorporated to find a true measurement. We present a modular framework that ties together various approaches in order to develop a generalized system for automated fish detection and measurement. A method is also suggested for using machine learning to improve identification. Experimental results indicate the suitability of our approach.
Chapter Preview


When studying fish populations, either for marine research or industrial fishery purposes, it is crucial to obtain accurate information on the size and shape of fish populations (Costa, Loy, Cataudella, Davis, & Scardi, 2006). Traditionally, this has been carried out using methods including extracting fish from the ocean by casting nets, and human underwater observations (Spampinato, Chen-Burger, Nadarajan, & Fisher, 2008). This poses several issues since these methods are intrusive upon the ecosystem: casting nets kills fish and interferes with unrelated wildlife, and human observation is expensive and can disturb the marine life. To this end, various systems using underwater cameras have been suggested and implemented in recent years, including a method called baited remote underwater video systems (BRUVS) (Johansson, Stowar, & Cappo, 2008; Marouchos, Sherlock, Barker & Williams, 2011). Figure 1 shows the frame and stereo camera setup that has been used (Langlois, Harvey, Fitzpatrick, Meeuwig, Shedrawi, & Watson 2010). An example pair of stereo BRUVS footage is shown in Figure 2. Current systems require manual analysis by trained experts which requires considerable time and effort. Spampinato et al. (2008) suggest that it could take as much as 15 minutes for a marine biologist to work through a minute of footage, classifying and annotating. Automating this process is clearly of critical importance to the success of these systems.

Figure 1.

The frame and a stereo camera in a BRUVS setup (drawing courtesy Langlois, Harvey, Fitzpatrick, Meeuwig, Shedrawi, & Watson (2010) with permission from Inter-Research Science Center)

Figure 2.

Example pair of corresponding frames from a stereo BRUVS setup

In this chapter, the authors propose a new method for structuring such systems to incorporate improvable ‘modules’ and a supervised learning approach. Stereo BRUVS footages are used to obtain 3D information on fish, and the modules of the system include Identification, Tracking, and Measurement. Current results are convincing and further research directions are suggested.



The identification and measurement of free-swimming fish in their natural environment is a critical and challenging task. Many different approaches have been taken to automating fish identification in stereo video under different circumstances. Conventional still photography and video imagery have been widely utilized for counting and measuring fish underwater (Boland & Lewbel, 1986; Naiberg, Petrell, Savage, & Neufeld, 1993; Petrell, Shi, Ward, Naiberg, & Savage, 1997). For example stereo-photography has been used in situ to quantify the growth of coral colonies (Done, 1981), salmon (Petrell et al., 1997) and reef fish (Harvey, Fletcher, & Shortis, 2001). The use of stereo-video technology was superior compared with the traditional underwater visual surveys, which was completed based on the experience by scuba divers to count and estimate fish. The accuracy and precision of measurements made with a stereo-video system were recorded by Harvey and Shortis (1996).

Complete Chapter List

Search this Book: