Navigation Path Detection for Cotton Field Operator Robot Based on Horizontal Spline Segmentation

Navigation Path Detection for Cotton Field Operator Robot Based on Horizontal Spline Segmentation

Dongchen Li, Shengyong Xu, Yuezhi Zheng, Changgui Qi, Pengjiao Yao
DOI: 10.4018/IJITWE.2017070103
(Individual Articles)
No Current Special Offers


Visual navigation is one of the fundamental techniques of intelligent cotton-picking robot. Cotton field composition is complex and the presence of occlusion and illumination makes it hard to accurately identify furrows so as to extract the navigation line. In this paper, a new field navigation path extraction method based on horizontal spline segmentation is presented. Firstly, the color image in RGB color space is pre-processed by the OTSU threshold algorithm to segment the binary image of the furrow. The cotton field image components are divided into four categories: furrow (ingredients include land, wilted leaves, etc.), cotton fiber, other organs of cotton and the outside area or obstructions. By using the significant differences in hue and value of the HSV model, the authors segment the threshold by two steps. Firstly, they segment cotton wool in the S channel, and then segment the furrow in the V channel in the area outside the cotton wool area. In addition, morphological processing is needed to filter out small noise area. Secondly, the horizontal spline is used to segment the binary image. The authors detect the connected domains in the horizontal splines, and merger the isolate small areas caused by the cotton wool or light spots in the nearby big connected domains so as to get connected domain of the furrow. Thirdly, they make the center of the bottom of the image as the starting point, and successively select the candidate point from the midpoint of the connected domain, according to the principle that the distance between adjacent navigation line candidate is smaller. Finally, the authors count the number of the connected domains and calculate the change of parameters of boundary line of the connected domain to make sure whether the robot reaches the outside of the field or encounters obstacles. If there is no anomaly, the navigation path is fitted by the navigation points using the least squares method. Experiments prove that this method is accurate and effective, which is suitable for visual navigation in the complex environment of a cotton field in different phases.
Article Preview

1. Introduction

Cotton plays an important role in world economy. However, cotton harvest in China Yangtze river valley still mainly relies on manual work so far. Other agricultural mechanism and automatic equipment for cotton is also rare. Shortages of labor leads to a surge in the cost of cotton picker, also limitations on the planting scale. The robot for cotton-picking is very important for the development of working efficiency, and it has a very good prospects of development (Jingbo, Xiaohui, Fan, & Wang, 2014). Automatic navigation is one of the key technologies of intelligent cotton-picking robot (Benson, Reid, & Zhang, 2004). The automatic navigation technology is the base for the development of the cotton robot. Auto-navigation has a great significance in increasing the operation quality and production efficiency of agriculture machinery such as improving the working environment, ensuring the safety of workers, and reducing the labor intensity, etc.

Currently there are three important navigation methods: GPS navigation, sensor fusion navigation and visual navigation technology. At present, GPS navigation technology has been developed at a high level. New GPS measurements such as RTK (real-time kinematic) technology can be located with a real-time centimeter-level positioning accuracy in the field, which uses a dynamic real-time carrier phase difference method (Zhao, Jin, Zhou, Wang, & Dai, 2015). GPS navigation is fast and stable with a simple structure and setting of the cruise line. The main problem of the method is that the satellite signal is greatly influenced by the environment sometimes. Sensor fusion navigation employs various types of sensors such as ultrasonic, infrared, laser ECT to detect robot surroundings by measuring the distance from the obstacle to the robot, by which the movement path of the robot can be decided (Ji & Zhou, 2014). This technique is usually used for the navigation of a robot wandering in a small range of indoor environments, or as an assisted avoidance means. These two navigation methods are generally used for specific applications and limited by the specific agricultural environment. For navigation in a complex environment of farmland, the most prominent method is the visual navigation technology based on image (Pilarski, Happold, & Pangels, 2002). This technique has grown up to be a primary method for navigation of agricultural machinery technology for its large amount of image information and robustness (Li, Chen, Liu, & Tao, 2013). By using one or more cameras to shoot images of the robot’s operating environment, the navigation path is calculated in real time via computer vision technology (Ma, Zhao, & Yuille, 2016). Vision navigation can detect obstacles and targets simultaneously (Zhang, Chen & Zhang, 2008). Compared with those two methods, the vision navigation has many technical advantages, for example, it can adapt to the complicated field of the operating environment and has a wide detection range, rich and complete information. Sometimes a wide view range and wide-angle lens can be added to camera for collection. In this case, the image distortion caused by wide-angle lens needs to be corrected (Saeed, Lawrence, & Lowed, 2006).

Complete Article List

Search this Journal:
Volume 19: 1 Issue (2024)
Volume 18: 1 Issue (2023)
Volume 17: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 16: 4 Issues (2021)
Volume 15: 4 Issues (2020)
Volume 14: 4 Issues (2019)
Volume 13: 4 Issues (2018)
Volume 12: 4 Issues (2017)
Volume 11: 4 Issues (2016)
Volume 10: 4 Issues (2015)
Volume 9: 4 Issues (2014)
Volume 8: 4 Issues (2013)
Volume 7: 4 Issues (2012)
Volume 6: 4 Issues (2011)
Volume 5: 4 Issues (2010)
Volume 4: 4 Issues (2009)
Volume 3: 4 Issues (2008)
Volume 2: 4 Issues (2007)
Volume 1: 4 Issues (2006)
View Complete Journal Contents Listing