Environments Diagnosis by Means of Computer Vision System of Autonomous Flying Robots

Environments Diagnosis by Means of Computer Vision System of Autonomous Flying Robots

Konstantin Dergachov (National Aerospace University – Kharkiv Aviation Institute, Ukraine), Anatolii Kulik (National Aerospace University – Kharkiv Aviation Institute, Ukraine) and Anatolii Zymovin (National Aerospace University – Kharkiv Aviation Institute, Ukraine)
Copyright: © 2019 |Pages: 23
DOI: 10.4018/978-1-5225-7709-6.ch004
OnDemand PDF Download:
List Price: $37.50
10% Discount:-$3.75


In this chapter, the authors present an approach to the extrinsic environs diagnostics based on using visual information collected by autonomous robots. The possibility of utilizing a computer vision for the purpose of rational control implementation in the condition of the full or partial uncertainty is investigated. In the study, the combined hardware and software computer vision tools were verified. The models, algorithms, and codes for solving the local tasks of obstacle identification and mutual location kinematic parameters estimation have been developed. A series of computational and in-kind experiments that illustrate a practical possibility of implementing the navigational environment diagnosis is carried out with the aim to select a rational flight path.
Chapter Preview


At present time, there is a trend of reducing the number of remotely piloted unmanned vehicles (UAVs) in sphere of aviation appliances. The more topical becomes the UAVs that are capable not only to fly autonomous fully but also perform monitoring, aerial photography, agricultural work, military missions, etc. Therefore, a UAV becomes an autonomous flying robot (AFR), which executes the basic functions without human being participation.

The progressive trend in AFRs development is to extend the sphere of application by improving the functionality of the automatic control system.

As a rule, the autonomous robots work in conditions that are partially or completely unknown. The path adaptation to a situational uncertainty is possible (Odarchenko, R. S., Gnatyuk, S. O., Zhmurko, T. O., & Tkalich, O. P. (2015, October)) by appropriate processing the visual information obtained by the on-board video camera. The efficiency of AFR adaptation process depends greatly upon the quality of extrinsic conditions diagnosis.

The environment in which the robot’s movement is going on can be characterized by a situational uncertainty caused by a number of factors (Isermann, R. (2006)).

AFRs operate in the condition of certain objectively existing causes which destabilize the navigation task fulfillment (Gulevich, S. P., Veselov, Yu. G., Pryadkin, S. P., & Tyrnov, S. D. (2012)). The destabilizing factors for the AFR could be events like given below

  • 1.

    The condition of the atmosphere in terms of atmospheric phenomena like the presence or absence of the wind, precipitation, solar activity, cloudiness;

  • 2.

    The state of the terrestrial and celestial visual orientators (their shift, changes of the reflecting surface);

  • 3.

    The obstacle occurrence on the flight path (different flying vehicles, other interfering things).

So, the destabilizing factors are the real, objectively existing causes that need to be identified and evaluated in the course of performing a relevant mission (Kulik, А. S. (2014)). The destabilizing impacts produce a situational uncertainty of the extrinsic environs.

Such an uncertainty can be reduced using the procedures intended to perform the operative diagnosis of vicinities of the AFR route. Diagnostic procedures would allow, for example, discovering the interference, identifying and localizing them, i.e. providing for a complete ambiguous events analysis. This information is required in real time to rapidly generate on board the autonomous robot such the obstacle bypassing procedure that would allow rational solving the navigation task in the condition of naturally limited time and energy resources. Similar diagnosis procedure functions should be applied when other destabilizing factors occurred.

For AFR, a task of navigating the flying vehicle from the point of departure to destination involves the need to identify obstacles in the path and bypass them.

Obstacles that arise on the path of AFR movement can be shared into two large groups: the obstacles that can be overcome by the AFR without changing its motion trajectory and those ones that can be avoided by changing the trajectory only. Hereafter, we investigate the second option when you need neccessarily to change the trajectory of AFR movement to avoid the obstacle (Kulik, A. S., & Radomskyi, O. M. (2017)). The following components associated with an intelligent diagnosis can be emphasized. You need providing

  • 1.

    Obstacle detection;

  • 2.

    Determination of the distance to the obstacle;

  • 3.

    Determination of the own speed relative to the obstacle;

  • 4.

    Estimation of obstacle size;

  • 5.

    Determination of AFR angular attitude relative to the obstacle.

Key Terms in this Chapter

Environs Diagnosis: A set of activities aimed at reducing situational uncertainty in the operation of autonomous flying vehicles, for example, the measures include procedures for interference detecting, identifying, and localizing to ensure the complete knowledge of outdoor circumstances.

Computer Vision Systems: Sensory assemblies that provide shooting of the work scenes and objects, and images converting, processing, and interpretation using an UAV on-board computer and further the transfer of results to the management device.

Destabilizing Factors: Such objectively existing interfering impacts that are needed to be identified and evaluated in the course of carrying out the respective mission.

Visual Reference, Visual Orientator: An object perceived by the eye of a person or an optical device as a quest visual form.

Off-Nominal Situations (Contingency Events): Uncertain events with respect to both a moment of their occurrence and to an obscure feature of the reason causing the situation.

Hough Transform: An algorithm used to extract image elements. The technique supplies the image analysis and digital processing in a variety of computer vision facilities; is designed to search for primitives that belong to a certain shape types with utilizing a voting procedure.

Computer Vision: Implements a complex process of extracting, identifying, and converting video information, which includes six basic steps: 1) obtaining (perceiving) information, or sensing; 2) preconditioning, or preprocessing; 3) segmentation; 4) description; 5) recognition; and 6) interpretation.

Complete Chapter List

Search this Book: