Graphical User Interface for the Control of a Biped Robot

Graphical User Interface for the Control of a Biped Robot

Claudio Urrea, Carlos Cortés Mac-Evoy
Copyright: © 2019 |Pages: 13
DOI: 10.4018/978-1-5225-8060-7.ch031
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The design and implementation of a graphical user interface (GUI) for the control and operation of a biped robot is presented. This GUI allows establishing communication between the user, the robot and a computer (controller) so that the robot can perform bipedal walking without the need to introduce commands that are not user-friendly. The developed graphic interface permits the user to carry out tasks operating on the robot without having to resort to commands that are not easy to use. This interface was created using the MATLAB-Simulink software and it presents important advantages compared to the manual operation of a robot.
Chapter Preview
Top

Introduction

The science of robotics has grown tremendously over the past 30 years, fueled by rapid advances in computer and sensor technology, as well as theoretical advances in control and computer vision. Humanoid biped robots, typically complex in design, have numerous Degrees Of Freedom (DOF) due to the ambitious goal of mimicking the human gait. In recent years, several efforts of the robotics community have focused on developing bio-inspired robots, particularly in humanoid biped robots. Biped robots represent a very interesting research subject, with several particularities and scope topics, such as mechanical design, gait simulation, patterns generation, kinematics, dynamics, equilibrium, stability, kinds of control, adaptability, biomechanics, cybernetics, and rehabilitation technologies.

Biped robots have leg link structures similar to human anatomy. To be able to maintain its stability under dynamic situations, such robotic systems require good mechanical designs and force sensors to acquire the Zero Moment Point (ZMP). Research in biped robotics has recently had a great surge due to the challenges of the subject, and the media impact of famous biped robotic creations like Honda’s robots.

The modeling of biped gait has usually resulted in the description of kinematics, dynamics, and stability of two-legged walking robots, and represents an unresolved challenge, as most approaches reduce the dynamic model of the humanoid robot to an inverted pendulum that is based upon either a simple observation or mathematical analysis, derived from kinematics.

Of significant importance here is robot software; i.e., the set of coded commands or instructions that tell a mechanical device and electronic system - known together as a robot - what tasks to perform. Robot software is used to perform autonomous tasks. Many software systems and frameworks have been proposed to make programming robots easier.

Due to the highly proprietary nature of robot software, most manufacturers of robot hardware also provide their own software. While this is not unusual in other automated control systems, the lack of standardization of programming methods for robots does pose certain challenges. For example, there are over 30 different manufacturers of industrial robots, so there are also 30 different robot programming languages required. Fortunately, there are enough similarities between the different robots that it is possible to gain a broad-based understanding of robot programming without having to learn each manufacturer's proprietary language.

One of the most important aspects in the implementation and control of biped robots is the software used for their operation. It is what indicates where the robot must go, at what speed, and for how long. In other words, it is the software that articulates the whole robotic system, and even though the associated electromechanical parts play a key role in the development of a robot, it is not possible to achieve the proposed objectives for a system with these characteristics if one does not have the proper program that directs, coordinates, and commands all its movements.

A robot’s software can be as simple as a set of routines that order the robot to carry out a particular task, or, as complex as a graphic program with a large variety of interactive interfaces. In this sense, the experience and skill that a user can get by using a graphic program to command a robot, compared with the use of simple routines and commands, turn out to be much greater. This clearly justifies the creation of interfaces with graphic components that make it easier for the user to operate a robot.

The Graphical User Interface (GUI), also known as UI, provides point-and-click control of software applications, eliminating the need to learn a language or type commands in order to run the application. The actions in a GUI are usually performed through direct manipulation of the graphical elements. In addition to computers, GUIs can be found in hand-held devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office, and industrial equipment. The term ‘GUI’ tends to not be applied to other low-resolution types of interfaces with display resolutions, such as video games, nor is it restricted to flat screens, like volumetric displays, because the term is restricted to the scope of two-dimensional display screens able to describe generic information, in the tradition of the computer science research at the Palo Alto Research Center (PARC) (Lii et al., 2015; Zhiguo, Chong, Jun, Habin, & Hong, 2014).

Complete Chapter List

Search this Book:
Reset