Multi-robot team work is necessary for complex tasks which cannot be performed by a single robot. To get the required performance and reliability, it is necessary to develop a proper cooperative task. The robots need to be intelligent enough to adjust with dynamic workload and environment. Benefits can be amplified from a team if benevolence combines with cooperation. The benevolence behaviors among the team members are extra benefits to the society. There is a flexible relation among intelligence, benevolence and emotions. We describe an emotion model to be used for each of the members of a multi-robot team. In respect of some drawbacks with the existing approaches, we present an emotion based multi-robot cooperation with some benevolent characters.
Multi-robot system is one of the main topics in research area to different application fields. Significant benefits like reliability, performance and economic value can be had by engaging multi-robot system instead of a single robot. In addition to that, a good level of robustness, fault tolerance and flexibility can be had from multi-robot due to task sharing among the members.
Multi-robot system is usually used to distribute the activities and intelligence among the members and this distributing process depends on the complexity of problems. If the task is too complex then it is needed to divide into small tasks and distribute these segmented tasks to members of team. A robot can have a satisfactory role by performing the assigned small task with the limited ability and knowledge.
The advantages of team work are widely acceptable and applicable from a small group to organization level. Until now, several team work theories and models (Scerri et al., 2002; Kitano et al., 1999; Tambe, 1997) have been developed considering coordination methods, communication methods among the team members, their forms and reforming methods, etc. The roles of emotions and effects of coordination for human team have already been investigated and supported by many psychologists. But, the roles of emotions and appliance for pure agent team have not been studied adequately, although some limited research results strongly support the importance of emotion for pure agent system (Nair et al., 2005; Sceutz, 2004; Gage, 2004; Murphy et al., 2002). During the cooperation among the team, it needs to develop different behaviours among the agents of which benevolence is one of the important behaviour for the welfare of the team. In this chapter, we will conjecture about how multiagent team can augment their capabilities for coordination with benevolent characters through the introduction of emotions.
While performing task in a group, it needs to have some agreements of cooperation and benevolent actions to increase the group/overall performance (as shown in Figure 1). The degree of benevolence depends on the cooperation level, situation and type of action, etc. So, what is benevolent agent? To what extend an agent should be benevolent? What is the role of benevolence for a Multi-agent system (MAS) system? When is benevolence useful or fruitless for action performing agent and its colleague? Such kinds of inquiries are continuously arising when benevolence concept is being considered to be applied for AI system. Is there any relation between benevolence and emotional state? Thinking about the incorporation of benevolence into MAS is a good idea to be considered as a research topic. In this chapter, we will discuss how emotional state affects benevolence characters and the roles of emotion for team work considering multi-robot system. In the following section, we will discuss about benevolent agents and their characters.
Figure 1. Top
Multiagent, cooperation and benevolence
2. Benevolent Agent
In generally, benevolent actions are necessary for task/goal sharing to acquire with ease. There is no common agreement to define benevolent agent. Definitions of benevolent agent from different researchers are slightly split into their concepts. Philosophers and biologists relate benevolence as a pure concept of virtue, compassion and moral sentiments (Mohammed and Huhns, 2001). They describe ‘benevolence action’ as the doing of kind action to other from mere good will and without any obligation. Jennings and Kalenka (1999) suggested to select benevolence while describing a good decision making function. Some researchers considered benevolence as an important ‘phenomenon’ that exists in a team of autonomous agents from instance of agent’s emotions (Mohammed and Huhns, 2001).