Trust and Fairness Management in P2P and Grid Systems

Trust and Fairness Management in P2P and Grid Systems

Adam Wierzbicki, Tomasz Kaszuba, Radoslaw Nielek
DOI: 10.4018/978-1-61520-686-5.ch032
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Peer-to-Peer (P2P) and Grid systems serve whole communities of users, and are thus good examples of information systems that should realize social goals. One of the most important social goals is fairness. For that reason, P2P and Grid systems incorporate many mechanisms, algorithms and methods for providing fairness that are referred to as Fairness Management in this chapter. Information systems that serve communities of users can also apply social concepts to support users. Trust is one such concept that is often applied in P2P and Grid systems that apply Trust Management to support users in making decisions under uncertainty that is due to other users’ behavior. This chapter describes Trust Management and Fairness Management in P2P and Grid systems, showing that the two subjects are connected. Trust Management can be used to improve fairness without centralized control. The chapter includes a demonstration of fairness emergence due to trust management.
Chapter Preview
Top

Introduction

Peer-to-Peer (P2P) and grid systems are instances of Open Distributed Systems that are now widely used in the e-society. For that reason, they must take into account the behavior and motivations of their human users, like egoism or exploitation of others, and the need for trust and fairness. Information technology aims to support the latter goal by creating systems that encompass and use knowledge about trust and fairness. This chapter is devoted to a description of this trend in P2P and grid systems.

Trust management (TM) uses computational models of human trust in order to help users to make decisions under uncertainty that is due to the actions of others. In a P2P or grid system, this decision might concern sharing one’s resources with another user when it is uncertain whether she will reciprocate. Other types of decisions must be made by users of P2P applications that have other purposes than resource sharing, like in a P2P game. One of the problems of trust management research is that it is quite generic and can be applied in a variety of areas. In this chapter, we shall introduce a general model of TM developed in the Universal Trust (uTrust) project (http://uTrust.pjwstk.edu.pl) that will be used to organize the discussion of various TM methods described in the literature of P2P and grid systems.

Trust management works by gathering information that can be used to quantify users’ trust or reputation. Since this information is usually obtained from untrusted third parties, it is not reliable and TM methods must take into account the credibility of received information and provide incentives for honesty (Fernandes et al., 2004). This problem demonstrates that TM methods in P2P and grid systems must be resistant to adversary behavior that exploits the lack of centralized trusted entities. A list of typical adversaries will be included in the chapter.

Many proposed TM methods in P2P and grid systems use reputation (i.e. information that is based on the history of agents’ behavior). However, reputation is always vulnerable to first-time cheating, as well as coalition or discrimination attacks (Dellarocas, 2000a). Other TM approaches (Wierzbicki and Kucharski, 2004; Wierzbicki and Kaszuba, 2007) avoid the use of reputation through the use of cryptography that allows better observation and verification of behavior, even with limited or no use of central control.

P2P and grid systems require fair treatment of users. The problem of freeriders is prevalent in both types of systems, and providing fairness is an important goal, since the lack of fairness is a disincentive to participation in the system. Based on the theory of equity (Kostreva et al., 2004), this chapter will present methods to evaluate the fairness of various protocols or distributed algorithms in P2P and grid systems.

The two subjects of fairness and trust are inherently connected. For example, it can be shown that the fair behavior establishes expectancy trust. On the other hand, it can also be shown that when efficient trust management is used, then the overall fairness of resource distribution in a system increases. In other words, fairness is an emergent property of trust management (Nielek, 2008). This is relevant information because in distributed systems like P2P systems or the Grid, fairness must often be guaranteed without centralized control; trust management is one of the ways of achieving this goal. The chapter will include a demonstration of fairness emergence due to trust management.

Key Terms in this Chapter

Selective Aggregation Systems: Systems, where proofs (information) are gathered from a subset of agents. Such systems performs better than full aggregation systems but have lower accuracy.

Trust: There is a lot of definitions of trust. In our work wee adopt the definition of trust as a tolerance of risk, thus trust and risk values can be expressed on the same scale.

Full Aggregation Systems: Systems which uses all gathered proofs (information) as an input to compute requested value. Proofs are gathered from all agents available in the system.

Freerider: Type of malicious agent (adversary) who consume more than its fair share of a resource. In peer-to-peer environment freeriders are detected by checking the share ratio value, which is a number determined by dividing the amount of data that agent has uploaded by the amount of data he has downloaded.

Fairness: Measure used to determine whether agents are receiving a fair share of a resources.

Proof: Feedback passed by the agent to the Trust Management system. Proof can be history-based (report, observation) or delegation-based (recommendation)

Adversary: Malicious entity whose aim is to weaken or destroy the system. In general adversary can operate independently or form more complex relationship to other adversaries (or normal agents)

Complete Chapter List

Search this Book:
Reset