Socio-Cognitive Model of Trust

Socio-Cognitive Model of Trust

Rino Falcone (Institute for Cognitive Sciences and Technology, National Research Council of Italy, Italy) and Cristiano Castelfranchi (Institute for Cognitive Sciences and Technology, National Research Council of Italy, Italy)
DOI: 10.4018/978-1-60566-026-4.ch558
OnDemand PDF Download:


Humans have learned to cooperate in many ways and in many environments, on different tasks, and for achieving different and several goals. Collaboration and cooperation in their more general sense (and, in particular, negotiation, exchange, help, delegation, adoption, and so on) are important characteristics - or better, the most foundational aspects - of human societies (Tuomela, 1995). In the evolution of cooperative models, a fundamental role has been played by diverse constructs of various kinds (purely interactional, technical-legal, organizational, sociocognitive, etc.), opportunely introduced (or spontaneously emerged) to support decision making in collaborative situations. The new scenarios we are destined to meet in the third millennium transfigure the old frame of reference, in that we have to consider new channels and infrastructures (i.e., the Internet), new artificial entities for cooperating with artificial or software agents, and new modalities of interaction (suggested/imposed by both the new channels and the new entities). In fact, it is changing the identification of the potential partners, the perception of the other agents, the space-temporal context in which interaction happen, the nature of the interaction traces, the kind and role of the authorities and guarantees, etc. For coping with these scenarios, it will be necessary to update the traditional supporting decision-making constructs. This effort will be necessary especially to develop the new cyber-societies in such a way as not to miss some of the important cooperative characteristics that are so relevant in human societies.
Chapter Preview

Trust In The New Technological Scenarios

In fact, various different kinds of trust should be modeled, designed, and implemented:

Key Terms in this Chapter

Trust: The attitude of an agent to delegate a part of its own plan/goal to another agent and rely upon it in a risky situation (possible failure) on the basis of its own beliefs about the other agent and on the environment in which it operates.

Task: An action and/or a goal an agent has to realize as delegated by another agent; thus – in the opposite perspective - the couple action/goal that an agent intentionally delegates to another agent; where at least the delegating agent knows one between the action and the goal.

This work was previously published in Encyclopedia of Information Science and Technology: edited by M. Khosrow-Pour, pp. 2534-2538, copyright 2005 by Information Science Reference, formerly known as Idea Group Reference (an imprint of IGI Global)

Trustier: The trusting agent in the trust relationship. Ubiquitous Computing: The trend of the technological development to integrate into any kind of object information processing and communication capabilities.

Reputation: The estimated trustworthiness in an agent as derived from the communicated opinions of other parts (directly or indirectly received); the resulting and emergent “common opinion” about the agent’s trustworthiness.

Trustee: The trusted agent in the trust relationship.

Complete Chapter List

Search this Book: