Karen S. Cook, Chris Snijders, Vincent Buskens, Coye Cheshire

Research output: Chapter in Book/Report/Conference proceedingForeword/editorialAcademic

2 Citations (Scopus)


Trust facilitates social interaction. When it exists, it strengthens cooperation, provides the basis for risk-taking, and grants latitude to the parties involved. When it does not exist, various mechanisms are required to protect against exploitation. In its most basic form, trust can be reduced to a situation where A knows that if she hands over the control of the situation to B, B can choose between an action X or Y. Trust is involved when there is a real probability that B will choose the action A does not prefer (Coleman 1990). Many trust-related distinctions, which are often implicit, can be clarified using this simple metaphor. Perhaps the most obvious distinction is that between trust, which is connected to A, and trustworthiness, which is connected to B (Hardin 2002), so that a trusting A in itself is not complete without a trustworthy B. One can also distinguish between trust as a behavioral measure ("A trusts: she chooses to let B choose") versus trust as a subjectively perceived probability that is attached (perhaps unconsciously) to the event that B will choose the action that A prefers ("A trusts: she feels that it is likely that B will choose the action A prefers"). Another distinction can be made according to what kind of entity B actually is. B can represent a brand or a product, an institution such as the government, or perhaps an abstract entity such as the general public. Throughout this book, however, the focus is on both A and B being persons. A related issue is whether trustful and trustworthy behavior are largely attitudinal ("some persons A have a stronger disposition to trust") or contextual ("under some conditions, persons A tend to trust more often"). We will encounter examples of both throughout this volume. For trust to actually work, the trusted party must be both willing and able to honor the trust placed in her. In the absence of clear information about these two dimensions, trust reduces to risky decision making under uncertainty (Snijders 1996). If the risks are too high or the uncertainty too great, potentially valuable social interactions will not occur. In such environments, various institutional and organizational mechanisms are typically established to facilitate interaction. Sometimes third-party institutions, which offer assurance that all parties live up to their promises, can overcome the need for trust between persons. One could see banks and credit card companies as attempts to realize an impartial third party (though the recent credit crisis has raised doubts as to the trustworthiness of this particular third party). A mechanism that can help in the absence of stabilizing third parties is repeated interaction. In repeated interactions, each actor can test the other to determine whether the partner is both capable and willing to perform as desired (Axelrod 1984; Kreps 1990; Gibbons 2001). The repetition of moves helps because actors in a trust relationship can learn by observing the behavior of their partner and can control opportunistic behavior by using future sanctions more or less tit-for-tat. Moreover, repetition gives partners real incentives to invest in their reputations. In this case, the exchange of information is between two parties in a dyadic relationship. However, many interactions are embedded in a larger context of interactions in which repeated interaction with the same party is either not likely or not that frequent. Buyers and sellers on eBay are an example. In such settings, reputation can play a vital role in establishing trust in the absence of a long-term dyadic relationship. Even though the direct pathway of information flow from A to B and back is lacking, information can flow through third parties and mutually cooperative relations can be built, in principle in the same general way as in dyadic relationships. This volume on trust and reputation is complementary to earlier volumes in the Russell Sage Trust series in the sense that we focus attention on reputation as a specific mechanism that can be used in many ways to facilitate interactions not likely to occur in the absence of such information. This mechanism in relation to trust and trustworthiness has been mentioned only occasionally in earlier volumes in the series. Although reputation can also play a role in repeated interactions between the same two actors, in this volume we focus on the effects of third-party information on trust. In a larger context, actors can build reputations for being trustworthy if information about their benevolent, reliable behavior reaches their potential future partners (Granovetter 1985). Information about a partner's capabilities and intentions flows through the network of actors, allowing all to learn about their potential future trust partners. On the other hand, partners can reward or punish trustworthy or untrustworthy actors if they can change existing reputations by distributing information about their experiences with specific actors in the network of possible future partners (for longer informal accounts of learning and control in such settings, see Yamagishi and Yam-agishi 1994; Buskens and Raub 2002; for more formal theoretical discussions on such reputation effects in social networks, see Raub and Weesie 1990; Buskens 2002). Most of the chapters in this volume do not focus on disentangling reputation mechanisms in different contexts, but emphasize instead the effectiveness of and problems associated with reputation mechanisms in various situations. In contexts in which direct informal communication in a network is difficult to achieve, it becomes worthwhile to provide such information in a more institutionalized way. There are a number of historic examples of such institutions (for example, Greif 1989, 2006; Klein 1997). More recently, reputation systems have also become well known in the world of Internet-based interactions. Typically, users can leave some kind of feedback about cooperation with their previous partner (as on eBay or LinkedIn), or about the quality of the goods they supplied (as on Amazon). This feedback is then made available to all other users, and those with a good reputation can benefit from their trustworthy behavior because their attractiveness as a future partner increases. Many chapters in this volume explicitly address the problems that arise in environments often characterized by anonymity and faceless interaction, such as Internet commerce and computer-mediated interaction more generally. And there are good reasons to place specific emphasis on these types of interactions. Information technology has made the problem of trust more salient. Many of the trust problems that can occur in the "real" world now have a virtual counterpart, but the conditions under which one has to solve trust problems online strongly differ from the conditions under which people are accustomed to solving such problems. Several factors make trusting behavior online much more difficult than offline. For instance, on the Internet one cannot see the other party face to face, and single-shot encounters are very common. However, the Internet also offers many new possibilities and innovative solutions to trust problems. Use of the reputation of the interacting partners is one potential solution, and is being used in many online settings (see, for example, Kollock 1999; Resnick and Zeckhauser 2002). Many research questions related to anonymous interactions in which reputation mechanisms are used to solve trust problems are addressed in this volume. How are such interactions made possible organizationally? What safeguards are necessary and what role does reputation play? What types of reputation systems work best to provide the needed clues to trustworthiness or ready access to organizational and legal means of redress? How are reputations repaired if damaged? Can they be? What are the determinants of effective monitoring in the world of online interaction and commerce? What role does reputation information play in the effort to provide assurance and some control over malfeasance? In the next section, we discuss briefly the nature of the contributions of each chapter in this volume and conclude with a postscript to identify important questions for future research that the work reported here raises.

Original languageEnglish
Title of host publicationeTRUST
Subtitle of host publicationForming relationships in the online world
PublisherRussell Sage
Number of pages12
ISBN (Print)9780871543110
Publication statusPublished - 1 Jan 2009


Dive into the research topics of 'Introduction'. Together they form a unique fingerprint.

Cite this