TRUST AND CONFIDENCE AND THEIR RELATION TO BLOCKCHAIN TECHNOLOGIES
A cura di Carlotta Pareschi
Trust and confidence
Trust and confidence are important concepts in interpersonal relationships and institutions. Although they are related, they have distinct meanings. Trust is multifaceted with different legitimate interpretations, while confidence is more precise and distinct (De Filippi, Mannan, Reijers, 2020)[1].
Trust
Trust is a complex social phenomenon with no single definition, yet scholars generally agree that trust involves a certain degree of risk and vulnerability. Precisely, trust is defined as a relationship where one party, the trustor, voluntarily keeps himself in a vulnerable position by relying on another party, the trustee, to achieve a particular task. Trust involves a choice between trusting another party or opting for a seemingly more secure alternative. Trust is beneficial as it allows the trustor to delegate tasks and reduce direct involvement, but it also carries inherent risks, especially in cases of information and power asymmetries. Due to reliance on others' knowledge or intentions, trust puts the trustor in a vulnerable position (De Filippi, Mannan, Reijers, 2020). Trust is not taking a risk per se, but rather making oneself vulnerable in taking that risk. The concept of vulnerability is strictly linked with trust, as being vulnerable implies there is something important to be lost. In history, the nature of trust has been obfuscated by terms such as cooperation and predictability that, despite have different meanings, have been used as synonyms to trust (Mayer, Davis, Schoorman, 1995).
Cooperation
Even if the two concepts are linked since trust frequently leads to cooperation, trust is not a necessary condition for cooperative behaviors to occur. As a matter of fact, cooperation does not necessarily put a party at risk. Two employees may cooperate due to powerful managers punishing them in case of deviations from focal interests or because they do not perceive any risk of betrayal from each other. Thus, it is possible to cooperate without trust, especially in absence of vulnerability, when external control mechanisms are in place or when the alignment of interests between the trustor and the trustee is clear (Mayer, Davis, Schoorman, 1995).
Predictability
Both prediction and trust are means of uncertainty reduction. However, trust goes beyond predictability as it involves risk. By equating the two concepts, a party who is expected to consistently engage in self-serving behaviors could be trusted, because predictable. However, the trustee’s predictability is not sufficient for the trustor to be willing to take the risk and trust (Mayer, Davis, Schoorman, 1995)
Confidence
In a situation of confidence, no choice has to be made as people do not perceive the risk that their expectations might remain unfulfilled: the fault for wrong outcomes will naturally be attributed to third parties or unforeseeable events. While trust is characterized by an awareness of risk of a possible betrayal or abuse of power from the part of the trustee, confidence is based on assurance and predictability. Confidence emerges from prior experience, statistical evidence, or reliance on experts, and it does not entail personal vulnerability. Unlike trust, confidence is not a voluntary disposition but a cognitive state of expectation about the future. Since confidence is associated with a sense of predictability and reduced feeling of risk, it allows for practical conduct and facilitates the entering into of contracts and obligations without relying on third-party enforcement. Confidence does not require communication or mutual commitment to exist, but rather it can be built through individual cognitive processes. This psychological quality of confidence makes it harder to assess it and identify it compared to trust. While trust relationships can be broken by a single act of a single party, the state of confidence is less fragile as requires a breach of expectations on a broad context. Nowadays, confidence is necessary in political, economic and social systems of complex societies, as these latter would not be able to operate simply based on interpersonal trust (De Filippi, Mannan, Reijers, 2020).
Sources and Objects of Trust
There exist different perspectives on the sources and objects of trust. Some view trust as a psychological attitude or a leap of faith, emphasizing belief and commitment of the trustee. Others see trust as a rational choice to achieve goals efficiently and that requires an evaluation of trustworthiness of the trustee. Trust can also emerge as a routine acceptance or an altruistic gesture, seeking to establish stronger relationships (De Filippi, Mannan, Reijers, 2020).
It is possible to assess the trustworthiness of individuals in two main ways, either through repeated direct interactions or through the trust conferred by others (e.g. diplomas, reference letters). Differently, trust in institutions involves a sustained belief in the alignment of the institutions with the trustor's interests, even though the latter does not fully comprehend the internal workings of the institution. Trusting an institution also involves trusting the people in charge of the design, production and administration of the system, as they have the power to change the operations and foster their personal interests. As a matter of fact, mistrust in institutions arises when internal actors develop negative attitudes or skepticism towards integrity (De Filippi, Mannan, Reijers, 2020).
Technological arrangements differ as they rely on predefined rules and predictable operations. Because of this, technologically-run institutions are often perceived as more trustworthy, leading to greater confidence rather than trust. Indeed, understanding the technology eliminates the need for trust, as individuals have confidence in its operations (De Filippi, Mannan, Reijers, 2020).
The Relationship Between Trust and Confidence
Despite the distinct meanings, confidence and trust are interrelated concepts. First, confidence in a system depends on the trust in the actors or institutions involved in higher-order systems. These latter must provide guarantees necessary to build expectations on matters that cannot be verified by anyone. Second, confidence in a particular system can contribute to the establishment of trust relationships in other systems. For instance, one may be more willing to trust a doctor, rather than a friend, for medical advice because of the degree he possesses, provided that there is enough confidence in the scientific community. Thus, confidence operates as a platform for trust, enabling easier trust relationships in lower-order systems when there is confidence in higher-order systems (De Filippi, Mannan, Reijers, 2020).
The Need for Trust
The decline in trust following events like the 2008 global financial crisis and the abuse of information by institutions and online platforms has a negative effect on financial development. It highlights the importance of rebuilding trust in finance and organizations. Indeed, a consequence of these crises is that investors become less inclined to participate in contracts, leading to a shift towards safer assets and higher costs for equity and long-term financing. This, in turn, impacts fast-growing and innovative companies that depend on these funding sources (Guiso, L. 2010). At the same time, at an organizational level, trust is crucial for effective collaboration and achieving goals. While control systems and legal remedies aim to prevent self-serving behavior and maintain trust, they are seen as weak substitutes. Finally, increasing workforce diversity and the use of work teams highlight the importance of trust in facilitating productive interactions among people with different backgrounds (Mayer, Davis, Schoorman, 1995).
A model of dyadic trust
The present article provides a review of the model of trust between two individuals, a trustor and a trustee, proposed by Mayer, Davis and Schoorman (1995)[2]. By taking into account the characteristics of both parties in the relationship and clearly defining the link between risk, trust, its antecedents and its outcomes, the authors conclude that a certain level of trust and perceived risk in a situation will lead to risk taking attitudes in relationships.
The Trustor
Some parties are more likely to trust than others. This happens because different trustors have different propensities to trust. Propensity to trust is defined as a stable within-party factor that affects the likelihood that a party will trust. Propensity influences the level of trust one has for a trustee, prior to data available on that particular trustee. Nevertheless, in order to fully understand how trust works, the trustor’s propensity to trust is insufficient: the characteristics of the trustee must be explored as well to assess his/her level of trustworthiness.
The Trustee
Previous researches suggest that the credibility of the trustee depends on its expertise in the sector and trustworthiness, with this latter being the motivation of the trustee to lie. More recent works show that trust depends on expectations of how a person will behave, based on current and previous claims, on its competence and integrity. In the present article, the three factors that are examined, both individually and interrelated, to determine the trustworthiness of individuals are ability, benevolence and integrity.
Ability
Ability comprises the skills and competences that enable a party to have influence in specific domains. Thus, trust is domain specific.
Benevolence
Benevolence involves some specific attachment between two parties. It is the extent to which a trustee is believed to want to do good to the trustor, aside from profit motives. Research shows that the more benevolent a person is, the less its motivation to lie will be.
Integrity
Integrity is assessed through the trustor’s perception that the trustee adheres to a set of principles that the trustor accepts. The level of integrity of a party is affected by its consistency in actions, credible claims on the trustee from other parties and its sense of justice. In evaluating trustworthiness, the perceived level of integrity is more important than the reasons why perception is formed.
Interrelationship Between the Three Factors
Trust for a trustee is a function of the trustee’s perceived ability, benevolence and integrity and of the trustor’s propensity to trust. Even if ability, benevolence and integrity are separable traits, alone they cannot explain the trustworthiness of individuals. Evidence may show that an individual is well integrated in the company, yet it doesn’t have the knowledge and capabilities to contribute to a particular task. Thus, integrity by itself does not imply trust. Or, a person whose integrity is well known and whose abilities are stellar may still have no particular attachment to individuals it works with/for. The consequence of this may be that the less benevolent person is more prone to use information for self-serving behavior rather than to contribute to the general goals. Again, benevolence alone cannot cause trust. In short, a perceived lack of any of the three factors may undermine trust. Even if all factors are assessed to be high in an individual, its trustworthiness should be thought of as a continuum. There may be cases in which the propensity to trust of the trustor is high enough for it to develop trust with lower degrees of trustworthiness of the trustee.
The model proposed exploits the concept of propensity to trust to explain how trust develops between two parties before any relationship. Then, as the trustor and the trustee enter the relationship, data on the integrity of the trustee become available through third-parties’ sources. The level of integrity is especially important early in the relationship. Then, as the two parties start interacting, the trustor also learns about the trustee’s benevolence and abilities, and alters the relative importance of the three factors of trustworthiness.
With this said, it is crucial to distinguish between trustor’s and trustee’s characteristics. To understand the extent to which a trustor is willing trust, both its propensity to trust and its perceptions of the trustee’s ability, benevolence and integrity must be assessed.
Risk-taking in Relationship
Risk is essential in a model of trust. Risk is intrinsic in the behavioral manifestation of the trusting action. In other words, if trust is the willingness to assume risks, behavioral trust is the actual assuming of risk. This important differentiation separates trust from its outcomes. Indeed, trust leads to risk-taking relationships (RTR) and to different risk-taking behaviors, depending on the situation. Yet, the amount of trust affects how much risk a party will take.
To explain trust, it is therefore necessary to separate it from other situational factors that necessitate trust, such as perceived risk. The assessment of risks is done taking into account both the context, including control systems and social influences, and the stakes in a situation, such as potential gains and losses. Finally, comparing trust with risk, if the level of trust is greater than the threshold of perceived risk, then the trustor will engage in a RTR. In the opposite case, he will not. In sum, RTR is explained as a function of trust and perceived risk of the trusting behavior.
The Role of Context
The trustor’s perception and interpretation of the context affects both the need for trust and the evaluation of trustworthiness of the trustee. The stakes involved in the relationship, the distribution of power and the perception of risk are all contextual factors affecting the outcome of trust. Also, the antecedents of trust, i.e. ability, benevolence and integrity, are influenced by the context. The perceived level of ability depends on the tasks to be accomplished. At the same time, perceived benevolence between two parties is affected by the perceived level of similarity in a particular context. The context influences the perceived level of integrity as well. A trustee may perform actions that are inconsistent with earlier decisions, in such a way that its integrity could be questioned. However, if the trustee in question is simply responding to orders from higher levels in the organization, its integrity will no longer be questioned. In short, strong control systems in an organization could put at risk the development of trust since the trustee’s actions may be responses to the control system rather than signs of trustworthiness.
Long-term Effects
To fully understand the concept of trust it is necessary to consider its evolution as the two parties interact. This dynamic nature of trust is visualized in the model proposed by a loop from the outcomes of trust, i.e. RTR, to the perceived characteristics of the trustee. This suggests that outcomes of trusting behaviors affect trust indirectly through the perceptions of ability, benevolence and integrity at the next interaction. With this said, a poor performance to complete a task may lead the trustee to be assessed with diminished trustworthiness. Depending on the situation, the trustor may attribute the poor performance to a lack of ability, integrity or benevolence and keep this information in mind for future interactions.
The model of dyadic trust proposed Mayer, Davis and Schoorman (1995) is the first to consider characteristics of both, the trustor and the trustee. The model also differentiates between antecedents of trust, outcomes and trust itself to analyze every element in detail. Likewise, the critical role of risk in trusting relationship is assessed. Perceptions of the trustee’s ability, benevolence and integrity are the antecedents of trust that are to be measured. Then, the RTR is the outcome and must be assessed in terms of actual risk-taking behavior, and not in the willingness to engage in risky actions. All assessments are done situation-specific to fully take into account the influence of the context on the development of trust. However, the proposed theory has several limitations. First, it focuses on the trust between a specific trustor and trustee. Second, the trust analyzed is unidirectional, it is not mutual trust between two parties. Finally, trust evolves over time and a critical issue is to further explore the process by which trust develops in long-term interactions.
THE CONSEQUENCES OF A FALL IN TRUST
The Recent Fall in Trust
Guiso L. (2010)[3] explains in his paper “A Trust-driven Financial Crisis. Implications for the Future of Financial Markets” that the concept of trust recently gained increased interest due to events like the global financial crisis and information abuses by institutions and online platforms. He argues that a crucial factor explaining the deterioration in economic activity that followed the crises was the collapse in trust. The crises revealed opportunistic behaviors and fraud, exemplified by the Bernard Madoff case, which eroded trust in the financial industry. The destruction of trust has important implications for the future of financial markets. It affects the demand for financial products, investors' portfolio choices, reliance on financial intermediaries, and the need for regulation. Thus, unless measures are taken to rebuild trust, the consequences will likely be long-lasting, as trust takes time to develop. The present chapter reviews some potential policies to rebuild trust in financial markets and intermediaries proposed by Guiso L. (2010), including changes in behavior within the financial industry and regulatory interventions. Furthermore, these recent events triggered a new attitude toward sociotechnical systems, causing people to question the necessity of relying on third parties. This has given rise to the development of blockchain technology, which is seen by its users as a "trustless" system. In the final section of the chapter, the analysis conducted by Primavera et al. in their paper "Blockchain as a confidence machine: The problem of trust & challenges of governance" is examined to delve into the concept of trust within the new framework of blockchain.
Possible Policies to Rebuild Trust in Finance
The Regulatory Approach
The regulatory approach has been a common strategy to rebuild trust in the financial industry. However, it is important to note that regulatory measures alone may have limited impact on restoring trust. Some regulatory proposals aim to address the failures and misconduct exposed during the financial crisis, but they may not directly address the underlying issue of trust. From an individual investor's perspective, the creation of a consumer protection agency and initiatives to combat financial crimes can contribute to rebuilding trust. These targeted interventions, aimed at protecting investors from abuses and fraudulent activities, have the potential to restore confidence. However, regulatory interventions imposed from outside the industry may face challenges. Financial intermediaries may seek to circumvent regulations, especially when the actual enforcement is weak, diminishing their impact on trust. Additionally, regulations designed to protect investors may also impose burdens and inconveniences on them, leading investors to tolerate some misapplications of the rules by intermediaries. In short, while regulatory measures can play a role in rebuilding trust, they may need to be accompanied by other initiatives and considerations.
An Industry-based Strategy
Research suggests that individuals tend to trust those who are similar to them in some way. This similarity can be based on various dimensions, such as facial resemblance, cultural background, or geographic origin. One industry-based strategy to rebuild trust is to improve the match between investors and their financial intermediaries. For example, assigning a manager of the same gender or geographical origin to the investor may enhance trust. This strategy aims to create a sense of affinity and familiarity between the investor and the intermediary, which can contribute to rebuilding trust. However, it must be noticed that this approach may primarily affect the average level of trust among investors and may not directly address the trust of those who have already lost it.
A Rating System
To address conflicts of interest and enhance trust in financial intermediaries, one potential solution is to adopt a rating system that evaluates banks based on their trustworthiness and fairness when dealing with customers, managing portfolios, and providing financial advice. This "bank-fairness index" would provide a simple metric, such as a rating on a scale from 0 to 10, that even financially illiterate investors can understand. The purpose of this system is to make information about banks more accessible and comprehensible to the average investor. However, implementing a rating system for banks may face challenges, such as finding independent and uncorrupted rating agencies. Additionally, the initial phase of the process may be difficult, as honest intermediaries may hesitate to subject their banks to the rating system when competitors are still benefiting from exploitative practices. However, given the increasing value of rebuilding reputation, by creating incentives for banks to prioritize trustworthiness and fairness, regulation can help establish a more favorable industry-wide behavior where all players engage in honest practices.
A Trust-based Compensation Scheme
To directly incentivize trustworthiness and raise trust levels, a trust-based compensation scheme can be implemented. Under this scheme, the compensation of an investor's manager or asset manager would depend on the level of trust that investors have in them. This provides strong incentives for managers to behave in a trustworthy manner, as their compensation is directly tied to the trust of their customers. The trust-based compensation scheme can utilize the information collected from investors to comply with regulatory directives, such as the EU's Markets in Financial Instruments Directive (MiFID). Specific questions on trust can be included, allowing investors to anonymously report their level of trust in the intermediary, portfolio manager, and others involved in financial decision-making. The manager's pay would then be adjusted based on the level of trust or its change within their customer base.
Promoting Financial Education
Unsophisticated investors with lower levels of financial education and experience are more vulnerable to deception by intermediaries. They rely heavily on the advice provided by intermediaries for their financial choices and are more likely to interpret negative investment returns as being cheated. Studies have shown that individuals with low levels of education are more likely to be deceived by banks or insurance companies. By taking actions that support financial education, such as advocating for financial education to be taught in schools and making certified educational material available to investors, intermediaries can empower investors to make informed financial decisions.
The decline in trust within the financial industry has significant implications for investors and the cost of risk capital. While the downward revision of overly optimistic trust levels can help punish dishonest actors and restore market discipline, it also hinders honest intermediaries from attracting capital. To rebuild trust, several measures that aim to limit opportunistic behavior and increase intermediaries' trustworthiness were proposed by Guiso (2010)’s research. These measures, such as a bank fairness index, a rating system and promoting financial education, are not mandated but left to the discretion of the intermediaries. However, there is no guarantee that all intermediaries will voluntarily adopt these measures, especially if dishonest behavior prevails in the industry. Regulatory agencies can play a crucial role in coordinating the adoption of these measures and encouraging intermediaries to choose the path of honesty. Regulatory agencies can influence the behavior of both honest and dishonest intermediaries, leading to a more trustworthy and competitive industry outcome.
Blockchain: Trustless Technology or Confidence Machine? - what happens to trust?
Blockchain technology emerged as potential solution to the decline in trust toward third parties. By offering an immutable and transparent technological system, blockchain eliminates the reliance on trust and helps to mitigate principal-agent issues. This unique characteristic of blockchain grants it the status of a "trustless" technology.
Most definitions of blockchain are grounded on the negative argument that such systems enable a “shift from trusting people to trusting math”, securing transactions via reliance on deterministic computation. This way, by eliminating trust, also the inherent risks should be removed. In contrast, the positive argument analyzed by Primavera et al. in the paper “Blockchain as a confidence machine: The problem of trust & challenges of governance” is that blockchain-based systems are intended to indirectly reduce, but not completely eliminate, the need for trust by maximizing the degree of confidence.
When it comes to complex systems, expectations about correct operations will take longer to develop as a lack of trust in any of the constitutive parts might bring people to distrust the system as a whole. These systems should, therefore, focus on guaranteeing transparency and predictability of the operations, so that, for any given input, anyone could verify the trustworthiness of internal workings. In the special case of technological operations, the higher the predictability of the code, the higher the confidence in the system and the lower is the need of trust in the developers and/or operators involved.
Confidence in Blockchain Technology
In blockchain-based systems, multiple elements contribute to the emergence of confidence. First, confidence arises from the high predictability of mathematical algorithms that constitute the foundation of blockchains. This includes also confidence and trust in core programmers and in the sanctioning mechanisms to ensure clear performances of the system. Second, economic incentives, such as the “consensus” algorithm of blockchain-based systems as well as the transparency given by their open-source nature reduce the risk of individual opportunism and enable anyone to hold a copy of the blockchain to verify the legitimacy of transactions. Thus, the perceived automation and impartiality are considered the new sources of confidence of such systems.
Trust in Blockchain Technology
Confidence in blockchain-based systems may still be undermined by core developers fostering their personal interests or by collusions of miners and mining-pools. Hence, the need for trust is not eliminated. Characteristics such as decentralization, censorship resistance, and automation not only increase confidence in the blockchain-based systems, but also require a certain level of trust both in the internal and external operators. Among the actors involved there are miners, programmers, validators and regulators. Blockchain-based systems reduce the need for trust in these individual actors, but do not eradicate it. Accordingly, despite the decentralization of the system, some oversight is still necessary to ensure a proper functioning as well as trust is required from at least four types of actors involved in the operations of the network. These actors are:
- Economic players, such as mining pools and exchanges, given their centralized control and ability to influence the network.
- Core developers and contributors can make decisions that impact the system's evolution and have political and economic implications.
- Cryptocurrency holders and users may have a voice in governance, and conflicting interests can complicate decision-making.
- Regulators also play a role in approving or disapproving blockchain technology, affecting its adoption and trust.
- Thus, labeling blockchain as "trustless" is misleading because trust is displaced rather than eliminated.
Governance in Blockchain-based Systems
Blockchain networks were at first intended to operate as polycentric governance systems, involving various actors such as miners and validators who collectively secure and maintain the network. They all agree to follow a set of rules, known as the blockchain protocol or "constitution." While some actors may have more influence than others, no one can unilaterally change the protocol or manipulate the recorded data on the blockchain. Blockchain governance relies on economic incentives, game-theoretic mechanisms, and social norms to ensure compliance with the protocol. Participation in the system is optional and voluntary.
However, if public blockchain systems offer the advantage of relatively low exit costs, which aligns with the principles of polycentric governance, with the evolution of the system the requirements for entry have become high for on-chain governance operations (e.g., mining requiring significant hashing power, token-based voting requiring extensive token holdings), and even higher for influential positions in off-chain governance (e.g., core developers or major cryptocurrency exchanges). Consequently, governance in most blockchain-based systems has become highly centralized, with a plutocratic nature in on-chain governance and a technocratic dominance in off-chain governance.
Challenges of Governance
Confidence in the blockchain system can be lost when actors involved in the maintenance of the blockchain overpower others. This is particularly relevant during “states of exception”, when shared values are questioned and when, deciding on the exception, certain actors are more influential than others. In these situations, when considering both on-chain (within the blockchain) and off-chain (outside the blockchain) governance aspects, the polycentric nature of the network may be diminished. A few influential actors can wield significant power and impact the overall operations of the network to serve their own interests. While centralization could resolve disputes and limit the powers of actors in a system, it contradicts the decentralized nature of public blockchains.
Possible solutions
Since confidence in blockchain-based systems implies trusting a variety of individual actors involved, good governance practices represent a crucial way to desist from untrustworthy operations. In liberal democratic institutions, good governance is associated with adhering to the Rule of Law, which requires clear and impartial application of laws and regulations. In the context of blockchain-based systems, a distinction must be made between the Rule of Law enforced by governments and the Rule of Code enforced by technology. National laws are challenging to be enforced on public blockchains due to the decentralized and distributed nature of these systems operating under a particular “protocol”. To enhance trust in these systems and mitigate opportunistic behavior, it is crucial to carefully design both the general (to counteract centralization) and domain-specific (to address specific needs) rules of public blockchain systems, not only for on-chain governance but also for off-chain governance. Existing discussions on polycentric governance systems propose rules that resemble those aimed at strengthening the Rule of Law. Thus, it is worth exploring how the constraints related to the Rule of Law could be adapted to blockchain-based systems.
Moreover, the governance of blockchain-based systems cannot solely rely on codified functions and technological guarantees at the on-chain level. It also necessitates institutional mechanisms and constitutional guarantees at the off-chain level. However, most procedural safeguards (such as transparency, representativeness, direct accountability, separation of powers, and avoidance of conflicts of interest) and substantive safeguards (such as protecting vulnerable actors and defining rights, duties, and attributes for politically influential roles) have been developed for traditional centralized institutions. In a blockchain-based system, where there is no centralized governing authority or coercive force to impose a constitution on network participants, it becomes necessary to adapt these constitutional safeguards to be applicable and enforceable in a decentralized and polycentric governance system.
Conclusion
Blockchain technology is often described as a “trustless” technology because it replaces the need for a trusted authority with a system of publicly verifiable proofs. This definition has several gaps, particularly concerning what blockchain technology actually brings to the table. Blockchain technology is rather defined here as a “confidence machine”, which increases confidence in the operations of a system while, indirectly, reducing trust. If trust involves some degree of risk and uncertainty, confidence is mostly associated with a sense of predictability. The mathematical knowledge and the cryptographic rules are the elements that allow the creation of strong expectations about the operations of the system, thereby eliminating the need of a trusted central authority. However, the absence of a trusted central authority does not confer to the system the feature of “trustless” technology, because despite less relevant, trusting the actors in charge of securing and maintaining the network is still necessary. Since confidence in a procedural system depends on the proper governance of that system, increased confidence in blockchain technologies is correlated with the degree of trust conferred to the actors involved. These latter are such as miners, mining-pools, core developers, social media influencers, regulators and policy makers. Finally, the governance of most blockchain based systems distributes trust over several actors with different interests and influences. In states of exceptions, problems may emerge as the need for decision-making beyond ordinary procedures arises. To ensure a proper level of confidence in decentralized blockchain-based systems, it may be necessary to introduce a series of rules addressing both normal situations and exceptions in off-chain governance.
- ↑ De Filippi, P., Mannan, M., & Reijers, W. (2020). Blockchain as a confidence machine: The problem of trust & challenges of governance. Technology in Society, 62, 101284.
- ↑ Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of management review, 20(3), 709-734
- ↑ Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of management review, 20(3), 709-734