Our New Regimes of Trust

  • Bruce Schneier
  • The SciTech Lawyer
  • Winter/Spring 2013

Society runs on trust. Over the millennia, we’ve developed a variety of mechanisms to induce trustworthy behavior in society. These range from a sense of guilt when we cheat, to societal disapproval when we lie, to laws that arrest fraudsters, to door locks and burglar alarms that keep thieves out of our homes. They’re complicated and interrelated, but they tend to keep society humming along.

The information age is transforming our sociey. We’re shifting from evolved social systems to deliberately created socio-technical systems. Instead of having conversations in offices, we use Facebook. Instead of meeting friends, we IM. We shop online. We let various companies and governments collect comprehensive dossiers on our movements, our friendships, and our interests. We let others censor what we see and read. I could go on for pages.

None of this is news to anyone. But what’s important, and much harder to predict, are the social changes resulting from these technological changes. With the rapid proliferation of computers – both fixed and mobile – computing devices and in-the-cloud processing, new ways of socialization have emerged. Facebook friends are fundamentally different than in-person friends. IM conversations are fundamentally different than voice conversations. Twitter has no pre-Internet analog. More social changes are coming. These social changes affect trust, and trust affects everything.

This isn’t just academic. There has always been a balance in society between the honest and the dishonest, and technology continually upsets that balance. Online banking results in new types of cyberfraud. Facebook posts become evidence in employment and legal disputes. Cell phone location tracking can be used to round up political dissidents. Random blogs and websites become trusted sources, abetting propaganda. Crime has changed: easier impersonation, action at a greater distance, automation, and so on. The more our nation’s infrastructure relies on cyberspace, the more vulnerable we are to cyberattack.

Think of this as a security gap: the time lag between when the bad guys figure out how to exploit a new technology and when the good guys figure out how to restore society’s balance.

Critically, the security gap is larger when there’s more technology, and especially in times of rapid technological change. More importantly, it’s larger in times of rapid social change due to the increased use of technology. This is our world today. We don’t know how the proliferation of networked, mobile devices will affect the systems we have in place to enable trust, but we do know it will affect them.

Trust is as old as our species. It’s something we do naturally, and informally. We don’t trust doctors because we’ve vetted their credentials, but because they sound learned. We don’t trust politicians because we’ve analyzed their positions, but because we generally agree with their political philosophy – or the buzzwords they use. We trust many things because our friends trust them. It’s the same with corporations, government organizations, strangers on the street: this thing that’s critical to society’s smooth functioning occurs largely through intuition and relationship. Unfortunately, these traditional and low-tech mechanisms are increasingly failing us. Understanding how trust is being, and will be, affected – probably not by predicting, but rather by recognizing effects as quickly as possible – and then deliberately creating mechanisms to induce trustworthiness and enable trust, is the only thing that will enable society to adapt.

If there’s anything I’ve learned in all my years working at the intersection of security and technology, it’s that technology is rarely more than a small piece of the solution. People are always the issue and we need to think as broadly as possible about solutions. So while laws are important, they don’t work in isolation. Much of our security comes from the informal mechanisms we’ve evolved over the millennia: systems of morals and reputation.

There will exist new regimes of trust in the information age. They simply must evolve, or society will suffer unpredictably. We have already begun fleshing out such regimes, albeit in an ad hoc manner. It’s time for us to deliberately think about how trust works in the information age, and use legal, social, and technological tools to enable this trust. We might get it right by accident, but it’ll be a long and ugly iterative process getting there if we do.

Categories: Computer and Information Security, Internet and Society, Trust

Sidebar photo of Bruce Schneier by Joe MacInnis.