It's 7:15am and you're in a conference room, the same room you sit in every working morning. At the head of the table sits the head of your country's domestic security service. Pick a country, any country. Into that room walks a group of analysts and field investigators, and they lay a new problem on the table: yesterday evening, they say, a friendly security service told us it is watching an extremist who just returned from Syria and is now recruiting people to fight there. That extremist has a small network of global contacts. He emailed one of those contacts yesterday. That contact lives here. In the US. The UK. Germany. Or Australia.
I've sat at that table, so let me fast-forward and tell you what happens next. The questions fly, fast and furious. Who is this person? Whom does he know? What is his network? Where is he getting money? Where has he traveled? Does he have weapons or explosives?
Now take just a moment, and look at this list of critical questions: do you think they are appropriate questions? Sure, you say. So what's the fastest way to find this kind of information, when you think you might face a threat? Easy: digital records. And then, the hardest question: where could I look to find these kinds of people before we have a lead? How intrusive can I be in an effort to prevent this kind of thing?
Today, in the midst of the Snowden revelations, the questions we face focus too narrowly on the limits of security agencies and the data they collect and analyse. The pendulum has swung: who's running the asylum? How can our security agencies have done this? But then, sometime this year, or next year, or the year after, the pendulum will swing again, with some horrific event perpetrated by an individual who might have been identified through aggressive analysis of mountains of data. What do we say then? Let me guess: let's figure out a way to inch the pendulum back from the post-Snowden uproar. Maybe we can keep whittling away to reach the point at which security agencies represent a cultural norm in cyberspace that we find, if not comforting, at least less disturbing than what we think security agencies are doing today.
There are a few characteristics of this swinging pendulum we should be conscious of along the way. First, I can tell you, as a former senior official at both the CIA and the FBI, that your security agencies won't lead the debate. [fold]
They'll take the laws passed by their legislative bodies and direction from their elected leaders, and they'll press the limits of those laws. If they don't, and if that security agency then fails to stop an attack, they will face blistering questions from elected officials: 'Yes,' those security officials say, 'we could have done more. But we chose not to use all the authority you gave us because we unilaterally decided that they don't reflect our culture and values.' That ain't gonna happen. Security agencies respond to culture; they don't create it.
Second, the pendulum will never reach a perfect place, because the debate will never stop. Here's why: when you go to the airport, you expect that the government will search you, maybe aggressively, and you accept that intrusion as the price of security. But when you go to the supermarket, you would object to a government official searching you. Still, you know the supermarket chain is watching you via security camera and collecting lots of data about your purchasing patterns. But you wouldn't want the government to have the same access as the supermarket.
Culturally, we have set boundaries around our physical space: you can intrude sometimes but not others. Those boundaries are pretty well understood, yet they're constantly shifting. In New York City, the stop-and-frisk policy of randomly searching citizens raised an uproar in some parts of the community but not others.
In the rapid revolution that is our personal cyber space, we have only begun the cultural conversation about boundaries. If the phone company has your data, can the government acquire it? For what purpose? If you put your life on Facebook, you've purposefully made your life available to everybody, right? How about the government?
Are you nervous yet? You should be. Part of that nervousness stems from the fact that you, like everybody else, are feeling your way toward a cultural understanding of cyber limits, the same set of norms we have come to agree on, roughly, in the physical world.
So think of a few rules as you watch this debate. First, you can attack your security agencies, but they won't give you the solutions you want, because the questions are bigger. Second, you can believe that we will quickly develop a set of norms, but we won't. Cyberspace will change, and public expectations will evolve; this debate will never end. And finally, you'll never be comfortable. Because when that group at that table feels the responsibility to find the next attacker before he finds you, they have a new tool that will only grow more powerful, and they will use that tool, enabled by whatever laws politicians pass. That tool is the digital trail that you leave, in your life, every day.
Photo by Flickr user Nick Miller.