How Snowden made us safer
In the wake of Snowden’s disclosures on the NSA (et al.), many have claimed that he handed terrorists a playbook to evade detection, making the job of intelligence agencies harder and endangering the public. While I would argue that’s not true, it doesn’t really matter; the adversary believing they have the playbook is a good thing.
This has three primary effects:
Enhancing paranoia within an adversary can be extremely advantageous, especially when the paranoia inducing information (true or not) plays on a group’s existing assumptions of its own vulnerabilities (Mobley, 245). One primary fear among jihadi groups is that someone is listening, and Snowden’s leaks confirmed that someone most likely is.
Many terrorist groups assume that their state adversary occasionally lives up to the quasi-omniscient reputation that the popular media and culture tend to promote. Governments may benefit from augmenting this popular belief because terrorist groups will expend valuable resources to protect against illusory intelligence capabilities. (Mobley, 246)
Mobley references illusory capabilities, and while many of the NSA’s capabilities are far from illusory, they aren’t all realistic at scale. However, the adversary is now living inside an expanded digital panopticon, where survival necessitates assuming the absolute worst. This means, as Mobley argues, they will lose time and expend resources to evade surveillance that might not exist.
The loss of confidence in the security of established channels forces a group to alter their communications security (COMSEC) posture. If rattled enough, they will significantly overhaul their routines, abandoning a regimented and familiar system for something likely experimental and new. Learning and migrating to new methods will render a group vulnerable during the adaption period. An organization-wide shift in communication methods is extremely difficult for any group, and especially so for a group as fissured as modern day al-Qaeda (AQ). Switching an entire network to a new system is loud, and the odds of intelligence agencies picking up on how the system is changing is extremely high; it’s a delicate operation.
Developing a new system will expose a group to multiple problematic scenarios. They may adopt an already subverted system, a weaker unproven system being sold as more secure, or any number of unforseen problematic details within a new system. To state more simply, the new system might actually be worse than the current one. Most importantly, this system will be new. It’s easy to make mistakes within new systems, and any COMSEC mistake risks compromising everything.
It is important to remember that it is unlikely these groups have a more secure system readily available to immediately shift to. If that were the case, they would have already been using that system. Terrorist groups are at the furtherest end of the incentive spectrum when it comes to hiding their intentions from Western powers. They don’t have the luxury of stopping their entire operation to research, refine, and properly adapt to a new system. This means they will have to adapt on-the-fly, an exceedingly dangerous way to learn when one’s adversary is Western intelligence.
The transition requires an initial period of high frequency contact using the old channels, as orders spread throughout the network to abandon them. This is an opportune time monitor which nodes are most active, and subsequently which nodes truly “disappear.” Assuming the NSA (or whoever) had the network well mapped, and I’m assuming they do, those disappearances will be hugely beneficial in informing HUMINT and specific SIGINT targeting.
As a group adapts they will be forced to pause, abandon, or at least slow the pace of their operations. This is compounded by the fact that, as Marc Sageman explains, terrorist groups are already working at a disadvantageously slow pace:
Terrorist organizations advocate strict compartmentalization to maintain security in a hostile environment. This implies a hierarchy with slow communications because of the vulnerability to interception of faster ones. Slow communications prevent the network from responding to new developments in a timely fashion and will further degrade its effectiveness. (165)
Of course, the fear of digital surveillance among organized jihadi groups is nothing new. Al-Qaeda laptops recovered in 2001 had thousands of encrypted documents. Many AQ leaders refused to personally touch a computer; they would use physical human proxies, who accessed computers at remotes sites to relay encrypted and/or coded messages.
In 2007 an al Qaeda affiliated organization, the Global Islamic Media Front (GIMF), released a piece of software called Asrar Al-Mujahideen (“Mujahideen Secrets”). It was “the first Islamic computer program for secure exchange [of information] on the Internet.” In the first eleven issues of Inspire, a magainze put out by al Qaeda in the Arabian Peninsula, readers were given the necessary information to “securely” contact the editors using Asrar Al-Mujahideen. However, in issue twelve, the first issue after the Snowden leaks, readers were instead given a warning in all caps: “DUE TO TECHNICAL AND SECURITY REASONS, WE HAVE SUSPENDED OUR EMAIL ADDRESSES TEMPORARILY.” This formal halt in communication eliminates a recruitment avenue and reinforces the chilling effect of the Snowden disclosures among the jihadi and/or would-be jihadi community.
In April of 2014, al-Shabab (an AQ affiliate in Somalia) publicized an email address registered with “safe-mail.net” as a secure way to contact the group. Outsourcing COMSEC is rarely a good idea, and it’s never a good idea when the terms of service explicitly state that a company has administrative access to user accounts and content, which the Safe Mail TOS does.
This last example is indicative of a larger problem occurring in the wake of Snowden’s disclosures: snake oil security/encryption services. It is a problem in the West, as companies rush to offer the latest in “military grade encryption” to evade the prying eye of the NSA. It appears jihadis may be suffering from the same privacy gold rush, resulting in the use of potentially insecure and unproven platforms. It has also (theoretically) given the West a chance to slip in their own offering, giving them direct backdoor access to the communications of jihadis unfortunate enough to trust a compromised platform.
What’s more, the fallout from the leaks has resulted in a largely uninformed mentality of “encryption encryption encryption,” as if preventing intelligence agencies from accessing the content of communications (via dragnet SIGINT) really matters at all. General Michael Hayden (former head of the NSA) recently admitted that the US government kills people based on metadata. Who you talk to, when, and from where, are enough to get up close and personal with a hellfire missile. Past that, once deemed a valuable enough target, encryption does not matter; they will get your data. This can be tricky if the target is in a hostile area with limited physical access, but is surprisingly easy (though often resource intensive) for domestic targets.
That said, reinforcing the idea that encryption apps are a viable solution is arguably a smart counterterrorism tactic. Operations security is hard enough with years of experience and formal training. A domestic lone wolf (arguably more of a threat than a transnational attack in the current jihadist landscape) pumped with a false sense of confidence that programs like “Mujahideen Secrets” will protect them is an extremely vulnerable target.
High ranking terrorist leaders are too smart and have too much experience (not to mention are often receiving assistance from sympathetic governments, i.e., Pakistan and Iran) to make the rookie mistakes intelligence agencies can easily capitalize on. But as the entire global jihadi landscape becomes more fissured, the leadership simply doesn’t have the luxury of controlling the tempo and organization of their environment as they used to. This creates pressure, which promotes desperate moves and lapses in judgment.
Additionally, the encryption = security fallacy is really only the beginning. As Marc Ambinder says in The Week, “If it seems like Edward Snowden and the reporters who have access to his archive have given away the farm, think again.” He goes on to specify:
So either the adversary is in the camp that believes Snowden “gave away the farm” and consequently believes they can evade detection, in which case they are very stupid and vulnerable. Or they’ve begun to better understand the scope of their adversary’s abilities and are now forced to anticipate every potential surveillance capability, no matter how outlandish, in which case they are likely wasting time and resources.
Either way, it’s a net positive for the intelligence agencies. Especially considering nothing has really changed as far as their capabilities are concerned.
Big thanks to @thegrugq for helping me work through some of these ideas.