Last week at the 20th Usenix Security Symposium, Sandy Clark, Travis Goodspeed, Perry Metzger, Zachary Wasserman, Kevin Xu, and I presented our paper Why (Special Agent) Johnny (Still) Can't Encrypt: A Security Analysis of the APCO Project 25 Two-Way Radio System [pdf]. I'm delighted and honored to report that we won an "Outstanding Paper" award.
APCO Project 25 ("P25") is a suite of wireless communications protocols designed for government two-way (voice) radio systems, used for everything from dispatching police and other first responders by local government to coordinating federal tactical surveillance operations against organized crime and suspected terrorists. P25 is intended to be a "drop-in" digital replacement for the analog FM systems traditionally used in public safety two-way radio, adding some additional features and security options. It use the same frequency bands and channel allocations as the older analog systems it replaces, but with a digital modulation format and various higher-level application protocols (the most important being real-time voice broadcast). Although many agencies still use analog radio, P25 adoption has accelerated in recent years, especially among federal agencies.
One of the advantages of digital radio, and one of the design goals of P25, is the relative ease with which it can encrypt sensitive, confidential voice traffic with strong cryptographic algorithms and protocols. While most public safety two-way radio users (local police dispatch centers and so on) typically don't use (or need) encryption, for others -- those engaged in surveillance of organized crime, counter espionage and executive protection, to name a few -- it has become an essential requirement. When all radio transmissions were in the clear -- and vulnerable to interception -- these "tactical" users needed to be constantly mindful of the threat of eavesdropping by an adversary, and so were forced to be stiltedly circumspect in what they could say over the air. For these users, strong, reliable encryption not only makes their operations more secure, it frees them to communicate more effectively.
The P25 protocols themselves suffer from some basic weaknesses that make them vulnerable to a range of active and passive attacks. The vulnerabilities we found, which apply even when encryption is properly configured, leak data about the identity of transmitting radios, enable active tracking and direction finding of idle (non-transmitting) users, allow highly efficient (low-energy) malicious jamming and denial of service, and permit injection of unauthenticated traffic into secured channels. These weaknesses violate many of the most basic assurances a secure communication system is expected to provide, and they apply to systems using any of the three standard P25 configuration (called "simplex", "repeater" and "trunked".)
For example, P25 signals include a "unit ID" that identifies the originating radio. That's not security problem by itself, except that when encryption is turned on, the unit ID turns out to always be sent in the clear. Unit IDs for otherwise encrypted transmissions can therefore be decoded (or recorded for analysis) easily by anyone within radio range with a modified receiver. Because these identifiers are generally unique and permanently assigned to each particular radio, this allows listeners to identify and track any currently active P25 operation, even those using encryption. And the traffic analysis situation is even worse in the face of an active attacker. We found that under most P25 configurations, even non-transmitting radios will (silently) respond to specially malformed messages directed at their data layer. This permits an adversary to selectively "ping" any or all P25 radios that are tuned to any given frequency. The response to pings allows, among other things, on-demand direction finding of members of active tactical operations, a sort of "Maurauder's Map" come to life for surveillance targets.
P25 systems are also strikingly vulnerable to denial of service. Most radio systems have the property that an adversary must deliver to the receiver at least as much energy as the targeted transmitter to effectively "jam" a signal. Old-fashioned analog FM modulation, for example, requires the jammer to have a slightly more powerful signal than the legitimate transmitter, and also forces the adversary to broadcast more or less continuously to cause lasting disruption. (This also makes jammers relatively easy to locate.) Digital spread-spectrum systems can disadvantage a jammer even more, requiring far more energy, spread over a wide frequency range, to disrupt a targeted signal. Jamming in most radio systems is thus somewhat costly as well as somewhat risky, an arms race in which the legitimate users enjoy the upper hand.
But a peculiarity in P25's error correction scheme reverses the defender's natural advantage, especially for voice traffic. P25 voice transmissions are digitized as a sequence of 1728 bit "frames", each encoding 180 milliseconds of audio. Because digital data sent over radio is subject to bit errors from fading and interference, frames include redundant data that allows a certain number of errors to be corrected automatically by the receiver, which makes P25 perform better under less-than-ideal conditions. Toward the beginning of each frame is a 64 bit field, called the "NID", that identifies the type of frame. But the NID is error corrected separately from the rest of the frame. This makes it possible for an attacker to effectively prevent an entire voice frame from being correctly received by synchronizing a jamming transmitter to interfere only with the 64 bit NID field; it can remain silent for the rest of the frame. That means that a synchronized P25 jammer needs to itself transmit for only about four percent of the duration of the signal it wants to jam. In other words, it requires 25 times less energy to jam a P25 signal than the signal itself, giving the attacker an enviable advantage right from the start.
Although it might sound as if building a tightly synchronized low duty cycle P25 jammer would be very difficult, modern programmable radio hardware turns out to make it not that hard. An inexpensive commercial chip, used in a variety of products including a toy "instant messenger" device marketed to pre-teen girls, turns out to be well suited to recognizing P25 frames and transmitting short pulses synchronized to received frames. We built a complete low-power P25 jammer by loading our own firmware into the instant messenger toy, and found it to be completely effective at preventing every P25 receiver we tested from decoding received audio. (This aspect of our paper got a perhaps disproportionate amount of press; let me emphasize that, contrary to what was implied in some reports, the device in question does not jam P25 signals "out of the box".)
"Trunked" P25 systems are also vulnerable to efficient jamming attacks against their "control" channel. We didn't explore attacks that are specific to trunking systems in detail, partly because control channel attacks are already a known threat. The vulnerabilities we looked at apply to both trunked and conventional P25 configurations.
Whether these active attacks currently represent practical threats against any given system depends on the resources and motivation of the adversary. For many systems, such as local government public safety radio, they may not be very important, at least today. But for others, especially systems that support operations against sophisticated targets, these threats seem quite realistic. And as the boundary between radio hardware and software continues to blur as technology advances, the resources required to construct and deploy the equipment required for these attacks become more and more modest.
Which brings us to the most serious and immediate -- if rather lower tech -- threat to encrypted P25 systems we found: serious usability deficiencies in their security user interface and key management practices. These deficiencies make it easy for anyone to intercept sensitive transmissions that users believe are encrypted, today.
In his invited talk at Crypto '95, the late Bob Morris, then working at the NSA, revealed what he said was the first rule of government-grade cryptanalysis: "First, look for cleartext". And indeed, the secure P25 systems currently fielded by the federal government appear to fall badly to this technique.
For almost two years, we systematically measured and analyzed the incidence of "unintended" cleartext leakage in real P25 systems. We set up a P25 interception network in several US metropolitan areas that cataloged the clear P25 traffic sent on the VHF and UHF two-way frequency bands assigned to the federal government. We collected data specifically on systems carrying a high volume of sensitive traffic from trained and motivated users: the encrypted tactical two-way radio networks used by federal agencies conducting criminal and national security investigations.
What we found surprised (and alarmed) us. Most sensitive federal tactical traffic is, in fact, successfully encrypted. But a significant minority of the traffic isn't -- it is sent in the clear, despite the users' apparent belief that it is encrypted. We captured an average of 20-30 minutes per day per city of highly sensitive "unintended" cleartext mixed in with the encrypted traffic that comprise most of the activity on these systems. While we can't go into details about specific operations, it is not a stretch to imagine that at least some of it could put lives at risk. The cleartext included all manner of highly sensitive operational details, such as identifying features of undercover operatives and informants, identities and locations of surveillance targets, plans and locations for forthcoming takedowns, and details of executive protection operations. And most were accompanied by specific indications that the users believed that their transmissions were securely encrypted.
What's going on here? There appear to be two problems. First, there is only poor feedback to the user about whether encryption is actually enabled, and radios set to clear mode will happily interoperate with radios set to encrypted mode, making it unlikely that errors will be detected during an operation. Second, many P25 systems, including those used by the federal government, are "rekeyed" at frequent intervals, in the apparent (and basically erroneous) belief that changing encryption keys regularly improves security. The effect is that many users are often left without current key material, and must revert to clear mode in order to communicate.
It is perhaps tempting to dismiss the problem of unintended cleartext as "user error", to place the blame on agents for "misusing" their radios. And if these were rare, isolated incidents that might be reasonable. But the overwhelmingly consistent, pervasive volume of occurrences makes it hard to do that. The problem of unintended sensitive cleartext rests squarely with the radios, not their users, and it is important to fix the problem rather than blame the victim.
Fortunately, the usability problems can be partly mitigated in most P25 products. We found ways to configure systems and handsets a bit differently from their default setup that increases feedback and makes extended unintended cleartext operation less likely. Our "P25 Security Mitigation Guide" [link] is aimed at system managers of sensitive networks, and we've been working with federal users to help them reduce the incidence of unintended cleartext.
What's the lesson here? The unintended cleartext problems with P25 should first and foremost remind us that cryptographic usability matters. All the security gained from using well-analyzed ciphers and protocols, or from careful code reviews and conservative implementation practices, is lost if users can't reliably figure out how to turn the security features on and still get their work done. When Whitten and Tygar published their classic paper "Why Johnny Can't Encrypt" at Usenix Security '99, they showed how an apparently "secure" crypto mechanism -- the PGP email encryption system -- can be effectively neutralized by an opaque, overly technical user interface. Almost everything they observed about PGP more than a decade ago applies to P25 radios today.
And this brings us to a final observation. P25 (and encrypted radio generally), like encrypted email before it, occupies a rather unusual place in the spectrum of cryptographic applications. Although it is used for "two-way" communication, it is, actually, a one-way protocol. All the security decisions in P25 are made by the sender's radio; unlike most cryptographic protocols, there is no "negotiation" between sender and receiver here. One-way protocols, in which there is no negotiation or exchange between the transmitter and the receiver are actually rather unusual, and relatively little is known (or written in the literature) about robust design principles for them. And indeed, the "classic" one-way encryption problem, electronic mail, does not exactly stand as a shining beacon of successful design.
While this may be bad news in the short term for secure radio users, it
also suggests a research direction our community has largely ignored
in designing and analyzing protocols and systems. I think we
have our work cut out for us.
Our short paper exploring the problems of "One-Way Cryptography", from
the 2011 Security Protocols Workshop, can be found at
http://www.mattblaze.org/spw2011-mab.pdf [pdf format] .
Our P25 Security Mitigation Guide can be found at
Our Usenix Security 2011 paper can be found at http://www.mattblaze.org/p25sec.pdf [pdf format] .
Our short paper exploring the problems of "One-Way Cryptography", from the 2011 Security Protocols Workshop, can be found at http://www.mattblaze.org/spw2011-mab.pdf [pdf format] .
Our P25 Security Mitigation Guide can be found at http://www.mattblaze.org/p25/ .