<?xml version="1.0" encoding="iso-8859-1" ?>
<rss version="2.0">

<channel>

<title>Matt Blaze's Exhaustive Search</title>
<link>https://www.mattblaze.org/blog</link>
<description>Science, Security, Curiosity</description>
<language>en-us</language>
<docs>http://blogs.law.harvard.edu/tech/rss</docs>
<generator>First Person</generator>
<managingEditor>mab@mattblaze.org</managingEditor>
<webMaster>mab@mattblaze.org</webMaster>
<item>
   <title>The Cryptography of Orphan Annie and Captain Midnight</title>
   <link>https://www.mattblaze.org/blog/badges/</link>
   <guid>https://www.mattblaze.org/blog/badges/</guid>
   <pubDate>Wed, 05 Jan 2022 04:38:44 +0000</pubDate>
   <description>Badges? We don't need no badges!




	
&lt;br/&gt;

&lt;img style=&quot;margin: 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/blog/badge-mug-smaller-20220101-P0002913.jpg&quot; width=500 height=auto alt=&quot;A 1935 Radio Orphan Annie's Secret Society decoder badge resting on a souvenir mug from the CRYPTO '93 conference.&quot; align=&quot;right&quot;&gt;

&lt;p&gt;
Between 1935 and 1949, many North American children (and adults) got their introduction to cryptography through encrypted messages broadcast at the ends of episodes of two popular radio adventure serial programs: &lt;em&gt;Little Orphan Annie&lt;/em&gt; and &lt;em&gt;Captain Midnight.&lt;/em&gt; Dedicated listeners could join &lt;em&gt;Radio Orphan Annie's Secret Society&lt;/em&gt; or (later) &lt;em&gt;Captain Midnight's Secret Squadron&lt;/em&gt;, whereupon they would be sent a decoder that 
would allow them to decrypt each week's messages (generally a clue about what would happen in the next episode).
&lt;p&gt;
Orphan Annie (and her Secret Society members) fought crime, battled pirates, solved mysteries, and had other typical American pre-adolescent adventures. Captain Midnight (with his Secret Squadron) used his aviation prowess to perform daring rescues and emergency transports, and, with the outbreak of WWII, was commissioned by the government to lead secret missions behind enemy lines.
&lt;p&gt;
The main qualification for membership in (and issuance of a decoder for) Radio Orphan Annie's Secret Society and Captain Midnight's Secret Squadron involved drinking Ovaltine, a malted milk flavoring containing the vitamins and nutrients then understood to be needed by growing secret operatives, or at least to be profitable for its manufacturer (which sponsored the broadcasts). Proof of sufficient Ovaltine consumption was established by mailing in labels from Ovaltine packages. New pins and badges were issued annually, requiring additional labels to be sent in each year. (The devices are sometimes remembered as decoder &lt;em&gt;rings&lt;/em&gt;, but in fact they took the form of pins, badges, and the occasional whistle or signal mirror.)
&lt;p&gt;
Orphan Annie's Secret Society produced decoders (variously called &quot;Super Decoder pins&quot;, &quot;Telematic Decoder Pins&quot; and other names from year to year) from 1935 through 1940. From 1941 through 1949, the decoders were rebranded as &quot;Code-O-Graphs&quot; and distributed by Captain Midnight's Secret Squadron. These years corresponded to Ovaltine's sponsorship of the respective programs. Although the decorative elements and mechanical designs varied, the underlying cryptographic principles were the same for all the decoders.
&lt;p&gt;
Encrypted messages were included in the broadcasts roughly once per week, usually at the end of Thursday's show (which typically ended with a cliffhanger). Unfortunately, there does not appear to be an easily available full online archive of the broadcasts. However, you can listen to (and, with the information below, decode) airchecks of several original messages here (note the year to ensure you use the correct decoder badge parameters):
&lt;blockquote&gt;&lt;table cellpadding=10&gt;
&lt;tr&gt;&lt;td valign=top&gt;1936&lt;/td&gt;&lt;td&gt;
	&lt;a href=&quot;https://www.mattblaze.org/private/OA2-1936.mp3&quot;&gt;Orphan Annie 1936 Pin (1)&lt;/a&gt;&lt;br/&gt;
	&lt;a href=&quot;https://www.mattblaze.org/private/OA5-1936.mp3&quot;&gt;Orphan Annie 1936 Pin (2)&lt;/a&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td valign=top&gt;1938&lt;/td&gt;&lt;td&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/OA1-1938.mp3&quot;&gt;Orphan Annie 1938 Pin (1)&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/OA3-1938.mp3&quot;&gt;Orphan Annie 1938 Pin (2)&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/OA4-1938.mp3&quot;&gt;Orphan Annie 1938 Pin (3)&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td valign=top&gt;1942&lt;/td&gt;&lt;td&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/CM1-1942.mp3&quot;&gt;Captain Midnight 1942 Badge (1)&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/CM2-1942.mp3&quot;&gt;Captain Midnight 1942 Badge (2)&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://www.mattblaze.org/private/CM3-1942.mp3&quot;&gt;Captain Midnight 1942 Badge (3)&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;

&lt;tr&gt;&lt;td valign=top&gt;1947&lt;/td&gt;&lt;td&gt;&lt;a href=&quot;https://www.mattblaze.org/private/CM4-1947.mp3&quot;&gt;Captain Midnight 1947 Badge (1)&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;
&lt;/blockquote&gt;
&lt;p&gt;
These decoders have endured as iconic examples of simple, &quot;toy&quot; cryptography, even among those (like me) born well after the golden age of radio. And while they are indeed vulnerable to weaknesses that make them unsuitable for most &quot;serious&quot; use, that doesn't mean we shouldn't take them seriously. In fact, the underlying cryptographic and security principles they embody are important and subtle, part of the foundations for much of &quot;modern&quot; cryptography, and the badges combine multiple techniques in interesting ways that repay a bit of careful study. Indeed, they were almost certainly the most cryptologically sophisticated breakfast premiums ever produced. And, by understanding them sufficiently well, we can cryptanalyze and decode messages without needing to buy Ovaltine or scour Ebay. The rest of this post explains how.
&lt;p&gt;







&lt;br/&gt;

&lt;a href=&quot;https://www.mattblaze.org/blog/badges/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Testing Phone-Sized Faraday Bags</title>
   <link>https://www.mattblaze.org/blog/faraday/</link>
   <guid>https://www.mattblaze.org/blog/faraday/</guid>
   <pubDate>Thu, 02 Dec 2021 05:46:53 +0000</pubDate>
   <description>Reliable tools for the modern paranoid.





	
&lt;br/&gt;

&lt;img style=&quot;margin: 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/blog/faraday-cookieclosed-33-20211116-DSC00901.jpg&quot; width=600 height=auto title=&quot;Photo: The foam-padded dark grey inside of a serious-looking equipment case. There is a power strip on top with a cable leading to a metal bracket on the side. Plugged into the power strip is a small charger. To the right, a kind of piston is holding the top open.  At the bottom two grey gloves, integrated into the wall, are hanging deflated. Taking up half of the case's right interior is a cuboid, bright orange tin labeled D. LAZZARONI &amp; C. AMARETTI ORIGINALI.&quot; align=&quot;right&quot;&gt;

Back in the not-so-distant past, if you were patient and knowledgeable enough, you could reverse engineer the behavior of almost any electronic device simply by inspecting it carefully and understanding the circuitry. But those days are rapidly ending. Today, virtually every aspect of complex electronic hardware is controlled by microprocessors and software, and while that's generally good news for functionality, it's also bad news for security (and for having any chance of being sure what, exactly, your gadgets are doing, for that matter). For devices like smartphones, software runs almost every aspect of the user interface, including how and when it's powered on and off, and, for that matter, what being &quot;off&quot; actually means.
&lt;p&gt;
Complex software is, to put it mildly, hard to get right (for details, see almost any other posting on this or any other security blog). Especially for gadgets that are rich with microphones, cameras, location and environmental sensors, and communication links (such as, you know, smartphones), errors and security vulnerabilities in the software that controls them can have serious privacy implications.
&lt;p&gt;
The difficulty of reliably turning software-based devices completely off is no longer merely a hypothetical issue. Some vendors have even recognized it as a marketable feature. For example, &lt;a href=&quot;https://support.apple.com/guide/iphone/add-your-iphone-iph9a847efc7/ios&quot;&gt;certain Apple iPhones will continue to transmit &quot;Find My Device&quot; tracking beacons even after they've ostensibly been powered off.&lt;/a&gt; Misbehaving or malicious software could enable similar behavior even on devices that don't &quot;officially&quot; support it, creating the potential for malware that turns your phone into a permanently on surreptitious tracking device, no matter whether you think you've turned it off. Compounding these risks are the non-removable batteries used in many of the latest smartphones.
&lt;p&gt;
Sometimes, you might really want to make sure something is genuinely isolated from the world around it, even if the software running on it has other ideas. For the radios in phones (which can transmit and receive cellular, wifi, bluetooth, and near field communication signals and receive GPS location signals), we can accomplish this by encasing the device inside a small &lt;em&gt;Faraday cage.&lt;/em&gt;
&lt;p&gt;
A Faraday cage severely attenuates radio signals going in or out of it. It can be used to assure that an untrustworthy device (like a cellphone) isn't transmitting or receiving signals when it shouldn't be. A Faraday cage is simple in principle: it's just a solid conductive container that completely encloses the signal source, such that the RF voltage differential between any two points on the cage is always zero. But actually constructing one that works well in practice can be challenging. Any opening can create a junction that acts as an RF feed and dramatically reduces the effective attenuation.
&lt;p&gt;
There are somewhat pricey (USD40-USD80) commercial Faraday pouches made specifically for cell phones, and there are a variety of improvised shielding methods that make the rounds as Internet folklore. The question is, then, how well do they actually work? It can be hard to reliably tell without access to a fairly specialized RF test lab. But fortunately, I sort of have one of those. While I can't compete with a full-scale commercial EMC test lab, my modest setup can make moderately accurate measurements of the signal attenuation provided by various commercial shielding pouches and home-brewed designs at most of the frequencies we care about.
&lt;p&gt;
I tested three commercial pouches as well as three commonly-recommended makeshift shielding methods. Read on for the results. (Note that I have no connection with any vendor mentioned here, and I do not endorse any of the products discussed for any particular purpose. Caveat emptor.)
&lt;br/&gt;

&lt;a href=&quot;https://www.mattblaze.org/blog/faraday/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Scientists say no credible evidence of computer fraud in the 2020 election outcome, but policymakers must work with experts to improve confidence</title>
   <link>https://www.mattblaze.org/blog/election-letter/</link>
   <guid>https://www.mattblaze.org/blog/election-letter/</guid>
   <pubDate>Mon, 16 Nov 2020 22:00:29 +0000</pubDate>
   <description>A brief statement from my colleagues and me




	
&lt;br/&gt;
&lt;em&gt;A PDF of this letter can be found &lt;a href=&quot;https://www.mattblaze.org/papers/election2020.pdf&quot;&gt;here&lt;/a&gt;.&lt;/em&gt;
&lt;p&gt;
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/election-letter/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>A Cryptologic Mystery</title>
   <link>https://www.mattblaze.org/blog/neinnines/</link>
   <guid>https://www.mattblaze.org/blog/neinnines/</guid>
   <pubDate>Fri, 18 Sep 2020 00:10:29 +0000</pubDate>
   <description>Did a broken random number generator in Cuba help expose a Russian espionage network?




	
&lt;p&gt;
I picked up the new book &lt;em&gt;Compromised&lt;/em&gt; last week and was intrigued to discover that it may have shed some light on a small (and rather esoteric) cryptologic and espionage mystery that I've been puzzling over for about 15 years. &lt;em&gt;Compromised&lt;/em&gt; is primarily a memoir of former FBI counterintelligence agent Peter Strzok's investigation into Russian operations in the lead up to the 2016 presidential election, but this post is not a review of the book or concerned with that aspect of it.
&lt;p&gt;
Early in the book, as an almost throwaway bit of background color, Strzok discusses his work in Boston investigating the famous Russian &quot;illegals&quot; espionage network from 2000 until their arrest (and subsequent exchange with Russia) in 2010. &quot;Illegals&quot; are foreign agents operating abroad under false identities and without official or diplomatic cover. In this case, ten Russian illegals were living and working in the US under false Canadian and American identities. (The case inspired the recent TV series &lt;em&gt;The Americans&lt;/em&gt;.)
&lt;p&gt;
Strzok was the case agent responsible for two of the suspects, Andrey Bezrukov and Elena Vavilova (posing as a Canadian couple under the aliases Donald Heathfield and Tracey Lee Ann Foley). The author recounts watching from the street on Thursday evenings as Vavilova received encrypted shortwave &quot;numbers&quot; transmissions in their Cambridge, MA apartment.
&lt;p&gt;
Given that Bezrukov and Vaviloa were indeed, as the FBI suspected, Russian spies, it's not surprising that they were sent messages from headquarters using this method; numbers stations are part of time-honored espionage tradecraft for communicating with covert agents. But their capture may have illustrated how subtle errors can cause these systems to fail badly in practice, even when the cryptography itself is sound.
&lt;br/&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/neinnines/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Exhaustive Search has Moved</title>
   <link>https://www.mattblaze.org/blog/newaddress/</link>
   <guid>https://www.mattblaze.org/blog/newaddress/</guid>
   <pubDate>Sat, 07 Jul 2018 18:18:36 +0000</pubDate>
   <description>But &quot;crypto&quot; still means cryptography.




	
&lt;p&gt;
You may have noticed that this blog, and my domain, is now at &lt;a href=&quot;https://www.mattblaze.org&quot;&gt;&lt;tt&gt;www.mattblaze.org&lt;/tt&gt;&lt;/a&gt;. Twenty five years ago, back in 1993, I registered the name &lt;tt&gt;crypto.com&lt;/tt&gt;, which I've used as my personal domain as well as to host a variety of cryptography technology and policy resources. 
&lt;p&gt;
During that quarter century the &quot;dotcom&quot; era came and went, but for whatever reason, I held on to the domain as basically a personal home, a kind of Internet version of the little house increasingly enveloped by skyscrapers in Pixar's &lt;em&gt;Up&lt;/em&gt;. (You kids can get off my lawn now, please.)
&lt;p&gt;
Cryptography has long been intertwined with difficult public policy issues, especially the balance between security of data on the one hand and law enforcement access for surveillance on the other. I've spent a good part of my career grappling with these issues, and remember &quot;crypto&quot; being misguidedly derided as some kind of criminal tool during the very time when we needed to be integrating strong security into the Internet's infrastructure. (That &quot;debate&quot;, in the '90's, set back Internet security by at least a decade, and we're still paying the price in the form of regular data breaches, many of which could have been prevented had better security been built in across the stack in the first place.)
&lt;p&gt;
Somehow, the word &quot;crypto&quot; has recently acquired an alternative new meaning, as a somewhat unfortunate shorthand for digital currencies such as Bitcoin. I've been involved around the edges of digital currency since early on -- old timers in this space will remember that I once chaired the &lt;em&gt;Financial Cryptography&lt;/em&gt; conference, where much of the foundational work toward practical digital money began.
&lt;p&gt;
I don't think conflating cryptography and digital currency will serve either field well in the long run, particularly as to how they're perceived by the public and policymakers. Surprisingly few of the important aspects of digital currency are directly related to its cryptographic components. Cryptography itself already attracts disproportionate attention for its potential as a tool for criminals and evildoers. Digital currency adds a completely different (but equally fraught) regulatory and policy morass into the equation. Still, there's no doubt that, at this moment in time, the two have become hopelessly intermixed, at least in the minds of the digital money people. That doesn't mean this won't end badly, but it's unarguably where we are right now.
&lt;p&gt;
Over the last few years, I've gotten a growing barrage of offers, many of which were obviously non-serious, but a few of which were, frankly, attention-getting, for the &lt;tt&gt;crypto.com&lt;/tt&gt; domain. I shrugged most of them off, but it became increasingly clear that holding on to the domain was making less and less sense for me. I quietly entered discussion with a few serious potential buyers earlier this year.
&lt;p&gt;
Last month, I reached an agreement to sell the domain. I have no idea what the new owner plans to use it for beyond what I read in the trade press, and I have no financial stake in their business. The details will have to stay confidential, but I will say that I'm satisfied with the outcome and that it involved neither tulips nor international postal reply coupons.
&lt;p&gt;
It's been, I think, a pretty good run, as these things go. See you on the Internets.</description>
</item>
<item>
   <title>How to Hack an Election Without Really Trying</title>
   <link>https://www.mattblaze.org/blog/vote_hacking_by_email/</link>
   <guid>https://www.mattblaze.org/blog/vote_hacking_by_email/</guid>
   <pubDate>Wed, 07 Jun 2017 07:59:30 +0000</pubDate>
   <description>Unraveling the NSA &quot;Russian Election Hacking&quot; story.




	
&lt;p&gt;
&lt;a href=&quot;http://www.flickr.com/photos/mattblaze/2999140247/&quot;&gt;&lt;img style=&quot;margin 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/misc/ivot-vote-350.jpg&quot; align=&quot;right&quot;&gt;&lt;/a&gt;This Monday, &lt;em&gt;The Intercept&lt;/em&gt; &lt;a href=&quot;https://theintercept.com/2017/06/05/top-secret-nsa-report-details-russian-hacking-effort-days-before-2016-election/&quot;&gt;broke the story&lt;/a&gt; of &lt;a href=&quot;https://assets.documentcloud.org/documents/3766950/NSA-Report-on-Russia-Spearphishing.pdf&quot;&gt;a leaked classified NSA report [pdf link]&lt;/a&gt; on an email-based attack on a various US election systems just before the 2016 US general election.
&lt;p&gt;
The NSA report, dated May 5, 2017, details what I would assume is only a small part of a more comprehensive investigation into Russian intelligence services' &quot;cyber operations&quot; to influence the US presidential race. The report analyzes several relatively small-scale targeted email operations that occurred in August and October of last year. One campaign used &quot;spearphishing&quot; techniques against employees of third-party election support vendors (which manage voter registration databases for county election offices). Another -- our focus here -- targeted 112 unidentified county election officials with &quot;trojan horse&quot; malware disguised inside plausibly innocuous-looking Microsoft Word attachments.  The NSA report does not say whether these attacks were successful in compromising any county voting offices or what even what the malware actually tried to do.
&lt;p&gt;
Targeted phishing attacks and malware hidden in email attachments might not seem like the kind of high-tech spy tools we associate with sophisticated intelligence agencies like Russia's GRU. They're familiar annoyances to almost anyone with an email account. And yet they can serve as devastatingly effective entry points into even very sensitive systems and networks.
&lt;p&gt;
So what might an attacker -- particularly a state actor looking to disrupt an election -- accomplish with such low-tech attacks, should they have succeeded? Unfortunately, the possibilities are not comforting. 
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/vote_hacking_by_email/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>When Should the Government Disclose "Stockpiled" Vulnerabilities?</title>
   <link>https://www.mattblaze.org/blog/between_immediately_and_never/</link>
   <guid>https://www.mattblaze.org/blog/between_immediately_and_never/</guid>
   <pubDate>Sat, 11 Mar 2017 04:21:39 +0000</pubDate>
   <description>Somewhere between immediately and never.




	
&lt;p&gt;
Encryption, it seems, at long last is winning. End-to-end encrypted communication systems are protecting more of our private communication than ever, making interception of sensitive content as it travels over (insecure) networks like the Internet less of a threat than it once was. All this is good news, unless you're in the business of intercepting sensitive content over networks. Denied access to network traffic, criminals and spies (whether on our side or theirs) will resort to other approaches to get access to data they seek. In practice, that often means exploiting security vulnerabilities in their targets' phones and computers to install surreptitious &quot;spyware&quot; that records conversations and text messages before they can be encrypted. In other words, wiretapping today increasingly involves &lt;em&gt;hacking&lt;/em&gt;.
&lt;p&gt;
This, as you might imagine, is not without controversy.
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/between_immediately_and_never/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>How Law Enforcement Tracks Cellular Phones</title>
   <link>https://www.mattblaze.org/blog/celltapping/</link>
   <guid>https://www.mattblaze.org/blog/celltapping/</guid>
   <pubDate>Fri, 13 Dec 2013 05:39:31 +0000</pubDate>
   <description>A brief taxonomy of wiretapping esoterica.




	
&lt;p&gt;
&lt;img style=&quot;margin: 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/blog/ngnr2000-600.jpg&quot; alt=&quot;Recall NGNR2000 DNR&quot; align=&quot;right&quot;&gt;
Recent news stories, notably &lt;a href=&quot;http://www.usatoday.com/story/news/nation/2013/12/08/cellphone-data-spying-nsa-police/3902809/&quot;&gt;this story in USA Today&lt;/a&gt; and &lt;a href=&quot;http://www.washingtonpost.com/world/national-security/agencies-collected-data-on-americans-cellphone-use-in-thousands-of-tower-dumps/2013/12/08/20549190-5e80-11e3-be07-006c776266ed_story.html&quot;&gt;this story in the Washington Post&lt;/a&gt;, have brought to light extensive use of &quot;Stingray&quot; devices and &quot;tower dumps&quot; by federal -- and local -- law enforcement agencies to track cellular telephones.
&lt;p&gt;
Just how how does all this tracking and interception technology work?  There are actually a surprising number of different ways law enforcement agencies can track and get information about phones, each of which exposes different information in different ways. And it's all steeped in arcane surveillance jargon that's evolved over decades of changes in the law and the technology.  So now seems like a good time to summarize what the various phone tapping methods actually are, how they work, and how they differ from one another.
&lt;p&gt;
Note that this post is concerned specifically with phone tracking as done by US &lt;em&gt;domestic law enforcement&lt;/em&gt; agencies. &lt;em&gt;Intelligence&lt;/em&gt; agencies engaged in bulk surveillance, such as the NSA, have different requirements, constraints, and resources, and generally use different techniques. For example, it was recently revealed that NSA has access to &lt;a href=&quot;http://www.washingtonpost.com/world/national-security/nsa-tracking-cellphone-locations-worldwide-snowden-documents-show/2013/12/04/5492873a-5cf2-11e3-bc56-c6ca94801fac_story.html&quot;&gt;international phone &quot;roaming&quot; databases used by phone companies to route calls&lt;/a&gt;.  The NSA apparently collects vast amounts of telephone &quot;metadata&quot; to discover hidden communications patterns, relationships, and behaviors across the world.  There's also evidence of some data sharing to law enforcement from the intelligence side (see, for example, the DEA's &quot;Hemisphere&quot; program). But, as interesting and important as that is, it has little to do with the &quot;retail&quot; phone tracking techniques used by local law enforcement, and it's not our focus here.
&lt;p&gt;
Phone tracking by law enforcement agencies, in contrast to intelligence agencies, is intended to support investigations of specific crimes and to gather evidence for use in prosecutions.  And so their interception technology -- and the underlying law -- is supposed to be focused on obtaining information about the communications of &lt;em&gt;particular&lt;/em&gt; targets rather than of the population at large.
&lt;p&gt;
In all, there are six major distinct phone tracking and tapping methods used by investigators in the US: &quot;call detail records requests&quot;, &quot;pen register/trap and trace&quot;, &quot;content wiretaps&quot;, &quot;E911 pings&quot;, &quot;tower dumps&quot;, and &quot;Stingray/IMSI Catchers&quot;.  Each reveals somewhat different information at different times, and each has its own legal implications.  An agency might use any or all of them over the course of a given investigation.  Let's take them one by one.
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/celltapping/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Voting by Email in New Jersey</title>
   <link>https://www.mattblaze.org/blog/njvoting/</link>
   <guid>https://www.mattblaze.org/blog/njvoting/</guid>
   <pubDate>Sun, 04 Nov 2012 07:37:37 +0000</pubDate>
   <description>Some very preliminary thoughts.




	
&lt;p&gt;
&lt;a href=&quot;http://www.flickr.com/photos/mattblaze/2999140247/&quot;&gt;&lt;img style=&quot;margin 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/misc/ivot-vote-350.jpg&quot; align=&quot;right&quot;&gt;&lt;/a&gt;
New Jersey was hit hard by Hurricane Sandy, and many parts of the state still lack electricity and basic infrastructure.  Countless residents have been displaced, at least temporarily.  And election day is on Tuesday.
&lt;p&gt;
There can be little doubt that many New Jerseyans, whether newly displaced or rendered homebound, who had originally intended to cast their votes at their normal neighborhood polling stations will be unable to do so next week.  Unless some new flexible voting options are made available, many people will be disenfranchised, perhaps altering the outcome of races. There are compelling reasons for New Jersey officials to act quickly to create viable, flexible, secure and reliable voting options for their citizens in this emergency.
&lt;p&gt;
A few hours ago, Gov. Christie &lt;a href=&quot;http://www.state.nj.us/governor/news/news/552012/approved/20121103d.html&quot;&gt;announced&lt;/a&gt; that voters unable to reach their normal polling places would be permitted to vote by electronic mail.  The directive, outlined &lt;a href=&quot;http://nj.gov/state/elections/2012-results/directive-email-voting.pdf&quot;&gt;here [pdf],&lt;/a&gt; allows displaced registered voters to request a &quot;mail in&quot; ballot from their local county clerk by email. The voter can then return the ballot, along with a signed &quot;waiver of secrecy&quot; form, by email, to be counted as a regular ballot.  (The process is based on one used for  overseas and military voters, but on a larger scale and with a greatly accelerated timeframe.)
&lt;p&gt;
Does email voting make sense for New Jersey during this emergency? It's hard to say one way or the other without a lot more information than has been released so far about how the system will work and how it will be secured.
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/njvoting/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Having Something to Get Spun Up About</title>
   <link>https://www.mattblaze.org/blog/spinup911/</link>
   <guid>https://www.mattblaze.org/blog/spinup911/</guid>
   <pubDate>Sat, 10 Sep 2011 18:48:43 +0000</pubDate>
   <description>Ten years ago tomorrow.




	
&lt;p&gt;
A &lt;a href=&quot;http://www.nytimes.com/2011/09/10/nyregion/biden-describes-bomb-threat-as-security-is-increased.html?hp=&amp;pagewanted=all&quot;&gt;
recent NY Times piece&lt;/a&gt;, on the response to a &quot;credible, specific and unconfirmed&quot; threat of a terrorist plot against New York on the tenth anniversary of the September 11 attacks, includes this strikingly telling quote from an anonymous senior law enforcement official: 
&lt;blockquote&gt;
&quot;It's 9/11, baby,&quot; one official said. &quot;We have to have something to get spun up about.&quot;
&lt;/blockquote&gt;
&lt;p&gt;
Indeed. But while it's easy to understand this remark as a bitingly candid assessment of the cynical and now reflexive fear mongering that we have allowed to become the most lasting and damaging legacy of Al Qaeda's mad war, I must also admit that there's another, equally true but much sadder, interpretation, at least for me.
&lt;p&gt;
We have to get spun up about something because the alternative is simply too painful. I can find essentially two viable emotional choices for tomorrow. One is to get ourselves &quot;spun up&quot; about a new threat, worry, take action, defend the homeland and otherwise occupy ourselves with the here and now. The other is quieter and simpler but far less palatable: to privately revisit the unspeakable horrors of that awful, awful, day, dislodging shallowly buried memories that emerge all too easily ten years later.
&lt;p&gt;
The relentless retrospective news coverage that (inevitably) is accompanying the upcoming anniversary has more than anything else reactivated the fading sense of overwhelming, escalating sadness I felt ten years ago. Sadness was ultimately the only available response, even for New Yorkers like me who lived only a few miles from the towers. It was in many ways the city's proudest moment, everyone wanting and trying to help, very little panic. But really, there wasn't nearly enough for all of us to do. Countless first responders and construction workers rushed without a thought to ground zero for a rescue that quickly became a recovery operation. Medical personnel reported to emergency rooms to treat wounded survivors who largely didn't exist. You couldn't even donate blood, the supply of volunteers overwhelming the small demand. (Working for AT&amp;amp;T at the time, I went to down to a midtown Manhattan switching office, hoping somehow to be able to help keep our phones working with most of the staff unable to get to work, but it was quickly clear I was only getting in the way of the people there who actually knew how do useful work.)
&lt;p&gt;
All most of us could really do that day and in the days that followed was bear witness to the horror of senseless death and try to comprehend the enormity of what was lost. Last words to loved ones, captured in voicemails from those who understood enough about what was happening to know that they would never see their families again. The impossible choice made by so many to jump rather than burn to death. The ubiquitous memorials to the dead, plastered in photocopied posters on walls everywhere around the city, created initially as desperate pleas for information on the missing.
&lt;p&gt;
Rudy Giuliani, a New York mayor for whom I normally have little patience, found a deep truth that afternoon when he was asked how many were lost. He didn't know, he said, but he cautioned that it would be &quot;more than any of us can bear&quot;. 
&lt;p&gt;
I remember trying to get angry at the bastards who inflicted this on us, but it didn't really work. Whoever they were, I knew they must be, in the end, simply crazy, beyond the reach of any meaningful kind of retribution. Anger couldn't displace the helplessness and sadness.
&lt;p&gt;
Remember all this or get &quot;spun up&quot;? Easy, easy choice.</description>
</item>
<item>
   <title>Wikileaking a Cryptography Lesson</title>
   <link>https://www.mattblaze.org/blog/wikileaking/</link>
   <guid>https://www.mattblaze.org/blog/wikileaking/</guid>
   <pubDate>Thu, 01 Sep 2011 20:56:34 +0000</pubDate>
   <description>Authentication and decryption are different.  And sometimes this is important.




	
&lt;p&gt;
Everything else aside, the recent Wikileaks/Guardian &lt;a href=&quot;http://www.wired.com/threatlevel/2011/09/wikileaks-unredacted-cables/&quot;&gt;fiasco&lt;/a&gt; (in which the passphrase for a widely-distributed encrypted file containing an un-redacted database of &lt;em&gt;Wikileaks&lt;/em&gt; cables ended up published in a book by a &lt;em&gt;Guardian&lt;/em&gt; editor) nicely demonstrates an important cryptologic principle: the security properties of keys used for &lt;b&gt;authentication&lt;/b&gt; and those used for &lt;b&gt;decryption&lt;/b&gt; are quite different.
&lt;p&gt;
Authentication keys, such as login passwords, become effectively useless once they are changed (unless they are re-used in other contexts). An attacker who learns an old authentication key would have to travel back in time to make any use of it. But old decryption keys, even after they have been changed, can remain as valuable as the secrets they once protected, forever. Old ciphertext can still be decrypted with the old keys, even if newer ciphertext can't.
&lt;p&gt;
And it appears that confusion between these two concepts is at the root of the leak here. Assuming the &lt;em&gt;Guardian&lt;/em&gt; editor's narrative accurately describes his understanding of what was going on, he believed that the passphrase he had been given was a temporary password that would have already been rendered useless by the time his book would be published. But that's not what it was at all; it was a decryption key -- for a file whose ciphertext was widely available.
&lt;p&gt;
It might be tempting for us, as cryptographers and security engineers, to snicker at both &lt;em&gt;Wikileaks&lt;/em&gt; and the &lt;em&gt;Guardian&lt;/em&gt; for the sloppy practices that allowed this high-stakes mishap to have happened in the first place.  But we should also observe that confusion between the semantics of authentication and of confidentiality happens because these are, in fact, subtle concepts that are as poorly understood as they are intertwined, even among those who might now be laughing the hardest. The crypto literature is full of examples of protocol failures that have exactly this confusion at their root.
&lt;p&gt;
And it should also remind us that, again, cryptographic usability matters. Sometimes quite a bit.</description>
</item>
<item>
   <title>Why (special agent) Johnny (still) Can't Encrypt</title>
   <link>https://www.mattblaze.org/blog/p25/</link>
   <guid>https://www.mattblaze.org/blog/p25/</guid>
   <pubDate>Wed, 17 Aug 2011 18:09:55 +0000</pubDate>
   <description>One-Way Cryptography and the First Rule of Cryptanalysis.




	
&lt;p&gt;
&lt;a href=&quot;https://www.mattblaze.org/blog/p25&quot;&gt;&lt;img style=&quot;margin: 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/misc/xts-keyloader_2000-small2.jpg&quot; align=&quot;right&quot;&gt;&lt;/a&gt;
Last week at &lt;a href=&quot;http://www.usenix.org/events/sec11/&quot;&gt;the 20th
Usenix Security Symposium&lt;/a&gt;, Sandy Clark, Travis Goodspeed, Perry Metzger,
Zachary Wasserman, Kevin Xu, and I presented our paper
&lt;a href=&quot;https://www.mattblaze.org/papers/p25sec.pdf&quot;&gt;&lt;em&gt;Why (Special Agent) Johnny
(Still) Can't Encrypt: A Security Analysis of the APCO Project 25 Two-Way Radio System &lt;/em&gt;[pdf]&lt;/a&gt;.  I'm delighted and honored to report that we won an &quot;Outstanding Paper&quot; award.
&lt;p&gt;
APCO Project 25 (&quot;P25&quot;) is a suite of wireless communications protocols designed for government two-way (voice) radio systems, used for everything from dispatching police and other first responders by local government to coordinating federal tactical surveillance operations against organized crime
and suspected terrorists.  P25 is intended to be
a &quot;drop-in&quot; digital replacement for the analog FM systems traditionally used in public safety two-way radio, adding some additional features and security options.   It use the same frequency bands and channel allocations as the older analog systems it replaces, but with a digital modulation format and various higher-level application protocols (the most important being real-time voice broadcast).
Although many agencies still use analog radio, P25 adoption has accelerated in
recent years, especially among federal agencies.
&lt;p&gt;
One of the advantages of digital radio, and one of the design goals of P25, is the relative ease with which it can
encrypt sensitive, confidential voice traffic with strong cryptographic algorithms
and protocols.
While most public safety
two-way radio users (local police dispatch centers and so on)
typically don't use (or need) encryption, for others -- those engaged in
surveillance of organized crime,
counter espionage and executive protection, to name a few -- it has become an essential requirement.  When all radio transmissions were in the clear -- and vulnerable to interception -- these &quot;tactical&quot; users needed to be constantly mindful of the threat of eavesdropping by an adversary, and so
were forced to be stiltedly circumspect in what they could say over the air.
For these users,
strong, reliable encryption not only makes their operations more secure, it frees them
to communicate more effectively.
&lt;p&gt;
So how secure is P25? Unfortunately, the news isn't very reassuring.
&lt;a href=&quot;https://www.mattblaze.org/blog/p25/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Wiretapping and Cryptography Today</title>
   <link>https://www.mattblaze.org/blog/wiretap2010/</link>
   <guid>https://www.mattblaze.org/blog/wiretap2010/</guid>
   <pubDate>Tue, 12 Jul 2011 22:36:30 +0000</pubDate>
   <description>Report from the sky didn't fall department.




	
&lt;p&gt;
&lt;a href=&quot;http://www.flickr.com/photos/mattblaze/2695044170/&quot;&gt;&lt;img style=&quot;margin: 10px 0px 10px 13px&quot; src=&quot;https://www.mattblaze.org/photos/misc/snst-2-360.jpg&quot; align=&quot;right&quot;&gt;&lt;/a&gt;
The &lt;a href=&quot;http://www.uscourts.gov/Statistics/WiretapReports/WiretapReport2010.aspx&quot;&gt;2010 U.S. Wiretap Report&lt;/a&gt; was released a couple
of weeks ago, the latest in a series of puzzles published annually, on
and off, by congressional mandate since the Nixon administration.
The report, as
its name implies, summarizes legal wiretapping by federal and state law enforcement agencies.   The reports are puzzles because they are notoriously
incomplete; the data relies on spotty reporting, and information
on &quot;national security&quot; (FISA) taps is excluded altogether.  Still, it's
the most complete public picture of wiretapping as practiced in the US that we
have, and as such, is of likely interest to many readers here.
&lt;p&gt;
We now know that there were at least 3194 criminal wiretaps
last year (1207 of these were by federal law enforcement and 1987 were
done by state and local agencies).  The previous year there were only
2376 reported, but it isn't clear how much of this increase was due to
improved data collection in 2010.  Again, this is only &quot;Title III&quot; content
wiretaps for criminal investigations (mostly drug cases); it doesn't include
&quot;pen registers&quot; that record call details without audio or taps for
counterintelligence and counterterrorism investigations, which presumably
have accounted for an increasing proportion of intercepts since 2001.
And there's apparently still a fair
bit of underreporting in the statistics.  So we don't really know how much wiretapping the government actually does in total or what the trends
really look like.  There's a lot of noise among the signals here.
&lt;p&gt;
But for all the noise, one interesting fact stands out rather clearly.
Despite dire predictions to the contrary,
the open availability of cryptography has done little
to hinder law enforcement's ability to conduct investigations.
&lt;a href=&quot;https://www.mattblaze.org/blog/wiretap2010/&quot;&gt;See the rest of this (rather long) entry...&lt;/a&gt;</description>
</item>
<item>
   <title>Google Plus</title>
   <link>https://www.mattblaze.org/blog/GPlus/</link>
   <guid>https://www.mattblaze.org/blog/GPlus/</guid>
   <pubDate>Mon, 11 Jul 2011 00:13:31 +0000</pubDate>
   <description>I, for one, welcome our Googly overlords.




	
&lt;p&gt;
A while back when I tried to sign up for a Facebook account it was almost indistinguishable from a phishing attack -- it kept urging me to give them my email and other passwords to &quot;help&quot; me keep in better contact with my friends.  (I ended up giving up, but apparently not completely enough to prevent an endless stream of &quot;friend&quot; requests from showing up in my mailbox.)
&lt;p&gt;
Signing up for &lt;a href=&quot;https://plus.google.com&quot;&gt;Google+&lt;/a&gt; this week was different. It already knew who all my contacts were, no passwords required.
&lt;p&gt;
I'm not sure, in retrospect, which was more disconcerting. If FB signup raised my phishing defenses, joining G+ felt more like a cyber-Mafia shakedown. All that was missing from the exhaustive list of friends and loved ones was &quot;... it would be a shame if something happened to these people.&quot;
&lt;p&gt;
I'd say to look for me there, but it seems you won't have to.</description>
</item>
<item>
   <title>I'll be on WHYY's Radio Times today</title>
   <link>https://www.mattblaze.org/blog/radiotimes/</link>
   <guid>https://www.mattblaze.org/blog/radiotimes/</guid>
   <pubDate>Tue, 07 Jun 2011 11:33:43 +0000</pubDate>
   <description>Radio is what our grandparents listened to before there were podcasts.




	
&lt;p&gt;
I'll be talking about computer security and cyberwar this morning live
at 10am on WHYY-FM's otherwise excellent
&lt;a href=&quot;http://whyy.org/cms/radiotimes/2011/06/07/cyberwarfare-and-cybersecurity/&quot;&gt;&lt;em&gt;Radio Times&lt;/em&gt;&lt;/a&gt; show.
For those who aren't up before the crack of noon, I'm told the show will also be repeated at 10pm as well as podcast online.
(WHYY is the Philadelphia NPR affiliate).</description>
</item>
</channel>
</rss>
