Crypto-Gram

September 15, 2009

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-0909.html>. These same essays appear in the “Schneier on Security” blog: <http://www.schneier.com/>. An RSS feed is available.


In this issue:


Eighth Anniversary of 9/11

On September 30, 2001, I published a special issue of Crypto-Gram discussing the terrorist attacks. I wrote about the novelty of the attacks, airplane security, diagnosing intelligence failures, the potential of regulating cryptography—because it could be used by the terrorists—and protecting privacy and liberty. Much of what I wrote is still relevant today.

http://www.schneier.com/crypto-gram-0109a.html

Me from 2006: “Refuse to be Terrorized.”
http://www.schneier.com/essay-124.html


Skein News

Skein is one of the 14 SHA-3 candidates chosen by NIST to advance to the second round. As part of the process, NIST allowed the algorithm designers to implement small “tweaks” to their algorithms. We’ve tweaked the rotation constants of Skein.

The revised Skein paper contains the new rotation constants, as well as information about how we chose them and why we changed them, the results of some new cryptanalysis, plus new IVs and test vectors.

Tweaks were due today, September 15. Now the SHA-3 process moves into the second round. According to NIST’s timeline, they’ll choose a set of final round candidate algorithms in 2010, and then a single hash algorithm in 2012. Between now and then, it’s up to all of us to evaluate the algorithms and let NIST know what we want. Cryptanalysis is important, of course, but so is performance.

The second-round algorithms are: BLAKE, Blue Midnight Wish, CubeHash, ECHO, Fugue, Gr� Hamsi, JH, Keccak, Luffa, Shabal, SHAvite-3, SIMD, and Skein. You can find details on all of them, as well as the current state of their cryptanalysis, at the SHA-3 Zoo.

In other news, we’re making Skein shirts available to the public. Those of you who attended the First Hash Function Candidate Conference in Leuven, Belgium, earlier this year might have noticed the stylish black Skein polo shirts worn by the Skein team. Anyone who wants one is welcome to buy it, at cost. All orders must be received before 1 October, and then we’ll have all the shirts made in one batch.

Skein website:
http://www.skein-hash.info/

Revised Skein paper:
http://www.schneier.com/skein.pdf

Revised Skein source code:
http://www.schneier.com/code/skein.zip

My 2008 essay on SHA-3:
http://www.schneier.com/essay-249.html

Details on Skein shirts:
http://www.schneier.com/skein-shirts.html

NIST SHA-3 Website:
http://csrc.nist.gov/groups/ST/hash/sha-3/index.html
http://csrc.nist.gov/groups/ST/hash/sha-3/Round2/…

SHA-3 Zoo:
http://ehash.iaik.tugraz.at/wiki/The_SHA-3_Zoo


Real-World Access Control

Access control is difficult in an organizational setting. On one hand, every employee needs enough access to do his job. On the other hand, every time you give an employee more access, there’s more risk: he could abuse that access, or lose information he has access to, or be socially engineered into giving that access to a malfeasant. So a smart, risk-conscious organization will give each employee the exact level of access he needs to do his job, and no more.

Over the years, there’s been a lot of work put into role-based access control. But despite the large number of academic papers and high-profile security products, most organizations don’t implement it—at all—with the predictable security problems as a result.

Regularly we read stories of employees abusing their database access-control privileges for personal reasons: medical records, tax records, passport records, police records. NSA eavesdroppers spy on their wives and girlfriends. Departing employees take corporate secrets

A spectacular access control failure occurred in the UK in 2007. An employee of Her Majesty’s Revenue & Customs had to send a couple of thousand sample records from a database on all children in the country to National Audit Office. But it was easier for him to copy the entire database of 25 million people onto a couple of disks and put it in the mail than it was to select out just the records needed. Unfortunately, the discs got lost in the mail and the story was a huge embarrassment for the government.

Eric Johnson at Dartmouth’s Tuck School of Business has been studying the problem, and his results won’t startle anyone who has thought about it at all. RBAC is very hard to implement correctly. Organizations generally don’t even know who has what role. The employee doesn’t know, the boss doesn’t know—and these days the employee might have more than one boss—and senior management certainly doesn’t know. There’s a reason RBAC came out of the military; in that world, command structures are simple and well-defined.

Even worse, employees’ roles change all the time—Johnson chronicled one business group of 3,000 people that made 1,000 role changes in just three months—and it’s often not obvious what information an employee needs until he actually needs it. And information simply isn’t that granular. Just as it’s much easier to give someone access to an entire file cabinet than to only the particular files he needs, it’s much easier to give someone access to an entire database than only the particular records he needs.

This means that organizations either over-entitle or under-entitle employees. But since getting the job done is more important than anything else, organizations tend to over-entitle. Johnson estimates that 50 percent to 90 percent of employees are over-entitled in large organizations. In the uncommon instance where an employee needs access to something he normally doesn’t have, there’s generally some process for him to get it. And access is almost never revoked once it’s been granted. In large formal organizations, Johnson was able to predict how long an employee had worked there based on how much access he had.

Clearly, organizations can do better. Johnson’s current work involves building access-control systems with easy self-escalation, audit to make sure that power isn’t abused, violation penalties (Intel, for example, issues “speeding tickets” to violators), and compliance rewards. His goal is to implement incentives and controls that manage access without making people too risk-averse.

In the end, a perfect access control system just isn’t possible; organizations are simply too chaotic for it to work. And any good system will allow a certain number of access control violations, if they’re made in good faith by people just trying to do their jobs. The “speeding ticket” analogy is better than it looks: we post limits of 55 miles per hour, but generally don’t start ticketing people unless they’re going over 70.

This essay previously appeared in Information Security, as part of a point/counterpoint with Marcus Ranum. You can read Marcus’s response here—after you answer some nosy questions to get a free account.
http://searchsecurity.techtarget.com/…
Role-based access control:
http://searchsecurity.techtarget.com/…
http://technet.microsoft.com/en-us/library/…
http://csrc.nist.gov/groups/SNS/rbac/documents/…
http://csrc.nist.gov/groups/SNS/rbac/documents/…

Abuse of database access:
http://articles.latimes.com/2009/may/09/local/…
http://www.marketwatch.com/story/…
http://www.computerworld.com/action/article.do?…
http://rawstory.com/08/news/2009/06/18/…
http://www.thetechherald.com/article.php/200924/…
http://searchsecurity.techtarget.co.uk/news/article/…
UK access control failure:
http://www.hmrc.gov.uk/news/hartnett-poynter-icc.htm
http://news.bbc.co.uk/2/hi/uk_news/politics/7103566.stm
http://www.computerweekly.com/Articles/2008/06/27/…
Johnson’s work:
http://mba.tuck.dartmouth.edu/digital/Research/…
http://weis2008.econinfosec.org/papers/Zhao.pdf
http://mba.tuck.dartmouth.edu/digital/Research/…


News

Flash has the equivalent of cookies, and they’re hard to delete.
http://www.wired.com/epicenter/2009/08/…
Movie-plot threat alert: robot suicide bombers. Let’s all be afraid.
http://www.telegraph.co.uk/news/newstopics/politics/…
I’m sure I’ve seen this stuff in movies.

It’s possible to fabricate DNA evidence:
http://arstechnica.com/science/news/2009/08/…
http://www.nytimes.com/2009/08/18/science/18dna.html
http://www.fsigenetics.com/article/…

The legal term “terroristic threats” is older than 9/11, but these days it evokes too much of an emotional response:
https://www.schneier.com/blog/archives/2009/08/…

Interesting developments in lie detection:
http://www.scientificamerican.com/article.cfm?…
Marc Weber Tobias on hacking the Assa Solo lock:
http://www.thesidebar.org/insecurity/?p=447
Me on locks and lockpicking:
https://www.schneier.com/blog/archives/2009/08/…

From the humor website Cracked: “The 5 Most Embarrassing Failures in the History of Terrorism.” Yes, it’s funny. But remember that these are the terrorist masterminds that politicians invoke to keep us scared.
http://www.cracked.com/article/…
My 2007 essay, “Portrait of the Modern Terrorist as an Idiot,” is also relevant. But less funny.
http://www.schneier.com/essay-174.html

Modeling zombie outbreaks: the math doesn’t look good. “When Zombies Attack!: Mathematical Modelling of an Outbreak of Zombie Infection.”
http://www.mathstat.uottawa.ca/~rsmith/Zombies.pdf

It turns out that flipping a coin has all sorts of non-randomness:
http://www.codingthewheel.com/archives/…
http://www-stat.stanford.edu/~susan/papers/…

As part of their training, federal agents engage in mock exercises in public places. Sometimes, innocent civilians get involved. It’s actual security theater.
http://www.washingtonpost.com/wp-dyn/content/…
http://www.startribune.com/59017377.html?…

The sorts of crimes we’ve been seeing perpetrated against individuals are starting to be perpetrated against small businesses. The problem will get much worse, and the security externalities means that the banks care much less.
https://www.schneier.com/blog/archives/2009/08/…

Interesting video demonstrating how a policeman can manipulate the results of a Breathalyzer.
http://www.bing.com/videos/search?…
There is a movement in the U.K. to replace the pint glasses in pubs with plastic because too many of them are being used as weapons. I don’t think this will go anywhere, but the sheer idiocy is impressive. Reminds me of the call to ban pointy knives. That recommendation also came out of the UK. What’s going on over there?
https://www.schneier.com/blog/archives/2009/08/…

More security stories from the natural world: marine worms with glowing bombs:
https://www.schneier.com/blog/archives/2009/08/…

Weird: “The U.S. Federal Bureau of Investigation is trying to figure out who is sending laptop computers to state governors across the U.S., including West Virginia Governor Joe Mahchin and Wyoming Governor Dave Freudenthal. Some state officials are worried that they may contain malicious software.”
http://www.itworld.com/government/75885/…
Fascinating story of a 16-year-old blind phone phreaker.
http://www.rollingstone.com/news/story/29787673/…
Hacking swine flu: “So it takes about 25 kilobits—3.2 Kbytes—of data to code for a virus that has a non-trivial chance of killing a human. This is more efficient than a computer virus, such as MyDoom, which rings in at around 22 Kbytes. It’s humbling that I could be killed by 3.2 Kbytes of genetic data. Then again, with 850 Mbytes of data in my genome, there’s bound to be an exploit or two.”
http://www.bunniestudios.com/blog/?p=353

Good article on the exaggerated fears of cyberwar:
http://bostonreview.net/BR34.4/morozov.php
The real risk isn’t cyberterrorism, it’s cybercrime.

SIGABA and the history of one-time pads:
http://www1.cs.columbia.edu/~smb/blog/control/
I wrote about one-time pads, and their practical insecurity, in 2002:
http://www.schneier.com/crypto-gram-0210.html#7

Interesting discussion of subpoenas as a security threat:
http://www.freedom-to-tinker.com//felten/…
A very interesting hour-long interview with David Kilcullen on security and insurgency.
http://www.abc.net.au/unleashed/stories/s2668177.htm

Nils Gilman’s lecture on the global illicit economy Malware is one of Nils Gilman’s examples, at about the nine-minute mark.
http://video.google.com/videoplay?…
The seven rules of the illicit global economy (he seems to use “illicit” and “deviant” interchangeably in the talk):
1. Perfectly legitimate forms of demand can produce perfectly deviant forms of supply.
2. Uneven global regulatory structures create arbitrage opportunities for deviant entrepreneurs.
3. Pathways for legitimate globalization are always also pathways for deviant globalization.
4. Once a deviant industry professionalizes, crackdowns merely promote innovation.
5. States themselves undermine the distinction between legitimate and deviant economics.
6. Unchecked, deviant entrepreneurs will overtake the legitimate economy.
7. Deviant globalization presents an existential challenge to state legitimacy.

Perfectly legal (obtained with a FISA warrant) NSA intercepts used to convict liquid bombers.
https://www.schneier.com/blog/archives/2009/09/…
The BBC has a video demonstration of a 16-ounce bottle of liquid blowing a hole in the side of a plane. I know no more details than what’s in the video.
http://news.bbc.co.uk/2/hi/uk_news/7536167.stm


File Deletion

File deletion is all about control. This used to not be an issue. Your data was on your computer, and you decided when and how to delete a file. You could use the delete function if you didn’t care about whether the file could be recovered or not, and a file erase program—I use BCWipe for Windows—if you wanted to ensure no one could ever recover the file.

As we move more of our data onto cloud computing platforms such as Gmail and Facebook, and closed proprietary platforms such as the Kindle and the iPhone, deleting data is much harder.

You have to trust that these companies will delete your data when you ask them to, but they’re generally not interested in doing so. Sites like these are more likely to make your data inaccessible than they are to physically delete it. Facebook is a known culprit: actually deleting your data from its servers requires a complicated procedure that may or may not work. And even if you do manage to delete your data, copies are certain to remain in the companies’ backup systems. Gmail explicitly says this in its privacy notice.

Online backups, SMS messages, photos on photo sharing sites, smartphone applications that store your data in the network: you have no idea what really happens when you delete pieces of data or your entire account, because you’re not in control of the computers that are storing the data.

This notion of control also explains how Amazon was able to delete a book that people had previously purchased on their Kindle e-book readers. The legalities are debatable, but Amazon had the technical ability to delete the file because it controls all Kindles. It has designed the Kindle so that it determines when to update the software, whether people are allowed to buy Kindle books, and when to turn off people’s Kindles entirely.

Vanish is a research project by Roxana Geambasu and colleagues at the University of Washington. They designed a prototype system that automatically deletes data after a set time interval. So you can send an e-mail, create a Google Doc, post an update to Facebook, or upload a photo to Flickr, all designed to disappear after a set period of time. And after it disappears, no one—not anyone who downloaded the data, not the site that hosted the data, not anyone who intercepted the data in transit, not even you—will be able to read it. If the police arrive at Facebook or Google or Flickr with a warrant, they won’t be able to read it.

The details are complicated, but Vanish breaks the data’s decryption key into a bunch of pieces and scatters them around the web using a peer-to-peer network. Then it uses the natural turnover in these networks—machines constantly join and leave—to make the data disappear. Unlike previous programs that supported file deletion, this one doesn’t require you to trust any company, organization, or website. It just happens.

Of course, Vanish doesn’t prevent the recipient of an e-mail or the reader of a Facebook page from copying the data and pasting it into another file, just as Kindle’s deletion feature doesn’t prevent people from copying a book’s files and saving them on their computers. Vanish is just a prototype at this point, and it only works if all the people who read your Facebook entries or view your Flickr pictures have it installed on their computers as well; but it’s a good demonstration of how control affects file deletion. And while it’s a step in the right direction, it’s also new and therefore deserves further security analysis before being adopted on a wide scale.

We’ve lost the control of data on some of the computers we own, and we’ve lost control of our data in the cloud. We’re not going to stop using Facebook and Twitter just because they’re not going to delete our data when we ask them to, and we’re not going to stop using Kindles and iPhones because they may delete our data when we don’t want them to. But we need to take back control of data in the cloud, and projects like Vanish show us how we can.

Now we need something that will protect our data when a large corporation decides to delete it.

This essay originally appeared in The Guardian.
http://www.guardian.co.uk/technology/2009/sep/09/…
BCWipe:
http://www.jetico.com/…

Me on cloud computing:
http://www.schneier.com/essay-274.html

Control and the iPhone:
http://www.schneier.com/essay-204.html

Social networking companies don’t want to delete data:
http://www.schneier.com/essay-278.html

How to delete your Facebook account:
http://www.wikihow.com/…

Kindle:
http://bit.ly/KindleDelete
http://www.techdirt.com/articles/20090416/…

Vanish:
http://www.economist.com/sciencetechnology/tm/…
http://vanish.cs.washington.edu/index.html
http://vanish.cs.washington.edu/pubs/…


On London’s Surveillance Cameras

A recent report has concluded that the London’s surveillance cameras have solved one crime per thousand cameras per year.

I haven’t seen the report, but I know it’s hard to figure out when a crime has been “solved” by a surveillance camera. To me, the crime has to have been unsolvable without the cameras. Repeatedly I see pro-camera lobbyists pointing to the surveillance-camera images that identified the 7/7 London Transport bombers, but it is obvious that they would have been identified even without the cameras.

And it would really help my understanding of the report’s per-crime cost-to-detect of £20,000 (I assume it is calculated from £200 million for the cameras times 1 in 1000 cameras used to solve a crime per year divided by ten years) if I knew what sorts of crimes the cameras “solved.” If the £200 million solved 10,000 murders, it might very well be a good security trade-off. But my guess is that most of the crimes were of a much lower level.

http://news.bbc.co.uk/2/hi/uk_news/england/london/…
http://www.telegraph.co.uk/news/uknews/crime/…


Robert Sawyer’s Alibis

Back in 2002, science fiction author Robert J. Sawyer wrote an essay about the trade-off between privacy and security. I’ve never forgotten the first sentence: “Whenever I visit a tourist attraction that has a guest register, I always sign it. After all, you never know when you’ll need an alibi.”

Since I read that, whenever I see a tourist attraction with a guest register, I do the same thing. I sign “Robert J. Sawyer, Toronto, ON”—because you never know when he’ll need an alibi.

Sawyer’s essay:
http://sfwriter.com/privacy.htm

Essays I’ve written about privacy:
http://www.schneier.com/essay-109.html
https://www.schneier.com/blog/archives/2006/05/…
http://www.schneier.com/essay-261.html


Schneier News

I’m speaking at the University of Kentucky on September 17.
http://www.cs.engr.uky.edu/events/distingLectures.php

I’m also speaking at the II International Symposium on Network and Data Communications, in Lima, Peru on September 25.
http://www.tecsup.edu.pe/simposio09/redes/…

And I’m speaking at GOVCERT.NL in Rotterdam on October 6.
https://www.govcert.nl/symposium/index.html

Here’s a video of a talk, “The Future of the Security Industry,” I gave at an OWASP meeting in August in Minneapolis.
http://vimeo.com/6495257


Stealing 130 Million Credit Card Numbers

Someone has been charged with stealing 130 million credit card numbers.

Yes, it’s a lot, but that’s the sort of quantities credit card numbers come in. They come by the millions, in large database files. Even if you only want ten, you have to steal millions. I’m sure every one of us has a credit card in our wallet whose number has been stolen. It’ll probably never be used for fraudulent purposes, but it’s in some stolen database somewhere.

Years ago, when giving advice on how to avoid identity theft, I would tell people to shred their trash. Today, that advice is completely obsolete. No one steals credit card numbers one by one out of the trash when they can be stolen by the millions from merchant databases.

http://news.yahoo.com/s/ap/20090817/ap_on_re_us/…


“The Cult of Schneier”

If there’s actually a cult out there, I want to hear about it. In an essay by that name, John Viega writes about the dangers of relying on Applied Cryptography to design cryptosystems:

But, after many years of evaluating the security of software
systems, I’m incredibly down on using the book that made Bruce
famous when designing the cryptographic aspects of a system. In
fact, I can safely say I have never seen a secure system come out
the other end, when that is the primary source for the crypto
design. And I don’t mean that people forget about the buffer
overflows. I mean, the crypto is crappy.

My rule for software development teams is simple: Don’t use
Applied Cryptography in your system design. It’s fine and
fun to read it, just don’t build from it.

[…]

The book talks about the fundamental building blocks of
cryptography, but there is no guidance on things like, putting
together all the pieces to create a secure, authenticated
connection between two parties.

Plus, in the nearly 13 years since the book was last revised, our
understanding of cryptography has changed greatly. There are
things in it that were thought to be true at the time that turned
out to be very false….

I agree. And, to his credit, Viega points out that I agree:

But in the introduction to Bruce Schneier’s book, Practical
Cryptography, he himself says that the world is filled with
broken systems built from his earlier book. In fact, he wrote
Practical Cryptography in hopes of rectifying the problem.

This is all true.

Designing a cryptosystem is hard. Just as you wouldn’t give a person—even a doctor—a brain-surgery instruction manual and then expect him to operate on live patients, you shouldn’t give an engineer a cryptography book and then expect him to design and implement a cryptosystem. The patient is unlikely to survive, and the cryptosystem is unlikely to be secure.

Even worse, security doesn’t provide immediate feedback. A dead patient on the operating table tells the doctor that maybe he doesn’t understand brain surgery just because he read a book, but an insecure cryptosystem works just fine. It’s not until someone takes the time to break it that the engineer might realize that he didn’t do as good a job as he thought. Remember: Anyone can design a security system that he himself cannot break. Even the experts regularly get it wrong. The odds that an amateur will get it right are extremely low.

For those who are interested, a second edition of Practical Cryptography will be published in early 2010, renamed Cryptography Engineering and featuring a third author: Tadayoshi Kohno.

http://broadcast.oreilly.com/2009/01/…

Applied Cryptography:
http://www.schneier.com/book-applied.html

Practical Cryptography:
http://www.schneier.com/book-practical.html


Comments from Readers

There are thousands of comments—many of them interesting—on these topics on my blog. Search for the story you want to comment on, and join in.

http://www.schneier.com/


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2009 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.