Shostack + Friends Blog Archive

 

Security Signaling

Signaling is a term from the study of lemons markets. A lemons market is a market, such as in used cars, where one party (the seller) knows more than the buyer. There are good cars (peaches) and bad ones (lemons). The buyer is willing to pay a fair price, but can’t distinguish between the cars. The buyer won’t pay a peach price for a lemon car, as such, the price for used cars is lowered to that of the price of a lemon. The concept was introduced in 1970 by George Akerlof in “The Market for ‘Lemons’”

A few years after the Lemons paper came out, Michael Spence wrote a set of papers in which he asked, why do students pay for education? The answer he came to was that it was a useful signal to employers that the student is the sort of person they’d like to hire.

How does all of this apply to security? This is long enough that it’s in an extended entry…

Signaling is thus an activity that someone takes that’s designed to send a message. A good signal is one that’s either hard for everyone to send, or harder for a lemon to send. A signal that’s hard for everyone to send is good because it means that only those with confidence in what they’re selling will invest in the signal. A signal that’s hard for a lemon to send is even better, because not everyone will invest in the same sort of signaling.

(This blog acts as a form of signaling. I invest energy in shooting my mouth off, and if you like what I have to say, you’re likely to keep reading, and maybe offer me work or beer or somesuch. Others choose to signal by getting a PhD, or a CISSP, depending on who they expect to be reading the signals.)

In thinking about Poneman’s column yesterday, he’s trying to use investment in security as a signal. I believe this is misguided. A signal needs to be easy to read, and hard to fake. A warranty is a pretty good signal in the car market: If the seller is willing to offer a warranty, they have to believe that the warranty will not be too expensive to fulfill. (Or they could be offloading the risk to an insurer.) How am I to evaluate the security investment of a company?

One way is to listen to the signals sent, such as privacy policies, security seals, etc. (Tony Vila, Rachel Greenstadt, and David Molnar, presented a paper at the 2nd Econ/Security workshop, “Why We Can’t Be Bothered To Read Privacy Policies: Privacy as a Lemons Market,” that I’m using as a model for my thinking.

But what are these signals, and how hard is it to cheat? In the case of privacy policies, they found that they’re hard to read, and easy to cheat.

There are some signals that are harder to get, like a “Capability Maturity Model,” (CMM) from the SEI, or a Common Criteria ranking. The CMM is more focused on code process than on security, but has turned out to be a useful signal in the outsourcing world. Common criteria has turned out to be both too easy and too expensive (although thats more rumor than anyone willing to speak on the record.) And then the market goes and questions it.

4 comments on "Security Signaling"

  • Security Signalling – sucking on the lemon

    Over on Adam’s blog, he asks the question, how do we signal security? Have a read of that if you need to catch up on what is meant by signalling, and what the market for lemons is. It’s a probing…

  • Tegam uses courts to signal bad security

    In the ongoing thread of Adam’s question – how do we signal good security – it’s important to also list signals of bad security. CoCo writes that Tegam, a French anti-virus maker, has secured a conviction against a a security…

  • Following up “Liability for Bugs”

    (Posted by Adam) Chris just wrote a long article on “Liability for bugs is part of the solution.” It starts “Recently, Howard Schmidt suggested that coders be held personally liable for damage caused by bugs in code they write.”…

  • Following up “Liability for Bugs”

    Chris just wrote a long article on “Liability for bugs is part of the solution.” It starts “Recently, Howard Schmidt suggested that coders be held personally liable for damage caused by bugs in code they write.” Chris talks about…

Comments are closed.