It’s easy to feel sympathy for the many folks impacted by the hacking of South Carolina’s Department of Revenue. With 3.6 million taxpayer social security numbers stolen, those people are the biggest victims, and I’ll come back to them. It’s also easy to feel sympathy for the folks in IT and IT management, all the way up to the Governor. The folks in IT made a call to use Trustwave for PCI monitoring, because Trustwave offered PCI compliance [link to http://www.southcarolinasc.com/2012/11/two-new-defendants-added-to-sc-hacking-class-action-suit/ no longer works]. They also made the call to not use a second monitoring system. That decision may look easy to criticize, but I think it’s understandable. Having two monitoring systems means more than doubling staff workloads in responding. (You have to investigate everything, and then you have to correlate and understand discrepancies.)
At the same time, I think it’s possible to take important lessons from what we do know. Each of these is designed to be a testable claim.
Compliance doesn’t prevent hacking.
In his September letter to Haley, [State Inspector General] Maley concluded that while the systems of cabinet agencies he had finished examining could be tweaked and there was a need for a statewide uniform security policy, the agencies were basically sound and the Revenue Department’s system was the “best” among them. (“Foreign hacker steals 3.6 million Social Security numbers from state Department of Revenue“, Tim Smith, Greenville Online)
I believe the reason that compliance doesn’t prevent hacking is because the compliance systems are developed without knowledge of what really goes wrong. That is, they lack feedback loops. They lack testability. They lack any mechanism for ensuring that effort has payoff. (My favorite example is password expiration times. Precisely how much more secure are you with a 60 day expiration policy versus a 120 day policy? Is such a policy worth doubling staff effort?)
You don’t know how your compliance program differs from the SC DoR.
I’m willing to bet that 90% of my readers do not know what exactly the SC DoR did to protect their systems. You might know that it was PCI (and Trustwave as a vendor). But do you know all the details? If you don’t know the details, how can you assess if your program is equivalent, better, or worse? If you can’t do that, can you sleep soundly?
But actually, that’s a red herring. Since compliance programs often contain a bunch of wasted effort, knowing how yours lines up to theirs is less relevant than you’d think. Maybe you’re slacking on something they put a lot of time into. Good for you! Or not, maybe that was the thing that would have stopped the attacker, if only they’d done it a little more. Comparing one to one is a lot less interesting than comparing to a larger data set.
We don’t know what happened in South Carolina
Michael Hicks, the director of the Maryland Cybersecurity Center at the University of Maryland, said states needed a clearer understanding of the attack in South Carolina.
“The only way states can raise the level of vigilance,” Mr. Hicks said, “is if they really get to the bottom of what really happened in this attack.” (“
Hacking of Tax Records Has Put States on Guard“, Robbie Brown, New York Times.)
Mr. Hicks gets a New School hammer, for nailing that one.
Lastly, I’d like to talk about the first victims. The 3.6 million taxpayers. That’s 77% of the 4.6 million people in the state. That would reasonably be the entire taxpaying population. We don’t know how much data was actually leaked. (What we know is a floor. Was it entire tax returns? Was it all the information that banks report? How much of it was shared data from the IRS?) We know that these victims are at long term risk and have only short term protection. We know that their SSNs are out there, and I haven’t heard that the Social Security Administration is offering them new ones. There’s a real, and under-discussed difference between SSN breaches and credit card breaches. Let’s not even talk about biometric breaches here.
At the end of the day, there’s a lot of victims of this breach. And while it’s easy to point fingers at the IT folks responsible, I’m starting to wonder if perhaps we’re all responsible. To the extent that few of us answer Mr. Hick’s question, to the extent that we don’t learn from one anothers’ mistakes, don’t we all make defending our systems harder? We should learn what went wrong, and we should learn that not talking about the root causes helps things go wrong in the future.
Without detracting from the crime that happened in South Carolina, there’s a bigger crime if we don’t learn from it.
“Update”: We now know a fair amount
The above was written and accidentally not posted a few weeks ago. I’d like to offer up my thanks to the decision makers in South Carolina for approving Mandiant’s release of a public and technically detailed version of their report, which is short and fascinating. I’d also like to thank the folks at Mandiant for writing in clear, understandable language about what happened. Nicely done, folks!