Shostack + Friends Blog Archive


Eric Drexler blogging

At [link no longer works] Way cool. I look forward to what he has to say.

Unfortunately, one of his early posts falls into the trap of believing that “Computation and Mathematical Proof” [link to no longer works] will dramatically improve computer security:

Because proof methods can be applied to digital systems, and in particular, will be able to verify the correctness (with respect to a formal specification) of compilers [pdf], microprocessor designs [pdf] (at the digital-abstraction level), and operating system microkernels (the link points to a very important work in progress). Software tools for computer-assisted proof are becoming more usable and powerful, and they already have important industrial-strength applications [pdf]. In a world which increasingly relies on computers for everything from medical devices to national governance, it will be be important to get these foundations right, and to do so in a way that we can trust. If this doesn’t seem important, it may be because we’re so accustomed to living with systems that have built on foundations made of mud, and thinking about a future likewise based on mud. All of us have difficulty imagining what could be developed in a world where computers didn’t crash, were guaranteed to be immune from virus attack, and could safely download code written by the devil himself, and where crucial pieces of software could be guaranteed to not leak data.

The trouble with this approach is that you demonstrably can’t make a useful computer which is immune from virus attack. The proof: a useful computer is one on which I can install software. The user of the computer will have to make a decision about a piece of software. Con men and frausters will continue to convince people to do things which are obviously not in their best interests.

Therefore, however well proven the operating system is, you can’t usefully guarantee them to be free of viruses, because computers are useful when they are generative and social.

That’s implied by his parenthetical “with respect to a formal specification.”

Similarly, the data may be guaranteed not to leak, but can also be guaranteed to be shown to people. (Otherwise, it’s not useful.) Those people can and will leak it. (Ross Anderson’s work on medical systems demonstrates this with a higher level of formality.)

This is not to say that formal methods won’t provide useful results on which we can build. They have, and will continue to in those areas where the problems don’t involve humans, our decisions, or our societies. But human beings are not rational result maximizers who adhere to computer security policies, and all the math in the world won’t change that.

6 comments on "Eric Drexler blogging"

  • Dean Loomis says:

    Actually, Drexler is even more wrong than that. Virus immunity doesn’t depend on peoples’ gullibility, it’s mathematically undecidable. First shown by Fred Cohen many years ago, any system that implements a universal Turing Machine (e.g. VBA in Microsoft Office)can support malicious code that is impossible to predict the existence of and prohibit in advance.
    Sorry, the arms race between white hats and black hats is an eternal, immutable fact of nature.

  • I do not share your white/black point of view. I can agree that formal proof of correctness is not a panacea and that it will not provide secure and usable system at the same time. Orange Book is a practical example, isn’t it? The decision about the whole system will still be on the user, including security decisions. However, there is a big difference if user has no cue for the decision and if there are clear visual cues. For example a system that clearly warns user that he is probably doing something wrong will substantially add to the overall security. And a formal proof of this warning system may still be helpful.
    And yes, practical system needs to be able to install software. However all the software may not come from the same source and may not run in the same security level. The way how current operating systems are implemented is far from ideal. I would not take that as an example of good (security) design. Yes, malicious code will still exists, but the goal of security is to make attack more costly than potential gain. And in current situation the virus attack costs almost nothing and the potential gain is enormous. We are on the highway to hell right now.
    Any tool in security is just improving the situation. There is no panacea, no 100% security. I would not entirely dismiss formal methods. They have their place.

  • Adam says:

    Horses may have wings. But having never seen a horse with wings, I don’t spend a lot of time worrying about them. Similarly, I’ve never seen a useful formal proof of usability (actually, I don’t think I’ve ever seen one at all). I have seen experimentation show interesting things about systems for which usability was claimed.
    Also, I did not entirely dismiss formal methods. I wrote “This is not to say that formal methods won’t provide useful results on which we can build.” But they won’t provide dramatic improvements.

  • nick says:

    “You demonstrably can’t make a useful computer which is immune from virus attack. The proof: a useful computer is one on which I can install software.”
    Well, no. On the vast majority of useful computers users can’t install new software. Of course, marketing departments and consumers mostly do not call these “computers”, but instead cell phones, cameras, MP3 players, auto dashboards, etc. Not coincidentally, these work more reliably than computers on which users commonly install new software and are largely free of malware.
    The vast majority of viruses spread via a very small number of vectors — email attachments, web/Active X, autorun, and boot sector being prominent examples. Eliminating these vectors in no way resembles the halting problem and is computationally a perfectly simple thing to do. These vectors were purposefully implemented by certain companies (ahem) because they made a value judgement that the added functionality was more important than the reduced reliability and security — or because historically they were far more focused on the former than the latter. But as I already indicated there are many more computers with far greater sales running on devices where a different judgment call was made.
    There are a number of steps that could practically eliminate malware. Chief among these is to separate data and executables, make executables write-once (and then only if properly signed by reputable code auditors) and then read-only, and run executables only in sandboxes with a very simple set of capabilities (as in capability-based security). The problem is not Turing machines generally but the Von Neumann architecture in particular and with giving in to the demands of particular customers for convenience without sufficiently taking into account the problems of reliability and security that such specific functionality can create.

  • Adam says:

    At least my cell phone and camera have had additional software added despite the manufacturer’s advice. I know lots of people who adjust their car’s software.
    That you can list a small number of vectors by which viruses spread today does not mean that you can eliminate them all. There have been viruses on every substantial — and many insubstantial — platforms. Or are you proposing eliminating the floppy disk USB key?

  • nick says:

    The number of people who install s/w on their cell phones and cameras are miniscule, compared to those who install s/w on their PCs. If/when it ever becomes common practice, and is done in the unprotected Unix/Windows style, we will see viruses galore on cell phones and cameras.
    I’m not proposing to eliminate USB keys. Quite the contrary, I’m proposing to make flash drives secure, something Windows and Unix can’t do. I’m proposing to eliminate autorun.inf files, boot sectors, and other common executables that can be written to by any other than the authorized distributor of the original code or a very limited set of agents authorized by same, and to run all other executables in extremely restrictive sandboxes. Nothing complicated or “halting” about this at all. It just happens to be radically different from how the Unix world and Microsoft currently do business.

Comments are closed.