Shostack + Friends Blog Archive

 

Quantum Progress

quantum-computers.jpg

What is it about the word “quantum” that sucks the brains out of otherwise reasonable people? There has to be some sort of Heisenberg-Schödinger Credulity Principle that makes all the ideons in their brains go spin-up at the same time, and I’m quite sure that the Many Worlds Interpretation of it has the most merit. (In case you’re a QM n00b, the ideon is the quantum unit of belief.) Fortunately, there seems to be some sanity coming to reporting about quantum computing.

Just about every quantum computing article has a part in it that notes that there are quantum algorithms to break public crypto. The articles breathlessly explain that this means that SSL will be broken and the entire financial world will be in ruins, followed by the collapse of civilization as we know it. Otherwise sensible people focus on this because there’s very little to sink your teeth into in quantum computing otherwise. Even certified experts know that they don’t know what they don’t know.

Scott Aaronson has a good article in Scientific American called “The Limits of Quantum Computers” (only the preview is free, sorry) that gives a good description of what quantum computers can’t do. I’m pleased to see this. SciAm has been a HSCP-induced quantum cheerleader over the last few years.

I have been doing some research on the claims of quantum computing. I decided to pick the specific factoring ability of quantum computers, and produce some actual numbers about how we might expect quantum computing to develop. In other words, I’m going to be a party pooper.

The crypto-obviating algorithms in question are Shor’s algorithm for factoring and an algorithm he developed for discrete logs. I was surprised to learn that Shor’s algorithm requires 72k3 quantum gates to be able to factor a number k bits long. Cubed is a somewhat high power. So I decided to look at a 4096-bit RSA key, which is the largest that most current software supports — the crypto experts all say that if you want something stronger, you should shift to elliptic curve, and the US government is pushing this, too, with their “Suite B” algorithms.

To factor a 4096-bit number, you need 72*40963 or 4,947,802,324,992 quantum gates. Lets just round that up to an even 5 trillion. Five trillion is a big number. We’re only now getting to the point that we can put about that many normal bits on a disk drive. The first thing this tells me is that we aren’t going to wake up one day and find out that someone’s put that many q-gates on something you can buy from Fry’s from a white-box Taiwanese special.

A complication in my calculations is the relationship between quantum gates and quantum bits. For small numbers of qubits, you get about 200 qugates per qubit. But qubits are rum beasts. There are several major technologies that people are trying to tease qubits out of. There’s the adiabatic techlogies that D-Wave is trying. There are photon dots, and who knows how many semiconductor-based methods.

It isn’t clear that any of these have any legs. Read Scott Aaronson’s harumphing at D-Wave, more pointed yet sympathetic faint praise and these educated doubts on photonics. Interestingly, Aaronson says that adiabatic quantum computers like D-Wave need k11 gates rather than k3 gates, which pretty much knocks them out of viability at all, if that’s so.

But let’s just assume that they all work as advertised, today. My next observation is that probably looking at billions of q-bits to be able to get trillions of q-gates. My questions to people who know about the relationship between quantum gates and quantum bits yielded that the real experts don’t have a good answer, but that 200:1 ratio is more likely to go down than up. Intel’s two-billion transistor “Tukwila” chip comes out this year. Five trillion is a big number. We are as likely to need 25 billion qbits to factor that number as any other good guess. Wow.

The factoring that has been done on today’s quantum computers is of a four-bit number, 15. If you pay attention to quantum computing articles, you’ll note they always factor 15. There’s a reason for this. It’s of the form (2n-1) * ( 2n+1). In binary, 2n-1 is a string of all 1 bits. A number that is 2n+1 is a 1 bit followed by a string of 0s, and then a 1 again. These numbers are a special form that is easy to factor, and in the real world not going to occur in a public key.

This is not a criticism, it’s an observation. You have to walk before you can run, and you have to factor special forms before you can factor the general case. Having observed that, we’ll just ignore it and assume we can factor any four-bit number today.

Let’s presume that quantum computers advance in some exponential curve that resembles Moore’s Law. That is to say that there is going to be a doubling of quantum gates periodically, and we’ll call that period a “generation.” Moore’s specific observation about transistors had a generation every eighteen months.

The difference between factoring four bits and factoring 4096 bits is 30 generations. In other words, 72*43 * 230 = 72*40963. If we look at a generation of eighteen months, then quantum computers will be able to factor a 4096-bit number in 45 years, or on the Ides of March, 2053.

This means to me that my copy of PGP is still going to be safe to use for a while yet. Maybe I oughta get rid of the key I’ve been using for the last few years, but I knew that. I’m not stupid, merely lazy.

I went over to a site that will tell you how long a key you need to use, http://www.keylength.com/. Keylength.com uses estimates made by serious cryptographers for the life of keys. They make some reasonable assumptions and perhaps one slightly-unreasonable assumption: that Moore’s Law will continue indefinitely. If we check there for how long a 4096-bit key will be good for, the conservative estimate is (drum roll, please) — the year 2060.

I’m still struck by how close those dates are. It suggests to me that if quantum computers continue at a rate that semiconductors do, they’ll do little more than continue the pace of technological advancement we’ve seen for the past handful of decades. That’s no mean feat — in 2053, I doubt we’re going to see Intel trumpeting its 45 picometer process (which is what we should see after 30 generations).

I spoke to one of my cryptographer friends and outlined this argument to him. He said that he thinks that the pace of advancement will pick up and be faster than a generation every eighteen months. Sure. I understand that, myself. The pace of advancement in storage has been a generation every year, and in flash memory it’s closer to every nine months. It’s perfectly conceivable that quantum computing will see horrible progress for the next decade and then whoosh off with a generation ever six months. That would compress my 45 years into 25, which is a huge improvement but still no reason to go begging ECRYPT for more conferences.

On the other hand, it’s just as conceivable that quantum computing will end up on the Island of Misfit Technologies [link to http://www.animeusaexpress.com/rudisofmisto.html no longer works], along with flying cars, personal jetpacks, Moon colonies, artificial intelligence [link to http://ap.google.com/article/ALeqM5jf_4NpPzdxsxVlB34jl7ka4CXbYwD8VCP1TO1 no longer works], and identity management.

But I also talked to a bigwig in Quantum Information Theory (that’s quantum computing and more) and gave him a sketch of my argument. I heard him speak about Quantum Information and he gave the usual Oooooo Scary Quantum Computers Are Going to Factor Numbers Which Will Cause The Collapse of All Financial Markets And Then We Will All DIEEEEE — So That’s Why We Need More Research Money boosterism.

He wouldn’t let me attribute anything to him, which I understand completely. We live in a world in which partisanship is necessary and if he were seen putting down the pompoms, he’d be fired. Telling middle-aged technocrats that the math says their grandkids are going to see quantum computers shortly before they retire will cause the research money dry up, and if that happens then — well, the world won’t end. And then where would we be?

Nonetheless, he said to me sotto voce, “There’s nothing wrong with your math.”

Photo is a detail from “Shop Full” by garryw16.

21 comments on "Quantum Progress"

  • Not a Quantum Bigwig says:

    Mordaxus,
    If memory serves, the number of qubits needed to run shor’s algorithm on an k-bit number is only O(k), owing to the fact that there are efficient reversible circuits for modular exponentiation. Each “gate” in the quantum circuit should be thought of as roughly equivalent to a bit operation in a deterministic algorithm. So the statement “Shor’s algorithm requires 72k^3 quantum gates” is equivalent to saying it takes TIME 72k^3 to run. This, by the way, is almost exactly the number of steps it takes to do a RSA decryption with a k-bit key — no mistake, since the main step in the algorithm is to perform a k-bit modular exponentiation.
    So using “your math,” we are only, say 14 generations from being able to factor 4096-bit keys.
    Now, it is still very unclear that we will ever solve the engineering problems of building a scalable quantum computer, and it is absolutely clear that the current leader, NMR, will not scale to factor even 64-bit numbers; so you’re absolutely right that for the near future it is unwise to worry extensively about quantum computers.
    On the other hand, *research conferences* are about thinking longer term, and I think it is absolutely appropriate for researchers to be thinking about ways to design cryptosystems that resist the known quantum attacks. Even if the quantum engineering problem is never solved, there is no guarantee that someone will not find a classical algorithm for factoring; and many of the newer designs could potentially be faster than RSA anyways.

  • François Greu says:

    A nitpick: should be the Heisenberg-Schrödinger Credulity Principle.
    For the rest I agree with Bruce. Of course my opinion is of little value: I know the spelling of the name of a Quantum Physics pionneer, but could not recognize a qbit from a boson if faced with these guys.
    Francois Grieu

  • Mark says:

    So the fundamental rationale is that because it’s not going to happen anytime in the immediate future, it’s not something we should worry about?
    Isn’t that the same mindset that they had in the 1960’s that ended up being such a hassle in the 90’s for Y2K?

  • qubit says:

    A draft of Aaronson’s paper can be found at http://www.scottaaronson.com/writings/

  • Scott L. Burson says:

    Hey! AI is not a “misfit technology”. It’s all over the industry. You just don’t see it because it’s under the surface. Granted, most of the big dreams of 30-40 years ago are still dreams, but many technologies developed by AI researchers over the years are now mainstream, like the clustering algorithms that give us recommendations on shopping sites.
    I think AI won’t really flower until we have affordable quantum computers. But even in its present state, which I would argue is still embryonic, it’s not in the same category as personal jetpacks.

  • Chris says:

    >”I’m still struck by how close those dates are.”

    Well that isn’t so close. It is four (almost five) 18 month generations, which is 16 (almost 32) times difference….

  • Alsee says:

    I see one potential problem in the reasoning. It assumes a Moore’s Law style growth, Ok, reasonable. However we have not yet *begun* on an actual curve yet. We are still seeking an actual technology path to apply. It is possible that once we do figure out a “how” on practical application of Quantum Computing, that the initial starting point of practical application may be extremely high. If we figure out a basic “how” of quantum computing that can operate on a silicon microchip type platform, it is possible that mechanism could immediately be moved onto a billion gate microchip. So we’re not really on a QM Moore Law track yet. We might START on that track at the billion gate level. And if that happens, a very big if, if that happens then you have just jumped something like 25 generations ahead of your above calculations. If that were to happen, a very big if, then 4096 bit public key crypto could fall in essentially the time span it takes industry to physically switch over manufacturing to the new design.

  • Enginerd says:

    The pace of progress for semiconductor-based computers has slowed considerably in the past few years. Yeah, flash is doing great, but the fundamental lithographic techniques have been running up against walls. Part of the reason to try to invent a quantum computer is just to keep it going. Feature sizes are 45 nm now. I’m not sure what the fundamental limit would be size-wise, assuming you could even etch it, but it likely wouldn’t be smaller than a few angstroms. We’ll say 5. That means we only have 7 generations max, or 15 years of semiconductor based computers. And the speed of progress is going to slow considerably, seeing as the only way to get angstrom resolution now is with electron beams.
    So it’s not really about beating standard computers, just keeping up the trend.

  • Dan says:

    Beale Ciphers approach the strength of One Time Pad without the hassle of picking “truly random” numbers. Just take your plaintext, cipher it against 3 or more different websites that you’re sure won’t change and you’re good to go.
    The key is to not just use “add” or “xor”, but to use a known set of reversible transformations which don’t necessarily result in a datum of identical size; and to do this 3 times or more, not just twice.
    Doing so makes any statistical analysis attack virtually impossible. Send changes to the transformations set as part of a message to keep a session.
    Regards,
    Dan

  • Interesting article. k-cubed or k-11 aside, and taking note of comments above that the timescale for a practical quantum computer might be anywhere between a decade and a century – or longer – I have a comment.
    From my conversations with friends who do this for a living, I am deeply skeptical about the possibility that practical quantum computers will ever exist. My objection is just that they are essentially analogue computers. They aim to map a problem to the evolution of a physical system. It may be that research in this area makes the most sensitive detectors of anything ever made – that’s realistic. And it may be we will learn a thing or two about quantum mechanics – that’s likely too. But the idea that they will make working devices seems just…unlikely. I would also say the idea of SOMETHING interesting-who knows what- arising from this research was pretty high.
    As an aside I find it curious the alignment of interests of the idiosyncratically diverse personalities of the scientists who work on this problem, and the rather mainstream interests of the bankers and generals who tell the government it is worth funding.
    Thanks for the article and the discussion
    Michael

  • Fred P says:

    Main article-
    I’m unclear why one couldn’t use fewer quantum gates in exchange for more time, although this would only give a few generations in your calculations.
    Dan,
    Are you trying to sell a product, or have you just bought into someone else’s sales spiel?
    Problem 1) “websites that you’re sure won’t change” – I’m curious as to how one would do this, unless one was in possession of the websites. In any case, this would do little, since you’d still need to transmit and manage information about the web sites; at best, this is just a means of data compression (for the key) with both a lot of leakage of information and poor key selection. At worst, you’re just setting yourself up to have your communications cracked at virtually no effort for the attacker.
    Problem 2) “The key is to not just use “add” or “xor”, but to use a known set of reversible transformations which don’t necessarily result in a datum of identical size; and to do this 3 times or more, not just twice.” – 3 times likely helps some. Using “reversible transformations which don’t necessarily result in a datum of identical size” doesn’t. Suggesting that it does violates Shannon’s Maxim: “the enemy knows the system”.
    If one uses the approach that you claim Beale Ciphers uses, you’d better hope that none of your attackers are competent.

  • Dave Bacon says:

    Gauntlet picked up and returned 🙂

  • Mordaxus says:

    (Echoed from Dave Bacon’s link.)
    I freely admit that I’m a mathematician, not a physicist, and that there is much about quantum computation that I do not understand.
    Nonetheless, here’s the gauntlet: when do you think it is reasonable that there will be a quantum computer that can factor a 4096-bit key in quasi-reasonable time? (Oh, let’s say a year of running time.)
    Under the assumptions that Moore’s law continues with semiconductors indefinitely, 2060 is a good guess.
    Under my lick-my-finger-and-stick-it-in-the-wind calculations for quantum computers, assuming similar ramps and dead reckoning, it’s 2053.
    I further observe that those two guesses are interestingly close.
    What’s your guess? What’s your reasoning?
    if it makes you feel any better, I predict we get a usable quantum computer before we get a HAL-9000-level AI, or flying cars, or jet packs, or moon bases.
    M

  • Michael Bacon says:

    Mordaxus,
    It’s always good to be skeptical, so that much about your post is appreciated. However, your response was inadequate to Dave Bacon’s (no relation) post that went through a number of the facts and some of the reasoning upon which you base your conclusions.
    I think if you’re post is to be taken seriously as more than mere skepticism or even as justified criticism of press hype — that is, as a meaningful critique of quantum computing and the research surrounding it, then you need to do a better job at responding in a clear and reasoned way to the what seemed like pretty clear points raised by Dave.
    If the facts and reasoning upon which you base you conclusions are really off-the-mark, perhaps the conclusions themselves deserve to be questioned.

  • Michael Bacon says:

    Mordaxus,
    It’s always good to be skeptical, so that much about your post is appreciated. However, your response was inadequate to Dave Bacon’s (no relation) post that went through a number of the facts and some of the reasoning upon which you base your conclusions.
    I think if you’re post is to be taken seriously as more than mere skepticism or even as justified criticism of press hype — that is, as a meaningful critique of quantum computing and the research surrounding it, then you need to do a better job at responding in a clear and reasoned way to the what seemed like pretty clear points raised by Dave.
    If the facts and reasoning upon which you base you conclusions are really off-the-mark, perhaps the conclusions themselves deserve to be questioned.

  • Michael Bacon says:

    Mordaxus,
    It’s always good to be skeptical, so that much about your post is appreciated. However, your response was inadequate to Dave Bacon’s (no relation) post that went through a number of the facts and some of the reasoning upon which you base your conclusions.
    I think if you’re post is to be taken seriously as more than mere skepticism or even as justified criticism of press hype — that is, as a meaningful critique of quantum computing and the research surrounding it, then you need to do a better job at responding in a clear and reasoned way to the what seemed like pretty clear points raised by Dave.
    If the facts and reasoning upon which you base you conclusions are really off-the-mark, perhaps the conclusions themselves deserve to be questioned.

  • Michael Bacon says:

    Mordaxus,
    It’s always good to be skeptical, so that much about your post is appreciated. However, your response was inadequate to Dave Bacon’s (no relation) post that went through a number of the facts and some of the reasoning upon which you base your conclusions.
    I think if you’re post is to be taken seriously as more than mere skepticism or even as justified criticism of press hype — that is, as a meaningful critique of quantum computing and the research surrounding it, then you need to do a better job at responding in a clear and reasoned way to the what seemed like pretty clear points raised by Dave.
    If the facts and reasoning upon which you base you conclusions are really off-the-mark, perhaps the conclusions themselves deserve to be questioned.

  • Mark Aldington says:

    I’m not a mathematician nor a physicist, nor a proponent of the “usual Oooooo Scary Quantum Computers Are Going to Factor Numbers Which Will Cause The Collapse of All Financial Markets And Then We Will All DIEEEEE” brigade either, though I concede as a consultant I will be assumed to be nearer the latter than the former. Just a couple of points:
    1. It may be a bit imprudent to assume that Moore’s Law will have any relevance to the development of Q computing and arguably, IF, (and it is a very big IF)the very real problems get overcome and scaling is resolved, then it might not take many generations at all to get to 72k cubed.
    2. As we have all recently noticed the banking system is totally reliant on confidence in the banking system. The merest sniff of quantum computers getting into the hands of the usual suspects and the media jumping on the story of the end of secure e-commerce as we know it, could make today’s market woes seem like a picnic.
    So while your skeptism is reasonable, the cost of your being right through continued research and conferences to prove the unviability of quantum computing is somewhat overshadowed if your assumptions turn out to be wrong.

  • Michael Bacon says:

    Sorry about the multiple postings — it didn’t seem like it was posting so I pushed it a few times — once again, very sorry about that 🙁

  • Richard Cant says:

    First an observation about Moore’s law.
    Moore’s Law is a law of economics. It determines the way in which a technology will progress given that there are no insuperable technical factors to halt progress and there is an unlimited market for the products. In that case what Moore’s law models is essentially the investment cycle. For semiconductors the investment cycle for a plant is roughly 3 years and the desirable progress for a new plant is a factor 2 linear scaling (4x area).
    Given the unlimited potential market for the product each generation easily funds the next and so the investment cycle governs everything. Without funding progress stops dead. Hit a technical barrier and the same happens.
    A similar law operated in the aircraft industry – for ~ 50 years aircraft speed doubled about every 10 years. If you read books written in the late 50’s you will find that eeryone was totally convinced that this would continue for ever!
    Then they hit the heat barrier and it all stopped
    There are good reasons to believe that conventional computing will also hit a heat barrier within the next 10 years. (Have you put your laptop on your lap recently?)
    There are also good reason to suppose that Quantum computing will never have the unlimited market of conventional computing – and hence the funding will never be there to make it obey Moore’s law

  • sup says:

    “Interestingly, Aaronson says that adiabatic quantum computers like D-Wave need k11 gates rather than k3 gates, which pretty much knocks them out of viability at all, if that’s so.”
    Where scott aareonson say this?
    “The factoring that has been done on today’s quantum computers is of a four-bit number, 15. If you pay attention to quantum computing articles, you’ll note they always factor 15. There’s a reason for this. It’s of the form (2n-1) * ( 2n+1). In binary, 2n-1 is a string of all 1 bits. A number that is 2n+1 is a 1 bit followed by a string of 0s, and then a 1 again. These numbers are a special form that is easy to factor, and in the real world not going to occur in a public key.”
    Factoring 15 take 4^n time, on NMR “quantum computer”, becouse (current) NMR QC isn’t quantum computer at all and working with exponentional speedup, but playning kind of quantum game: “how will be if I put shit onto qunatum world”…
    Quantum computer is analog computer with at most precision 0.99999, which is limitation for any analog computer. And so quantum computer ever break after 100000 qubits or steps or need error correction, which somehow deal with this number (one error per 100 thousunds per qubit). I can’t imagine how is possible to correct analog errors, but maybe I somthing don’t understand yet. So according to my analog understanding of quantum computer it can’t factor even 100 bits number, don’t mutter how much qubits it will have, even if it will work like probabilitic computer like current NMR QC’s. It will take universe age time…

Comments are closed.