[https://scottlocklin.wordpress.com/2019/01/15/quantum-computing-as-a-field-is-obvious-bullshit/]
Quantum computing as a field is obvious bullshit
Posted in non-standard computer architectures, physics by Scott Locklin on
January 15, 2019
I remember spotting the quantum computing trend when I was a larval physics
nerdling. I figured maybe I could get in on the chuckwagon if my dissertation
project didn’t work out in a big way (it didn’t). I managed to get myself
invited to a Gordon conference, and have giant leather bound notebooks filled
with theoretical scribblings containing material for 2-3 papers in them. I
wasn’t real confident in my results, and I couldn’t figure out a way to turn
them into something practical involving matter, so I happily matriculated to
better things in the world of business.
When I say Quantum Computing is a bullshit field, I don’t mean everything in
the field is bullshit, though to first order, this appears to be approximately
true. I don’t have a mathematical proof that Quantum Computing isn’t at least
theoretically possible. I also do not have a mathematical proof that we can
make the artificial bacteria of K. Eric Drexler’s nanotech fantasies. Yet, I
know both fields are bullshit. Both fields involve forming new kinds of matter
that we haven’t the slightest idea how to construct. Neither field has a sane
‘first step’ to make their large claims true.
Drexler and the “nanotechnologists” who followed him, they assume because we
know about the Schroedinger equation we can make artificial forms of life out
of arbitrary forms of matter. This is nonsense; nobody understands enough about
matter in detail or life in particular to do this. There are also reasonable
thermodynamic, chemical and physical arguments against this sort of thing. I
have opined on this at length, and at this point, I am so obviously correct on
the nanotech front, there is nobody left to argue with me. A generation of
people who probably would have made first rate chemists or materials scientists
wasted their early, creative careers following this over hyped and completely
worthless woo. Billions of dollars squandered down a rat hole of rubbish and
wishful thinking. Legal wankers wrote legal reviews of regulatory regimes to
protect us from this nonexistent technology. We even had congressional hearings
on this nonsense topic back in 2003 and again in 2005 (and probably some other
times I forgot about). Russians built a nanotech park to cash in on the
nanopocalyptic trillion dollar nanotech economy which was supposed to happen by
now.
Similarly, “quantum computing” enthusiasts expect you to overlook the fact that
they haven’t a clue as to how to build and manipulate quantum coherent forms of
matter necessary to achieve quantum computation. A quantum computer capable of
truly factoring the number 21 is missing in action. In fact, the factoring of
the number 15 into 3 and 5 is a bit of a parlour trick, as they design the
experiment while knowing the answer, thus leaving out the gates required if we
didn’t know how to factor 15. The actual number of gates needed to factor a
n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any
time soon.
It’s been almost 25 years since Peter Shor had his big idea, and we are no
closer to factoring large numbers than we were … 15 years ago when we were also
able to kinda sorta vaguely factor the number 15 using NMR ‘quantum computers.’
I had this conversation talking with a pal at … a nice restaurant near one of
America’s great centers of learning. Our waiter was amazed and shared with us
the fact that he had done a Ph.D. thesis on the subject of quantum computing.
My pal was convinced by this that my skepticism is justified; in fact he
accused me of arranging this. I didn’t, but am motivated to write to prevent
future Ivy League Ph.D. level talent having to make a living by bringing a
couple of finance nerds their steaks.
In 2010, I laid out an argument against quantum computing as a field based on
the fact that no observable progress has taken place. That argument still
stands. No observable progress has taken place. However, 8 years is a very long
time. Ph.D. dissertations have been achieved, and many of these people have
gone on to careers … some of which involve bringing people like me delicious
steaks. Hundreds of quantum computing charlatans achieved tenure in that period
of time. According to google scholar a half million papers have been written on
the subject since then.
There are now three major .com firms funding quantum computing efforts; IBM,
Google and Microsoft. There is at least one YC/Andreesen backed startup I know
of. Of course there is also dwave, who has somehow managed to exist since 1999;
almost 20 years, without actually delivering something usefully quantum or
computing. How many millions have been flushed down the toilet by these turds?
How many millions which could have been used building, say, ordinary analog or
stochastic computers which do useful things? None of these have delivered a
useful quantum computer which has even one usefully error corrected qubit. I
suppose I shed not too many tears for the money spent on these efforts; in my
ideal world, several companies on that list would be broken up or forced to
fund Bell Labs moonshot efforts anyway, and most venture capitalists are frauds
who deserve to be parted with their money. I do feel sad for the number of
young people taken in by this quackery. You’re better off reading ancient Greek
than studying a ‘technical’ subject that eventually involves bringing a public
school kid like me a steak. Hell, you are better off training to become an
exorcist or a feng shui practitioner than getting a Ph.D. in ‘quantum
computing.’
I am an empiricist and a phenomenologist. I consider the lack of one error
corrected qubit in the history of the human race to be adequate evidence that
this is not a serious enough field to justify using the word ‘field.’ Most of
it is frankly, a scam. Plenty of time to collect tenure and accolades before
people realize this isn’t normative science or much of anything reasonable.
As I said last year
All you need do is look at history: people had working (digital) computers
before Von Neumann and other theorists ever noticed them. We literally have
thousands of “engineers” and “scientists” writing software and doing
“research” on a machine that nobody knows how to build. People dedicate
their careers to a subject which doesn’t exist in the corporeal world.
There isn’t a word for this type of intellectual flatulence other than the
overloaded term “fraud,” but there should be.
“Computer scientists” have gotten involved in this chuckwagon. They have added
approximately nothing to our knowledge of the subject, and as far as I can
tell, their educational backgrounds preclude them ever doing so. “Computer
scientists” haven’t had proper didactics in learning quantum mechanics, and
virtually none of them have ever done anything as practical as fiddled with an
op-amp, built an AM radio or noticed how noise works in the corporeal world.
Such towering sperg-lords actually think that the only problems with quantum
computing are engineering problems. When I read things like this, I can hear
them muttering mere engineering problems. Let’s say, for the sake of argument
this were true. The SR-71 was technically a mere engineering problem after the
Bernoulli effect was explicated in 1738. Would it be reasonable to have a
hundred or a thousand people writing flight plans for the SR-71 as a
profession in 1760? No.
A reasonable thing for a 1760s scientist to do is invent materials making a
heavier than air craft possible. Maybe fool around with kites and steam
engines. And even then … there needed to be several important breakthroughs in
metallurgy (titanium wasn’t discovered until 1791), mining, a functioning
petrochemical industry, formalized and practical thermodynamics, a unified
field theory of electromagnetism, chemistry, optics, manufacturing and arguably
quantum mechanics, information theory, operations research and a whole bunch of
other stuff which was unimaginable in the 1760s. In fact, of course the SR-71
itself was completely unimaginable back then. That’s the point.
its just engineering!
Physicists used to be serious and bloody minded people who understood reality
by doing experiments. Somehow this sort of bloody minded seriousness has faded
out into a tower of wanking theorists who only occasionally have anything to do
with actual matter. I trace the disease to the rise of the “meritocracy” out of
cow colleges in the 1960s. The post WW-2 neoliberal idea was that geniuses like
Einstein could be mass produced out of peasants using agricultural schools. The
reality is, the peasants are still peasants, and the total number of Einsteins
in the world, or even merely serious thinkers about physics is probably
something like a fixed number. It’s really easy, though, to create a bunch of
crackpot narcissists who have the egos of Einstein without the exceptional work
output. All you need to do there is teach them how to do some impressive
looking mathematical Cargo Cult science, and keep their “results” away from any
practical men doing experiments.
The manufacture of a large caste of such boobs has made any real progress in
physics impossible without killing off a few generations of them. The vast,
looming, important questions of physics; the kinds that a once in a lifetime
physicist might answer -those haven’t budged since the early 60s. John Horgan
wrote a book observing that science (physics in particular) has pretty much
ended any observable forward progress since the time of cow collitches. He also
noticed that instead of making progress down fruitful lanes or improving
detailed knowledge of important areas, most develop enthusiasms for the latest
non-experimental wank fest; complexity theory, network theory, noodle
theory. He thinks it’s because it’s too difficult to make further progress. I
think it’s because the craft is now overrun with corrupt welfare queens who are
play-acting cargo cultists.
Physicists worthy of the name are freebooters; Vikings of the Mind,
intellectual adventurers who torture nature into giving up its secrets and risk
their reputation in the real world. Modern physicists are … careerist ding
dongs who grub out a meagre living sucking on the government teat, working
their social networks, giving their friends reach arounds and doing PR to make
themselves look like they’re working on something important. It is terrible and
sad what happened to the king of sciences. While there are honest and
productive physicists, the mainstream of it is lost, possibly forever to a
caste of grifters and apple polishing dingbats.
But when a subject which claims to be a technology, which lacks even the
rudiments of experiment which may one day make it into a technology, you can
know with absolute certainty that this ‘technology’ is total nonsense. Quantum
computing is less physical than the engineering of interstellar spacecraft; we
at least have plausible physical mechanisms to achieve interstellar space
flight.
We’re reaching peak quantum computing hyperbole. According to a dimwit at the
Atlantic, quantum computing will end free will. According to another one at
Forbes, “the quantum computing apocalypse is immanent.” Rachel Gutman and
Schlomo Dolev know about as much about quantum computing as I do about 12th
century Talmudic studies, which is to say, absolutely nothing. They, however,
think they know smart people who tell them that this is important: they’ve
achieved the perfect human informational centipede. This is unquestionably the
right time to go short.
Even the national academy of sciences has taken note that there might be a
problem here. They put together 13 actual quantum computing experts who poured
cold water on all the hype. They wrote a 200 page review article on the topic,
pointing out that even with the most optimistic projections, RSA is safe for
another couple of decades, and that there are huge gaps on our knowledge of how
to build anything usefully quantum computing. And of course, they also pointed
out if QC doesn’t start solving some problems which are interesting to …
somebody, the funding is very likely to dry up. Ha, ha; yes, I’ll have some
pepper on that steak.
There are several reasonable arguments against any quantum computing of the
interesting kind (aka can demonstrate supremacy on a useful problem) ever
having a physical embodiment.
One of the better arguments is akin to that against P=NP. No, not the argument
that “if there was such a proof someone would have come up with it by now” -but
that one is also in full effect. In principle, classical analog computers can
solve NP-hard problems in P time. You can google around on the “downhill
principle” or look at the work on Analog super-Turing architectures by people
like Hava Siegelmann. It’s old stuff, and most sane people realize this isn’t
really physical, because matter isn’t infinitely continuous. If you can encode
a real/continuous number into the physical world somehow, P=NP using a
protractor or soap-bubble. For whatever reasons, most complexity theorists
understand this, and know that protractor P=NP isn’t physical. Somehow quantum
computing gets a pass, I guess because they’ve never attempted to measure
anything in the physical world beyond the complexity of using a protractor.
In order to build a quantum computer, you need to control each qubit, which is
a continuous value, not a binary value, in its initial state and subsequent
states precisely enough to run the calculation backwards. When people do their
calculations ‘proving’ the efficiency of quantum computers, this is treated as
an engineering detail. There are strong assertions by numerous people that
quantum error correction (which, I will remind everyone, hasn’t been usefully
implemented in actual matter by anyone -that’s the only kind of proof that
matters here) basically pushes the analog requirement for perfection to the
initialization step, or subsumes it in some other place where it can’t exist.
Let’s assume for the moment that this isn’t the case.
Putting this a different way, for an N-qubit computer, you need to control,
transform, and read out 2^N complex (as in complex numbers) amplitudes of
N-qubit quantum computers to a very high degree of precision. Even considering
an analog computer with N oscillators which must be precisely initialized,
precisely controlled, transformed and individually read out, to the point where
you could reverse the computation by running the oscillators through the
computation backwards; this is an extremely challenging task. The quantum
version is exponentially more difficult.
Making it even more concrete; if we encode the polarization state of a photon
as a qubit, how do we perfectly align the polarizers between two qubits? How do
we align them for N qubits? How do we align the polarization direction with the
gates? This isn’t some theoretical gobbledeygook; when it comes time to build
something in physical reality, physical alignments matter, a lot. Ask me how I
know. You can go amuse yourself and try to build a simple quantum computer with
a couple of hard coded gates using beamsplitters and polarization states of
photos. It’s known to be perfectly possible and even has a rather sad wikipedia
page. I can make quantum polarization-state entangled photons all day; any fool
with a laser and a KDP crystal can do this, yet somehow nobody bothers sticking
some beamsplitters on a breadboard and making a quantum computer. How come?
Well, one guy recently did it: got two whole qubits. You can go read about this
*cough* promising new idea here, or if you are someone who doesn’t understand
matter here.
FWIIW in early days of this idea, it was noticed that the growth in the number
of components needed was exponential in the number of qubits. Well, this
shouldn’t be a surprise: the growth in the number of states in a quantum
computer is also exponential in the number of qubits. That’s both the
‘interesting thing’ and ‘the problem.’ The ‘interesting thing’ because an
exponential number of states, if possible to trivially manipulate, allows for a
large speedup in calculations. ‘The problem’ because manipulating an
exponential number of states is not something anyone really knows how to do.
The problem doesn’t go away if you use spins of electrons or nuclei; which
direction is spin up? Will all the physical spins be perfectly aligned in the
“up” direction? Will the measurement devices agree on spin-up? Do all the gates
agree on spin-up? In the world of matter, of course they won’t; you will have a
projection. That projection is in effect, correlated noise, and correlated
noise destroys quantum computation in an irrecoverable way. Even the quantum
error correction people understand this, though for some reason people don’t
worry about it too much. If they are honest in their lack of worry, this is
because they’ve never fooled around with things like beamsplitters. Hey, making
it have uncorrelated noise; that’s just an engineering problem right? Sort of
like making artificial life out of silicon, controlled nuclear fusion power or
Bussard ramjets is “just an engineering problem.”
Of course at some point someone will mention quantum error correction which
allows us to not have to precisely measure and transform everything. The most
optimistic estimate of the required precision is something like 10^-5 for
quantum error corrected computers per qubit/gate operation. This is a fairly
high degree of precision. Going back to my polarization angle example; this
implies all the polarizers, optical elements and gates in a complex system are
aligned to 0.036 degrees. I mean, I know how to align a couple of beamsplitters
and polarizers to 628 microradians, but I’m not sure I can align a few hundred
thousand of them AND pockels cells and mirrors to 628 microradians of each
other. Now imagine something with a realistic number of qubits for factoring
large numbers; maybe 10,000 qubits, and a CPU worth of gates, say 10^10 or so
of gates (an underestimate of the number needed for cracking RSA, which, mind
you, is the only reason we’re having this conversation). I suppose it is
possible, but I encourage any budding quantum wank^H^H^H algorithmist out
there to have a go at aligning 3-4 optical elements to within this precision.
There is no time limit, unless you die first, in which case “time’s up!”
This is just the most obvious engineering limitation for making sure we don’t
have obviously correlated noise propagating through our quantum computer. We
must also be able to prepare the initial states to within this sort of
precision. Then we need to be able to measure the final states to within this
sort of precision. And we have to be able to do arbitrary unitary
transformations on all the qubits.
Just to interrupt you with some basic facts: the number of states we’re talking
about here for a 4000 qubit computer is ~ 2^4000 states! That’s 10^1200 or so
continuous variables we have to manipulate to at least one part in ten
thousand. The number of protons in the universe is about 10^80. This is why a
quantum computer is so powerful; you’re theoretically encoding an exponential
number of states into the thing. Can anyone actually do this using a physical
object? Citations needed; as far as I can tell, nothing like this has ever been
done in the history of the human race. Again, interstellar space flight seems
like a more achievable goal. Even Drexler’s nanotech fantasies have some
precedent in the form of actually existing life forms. Yet none of these are
coming any time soon either.
There are reasons to believe that quantum error correction, too isn’t even
theoretically possible (examples here and here and here -this one is
particularly damning). In addition to the argument above that the theorists are
subsuming some actual continuous number into what is inherently a noisy and
non-continuous machine made out of matter, the existence of a quantum error
corrected system would mean you can make arbitrarily precise quantum
measurements; effectively giving you back your exponentially precise continuous
number. If you can do exponentially precise continuous numbers in a non
exponential number of calculations or measurements, you can probably solve very
interesting problems on a relatively simple analog computer. Let’s say, a
classical one like a Toffoli gate billiard ball computer. Get to work; we know
how to make a billiard ball computer work with crabs. This isn’t an example
chosen at random. This is the kind of argument allegedly serious people submit
for quantum computation involving matter. Hey man, not using crabs is just an
engineering problem muh Church Turing warble murble.
Smurfs will come back to me with the press releases of Google and IBM touting
their latest 20 bit stacks of whatever. I am not impressed, and I don’t even
consider most of these to be quantum computing in the sense that people worry
about quantum supremacy and new quantum-proof public key or Zero Knowledge
Proof algorithms (which more or less already exist). These cod quantum
computing machines are not expanding our knowledge of anything, nor are they
building towards anything for a bold new quantum supreme future; they’re not
scalable, and many of them are not obviously doing anything quantum or
computing.
This entire subject does nothing but eat up lives and waste careers. If I were
in charge of science funding, the entire world budget for this nonsense would
be below that we allocate for the development of Bussard ramjets, which are
also not known to be impossible, and are a lot more cool looking.
As Dyakonov put it in his 2012 paper;
“A somewhat similar story can be traced back to the 13th century when
Nasreddin Hodja made a proposal to teach his donkey to read and obtained a
10-year grant from the local Sultan. For his first report he put
breadcrumbs between the pages of a big book, and demonstrated the donkey
turning the pages with his hoofs. This was a promising first step in the
right direction. Nasreddin was a wise but simple man, so when asked by
friends how he hopes to accomplish his goal, he answered: “My dear fellows,
before ten years are up, either I will die or the Sultan will die. Or else,
the donkey will die.”
Had he the modern degree of sophistication, he could say, first, that there
is no theorem forbidding donkeys to read. And, since this does not
contradict any known fundamental principles, the failure to achieve this
goal would reveal new laws of Nature. So, it is a win-win strategy: either
the donkey learns to read, or new laws will be discovered.”
Further reading on the topic:
Dyakonov’s recent IEEE popsci article on the subject (his papers are the best
review articles of why all this is silly):
https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing
IEEE precis on the NAS report:
https://spectrum.ieee.org/tech-talk/computing/hardware/
the-us-national-academies-reports-on-the-prospects-for-quantum-computing
(summary: not good)
Amusing blog from 11 years ago noting the utter lack of progress in this
subject:
http://emergentchaos.com/archives/2008/03/quantum-progress.html
“To factor a 4096-bit number, you need 72*4096^3 or 4,947,802,324,992 quantum
gates. Lets just round that up to an even 5 trillion. Five trillion is a big
number. ”
Aaronson’s articles of faith (I personally found them literal laffin’ out loud
funny, though I am sure he is in perfect earnest):
https://www.scottaaronson.com/blog/?p=124