Article: "Massive Disruption Is Coming With Quantum Computing"

Always great when Peter Diamandis gives his view on a subject.

http://singularityhub.com/2016/10/10/massive-disruption-quantum-computing/?utm_content=bufferfdbad&utm_medium=social&utm_source=twitter-hub&utm_campaign=buffer

4 Likes

According to the article/vid is that a quantum computer is

  • NOT a replacement for a classical computer. No good for surfing the web, watching vids etc
  • Only good for specific computations where they can construct the atoms to form algorithms and then it (supposedly) solve them quickly. NOTE: Not yet proven. vid says theory
  • The way it was explained is the way one would explain an analogue computer where each step is slower but each step is equivalent to a multitude of classical computing steps.

It really is sounding like they are designing the quantum components as an analogue computer look alike. Except the accuracy is ever so much more accurate (digital accuracy).

If one listened carefully to the vid, the researcher dropped an “if” in there. He was saying to get multiple bits the atoms (electrons) have to be tangled and for a 300 bit quantum computer to work it would require all 300 to be tangled. I would expect the significance of this quietly stated “if” to have been missed by most, but having studied quantum mechanics it is a HUGE “if”. To get 2 entangled has taken almost 4 decades to occasionally get success. But 300 the nuclear forces working against this happening is like trying to split the atom. And we are like scientists in the 1700’s trying to split the atom. We may know there is a lot of energy, but at a loss to do it. So it is with trying to tangle a few atoms (electrons) at the same time for long enough to do something for us.

I am unsure how the author of that article can make the claim of next year for anything more than a 2 qubit quantum computer that computes slower than a 1960s microprocessor. (hint in 1960 microprocessors didn’t exist)

Peter Diamandis is on the board of a lot of big tech companies like SpaceX and more. He knows the Google folks personally. A few weeks ago we had this news:

Several scientists familiar with Google’s progress, including Devitt, suggest that a functioning 50-qubit quantum chip, enough to overpower conventional supercomputers at a certain kind of calculation, could be ready by as soon as the end of 2017.

This article is written by technology review by MIT. So what I get out of it is the fact that Google and several others are close to a working quantum computer. That’s the news. Here’s more:

Silicon quantum computers take shape in Australia

These computers are expected to work extremely well for a typical type of computation, like the traveling salesmen problem. But Machine Learning is part of the game as well.
So in a few years we might see cloud services with quantum computation where people hack ML problem millions of times faster next to conventional computers. And the good news is that the power of these systems double every time you add an extra qubit. So with 32 qubits you might be able to do the same thing as 4.3 billion transistors on a CPU (only typical type of calculations) but with 33 qubits you might do the same as 8.6 billion transistors on a common CPU. Now imagine taking it up to 50 or 64 qubits. I think that’s where the magic is.

1 Like

DWave just announced a 2048 qbit chip I thought and Google on their quantum ai blogg said the expect shortly to disprove the strong varient of the Church Turing thesis. Now the Dwave chip is not general but getting better all the the time.

DWave just announced a 2048 qbit chip I thought and Google on their quantum ai blogg said the expect shortly to disprove the strong varient of the Church Turing thesis. Now the Dwave chip is not general but getting better all the time.

1 Like

Australian researchers have created qubits that stay in stable superposition much longer than previous records. The new qubits had a dephasing rate of T2*=2.4 milliseconds, 10 times better than standard qubits.

Construction of practical quantum computers radically simplified

Scientists at the University of Sussex have invented a ground-breaking new method that puts the construction of large-scale quantum computers within reach of current technology.
Quantum computers could solve certain problems - that would take the fastest supercomputer millions of years to calculate - in just a few milliseconds.
They have the potential to create new materials and medicines, as well as solve long-standing scientific and financial problems.
Universal quantum computers can be built in principle - but the technology challenges are tremendous. The engineering required to build one is considered more difficult than manned space travel to Mars – until now.
Quantum computing on a small scale using trapped ions (charged atoms) is carried out by aligning individual laser beams onto individual ions with each ion forming a quantum bit.
However, a large-scale quantum computer would need billions of quantum bits, therefore requiring billions of precisely aligned lasers, one for each ion.
Instead, scientists at Sussex have invented a simple method where voltages are applied to a quantum computer microchip (without having to align laser beams) – to the same effect.
Professor Winfried Hensinger and his team also succeeded in demonstrating the core building block of this new method with an impressively low error rate at their quantum computing facility at Sussex.
Professor Hensinger said: "This development is a game changer for quantum computing making it accessible for industrial and government use. We will construct a large-scale quantum computer at Sussex making full use of this exciting new technology."
Quantum computers may revolutionise society in a similar way as the emergence of classical computers. Dr Seb Weidt, part of the Ion Quantum Technology Group said: “Developing this step-changing new technology has been a great adventure and it is absolutely amazing observing it actually work in the laboratory.”

This could be quite big. Here’s another article about the same subject:

1 Like

The reporting in those articles is frusterating, it got propagandistic stuff in it like “50 years away,” or the notion that large scale must be “billions of qbits,” or the seeming denial of the annealing approach even if not general but great stuff anyway. Interesting what first q computer will be able to do in helping to refine the tech with optimization as a strength. Check out this article with the same group reporting:https://www.google.com/amp/s/amp.ibtimes.co.uk/quantum-computing-breakthrough-israeli-scientists-invent-cannon-entangled-photon-clusters-1583798?client=ms-android-metropcs-us

Now its an optical qcompter but q computing may be where optical computers finally shine. That paper describes a system that seems busless and bridgeless and therefore hopless or latency free. With some mechanism to set it up say using the existing global fiber network would it be possible to get a set of entangled coherant stable photons distributed across the globe with redundacy (cloning/qteleport) where possibly the same set of photons was both the storage (store and retrieve in an electron why not photon if viable photon qbits?) and the qbit compute units and then run something like the qmodified SAFE protocol in the software so that if someone did get legitmate access to this network physically it would be SAFE sans the hops issues underneath and possibly with no limit on additional nodes. It seemed like there was a provision for adding additional photons- could be a point of centalized weakness, such that the network must be closed to additional cluster nodes to be secured? Maybe bridges then for additions- latency and hops again? It seems like stable strange action at a distance cutting out hops would give SAFE a no compromise basis.