Science
Related: About this forumNasa buys into 'quantum' computer
Nasa buys into 'quantum' computer
A $15m computer that uses "quantum physics" effects to boost its speed is to be installed at a Nasa facility.
It will be shared by Google, Nasa, and other scientists, providing access to a machine said to be up to 3,600 times faster than conventional computers.
Unlike standard machines, the D-Wave Two processor appears to make use of an effect called quantum tunnelling.
This allows it to reach solutions to certain types of mathematical problems in fractions of a second.
http://mobile.bbc.co.uk/news/science-environment-22554494
napoleon_in_rags
(3,991 posts)A classic example of one of these "combinatorial optimisation" problems is that of the travelling sales rep, who needs to visit several cities in one day, and wants to know the shortest path that connects them all together in order to minimise their mileage.
Aren't all those NP hard problems mappable to each other? I mean I thought if you could do one, you could pretty much do them all.
If so, wow! They really got it going on. The code breaking implications are profound also.
gurthang
(13 posts)NP-complete problems are reducible to one another on polynomial time. NP-hard is a strict superset of P and NP.
http://en.wikipedia.org/wiki/NP-hard
Interestingly, if P != NP, then it is believed that quantum computers still can't solve NP-complete problems in polynomial time.
https://en.wikipedia.org/wiki/Quantum_computer
and
https://complexityzoo.uwaterloo.ca/Complexity_Zoo
napoleon_in_rags
(3,991 posts)Okay, so regarding - yeah, NP-Completeness...(Be patient, I didn't pay as much attention in school as I wish I did)
NP-Complete Problem
A problem which is both NP (verifiable in nondeterministic polynomial time) and NP-hard (any NP-problem can be translated into this problem). Examples of NP-hard problems include the Hamiltonian cycle and traveling salesman problems.
And then from the BBC article in the OP:
A classic example of one of these "combinatorial optimisation" problems is that of the travelling sales rep, who needs to visit several cities in one day, and wants to know the shortest path that connects them all together in order to minimise their mileage.
The D-Wave Two chip can compare all the possible itineraries at once, rather than having to work through each in turn.
If didn't say solved in polynomial time, but it sounds implied. And then you say:
Interestingly, if P != NP, then it is believed that quantum computers still can't solve NP-complete problems in polynomial time.
Are you implying that these results could mean P == NP?
gurthang
(13 posts)but the language of complexity theory is often confusing. It states that NP-complete problems can be mapped to NP-hard in polynomial time, but not that all NP-hard problems can be mapped to each other.
The Wikipedia article has a diagram that shows the relationships nicely. It also shows that I was wrong to claim that NP-hard was a strict superset of P. NP-hard was actually not discussed in my class and I had not understood that it was not one of the nested sets of complexity.
http://en.wikipedia.org/wiki/NP-hard.
To your last point, no, a new kind of computer doesn't have any bearing on whether P=NP; that is a matter for pure mathematical distinction. But when they speak of trying all possibilities at once, they are effectively claiming that quantum computers are equivalent to non-deterministic Turing machines. At least according to the Ph.D. dissertation of the keeper of the complexity zoo, this is not the case.
WARNING: the remainder of this post is conjecture at best
My -- very feeble -- understanding of this claim is that quantum computers can only give a certain probability of giving the correct answer within a certain period of time. I think that this has something to do with the tendency for entangled particles to spontaneously become disentangled, but I really don't know enough about it to say much either way.
If it really is a matter of trying all possibilities at once, then the potential advantage of a quantum computer would be that the machine size would grow linearly with the size of the input. However, the probability of disentanglement may increase with the size of the machine, yielding interesting limits to their computing power.
napoleon_in_rags
(3,991 posts)The thumbnail, napkin, sketch understanding I have - which has served me well in application - is the big difference between a thing solvable in polynomial time or less, and a thing solved in exponential time. The visualization I use is that a thing solvable in 2^N steps (exponential time) has to basically split into two separate processes for each time N is incremented, so it maps pretty well to a non-deterministic Turing machine (which can split itself into as many copies of itself as it needs) That seems to jive with the definition of NP-complete, but I still don't understand the group of things outside of NP, but within NP-hard. Most of the important problems I've looked at have solutions in exponential or factorial time, I don't know what's in that group outside of NP at all.
Anyway though, from a simple skimming, the facts imply the complexity zoo guys dissertation has some merit. Check out this:
http://en.wikipedia.org/wiki/EQP_%28complexity%29
EQP would not need to exist if it were the same as P. Quantum polynomial time gets its own class, so it must me different in some fundamental way than P. But how, I wonder?