thenewgreen remember our very first call with rob05c? This is like an easier-to-understand version!
Eh, in general, but he gets a few things wrong. Nope. Conceptually, what makes Travelling Salesman hard, is that you have to return to the starting city. With internet routing, you have to "visit" all the servers, to find the shortest path, but you don't have to "return." Once you've found the destination, you're done. Dijkstra's algorithm is a perfect solution, and runs in a very small P (specifically, O(|E| + |V|log|V|), where |E| is the number of connections between servers, and |V| is the number of servers). Even if he was right, which he's not, you still couldn't 'make stock trades faster.' The internet uses Dijkstra's algorithm at a low level which users have no control over. Even if you figured out a better algorithm, you couldn't make the internet run it. You'd have to build your own internet. Actually, there are a number of special-purpose polynomial time factoring algorithms. If you have a quantum computer, there's even a good general-purpose one. There's no general purpose one for regular computers yet; but nobody really cared about factoring until 1970, so we've only been working on the problem for about forty years. There's no good reason to believe prime factorisation isn't P. Give it time. No, it wouldn't. There are numerous asymmetric algorithms besides factoring. When factoring is broken, we'll just switch to a different one.Change the salesman to a page request and the houses to servers and you’ve got Internet packet routing.
If you could figure out a better way to route data on the Internet, you could make stock trades faster than everyone else and make billions of dollars.
There’s no good way to write a computer program to find the factors of large numbers quickly.
If finding those factors were easy, then the Internet would fall over.
By 'perfect' I meant it solves the problem perfectly, as opposed to a heuristic. You're right, it isn't optimal; I mentioned it because it's more accessible. Sort of. If you prove P=NP, you'll have proven factorisation is P. But factoring isn't NP Complete, so proving factoring is P or NP doesn't prove P=NP, nor would proving P≠NP prove factoring is NP.Whether or not it is is the million-dollar(literally) question.
There's a lot of confusion and misinformation about the P=NP problem. I think most of it stems from the fact people misunderstand what P and NP even are. They also seem to think that all these diverse NP and NP-Complete problems will suddenly become trivial if we could prove P=NP. They are somehow sucked into the fantasy that constant factors don't count in the real world because they can be ignored mathematically. Or that just because something is in NP that it would automatically have a fast enough solution that we can just write a program to get an answer in reasonable time.
And the article didn't really help. But, when dealing with the cryptography question it isn't really much about whether or not, the day after someone proves P = NP, all asymmetric crypto suddenly breaks. Most of it will probably still be usable. The problem is that polynomial problems grow much slower than, say, exponential problems, even with gigantic factors everywhere. So, the difference between a 768-bit key and a 1024-bit key, which today is astronomical(something in the 10^30 range), would be cut down to something that would probably be a little less. At some point, no matter what, there will always be a key size where the polynomial-time algorithm starts being faster than the exponential-time algorithm. And if there's one thing cryptographers don't like, it's when their cryptosystems stop working when the key size gets very large, because who says that in a hundred years we won't be using 10-billion-bit keys?There's a lot of confusion and misinformation about the P=NP problem.
Maybe. But what if someone discovers a symmetric algorithm for which the polynomial exponent is Graham's Number? That is, there are polynomial exponents for which it would take half the electrons in the universe, until the heat death of the universe, to compute relatively small n. We just have to find one.the day after someone proves P = NP, all asymmetric crypto suddenly breaks
Obviously an algorithm such as that is just as ineffective as an exponential algorithm, but it's important to know if you can turn an exponential-time algorithm into a polynomial-time algorithm because it means that, in 10 years, you might find a symmetric algorithm for which the polynomial exponent is within the realm of breakability. In the meantime, if someone finds a faster exponential-time algorithm you just double the key size.
Ha. I figured there would be some issues. High level, easy to understand stuff typically leads to things not being quite right. Thanks for chiming in.
I also commented on NP=P and the Number Field Sieve algorithm recently involving RSA here: But I want to actually clarify a couple of things I said. I in fact was slightly incorrect about the complexity of the NFS algorithm itself. https://en.wikipedia.org/wiki/General_number_field_sieve The complexity in big-O notation is actually that top equation, that isn't the algorithm itself. I'll point out this quote because that equation simply for complexity notation itself is fairly ridiculous to distinguish: So it's not polynomial, but it's also not quite exponential. Solving P=NP actually COULD speed up the NFS algorithm. Now the question is, how much? While factoring numbers might actually be solvable in polynomial time, I don't think it wouldn't matter much. Polynomial equations can still be difficult if the number of polynomials needed to solve the equation are high, such as if the number of bits can cause the polynomial equation found to have millions of polynomials. In that instance, if increasing the bit size increased the number of polynomials required to solve the equation (probably I think? I mean we are talking about pure speculative solutions here so who knows what the solution would end up being), we would just end up using RSA 8192-bit or 16384-bit to slow the factoring process down again. Then again, RSA on larger bit keys slows down the encryption process itself, but we already are under the assumption that this is slow and that is why we use hybrid encryption for things like TLS. So while P=NP would likely affect how we configure RSA on servers, it would likely just end up being a patch to services that use TLS or RSA (many have maximum bit settings at 4096 for instance) and a requirement that we regenerate all of our private keys. It's happened before (all UNIX servers had to do this with Heartbleed), we can do it again. Some sleep will be lost for a few weeks by certain people and then life would move on.The running time of the number field sieve is super-polynomial but sub-exponential in the size of the input.