>>14190711>but none of these are actually practical I don't know if "they're not practical" is exactly valid since this is active research, and to give the math/CS crowd credit, they've found ways to make pure problems practical. The biggest advantage even on noisy circuits is in the supremacy of no-communication, entangled strategies over optimal classical strategies with communication:
https://en.wikipedia.org/wiki/Quantum_pseudo-telepathyBrayvi's paper and post on the subject:
https://www.ibm.com/blogs/research/2020/07/quantum-advantage-shallow-circuits/>or can be put in use where a classical computer cannotde-quantizing algorithms in machine learning is actually a big topic now, but there are twofold effects of this:
1) we have "quantum methods" to prove hard theorems about classical computers. The quantum formalism isn't just linear algebra, and it is super important for tackling a lot of harder problems. Great survey paper here:
https://arxiv.org/abs/0910.33762) we have a sharp understanding of what speedups we get with quantum that are actually meaningful. Subquadratic speedup means nothing, but anything asymptotically above that has massive gains. All in all, both classical and quantum computing have a LOT to gain from continuing the current study of quantum computing and algorithms.