Since the publication of Feynman's "simulation" paper in 1982, researchers in physics, EE, CS, and maths had been trying to build practical #quantum #computers (QC). They had come a long way, over the past four decades. In recent years, random circuit sampling, boson sampling, and several other experiments have demonstrated QC’s supremacy over classical digital computers (DC).
But neither the QC theory nor the technology is practicable, at present. Sure, the Toffoli CCNOT reversible, unitary gate is equivalent to the DC universal NAND gate, so QC can theoretically simulate any DC, including probabilistic ones. But "can" is not "do". The best known hardest quantum algorithm, Shore's 1994 prime factoring, solves an NP problem of DC. In 2026, a QC factored 551. Incidentally, ChatGPT factored it in milliseconds. (OK, cheap shot.) And, to date, there is no known efficient quantum algorithm for an NP-complete problem. The general consensus is that QC is just as much subject to that "gut-feel” speed-limit of DC: \(P \neq NP\).
Forty years is an eternity in computing: compare a cheap 2025 gaming laptop to the top-end 1985 SGI IRIS 2000 3D graphics workstation; compare the 1985 CRAY-2 supercomputer to von Neumann’s 1945 paper on a bus-based computer architecture. Goodness me, is DC actually "faster" than QC? Still, QC has proven, demonstrated supremacy over DC, in some narrow, but significant, areas of computing; QC should be studied.
At present, though, #profiteering quantum startups are trading a few buzzwords for a mound of cash, using #WallStreet's proprietary, alchemical process for creating the element \(Au\): heat an admixture of #QC and #AI buzzwords over the hot coals of social media—voilà !