Quantum Information, Game Theory, and the Future of Rationality
Build the Primitive First
In a recent paper, Jens Eisert and John Preskill argue that scalable, fault-tolerant quantum computing must emerge through difficult, staged progress, and they are right about the engineering burden. But the next stage need not remain anchored to universal computation. Transformative technologies historically mature by consolidating a stable primitive before scaling complexity. Quantum technology already possesses such a primitive: distributed quantum correlations. The bottom line is simple: progress will come not from chasing universality first, but from building on what is already controllable and infrastructure-ready.
Faisal Shah Khan, PhD
2/17/20264 min read


Eisert and Preskill argue that the road to large-scale, fault-tolerant quantum computing is fraught. They are not wrong. If the objective is universal, application-scale quantum computation, the engineering burden is immense and tightly coupled across hardware, control, error correction, and algorithms. Progress along one dimension often amplifies difficulty along another. Importantly, they do not imagine a single leap. They describe progress as unfolding in stages: first extending noisy devices toward protected quantum memory, then moving from protected memory to scalable fault tolerance, and only ultimately toward broadly useful quantum advantage. The question is not whether development must be staged. It already is. The real question is how we define the next stage.
In their framing, the next stage remains internal to the universal quantum computing program: better protection, better scaling, better fault tolerance. The horizon is still the general-purpose machine. But transformative technologies rarely begin at maximal generality. They mature by stabilizing a narrow, controllable primitive before scaling complexity. Reliable transistor switching preceded integrated circuits. Controlled lift preceded long-range aviation. The early internet emerged within research networks long before personal computing became ubiquitous; infrastructure matured before higher-layer applications reshaped the world. In each case, a foundational capability was consolidated first, and complexity was layered gradually.
Quantum technology may be at a similar moment. Instead of consolidating its foundational primitive, it has largely oriented itself toward the end-state of universal quantum computation. Yet that primitive already exists. Quantum correlations are the underlying resource that enables quantum advantage across communication, computation, and game-theoretic protocols. It is not universal programmability, but stable generation and distribution of quantum correlations. Quantum key distribution was the first killer application of quantum information science, and it is not speculative. China’s Beijing–Shanghai quantum communication backbone spans thousands of kilometers of fiber. The Micius satellite has demonstrated entanglement distribution over continental scales. Europe’s EuroQCI initiative is integrating terrestrial and satellite quantum links across member states. In the United States, metropolitan quantum network testbeds are already operating. Quantum correlations are no longer confined to laboratories; they are deployed in operational communication networks.
QKD is not the destination; it is proof that nonclassical correlations can be engineered, transmitted, and maintained at infrastructure scale. That is the transistor moment of quantum technology. This perspective is not foreign to the field. Eisert’s early work on quantum games demonstrated that structured quantum correlations, combined with local single-qubit operations, can yield performance advantages without requiring large-scale quantum computation. In that setting, advantage emerged from correlation structure rather than computational depth. A correlation-first staging is therefore not a departure from quantum information theory, but a continuation of one of its foundational insights.
The divergence from the universal computing roadmap lies in what follows. The difference is not about abandoning quantum computing, but about priority. Rather than treating distributed entanglement primarily as a stepping stone toward full-stack, fault-tolerant processors, it can be developed as a standalone architectural objective. The center of gravity shifts from scaling monolithic quantum processors to engineering robust, networked systems for generating, distributing, and managing quantum correlations as an infrastructure layer in their own right. Rather than attempting to close every gap required for scalable fault tolerance, we can consolidate and extend what is already controllable—distributed quantum correlations.
Entanglement-based networks already generate measurement statistics whose joint structure reflects a nonclassical resource. Today those statistics are typically reduced to final secret keys, and the underlying correlation structure remains internal to the device. But that is a design choice, not a physical necessity. If correlation-level information becomes accessible as part of network operation—if the strength and structure of quantum correlations can be measured and tracked—then they become analyzable signals.
This is where a Quantum Correlation Index (QCI) becomes central. QCI is not an entanglement witness or a security certificate. It is a statistical metric quantifying the degree to which observed joint distributions depart from classical baselines—in other words, how much usable nonclassical correlation is present in the system. Once such a metric exists, quantum correlation becomes comparable, monitorable, and optimizable, much like volatility or cross-asset correlation in financial markets. At that point, quantum infrastructure ceases to be merely secure communication; it becomes a new data layer.
This connects naturally to the structure of the original Eisert–Wilkens–Lewenstein (EWL) protocol—introduced by Eisert and collaborators in 2000—where an entangling operation is followed by local single-qubit actions and then an inverse map restoring classical comparability before measurement. That template—correlation resource, local agency, classical reset—need not remain confined to abstract game theory. It can serve as a general architectural pattern for quantum networks. Generalizing this structure beyond a specific entangling gate while preserving the classical reset condition ensures that quantum performance can always be benchmarked against well-defined classical limits.
Markets provide one plausible domain for this evolution. Work on quantum-correlated coordination and game-theoretic protocols already shows that performance advantages can persist even under noise, provided the quantum correlations are not overwhelmed. The threshold is not perfection but detectable deviation from classical limits. Universal quantum computing demands extremely high fidelity and large-scale error correction; markets operate in regimes where even marginal, systematic statistical edge can matter. Perfection is not the goal. The question is whether the signal is strong and stable enough to translate into measurable alpha under realistic market conditions. That bar is structurally different — and generally less stringent — than the threshold required for fault-tolerant universality.
The next step therefore becomes architectural rather than universal: distribute quantum correlations across networks, allow local single-qubit actions by agents, and enforce a classical reset layer that guarantees reproducible classical limits when quantum resources are restricted. The reset condition ensures clean comparison and prevents ambiguity about where advantage arises. Under this reframing, the road to quantum need not be fraught because the ambition has been disciplined.
The way forward is not to retreat from quantum ambition, but to discipline it—build around distributed quantum correlations, measure what is usable, exploit where signal exceeds noise, and let universality emerge from infrastructure, not precede it. In the near term, that means prioritizing distributed quantum correlation networks over scaling full-stack, fault-tolerant universal processors.