The AGI–Quantum Divide2The AGI–Quantum DivideHow Two Technologies Could Split the Future of CivilizationPublished by:EGK Microelectronic Solutions Group Sdn. Bhd.8, Lintang Beringin 8, Diamond Valley Industrial Park,11960 Batu Maung, Penang, MalaysiaTel: +604-505 9700 • www.egkhor.com.myAuthor: Isaac Khor Eng GianFounder & Chief Executive OfficerEGK Microelectronic Solutions Group Sdn. Bhd.eISBN: 978-629-94949-X-XFirst Published: 2026© 2026 EGK Microelectronic Solutions Group Sdn. Bhd.All rights reserved. No part of this publication may be reproduced,distributed, or transmitted in any form or by any means, includingphotocopying, recording, or other electronic or mechanical methods,without the prior written permission of the publisher, except in thecase of brief quotations embodied in critical reviews and certain
EGK Publishing House3other noncommercial uses permitted under applicable copyright law.The views and opinions expressed in this book are those of the authorand do not necessarily reflect the official policy or position of anyagency or organisation.
The AGI–Quantum Divide4ContentsA Note Before We BeginPART ONE — THE WORLD AS IT ACTUALLY ISChapter 1 — The Wall at the End of Moore's LawChapter 2 — Why Today's AI Cannot Think 15Chapter 3 — The Noise Problem: Quantum's UnsolvedChildhoodChapter 4 — What Hybrid AI Acceleration ChangesPART TWO — THE WORLD WHERE AGI NEVER COMESChapter 5 — Quantum Without a GuideChapter 6 — The Governments Take the AtomChapter 7 — Medicine at the Speed of CommitteesChapter 8 — The Widening Chasm 48PART THREE — THE WORLD WHERE AGI ARRIVES FIRSTChapter 9 — Autonomous Labs 53Chapter 10 — The Day Human Scientists Become Optional 57Chapter 11 — AGI Discovers New Physics 61Chapter 12 — Infrastructure Without Operators 66Chapter 13 — The Employment Earthquake 70Chapter 14 — The Compressed Revolution 761102330354044
EGK Publishing House5PART FOUR — THE DARK SIDE OF THE DIVIDEChapter 15 — When the Encryption Breaks 84Chapter 16 — The Darker Convergence 85Chapter 17 — Who Controls the Future? 91Chapter 18 — The Great Stagnation 93EPILOGUE & CLOSINGEpilogue — The Necessity Question 94BACK MATTERAbout the Author 96EGK Publishing House — Other Titles 98
The AGI–Quantum Divide6A Note Before We BeginIn April 2026, Jensen Huang stood before an audience and described afuture that, eighteen months earlier, would have required considerablequalification to discuss in technical circles. He spoke ofquantum-classical hybrid acceleration as something imminent — nottheoretical, not a decade away, but close enough to plan for. Theannouncement landed in a period already saturated with AI news, andmany in the press treated it as another piece of silicon-industryoptimism.I had been writing about this convergence before the announcementarrived. My earlier work, Hybrid AI Acceleration, explored whathappens when the boundaries between classical processing, AIinference, and quantum computation begin to dissolve. I did not predictHuang's announcement. No one could have predicted its precise form.What I did argue — and what the announcement seemed to validate —was that the integration of quantum and classical AI systems was not aquestion of whether, but of sequence, timing, and who would bepositioned to benefit.This book is not a prediction. Predictions age badly. What this bookoffers instead is a set of carefully constructed scenarios — twodivergent futures, built from the same present moment, and examinedwith the rigour that serious engineering questions deserve.The title, The AGI-Quantum Divide, refers to something more thanthe gap between two technologies. It refers to the civilisational fork thatopens when two transformative forces arrive in the wrong order — or
EGK Publishing House7together — or not at all. The divide is not only between countries orcompanies. It is between futures. It is between the world where thesetechnologies develop in sequence, allowing human institutions time toadapt, and the world where they arrive simultaneously, compressingdecades of adaptation into years that no governance framework wasdesigned to handle.I write from a specific vantage point. I am an engineer who worksin semiconductor manufacturing and ESD protection systems. Iunderstand, at a practical level, what it means when a node shrinks,when a transistor leaks, when a qubit decoheres. I am not a futurist inthe traditional sense. I do not traffic in breathless extrapolation. What Itry to do — in this book and in my work — is to reason carefully fromwhat is actually true about the physical world toward what mightfollow.The book is structured in four parts. Part One examines the world asit actually is in 2026: the limits of classical computing, the current stateof AI, and why quantum computing remains technically formidable.Part Two constructs Scenario A — a future in which AGI does notarrive on the timelines its most optimistic advocates project, andquantum computing matures slowly under government control. PartThree constructs Scenario B — a future in which AGI arrives first,transforms quantum research, and compresses the timeline in ways thatraise questions governance frameworks are not designed to answer. PartFour examines the darker convergences that both scenarios share, andasks what it might mean to navigate them responsibly.Neither scenario is utopian. Neither is apocalyptic. Both areintended to be technically grounded and intellectually honest.My hope is that this book is useful — not merely as a readingexperience, but as a thinking tool. The decisions being made now, in
The AGI–Quantum Divide8laboratories and boardrooms and parliamentary committees, willdetermine which of these futures we actually inhabit. I believe thosedecisions deserve more rigorous frameworks than most public discoursecurrently provides.This book is my attempt to contribute one.— Isaac Khor Eng Gian, Penang, 2026
EGK Publishing House9PART ONEThe World as It ActuallyIs\"Before we can understand where two rivers mightmeet, we must first understand why both of them arestill fighting their own currents.\"
The AGI–Quantum Divide10CHAPTER 1The Wall at the End ofMoore's LawOn the 19th of April, 1965, Gordon Moore published a paper inElectronics Magazine that contained, almost as a footnote, anobservation that would quietly govern the shape of civilisation for thenext six decades.Moore noticed that the number of transistors on a chip had roughlydoubled each year since the first integrated circuit was produced. Hepredicted this trend would continue. He was right — for fifty years.And the entire architecture of the modern world was built on thatreliability. Every smartphone in your pocket. Every cloud serverprocessing your search. Every autonomous vehicle parsing itsenvironment in real time. All of it is downstream of Moore's Lawbehaving as promised.What nobody fully prepared for was the day it stopped.That day did not arrive with a press release. It arrived quietly, in thesemiconductor fabs of TSMC, Samsung, and Intel, as engineersdiscovered that moving to the next node — the next generation ofminiaturisation — was consuming an exponentially growing share ofresearch budget for an exponentially shrinking return. The gains werestill real. But they were no longer the gains of a covenant. They werethe gains of a civilisation reaching the bottom of a barrel it had not
The Wall at the End of Moore's Law11previously known was finite.The Physics of an EndingHere is the engineering reality that most technology commentarynever quite captures: the transistors being built at the frontier ofsemiconductor fabrication are no longer reliably classical objects. At 2nanometres — the current bleeding edge, the node at which TSMC isnow shipping production chips — engineers are working at the scale often silicon atoms placed side by side. A human hair is roughly 80,000nanometres wide. The features being carved into silicon today areobjects that exist at the boundary between classical and quantummechanics.At this scale, electrons do not behave the way they do inintroductory physics textbooks. Quantum tunnelling — thephenomenon in which a particle passes through a barrier it classicallyshould not be able to penetrate — becomes not an exotic edge case buta dominant engineering constraint. Electrons begin leaking through gateoxides, through insulators, through barriers designed to control them.The transistor, which is fundamentally a switch — a controlled gate thatallows or blocks current — begins to blur. The off state is not fully off.The on state is not reliably on.At sub-2nm nodes, leakage current — electrons tunnelling throughgate oxides — becomes a dominant energy loss mechanism. The veryquantum effects that make quantum computing possible activelyundermine classical computing at its smallest scales. This is not asoftware problem. No optimisation of code architecture can overcome ahardware limit rooted in physics.
The AGI–Quantum Divide12The industry has responded with increasingly baroque engineeringsolutions. Gate-all-around transistors wrap the gate material around thechannel on all sides, improving electrostatic control. Three-dimensionalstacking layers chips vertically to improve memory bandwidth withoutrequiring further planar shrinkage. Chiplets — disaggregated chipdesigns that package multiple dies in a single package — allowcontinued performance scaling without requiring every functional blockto shrink simultaneously.The Energy WallThese are not failures of imagination. They are the products ofextraordinary engineering talent working at the absolute limit of whatphysics permits. But they are also, unmistakably, workarounds. Theelegant covenant of Moore's Law — print smaller, get faster, getcheaper — has been replaced by a complex negotiation betweenmaterials science, thermal management, packaging innovation, andeconomic viability. The era of free improvements is over.There is a second constraint that the public conversation around AIperformance rarely addresses adequately: power consumption. Traininga large language model of the GPT-4 class requires on the order of tensof gigawatt-hours of electricity — roughly the annual consumption of asmall town. The frontier models being trained in 2025 and 2026 aresubstantially larger. Data centre power consumption globally isgrowing at rates that grid infrastructure was not designed toaccommodate.Global data centre power demand (2023): ~240–340 TWh per year— approximately 1% of global electricity consumption. Projecteddemand (2026): Analysts project 3–4% of global electricity
The Wall at the End of Moore's Law13consumption as AI workloads scale. Training a large frontier model:Estimated 50–100+ GWh per training run for the largest models,comparable to powering a small city for a month. Nuclear energyresurgence: Microsoft, Google, and Amazon have all signed agreementsto power data centres from nuclear sources — a direct response to thecomputational energy crisis.Sources: International Energy Agency (IEA), Goldman SachsPower & Utilities Research 2024. Figures represent order-of-magnitudeestimates; precise consumption figures are not publicly disclosed bymodel developers.The Parallelism DetourThis energy constraint is not merely an environmental concern,though it is that. It is also a competitive and strategic constraint. Thecountries and corporations that can secure low-cost, reliable,high-density power for computational infrastructure will have structuraladvantages in the AI race. This is already reshaping where data centresare built, which energy companies are valued most highly, and whichnations are positioning themselves for computational relevance in thecoming decade.When individual transistors could no longer be made faster, theindustry made a decision so pragmatic and so consequential that its fullimplications are still unfolding: instead of making chips faster, makethem wider. Put more processing units side by side. Let them worksimultaneously. This is parallelism, and for the better part of the lastfifteen years, it has been the engine powering everything you associatewith modern computing performance.
The AGI–Quantum Divide14The graphics processing unit tells this story most clearly. The GPUwas designed for a narrow problem: rendering the pixels of a videogame frame in real time, a task that requires performing the samemathematical operation — transform, light, shade — on millions ofindependent pixels simultaneously. For this problem, massiveparallelism is ideal. A GPU is, architecturally, a device that does manythings at once rather than one thing very fast.The insight that transformed the AI industry — the insight that was,in retrospect, the founding insight of the NVIDIA that now dominatesglobal technology markets — was that training neural networks looks,mathematically, remarkably like rendering graphics. Both aredominated by matrix multiplications: operations that multiply enormousarrays of numbers together. The GPU, built to render Halo, turned outto be nearly ideal for training the systems that would power ChatGPT.What Quantum PromisesNVIDIA's transformation from a gaming peripherals company tothe central infrastructure provider for the most strategically importanttechnology in human history is perhaps the defining corporate story ofthe 2020s. In 2019, NVIDIA's market capitalisation was roughly $100billion. By mid-2024, it had crossed $3 trillion, briefly making it themost valuable company in the world. This was not a coincidence oftiming. It was the direct consequence of the parallelism strategyreaching full industrial deployment.But parallelism, too, has limits. Energy consumption scales withcore count. A data centre running ten thousand GPUs consumes tenthousand times the power of one. Heat dissipation becomes a physicalengineering problem of the first order. And certain classes of problems
The Wall at the End of Moore's Law15— the problems that matter most for science, for medicine, foroptimisation of truly complex systems — do not parallelise cleanly.They require something fundamentally different.Classical computers work with bits — discrete units of informationthat are either zero or one, off or on, the fundamental binary logic of allclassical computation. Quantum computers work with qubits: quantummechanical systems that can exist in superposition — effectively inmultiple states simultaneously — until they are measured. Thedistinction is not merely technical. It is a different relationship betweeninformation and reality itself.Consider a maze. A classical computer solves a maze the way ahuman does: it tries one path, hits a dead end, backtracks, tries another.Even a very fast classical computer is doing this sequentially, or inparallel — running multiple paths simultaneously on separate cores. Aquantum computer, exploiting superposition and entanglement, can, forcertain problem structures, explore all paths simultaneously and arriveat the solution through quantum interference — constructiveinterference amplifying the correct answer, destructive interferencecancelling the wrong ones.The theoretical implications are staggering. A quantum computerwith 300 qubits in superposition can represent more simultaneous statesthan there are atoms in the observable universe. For certain classes ofproblems — factoring large numbers, simulating molecularinteractions, optimising complex logistics networks — this offerscomputational advantages that no amount of classical hardware canever replicate, regardless of how many chips you stack or how muchelectricity you supply.Shor's Algorithm: Factoring large integers exponentially faster thanclassical methods. Directly threatens RSA encryption — the
The AGI–Quantum Divide16mathematical foundation of modern internet security. A sufficientlylarge quantum computer running Shor's Algorithm could break 2048-bitRSA encryption in hours; classical computers would require longerthan the age of the universe. Grover's Algorithm: Searching unsorteddatabases with quadratic speedup. Reduces the effective security ofsymmetric encryption by half — meaning AES-128 becomeseffectively AES-64 against a quantum adversary. Quantum Simulation:Simulating molecular and quantum mechanical systems withexponential efficiency. This is potentially the most important near-termapplication: drug discovery, materials science, catalyst design, andbattery chemistry all involve quantum mechanical interactions thatclassical computers can only approximate. Quantum Annealing / IsingMachines: are computational approaches that map complexoptimization problems onto spin systems, where solutions emerge asminimum-energy (ground state) configurations of an Ising Hamiltonian.These methods are actively studied as potential tools for combinatorialoptimization problems. In April 2026, NVIDIA’s Ising announcementreframed this space—not as standalone optimization hardware, but aspart of a broader AI-driven control layer for quantum systems. Ratherthan delivering quantum advantage directly, the focus is on acceleratingtwo bottlenecks that determine whether such advantage becomesachievable at scale: quantum calibration and error correction.The promise is extraordinary. The practical reality, as of 2026, isthat this promise remains largely theoretical — not because the physicsis wrong, but because building a quantum computer that is largeenough, stable enough, and accurate enough to demonstrate meaningfuladvantage over classical systems is an engineering challenge ofbreathtaking difficulty that has consumed decades of effort from someof the most talented physicists and engineers on Earth.
The Wall at the End of Moore's Law17That challenge is the subject of Chapter 3. But first, we need tounderstand the intelligence — or rather, the intelligent-seeming systems— that many believe will ultimately solve it.DATAData Centre Power: The Scale of the ProblemDATAQuantum Advantage — Key Problem Classes