
Home
About the Book
Key Insights
More Details
About the Author
Services
Download the Brochure
Order Now


More Details
Limits of Classical Computing
The past five decades have witnessed a dramatic acceleration of computer evolution to multibilliontransistor chips, able to collectively execute multi exaOPS (multi 10^{18} operations per second) when configured into a massively parallel processing supercomputer architecture. However, even today’s fastest supercomputers are ultimately classicallyconstrained, energyinefficient sequential processors, both at component and system levels.
‘Moore’s law’—the doubling of transistors on chip every 18 months—has propelled classical (nonquantum) computing evolution for the past half century. During the past three decades, Dennard scaling enabled more, faster, and increasingly energyefficient transistors and applications with each succeeding processor generation.
A growing failure of Dennard scaling and Moore’s law, to which the shift to multicore since 2005 was partially a response, has shown signs of limiting multicore as well. Contributing factors to the collapse of classical exponential scaling include power and parallelism constraints encountered at small dimensions, ‘dark silicon’, and sequential processing bottlenecks.
Continuing exponential improvements in classical processor speed, memory and integration under Dennard scaling and ‘Moore’s law’ are not sustainable due to quantum effects. Components and circuits today are fabricated and operate at intrinsically quantum mechanical dimensions—the molecular (nano), atomic (ångström), and subatomic (pico) scales.
In order to sufficiently enhance computer and network processing to continually meet worldwide computational and availability demands, the following conditions must be met:
 Computer and network components and systems must be driven at increasingly higher clock frequencies within shrinking chip geometries and diminishing memory latencies—a quest that has led to the collapse of Dennard scaling and ‘Moore’s law’ within classical spacetime;
 Computer and network components and systems must be increasingly integrated due to the speed of light limitation while remaining within classical spacetime symmetries;
 Increasingly miniaturized components and systems need to be continually more energyefficient, while avoiding serial architecture (Von Neumann) bottlenecks and resistancecapacitance delays, issues that are only temporarily delayed within classical parallel processing platforms.
When we extrapolate classical exponential scaling trends, we attain a limit of one atom per bit and efficient single electron transistor by the 2020 timeframe. Prior to these levels, it becomes necessary to use quantum effects to properly address worldwide computeintensive evolution.
Emergence of Quantum Computing
Of all the candidate technologies that continue to scale well beyond the current classical era, quantum logic has one unique feature—it is not contained by classical spacetime physics. ‘Moore’s Law’ is exponential; any classical approach demands exponential increases in space or time. Even the Avogadro’s number of elements in a molecular computer is quickly limited by the size of the exponential problem. Quantum computing and networking access Hilbert space, the one exponential resource that has been untapped for computation.
Quantum computers, in various stages of R&D, operate according to the rules of quantum mechanics which govern the world of the very small—the waves and particles intrinsic to the nanoscale (molecular scale, 10^{9} meter), ångströmscale (atomic scale, 10^{10} meter) and picoscale (10^{12} meter, the domain of electrons and photons). Perhaps the most striking characteristic of quantum computers is that elementary particles can persist in two or more (2^{n}) states at once, making possible processing units (quantum bits, or qubits) that are exponentially more efficient than any conventional, “classical” computer could ever be.
Quantum computers operate in truly parallel fashion, with sequential and simultaneous computing costructured into their very nature. Quantum computing simultaneity ensures that all computational pathways are pursued at once, exponentially eclipsing serial processing bottlenecks encountered in conventional (classical) computers. In other words, each quantum processing operation acts on all system states simultaneously. Therefore, one machine cycle, one ‘tick of the quantum computer clock,’ computes not just on one machine state (as is true of nonquantum, serial computers), but on all possible instruction states at once.
Quantum Computing Implications
Quantum computers may one day solve problems that conventional computers could never manage, and in a fraction of the time, including:
 Quantum Search—Searching the Internet, Surface Web, Deep Web, private content, and database repositories with far greater contextual precision than is imaginable today, even using massively parallel (nonquantum) systems, via quantum search engine^{™} applications that examine and update all possible locations and contexts within several seconds to a few minutes.
 Quantum Simulation—Simulating the intricacies of complex classical and quantum systems at scales unattainable by conventional computing technologies, leading directly to dramatic breakthroughs in costefficient and environmentfriendly ageless materials based on optimum strengthtoweight ratios, atomic and subatomicscale architecture, electronic and photonicscale chip design (picoelectronics), onestep designtobuild quantum memory, and optimized designtoconstruction of entire communities.
 Quantum Database and Modeling—Modeling national and global economies based on continually refreshed contextual worldwide search results of billions of networked and standalone information and data repositories. Rapid and accurate weather and climate pattern forecasting.
 Quantum Factoring—Factoring multidigit numbers exponentially more rapidly than is currently possible with the best nonquantum methods, enabling timely access by authorized individuals to secure private information.
 Quantum Cryptography—Ensuring protection of sensitive data and information through quantum cryptographic methods that reliably prevent unauthorized access through classical or quantum means, regardless of the scale of breadth and depth of bruteforce or contextual compromise attempts.
 Quantum Factoring, OrderFinding, PeriodFinding, FastFourier Transform—Performing globalscale calculations and worldwide database updates with contextual precision across a wide range of systems within a matter of minutes, regardless of the scale or order of distributed permutations involved.
 Quantum Counting—Calculating all required solutions to any scale of presented problem within a tractable period of time, regardless of the dimensions involved.
 Quantum Teleportation—Transcending speedoflight and temporal computing boundaries at great distances using quantum entanglement properties associated with nonlocal connections.
 Quantum Error Correction—Automatically fixing system errors while revealing nothing about ongoing quantum computations, thereby preserving the system in a state of quantum superposition—that is, a dynamic state of all computational possibilities.
The Cosmic Computer^{®} and Cosmic Switchboard^{®}
When we consider a quantum computational system of n quantum bits (qubits), we find the computational basis states of this system to be of the form ⎢χ_{1}χ_{2} … χ_{n} 〉. Therefore, the system quantum state is specified by 2^{n} probability amplitudes. For n greater than 500, this number is larger than the estimated number of atoms in the known physical universe.
The Cosmic Computer^{®} and Cosmic Switchboard^{®}, stationed at the level of the Unified Field, are found to be perpetually processing 2^{n} amplitudes, even for systems that contain only a few hundred atoms, to say nothing of the massively parallel infinitypoint calculations eternally proceeding behind the scenes to evolve and maintain all the Laws of Nature on every level of creation. We extrapolate that Nature maintains greater than 2^{300} calculations for every few hundred atoms throughout the entire universe.
The scale of Natural Law calculations of the Cosmic Computer^{®} and Cosmic Switchboard^{®} is estimated to extend exponentially beyond the atomic level when we shift our attention to the scales of the fundamental force particles (photons for the Electromagnetic Force; weak gauge bosons for the Weak Force; gluons for the Strong Force; gravitons for the Gravitational Force). The fundamental force computation density of the Cosmic Computer^{®} and Cosmic Switchboard^{®} is again extended hyperexponentially at the superstring dimensions that pervade Planck scales of 10^{33} centimeter and 10^{43} second.
This book reveals the entire structure of the Cosmic Computer^{®} and Cosmic Switchboard^{®} to be pure, cosmic intelligence and has been identified by Maharishi Mahesh Yogi as integral to Maharishi Vedic Science in terms of the infinitywithinallpoints and allpointswithininfinity cosmic computational foundation for perfection of evolution. We locate the Cosmic Computer^{®} and Cosmic Switchboard^{®} within the selfluminous junction point of the HardwareSoftware Gap™, human physiology, and throughout every point of manifest creation. It is here that we discover that intelligence which is at the same time numeric and also with boundaries, where physical digits are connected to numeric digits, where the physical is expressed in terms of numbers.
Read About the Author.
COSMIC COMPUTER, COSMIC SWITCHBOARD, DIGITAL UNIVERSE, RAAM GATE, VEDACOM, and VEDIC COMPUTING are registered trademarks of Thomas J. Routt. GLOBAL INTERNETWORK, HARDWARESOFTWARE GAP, NETWORKONACHIP, QUANTUM GAP, QUANTUM NETWORK ARCHITECTURE, QUANTUM SEARCH ENGINE, and VEDIC GAP are trademarks of Thomas J. Routt. Other brand and/or product names may be trademarks or registered trademarks of their respective owners.

