Quantum Computing Research: Ontological Study of the Quantum Computing Research Ecosystem

Quantum Computing Research: Ontological Study of the Quantum Computing Research Ecosystem

Copyright: © 2024 |Pages: 28
DOI: 10.4018/979-8-3693-1168-4.ch012
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Quantum computing is one of the remarkable achievements jousting on the coalesced power of quantum mechanics and information theory to bring innovative paradigm shift in computational regime. It possesses potential to solve computationally complex problems in real-time which were otherwise considered intractable in traditional computer science. The nascent field, originated during the early 1980's, has now revolutionarised the global technological reign by fetching the attention of every government and corporate across the world. Despite huge fund infusions, research advancements, and policy supports, the ideal potential has yet not been realised. This chapter intends to study the ontology of quantum computers to understand its astonishing journey—based on underlying mathematical, scientific principles, technical issues, challenges, and variants—for paving a way for future direction of research. The study is anticipated to come out with the essential research elements required to formulate the hypothesis and address research issues in this field.
Chapter Preview
Top

1. Introduction

The end of 20th century earmarked a wacky technological leap to drastically overhaul the societal landscape across the globe as the century-long technological advantage has resulted in the culmination of a promising discipline of ‘quantum computing’ leveraging the principles of quantum mechanics to enhance the capabilities of information science. The existing practices in traditional information sciences were restricted by the narrow computational space available with underplaying cmos technology.

Quantum computing comes up with innovative solutions to harness the principles of quantum mechanics viz. interference, parallelism, entanglement, and superposition to overcome the existing limitations and offer enormous computational space with unprecedented speed. Traditional computer machines use the bit as a basic computational unit with only two available states i.e. either ‘0’ or ‘1’ offering ‘2n’ computational spaces while quantum computers use a quantum bit (qubit) as a fundamental computing unit comprising of |0, |1 or a superposition of both and offer 2n computational spaces. This enlarged computational space, exponentially faster the capabilities to solve complex problems in real time, which otherwise were considered NP-Hard or knapsack problems in the earlier regimes. This way quantum computing is poised to bring a paradigm shift in information sciences by relying on an entirely new concept of computation rather than merely a modified version of existing traditional computing (Ying et al., 2010).

The roots of computing based on principles of quantum mechanics can be traced back in Moore’s law, a talisman to forecast the future of technology, stating that with the increase in computational power the size of processor will reduce exponentially (Khang and Quantum et al., 2023). The present-day classical processor has reached to nanoscale and the further reduction will enter an atomic scale computing i.e. quantum computing. The foundation of quantum computing was initiated with a proposition of quantum automata by Russian Mathematician Yuri Manin in 1980 and a quantum Hamiltonian to simulate Turing machine by Paul Benioff in the same year. This eureka moment was venerated with the idea of great physicist Richard Feynman, in 1982, conceiving that some quantum phenomenon could not be simulated by classical computers without any exponential slowdown.

Feynman’s quantum curiosity was further explored by Deutsch for propounding the Quantum Turing Machine as a universally empowered quantum computer based on the technique of quantum parallelism. In 1994, Peter Shor was the first to develop quantum algorithm to address the prime factorization issue which otherwise was infeasible to be solved in real-time (Ying et al., 2010). In 1996, Lov Grove implemented an algorithm that sped up the large database search to real by modifying the classical algorithms. Shor’s and Grover’s algorithms poised innovative ideas for practical implementation of quantum computing to solve real-life puzzles. This breakthrough in harnessing the weird potential of quantum computing has ignited various scientific and industrial giants across the globe to rope into the quantum race making it a hot issue of strategic diplomacy for governments across the world (Shah & Khang et al., 2023).

D-wave Systems, a Canadian company, claimed creation the first quantum computer in 1999. IBM has already released 400 plus qubit quantum process ‘Sprey’ in 2022 and paved a road map for another 1121-qubit processor by 2026. Google proved its quantum supremacy in 2019 through its 3-qubit Sycamore processor by carrying out boson sampling in 200 seconds which is not expected from present-day classical computers in a time less than 2.5 billion years while two Chinese teams have claimed such achievements in 2021 (Khang and Quantum et al., 2023) (Abd El-Latif et al., 2018).

However, despite huge fund infusions and swift technological developments, the quantum computing is still at infancy, hindering its application as an autonomous computing machine. The quantum computer prepares qubit states to store and process the information while quantum gates based linear algebra are used for implementing the programs. The hardware used in quantum computer is sensitive to the environment, prone to error, fragile, and ephemeral to limit that even a small change in physical condition may shutter the quantum state completely (Khang & Quantum, 2023).

Complete Chapter List

Search this Book:
Reset