Exploring the Potential of Quantum Computing in AI, Medical Advancements, and Cyber Security

Exploring the Potential of Quantum Computing in AI, Medical Advancements, and Cyber Security

Copyright: © 2024 |Pages: 20
DOI: 10.4018/979-8-3693-1479-1.ch004
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The modern method of computing known as quantum computing depends on incredible phenomena over quantization. Combining various domains beautifully. For controlling a way of treating microscopic elements like atoms, electrons, photons etc., it outperforms conventional computers in terms of computational power, efficiency, and speed. The fundamental ideas and concepts of quantum computing are discussed in this chapter. The chapter unitizes a history in basic computing techniques along with changes as well as improvements which were made to overcome its disadvantages up to this point. This chapter discusses structure, h/w, s/w, category, types, along protocols which are generally needed by quantization methods to comprehend overall potential as well as limitations for realistic quantum devices which can be released public manner. Moving towards the advantages regarding the general quantum computing technique as well as analyzing its potential, it is recommended to review some background information.
Chapter Preview
Top

2. History Of Quantum Computing

The author brought up an idea for a quantum mechanism for tuning devices in 1980, it marked the beginning of quantum computing. After that, another author (Calamuneri, A., Costa, A., D’Angelo, R., & Sidoti, A., 2017) brought up the idea that quantum devices can realistically manage the things that general devices cannot perform. Previous developments in quantum techniques were introduced by Feynman in 1986. Peter Shor created a quantum algorithm in 1994 that could decrypt RSA-encrypted communications by identifying the composite numbers for a set of ranges of numbers. In 1998, Mark Kubinec, Neil Gerstenfeld, and Isaac Chuang developed the first quantum computer with two Q-bits that could carry out calculations. “Identifying the faults became the hardest task compared to other tasks in the quantum environment” the majority of researchers believe, instead of performing the research after the 1990s. In 2015, research from Duke University estimated that a large fault-tolerant quantum technique with approximately 3.5 million Q-bits can perform the 2,048- set of values in 150 days. Both the public and private sectors have increased their investments in quantum computing research in recent years. Google AI and NASA defined on October 23, 2019, that they had worked on quantum technique which is impossible compared to traditional techniques. Even it is found or not the capturing is still true is the subject of ongoing research. A time line visualization is shown in Figure 1.

Figure 1.

History of quantum computing

979-8-3693-1479-1.ch004.f01

Complete Chapter List

Search this Book:
Reset