Breakthroughs in Information Technology and Their Implications for Education and Health

Breakthroughs in Information Technology and Their Implications for Education and Health

Xinwei Shi (Tsinghua University, China) and Jian Lei (Beihang University, China)
DOI: 10.4018/978-1-7998-6772-2.ch005
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter reviews the major developments in information technology (IT) since 2000 and the contributions these developments have made to industrial growth. Recently, big data, the internet of things, artificial intelligence, the fifth-generation mobile communications (5G), and other new network-based IT are accelerating the promotion and application of new business models, new formats, and new sectors. IT has become the main driving force of the digital economy. The chapter also discusses likely IT breakthroughs during 2021-40 and the potential opportunities these developments might bring in the education, health, and other sectors. It is expected that the new generation of IT will be driven by the 5G/6G, the internet of everything, and artificial superintelligence.
Chapter Preview
Top

Introduction

From the perspective of social development history, mankind has experienced the agricultural revolution, the industrial revolution, and is experiencing the information technology (IT) revolution. The IT revolution has brought a qualitative leap in productivity, and has had a profound impact on the development of international politics, economy, and the society. The term IT was first named by Harold J. Leavitt and Thomas L. Whisler in 1958 in their Harvard Business Review article (Leavitt and Whisler, 1958). IT generally refers to the processing and distribution of data through using computer systems, software, and networks. Currently, the global IT evolution is reshaping the original industrial chain and value chain, and completely stepping into a new stage where a group of new technologies are emerging including the 5G, artificial intelligence, Internet of Things and so forth.

Currently, the COVID-19 pandemic is changing or has already changed our collective recognition of addressing uncertainty as there is no reference case for this kind of crisis in living memory (Durodié, 2020). Importantly, the pandemic is more global in scope, more profoundly impactful and far-reaching, and more complex than any other crisis that today’s decision-makers in every sector have experienced1. Much evidence, however, has shown that the IT has contributed a lot for people in different sectors to confront the COVID-19 pandemic (Chamola et al., 2020; Gozes et al., 2020; Vaishya et al., 2020). In future, therefore, the need for IT has been proved to become more important than any other industries. Notwithstanding, the research on a systematic review of IT developments in past two decades and accordingly a specific forecast on IT breakthroughs in future 2020 and 2040 is yet to be developed. This chapter aims to bridge such gap.

From 2000 to 2020 we have seen three major IT development trends including digitalization, networking and intelligence (Brennen et al., 2016; Haenlein et al., 2019; Winzer et al., 2018). We need to grasp the focus of the generation of IT, predicting its likely developments in the next two decades (seen from Figure 1). First, the Digitization is seen as the foundation of social informatization, and its development trend is the comprehensive digitization of society. Second, the Networking provides a physical carrier for information dissemination, and its development trend is the Internet of Everything. Third, the Intelligence reflects the level of information application, and its development trend is a new generation of intelligence, the so-called super-intelligence (i.e., mock-human emotion). In the future two decades, we believe that the IT breakthroughs will be further driven by Big Data, 5G/6G and Strong AI. The potential path of technological breakthroughs and the continued development trajectory of business models will be reflected through practical examples in the chapter.

Figure 1.

Development Trends of IT from 2000 to 2040

978-1-7998-6772-2.ch005.f01
Top

It Developments During 2001-2020

IT is a broad field that includes all functions and processes relating to computers or technology in an organization, which largely refer to computers and networks (Leavitt and Whisler, 1958). The IT related are usually about physical hardware, operating systems, applications, data storage, databases, servers, and so on. Specifically, the information life cycle of IT is involved with four links, including the acquisition, transmission, processing, and application of information (Matsuyama et al., 2005). Four key technologies that promote the developments of IT system include integrated circuit technology, computer technology, communication technology and software technology.

First, Robert Noyce, founder of Fairchild Semiconductor, in 1959 he invented the world's first integrated circuit (Berlin et al., 2001). In 1968, Noyce and his several other Fairchild employees established Intel. The world’s first CPU was produced in 1971 (Danowitz et al., 2012). On the whole, the integrated circuit technology is seen as the foundation of IT development.

Complete Chapter List

Search this Book:
Reset