Redefining the Information Technology in the 21st Century

Redefining the Information Technology in the 21st Century

Ruben Xing (Montclair State University, USA), Zhongxian Wang (Montclair State University, USA) and Richard L. Peterson (Montclair State University, USA)
Copyright: © 2013 |Pages: 10
DOI: 10.4018/978-1-4666-2782-6.ch001


As one of the most influential and beneficial developments today, Information Technology (IT) is quickly transforming business infrastructures and reshaping the way how people are affected in context to how they work and live. In order to meet the fast and changing trends in the 21st century, Information Technology should be redefined in five major areas, i.e., Power of Computing, Internet-working and Telecommunications, New Features with Emerging Trends, Security and Disaster Recovery, and Green IT.
Chapter Preview


While the computing and Internet technologies continue innovating quickly in the 21st century, the way of people conducting businesses, dealing with their social lives and daily activities are further transformed. The IT innovations have created new products and services, destroyed old business models and replaced them with new business styles, disrupted entire industries, built new business processes and transformed the day-to-day conduct of business (Landon, 2007). During the process of making business strategies, it is crucial for all industry leaders, business executives, and technology developers to refresh their traditional concept of information technologies. According to current research, the following five topics are most typical areas reflecting the contemporary IT domain:

  • 1.

    Challenge of computing powers and the Moore’s principle of law;

  • 2.

    Challenge of Internet development - the key field of IT movement today;

  • 3.

    Trend of IT strategic changes;

  • 4.

    Top priority: Information security and disaster recovery planning; and

  • 5.

    Mission critical: Global warming and green IT.

This paper will critically examine these five areas of trend. It will point out their impact on the development of IT in the future. It will further emphasize how the industries should reshape their business strategies.


Challenge Of Computing Powers

There are two major misconceptions among the IT industry regarding the future development of computing technologies. The first misconception believes that the processing speed has reached the limit, and the power of computing will stay on the current level with no rooms to further increase. The second one is saying that the Moore’s Law will be discontinued.

To clarify these misconceptions, it is necessary to understand Moore’s Law correctly. In 1965, the Intel co-founder, Mr. Gordon Moore made a provocative prediction that “the number of transistors on a chip will be doubled in about every two years” (Intel, 2009). This is known as Moore's Law.

In terms of technical aspect, the more the transistors integrated in a processor, the faster is the processing speed. Further, the bigger the storage capacity, the more powerful is the performance that a computer can provide. The research chart in Figure 1 showed that the number of transistors on the first generation of a personal computer was about 1000 in the 1970’s. Since then, the power of computing has actually doubled in every 18 months. When the Intel Itanium-II processor was introduced in 2005, it had integrated with nearly 1 billion transistors. This history of development proved that Mr. Moore’s prediction was exactly correct.

Figure 1.

Computing power doubled along Moore’s Law


However, people noted that the speed of processor was stalled when it reached 3.6Gbps by the end of 2005 (Figure 2). This fact made business managers and many industry leaders believed that computing speed was unable to continue going higher, and the Moore’s Law has reached its limit.

Figure 2.

CPU speed dropped after 2005


Complete Chapter List

Search this Book: