Artificial Intelligence, the Risk to Do Something Against the Man: Innovation Must Be Governed to Avoid Disgregation

Artificial Intelligence, the Risk to Do Something Against the Man: Innovation Must Be Governed to Avoid Disgregation

Anna Verrini
DOI: 10.4018/978-1-7998-7126-2.ch008
OnDemand:
(Individual Chapters)
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

A long time ago we started to speak about artificial intelligence. It was like a dream for many years, and now, with the tremendous computing capability and data availability, we are living in a kind of reality where we think everything is possible because we can train machines to do more or less everything man is able to do both with his hands and with his brain. But we must keep in mind that we can train machines to do what we do, in the way we do, but it is quite difficult, for example, to train machines, and not only, to take decision in the way and with the quickness usually necessary. And what about ethics? What kind of future can we design? This chapter explores the risks of artificial intelligence.
Chapter Preview
Top

Introduction

Speaking about Artificial Intelligence is quite mandatory to start with Marvin Minsky, one of the pioneer of Artificial Intelligence.

Minsky, American Mathematician and Cognitive Scientist, was born in New York in 1927 and died in Boston in 2016. He was largely concerned with research on Artificial Intelligence and was co-founder of the Massachusetts Institute of Technology's AI laboratory. Anyway, it is not the case to proceed with the Biography of Marvin Minsky, very well known to the majority of the people with some interest in AI, Computer Science, Innovation. He has received the ACM Turing Award, the MIT Killian Award, the Japan Prize, the IJCAI Research Excellence Award, the Rank Prize and the Robert Wood Prize for Optoelectronics, and the Benjamin Franklin Medal.

Figure 1.

Marvin Minsky open

978-1-7998-7126-2.ch008.f01

Minsky was adviser on Stanley Kubrick's movie 2001: A Space Odyssey and he was also mentioned explicitly in Arthur C. Clarke's derivative novel (Clarke, 1968) with the same title. He argued that “somewhere down the line, some computers will become more intelligent than most people,” but that it was very hard to predict how fast progress would be. He cautioned that an artificial superintelligence designed to solve an innocuous mathematical problem might decide to assume control of Earth's resources to build supercomputers to help achieve its goal, but believed that such negative scenarios are “hard to take seriously” because he felt confident that AI would go through a lot of testing before being deployed (Jerusalem Post, May 13, 2014).

From the very beginning science and science fiction started to move forward together in a way where science fiction seems to anticipate what will happen sometime in the future without knowing exactly when. Many times the stories of the science fiction become real life with some modification, of course, but remaining very close to what has been created in the fiction. Everyone must be convinced that what would be better to do is that man keep in their hands the ability to design and write their future (Minsky, 1967), (Minsky, (Minsky, 1986), (Minsky, 2006).

And this is what anybody would believe: the worst scenarios cannot become reality because anybody is convinced, and hopes, that the human mind will keep the governance in the innovation and in the evolution (Stephen Hawking, The Guardian 2014-dec-2), (Elon Musk, Vox 2018-nov-2).

Top

The Computer From The Beginng

From the very beginning computer had the capability to strike the imagination of people and to open futuristic views. Computer is usually the “star” in a science fiction novel and it is demonstrated that is a character that can draw a lot of readers. Also in the everyday life computers and people working with a computer can fascinate a lot of persons.

From the beginning of the human history, more or less 2.100 years before Christ in China and then for ancient Greeks and Romans, abacus was a help for the man to make calculation. A first trial to design and build a general purpose computer was the Analytical Engine by Charles Babbage.

Figure 2.

Charles Babbage machine open

978-1-7998-7126-2.ch008.f02

Complete Chapter List

Search this Book:
Reset