Article Preview
TopBackground
Artificial Intelligence (AI) and data analytics have impacted recent business development. This article aims to discover practical business ethics for data protection through evaluate data-driven and human-centered approaches in two startup businesses. In early 2020, a picture of “AI Factory” was depicted in Harvard Business Review:
At the core of the new firm is a decision factory—what we call the “AI factory.” Its software runs the millions of daily ad auctions at Google and Baidu. Its algorithms decide which cars offer rides on Didi, Grab, Lyft, and Uber. It sets the prices of headphones and polo shirts on Amazon and runs the robots that clean floors in some Walmart locations. It enables customer service bots at Fidelity and interprets X-rays at Zebra Medical. In each case the AI factory treats decision-making as a science. Analytics systematically convert internal and external data into predictions, insights, and choices, which in turn guide and automate operational workflows. (Iansiti & Lakhani, 2020, p. 62)
This is daily life under technological hegemony. A world controlled by artificial intelligence (AI) seems distant, but the world is obviously marching along that path. The report further indicated:
Four components are essential to every factory. The first is the data pipeline, the semiautomated process that gathers, cleans, integrates, and safeguards data in a systematic, sustainable, and scalable way. The second is algorithms, which generate predictions about future states or actions of the business. The third is an experimentation platform, on which hypotheses regarding new algorithms are tested to ensure that their suggestions are having the intended effect. The fourth is infrastructure, the systems that embed this process in software and connect it to internal and external users. (Iansiti & Lakhani, 2020, p. 63)
The problem of people being reluctant to be controlled by AI can be addressed using the following four aspects: data pipelines, algorithms, experimentation platforms, and infrastructure. As radical as the idea may seem, it is necessary to determine approaches that may help overcome this hurdle.
AI is generally considered a new technology, yet technology—namely, the algorithm—accounts for only a quarter of AI, and engineering takes up the remaining three parts of AI. The first AI algorithms were developed over 70 years ago (Lewis, 2014), and new algorithms have been continuously developed since then. The synergy of data, platform, and infrastructure has resulted in recent advances in AI.
Discourses on deep learning and machine learning essentially refer to algorithms, and the legal regulation of AI algorithm development is difficult. Thus, the direction of AI development can be influenced only from the perspectives of data, platforms, and infrastructure. Data refers to the input data (i.e., the content of AI learning). Platform refers to an organization formed by a company or an institution operating the AI business; the organization is mainly composed of real persons and engages in experimentation, namely conducting tests or trials, for further improvement of the business. Infrastructure refers to an engineering framework that enables smooth AI operation; it generally refers to hardware.
AI cannot exist without data, irrespective of how efficient the algorithms, organizations, and hardware are. AI generates A if data are oriented toward A; AI generates B if data are oriented toward B. Thus, data are the key component of AI. However, further understanding of the origins of data is required.
Data input is mostly a semiautomated process, also known as data collection. On online social network, data are generated from usage by users (Cranshaw et al., 2012, p. 58). From the engineering perspective, data are generated through user interactions. From the humanistic perspective, data are produced through user behavior. However, a user actually does not generate data because of the automation process like a computer that records the user’s usage behavior automatically (Kosinski et al., 2013, p. 5802).