Article Preview
TopIntroduction
Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user (Foster, Zhao, Raicu, & Lu, 2008). It enables increased productivity and efficiency (Armbrust, Fox, Griffith, & Joseph, 2010). The cloud paradigm changes various processes, patterns, practices, and philosophies, and its adoption must be carefully analyzed. The authors in (Abdelwahhab & Mostefai, 2020) provide a practical framework for decision-making on cloud paradigm adoption, which is based on cloud standards and best practices. Despite the influence of cost effectiveness, on-demand service, and scalability, cloud computing faces many challenges, such as security, performance, orchestration, and fault tolerance. Several investigations propose technologies and tools to overcome these challenges, such as the models for task scheduling presented in (Alakbarov, 2022) and in (Tuli & Malhotra, 2022) or the model for fault mitigation described in (Adeyinka Osuolale, 2022) and the capability-based access control proposed in (Kaushik & Gandhi, 2020) to ensure that only authorized users will be able to access the data.
Edge computing (Shi, Cao, Zhang, Li, & Xu, 2016) as well as Fog computing (Ahuja & Wheeler, 2020) emerged as natural responses to that conflict. These distributed computing paradigms bring computation and data storage closer to the location where it is needed, by extending the Cloud to the Internet of Things (IoT) devices. In (Shi, Cao, Zhang, Li, & Xu, 2016) the authors not only elaborate on the potential of this distribution paradigm, but also expose the greatest challenges that must be faced to guarantee the viability and popularize this technology.
Modern edge computing (Chelliah & Surianarayanan, 2021) extends this approach through virtualization technologies that make it easier to deploy and run a wider range of applications on the edge servers and take advantage of largely unused computational resources such as the ones present in IoT gadgets. Some examples are, the algorithms for workflow scheduling which consider cost, energy and load balancing in heterogeneous environment described in (Bisht & Vampugani, 2022); the concurrency control protocol for IoT transactions presented in (Al-Qerem, Alauthman, Almomani, & Gupta, 2020); the heuristic algorithms for virtual machine placement and workload assignment defined in (Wang, Tornatore, & Zhao, 2021), and heuristics on the cost-effectiveness of user allocation solutions with the objective of maximizing the number of users allocated to edge servers while minimizing the number of required edge servers. (Lai, He, Grundy, & Chen, 2020).
In the context of the edge computing paradigm, Hive (Teragni,, Moran, & Zabala, 2020) - an abstraction layer compatible with standard JavaScript and Node.js - was designed and implemented. It provides a distributed shared memory on top of existing web browser, like the ones present in smartphones or tablets. In this way Hive enables developers to take advantage of the biggest unused processing power available today, without incurring in the extra cost of deploying a network of devices. To this end Hive provides a virtual cloud server that enables collaboration and sharing among applications deployed on different distributed devices.