Heterogeneous Large-Scale Distributed Systems on Machine Learning

Heterogeneous Large-Scale Distributed Systems on Machine Learning

Karthika Paramasivam, Prathap M., Hussain Sharif
DOI: 10.4018/978-1-7998-3591-2.ch004
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Tensor flow is an interface for communicating AI calculations and a use for performing calculations like this. A calculation communicated using tensor flow can be done with virtually zero changes in a wide range of heterogeneous frameworks, ranging from cell phones, for example, telephones and tablets to massive scale-appropriate structures of many computers and a large number of computational gadgets, for example, GPU cards. The framework is adaptable and can be used to communicate a wide range of calculations, including the preparation and derivation of calculations for deep neural network models, and has been used to guide the analysis and send AI frameworks to more than twelve software engineering zones and different fields, including discourse recognition, sight of PCs, electronic technology, data recovery, everyday language handling, retrieval of spatial data, and discovery of device medication. This chapter demonstrates the tensor flow interface and the interface we worked with at Google.
Chapter Preview
Top

Programming Representation And Basic Perception

A Tensor Flow measurement is represented through a coordinated diagram, which consists of a lot of hubs. The diagram speaks to a data flow estimate, with increments to allow a few types of hubs to maintain and update industrial conditions and to extend and circle organize structures inside the diagram in a method like Naiad. Customers create a logical diagram on a regular basis using single of the maintain front end dialects. A model component for constructing and executing a Tensor Flow chart using the frontage end of Python is revealed in Figure 1 and the calculation diagram below in Figure 1. -hub has at least zero data sources and at least zero yields in a Tensor Flow chart, and speaks to the start of an action. Qualities that stream along the typical edges of the diagram (from yields to inputs) are tensors, self-assertive dimensionality clusters that evaluate or induce the basic component form at the time of graph creation. Extraordinary edges, known as control conditions, can also be found in the chart: no information streams beside such edges, but it specify that the control reliance foundation hub must be executed before the control reliance target hub begins to execute. Since our model integrates variable state, customers may legally use the control conditions to allow the event before connections occur. The execution often integrates control criteria for ordering autonomous operations in any case as a mechanism for monitoring, for example, the use of pinnacle memory.

Activities and Kernels

In recent years we have observed large change sine convoying enema land marketing in particular as a result to fin tenet expansion, globalization, and ubiquitous sin for motion availability. One of the scientific fields which gained result of this was data analysis under various names: statistics, data mining, machine learning, intelligent data analysis, knowledge discovery. Many new data analysis technique re-emerged which exploit avail ability of more and different data from several sources, and increased computational power of nowadays computers. Some examples of these techniques are support vector machines, text analytics, association rules, ensemble techniques, subgroup discovery, etc. These techniques have been accepted into analytics’ standard tool box in many disciplines: genetics, engineering, medicine, vision, statistics, marketing, etc.

Complete Chapter List

Search this Book:
Reset