Big Data on E-Government

Big Data on E-Government

Mohd. Shahid Husain (Integral University, India) and Neha Khan (Integral University, India)
Copyright: © 2017 |Pages: 10
DOI: 10.4018/978-1-5225-1703-0.ch002

Abstract

All aspects of big data need to be thoroughly investigated, with emphasis on e-governance, needs, challenges and its framework. This chapters recognizes that e-governance needs big data to be reliable, fast and efficient. Another principle is that the trust of a citizen is the main concern. The extraction of meaningful data from large variety of data is a critical issue in big data hence new approaches must be developed. This chapter basically discusses the key concepts of veracity in big data on e-governance. Its main aim is to provide the comprehensive overview big data in e-governance. E-government is still struggling to move advanced level of development. Current e-government applications handle only structured data and sharing between the applications is also difficult.
Chapter Preview
Top

Challenges Of Big Data

The problems of big data are complex to analyze and solve. The better option for this is to classify the problem according to the data format. It is really very difficult to handle big data in every field. These challenges requires the design of new advanced architecture, algorithms, visualization techniques etc. The main challenges to handle big data are:

  • Requires high computation and storage power.

  • Requires new advanced algorithms.

  • Requires New architecture

  • Reduction in data dimension

  • Scalability (Scaling up and scaling down)

  • Challenge to improve performance

  • How data can be secure (data security)

  • Challenge in workload diversity

  • Continuous availability in services and to improve cost

Top

Characteristics Of Big Data

Big data can be categorized in terms of volume, velocity, variability and complexity.

  • Volume: The amount of data generated is very important; this amount determines the potential of the data. Volume of data is indeed an important dimension that has influenced the data processing techniques. Telecom companies typically process from 100 million to half billion Call Details record per day. Providers also need to provide real time information to the consumer. Using traditional techniques it is indeed impossible to provide these services.

  • Variety: This defines the different forms of data i.e. whether the data is structures unstructured or semi structured. IBM estimates that over 90% of the real time data is represented by unstructured data. Having new types of data arise new risks.

  • Velocity: Velocity refers to the speed of the generation of the data or how fast the data is generated as well as processed. Now the data movement is real time. The high velocity of data represents Big Data.

  • Variability: This is one of the main characteristic of big data as accuracy of the data depends on the veracity of the source data.

  • Complexity: The management of big data is very complex process especially in the case when big amount of data comes from multiple sources. These data needs to be properly connected and correlated in order to grasp.

Complete Chapter List

Search this Book:
Reset