Big Data-A Term
The extensive use of emerging internet and mobile technologies is the genesis of big data age with huge capacity, complex and rapidly growing data in variety of forms. In 2020, around 2.5 quintillion bytes of data has been generated by internet users and around 1.7 megabytes data by every person in just a second. Google search statistics predicted for year 2021 that Google will get over 1.2 trillion searches per year, and more so 40,000 number of search queries per second.
Besides, big data is omnipresent but its origin is unknown yet. It was stated that in the mid 1990's, John Mashey coined this term at a lunch table conversation at Silicon Graphics Inc.(SGI) (Diebold, 2012). A repeated definition is given by Laney (2001) “Data generated very fast that contains an extensive volume of content”. The term became a well-known term till 2011 (Gandomi & Haider, 2015).
Apart from mass of data, big data has been defined by several other characteristics. Doug Laney in 2011 pointed out challenges and opportunities accomplished by growing data into 3 Vs model (Increased Volume, Velocity and Variety). Later in 2011, IDC (International Data Corporation) most significant known name in big data stated that the evolution defines the development of tools and technologies in new form to extract value from big data. This results as the 4th V for big data –Value. The 4 Vs were widely accepted since it drew attention towards the usefulness of big data. It points the most significant problem in big data. If data is not utilized properly, it is only a bunch of data. Another term Veracity has been added as 5th V which means checks the provenance or reliability of the data source and verifies how meaningful the data is i.e truthfulness in data. This is not the end for expansion of Vs, big data is continuously characterizing with more Vs such as Validity, Variability, Vocabulary, Venue and Vague defining new parameters (Sun, 2018).