by Helen Johnson | September 14, 2017 10:25 am
According to Grazitti Interactive, a marketing technology company, 90% of the data in the world has been created in last 2 years. Besides, in 2020 1,7 megabytes of new data are expected to be created every second by every person. Numbers are really huge. Even now data is everywhere and it is obvious that the future belongs to Big Data.
Why does it happen? The answer is digitalization of the majority of industries. Big Data has already influenced such industry verticals as media and entertainment, healthcare, education, transportation, banking, manufacturing, retail and insurance. For example, even now various wearables, fitness trackers and smartwatches[1] help to control health and reduce healthcare costs. The digitalization process also has changed business environment.
In order to ensure the scalability of business activities, companies of different size develop their strategies by analyzing a large amount of various information, e.g., digital business records and personal data. Also, the gathered data should be collected, processed and stored using special technologies / tools that ensure fast and economical analysis of large volumes of information.
Gartner Inc. says Big Data is the combination of high velocity, volume, variety and value information assets. But Big Data analysis faces a number of challenges.
Not only the data volume creates difficulties, but appearance of fast-developing sources of information of various types does. The variety of available devices with different features, capacities and resolutions produces an ever-larger number of personal content including images, video files, etc.
A data center is not the only place where information is generated. Some data is created outside the datacenter that makes it difficult to gather information. Large volumes of data require high storage capacities to deliver information fast and provide successful transactions.
In order to collect, validate and process high data amounts, companies require the computing infrastructure that will be able to perform necessary functions. Data specifics also plays role as information from various sources can be unstructured, mixed or possess no apparent structure. But not all pieces of information are actually data.
Russell Ackoff, a professor and systems theorist, divided the content into five separate categories according to its interpretation by human mind:
But the traditional way of data accumulation is not suitable for Big Data that is connected with statistics and artificial intelligence[2]. Requiring the development of new technologies and software, the phenomenon of Big Data influences the sphere of software products development. As such systems possess quite different architecture and operation logics, the standardized approaches to their verification will not ensure their proper working. So QA and testing[3] services should also be the specifics of Big Data applications that will continue their development.
The number of Big Data applications will only grow. Grazitti Interactive company predicted the high development and popularity of voice search and audio-centric technologies. 30% of web browsing will be performed by voice without a screen by 2020. Advanced algorithms will be implemented to alter positively the human workers. All that is possible is due to Big Data. But only time will show whether the predictions are true.
Source URL: https://blog.qatestlab.com/2017/09/14/big-data-invasion/
Copyright ©2024 QATestLab Blog unless otherwise noted.