In its latest report, Gartner says that most of the current spending is used in adapting traditional solutions to the big data demands — machine data, social data, widely varied data, unpredictable velocity — and only $4.3 billion in software sales will be driven directly by demands for new big data functionality this year.
According to Mark Beyer, research vice president at Gartner, Big Data currently has the most significant impact in social network analysis and content analytics, and he says that in traditional IT supplier markets, application infrastructure and middleware is most affected, with 10 percent of new spending each year influenced by big data in some way, when compared with storage software, database management system, data integration/quality, business intelligence or supply chain management (SCM).
"Despite the hype, big data is not a distinct, stand-alone market, it but represents an industry-wide market force which must be addressed in products, practices and solution delivery.
Beyer says that Big Data opportunities emerged when several advances in different IT categories aligned in a short period at the end of the last decade, creating a dramatic increase in computing technology capacity.
“This new capacity, coupled with latent demands for analysis of ‘dark data’, social networks data and operational technology (or machine data), created an environment highly conducive to rapid innovation.”
Starting near the end of 2015, Gartner expects leading organisations to begin to use their big data experience in an almost embedded form in their architectures and practices, and beginning in 2018, the market analyst firm says that big data solutions will offer increasingly less distinct advantage over traditional solutions that have incorporated new features and functions to support greater agility when addressing volume, variety and velocity.
However, Beyer says the skills, practices and tools currently viewed as big data solutions will persist as leading organisations will have incorporated the design principles and acquired the skills necessary to address big data concerns as routine flexibility.
"Because big data's effects are pervasive, big data will evolve to become a standardised requirement in leading information architectural practices, forcing older practices and technology into early obsolescence.
"As a result, big data will once again become 'just data' by 2020 and architectural approaches, infrastructure and hardware/software that does not adapt to this 'new normal' will be retired. Organisations resisting this change will suffer severe economic impacts."