To predict the future, one must look at the past, says the old adage. To determine what to expect in 2017, we thought it was best to draw lessons from 2016 despite our industry’s yearning for dramatic change. Laks Srinivasan, COO at Opera Solutions, shares his insights into the biggest Big Data trends of 2016 and reflects on where the market is going and how companies will react.
1. No big bang AI. It’s bottom-up AI
Sure, the tech giants are working on flashy AI projects, with projects on robotics, drones, and computers that surpass humans in certain tasks. However, what’s more interesting — and what we’ve been witnessing through our current work — is that artificial intelligence is nibbling away at IT, particularly with deep and supervised learning, as enterprises find more ways to apply them for business gain. Expect to see insights everywhere and “smart” everything (meaning things with machine learning components built in): smart data, smart objects, smart modules, smart processes, and smart applications. With AI, analytics are becoming an enterprise’s central nervous system… gradually.
2. Personalization at scale — back to the future
The industry has been talking about this for quite some time, but scaled personalization will continue to improve. Enterprises will seek out and identify specific consumer behavior that helps predict future behavior and then programmatically deliver precise messaging across the customer journey. With millions of customers, automation will be key. If enterprises do this right, consumers will feel as though they have a more personal relationship with the companies with which they do business and that those companies better understand their needs and wants..
3. Growing expectations and disenchantment of predictive analytics
Consumers will continue to require more predictive capabilities from every company they interact with, à la streaming radio services such as Pandora and Slacker as well as recommender engines that power Amazon and Google. CxOs will work toward the “predictive” enterprise, as they respond to market demands but will face disenchantment and challenges in their ability and speed to achieve these goals.
4. Good news and bad news: open source’s surge not without consequences
Open-source software has been a boon to Big Data analytics, enabling businesses large and small to embrace the technology and try various applications. It also gives companies the flexibility to adapt at their respective speed. However, using open-source software does not guarantee timely arrival or conclusion of projects and often ends up incurring significant costs. The complexity involved in orchestrating the different analytics components within an existing enterprise with a legacy environment, and the need to stay current with the most up-to-date versions will open cracks in companies’ architectures, processes, and resources. More companies will face deceleration of their Big Data analytics projects and require a do-over for their analytics architecture.
5. Too many tools and too many services
The proliferation of tools around big data, analytics, and even AI is giving users and buyers many new options and lowering the barrier for them to test and experiment. Without service capabilities, these tools on their own won’t succeed since they require coordination of an assembly line of Big Data analytics throughout the enterprise. The corollary is also true, where professional services alone cannot sustain the need to scale analytics due to the expense and inefficiencies involved. It takes business, technology, and data science working together, as well as change management in an organization, to create business impact. Successful companies in 2017 will be the ones that rely on hybrid approaches by tapping data science expertise and applying the right tools. They will reap the best business results.
Laks Srinivasan is Co-COO at Opera Solutions.
This article was originally published at Martech Advisor.