AI is helping people create exciting applications and new ways to serve customers, cure diseases, prevent security threats, and much more. Rapid progress continues to unlock increasingly more opportunities for enterprises and scientific research where AI can make a big impact. Many believe that the real-world potential for AI is highly promising. Speaking at a 2016 AI conference in London, Microsoft’s chief envisioning officer, Dave Coplin observed, “This technology will change how we relate to technology. It will change how we relate to each other. I would argue that it will even change how we perceive what it means to be human.” Apparently, the best is still to come.
This past year has marked a revival of artificial intelligence (AI). While the term — and the science behind it — have been around for decades, the technology has only recently come into its own for the mainstream enterprise. AI-based tools are pouring into the B2B marketplace, and these offerings represent years of collective development and billions of dollars of investment. For example, distributed In-Memory databases, as well as new graph analytic approaches — both accelerated by GPUs — allow enterprises to access and interact with data much faster than before. Big pushes for AI deployments are occurring in manufacturing, transportation, consumer finance, precision agriculture, healthcare and medicine, and many other industries, including those in the public sector.
AI is becoming important as an enabling technology, and as a result, the US federal government recently issued a policy statement, “Preparing for the Future of AI” from the Subcommittee on Machine Learning and Artificial Intelligence to provide technical and policy advice on topics related to AI.
So why, after all these years, is AI coming to the forefront? We break it down into 3 themes that have converged to drive this renewed purpose. The first is the scale of computation and technology. Only recently have technologists figured out how to scale computation to build deep-learning algorithms that can take effective advantage of voluminous amounts of data. The second is the massive heap of data, as we cross a major milestone in the volume of data currently being collected and used by enterprises. And the third is a shift in mindset from being overwhelmed by the data deluge to actually being data hungry. AI is the answer to this insatiable appetite.
1. Scale of Computation
Over the past 20 years, computers’ ability to process information has become simultaneously more robust and more affordable, allowing us to do more with increasingly more information, in less time and for less money. The intersection of these factors lands us at a place that makes exploring artificial intelligence for business impact economically advantageous. That said, it’s not as though companies haven’t explored different entities of data science before. Neural networks, which are computationally expensive algorithms, were used in the 1980 and early 90s, diminished in the late 90s, and are now having a major resurgence. This is likely attributable to companies realizing the minimal gains were not worth the high cost at the time. But in the past decade computers became fast enough to run large-scale neural networks at a more palatable cost. Since 2006, advanced neural networks have been used to realize methods referred to as deep learning. Now, with the adoption of GPUs (the graphics processing unit originally designed nearly 20 years ago for gaming), neural network developers can run deep learning algorithms using the same amount of computing power required to bring AI to life quickly.
Adding GPUs to the mix was pivotal in the area of machine learning, offering orders-of-magnitude speed increases on dense and sparse data. GPUs define the current performance limits for machine learning but have limited model capacity. Academic researchers are working on methods to mitigate that challenge and achieve linear speed increases with GPUs on commodity networks. Most enterprise AI projects are still in the experimentation stage, examining problems and deciding whether AI can be applied to find solutions. And with GPU power now available through cloud services, such as Amazon Web Services, Azure, and Google, the barrier for AI experimentation has been lowered.
2. Scale of Data
Enterprises also appear to be embracing the idea of businesses accessing, processing, and taking advantage of data by using data science. This year we will cross into the zettabyte regime in terms of data volume. So with the amount of data at our disposal, coupled with the intended applications of the data, enterprises are seeing more opportunity to gather game-changing insights than ever before. Essentially, the combination of data volume and access to technology has upped the ante for enterprises, and they need to utilize both in new, innovative ways to stay competitive.
3. Shift in Mindset — from Data Deluge to Data Appetite
Companies are no longer drowning in their data deluge but are now looking for new solutions to better take advantage of the data they have. Many enterprises are enabling themselves to overcome the data deluge by investing in highly scalable software like Hadoop and building data lake architectures. Having made the investments, companies now have an insane appetite for more insights, and they’ve shifted their approach from using data mining to answer traditional hypothesis-based questions to wondering what the data can reveal to help them address business blind spots and challenges. They know that to do more with their valuable data assets, they must start using AI, machine learning, and deep learning. Important problems are ripe for solutions by these technologies, including AI-powered healthcare at scale, AI-powered weather forecasting, AI-accelerated cyber defense, and AI-powered customer service, to name a few. So while data volume is creating a need for AI, AI is simultaneously creating an insatiable desire for even more data.
Humans Still Needed
In general, AI technologies are making headway now thanks to the steady proliferation of data, the advancements and affordability of computing technology, and the applicability of data science to business applications, inspiring enterprises to take their data science investments and capabilities to the next level.
One of the hurdles that organizations need to overcome is the lack of human resources. With an influx of data, technology, and capability, enterprises must find the skilled data scientists required to run their AI applications. Many enterprises will find solace in partnering with third parties or buying solutions that can automate much of the data science process. Solutions are evolving to maximize the productivity of the few scientists an enterprise has, to do the work of tens or even hundreds more, and to enable simplified self-service to a growing number of hybrid analysts, skilled in both business analysis and data science.
To learn more about how to incorporate an AI-capable analytics platform into your enterprise — without hiring more data scientists — download our latest whitepaper, “Transforming Data to Intelligence to Value at Scale.”
Georges Smine is VP of Product Marketing at Opera Solutions.