Amid the excitement and chatter we've seen on the topic over the last 12-18 months, it's important to understand where progress is being made.
I'll look at how AI is being used in business currently, what the implications are for data centre managers, and how IT leaders can put AI to use in the data centre now. How we moved from AI theory to practice; What is AI and how is it affecting our professional and personal lives?
AI, like cloud computing, has a number of definitions, so depending on who you talk to, a different answer can be expected.
In my view, it is the combination and application of two computing disciplines: "machine learning" and "deep learning", (its subset). Both of these areas have been in research and development for decades (since the 1950s and 1980s, respectively) and comprise a field of computer science which provides computers with an ability to "learn" without being explicitly programmed. However, until very recently, this potential hasn't resulted in any real applicability due to the enormous amounts of computing power needed to show tangible proof of AI in action.
This is what has changed: excitingly, technology has now advanced enough for us to realise the promise of our AI predictions. With the advancements of high performance computing (HPC) and the application of powerful graphics processing units (GPUs), it's now possible to train deep neural networks more efficiently for usable outcomes. Add to this the enormous amounts of data available today; and we're now able to access massive unstructured information types and use machine learning to extract useful, actionable information from otherwise unusable data.
And of course, it's data centres which form the infrastructure backbone housing these huge sets of information.
I'll give you a simple example from outside the tech industry, in the medical field: Scientists from Australia's University of Adelaide are using a deep learning AI in the analysis of CT scans and X-rays to help predict the life expectancy of patients and develop appropriate treatments.
The implications are manifold: any patient regardless of location in the world can have access to an expert service like this, which increases the likelihood of identifying organ or tissue problems ahead of patient symptoms. This infers an example of what I call a "synthetisation of experience", where we can "download" the experience and expertise of our best specialists and teach a system this knowledge, which can then be duplicated and deployed in other locations and situations, bypassing the limited availability of a human specialist.
AI in the data centre
The medical industry isn't alone in using AI today, manufacturers are using it to model scenarios for long-term planning, and the financial sector uses AI-enhanced game theory to predict how markets will react to certain announcements. Some of these same deep learning strategies can also be used by IT leaders in the data centre.
One area is energy efficiency, power and cooling. Data centre managers can apply an algorithm to self-optimise energy consumption, so that a data centre can autonomously adjust its power and cooling systems. This is traditionally a manual process, and a daunting task when the data centre can have hundreds of air-con units with thermostats. Usually a human being needs to monitor the effects of temperature changes on various IT workload levels with regard to the energy usage and the utility bill.
But if an AI handles this vital task, it can start to learn what the ideal temperatures are at various times of the day, and for varying workloads. It can adjust power and cooling accordingly, and analyse the data over time, continually refining the algorithm so the system becomes more and more efficient.
The cost saving implications are significant. For example, Google is using AI currently to reduce power consumption across its entire data centre infrastructure, lowering energy usage by 15 per cent.
The company says it will save hundred of millions of dollars over the next few years alone.
A parallel trend being deployed now in data centres is the use of AI-enabled robotics technology. This area has the benefit of automating the management of the physical connections across a network infrastructure, resulting in improvements like increased reaction time to security issues, lowered operational costs, and increased productivity and more time for IT staff. It's all about reducing human intervention, making operational gains and allowing IT managers to more proactively run the data centre to better meet the needs of the business.
AI will continue to impact enterprise IT and data centre operations in the near term. When IDC points to worldwide spending on cognitive systems and AI climbing from US$8 billion in 2016 to more than US$47 billion in 2020, we know the impact to data centre infrastructure will be profound, and not just because of the exponentially increasing need for hardware to crunch huge data volumes. Obviously, organisations will be looking beyond the data centre alone to leverage AI.
While many companies understand AI's potential, most are still trying to understand how they might leverage it to meet their primary business goals.
To help our customers, Lenovo has developed a AI strategy for organisations based on three steps: Discover, Develop, Deploy. It's all about helping our customers through each of these stages in their AI journey, bringing together our experts along with industry partners as needed on a case by case basis. Our strategy is underpinned by three new AI Innovation centres we have recently announced in Morrisville in the US, Stuttgart in Germany, and in Beijing, China.
In the meantime, data centre operators should look for the low hanging fruit right now, and plan for an array of AI-enabled operational efficiencies over the next few years.
- The writer is president of Lenovo's data centre group for the Asia-Pacific.
Click here to sign up for related courses.