VMware Award Winners Pioneer Tomorrow’s Use of Big Data
VMware and Big Data
VMware is a company that actively invests in cutting-edge research, collaborates with world-class researchers, and recognises groundbreaking work. This is recognised through the VMware Systems Research Award.
The award, which was established in 2016, first commended Professor Matei Zaharia of Stanford University and this past year recognised MIT Professor Tim Kraska. Both were cited for their influential work on big data.
Professor Kraska has been widely recognised for his early work on hybrid human-machine data management. Dr. Zaharia has been involved in a number of projects at the Stanford DAWN centre that aim to democratise machine learning by enabling cutting-edge machine learning applications to be built more quickly. Taken together, their work promises to influence how organisations can better analyse, process, and understand big data.
As enterprises become more data driven, the odds are high that insights from these big data pioneers will be influencing data centre designs. VMware’s interest in big data, however, isn’t limited to the data centre.
Making Big Data Understandable
Another reason VMware is focused on big data is to better understand its role in the internet of things (IoT).
Dr. Greg Bollella, VMware’s CTO for IoT, investigates infrastructure management for IoT. His goal is to fully understand both the hardware and software components that comprise IoT systems. Big data is inherent to IoT systems, and understanding it can help enterprises analyse and maintain operations, perform predictive maintenance, and identify the root causes of failures.
Understanding big data starts with data modeling. Data modeling refers to monitoring data as it changes over time.
“For example, if you looked at a business’ operations, usage will likely be much higher during the week than the weekend,” says Bollella. “The shape of the data stream changes with respect to time. Data modeling maps these patterns.”
Understanding the variable patterns within the data helps improve the accuracy of monitoring systems.
“It’s useful for calibrating alerts,” says Bollella. “If an operator gets an alert that something is wrong with the infrastructure, you want that alert to be as accurate as possible. Right now, there are so many false alerts that people ignore them.”
Improving the accuracy of alerts might sound like a minor issue, but the frequency of false alerts means not knowing if an alert is real or not. It is analogous to an engine light in a car that goes off even when there is no issue, and then one day it goes off because there is no oil. The warning is ignored, and the engine is ruined.
Data-driven enterprises need advanced data analytics. The enterprise is interested in data modeling because improving the accuracy of alerts ultimately reduces risk.
Despite the challenges of big data, the potential for improving operations, finding efficiencies, and reducing risk far outweighs the difficulties. Companies, including VMware, are moving ahead aggressively with big data initiatives. The company believes democratised machine learning will ultimately help its customers develop machine learning systems more quickly and easily.
Bollella says he typically sees two types of VMware customers experimenting with big data today. The first are companies that have been engaging in IoT-like activities—though not as fully automated—for a decade, and who are now accelerating that work with new technologies. The second are companies experimenting with IoT for the first time, as a proof of concept. Both are still in the early stages of understanding how to best leverage big data, but early returns are encouraging.
“In the next two to three years, these projects will move from proof of concept to deployment,” Bollella says. “History tells us that most will fail, but the best ideas will lead and the successes will be widely deployed.”