Presented by SambaNova Systems
To stay on top of AI innovation, it’s time to upgrade to next-gen architecture. Join this VB Live event to learn how cutting-edge computer architecture can unlock new AI capabilities, from common use cases to real-world case studies and more.
“Everybody’s heard a lot about big data over the last decade,” says Alan Lee, corporate vice president and head of research and advanced development at AMD. “Put differently, what are we going to do with all this data? How can we use it for the betterment of mankind, business, individuals, and so on?”
All that data needs to be handled or manipulated in some way. AI is enabling new modelling and simulation methods, and dramatically improving visualization. It’s helping unlock new ways to meet the critical needs of enterprise, to connect people and businesses through data sharing and video collaboration when working (or teaching) in-person is not possible.
However, all the data we collect is worthless unless you have the CPUs and GPUs necessary to make sense of it, to grind through the necessary computations. And then you need the technology to go a step beyond the computation to create visualizations of this data. Having data is clearly important; acting on that data is more important; but really being able to see and interact and understand it at a glance, that’s when differences are made, Lee explains.
“You show somebody a random stream of data, they’ll ignore you,” he says. “You show somebody an Excel table, they might not clearly see the trends. But if you show someone a 3D animation of what’s actually going on, whether it’s a business or a climate model or what have you, it’s an a-ha moment.”
“In today’s world, the rise of data science, statistical modelling, machine learning, and AI have changed all aspects of business and science as we know them, and for the better,” Lee says. “But they don’t exist in a vacuum.”
Leaders need to treat data as a resource, similar to traditional internal input and output flows. While data has features and capabilities all of its own, understanding how data is like a standard input and output, and where it differs from your standard, supply chain flow is a critical skill for leaders.
More concretely, most businesses are best served by flexible solutions that encompass a broad range of compute capabilities.
“Leaders need to evaluate their companies’ and organizations’ compute needs for today and the future, and think about their ability to scale,” he says. “If they do that, they can make sound decisions which will last them through this very exciting time for a while: the explosion, not just of data, but of compute capability and machine learning, AI, and data science that goes along with it.”
That includes hiring and empowering workers with data science skills. Skills that weren’t considered so important a decade ago, such as statistics and applied math, are now at the forefront of most innovation in the industry.
“What we’re going to see over the course of the next decade, isn’t a third paradigm,” Lee says. “It’ll be discovering what knowledge and gains we can extract from the confluence of traditional models with these new statistical models.”
That could have the appearance, from the outside, of being another explosion of value in the enterprise landscape, but it will really be a journey over the long term, coming to understand when traditional models of compute are most viable and most cost-effective, and when the new statistical models are most viable and cost-effective. But overall, Lee asserts, we’re going to see a lot more hybrids, folks merging these traditional models with new statistical models to discover new avenues of research and development.
In the end, he says, “Leaders who understand the AI and data science revolution — its possibilities, limits, and capacity for change — are poised to have a successful organization for many years to come.”
Don’t miss out.