Here are the next steps for companies to accelerate their AI deployments
The use of AI in enterprises often starts slowly, from conceptual stages to pilots, through testing and eventual deployment. But as these groundbreaking projects mature and more uses of AI are discovered and encouraged, companies must constantly re-evaluate their strategies and look to accelerate their AI deployments to keep pace and aim for goals. more ambitious.
This is the advice of Daniel Wu, Head of Commercial Bank AI and Machine Learning at JPMorgan Chase, who presented at the virtual conference AI Hardware Summit September 15 on “Mapping the AI Acceleration Landscape”.
The rapid rise and expansion of AI across all industries in recent years has been fueled by vast improvements in computing power, open and flexible software, powerful algorithms, and related advancements, but companies don’t should not remain indifferent to early AI successes, Wu said.
Instead, as more experiences are gained, now is the right time to accelerate these efforts and democratize new AI innovations to help businesses use these still-developing tools to drive their goals. and their business strategies, he said.
To further boost AI capabilities, companies need to start with the basics they already know, Wu said, including data, hardware, IT staff, governance and operations.
“I intend to try and answer this very daunting question of how we should develop AI capabilities, regardless of the size of an organization and the types of resources you have available for your organization.” Wu said. “Data, of course, gets our attention first,” with the biggest problem for data scientists and machine learning practitioners being dirty data.
“About 60% of developers think dirty data is a major problem for them, with around 30% of data scientists stating that dirty data is [adequate] the availability of usable data is a major hurdle for them, “Wu said.” So what can we learn about this? “
Such challenges with data are not new and have been around for a long time, he said. “You see data silos everywhere, cross-functional areas, with each team developing their own solution and creating their own data assets without thinking about how that data asset can be used across the organization. “
For many computer systems developed many years ago, the appropriate data models were not included in their creation, he said. At the time, only functional and performance requirements mattered, without worrying too much about how the data would be used for other purposes in the future.
But AI has changed that old approach, Wu said.
“Even today, many businesses are undergoing digital transformations and they are moving their on-premises data centers to the cloud,” he said. “During that transition there’s this awkward hybrid state where you have some of the data in the cloud and some of the on-premises data in your own private data center. Most of the time, this creates unnecessary duplicates.
To solve this problem and better prepare that data for AI use today, one strategy is to invest in data cleansing, which is a one-time upfront cost to clean the data and consolidate it. “You are trying to access this single source of truth, so when the data gets to the data scientists, they don’t have to fight between the data they should believe,” Wu said.
To make sure it works better in the future, companies need to practice data-centric design, where data needs to be a priority from the start as part of every process and technology, he said. “He shouldn’t be a second-class citizen. We should automate data processes. Many organizations still have many manual steps to perform certain steps, or scripts, to do their ETL (extract, transform, load). And part of that automation is making sure that you integrate governance and data cataloging into your process to make it an integral process.
Data also needs to be made more accessible to drive the use of AI, Wu said.
“Part of this can be enabled by creating self-service tools for organizations, for data workers, so that they can access data more easily,” he said. “Reusability should also be emphasized here. And we have to break down the silos.
These steps will also help companies save a lot of time in their model development processes, he added. “Think in terms of the feature store, which is a very popular trend going around now, those reusable features that you can use to build multiple solutions. “
Changes to accelerate AI are also needed when it comes to computing, Wu said.
“The challenges here are always about the availability, cost and efficiency of the compute, but we also have to pay attention to the carbon footprint,” he said. “People don’t guess how much of a carbon footprint there is when we train these great models. But there is hope in exploiting them.
Another trend observed is that some users are moving from very general IT architectures to more domain-specific architectures, both in cloud deployments and edge deployments, he said.
Wu also has new ideas to create more language models.
“It is not necessary for every organization to co-develop another large language model,” Wu said. “We should be geared towards exploiting what has been developed and only requiring a few improvements and tweaks to the model to that it serve a different business use case. “
Instead, companies can look to leverage a more generic layered AI model architecture for most uses, while still allowing more specific models to be built to meet specialized business cases, a. he declared.
Overall, however, it will take more than data, calculations and modeling to accelerate AI faster, Wu said.
It also requires skilled, trained and imaginative IT people who can bring their innovations to AI to help drive their business missions, he said.
“We all know about the shortage of AI talent around the world,” he said, but there is also an imbalanced distribution of talent that compounds the problem. About 50% of the country’s AI talent is found in Silicon Valley, with around 20% of those workers employed by the biggest tech companies. That doesn’t leave enough experts available for other companies to pilot their AI technologies, he said.
“This is a real challenge that we need to tackle across the community,” Wu said. To tackle the problem, companies need to find ways to reduce the burdens on their AI team by ensuring that they focus on developing models and do not address other IT overheads in their organization, he added.
“38% of organizations spend more than 50% of their data scientists’ time on operations, especially deploying their models,” he said. “And only 11% of companies can put a model into production in a week. Some 64% of them take a month or more to complete this production integration, with a model fully trained, validated and tested. For most organizations, reaching the finish line would take over a month.
These inconvenient delays and activities are happening because the support for AI operations is not there, Wu said. data-centric ideas. Think about the increase you can easily achieve by getting better data to train your models, rather than focusing on inventing another more powerful model architecture itself.
An important step in controlling these issues is to recognize the importance of managing change, while maintaining a clear lineage from your data to your model so that you can have or increase the ability to replicate your model, he said. added.
Ultimately, even after the development and deployment of an organization’s models, there are still concerns that will keep IT managers from sleeping at night, Wu said. is the risk of deploying the solution and making it available to customers? It becomes an afterthought, and it usually becomes the biggest blocker in the end and keeps the solution from going into general availability.
The challenge for companies is to try not just to think about time to market, Wu said. “You also have to think about doing it right, so that you don’t come back to the drawing board and have to redevelop your entire solution. It will be much more expensive. And meet regulatory requirements – there’s a lot of ethics around AI development that organizations need to address early on to mitigate those risks. Implement a process to guide your model development lifecycle and incorporate and streamline your compliance practice into that cycle.
In another presentation at the AI Hardware Summit, Aart De Geus, President and Co-CEO of Electronic Design Automation (EDA) and Semiconductor IP Design Company, Synopsis, explained how Moore’s Law continues to be pushed to its limits in recent years and could be better replaced by the concept of “SysMoore, A blend of Moore’s long-standing legal knowledge with new technological innovations that take advantage of systemic complexity.