An artificial intelligence ecosystem isn’t necessarily all cloud

Last Updated on December 23, 2022 by Admin

[ad_1]

Cloud services provide resources on a massive scale to support artificial intelligence and machine learning efforts, but a hybrid approach may be the best course in many cases. That requires an enterprise architecture approach to get everything right.

path-station-people-train-station-path-october-2016-photo-by-joe-mckendrick.jpg

Photo: Joe McKendrick

“We’re seeing a lot of companies kind of doing a pause with cloud,” according to Bill Wong, AI and data analytics leader with Dell Technologies, who keynoted the recent Business Transformation & Operational Excellence Summit & Industry Awards (BTOES) event hosted by Proqis. “A lot of firms have mandated that everything has got to go in the cloud. What people are finding is, while there are some benefits putting everything in a central spot, the benefit of saving money seems to have fallen by the wayside. In some cases, especially with AI, the costs are quite dramatic on where you place the data. So if you’re training an AI model to do image recognition… it can be at least a tenfold difference.” 

Wong indicates many companies need to step back and ask what makes sense in terms of handling sophisticated applications and large amounts of sensitive data. When it comes to AI initiatives, then, “most firms are picking a hybrid approach. Many like to develop on the cloud, but if they have a lot of data, they put the development on-prem, and when they finish with their model they execute production in the cloud.” 

Such a hybrid environment requires an architecture-driven approach to building a data platform that enables an organization to share data and maximize the benefits to their investments in advanced analytics. The goal is to build a data-driven culture built on platforms that deliver agile, open ecosystems for data scientists and developers to work together.

“Where do we start?” Wong continues. “It’s like cloud was when it was first introduced. C-level execs need to know their strategy. From an enterprise architecture approach, you have to build a strategy.”  That strategy starts small, and “lining up with an executive stakeholder. Pick a use case that’s going to get high visibility that’s low risk.”

Wong acknowledges that within many organizations, lining up executive support and identifying the low-hanging fruit for AI use cases “is asking a lot. This is not an easy task. But look for those use cases where you can get a quick success story, and that’ll help get people on board behind it. So for banking, something with customer experience customer insight. With healthcare, something that a frontline person a clinician would benefit or a patient.  And it doesn’t have to be difficult really complex. This is one of the most challenging technical things you can do out there. but the rewards are worth it.”

The ultimate approach, employing architectural thinking, is to move toward “a model-driven environment,” Wong continues. “What we’re going to see is more and more tools that don’t require coding, to get more citizen types of data scientists to be able to create these applications.” Offerings such as Machine Learning as a Service helps reduce development time from “months to weeks.” he says. Add tools such as data cataloging to manage the data that fuels AI-based transformation. “You want to try to remove IT from the equation and have a self-service portal,” he explains. Such a self-service approach needs to enable data scientists, analysts and other users to “look at a menu, and pick data, with a description dictionary glossary of what the data means. And then Platform as a Service — how to deploy the infrastructure, which includes hardware, software using and leveraging Kubernetes having everything containerized.”

[ad_2]

Source link