SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Maria C. Horton, Andy Thurai, Zakia Bouachraoui

Blog Feed Post

AI Meets AI: The Key to Actually Implementing AI

With AI use case scenarios becoming more complex, the actual implementation of AI has also grown more challenging. However, a new way of delivering AI may be on the horizon.

An unprecedented amount of progress was made with AI and machine learning in 2017, as numerous companies deployed these technologies in real-world applications. This trend is projected to hold true through the near future, with some analysts like Gartner predicting that AI technologies will be in every new software product by 2020. From healthcare diagnosis to predictive maintenance for machines to conversational chatbots, there is no question that AI is quickly becoming a fundamental requirement for modern businesses.

However, despite the market buzz, many companies are still stumped by the prospect of deriving actual business value from the use of AI. In fact, the introduction of AI to actual products and solutions remains one of the leading sticking points for businesses, with many left asking, “How do I actually implement an AI solution?”

The Growing Complexity of AI Application

As AI goes from a “nice to have” to a “need to have,” it’s also evolving in terms of complexity. Companies need more than just simple, standardized AI services that do image or text recognition—they need complex predictive scenarios that are highly specific to their operations and customized for their business needs.

For example, take a scenario that uses time series data to generate business insights, such as predictive maintenance for the Industrial Internet of Things (IIoT) or customer churn analysis for a customer experience organization. These scenarios can’t be supported by simply calling a generic service with a few specific parameters and getting a result. Getting accurate and actionable results in these predictive scenarios requires a lot of data science work, with data being used over time to iteratively train the models and improve the accuracy and quality of the output. Additionally, businesses are being challenged to engineer new features, run and test many different models and determine the right mix of models to provide the most accurate result—and that’s just to determine what needs to be implemented in a production environment.

Moreover, businesses need to realize that AI is no longer the exclusive domain of data scientists and the engineers that help prepare the data. The situation is not unlike how digital transformation has branched out from being an IT-driven initiative to a company-wide effort. Organizations must move beyond a siloed AI approach that divides the analytics team and the app development team. App developers need to become more knowledgeable about the data science lifecycle and app designers need to think about how predictive insights can drive the application experience.

To be successful, organizations must identify an approach that enables them to easily put models into production in a language that is appropriate for runtime—without rewriting the analytical model. Organizations need to not only optimize their initial models but also feed data and events back to the production model so that it can be continuously improved upon.

This may seem like a big, complicated process, but it’s key to the actual implementation of AI—the AI of AI, if you will. AI will become unreachable to your organization if you cannot do this.

The New World of AI

So how can organizations effectively implement AI in a way that enables them to address complex predictive scenarios with limited data science resources? And how do organizations achieve success without retraining their entire development team?

The truth of the matter is that it can’t be done by simply creating a narrowly defined, one-size-fits-all approach that will get you results with only a few parameters. It requires a more complex implementation to be insightful, actionable and valuable to the business.

Take, for example, an IIoT predictive maintenance application that analyzes three months of time series data from sensors on hundreds or thousands of machines and returns the results automatically. This isn’t a simple predictive result set that is returned, but a complete set of detected anomalies that occurred over that time, with prioritized results to eliminate the alert storms that previously made it impossible to operationalize the results. These prioritized results are served up via a work order on a mobile app to the appropriate regional field service personnel, who are then able to perform the necessary maintenance to maximize machine performance. It’s a complex process where the machine learning is automated and feature engineering is done in an unsupervised fashion. The provided results analyze individual sensor data, machine-level data and machine population data and are packaged up in format that enables the business to take action.

Welcome to the new world of AI implementation. While it’s a very new concept, the best market definition of this process is currently “anomaly detection.” But not all solutions take the same approach and not all solutions deliver predictions that lead to better business outcomes.What you are about to see is a fundamental shift in how machine learning capabilities are delivered—and we aren’t just talking deployment in the cloud versus on-premise. We are talking about a shift from delivering data science tools that make the data scientists more effective to data science results that eliminate the need for the data scientist to have these tools in the first place. In this brave new world, data scientists would be able to spend their time analyzing and improving the results, instead of wasting their time on non-mission-critical tasks.

The only thing that is required is that the data is provided in a time series format Otherwise, you simply upload the data to the cloud (but on-premise options will exist too) and the automated AI does the rest, with accurate results returned within days.

Soon you can move from dreams of AI to actual implementation!

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single appl...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...