SYS-CON MEDIA Authors: Pat Romanski, Gary Arora, Zakia Bouachraoui, Yeshim Deniz, Liz McMillan

Blog Feed Post

AI Meets AI: The Key to Actually Implementing AI

With AI use case scenarios becoming more complex, the actual implementation of AI has also grown more challenging. However, a new way of delivering AI may be on the horizon.

An unprecedented amount of progress was made with AI and machine learning in 2017, as numerous companies deployed these technologies in real-world applications. This trend is projected to hold true through the near future, with some analysts like Gartner predicting that AI technologies will be in every new software product by 2020. From healthcare diagnosis to predictive maintenance for machines to conversational chatbots, there is no question that AI is quickly becoming a fundamental requirement for modern businesses.

However, despite the market buzz, many companies are still stumped by the prospect of deriving actual business value from the use of AI. In fact, the introduction of AI to actual products and solutions remains one of the leading sticking points for businesses, with many left asking, “How do I actually implement an AI solution?”

The Growing Complexity of AI Application

As AI goes from a “nice to have” to a “need to have,” it’s also evolving in terms of complexity. Companies need more than just simple, standardized AI services that do image or text recognition—they need complex predictive scenarios that are highly specific to their operations and customized for their business needs.

For example, take a scenario that uses time series data to generate business insights, such as predictive maintenance for the Industrial Internet of Things (IIoT) or customer churn analysis for a customer experience organization. These scenarios can’t be supported by simply calling a generic service with a few specific parameters and getting a result. Getting accurate and actionable results in these predictive scenarios requires a lot of data science work, with data being used over time to iteratively train the models and improve the accuracy and quality of the output. Additionally, businesses are being challenged to engineer new features, run and test many different models and determine the right mix of models to provide the most accurate result—and that’s just to determine what needs to be implemented in a production environment.

Moreover, businesses need to realize that AI is no longer the exclusive domain of data scientists and the engineers that help prepare the data. The situation is not unlike how digital transformation has branched out from being an IT-driven initiative to a company-wide effort. Organizations must move beyond a siloed AI approach that divides the analytics team and the app development team. App developers need to become more knowledgeable about the data science lifecycle and app designers need to think about how predictive insights can drive the application experience.

To be successful, organizations must identify an approach that enables them to easily put models into production in a language that is appropriate for runtime—without rewriting the analytical model. Organizations need to not only optimize their initial models but also feed data and events back to the production model so that it can be continuously improved upon.

This may seem like a big, complicated process, but it’s key to the actual implementation of AI—the AI of AI, if you will. AI will become unreachable to your organization if you cannot do this.

The New World of AI

So how can organizations effectively implement AI in a way that enables them to address complex predictive scenarios with limited data science resources? And how do organizations achieve success without retraining their entire development team?

The truth of the matter is that it can’t be done by simply creating a narrowly defined, one-size-fits-all approach that will get you results with only a few parameters. It requires a more complex implementation to be insightful, actionable and valuable to the business.

Take, for example, an IIoT predictive maintenance application that analyzes three months of time series data from sensors on hundreds or thousands of machines and returns the results automatically. This isn’t a simple predictive result set that is returned, but a complete set of detected anomalies that occurred over that time, with prioritized results to eliminate the alert storms that previously made it impossible to operationalize the results. These prioritized results are served up via a work order on a mobile app to the appropriate regional field service personnel, who are then able to perform the necessary maintenance to maximize machine performance. It’s a complex process where the machine learning is automated and feature engineering is done in an unsupervised fashion. The provided results analyze individual sensor data, machine-level data and machine population data and are packaged up in format that enables the business to take action.

Welcome to the new world of AI implementation. While it’s a very new concept, the best market definition of this process is currently “anomaly detection.” But not all solutions take the same approach and not all solutions deliver predictions that lead to better business outcomes.What you are about to see is a fundamental shift in how machine learning capabilities are delivered—and we aren’t just talking deployment in the cloud versus on-premise. We are talking about a shift from delivering data science tools that make the data scientists more effective to data science results that eliminate the need for the data scientist to have these tools in the first place. In this brave new world, data scientists would be able to spend their time analyzing and improving the results, instead of wasting their time on non-mission-critical tasks.

The only thing that is required is that the data is provided in a time series format Otherwise, you simply upload the data to the cloud (but on-premise options will exist too) and the automated AI does the rest, with accurate results returned within days.

Soon you can move from dreams of AI to actual implementation!

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
DevOps has long focused on reinventing the SDLC (e.g. with CI/CD, ARA, pipeline automation etc.), while reinvention of IT Ops has lagged. However, new approaches like Site Reliability Engineering, Observability, Containerization, Operations Analytics, and ML/AI are driving a resurgence of IT Ops. In this session our expert panel will focus on how these new ideas are [putting the Ops back in DevOps orbringing modern IT Ops to DevOps].
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Concerns about security, downtime and latency, budgets, and general unfamiliarity with cloud technologies continue to create hesitation for many organizations that truly need to be developing a cloud strategy. Hybrid cloud solutions are helping to elevate those concerns by enabling the combination or orchestration of two or more platforms, including on-premise infrastructure, private clouds and/or third-party, public cloud services. This gives organizations more comfort to begin their digital tr...
Most organizations are awash today in data and IT systems, yet they're still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What's lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions.
Keeping an application running at scale can be a daunting task. When do you need to add more capacity? Larger databases? Additional servers? These questions get harder as the complexity of your application grows. Microservice based architectures and cloud-based dynamic infrastructures are technologies that help you keep your application running with high availability, even during times of extreme scaling. But real cloud success, at scale, requires much more than a basic lift-and-shift migrati...
David Friend is the co-founder and CEO of Wasabi, the hot cloud storage company that delivers fast, low-cost, and reliable cloud storage. Prior to Wasabi, David co-founded Carbonite, one of the world's leading cloud backup companies. A successful tech entrepreneur for more than 30 years, David got his start at ARP Instruments, a manufacturer of synthesizers for rock bands, where he worked with leading musicians of the day like Stevie Wonder, Pete Townsend of The Who, and Led Zeppelin. David has ...
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Addteq is a leader in providing business solutions to Enterprise clients. Addteq has been in the business for more than 10 years. Through the use of DevOps automation, Addteq strives on creating innovative solutions to solve business processes. Clients depend on Addteq to modernize the software delivery process by providing Atlassian solutions, create custom add-ons, conduct training, offer hosting, perform DevOps services, and provide overall support services.
Contino is a global technical consultancy that helps highly-regulated enterprises transform faster, modernizing their way of working through DevOps and cloud computing. They focus on building capability and assisting our clients to in-source strategic technology capability so they get to market quickly and build their own innovation engine.
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the ste...
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...