SYS-CON MEDIA Authors: AppDynamics Blog, Pat Romanski, Elizabeth White, Yeshim Deniz, Liz McMillan

Blog Feed Post

How State Governments Can Protect and Win with Big Data, AI and Privacy

I was recently asked to conduct a 2-hour workshop for the State of California Senior Legislators on the topic of “Big Data, Artificial Intelligence and Privacy.” Honored by the privilege of offering my perspective on these critical topics, I shared with my home-state legislators how significant opportunities await the state. I reviewed the once-in-a-generation opportunities awaiting the great State of California (“the State”), where decision makers could vastly improve their constituents’ quality of life, while creating new sources of value and economic growth.

Industrial Revolution Learnings

We have historical experiences and references to revisit in discerning what the government can do to nurture our “Analytics Revolution.” Notably, the Industrial Revolution, holds many lessons regarding the consequences of late and/or confusing government involvement and guidance (see Figure 1).

Figure 1: Lessons from the Industrial Revolution


Government’s role in the “Analytics Revolution” is clear: to carefully nurture and support industry, university, and government collaboration to encourage sustainable growth and prepare for massive changes and opportunities. The government can’t afford to stand by and let the markets decide. By the time the markets have decided, it may be too late to redirect and guide resources, especially given the interests of Russia and China in this all-important science.

Be Prepared to Action on the Nefarious

Access to sensitive information, data protection, privacy – these are all hot button issues with the citizenry. The State must be aware of the society and cultural risks associated with the idea of a “Big Brother” shadowing its people. The State must champion legislation in cooperation with industry in order to protect the masses, while not stifling creativity and innovation. That’s a tough job, but the natural conflict between “nurturing while protecting” is why the government needs to be involved early. Through early engagement, the State can then reduce concern between industrial growth and personal privacy.

The “Analytics Revolution” holds tremendous promise for the future of industry and personal achieve, but will require well-defined rules of conduct and engagement. Unsupervised growth or use may lead to information being exploited in nefarious ways with potentially damaging results.

The State must protect its constituents’ sensitive information while nurturing the industrial opportunity. That’s a tall order, but nothing less should be expected from our government, industry and society leaders.

Can’t Operate in a World of Fear

We can’t be afraid of what we don’t know. The State must increase constituents’ awareness and education of Big Data and Artificial Intelligence; what they are, what they are used for and the opportunities locked within including “The Good, the Bad, and the Ugly.”

We can’t operate in a world of fear; jump to conclusions based upon little or no information, or worse yet, misinformation or purposeful lies. Government leaders must collaborate with industry and universities to actively gain understanding of the true ramifications and capabilities of Big Data and Artificial Intelligence, before they create legislation (see Figure 2).

Figure 2: Government Leaders Must Seek Information before Jumping to Conclusions


It’s because I’m an educator in this field that I was so honored to be part of this discussion. In addition to discussing the economic opportunities that lie within Big Data and Artificial Intelligence, I wanted to help our legislators understand they should prioritize their own learning and education of these sciences before enacting rules and regulations.

Predict to Prevent

The opportunities for good are almost overwhelming at the government level! Whether in education, public services, traffic, fraud, crime, wild fires, public safety or population health, Big Data and Artificial Intelligence can dramatically improve outcomes while reducing costs and risks (see Figure 3).

Figure 3: Big Data and AI Reducing Crop Loss to Diseases


However, to take advantage of the potential of Big Data and Artificial Intelligence, The State, its agencies, and its legislators need to undergo a mind shift. They need to evolve beyond “using data and analytics to monitor agency outcomes” to understanding how to “leverage data and analytics to Predict, to Prescribe and to Prevent!”  That is, these organizations need to evolve from a mindset of reporting what happened to a mindset of predicting what’s likely to happen and prescribing corrective or preventative actions or behaviors (see Figure 4).

Figure 4: The “Predict to Prescribe to Prevent” Value Chain


There are numerous use cases of this “predict to prevent” value chain that will not only benefit state agencies’ operations, but also have positive and quality of life ramifications to the residents of California including the opportunity to prevent:

  • Hospital acquired infections
  • Crime
  • Traffic Jams / vehicle accidents
  • Major road maintenance
  • Cyber attacks
  • Wild fires
  • Equipment maintenance and failures
  • Electricity and utility outages
  • And more…

Role of Government

The role of government is to nurture, not necessarily to create, especially in California. California is blessed with a bounty of human capital resources including an outstanding higher education system and an active culture of corporate investing such as the Google $1B AI Fund (see “Google Commits $1 Billion In Grants To Train U.S. Workers For High-Tech Jobs”).

There is a bounty of free and low-cost Big Data and Artificial Intelligence training available. For example, Andrew Ng, one of the world’s best-known artificial-intelligence experts, is launching an online effort to create millions more AI experts across a range of industries. Ng, an early pioneer in online learning, hopes his new deep-learning course on Coursera will train people to use the most powerful idea to have emerged in AI in recent years.

California sits in rarified air when it comes to the volume of natural talent in the Big Data and Artificial Intelligence spaces. The State should seize on these assets, coordinate all of these valuable resources and ensure that this quality and depth of training is available to all.

State of California Summary

In summarizing what I told my audience, Big Data and Artificial Intelligence provide new challenges, but the opportunities for both private and public sectors are many. To harness the power of Big Data and AI, the State should focus on:

  • Minimizing impact of nefarious, illegal and dangerous activities
  • Balancing Consumer value vs. Consumer exploitation
  • Addressing inequities in data monetization opportunities
  • Re-tooling / Re-skilling the California workforce
  • Fueling innovation via university-government-business collaboration
  • Adopt regulations for ensuring citizen/customer fairness (share of the wealth)
  • Providing incentives to accelerate state-wide transformation and adoption

Figure 5: Threats to the California “Way of Life”


It is up to everyone — the universities, companies, and individuals — to step up and provide guidance to our government and education leaders to keep California at the forefront of our “Analytics Revolution.” This is one race where there is no silver medal for finishing second.

The post How State Governments Can Protect and Win with Big Data, AI and Privacy appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Hitachi Vantara as CTO, IoT and Analytics.

Previously, as a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

Latest Stories
StackRox helps enterprises secure their containerized and Kubernetes environments at scale. The StackRox Container Security Platform enables security and DevOps teams to enforce their compliance and security policies across the entire container life cycle, from build to deploy to runtime. StackRox integrates with existing DevOps and security tools, enabling teams to quickly operationalize container and Kubernetes security. StackRox customers span cloud-native startups, Global 2000 enterprises, a...
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
Emil Sayegh is an early pioneer of cloud computing and is recognized as one of the industry's true veterans. A cloud visionary, he is credited with launching and leading the cloud computing and hosting businesses for HP, Rackspace, and Codero. Emil built the Rackspace cloud business while serving as the company's GM of the Cloud Computing Division. Earlier at Rackspace he served as VP of the Product Group and launched the company's private cloud and hosted exchange services. He later moved o...
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, will discuss how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galer...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
Cloud-Native thinking and Serverless Computing are now the norm in financial services, manufacturing, telco, healthcare, transportation, energy, media, entertainment, retail and other consumer industries, as well as the public sector. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that pro...
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
If you are part of the cloud development community, you certainly know about “serverless computing,” almost a misnomer. Because it implies there are no servers which is untrue. However the servers are hidden from the developers. This model eliminates operational complexity and increases developer productivity. We came from monolithic computing to client-server to services to microservices to the serverless model. In other words, our systems have slowly “dissolved” from monolithic to function-...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It's clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Th...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
To enable their developers, ensure SLAs and increase IT efficiency, Enterprise IT is moving towards a unified, centralized approach for managing their hybrid infrastructure. As if the journey to the cloud - private and public - was not difficult enough, the need to support modern technologies such as Containers and Serverless applications further complicates matters. This talk covers key patterns and lessons learned from large organizations for architecting your hybrid cloud in a way that: Su...
xMatters helps enterprises prevent, manage and resolve IT incidents. xMatters industry-leading Service Availability platform prevents IT issues from becoming big business problems. Large enterprises, small workgroups, and innovative DevOps teams rely on its proactive issue resolution service to maintain operational visibility and control in today's highly-fragmented IT environment. xMatters provides toolchain integrations to hundreds of IT management, security and DevOps tools. xMatters is the ...
CoreOS extends CoreOS Tectonic, the enterprise Kubernetes solution, from AWS and bare metal to more environments, including preview availability for Microsoft Azure and OpenStack. CoreOS has also extended its container image registry, Quay, so that it can manage and store complete Kubernetes applications, which are composed of images along with configuration files. Quay now delivers a first-of-its-kind Kubernetes Application Registry that with this release is also integrated with Kubernetes Helm...
Serverless Computing or Functions as a Service (FaaS) is gaining momentum. Amazon is fueling the innovation by expanding Lambda to edge devices and content distribution network. IBM, Microsoft, and Google have their own FaaS offerings in the public cloud. There are over half-a-dozen open source serverless projects that are getting the attention of developers.