SYS-CON MEDIA Authors: Elizabeth White, Liz McMillan, William Schmarzo, Yeshim Deniz, Jason Bloomberg

Blog Feed Post

Cloud Migration Best Practice: Classifying Your Data



In my first post of this series, “CloudMigration Part One: An Overview,” I provided a high-level summary of how enterprises should migrate applications to the cloud. In this installment, the focus is on enterprise data and why your organization may need to review and reclassify its data before moving anything to the cloud.

Cloud computing has done more than change the way enterprises consume information technology.  It is also changing how organizations need to protect their data.  Some may see this as an “unintended consequence” but the headlong rush to save money by migrating applications to the cloud has simultaneously uncovered long-hidden application security issues.  This revelation is mostly due to the wide adoption of “Lift & Shift” as a cloud migration strategy.  Using this option typically precludes any modifications of the migrating application.  It can also result in the elimination of essential data security controls and lead to grave data breaches.

While there is no doubt in the good intentions of all involved, traditionally, enterprise applications were developed for deployment into the organization’s own IT infrastructure.  This implicit assumption also included the use of infrastructure-based security controls to protect organizational data.  These generally accepted industry practices were coupled with a cultural propensity to err on the side of caution by protecting most data at generally high levels.  During an implementation, organizations typically used a two-level (sensitive and non-sensitive) or at most a four-level data classification model.

Today, the cloud has quickly become the preferred deployment environment for enterprise applications.  This shift to using “other people’s infrastructure” has brought with it tremendous variability in the nature and quality of infrastructure-based data security controls.  It is also forcing companies to shift away from infrastructure-centric security to data-centric information security models.  Expanding international electronic commerce, ever tightening national data sovereignty laws and regional data protection and privacy regulations (i.e., GDPR) have also combined to make many data classification schemas generally untenable.  Cloud Security Alliance and the International Information Systems Security Certification Consortium (ISC2), in fact, both suggest that corporate data may need to be classified across at least eight categories, namely:
  • Data type (format, structure)
  • Jurisdiction and other legal constraints
  • Context
  • Ownership
  • Contractual or business constraints
  • Trust levels and source of origin
  • Value, sensitivity, and criticality
  • The obligationfor retention and preservation

Moving to classify data at this level means that one of the most important initial steps of any cloud computing migration must be a review and possible reclassification of all organizational data.  In bypassing this step, newly migrated applications simply become data breaches in wait.  At a minimum an enterprise should:
  • Document all key business processes destined for cloud migration;
  • Identify all data types associated with each migrating business process;
  • Explicitly assign the role of “Process Data Owner” to appropriate individuals; and
  • Assign each “Process Data Owner” the task of setting and documenting the minimum required security controls for each data type.

After completing these steps, companies should review and update their IT governance process to reflect any required expansion of their corporate data classification model.  These steps are also aligned with ISO 27034-1 framework for implementing cloud application security.  This standard explicitly takes a process approach to specifying, designing, developing, testing, implementing and maintaining security functions and controls in application systems.  It defines application security not as the state of security of an application system (the results of the process) but as “a process an organization can perform for applying controls and measurements to its applications in order to manage the risk of using them.”

In Part 3 of this series, I will discuss application screening and related industry best practices and include:
  • Determining the most appropriate target application deployment environment;
  • Determining each application's business value, key performance indicators and target return on investment;
  • Determining each application's migration readiness; and
  • Deciding the appropriate application migration strategy.



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Read the original blog entry...

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

Latest Stories
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, will provide an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life ...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform elimi...