|By Elad Yoran||
|May 19, 2013 05:00 PM EDT||
Cloud service providers store data all over the globe, and are constantly moving that data from one datacenter to the next for reasons as wide-ranging as cost considerations and redundancy requirements. Does this mean that the requirements outlined in varying data residency laws and privacy regulations are directly at odds with how cloud computing works?
The question is an especially delicate one when the cloud service provider stores and processes data in a jurisdiction that is perceived to have far less stringent privacy and data protection requirements - or may allow government agencies far broader data subpoena powers. Since the cloud computing model relies on distributed infrastructure to generate cost and flexibility benefits for customers, building a datacenter in each data residency jurisdiction quickly becomes cost-prohibitive. And, applying a set of constraints to the movement of data introduces an additional layer of complexity that further erodes the value proposition of cloud computing for customers.
Just as cloud computing represents a novel way of delivering IT computing and functionality, a new model for maintaining ownership and direct control of data in the cloud is increasingly required. However, this new model requires that the encryption mechanism is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
The Issues Surrounding Information Security and Data Protection Laws
Over the past 18 months, concerns about the feasibility of enforcing data residency laws and regulations in the cloud have increasingly come to the forefront. Multiple countries including India, Switzerland, Germany, Australia, South Africa and Canada have enacted laws restricting corporations from storing data outside their physical country borders. Additionally, EU Safe Harbor Principles mandate that companies operating within the European Union are forbidden from sending personally identifiable information (PII) outside the European Economic area, unless it is guaranteed that the data will receive equivalent levels of protection.
This is partly as a result of broader understanding of cloud computing architecture and processes, but also because of the ambiguity of safeguards for the privacy of cloud data. For example, national security concerns have driven the definition of US legislation such as The Foreign Intelligence Surveillance Amendments (FISA) Act and the USA PATRIOT Act, to extend the ability of the federal government and law enforcement agencies to subpoena communications and emails stored in the cloud. The concern is now as much whether data is leaving the jurisdiction as it is what the privacy laws hold where the data lands. Inconsistent approaches to privacy further complicate the picture.
The current response to this challenge is either not to move to the cloud, or require cloud service providers to store data within each jurisdiction. For cloud service providers, this presents a business challenge in delivering a level of flexibility, cost and effective service while altering their delivery and management models to satisfy data residency and privacy requirements. To address the mandates set forth by these laws, a cloud provider would ostensibly have to build datacenters in each jurisdiction, resulting in significant cost and overhead that would reduce the overall gain of cloud storage.
Cloud Encryption and Cloud Data Residency Regulations
The interaction between the evolution of information security and the definition of data protection mandates by legislative bodies or industry groups is a dynamic one. At the heart of the concern is how organizations can continue to maintain ownership and control of data to protect personal information, even when the information resides with a third-party service that relies on a distributed infrastructure in order to deliver resiliency, availability and flexibility to customers.
By way of illustration, compliance requirements and data breach laws have been regularly updated as new information security alternatives have been developed. In the US, more than 40 states currently have breach notification laws mandating that if a company is aware of lost or stolen personally identifiable information, they are required to directly notify the consumer. When these laws were initially enacted (starting with the State of California in 2002), they generally stated that regardless of the circumstances, the company was required to notify the consumer. However, the laws have been gradually amended, and more than 25 states have now enacted an exemption for encrypted personal data. In other words, in instances where lost or stolen data is encrypted, the company is no longer required under law to notify the consumer.
The underlying argument for differentiating between unencrypted data and encrypted data in the context of breach notification is that in the instance where data is encrypted, the attacker has gained access to useless "gibberish" if they do not hold the encryption keys.
However, cloud computing is an evolving paradigm where both the obligations of the data owner and acceptable forms of data protection are still in the process of initial definition. As the technology gains popularity and becomes a well-established method of data storage and processing, the laws pertaining to cloud computing will also continue to evolve in the same way that data breach laws have.
For example, regulations are also now moving towards excluding encrypted data from data residency legislation. Encryption is recognized in the State of Nevada as a means of securing data outside of geographic boundaries: "A data collector doing business in this State shall not: (a) Transfer any personal information through an electronic, non-voice transmission other than a facsimile to a person outside of the secure system of the data collector unless the data collector uses encryption to ensure the security of electronic transmission; or (b) Move any data storage device containing personal information beyond the logical or physical controls of the data collector or its data storage contractor unless the data collector uses encryption to ensure the security of the information."
While data residency regulations can be narrowly defined, in many jurisdictions laws can be interpreted as not applying to data that has been encrypted before being sent to the cloud. Dr. Thilo Weichert, head of the Independent Center for Privacy Protection for the German state of Schleswig-Holstein, argues in his Cloud Computing & Data Privacy paper that if data is anonymized or sufficiently aliased to the extent that the identity of individuals is indecipherable, then data residency law does not apply. Encryption takes anonymizing and aliasing a step further, where the data is completely indecipherable. Similarly, under the European Union's Data Protection Directive (EU DPD), as long as the data is encrypted, where it resides should not present a legal obstacle.
Likewise, under Canadian privacy law, both federal bodies and commercial organizations domiciled within Canadian borders are responsible for the privacy and protection of personal information in their custody. This requirement applies regardless of where the data resides. While significant concerns have been articulated with regards to the probability of disclosure to law enforcement agencies for data that resides within US datacenters, the requirements pertain directly to the safeguards in place to maintain control.
Ann Cavoukian, Information and Privacy Commissioner for the Province of Ontario, noted in her formal response to a question related to the compliance with the Freedom of Information and Protection of Privacy Act concerning the privacy and security of personal information collected by the Ministry of Natural Resources being stored in the US that: "to the extent that the data owner retains the encryption keys, the location of the encrypted data is a secondary issue."
In other words, if the encrypted data leaves the jurisdiction, but the keys remain under the data owner's direct control, the level of protection can be sufficient in terms of data residency requirements.
However, this model also implies that the data encryption scheme is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
Persistent Encryption and Data Residency
The most effective method to address the jurisdictional and residency requirements of data processed by third-party services is via control of encryption keys and the application of persistent encryption. By applying persistent encryption, data that is encrypted at the boundary of the network remains encrypted even when processed and stored within a cloud service provider environment. As a result, persistent encryption ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key.
Therefore, businesses can comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data. Laws relating to data residency are now undergoing a historic transition from the old paradigm where it mattered where the data was physically located to the new paradigm where it only matters where the encryption keys are located.
With the application of persistent encryption, control of the keys in combination with encryption across the data lifecycle - in transit, at rest and in use - provide the foundation to satisfy requirements for control and adequate safeguards for the privacy of personal information. Although the encrypted data may leave the physical borders of a specific country, the data is always fully encrypted while outside of the defined jurisdiction. As the keys are retained within a business's legal jurisdiction, the data cannot be accessed or read until it returns to the physical borders in which the organization resides.
Global Pharmaceutical Company Case Study: Cloud Data Ownership and Control Concerns
The following example depicts a privately held multinational pharmaceutical company that engages in research, development, production, and marketing of prescription and over-the-counter medicines and healthcare products. The company has thousands of employees across the globe, as well as multiple subsidiaries and entities.
The company's IT procurement and deployment approach follows a decentralized model in which each entity subsidiary hosts its own servers and datacenters. There are three functional organizational pillars maintained within its technology and IT services division: Technology Planning; Enterprise Architecture and Data Services; and Production Services. The divisions are staffed by IT engineers with managed services providing support for thousands of clients across a multitude of sites. Existing infrastructure includes hardware, software, services, and virtualization from multiple top vendors including Microsoft, VCE, Dell, Oracle, EMC and VMware.
The pharmaceutical company had adopted several cloud-based services for applications that do not process or store critical or regulated business information, such as Web conferencing, spam filtering, compliance training and tracking, and travel and expense management, but was seeking to expand its cloud computing usage to business critical applications by moving low value servers to cloud providers, as well as moving commodity applications such as email to the cloud.
Concerns about the loss of control and ownership of corporate data, however, stood in the way of realizing the increased efficiencies and operational benefits possible through broader adoption of cloud-based services. These concerns were related to:
- Compliance with international data residency requirements that preclude data leaving a jurisdiction in the clear
- Compliance with regulations governing the security, privacy and confidentiality of healthcare data
- Safeguards to limit exposure of its intellectual property when it is stored and processed in the cloud
- Lack of visibility into service provider responses to information subpoenas that can result in a breach of confidentiality or loss of data
Addressing Residency and Unauthorized Disclosure
While the cloud service provider could attest to the security of the environment based on a framework like the Cloud Security Alliance's Cloud Control Matrix, the global pharmaceutical company required an independent mechanism to protect its intellectual property while resident in the cloud. A common challenge to cloud migration within the pharmaceutical/healthcare industry is confidentiality and sensitivity to a service provider's compliance with government subpoenas as pharmaceutical and healthcare companies maintain sensitive information related to research, clinical study results, and personal medical history. Therefore, it is critical that sensitive information remain under the company's control, without any forfeiture of attorney-client privilege.
In a typical scenario, if a company stores sensitive data in the cloud, and the cloud service provider is faced with a subpoena or other request from the government, they must comply and disclose the company's data to the federal government body. The provider may notify the company after the fact, or in cases of blind subpoenas, not at all.
The pharmaceutical company decided to use persistent encryption technology to specifically address the migration of their email infrastructure to the cloud. Deployed as an on-premise gateway, this enabled the company to successfully address the jurisdictional and residency requirements of email data hosted in the cloud, as the company maintains control of the encryption keys - and business data is encrypted when it passes through the gateway's proxy at the boundary of the network and remains encrypted even when processed by and stored within a cloud service provider environment.
The persistent encryption technology ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key. Therefore, the company is able to comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data, as well as ensuring complete ownership and control of that data if faced with a subpoena.
|veronica321 05/22/13 11:13:00 AM EDT|
Great article and rightly said, security is still the biggest concern when moving to the cloud , I came across this interesting piece on cloud computing and cloud security in particular that readers might find interesting 'Cloud risks Striking a balance between savings and security' it talks a great deal about securing the cloud and data residency laws that might interest readers
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Nov. 26, 2014 11:45 PM EST Reads: 1,038
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Nov. 26, 2014 11:30 PM EST Reads: 926
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
Nov. 26, 2014 09:00 PM EST Reads: 996
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will w...
Nov. 26, 2014 07:00 PM EST Reads: 993
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
Nov. 26, 2014 06:00 PM EST Reads: 997
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 26, 2014 05:45 PM EST Reads: 937
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Nov. 26, 2014 04:45 PM EST Reads: 661
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
Nov. 26, 2014 04:45 PM EST Reads: 675
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
Nov. 26, 2014 04:45 PM EST Reads: 666
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Nov. 26, 2014 04:00 PM EST Reads: 1,027
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
Nov. 26, 2014 03:45 PM EST Reads: 995
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
Nov. 26, 2014 02:00 PM EST Reads: 1,474
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
Nov. 25, 2014 11:30 PM EST Reads: 1,144
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
Nov. 25, 2014 09:30 PM EST Reads: 1,230
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Nov. 25, 2014 09:30 PM EST Reads: 1,282