|By Adam Vincent||
|December 10, 2010 09:24 AM EST||
The 9/11 Commission Report cited "pervasive problems of managing and sharing information across a large and unwieldy government that had been built in a different era to confront different dangers". Since 9/11 governments around the world have considerably adjusted their stance on information-sharing to allow more adequate and timely sharing of information. Unfortunately, the need to share information quickly in many situations had priority over the need to protect it and this left security policies, certification and accreditation practices, and existing security controls behind.
WikiLeaks may jeopardize all we've worked towards to enhance information sharing, and impede pursuits to make information-sharing more effective. Or it may serve as a wakeup call that our current policies, processes and solutions are not adequate in today's world where information must be collected, fused, discovered, shared and protected at network speed.
Here at Layer 7, we've been working with government agencies worldwide to support their needs for sharing information more quickly, while introducing a more robust set of access and security controls to allow only those with need-to-know clearance access to privileged information. In the following paragraphs, I'm going to discuss how Layer 7 Technologies aids in breaking down information-sharing silos while maintaining a high degree of information protection, control and tracking.
There are multiple efforts underway across government agencies to use digital policy to control who gets access to what information when, as opposed to relying on a written policy. Layer 7's policy-oriented controls allow for digital policy to be defined and enforced across distributed information silos. Either inside an enterprise or in the cloud, using Layer 7,government agencies and commercial entities can define and enforce rules for information discovery, retrieval and dissemination across a variety of security realms and boundaries. With the right kind of policy controls, companies can avoid a WikiLeak of their own.
Layer 7 provides information plumbing for the new IT reality. Using Layer 7 products organizations can ensure:
Data Exfiltration –The WikiLeaks scandal broke because of a single user’s ability to discover, collect and exfiltrate massive quantities of information, much of which was not needed for the day-to-day activities of the user. With Layer 7, digital policies can be defined and enforced which put limits on the number of times a single user can retrieve a single type of data or multiple types of data that, when aggregated together, could be interpreted as having malicious intent. If the user goes beyond his administratively imposed limit, Layer 7 can either allow the operation while notifying administrative or security personnel of the potential issue, or can disallow access altogether while awaiting remediation.
Access Control -The heart of any information system is its ability to grant access to people who meet the "need to know" requirement for accessing the information contained within. The reality with government organizations is that many information systems rely on the user’s level of clearance, the network he is using, or course-grained information likethe branch of service he belongs to, in order to grant or deny access to an information-sharing system in its entirety. For those going beyond the norm with usage of Role Based Access Control (RBAC), the burden of administrating hundreds or thousands users, based on groups, is formidable and limits the effectiveness of the system; it increases the likelihood that the system has authorized users whom no longer have “need to know” of the information.
Layer 7 policy enforcement and decision allows for user authorization through either Attribute Based Access Control (ABAC) or Policy Based Access Control (PBAC). These types of authorizations correlate through policy, attributes about the user, resource and environment in order to allow/deny access. Attributes can be collected from local identity repositories or from enterprise attribute services.
In addition, enterprise attribute services can be federated to allow for attributes to be shared across organizations, thereby minimizing the requirement of having to manage attributes about users from other organizations. An often-overlooked factor of authorization is the need to tie typical authorization policy languages like XACML (is user X allowed to access resource Y) to policies around data exfiltration, data sanitization and transformation, and audit. This is the area where Layer 7 stands out: not only do we have the ability to authorize the user, but we can also enforce a wide variety of policy controls that are integrated with access control.
The following blog posts by Anil John, a colleague whom has specialization in the identity space, provides good information about the benefits and needs of the community in moving from roles to policy and attributes. Policy Based Access Control (PBAC) and Federated Attribute Services
Monitoring, Visibility & Tracking - Even when controls are in place that help mitigate the issue of “need to know,” there will always be a risk of authorized users collecting information within the norms of their current job and role. In support of this, visibility of usage by the individual IT system owner and across enterprise systems is key to limiting this type of event in the future. Layer 7 allows for federation of monitoring data so information about data accesses can be shared with those organizations monitoring the network or enterprise. This allows authentication attempts and valid authorizations to be tracked, and distributed data retrieval trends analyzed on a per user basis across the extended enterprise.
Leakage of privileged information to unauthorized users can never be 100% guaranteed. However, with the simple implementation of a policy-based information control like Layer 7, access to confidential information can be restrictedand tracked.
"We help companies that are using a lot of Software as a Service. We help companies manage and gain visibility into what people are using inside the company and decide to secure them or use standards to lock down or to embrace the adoption of SaaS inside the company," explained Scott Kriz, Co-founder and CEO of Bitium, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 27, 2014 07:00 PM EST Reads: 1,279
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
Nov. 27, 2014 07:00 PM EST Reads: 1,144
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
Nov. 27, 2014 06:00 PM EST Reads: 1,693
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
Nov. 27, 2014 06:00 PM EST Reads: 994
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
Nov. 27, 2014 05:00 PM EST Reads: 1,272
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Nov. 27, 2014 04:45 PM EST Reads: 1,129
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
Nov. 27, 2014 04:45 PM EST Reads: 1,070
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
Nov. 27, 2014 04:30 PM EST Reads: 1,129
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
Nov. 27, 2014 04:00 PM EST Reads: 1,207
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
Nov. 27, 2014 04:00 PM EST Reads: 1,592
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Nov. 27, 2014 03:00 PM EST Reads: 1,236
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Nov. 27, 2014 03:00 PM EST Reads: 862
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Nov. 27, 2014 03:00 PM EST Reads: 1,421
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
Nov. 27, 2014 01:00 PM EST Reads: 1,619
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Nov. 27, 2014 11:00 AM EST Reads: 1,229