SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Maria C. Horton, Andy Thurai, Zakia Bouachraoui

Blog Feed Post

Define your own API Management Deployment Model

API Management Platforms come in different shapes and sizes: cloud based infrastructure, on-premise infrastructure, multi-tenant SaaS, single provider portals, API ecosystems, etc. In this 3rd part on API management deployment models, lets look at some of the considerations in choosing the right approach for your API management project.

Let’s start with the data.

Assuming the data of the target APIs already exists, where is that data living? If the data does not exist, are there constraints as to where it can reside (certification requirements, legal obligations, etc)? Bridging this data to the external world will require some level of security at the perimeter of the existing data zone regardless of where or how the rest of the api management infrastructure is deployed. In that case, the infrastructure model is at least part of the solution. Conversely, if the data does not exist yet and/or can freely exist on a public zone, the hosted api management model is a great alternative. Ideally, the data or backend is located in the ‘same’ public zone. This may seem obvious but if the same zone is not hosting both API management and backend, you do not realize the full benefit. Backend as a service can be considered as part of the platform, especially for public deployments.
As Leif concludes in his post Do you need MBaaS to be a Mobile Bad Ass Developer, enterprise-focused APIs benefit less from MBaaS because the backend is too often tied to the enterprise zone.

Despite the advantages of a “near api management”, many API providers require high degrees of elasticity to handle seasonal peaks for example. Public providers are an effective way to accommodate such traffic characteristics. You want your cake and eat too? When data can be governed privately and pushed to public side cache, api backend management is coordinated at the perimeter of each zone to allow you to scale across multiple regions.

Image

What about identities?

Identity related information is of particular sensitivity, which often makes it better suited for private. Even in situations where the data returned by APIs is effectively hosted, the authentication of subscribers can continue to involve an on-premise component. Done right, this means your API management infrastructure will need to enable access control that accommodate federation across these zones.

 Image

OAuth accommodates this in many ways. One can decouple OAuth authorization server closer to the source of the identity and the OAuth resource server closer to the API data. Another approach is to implement the oauth implementation fully in each zone and delegate authentication across zone using a federated authentication API.

Image

The identities that applications will consume your API on behalf of may also be provided by a 3rd party. Trends like social login and standards like OpenID Connect will enable this federated authentication to not only go across zones but integrate with social identity providers and enable a more social user experience. When building out your API management infrastructure, be an OAuth hero, not a security zero.

Which ecosystem?

Creating visibility for an API by joining an API ecosystem can also be a motivating factor in selecting an API management platform. I would argue that the internet is the ecosystem and that maintaining ownership of your own APIs and their infrastructure does not preclude you from reaching out to your target developer audience. An API marketplace may help provide the visibility that you are looking for but the complete API management infrastructure will still have touch points to multiple zones, whether public or private.

In the end, there is no one-size-fits-all API management deployment model and many considerations are relevant to its design. This post does not claim to be an exhaustive list of such considerations. I’ve touched other obvious ones such as security and cost in the first and second part of this blog post. Also, I will be describing in more details this hybrid model as part of my upcoming presentation at Cloud Security Alliance Congress titled Seasonal burst handling using hybrid cloud infrastructure.


Read the original blog entry...

More Stories By Francois Lascelles

As Layer 7’s Chief Architect, Francois Lascelles guides the solutions architecture team and aligns product evolution with field trends. Francois joined Layer 7 in the company’s infancy – contributing as the first developer and designing the foundation of Layer 7’s Gateway technology. Now in a field-facing role, Francois helps enterprise architects apply the latest standards and patterns. Francois is a regular blogger and speaker and is also co-author of Service-Oriented Infrastructure: On-Premise and in the Cloud, published by Prentice Hall. Francois holds a Bachelor of Engineering degree from Ecole Polytechnique de Montreal and a black belt in OAuth. Follow Francois on Twitter: @flascelles

Latest Stories
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single appl...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...