SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Pat Romanski, Gary Arora, Zakia Bouachraoui

Blog Feed Post

External Authentication and FIPS Compliance with Hybrid Data Pipeline

New security enhancements to Hybrid Data Pipeline include external authentication over OAuth, LDAP, Okta and more, plus FIPS support for federal compliance.

Hybrid Data Pipeline, the groundbreaking data access service from DataDirect, recently released several new features to meet market demand and remain on the cutting edge of data services. Security requirements are in higher demand than ever before, and Hybrid Data Pipeline continues to be at the forefront of data security.

What is Hybrid Data Pipeline?

Hybrid Data Pipeline is a lightweight, embeddable data access service that simplifies integration by connecting directly to the data. This enables applications to use SQL or OData to perform real-time access to on-premises and cloud data. This prevents developers from having to do ETL. Connecting directly to the data in real-time is more agile than setting up a middle tier and is better than ETL for several use cases.

Hybrid Data Pipeline

 What's New in Hybrid Data Pipeline?

  • External Authentication Support: In addition to its internal authentication, Hybrid Data Pipeline now supports external authentication methodologies, such as LDAP, OAuth and Okta, via Java plugin. External authentication allows administrators to call their existing systems of authentication through APIs, for an added layer of security. Users also have the ability to write Java code to handle authentication in a way that best fits their environment. This authentication system also adds a layer of flexibility to Hybrid Data Pipeline, as administrators can choose to map multiple externally authenticated users to a single Hybrid Data Pipeline user to more easily control data source access.
  • FIPS Compliance: Hybrid Data Pipeline Server now provides a configuration where it can be run in FIPS 140-2 compliant mode. FIPS, the Federal Information Processing Standard, is a cryptography standard defining security compliance for both hardware and software. Why is FIPS important? Compliance means that software has met the security standards for deployment by U.S. federal agencies and federal contractors. In addition, FIPS is an established standard for security industry-wide since it is accredited by both the US and Canadian governments.
  • FedRAMP Account Lockout Policy: Hybrid Data Pipeline supports the implementation of an account lockout policy, which can be used to limit the number of consecutive failed authentication attempts permitted before a user account is locked. The user is unable to authenticate until a configurable period of time has passed or until the administrator unlocks the account. The Hybrid Data Pipeline account lockout policy is by default enabled in accordance with Federal Risk and Authorization Management Program (FedRAMP) low- and medium-risk guidelines. FedRAMP Account Lockout Policy and FIPS compliance together make Hybrid Data Pipeline easy to use for Federal customers.

Security Policy

Progress DataDirect is committed to providing secure data access to its customers. Upon identification of any security vulnerability that would impact one or more Progress product(s), Progress will exercise commercially reasonable efforts to address the vulnerability in accordance with the following guidelines:

Security Vulnerability Response Policy

PRIORITY*

TIME GUIDELINE

VERSION(S)

High Risk
(CVSS 8+ or industry equivalent)

30 days

Active (i.e. latest shipping version) and all Supported versions

Medium Risk
(CVSS 5-to-8 or industry equivalent)

180 days

Active (i.e. latest shipping version)

Low Risk
(CVSS 0-to-5 or industry equivalent)

Next major release or best effort

Active (i.e. latest shipping version)

* Priority is established based on the current version of the Common Vulnerability Scoring System (CVSS), an open industry standard for assessing the severity of computer system security vulnerabilities. For additional information on this scoring system, refer to this page.

How are Companies Using Hybrid Data Pipeline?

Progress partners are using the Hybrid Data Pipeline technology to access data in the cloud or on-premises behind a firewall. In one example, a partner is exposing standard SQL/REST from multiple data sources. Hybrid Data Pipeline’s new release allows them to leverage existing LDAP security while continuing to access data in many sources. Another partner scenario involves a financial company with strict data governance requirements managed via OAuth.

Support for external authentication in the latest Hybrid Data Pipeline release enables both of those companies to access their data with minimal security effort, as well as delivering compliance with federal standards.

Learn More

To learn more about the latest innovations in enterprise security, join our webinar on Enterprise Security in Data Access, or get started with Hybrid Data Pipeline today..

Join Security Webinar

Try Hybrid Data Pipeline

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Alan Hase is Vice President of Engineering and Chief Development Officer at Big Switch. Alan has more than 20 years of experience in the networking industry and leading global engineering teams which have delivered industry leading innovation in high end routing, security, fabric and wireless technologies. Alan joined Big Switch from Extreme Networks where he was responsible for product strategy for its secure campus switching, intelligent mobility and campus orchestration products. Prior to Ext...
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure...
Public clouds dominate IT conversations but the next phase of cloud evolutions are "multi" hybrid cloud environments. The winners in the cloud services industry will be those organizations that understand how to leverage these technologies as complete service solutions for specific customer verticals. In turn, both business and IT actors throughout the enterprise will need to increase their engagement with multi-cloud deployments today while planning a technology strategy that will constitute a ...
Cloud-Native thinking and Serverless Computing are now the norm in financial services, manufacturing, telco, healthcare, transportation, energy, media, entertainment, retail and other consumer industries, as well as the public sector. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that pro...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
Andrew Keys is co-founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereum.
Data center, on-premise, public-cloud, private-cloud, multi-cloud, hybrid-cloud, IoT, AI, edge, SaaS, PaaS... it's an availability, security, performance and integration nightmare even for the best of the best IT experts. Organizations realize the tremendous benefits of everything the digital transformation has to offer. Cloud adoption rates are increasing significantly, and IT budgets are morphing to follow suit. But distributing applications and infrastructure around increases risk, introdu...
Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expensive intermediate processes from their businesses. Accordingly, attendees at the upcoming 23rd CloudEXPO, June 24-26, 2019 at Santa Clara Convention Center in Santa Clara, CA will find fresh new content in full new FinTech & Enterprise Blockchain track.
DevOps has long focused on reinventing the SDLC (e.g. with CI/CD, ARA, pipeline automation etc.), while reinvention of IT Ops has lagged. However, new approaches like Site Reliability Engineering, Observability, Containerization, Operations Analytics, and ML/AI are driving a resurgence of IT Ops. In this session our expert panel will focus on how these new ideas are [putting the Ops back in DevOps orbringing modern IT Ops to DevOps].
While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy. Bill has a very impressive background which includes ...
On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure...
Most organizations are awash today in data and IT systems, yet they're still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What's lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions. New research shows that delivering on multicloud e...