SYS-CON MEDIA Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Courtney Abud

Blog Feed Post

Throttling Database Using Rate Limits for SQL or REST

Need to utilize rate limiting to improve the quality of service of your database? Our new database-agnostic Rate Limits API makes it easy.

When you are planning to expose your database to new users or tenants, one of the important areas to consider is resource governance. When in production, there’s always a high probability that you will see complex live queries for data visualization or map reduce jobs impacting your analytical database, which can impact other users. Then you start to scale as with any web application by running a load balancer in front of your servers to distribute requests efficiently. But often in a production environment you come across a bad user that affects your quality of service (QoS).  To give you an idea on how a bad user can affect your service, here are couple of abusive scenarios:

  • A naïve developer who keeps hogging all the resources due to an inefficiently written client request.
  • A low priority user who keeps hogging the resources, causing service outages for a high priority users
  • A malicious user who keeps attacking your API endpoints to cause DDoS for all other users

It is not pragmatic to scale your system to accommodate genuine requests whenever there is a drop in QoS due to such abusive behavior. To deal with this, rate limiting is one technique that can be employed. Essentially rate limiting defines a number of requests or the amount of data that you can request with in an interval of time. This is an effective technique that can mitigate the abusive scenarios discussed above, and you can find rate limits for almost all the SQL and REST APIs that you would want to interact with.

Database Agnostic Throttling

At Progress, we are really interested in open analytics strategies that leverage Data API Management to expose hosted databases to other users, customers and tenants for self-service analytics. We engineered DataDirect Cloud (a hosted service) and Hybrid Data Pipeline (a self-hosted service) to provision Data APIs on top of 30+ different data sources, including Oracle, SQL Server, Hadoop Hive, Postgres, MySQL, IBM DB2, Sybase, Greenplum and more—without having to configure the database itself. In this blog, I will be going through the rate limiting features for SQL or REST APIs generated through Hybrid Data Pipeline to protect your hosted database resources.

Introducing Rate Limits API

With the recent release of Hybrid Data Pipeline, admins can now throttle Data APIs (ODBC, JDBC or OData) with fine granularity to improve the overall QoS. With the Rate Limits API, you can configure the following parameters:

  1. MaxFetchRows: Maximum number of rows that can be fetched per query
  2. PasswordLockoutInterval: The duration, in seconds, for counting the number of consecutive failed authentication attempts
  3. PasswordLockoutLimit: The number of consecutive failed authentication attempts that are allowed before locking the user account
  4. PasswordLockoutPeriod: The duration, in seconds, for which a user account will not be allowed to authenticate to the system when the PasswordLockoutLimit is reached
  5. CORSBehavior: Configuration parameter for CORS behavior. Setting the value to 0 disables the CORS filter.  Setting the value to 1 enables the CORS filter.

You can configure these parameters at three levels wherever they are applicable. Hybrid Data Pipeline offers three levels of granularity to rate limit the OData API. They are:

  1. System Level
  2. User Level
  3. Data Source Level

You can set the all the parameters defined above in System Level Limits and only MaxFetchRows parameter for User and Data Source Level settings.

System Level Rate Limit

System Level limits defines the limits for the Hybrid Data pipeline service. For System level limit, you can configure following parameters:

  1. MaxFetchRows (id = 1)
  2. PasswordLockoutInterval (id = 2)
  3. PasswordLockoutLimit (id = 3)
  4. PasswordLockoutPeriod (id = 4)
  5. CORSBehavior (id = 5)

Let’s look at the Rate Limits API. To fetch all the system level limits all you must do is send a GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/system

 RESPONSE

   {
 
    "limits": [
 
        {
 
            "value": 1000,
 
            "id": 1
 
        },
 
        {
 
            "value": 100,
 
            "id": 2
 
        },
 
        {
 
            "value": 10,
 
            "id": 3
 
        },
 
        {
 
            "value": 100,
 
            "id": 4
 
        },
 
        {
 
            "value": 0,
 
            "id": 5
 
        }
 
    ]
 
}

To set a limit, you can send a POST request as shown below:

POST

http://<server>:<port>/api/admin/limits/system/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

 

As simple as that, an admin can set limits for all the above parameters at the System Level. You can also update and delete the system limits. Visit the documentation here to learn how you can do it.

User Level Limits

At the user level, you can set limits only for restricting the number of rows that can be fetched for a single query. To fetch all the users and the Max row limits for each of them, execute the GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/users

RESPONSE

{
 
    "userLimits": [
 
        {
 
            "limits": [
 
                {
 
                    "value": 10000000,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 1,
 
            "userName": "d2cadmin"
 
        },
 
        {
 
            "limits": [
 
                {
 
                    "value": 100,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 2,
 
            "userName": "d2cuser"
 
        }
 
    ]
 
}

To set a limit for a user, you can send a POST request as shown below:              

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/<limit-id>

BODY

{
    "value": 100
}

RESPONSE

{
    "value": 100
}

 

You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data Source Level Limits

Each user can have multiple data sources defined in his account, and you can dictate the limit for each data source individually. To get all the data sources and limits for a user, execute a GET request as shown below.

GET

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources

RESPONSE

{
 
    "datasourceLimits": [
 
        {
 
            "limits": [],
 
            "dataSourceId": 1,
 
            "dataSourceName": "SQLServer",
 
            "isGroup": false
 
        }
 
    ]
 
}

To set a limit for a datasource under a user, you can send a POST request as shown below.

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources/<datasource-id>/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

As with User level limits, you can only set MaxFetchRows limit at the data source level. You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data API Management

We hope this article gave you a glimpse into how Hybrid Data Pipeline can provide throttling for the Data APIs that you can produce. In addition to this, you can now easily configure a load balancer for cluster of Hybrid Data Pipeline servers, helping you to handle and distribute requests properly, improving the QoS for the ODBC, JDBC and OData APIs that you are generating using Hybrid Data Pipeline servers. 

Want to learn more about Hybrid Data Pipeline features for Data APIs? Just click the link below.

Learn More about Hybrid Data Pipeline

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
Take advantage of autoscaling, and high availability for Kubernetes with no worry about infrastructure. Be the Rockstar and avoid all the hurdles of deploying Kubernetes. So Why not take Heat and automate the setup of your Kubernetes cluster? Why not give project owners a Heat Stack to deploy Kubernetes whenever they want to? Hoping to share how anyone can use Heat to deploy Kubernetes on OpenStack and customize to their liking. This is a tried and true method that I've used on my OpenSta...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, will discuss how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galer...
10ZiG Technology is a leading provider of endpoints for a Virtual Desktop Infrastructure environment. Our fast and reliable hardware is VMware, Citrix and Microsoft ready and designed to handle all ranges of usage - from task-based to sophisticated CAD/CAM users. 10ZiG prides itself in being one of the only companies whose sole focus is in Thin Clients and Zero Clients for VDI. This focus allows us to provide a truly unique level of personal service and customization that is a rare find in th...
Emil Sayegh is an early pioneer of cloud computing and is recognized as one of the industry's true veterans. A cloud visionary, he is credited with launching and leading the cloud computing and hosting businesses for HP, Rackspace, and Codero. Emil built the Rackspace cloud business while serving as the company's GM of the Cloud Computing Division. Earlier at Rackspace he served as VP of the Product Group and launched the company's private cloud and hosted exchange services. He later moved o...
92% of enterprises are using the public cloud today. As a result, simply being in the cloud is no longer enough to remain competitive. The benefit of reduced costs has normalized while the market forces are demanding more innovation at faster release cycles. Enter Cloud Native! Cloud Native enables a microservices driven architecture. The shift from monolithic to microservices yields a lot of benefits - but if not done right - can quickly outweigh the benefits. The effort required in monitoring,...
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
Signs of a shift in the usage of public clouds are everywhere. Previously, as organizations outgrew old IT methods, the natural answer was to try the public cloud approach; however, the public platform alone is not a complete solution. Complaints include unpredictable/escalating costs and mounting security concerns in the public cloud. Ultimately, public cloud adoption can ultimately mean a shift of IT pains instead of a resolution. That's why the move to hybrid, custom, and multi-cloud will ...
The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get tailored market studies; and more.
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The KCSP program is a pre-qualified tier of vetted service providers that offer Kubernetes support, consulting, professional services and training for organizations embarking on their Kubernetes journey. The KCSP program ensures that enterprises get the support they're looking for to roll out new applications more quickly and more efficiently than before, while feeling secure that there's a trusted and vetted partner that's available to support their production and operational needs.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It's clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Th...
xMatters helps enterprises prevent, manage and resolve IT incidents. xMatters industry-leading Service Availability platform prevents IT issues from becoming big business problems. Large enterprises, small workgroups, and innovative DevOps teams rely on its proactive issue resolution service to maintain operational visibility and control in today's highly-fragmented IT environment. xMatters provides toolchain integrations to hundreds of IT management, security and DevOps tools. xMatters is the ...