SYS-CON MEDIA Authors: Pat Romanski, Gary Arora, Zakia Bouachraoui, Yeshim Deniz, Liz McMillan

Related Topics: IBM Cloud

IBM Cloud: Article

Understanding Straight-Through Processing - A technical overview

Understanding Straight-Through Processing - A technical overview

Straight-Through Processing (STP) is a term associated with workflow and business process management technologies. STP is the automation of a process flow, from invocation to execution. STP allows the seamless and accurate transactional exchange of information electronically. It can involve many different information stores and parties within the process flow, and it reduces the need for human intervention.

STP is often associated with the financial services and securities industry, as this is where its use has been prevalent, due to a change in legislation (Rule 15c6-1) by the Securities and Exchange Commission. This shortened the settlement time for trades from 5 days (T+5) to 3 days (T+3). Previously it was possible to manually settle all trades, but the shortened cycle time, coupled with increasing volumes, meant that automation was the only real solution to T+3 trade settlement. This ultimately involves integrating automated processes such as order management, trade capture, confirmation, confirmation matching, settlement, general ledger accounting, cash and asset reconciliation, and transaction and account reporting.

Business processes and STP are inextricably linked; you won't often find reference to one without the other. This is because the first stage in the automation of a "business transaction" (not to be confused with ACID transactions, which a business transaction is composed of) is the building of a consistent platform for disparate processes that reside in different systems. BPM technology provides this consistency, residing above an organization's existing internal systems and applications.

STP is increasingly being used in other industries and applications where there is a need to automate flows and orchestrate services. This has resulted in a number of different approaches, patterns, terms, and technologies that I will explore in this article.

STP Implementations
Service-based architectures and Web services technologies have had a big impact on how and why organizations want to construct automated process flows, and also on the technology used to implement them.

Many organizations have utilized such architectures to build granular services that interact with their legacy back-office systems. The next step is to orchestrate these granular services to provide a higher-level business service. This can and has been done using code, but it can prove to be inflexible and difficult to change and maintain. Organizations are increasingly looking for more productive ways to orchestrate and automate flows, which provides them with greater flexibility and productivity, and often involves a visual design tool and/or code generation.

Traditionally business process technology has been associated with long-lived asynchronous-type flows that often involve human interaction and multiple ACID transactions. Process templates are defined at design time in a visual user interface and the process meta-information is persisted at design time using a markup language. An example of this is flow description markup language (FDML), which is used by MQSeries Workflow. Other notations that are used center around the process vendors' support for proposed standards, such as WS BPEL and BPML. An overview of these standards is outside the scope of this article but is offered in my earlier article, entitled "Process-Driven Architectures for WebSphere Application Server" [WJ, Vol. 2, issue 1].

When deployed, the process template is persisted to a database that the process engine uses to store state and process information. The state of the running process is persisted at each step by the process engine to ensure reliability and failover, and also as a mechanism to deal with long-running processes, which may take days, months, or in extreme cases, years.

Figure 1 shows this type of longlived process, using as an example a bank loan application in which three different parties - the customer, the bank agent, and the insurance underwriter - interact over an infinite period of time. The completion of each step moves the process flow on to the next step and there may be rules and timers at each step.

 

This model does not work for STP flows, however, which are often synchronous and may need to be executed in real time. Process vendors such as Versata have tackled this by providing STP activities in which the whole flow is synchronous and can execute within the same transaction, while state is only persisted at the beginning and end of the process.

Figure 2 shows this type of STP process using an STP process aimed at preventing money-laundering as an example. In this example the synchronous STP process is invoked from a session bean, which submits a payload that is used to check client details, check against sanction lists (hosted as a Web service), run profile rules on the data submitted, log noncompliance, and add details to a user's worklist for further investigation, before returning a response as to success or failure.

 

This STP process has been constructed visually, interacts with prebuilt services, is synchronous, executes in a single thread and a single transaction, and is far quicker than a traditional BPM process due to the lack of persistent I/O at each step. This process can also be used as a subprocess within a higher-level asynchronous process, enabling the design of business flows that accurately reflect synchronous or asynchronous requirements.

The response for validation must occur within three seconds, according to the SLA with their customers. This is currently done using a legacy framework that the company built but which does not take advantage of industry-based middleware or modern programming languages. Consequently, it is becoming difficult to maintain and difficult to scale as more clients are signed up.

This company is currently in the process of re-architecting their solution around J2EE middleware and also wants to extract the process steps from the code to provide the type of flexibility, productivity, and maintenance benefits discussed earlier. The company has no requirement for persistence, other than logging to validate service response, since if a card is not validated within three seconds it is just swiped again. All interactions with the services in the process flow result in read-only interrogation of data, based on the input data, with routing rules applied to the data returned. The system itself must be able to handle, in the first instance, 30 of these end-to- end business transaction flows per second and must be able to scale effectively.

This scenario is difficult to fulfill for a traditional process vendor, as process tooling was never designed to deal with these types of flow scenarios, which are increasingly being referred to by a variety of terms - STP flows, microflows, and uninterruptible processes.

I will first show how you could go about achieving this with traditional process products, and then look at other tools that you can use to construct generated flows that do not reside in a process state engine.

As discussed earlier, most process products persist their state to an underlying database. Even when using STP, the recording of initial and end state persistence means an I/O against the underlying database, which can add substantial overhead to an end-to-end flow in which the state engine database becomes the bottleneck. In our scenario we are not concerned with any recording of state, as in the event of a failure the process transaction is just restarted when the credit card is swiped again. Neither are we concerned with transaction management, as there are no transactions within the flow since all database activities are read-only.

Embedded databases provide a good solution when used with traditional process products to enhance STP performance and enable rapid execution of STP flows:
1.  They can often be run in embedded mode as well as mixed mode.
2.  There is no remote connection overhead in embedded mode.
3.  They have an array of parameters to allow the tweaking of elements such as persistent file size and frequency of any disk I/O.
4.  Because everything is executed in memory, they are extremely fast.

I recommend HSQLDB (http://hsqldb.sourceforge.net) as an open source embedded database. It is a good option for developers wishing to explore further the use of an embedded database for STP processes.

Construction of code-generated microflows is another way to approach STP flow requirements. WebSphere Studio Application Developer Integration Edition (v5) contains a plug-in that supports development with WebSphere's Process Choreographer. A business process container is configured for every WebSphere Application Server that has the Process Choreographer installed. The Process Choreographer contains support for what are referred to as interruptible and uninterruptible flows.

Interruptible flows are asynchronous. They execute in separate transactions and their state is stored in a database. Uninterruptible flows are visually described, resulting in generated code, with the meta-information of the process residing as FDML. These uninterruptible flows execute quickly either outside of a transaction or as a single transaction, and because each flow's steps execute in a single thread, they share the same context.

Although Process Choreographer uninterruptible processes do not require a state database, they do require the business process container that the Process Choreographer provides, as they make use of the choreographer engine.

Uninterruptible flow activities, or steps, can make use of snippets of Java code, as well as being able to invoke Web services and make calls to EJBs. The complete process can be exposed as a session bean facade or alternatively as a message-driven bean. This can then be packaged as a SOAP service if required. This is bound to an HTTP transport. However, using the Web Services Invocation Framework, you can also bind the session bean to an RMI or JMS transport at runtime.

This can all be done using wizards inside of WSAD IE, leading to a very productive and performant approach to creating straightthrough process flows. The use of this tool and approach would be very applicable to the scenario I discussed earlier.

Figure 3 shows an example of an uninterruptible process that is provided with WSAD IE, and the associated Java snippet for the activity I will focus on.

 

Process Definition for Java - JSR-207
Other initiatives exist that would support the construction of straight-through process flows, one being JSR-207, which is currently progressing through the Java Community Process. This aims to define metadata, interfaces, and a runtime model for creating business processes in the Java/J2EE environment. This proposal uses the Java Language Metadata technology (JSR-175) as a means to use a simple notation for describing business processes. The aim is for the metadata to be applied directly to Java source code in order to dynamically generate and bind process behaviors, which should include synchronous and asynchronous execution and support for straightthrough flows. Anyone who has had cause to use XDoclet (http://xdoclet.sourceforge.net) - and to appreciate its simplicity and power - will appreciate this approach.

This is likely to be an important JSR, particularly because the functionality it provides is likely to find its way into the J2EE specification at some point. Further details can be found at www.jcp.org/en/jsr/detail?id=207.

Conclusion
To recap, a number of benefits can be derived from the use of STP within industries other than the financial industry with which it has traditionally been associated. This is especially pertinent to organizations that are embracing the concept of a loosely coupled and/or service-based architecture. These benefits include:

  • Increase of business efficiency through the use of streamlined business processes
  • Reducing cycle times and bottlenecks while improving operational efficiency
  • Reducing manual intervention and therefore reducing costs
  • Increasing productivity as the process steps are "extracted" from code, leading to easier design, maintenance, and change.
  • Increased business visibility and information from process dashboards

    Straight-though flows can be achieved today through the use of tooling and technology. This provides an effective way to increase business efficiency, visibility, and return on investment in the enterprise.

  • More Stories By Jim Liddle

    Jim is CEO of Storage Made Easy. Jim is a regular blogger at SYS-CON.com since 2004, covering mobile, Grid, and Cloud Computing Topics.

    Comments (2)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    Latest Stories
    Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
    On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure...
    "Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
    While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.
    DevOps has long focused on reinventing the SDLC (e.g. with CI/CD, ARA, pipeline automation etc.), while reinvention of IT Ops has lagged. However, new approaches like Site Reliability Engineering, Observability, Containerization, Operations Analytics, and ML/AI are driving a resurgence of IT Ops. In this session our expert panel will focus on how these new ideas are [putting the Ops back in DevOps orbringing modern IT Ops to DevOps].
    Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
    Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
    Most organizations are awash today in data and IT systems, yet they're still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What's lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions.
    Concerns about security, downtime and latency, budgets, and general unfamiliarity with cloud technologies continue to create hesitation for many organizations that truly need to be developing a cloud strategy. Hybrid cloud solutions are helping to elevate those concerns by enabling the combination or orchestration of two or more platforms, including on-premise infrastructure, private clouds and/or third-party, public cloud services. This gives organizations more comfort to begin their digital tr...
    Keeping an application running at scale can be a daunting task. When do you need to add more capacity? Larger databases? Additional servers? These questions get harder as the complexity of your application grows. Microservice based architectures and cloud-based dynamic infrastructures are technologies that help you keep your application running with high availability, even during times of extreme scaling. But real cloud success, at scale, requires much more than a basic lift-and-shift migrati...
    David Friend is the co-founder and CEO of Wasabi, the hot cloud storage company that delivers fast, low-cost, and reliable cloud storage. Prior to Wasabi, David co-founded Carbonite, one of the world's leading cloud backup companies. A successful tech entrepreneur for more than 30 years, David got his start at ARP Instruments, a manufacturer of synthesizers for rock bands, where he worked with leading musicians of the day like Stevie Wonder, Pete Townsend of The Who, and Led Zeppelin. David has ...
    Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
    Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
    Addteq is a leader in providing business solutions to Enterprise clients. Addteq has been in the business for more than 10 years. Through the use of DevOps automation, Addteq strives on creating innovative solutions to solve business processes. Clients depend on Addteq to modernize the software delivery process by providing Atlassian solutions, create custom add-ons, conduct training, offer hosting, perform DevOps services, and provide overall support services.