AlexOakenman - Fotolia

Manage Learn to apply best practices and optimize your operations.

Adopt complex event processing architecture across hybrid clouds

Find out how to implement CEP and stream processing across hybrid cloud infrastructure. Expert George Lawton discusses.

The rise of big data is pushing many important data sources into a variety of cloud and legacy enterprise applications. Enterprise architects can leverage complex event processing architecture to help bridge these silos in useful ways. At the same time, it is important to consider security, latency and the cost structures around different architecture choices.

"The current state of architecture and tools for complex event processing is undergoing a significant shift," said Bill Platt, senior vice president and chief architect at BMC Software Inc., based in Houston. "Due to the emergence of more useful cloud-based primitives and the cost savings most companies can achieve in the cloud, new and exciting ways of event processing are beginning to show. For the last five years, cloud-native programmers and users have been creating workloads with a pattern similar to CEP."

In essence, modern CEP apps are architected such that requests are sent to REST APIs, and then queued up for worker fleets to process. The results are moved along a process path to more queues and more workers until the entire sophisticated process or workflow is completed. These workloads in the cloud are not completely different than classic Java-based database transactions for enterprise apps, but the tools and environment have shifted. 

Ephemeralize CEP resources

"Within the last three years, mobile and [Internet of Things] workloads have grown to be the largest percentage of all transactions, yet their nature is of short-lived, individual or isolated transactions," Platt said. This has given rise to primitives, such as Amazon Web Services Lambda, and event streaming tools, like Kafka and AWS Kinesis. It's also led to a greater use of in-memory databases to monitor and process these short bursts, fast and less expensively. The results of these processing streams are newly defined workloads that are flexible and redefine the business process weekly, daily or even hourly.

Those ephemeral resource uses are helpful toward complex event processing architecture and business process management (BPM) in general. So, instead of the actual business process counting on expensive and difficult to maintain RDBMS transactions, these transactions are being broken into microtransactions that can be unwound quickly or combined differently, if needed. Only the end of the workload automation stream needs to be committed to complex data storage. 

Use process and virtual private clouds to protect data

Shawne Robinson, director of product marketing for Pega Cloud at Pegasystems Inc., based in Cambridge, Mass., said organizations need to establish processes to ensure the data being integrated is clean, and not full of bad data or malformed data. This can wreak havoc computing resources, and thus make any conclusions meaningless.

The use of a virtual private cloud can also help the enterprise to achieve greater control over their cloud environment. This can help minimize network hurdles applications require to successfully share data across systems and across an organization's infrastructure -- whether cloud-based or on premises.

Simplify the CEP development process

One of the big challenges of integrating lots of CEP components into BPM systems is that it can lead to the growth of a plethora of event processing components. "These microtransaction environments look more complex at first, because they are broken down into subpieces and sometimes across disparate infrastructure [on purpose]," Platt said.

However, if integration is handled continuously, each iteration requires less change, which makes automation more tractable and turns the app development, integration and deployment process into something like a CEP or BPM. This makes it easier for the development team to apply a consistent tool set and deployment process for new features and configurations.

Platt said the key to keeping actual performance latency low and the CEP successful is to have each element instrumented with both performance monitoring and commit logs. This way, troubleshooting and improvement can occur on a micro level, while still maintaining transaction-level understanding metrics. Another aid in the continuous improvement world is to have continuous discovery of the environment via a discovery tool.

Execute CEP functionality in small batches

As organizations move BPM or CEP applications into the cloud, it is important to practice DevOps in small teams to own each event process and business process where the team owns the development, operations and support of end users.

"This allows enterprises to rapidly eliminate root-cause problems and stay in close touch with the experience of end users, even while we paradoxically create more and more automation to handle the process," Platt said. "Continuous integration, monitoring and discovery are critical, as the conditions in which the process exists will be dynamic in a very organic way."

Leverage events from the public cloud

In many cases, the enterprise might want to bring events in from public cloud services to improve their ability to generate more complex events, such as transit data or the weather. But bringing these into the enterprise can be challenging.

The public cloud is slowly developing an ecosystem of events, like transit or weather. Crossing the boundary into the enterprise requires some forethought. Enterprises can address latency by aggregating public data in the public cloud close to the events, and then forwarding aggregated events into the private cloud.

"The key challenge is obviously security. Enterprises need policies in place that define which data can go across the boundary between public and private," said Mark Palmer, senior vice president and general manager of engineering at TIBCO Software Inc., based in Palo Alto, Calif. "Enterprise architects need to be able to model event flow through a lens that differentiates between private events and public events, with appropriate security policy for each."

Focus on flexibility

TXODDS implemented a hybrid cloud CEP service that leverages TIBCO StreamBase for complex event processing architecture.

The platform uses a variety of data gathered from live sporting events and historical data sources to generate odds for sports bookmakers around the world. Alex Kozlenkov, software architect at TXODDS, said some of the analytics are done on historical data at rest. But other data about related bets and events that occur during sporting matches can affect the notices sent out to bookmakers.

Kozlenkov said they decided to leverage a commercial CEP platform, so they could focus on their core business. Providing real-time event data to clients is important to stay competitive for its clientele. "When a match has started, there are all sorts of price drifting on the bets. We don't just provide raw data, we provide context around how things are, with respect to each other. Bookmakers want to be immediately alerted when things drift by a given percentage."

TXODDS wanted to implement a hybrid service that spanned data from its internal servers and data stores housed in the cloud. Kozlenkov said it is important to create an architecture that makes it possible to implement new combinations of CEP data securely. Also, by adopting a commercial complex event processing architecture, TXODDS was able to implement new data correlations to clients, with minimal time and effort.

Next Steps

Understanding the true value of event-driven architecture

Implementing CEP and stream processing with BPM tools

Delays are unacceptable in streaming analytics

This was last published in March 2016

Dig Deeper on Hybrid and private cloud applications

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What are the biggest challenges your organization has faced when implementing CEP architecture?
Cancel

-ADS BY GOOGLE

SearchAWS

TheServerSide

SearchFinancialApplications

SearchBusinessAnalytics

SearchCRM

SearchSalesforce

Close