Understanding data in motion

Event stream processing discovers interesting patterns in huge streams of data in motion

by Frédéric Combaneyre, Sr. Business Solutions Manager, SAS Analytics Centre of Excellence

Data pours into the organization from every conceivable direction: from operational and transactional systems; from scanners, sensors and smart meters; from inbound and outbound customer contact points; from mobile media and the Web.

Those streams of data contain a wealth of potentially valuable insight – if you can capture and analyze it.

But how do you manage such a torrent of data? Where would you store it? How long would it take to make sense of it? Traditional approaches, which apply analytics after data is stored, may provide insights too late for many purposes – and most real-time-enabled applications can’t deal with this much constantly flowing data.

Here’s an idea: Analyze it on the fly. Find what’s meaningful, grab only what you need, and get instant insights to react immediately and make the best decisions as data is flowing in.

That’s the promise of event stream processing. Event stream processing continuously analyzes data as it flows into the organization, and then triggers an action based on the information flow. It is a form of complex event processing that empowers you (or an automated system) to spot patterns and make decisions faster than ever.

Aggregation, correlation and temporal analysis set event stream processing apart from other approaches by revealing what’s happening now, not just what happened in the past, so you can take action immediately.

Three steps for streaming data

Managing data in motion (streaming data) is different from managing data at rest. Event stream processing relies on three principal capabilities – aggregation, correlation and temporal analytics – to deal with data in motion.

  1. Aggregation. Let’s say you wanted to detect gift card fraud: “Tell me when the value of gift card redemptions at any point-of-sale (POS) machine is more than $2,000 in an hour.” Event stream processing can continuously calculate metrics across sliding time windows of moving data to understand real-time trends. This kind of continuous aggregation would be difficult with traditional tools. With the SAS® Event Stream Processing Engine, it’s built in.
  2. Correlation. Connect to multiple streams of data in motion and, over a period of time that could be seconds or days, identify that condition A was followed by B, then C. For example, if we connect to streams of gift card redemptions from 1,000 POS terminals, event stream processing could continuously identify conditions that compare POS terminals to each other, such as: “Generate an alert if gift card  redemptions in one store are more than 150 percent of the average of other stores.”
  3. Temporal analysis. Event stream processing is designed for the concept of using time as a primary computing element, which is critical for scenarios where the rate and momentum of change matters. For example, sudden surges of activity can be clues to potential fraud. Event stream processing could detect such surges as they occur, such as: “If the number of gift card sales and card activations within four hours is greater than the average number of daily activations of that store in the previous week, stop approving activations.” Unlike computing models designed to summarize and roll up historical data, event stream processing asks and answers these questions on data as it changes.


These three capabilities set event stream processing apart from other approaches by revealing what’s happening now, not just what happened in the past, so you can take action immediately.

Enrich and empower your analytic applications

Event stream processing really proves its value when it is embedded into analytical applications, such as risk management, fraud detection and prevention, anti-money laundering and customer intelligence. Event stream processing can be used to detect patterns and filter relevant events to send to analytic solutions before data is stored. Or it can detect when the data for a specific independent calculation is available so that it can be immediately run, rather than having the analytic solution waiting until all data is available to run the most time-intensive calculation.

With the ability to process millions of records per second (with latencies around a microsecond), the possibilities are limited only by imagination. Here are some idea-starters:

  • When transactions against the same credit card number come from four or more companies within one minute, deny the next request, flag the account and send a message to the fraud detection dashboard.
  • When stock level for the book The Da Vinci Code drops to 10 percent of minimum, given the last 10 hours of buying behavior, trigger the distribution center to begin the restocking process.
  • How many website visitors are going from the home page to About Company and clicking My Profile during a rolling, 10-minute window?
  • If the time between in-store credit card transactions in different cities is less than the travel time between those cities, put the account on hold and flag it for investigation.

Event stream processing answers such questions while reducing storage requirements, computing demands and time to decision. When you consider the terabytes, petabytes and exabytes flowing in and around the organization, there’s enormous value in being able to quickly find the nuggets of value and using them to make better decisions, faster.


article-frédéric-combaneyre

Passionate about technology and innovation, Frédéric Combaneyre is an expert on event stream processing solutions. During his 19 years in the software industry, he has covered many domains, from business intelligence to information management. Combaneyre supports customers in multiple industries and speaks on a wide variety of topics at SAS and external conferences.