3 things you need to know about event stream processing
What it is, how it works and where it’s happening
By Alison Bolen, SAS Insights Editor
As more data is generated from our devices, sensors and “things,” more of it will need to be analyzed while it is still streaming. One way you can analyze IoT data while it's in motion is with event stream processing. This is a term that most analysts and data scientists understand, but many business leaders do not. Let’s fix that. Start by reading the answers to these three questions: what is it, how does it work, and where is it happening?
1. What is event stream processing?
First, let’s break down the three parts of the term:
- Event – An event is any occurrence that happens at a clearly defined time and is recorded in a collection of fields.
- Stream – A data stream is a constant flow of data events, or a steady rush of data that flows into and around your business from thousands of connected devices and other sensored “things.”
- Processing – The act of analyzing data.
Putting that all together, event stream processing is the process of quickly analyzing time-based data as it is being created and before it’s stored, even at the instant that it is streaming from one device to another.
Directly working with event data, when they happen, allows for faster reaction time – even influencing a situation before it’s over.
Fiona McNeill
Global Product Marketing Manager
SAS
2. How does event stream processing work?
“Traditional analytics applies processing after the data is stored, but for an increasing number of situations, these insights are too late,” explains Fiona McNeill, Global Product Marketing Manager for SAS. “Directly working with event data, when they happen, allows for faster reaction time – even influencing a situation before it’s over.”
Traditional analytics follows this general procedure:
- Receive and store data.
- Prepare data.
- Process/analyze the data.
- Get results and share as needed.
Event stream processing changes the order of the whole analytics procedure:
- Store queries/analysis.
- Receive data.
- Process the data.
- Push results immediately (often to trigger a reaction).
3. Where does event stream processing take place?
According to McNeill, event stream processing can occur in three distinct places: at the edge of the network, in the stream, or on data that’s at rest, out of the stream. Let’s look at each:
- At-the-edge analytics is any data that is processed on the same device from which it is streaming. This could be your thermostat, your iPhone or any single sensor with processing capabilities. This type of analytics works with minimal context to the data, often confined to rudimentary rules and simple statistics like average or standard deviation. Simple commands can be automated using analytics at the edge, as such instructions to turn something on/off or to stop/go. For example, a thermostat adjusts based on temperature fluctuations.
- In-stream analytics occur as data streams from one device to another, or from multiple sensors to an aggregation point. This type of analysis combines events of different types and alternate formats that are transmitting at varied rates. Analytics on multiple stream inputs has a richer context and can be used to identify more complex patterns of interest, or even connect a desired chain of actions. You can use in-stream analytics to automate or trigger more sophisticated prescriptive actions. For example, analyzing mobile phone use relative to subscribed plan offers can be triggered based on location and activity.
- At-rest analytics occurs when there is a historical repository of data, which can include both data saved from event streams as well as other stored information – so it’s processed after the fact. With the big data volumes that streaming generates, high-performance analytics is needed for effective processing. And time can be saved by cleansing and normalizing data while it’s in motion – before it’s stored, even in large data lakes like Hadoop. At-rest analytics is based on rich, historical context – the perspective required to create predictive analytical models and forecasts and to discover new patterns of interest.
More advanced organizations using event stream processing will deploy all three tactics in a multiphase analytics system, says McNeill, optimizing the decisions that should be made in each step of the process. “Multiphase analytics can analyze data throughout the event spectrum to inform what sensors are needed where and when, what new patterns of interest are emerging – and to provide continuous interactive monitoring for situational awareness in centralized command centers.”
As we continue to outfit more of our world with sensors and data streaming capabilities, the ability to analyze the data from devices and other things will become more and more important. Event stream processing will be crucial for smart grid stabilization, predictive asset maintenance, digital marketing – and more.
“Everything is being connected,” says McNeill, “People, places and things. Whether you are in the connected home or on a rainy street in China, when everything is connected, we have questions. What is going to happen? What should be done? When will this change? Deriving answers requires knowledge based on event streams. And the best way to get answers from streaming data in the Internet of Things is to analyze it with event stream processing.”
Read More
- Still wondering what the Internet of Things is exactly? Read our introduction to the Internet of Things.
- How can you analyze event streams? Start with SAS Event Stream Processing.
Get More Insights
Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.