Mention to a security engineer that their problem of event log plumbing has finally been solved, and you will hear a big sigh of relief. Big data technologies and streaming platforms have come to the rescue by automating the arduous process of gathering, storing and distributing logs, and thereby presenting new, welcomed and liberating opportunities for security analysts.
However, it is essential for us to reflect on the evolution of event log collection to fully appreciate what this means to a security engineer. The reality is that by leveraging these available technologies, security analysts can now refocus their time from monitoring massive amounts of mind-numbing data to doing what they do best -- hunting for real security incidents and bad guys.
Here’s a summary of how this event log plumbing problem has matured into today’s big data approaches.
The Evolution of Event Log Collection
Before big data technologies and streaming platforms, event logs were dispersed throughout the environment or stored centrally on a syslog server. This system allowed the use of simple utilities to parse and find interesting events, but the methodology wasn’t scalable and resisted automation. The detection capability was created with on-off custom scripts or accomplished through manual analysis. Who remembers sed and awk? :-) Every task was extremely time consuming and was not reproducible or reusable.
When the first SIEMs started gaining traction, the logs made their way into centralized databases or to various data warehouses. This procedure worked for a while, and then integrations were built to pull the data from various sources quickly. By implementing this process, data collection accelerated and enabled repeatable detection capabilities. At that point, the ability to share with one another became possible, but conversely, we had to deal with data quality issues and very high latency in after-the-fact event retrieval.
Finally, streaming platforms became mainstream. This technology enables high throughput with low latency distribution of information to a broad scope of specific applications. We can publish information into a stream and allow many types of applications to consume the data as needed for their purposes. The clear advantages are:
- Big data technologies can pull the information and store accordingly
- Real-time monitoring utilities can process a feed and create beautiful visualizations and alerting mechanisms
- Web applications can use the stream to support millions of customers at once
The explosion of information available through a common interface opens many possibilities for everyone in the security industry. Easy abstracted access to event information helps security professionals continue to innovate and automate security analysis tasks.
Consider this, if we conveniently mine desired data sets to study how humans think about an event, categorize the contextual information that links relevant data points and use it to make expert decisions, we have fundamentally changed operational security.
Plumbing Problem Solved
Today it is evident that data plumbing and application development are distinct disciplines, and each should work with the other without requiring massive rework.
Here at Respond Software, with Respond AnaystTM we have solved the event log plumbing problem for enterprise organizations by leveraging existing big data technologies and streaming platforms. Whether you use a SIEM, Splunk, ELK or the Apache Hadoop stack, we simply plug our application into the data stream to process this firehose of information and make sense of it. As a scalable, robust expert system, Respond AnalystTM consistently detects those crucial information nuggets that need human attention, while smoothly integrating into complex environments we understand our customers must manage.
For more information about Respond Software, please contact us at email@example.com
John Petropoulos, Product Strategy Consultant for Respond Software
You might like this article: The Hidden Costs of Security Engineering