A new tool for defenders – Real-time analysis of Web Proxy data

When I got back into the office after taking a short break to recharge my batteries, I was really excited to be speaking with my colleagues at Respond Software about the upcoming release of our web filtering model for our Respond analyst. You see, over the last few months we’ve been working tirelessly to build a way to analyze web filtering event data in real-time. Now that I’m sitting down to write this blog, the fruit of all the hard work our team has put into making this a reality is really sinking in. We’ve done it! It’ s now available as part of the Respond Analyst!

This was no small feat, as most of you in the security operations world would know.

You may ask why we chose to take this challenge on.  The answer is quite simple, there is a ton of valuable information in web filtering data and it’s extremely difficult for security teams to analyze these events in real-time due to the sheer volume of data generated by enterprises. What a perfect opportunity for us to show off the Respond Analyst’s intelligence and capability.

Up until now, security operations and IR teams have pivoted to using web filtering data for investigations once they’ve already been alerted to an attack through threat hunting or some other form of detection.  Processing all of the web filtering data for an organization in a SIEM or similar has just been way too expensive to do. In fact, most organizations can’t even afford to store this data for a “reasonable” amount of time for investigators to dig through.

Think about it for a second, each web page visited can generate a number of new web requests to pull back content from different sources. Then picture each employee using the internet for most of day; navigating the web through their day-to-day tasks, a few personal items between meetings, all this amounts to hundreds of web page visits each day. If you have a few hundred employees, the volume of data generated by the web filtering solution quickly becomes unmanageable. Well now we’re able to process all of these events in real-time.

Consider the questions you are able to ask of the data without even taking the assigned web filtering category into account…

  • Analyze each component of the HTTP header
  • Perform user agent analysis
  • Take a look at how suspicious the requested domain is
  • Perform URL string comparisons to all other requests over an extended period of time
  • Compare each attribute to information you’ve gathered in your threat intel database

But why stop there…

  • What about looking at whether the pattern of behavior across a set of requests is indicative of exploit kit delivery?
  • Maybe you suspect that these requests are related to command-and-control activity
  • What about the upload of documents to a filesharing service, is that data exfiltration or simply everyday user activity?

Web filtering data can also leverage the power of integrated reasoning.  When web filtering data is combined with IDS/IPS sensors, Anti-malware technology and contextual sources like vulnerability data and critical asset lists, you are able to form an objective view of your enterprise’s threat landscape.  Beyond the analysis of each of these data sources, the Respond Analyst accurately scopes all events related to the same security incident together for a comprehensive incident overview.  The Respond Analyst then assigns an appropriate priority to that incident and documents all the details of the situation and presents this information to you.  This is, by far, the most efficient way to reduce attacker dwell time.

We have a long way to go and many more exciting Respond Analyst skills & capabilities on the way. I couldn’t be prouder of all the work we’ve achieved and the release of our Web Filtering model.

Way to go Respond team!