2019 Security Predictions: Real or Bogus? Respond Experts Weigh In

Where is cybersecurity heading and what should security teams focus on in 2019?

We thought it would be helpful to examine the most common cybersecurity predictions for 2019. Business press, trade publications and of course vendors had a lot to say about emerging technology trends, but what predictions or trends will matter most? More importantly, how can security teams and CISOs turn these predictions into advantages for their business?

I sat down with Chris Calvert, industry expert who often says; “if you have a problem with how today’s SOCs work, it’s partially my fault and I’m working to solve that issue!” With over 30 years of experience in information security, Chris has worked for the NSA, the DOD Joint Staff and held leadership positions in both large and small companies, including IBM and Hewlett Packard Enterprise. He has designed, built and managed security operations centers and incident response teams for eight of the global fortune-50.

During our conversation, we discuss questions like:

  • Will we see an increase in crime, espionage and sabotage by rogue nation-states?
  • How will malware sophistication change how we protect our networks?
  • Will utilities become a primary target for ransomware attacks?
  • A new type of fileless malware will emerge, but what is it? (think worms)
  • And finally, will cybersecurity vendors deliver on the true promise of A.I.?

You can listen to his expert thoughts and opinions on the podcast here!

Want to be better prepared for 2019?

The Respond Analyst is trained as an expert cybersecurity analyst that combines human reasoning with machine power to make complex decisions, with 100% consistency. As an automated cybersecurity analyst, the Respond Analyst processes millions of alerts as they stream. Allowing your team to focus on higher priority tasks like threat hunting and or incident response.

Here’s some other useful information:

Mid-sized Enterprises: Want Robust, Sustainable SecOps? Remember 3 C’s

Cybersecurity is tricky business for the mid-sized enterprise.

Attacks targeting mid-sized companies are on the rise, but their security teams are generally resource constrained and have a tough time covering all the potential threats.

There are solutions that provide sustainable security infrastructures but the vendor landscape is confusing and difficult to navigate. With smaller teams and more than 1,200 cybersecurity vendors in the market, it’s no wonder mid-sized enterprise IT departments often stick with “status quo” solutions that provide bare-minimum coverage. The IT leaders I talk to, secretly tell me they know bare-bones security is a calculated risk but often executive support for resources is just not there.  These are tradeoffs that smaller security teams should not have to make.

Here’s the good news.  Building a solid enterprise-scale security program without tradeoffs is possible. To get started IT leaders should consider the 3 C’s of a sustainable security infrastructure: Coverage, Context, and Cost.

In part 1 of this 3-part blog series, we will deep-dive into the first “C”: Coverage.

When thinking about coverage, there are two challenges to overcome. The first challenge is to achieve broad visibility into your sensors. There is a wide array of security sensors and it’s easy to get overwhelmed by the avalanche of data they generate. Customers often ask me: Do we have to monitor everything? Where do I begin? Are certain sensor alerts better indications of compromise than others?

Take the first step: Achieve visibility with appropriate sensor coverage

To minimize blind spots, start by achieving basic 24 x 7 coverage with continuous monitoring of Network Intrusion Detection & Prevention (NIDS/NIPS) and Endpoint Protection Platform (EPP) activity. NIDS/NIPS solutions leverage signatures to detect a wide variety of threats within your network, alerting on unauthorized inbound, lateral, and outbound network communications. Vendors like Palo Alto Networks, TrendMicro and Cisco have solid solutions. Suricata and Snort are two popular open-source alternatives. EPP solutions (Symantec, McAfee, Microsoft) also leverage signatures to detect a variety of threats (e.g. Trojans, Ransomware, Spyware, etc) and their alerts are strong indicators of known malware infections.

Both NIDS/NIPS and EPP technologies use signatures to detect threats and provide broad coverage of a variety of attacks, however, they do not cover everything.  To learn more on this topic read our eBook: 5 Ingredients to Help your Security Team Perform at Enterprise-Scale

To gain deeper visibility IT departments can eventually start to pursue advanced coverage.

With advanced coverage, IT teams can augment basic 24 x 7 data sensor coverage by monitoring web proxy, URL filtering, and/or endpoint detection and response (EDR). These augmented data sources offer opportunities to gain deeper visibility into previously unknown attacks because they report on raw activity and do not rely on attack signatures like NIDS/NIPS and EPP. Web proxy and URL filtering solutions log all internal web browsing activity, and as a result, provides in-depth visibility into one of the most commonly exploited channels that attackers use to compromise internal systems. In addition, EDR solutions act as a DVR on the system, recording every operation performed by the operating system—including all operations initiated by adversaries or malware. Of course, the hurdle to overcome with these advanced coverage solutions is managing the vast amounts of data they produce.

This leads to the second coverage challenge to overcome—obtaining the required expertise and capacity necessary to analyze the mountains of data generated.

As sensor coverage grows, more data is generated with each sensor type, creating data with unique challenges. Some sensors are extremely noisy and generate massive amounts of data. Others generate less data but are highly specialized and require a great deal more skill to analyze. To deal with the volume of data, common approaches are to ‘tune down’ sensors (which literally filters out potentially valuable data). This type of filtering is tempting since it essentially reduces the workload of a security team to a more manageable level. In doing so, however, clues to potential threats stay hidden in the data.

Take the second step: Consider security automation to improve coverage with resource-constrained teams.

Automation effectively offers smaller security teams the same capability that a full-scale Security Operations Center (SOC) team provides a larger organization, at a fraction of the investment and hassle.

Automation improves the status quo and stops the tradeoffs that IT organizations make every day. Smaller teams benefit with advanced security operations. Manual monitoring stops. Teams can keep up with the volume of data and can ensure that the analysis of each and every event is thorough and consistent. Security automation also provides continuous and effective network security monitoring and reduces time to respond. Alert collection, analysis, prioritization, and event escalation decisions can be fully or partially automated.

So to close, more Coverage for smaller security teams is, in fact, possible: First, find the right tools to gain more visibility across the network and endpoints. Second, start to think about solutions that automate the expert analysis of the data that increased visibility produces.

But, remember, ‘Coverage’ is just 1 part of this 3-part puzzle. Be sure to check back next month for part 2 of my 3 C’s (Coverage, Context, Cost) blog series. My blog on “Context” will provide a deeper dive into automation and will demonstrate how mid-sized enterprise organizations can gain more insights from their security data—ultimately finding more credible threats.

In the meantime, please reach out if you’d like to talk to one of our Security Architect to discuss coverage in your environment.

Algorithmic Stealth and the Security Arms Race

I recently visited Norwich University, the oldest private military college in America and an NSA & DHS Center of Academic Excellence for Cyber Defense. I had the opportunity to speak with some of the students in their Computer Security and Information Assurance program, and I was asked a question by Keely Eubank about the ability of attackers to leverage stealth techniques to hide from algorithms. It was a very insightful question and it got me thinking about the topic of “Algorithmic Stealth.”

Information Security has always been an arms race. Getting around more advanced cyber-defenses has always required stealth techniques. For example, I remember the days when almost every attack happened on Friday afternoon around 3:00pm before a 3-day weekend because the system administration team was having Friday beers and wouldn’t be back to work before Tuesday. Plenty of time to break in, steal what you were after and then clean up before they got back to work. My pager (yeah that’s old school) went off every 3-day weekend for years.

Eventually the bad guys took a statistics course and realized that defenders had a real challenge with volume, so they switched their loudest attacks to Wednesday at 10:00am. That is weekly peak network traffic and they realized it was easier to hide in all that noise than the quiet of the weekend. Once we were able to suppress some of the noise they moved to a “living off the land” model so that they looked like regular administrators and used tools that were natural to the environment. Tools like TeamViewer and Powershell were particular favorites. They were both hiding in the noise AND not introducing new binaries into the environment, so they were that much harder to detect.

Now that advanced math and algorithms are becoming the detection methods of choice, there is an inevitable progression for attackers to research techniques for “algorithmic stealth.” Luckily for us there are lots of approaches to math-based detection being developed at the moment. Each of these would need to be specifically defeated and thus a combination of them would be incredibly difficult to completely circumvent. When machines do security monitoring human behavior, bias and decision making can no longer be the main thrust of stealth technique development. They have to hide from the algorithms.

This means a couple of things for us defenders. Basic anomaly detection is the weakest of the modern approaches as it suffers from the same signal-to-noise ratio problem that traditional signature detection methods do. The modern enterprise is full of anomalies due to the complex and poorly coordinated way our enterprise technology is cobbled together. However, signatures and anomalies can both reduce uncertainty by some amount and that is still valuable. The optimal detection operation will “see“ simultaneously in multiple telemetries; use the analogy of visible light, infrared, thermal, radio wave, etc… AND leverage diverse mathematical methods that provide a Venn diagram of opportunities to recognize malicious activity. All phases of an attack can be targeted for detection differently. At Respond Software we call this Integrated Reasoning.

I’d like to thank Norwich University’s Applied Research Institute (NUARI) and the Norwich faculty for the chance to learn something cool from their amazing students. As we develop new tools and techniques to defend our organizations, I am reminded that our very depth of experience (in the status quo) can blind us to new approaches and listening closely to the questions of the next generation of defenders can open new avenues of defensive research. Understanding and defeating algorithmic stealth is a new frontier for security research and active defense. Welcome to AI in Security, I think you knew it wouldn’t be all fun.

Join our growing community! Subscribe to our newsletter, the "First Responder Notebook," delivered straight to your inbox.