Warning: Illegal string offset 'thumbnail_type' in /home/cyberdefensemag/public_html/wp-content/themes/Magazon/st-framework/templates/single.php on line 46
by Lee Vorthman, Cyber Lead NetApp U.S. Public Sector
Within the last ten years it has become readily apparent that traditional perimeter defenses are no longer sufficient to secure our organizations. Like the Maginot line attackers can easily end run our firewalls and IDSs through a variety of techniques like phishing, SQL injection and XSS. Instead of primarily relying on a perimeter defenses, organizations are now beginning to use the same techniques in big data analytics to understand the security of their environments in real time. However, achieving near real-time situational awareness has proved challenging for several reasons.
First, as network speeds increase from 1Gb/s to 10Gb/s it becomes increasingly difficult to collect and process the velocity of data coming across the wire. In order to truly understand the threats they are facing security analysts need access to the raw packets for analysis or to reconstruct the session. However, most current packet capture appliances lack the processing power to collect and analyze the packets at the same time. Unfortunately, this processing constraint can result in dropped packets, which security analysts can’t afford if they around going to adequately perform their job. Ultimately, these increased network speeds mandate that, for the time being, the bulk of analytics must shift from pre-processing in our collection appliances to post processing by analytics platforms and this is one reason why near real-time situational awareness resembles a big data problem.
Second, the volume of data generated by a 10Gb/s half-duplex link is approximately 100 TB per day. As organizations begin to build environments that can process this data they need to consider two things – how much data do they want to analyze at one time and how long will they want to keep this data after it has been collected? Both of these requirements drive up the total amount of storage needed and significantly increases the complexity of data management. Over the past year I’ve spoken to numerous security professionals whose retention times have ranged from 24 hours to one year, which represents anywhere from 100TB to 36.5PB of data.
Third, while storing this data is certainly challenging, the bigger problem is analyzing it. There are numerous tools for analytics, SIEM, intrusion detection and network forensics. The challenge becomes correlating the data from all of these to create information that is actionable. In my experience, there isn’t a single analytics tool that adequately correlates all of this data into actionable information. Analysts are forced to switch between a variety of tools manually, which is a time consuming process. In order to move towards near real-time situational awareness, security professionals will need to make their analytics process as efficient as possible and, rather than sifting through TB of data manually, automate the majority of the process.
Lastly, trends like cloud, BYOD and the mobile work force have made our network environments more complex. This presents a challenge not only for collection, but also for analysis and ultimately mitigating security threats. Security professionals can no longer afford to only watch traffic coming in, they most also collect and analyze traffic going out and across their networks. Incoming traffic will provide information about the types of attacks and techniques being used to penetrate the network, while outgoing traffic will provide indicators of existing compromises and data breaches. Lastly, traffic going across the network can provide indication of insider threats. Monitoring all of these network vectors is essential for near real-time situational awareness.
In order to solve these challenges security professionals will need to work closely with traditional IT to help design and manage the resources required of near real-time situational awareness. As this concept matures, I look forward to meeting and speaking with security professionals at all levels to understand how they are approaching this problem, what works and most importantly – what’s next?
Lee Vorthman runs cyber defense programs for NetApp’s US. Public Sector Division, located in Tysons Virginia.