Defending Ever Expanding Networks and IT Systems

Architecture at Scale is Needed

By Trevor Pott, Product Marketing Director, Juniper Networks

How many systems must an information security professional defend?  For most people, the numbers involved are abstract concepts.  We think we understand them, but when confronted with them in a tangible form, we are constantly surprised by how much our perception differs from reality. Today even the smallest enterprises operate at scales that are simply beyond our ability as humans to truly comprehend.

There’s a considerable gap in capability between small business IT and enterprise IT.  For example, it is entirely feasible – and even reasonable – to meet all of a small organization’s file storage needs using a bare-bones secure cloud storage provider like Sync.

It would be rank madness to do this for an organization with 10,000 employees.  When you get to the scale of the military, there are strong arguments to be made that, if used as the organization’s only storage solution, such an approach would constitute criminal negligence.

Scale matters.  As scale increases, inevitably, so does the complexity.  There is no getting around this.

So how many systems must an information security professional defend?  All of them.  Given the scale of our increasingly interconnected world, that’s quite the problem.

 The evolution of network management and automation

In the beginning, we managed everything by hand.  Each system on our networks was a pet, loved, and cared for, unique amongst all other systems.  Eventually, the number of systems under management became too large for this approach to management, and so administrators turned to script.  Common tasks were automated.  Each administrator could manage a larger number of systems.

Eventually, people who had a large number of scripts packaged them into the first IT management applications, and manual IT gave way to management centralization.  Scripting and #CommandLineLife was replaced by policies, profiles, and templates.  The number of systems a single administrator could manage exploded, and this is where most organizations are today.

Unfortunately, that scale thing keeps coming back ’round again.  Despite the management magnification capabilities afforded administrators by today’s policy-driven management applications, larger organizations are hitting very real scaling problems.  100% of administrator time is being tied up with policies, profiles and templates.  Worse, in many cases the relevant IT teams are already at their maximum size: adding staff does little to increase the number of systems that can be managed.

Holistic architect wanted

If there is one thing I would like every single network defender to keep in mind for the next decade, it is that there is no network edge anymore.  The days of hunkering down behind our perimeters are long past.

“Hybrid IT” and “multicloud” – including all flavors of modifying buzzwords – is no longer novel.  It is simply how IT is done today.  A single organization’s IT can span multiple infrastructures.  On-premises IT blends neatly into infrastructure, software and services provided by multiple public cloud providers, while edge computing has quietly become an ordinary fact of life that we don’t even pay much attention to anymore.

That dispersed, complex vision of a modern network exists without even beginning the conversation about mobile and remote workers, IoT, or the intricacies of interdependence that exist both upstream to our supply chain, and in the provisioning of IT to downstream customers.  Unfortunately, in many ways, we are our own worst enemy, and we – both as IT practitioners and as vendors – create many of the security problems that will haunt us in the coming years.

Our innate need to categorize, to segment and to simplify may well be looked upon as the security threat of the 2020s.  Our need to keep bringing complexity down to something we can fit in our brains stands in the way of making holistic architectural – and thus security – decisions about the implementation of IT across these many and varied infrastructures.

Think outside the network

The persistence of a siloed mentality, complete with an insistence on treating network segments as though they had perimeters (and as though those perimeters mattered) consistently limits our thinking. This puts us at risk.  The compromise of the most minor system can lead to the compromise of significantly more important systems, and an inability to think holistically will ultimately lead to compromise.

Consider, for example, the caching of credentials.  In many cases, merely logging into a system with administrative credentials once (and then forgetting to wipe the cache) is enough to leave a copy of those credentials on the system in question.  That cache can be exploited by attackers to then compromise other systems that are part of the network and which share those credentials.

In this manner, the compromise of a small edge node located on the other side of the world could result in a devastating compromise of central databases.  What’s worse, these sorts of compromises happen not because anyone along the chain of responsibility between those two systems does anything wrong, but because their areas of responsibility were so disconnected that the security implications of how doing something to A would affect B was never even considered.

Machines managing machines managing machines…

This is the challenge of the 2020s.  In order to cope with the perpetually increasing scale, we must begin to turn the definition and daily management of policies, profiles, and templates over to machines.  Machine Learning (ML), Artificial Intelligence (AI), and other Bulk Data Computational Analysis (BDCA) tools are a must.

Initially, these tools will make suggestions, and automate very simple tasks – the sort of things we’re seeing from AIOps vendors today.  But this is only the beginning; in order for the networks of tomorrow to even be possible, virtually everything that IT administrators do today must be done by BDCA tools without any form of human input.

This is not about replacing IT personnel.  It isn’t about an attempt to save money.  The problems we’re running up against are the limits of human capability.

Humans can only hold so many things in working memory at a time.  Call it a RAM limit, if you will.  We can only conceive of so many nodes on a network.  We can only wrap our minds around so many permissions interactions.  Enterprise networks are already bigger than we can fit in our brains, and that means we are running up against human limits in terms of even being able to architect these networks, let alone defend them.

For security to be effective, it needs to be holistically integrated into network architecture decisions.  Network and security are inseparable, and the challenge of the next 10 years is going to be redesigning how we represent these networks for human consumption, and how we translate human-scale architectural and security decisions into the practical application of configuration for a literally incomprehensible number of systems that, even for small businesses, can span the entire globe.

Vendors will build – are building – AIs to take on the day-to-day.  This, while fantastically difficult, is still the easy part.  The hard part is convincing organizations – and certainly individual administrators within those organizations – to give AIs that kind of power.  The jump from basic BDCA tools and suggest-o-tron ML agents all the way to AIs which make judgment calls about which policies to craft and apply is, psychologically at least, a pretty big deal.  To say nothing of the legal and regulatory implications.

In the end, the latest strains of malware or who is hacking who is not the problem.  The problem – the real problem – is architecture at scale.  What is needed are the tools to take the intelligent, experienced, and capable IT staff that organizations already have and empower them to operate at that level.  The robots can handle the rest.

About the Author

Trevor Pott AuthorTrevor Pott is the Product Marketing Director for Juniper Networks.  He shares all the ways that Juniper’s technologies can help organizations of all sizes defend their data, meet regulatory requirements, and advance the organization’s own goals while doing so.