PART 1-- INTRODUCTION:
Over the years, I feel fortunate to have had the opportunity to work in multiple “wheelhouses” of the technology stack. I started my hobby/career (they’ve always been the same) programming, and quickly transitioned into telecom, systems and network administration (upon discovering the world of bulletin board services, and then Internet Service Providers (ISP’s.) My initial exposures into security technology began, as a youth, by first figuring out how to break things-- whether it be cracking a piece of Apple ][ software or figuring how to clone an old analog Motorola brick phone. As my world became more adult like, security shifted from being less about breaking things and more about avoiding things from being broken (steering clear of spamming mail server hijackers, etc.)
Over this time, technology advancement felt like it was moving a million miles an hour, and i’ve always loved the thrill of that ride. The same year Y2K passed uneventfully, i left my 20’s in the rear view mirror, moved to the Silicon Valley and was hired to run data centers for Abovenet (second biggest data center colocation provider during the dot com boom.) In addition to the networks, systems and applications that i had background experience with, I was put in charge of a different technology to learn: the industrial systems that provided the foundational power and cooling capacities to the data centers. This was my first exposure working with machines that were bigger, louder, and felt more intimidating than the new Intel Pentium based 2U Linux boxes that were killing the Solaris powered Sun Enterprise servers of the day.
It took months to digest the scale of these beasts (thanks to AD Robison, who was part of the AboveNet team i inherited, and we have worked together ever since,) but over time I came to recognize them more as stubborn old men-- much more set in their ways versus the rapidly evolving technology world i was used to.
It’s been 15 years since my first exposure to 4000 amp electrical panels, 3000 horsepower power backup generators and 100 ton air handlers. Since that time, i’ve had the pleasure of being part of several billion dollars worth of new data center builds, filling them with hundreds of thousands of pieces of network, systems and storage equipment, for several of the biggest players in the world (AboveNet, to Yahoo and then to Apple.) During that time, it’s been amazing to be part of realizing of the power of massively scaled Hadoop clusters at Yahoo, or seeing results of the early power of Mesos at Apple.
The pace of which the majority of our technology stack has shifted, from the expensive, proprietary hardware dependent days of Sun, EMC and Cisco to the new, continuously advancing, software defined world of Mesos, SwiftStack and Cumulus, has just been AWESOME.
That said, being ¾ software defined does not suffice-- when it’s all dependent on the remaining ¼ that is not. Take a look at the above graphic. Since it’s almost summer in sunny California, and i’ve got swimming on my mind, let me explain it a weather appropriate context: Notice the 4 layers. Think of the software defined data center as a relay swim team, with each layer symbolizing 1 of 4 swimmers on the team:
- Leg 1: IT Security (and other services)
- Leg 2: Applications
- Leg 3: Systems
- Leg 4 (aka “the anchor leg:”) Deep Infrastructure / Industrial Control Systems)
The first three layers / swimmers on the team (IT security, applications and systems) have worked together to shift from the legacy hardware dependent days of old, to the continuously advancing software defined days of today. They’ve done so by:
1) Abstracting what drove limiting, expensive dependencies away from stand-alone / proprietary hardware.
2) Turning that hardware into more a standardized, virtualized, elastic commodity.
3) Enabling and relying on the pace of software innovations to continuously advance the stack.
Thus, we've largely shifted from a world that was held back by slow evolving proprietary hardware, to one that is now driven by continuously improving software that is run on low cost, commodity hardware.
As a result, the first three swimmers in this relay are each leading their legs of the race in record breaking pace. It’s amazing to see the speed, efficiency and smoothness of these layers at work in the biggest cloud players in the world where:
1) IT Security technology & approaches evolve and advance DAILY, in order to keep pace with the continuous attacks and challenges they face (different levels of invisible “cyberwar” is constantly occurring, which i’ll save for another post..)
2) Applications are being pushed, patched and updated monthly, or even more frequently, thanks to the pace set by the nimble mobile application space.
3) Hardware systems evolve annually, keeping up with Moore’s law, plus some, and they do so smoother than ever-- thanks to virtualization approaches that have evolved from things like VMWare, to OpenStack, to Mesos, etc.
So you get that i’m excited about the first ¾ of this race….
Now let’s begin to look at the anchor leg of this race, driven by the Deep Infrastructure / Industrial Control Systems that power all layers above it. Most swim fans will know that the anchor leg is usually held for the strongest, fastest swimmer. This is because, just like in the triangle above, the anchor provides the foundation for the entire team and is in the most critical position to make or break the entire race.
Think about it: What happens if the foundational, Deep Infrastructure / Industrial systems of this stack fails?
The answer is simple: Everything else in the stack become meaningless. Fade to black.
Unfortunately, in this anchor leg, the industrial control technologies that provide the foundation for modern societies entire technology stack has failed to keep pace with what’s evolved in the layers above. The results are unacceptable:
- Security gaps that present the biggest risk to our technology driven modern society today.
- Widespread lack of interoperability with the other “¾” of the technology stack, which leaves them all less efficient, less productive and less safe.
- Limited upgradability paths continues to leave our “anchor leg” trapped 15-20 years in the past.
Litbit was built to help get our anchor swimmer in shape! In part 2 of this article, we’ll get into why the current gaps exist, as well as ways to address them. Be sure to "Get Started" with our free demo (click the green button in the upper left corner,) and we’ll also let you know when new articles come out.
To be continued in Part 2…..