unnamed-1-1.jpg

The Litbit Blog

AI-EMPOWERED WORKPLACE TRENDS

Drivers, Results and Guidelines of the Emerging Data Center IoT

DC IoT Drivers

 

Note: This is an article that appears in 7x24 Exchange magazine's spring edition and is a warm up for a keynote presentation that i'm going to be giving at the Navigating the Future conference on June 7th, 2016 in Boca Raton, Florida. Come hang out and i'll buy you a drink.  http://conferences.7x24exchange.org/spring2016/

Every year the world’s top business leaders, intellectual thinkers, and heads of state converge at a small ski resort community in Davos, Switzerland to engage in discussions related to forecasting the future, and projecting and preparing for the world’s “next big thing.” Over the years, the World Economic Forum has pre-emptively explored many of the humanities biggest challenges over time, including: the oil crisis of the 1970’s, relationship problems between the Arab and Western worlds, and the cause and effect of global climate change. On the technology advancement front, the Forum’s discussions have deep dove the emergence of the personal computer, e-learning in schools, as well as the Internet / World Wide Web. Over the years, the Forum has gained a reputation for its insight.

The central theme of this year’s forum revolved around the impending realization of the “Fourth Industrial Revolution,” driven by the Internet of Things and the fusion of operational (machines and people,) and information technologies (applications that better empower their productivity.) Event leader, German economist Klaus Schwab, projects this shift will fundamentally change the way we live and work in the coming decades. With so many devices, machines and things in the data center, these projections are of particular interest in regards to what they mean to us.

While policy and technical-related conversations are being had regarding this new big thing, talk of an Internet of Things isn’t really that new. In 1932, writer Jay B. Nash first called it when he wrote “Within our grasp is the leisure of the Greek citizen, made possible by our mechanical slaves, which far outnumber his twelve to fifteen per free man. These mechanical slaves jump to our aid. As we step into a room, at the touch of a button a dozen light our way. Another slave sits twenty-four hours a day at our thermostat, regulating the heat of our home. Another sits night and day at our automatic refrigerator. They start our car; run our motors; shine our shoes, and cult our hair. They practically eliminate time and space by their very fleetness.” What’s old and never came, many times becomes new and coming, I suppose. Timing is everything.

In data centers it becomes easy to question: If the IoT simply involves controlled machines that are either directly or indirectly connected together and to the Internet, then haven’t our SCADA, PLC and controls systems been living la vida IoT over the past 25+ years? While we may have plenty of experience dealing with networked machines and things that work together, the real answer is, not quite. That said, what specifically in the data center defines the difference between the legacy IoT we’ve worked with over the years, and the coming IoT that bring about a fourth industrial revolution? A multitude of compounding hardware and software disruptions that are currently underway:

Advancements in Edge Controller Computing Capabilities: Currently, a $5 Raspberry Pi Zero offers 100x the compute performance and 100x less cost than many of the industrial controllers in our data centers today. Companies like Olimex are now starting to make these modern control boards via industrial standards. In addition, there are several high performance, low cost industrial grade open source controllers coming to fruition. Open source software and hardware is quickly advancing the disaggregation of long-standing proprietary hardware and the embedded software handcuffed to them. In doing so, the capability for continuous software defined technology advancements emerges amongst traditionally static hardware that has a very long static lifecycle. This shift allows our plant to continuously get smarter.

Advancements in Sensing Capabilities: Not many years ago, measuring vibration acceleration in the industrial space involved sensor devices that communicated serially, via 9600 baud Modbus, and cost upwards of $20,000. Today the sensors that measure the same in our iPhones are more accurate, much faster and cost less than $2. Same goes for advancements in sound analytical capabilities, as well as environmental and power sensing. In a nutshell, low cost, high performance commodity sensing is a major driver in making it feasible to measure almost everything in the enterprise. This shift is what will enable hundreds of billions of new points to be measured over the coming years, with incredible new levels of robustness.

Advancements in Local & Cloud Control Layer Capabilities: Right now a commodity computer exceeds the processing capability of a mouse’s brain, and is about 1/1000th the intellect of a human. That may not sound like much but, in perspective, my first computer, an Apple II, was about a trillionth. By the time the first commodity Linux servers were running in data centers, they were about a millionth as capable. The point being is that Moore’s law is rapidly advancing the low cost capabilities of information technology and by 2025 the commodity computer will surpass single human calculation capabilities. By 2050 it has been predicted that same single computer will exceed the calculation capabilities of *all* humans. As a human, it can be tough to comprehend-- but, to many at the time, so was the idea of traveling to the moon before it occurred.

Far Smarter Apps that are Far Easier to Use: When it comes to artificial intelligence (AI) in a computer, we are currently still in an era that requires people trained to think like computers and communicate via “coding” in order to program them to most effectively behave and operate the ways we desire. The results only enable a small sector of the population to have deep engagements with programming machines-- producing the ability to, at best, gain “Weak AI” (also called ANI or “Artificial Narrow Intelligence,) or the limited ability to narrowly focus on only one area of AI-- such as beating a person at chess.

With advancements in compute capabilities and humanistic UI/UX on the horizon, the next era of Artificial General Intelligence (AGI) is upon us. This “Strong AI” will enable our machines to reason, plan, solve problems, comprehend complex idea and learn quickly from experiences. The process of AGI in the data center will begin by enabling all people who work in the data centers to apply their expertise and experiences to knowledge-base repositories, where machining can then be used to compound AI gains. Along with AGI gains will come continuously improved capabilities that will enable all people (non-programmers) to teach their computers to do what they want-- using a humanistic interface. In essence, you’ll be able to program your computer without any need of being able to “code” or think like a computer-- because your computer will have the resources and capability to think like you.

The above technology advancements is what will enable the realization of the true capabilities of a modern Data Center Internet of Things (DCIoT.) These gains will be particularly valuable in the areas of human+machine aggregation and sharing, increased security, and the ability for all your teammates, customers and partners to self-define and program means of data visualization, orchestration and human-to-machine knowledge correlation-- on their own terms, based on permission granted. Full mobile capabilities will free us from our desks, while cloud capabilities will offer us more, without requiring cloud dependence. Integrated human + machine learning will enable systems to tune themselves on the fly, route themselves around problems, preemptively find issues before they occur, and diagnose root causes as they occur. Temperatures will no longer adjust simply to the measurements of rooms, but more precisely to the real-time needs of the servers. Power capacity will automatically orchestrate succinctly with utilization capacity of the devices they serve. All data will be continuously analyzed via real-time streaming, with both human defined and machine learning capabilities. In a nutshell, we should expect that every metric we currently use to define productivity in the data center will be drastically improved up.

As we prepare ourselves for the Data Center Internet of Things, the following top six driving principles can help define the foundational basis for a platform to build upon:

AGGREGATION AND SHARING: Recognize that your legacy IoT of existing / current SCADA, PLCs, controllers, BMS and DCIM needs to work alongside any new IoT devices. Modbus needs to be able to communicate with RESTful API. In turn, your solution needs to be able to aggregate and communicate with both older and newer machines. The platform should offer disaggregated compute capabilities, so that new things can be done centrally-- so as to not be limited by the resources of low performance, legacy embedded systems. Ensure that everything necessary can be shared between your team of people and machines, via tight permissions and high security…

SECURITY AND PRIVACY: Currently, it is very likely that your existing facility control topology has major gaps in the areas of permissions management, authentication, and accounting . Many of these systems communicate in clear text. Make sure your new aggregation platform offers 100% encryption for all data in transmission and all data at rest (both network and data layer encryption.) The right platform can then help enable the ability to network isolate your old, insecure gear. Make sure that no keys are shared between shared users. Encryption should be on a per-user basis-- and the system should only know one side of the key pair. Speaking of key pair-- ensure the level of encryption meets currently acceptable means of security (such as AES-256 with a 4096 bit RSA key.)

MOBILE FIRST: It’s rare to see a desk in the data center, and the proposition of needing to “run back to my desk” has become unproductive in today’s “mobile first” world. Most people now spend more time on their mobile devices than they do on the desktop. Give them the ability to do their job wherever they may be.

CLOUD ENABLED, BUT NOT CLOUD DEPENDENT: The ability to securely connect data to the cloud enables integration with more capabilities, applications, services and users (via mobile API’s, etc.) However, in the data center space you should never create a cloud dependency for any type of critical functionality. For important things, if you’re looking to be cloud enabled / integrated-- make sure you’re also not cloud dependent.

EASY TO USE: In data centers today, data visualization and dashboards are typically limited to whatever came “in the box,” or require complex custom programming. Orchestration and automation typically involves bringing on subject matter experts that have the “I’ve been doing this for 25 years” job requirement. Make sure your apps have learned the lessons in user experience and user interface, brought to use by the best-in-class consumer apps over the years. Complex training and thick books should be a thing of the past when using a modern app, built with humanistic UI/UX in mind.

OPEN SOURCE: Proprietary moves slow and rarely behaves like a good neighbor with other vendors, etc. Open source ensures higher security, faster evolution, faster bug repairs and easier cross functionality with anything else. Make sure API formats are published. Make sure app capabilities are not limited to a single vendor. Open source has greatly advanced and improved the IT (server/network) side of the data center. The same needs to occur in other areas of the data center as well.

With major gains driven by the coming era of continuous software defined advancements on top of commodity hardware approaching, the DC-IoT will enable practically endless opportunities-- some of which can even be tough to comprehend using today’s thinking. The biggest challenge we will face as an industry is the inherent fear of change that tends to accompany the mindset of an operator. While advancements in systems/network/storage hardware and software continue to occur in the front of house in the data center, other silos in back have remained stuck in the past. In a nutshell, the Internet of Things is coming full force to the world, and we will have some work to do. That’s exciting..

Scott Noteboom

Written by Scott Noteboom

Scott Noteboom serves as ceo.founder of Litbit, a company that allows you to create, mentor, manage and AI-empowered Co-Workers that augment your intelligence and productivity in the workplace.

SHARE THIS STORY | |