Splunk. Glassbeam. Azure. Amazon Web Services.
If you are a CIO, get used to these names, because they (or their competitors) are likely to become an active part of your IT infrastructure over the next few years as the Internet of Things moves from bleeding-edge concept to mission-critical reality. The Internet of Things (IoT), the growing network of connected devices…everything from that Fuel band on your wrist and the refrigerator in your kitchen to wind turbines providing your electricity or the jet engines thrusting you skyward…is rapidly altering how CIOs need to engineer their data centers.
The high profile examples to date have focused on industries like aerospace where small operational improvements can lead to major savings or dramatic improvements in customer service. For example, the airline industry spends approximately $200 billion annually on fuel. Every 1% improvement in efficiency that can be gleaned by more efficient in-flight decision-making means $1 billion in savings. Or, real-time feedback from jet engines experiencing an issue in-flight, can mean faster repair turnaround on the ground, since parts and technicians can be ready for the flight upon arrival.
But, machine-to-machine (M2M) interactions introduce a completely new data profile into the mix for CIOs. Current internet applications operate on a transactional basis…a user makes a request that a server responds to. In M2M applications, data is supplied as a continuous, real-time stream that can add up to a large final data set, and that may require equally real-time response streams to be sent back to the source device. Virgin Atlantic IT Director David Bulman noted that a single flight for the company’s recently purchased 787 Dreamliner could generate up to a half terabyte of data! And, getting fuel optimization programs in place means analyzing some of that data in real-time to provide feedback to the flight crew.
Storing all that data is one obvious implication for the data center, bringing smiles to the faces of executives at companies like EMC. However, there is no reason to collect data if you have no plan to use and analyzing large data sets is the next implication. Server capacity must adapt to new processing loads driven by entirely new software platforms like Splunk or Glassbeam, applications optimized for handling and analyzing large machine data sets.
But the implications go beyond the walls of the data center as well. Machine data implies collection from tens, hundred, thousands or millions of devices scattered around the globe. Moving this data in an optimal, secure and real-time fashion implies sophisticated and creative integration of web services like Azure or Amazon Web Services. For CIOs, this opens yet another reason for evaluation of hybrid architectures for the data center.
OK. It’s Real…So, Now What?
For CIOs evaluating data center plans, the Internet of Things must be part of the future capacity planning process since miscalculation can significantly alter a company’s competitive posture. Here are three tips for integrating a M2M strategy into your broader data center planning process:
- Be Integrated. First and foremost, the IT team needs to be fully integrated with product development and customer service planning processes since IoT demand will arise not from an IT requirement, but rather from real-world new product/service innovation. This means that demand forecasts in IT that may historically have only needed to account for classic administrative, finance, engineering and manufacturing workloads, will now need to account for real-time data exchange as part of product/service delivery. This makes IT part of design and customer service conversations—not just IT support.
- Be Web Integrated. As implied above, the networking and distributed processing demands of M2M streams means opening new discussions about Web integration in the data center architecture. For both networking or remote processing, CIOs cannot overlook the importance and potential value of cloud-based services in support IoT workloads.
- Be Nimble. The Internet of Things is spawning yet another era of innovation and demand in the data center. From exploding demand for data scientists to a new expansion of capacity, M2M interactions will most certainly shine a spotlight on IT with good planning key to being able to support this exploding requirement.
How BRUNS-PAK Can Help
BRUNS-PAK’s proprietary design/build methodologies integrate an evolving array of strategies and tools for data center planning teams that must account for the potential impact of IoT workloads, including the need to fully integrate cloud services strategies. The BRUNS-PAK Hybrid Efficient Data Center Design program offers an iterative process that acknowledges both rapidly changing IT requirements and their associated facilities infrastructure needs, resulting in a strategic plan to address the evolving capacity and complex networking requirements created by M2M work streams. Through our expanded consulting services group, and well-established design/build services team, we can help you create a strategy that ensures your data center is as resilient and responsive as the devices you are monitoring around the globe!
 ComputerWeekly.com, “GE uses big data to power machine services business” http://www.computerweekly.com/news/2240176248/GE-uses-big-data-to-power-machine-services-business
 ComputerWorld.com, “Boeing 787s to create half a terabyte of data per flight, says Virgin Atlantic” http://www.computerworlduk.com/news/infrastructure/3433595/boeing-787s-create-half-terabyte-of-data-per-flight-says-virgin-atlantic/