Edge vs. central IT: Where do my apps and services belong?
The Internet of Things (IoT) promises to connect everything that can be connected. But grasping its inevitability only gets you so far. Organizations must meticulously plan and execute their IoT implementations to produce the best outcomes possible.
You need a strategy to handle the inflow of information from sensors and monitors in the field, and to guide decisions about where to locate the compute power, storage, and analytics engines essential to successful IoT implementations.
Why edge computing matters
So where do your apps and services belong? Some will inevitably reside in the cloud, but cloud infrastructures cannot efficiently handle the massive loads of data the IoT is expected to generate. Despite the cloud’s scalability, cost-effectiveness, and support for future architectures, latency issues can get in the way of the real-time processing necessary for IoT implementations.
That won’t be happening at the core network, either. The cloud’s raison d’etre, after all, is to relieve central IT of ever-increasing demands for data processing, analysis, and storage. Some other solution is needed between the core and cloud. And that’s where edge computing comes in.
Edge allows you to place compute power closer to the action—the network edge. This is where many of the IoT’s analytics and monitoring applications will reside to enable real-time decision-making. As IoT implementations get under way, a web of micro data centers will sprout at the edge. They will act as way stations between cloud servers, core IT, and the vast networks of sensors and monitors that capture and transmit data. Like a rail system with stops between major hubs, these micro data centers will ideally transform the IoT into a well-organized data delivery system.
Edge computing promises to play an essential role in the network of the future as it evolves to accommodate IoT needs. That network will be a hybrid combining cloud, edge, and central IT components, with applications—or pieces of applications—residing in these distinct but integrated areas.
Location, location, location
As in real estate, edge computing comes down to location. The closer you place processing and data, the more agile your organization becomes. Now you don’t have to wait for data to travel from the source for hundreds or thousands of miles to a cloud data center to be processed and redirected to a technician or analyst in front of a dashboard somewhere else.
Funneling data to the cloud potentially wastes precious seconds—or even minutes—that can make a real operational difference. A driverless car at an intersection can’t wait several seconds for information from the cloud before it starts moving again. If the vehicle sits there too long waiting for data, it is bound to cause a traffic snag or even an accident.
As connected cars become more sophisticated, they will be able to communicate with each other about road and weather conditions. For instance, location services company HERE has teamed with Nokia Liquid Applications to use an LTE network to warn vehicles as they approach road hazards.
“Edge computing is used to extend existing cloud services into the highly distributed mobile base station environment, so that road hazards can be recognized and warnings can be sent to nearby cars with extremely low latency,” according to a Nokia blog. Google's Waze mobile navigation application performs similar services, albeit they require humans to inform the system about traffic slowdowns and potential road hazards.
Edge computing has a place not only on regular roadways, but also on the racetrack, where cars running at 140 mph can transmit sensor data to the pit crew. This scenario is already a reality in Formula E, where the DS Virgin Racing team uses the compute power of a trackside data center provided by Hewlett Packard Enterprise to optimize car performance.
“Streaming data is analyzed at the point of collection, providing real-time insight that allows [the team] to make real-time adjustments to maximize the systems that control their car, and hopefully win the race," says Kelly Pracht, senior manager of HPE Edgeline IoT Systems, in a recent blog. "After the race, aggregate data is analyzed for deeper insights.”
The power of immediacy
Away from roadways and racetracks, edge computing is starting to make a difference in other industries. For example, healthcare providers increasingly rely on connected devices that deliver vital information to applications monitored by medical personnel.
At-home monitoring devices track patients’ weight, blood pressure, heart rate, insulin levels, and other metrics. The data travels to software monitoring systems that issue alerts to the smartphones, tablets, and stationary monitors of nurses and doctors if intervention is needed. Any latency here is potentially a life-and-death situation. The same is true in tele-ICU, which allows critical care medical personnel to connect remotely with intensive care unit patients through real-time audio, visual, and data links.
Slow-loading screens or pixelated video images won’t cut it in these scenarios. However, not all edge computing instances come down to life and death. In retail environments, for example, the combination of Wi-Fi and smartphones can create Internet-like shopping experiences.
A shopper who has previously registered for the store’s Wi-Fi connection will be recognized by the network as she walks in. Wi-Fi analytics software brings up relevant information such as previous purchases and time spent at the store. The system tracks the shopper through the store and sends promotional information to nearby digital displays or texts coupons to her smartphone. The goal is to get the customer to spend more money while feeling the retailer is attuned to her needs and wants.
Where the cloud excels
Edge data centers will be essential to IoT adoption in hybrid environments where real-time decisions are paramount. However, cloud infrastructures will still provide essential scalability, flexibility, and storage for certain applications.
The cloud can handle massive volumes of data for which no immediate action is required. Analysts can mine that data later to identify patterns and trends that can be used for preventive maintenance and predictive purposes. For instance, cybersecurity solutions are being developed that identify the sources and methods of attacks to forecast future attacks, giving organizations a greater chance at preventing breaches.
Long term, large-scale data storage will remain an essential cloud function. So will web applications affected by seasonal fluctuations, such as retail websites that require extra capacity during the holidays or accountants who need to scale up during tax filing season.
The cloud also makes sense for applications for which demand is hard to predict, along with testing environments and—increasingly—mobile app development and management. Cloud-based software development accelerates the development process and keeps down costs, helping organizations achieve the agility they need to compete in fast-paced markets.
What to keep in-house
At least for the foreseeable future, certain applications will need to stay on premises. There are compelling reasons for this. In some cases, it's more expensive to move applications to the cloud or replace them with cloud apps. Some executives still get nervous about giving up direct control of assets by moving them elsewhere. And there are lingering concerns about cloud security, privacy, and regulatory compliance.
From a technical standpoint, a compelling case for keeping applications in house can be made based on these criteria:
- Extensive redevelopment and integration needed for applications to run efficiently in a cloud environment
- Applications requiring extensive customization to meet corporate requirements
- Applications tightly linked to vast, complex databases
- Comparable cloud-based applications lack required functionality
- Mainframe applications that serve as hubs of data integration, such as enterprise service bus software, and can’t be moved without also moving all dependent applications
Hybrid environments combining edge, cloud, and in-house assets will become as commonplace as client-server systems were not long ago. Years from now we won't even use the word "hybrid" to describe these environments. Instead we'll call them "the network."
On the edge: Lessons for leaders
- It is critical to understand how data is generated and consumed via your enterprise infrastructure.
- The correct balance of edge and central IT will enhance IT agility while retaining security and reliability.
- Proper implementation will reduce OpEx for services and applications via lower networking costs, minimized latency, and more satisfied consumers.