Contact Us
CAPTCHA
Hughes

Questions You Should Ask When Considering Edge Computing for Your Enterprise

Share
false
cloud_edge_computing

If you are trying to determine the purpose of edge computing and, additionally, how it ties in to your DevOps adoption process, you are not certainly not alone—at this moment, many others are also taking their first steps into this new world. While the finer details of edge computing technology are well worth discussing, this post will focus on how edge computing needs to be understood, deployed, and ultimately adopted to develop definitive market indicators that will demonstrate its future applications. What I want to share with you are the questions that I am currently asking about edge computing, which I hope you can use in your own assessment.

A 2018 report from McKinsey states, “As connected devices proliferate and their capabilities expand, so does the need for real-time decision making untethered from cloud computing’s latency, and from connectivity in some cases. This movement of computational capacity out of the cloud—to the edge—is opening up a new sector: edge computing.”

This brings us to our first question:

  1. Will real-time decision making and less dependence on connectivity create value for a business?

To answer this, it is important to consider a separate question entirely. Today, I don’t have any specific ideas on how real-time decisions would assist my business, but:

  1. If my investment into DevOps and the cloud could allow me to extend my application reach to the edge, could a faster response time generate value?

We have to realize that edge computing is fundamentally about local decision‑making. If your IT organization is evolving along the lines of containerization atop a Continuous Integration/ Continuous Delivery‑driven SDLC, your chances of making these local decisions are higher. Ultimately, if the enterprise is moving toward automation and robotics at distributed locations and effectively creating a “micro-industrial” environment in which performance is increasingly dependent on local data rather than generalized or centralized data from remote sources, there is a need for the compute to make those local decisions at the edge.

Regarding the operational challenge in this emerging technology, I’m interested in knowing:

  1. Who manages these edge computing devices?

Is there yet another box to secure at the distributed location? This is where a parallel development becomes interesting. SD-WAN boxes are growing more powerful computationally, allowing for the deliberate use of fallow cores for other applications to be supported. Some vendors, such as VeloCloud, are creating a Universal CPE (uCPE) that leverages Network Function Virtualization (NFV) architecture on a whitebox to host Virtual Network Functions (VNF) that offer networking and security services while also providing additional compute resources for other virtualized applications deployed to these edge devices. This horizontal capability expansion is being studied by forward-looking managed service providers (MSPs) like Hughes.

Wouldn’t you want your MSP to assist in the management of the device that now hosts your mission‑critical, customer-facing application?

We are on the cusp of widespread adoption of edge computing. As use cases begin to emerge within major enterprises such as Chick‑fil-A and Target, it is becoming clear that edge computing makes sense when your IT design can leverage compute at the edge.

As artificial intelligence and machine learning are gradually integrated with your applications and become capable of leveraging local data to make local decisions, this added, managed compute through edge computing can contribute a new dimension of reliable, real-time decision-making to your enterprise. Converging the service layers for networking, security and application management at the edge does produce food for thought.