The Cloud remains very popular with businesses and investors alike, but all other technology business needs to drive innovation and change cloud computing. Where exactly is Cloud computing headed? Well, it is morphing into edge computing. You can think of edge-computing as the anti-cloud. It is still here to stay and grow because next-generation applications will focus on a machine to machine interactions rather than devices to human interactions. The Internet of things or IoT, artificial intelligence, and machine learning all require the gathering and processing massive amounts of data. That data is not generated in the cloud-it is generated at the edge. The challenge for Cloud comes down to a fundamental law of nature- the speed of light. The rate of light is defined as when it takes a signal to travel from point A to point B. This is a fixed speed. It takes the same amount of time today as it did a hundred years ago. 

In computing, we define the term latency as the period it takes for a specific action. Latency is usually a more critical speed factor than bandwidth. When it comes to an IoT action, time is not taken up by the processing of the data, but by the time it takes for the data to travel to and from the cloud-or latency. There is only one way to reduce the transaction time or latency, and that is to place the transaction engine closer to the actual device. Edge computing establishes local mini-datacenters similar to the way a wireless operator places individual cell towers. With edge computing, the transaction between an end sensor and the primary system occurs at the local level improving transactional performance by a minimum factor of 10. Data is still forwarded to the Cloud for aggregation, but time is no longer as critical since it is already completed.

Edge computing is becoming more and more necessary because as machine automation and sensors increase, they will be feeding off each other’s data. The sharing of that data will need to happen too quickly. Take a case such as two driverless cars. To avoid a collision, the vehicles will need data on speed and location, and that data needs to share fast. There isn’t enough time to send it up to the Cloud and get the required data back in time. Coordination of this type of data sharing across classes of devices and machines will be localized. Consequently, the best performance improvements will occur by reducing transaction latency or transaction time.

Our current Cloud-based systems cannot deliver ultra-low latency because the speed of light is a physical limitation. In a localized transaction, the latency or transaction time might fall in a range of one to ten milliseconds. In contrast, a cloud transaction would have a transaction time of one hundred milliseconds or more. For a machine to human interactions, the difference would not even be detectable, but with a machine to machine transaction, that difference is like waiting for an eternity. The only way to achieve that ultra-low latency is to add data storage systems and mini-data centers at the edge. Edge-clouds will form the basis for a whole new generation of machine to machine interactions that will enable an entirely new set of capabilities.

Like the mainframes and PCs that have come before, cloud computing will have its place in technology. Edge computing is not evolving to replace cloud computing but enables the next level or wave of computing.

Sep 11 20
Christina Zumwalt

Discover more from Alto9

Subscribe now to keep reading and get access to the full archive.

Continue reading