The past 15-20 years have generated a massive shift from on-premise software to cloud computing. We can now access everything we need from anywhere, without the limitations of fixed-location servers. However, the cloud computing movement is about to swing back the other way towards decentralized computing. So why do we need edge computing?
Given the massive opportunities generated by cloud networks, that concept might seem counterintuitive. But in order for us to move to the next generation and take advantage of all the Internet of Things (IoT) has to offer, technology has to become local again.
Take a look at agricultural history to draw some parallels. A century or more ago, people consumed foods that were cultivated in their local area. If it didn’t grow or breed within 50-100 miles of where you lived, you probably wouldn’t have the opportunity to eat it.
Then, technology came along and opened new doors. Transportation got a lot faster, refrigeration meant food could travel without spoiling, and new farming techniques allowed for mass production. With these developments, consumers can access foods from all over the world.
We’re still taking advantage of easy access to global foods, but there’s been a shift back to local sources for a number of reasons. Shipping foods over long distances impacts the environment. Consumers want to contribute to their local economy. And many of us want fewer artificial ingredients in the foods we consume.
So what does that mean for cloud computing? Just like global food access, cloud computing isn’t going away completely. But, where the processing regularly happens will move from the cloud to what’s now called “the edge”.