Exploring WAN data acceleration: Is edge computing really necessary?

Exploring WAN data acceleration: Is edge computing really necessary? Graham Jarvis is a freelance business and technology journalist, and a freelance PR consultant. He covers topics such as blockchain , cryptocurrencies, Fintech, ICT networking, edge computing, cloud computing, co-location, BYOD, networking, ICT security, autonomous vehicles, data centres, Internet of Things, marketing, customer services, remote working, and education.

Most traditional forms of dealing with network and data latency, as well as packet loss, fall short of their promise. This includes WAN optimisation. SD-WANs are great, but their performance can also be improved by adding a WAN data acceleration layer. With the growth of the Internet of Things, therefore, organisations are looking for new ways to reduce the impact of latency and packet loss.

David Linthicum, chief cloud strategist at Deloitte, explains edge computing’s benefits. “By eliminating the distance and time it takes to send data to centralised sources, we can improve the speed and performance of data transport, as well as devices and applications on the edge.” This sounds great, but why are people talking about edge computing, which while in many respects has many other names isn’t quite so new?

With technologies such as autonomous vehicles, the answer appears to lie in edge computing because it permits much of the data analysis to be conducted close to the data, while minimising latency and packet loss. However, questions should still be raised about whether this pushes data too close to the edge. One thing is true; edge computing won’t necessarily make data centres redundant. There will still be a need for data to be stored away from the edge to ensure business and service continuity.

Data processing trends

However, the new Gartner Trend Insight Report, ‘The Edge Completes the Cloud’, states: "By 2022, more than 50% of enterprise-generated data will be created and processed outside the data centre or cloud." David Trossell, CEO and CTO of Bridgeworks, thinks these findings are bang on. He says edge computing improves the ability of multi-national cloud companies to make deployment easier.

“In the past edge was difficult, as you would have to create own little data centre or facility, but now it’s all there in the cloud; and its savings in operational expenditure (OPEX) and capital expenditure (CAPEX) that everyone loves at the moment”, he adds.

Edge computing clarity

In response to whether people are confused about what entails edge computing, Trossell says: “I don’t think there is confusion – its function and reason are straightforward. Confusion only arises when people don’t understand why it is needed or try to misuse it.” 

Eric Simone, CEO of ClearBlade, a company that is promoting the benefits of edge computing, nevertheless explains that it is one of the latest ways of applying age-old concepts of distributed computing philosophy and practice.

He recently explained to IoT World Today that his definition of edge computing is about having a solution that runs on the cloud or elsewhere, and a solution that you sync completely with a device in a car, on a train, in a factor and “run it from there.” This requires some standardisation of the way organisations transmit – not just data, but also configure the entire system.

Nothing new

Trossell concurs that nothing is new in computing. “Edge is just a new way of solving age-old problems," he says. "Now that the market consists of the major cloud and other smaller local cloud providers, edge computing is more economically viable to do.

“Like every aspect of computing, edge computing is not a panacea for all problems, but It’s great for low latency applications. However, if the application doesn’t need that level of response or local computing capacity – then it is not the answer.”

Speaking about the question of whether edge will lead to the demise of data centres, he says there are different levels of storage and computation required. At some point, all of the edge data needs to come together for processing. “Data centres meet those high-end requirements – this could be on-premise or in the cloud”, he suggests.  

He observes that edge computing has a specific use case – that being for low power latency and critical applications, while emphasising that latency can only be mitigated with WAN data acceleration and other technologies. So, the answer is often to move the data closer to the end point (such as with edge computing). Despite this, WAN data acceleration is still used to improve the efficiency of WAN connection when moving the data back to the data centre.

Tips for mitigating latency

With these points in mind, Trossell offers his top five tips for mitigating latency and reducing packet loss, with or without edge computing:

  • Assess the impact of latency and packet loss on your project because they are a fact of life
  • Remember that even though may you have solved the latency by moving control to the edge, getting data to and from the edge can still scupper the project
  • Any technology if deployed at the edge to mitigate these, must be lightweight in terms of computing and storage requirements
  • Ensure that your edge computing works for all data types, especially encrypted
  • Consider deploying a combination of solutions, include WAN data acceleration – or even SD-WANs with a WAN data acceleration overlay

“As people move to more and more connected devices that have to respond immediately to situations, such as autonomous vehicles in motion, plant equipment or process controls, then there is an absolute need for edge computing," Trossell adds. "There is also an argument for edge-to-edge requirements; such as with autonomous vehicles. Their first primary edge within them requires the making of split-second decisions, and the secondary edge is receiving or transmitting information between the central data centre and the vehicle.”

The edge is but one solution

Considering the views of the industry experts cited in this article, it is clear that in many cases edge computing is required. However, it’s not the only technology that’s necessary. A plethora of technologies may provide the complete solution, and this could include WAN data acceleration. It uses machine learning to accelerate data across WANs, and in contrast to edge computing the mitigation of data doesn’t require each data centre to be situated in the same circles of disruption.

This means that any data that has been gleaned at the edge, can be sent for processing and analysis thousands of miles away. It can also be backed up and then rapidly restored whenever disaster strikes. WAN data acceleration can therefore complement edge computing – giving life to new ways to tackle latency and packet loss.

Update 15 May: A previous version of this article stated the incorrect employer of David Linthicum. This has since been fixed.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *