21 September 2016
There’s no denying that today’s enterprise networks are radically different from what they were just a handful of years ago. As a result, many experts argue that WAN optimisation is now more important than ever before.
“Quite simply, if you are unable to see what is going on within your network, how can you troubleshoot problem situations or detect abnormalities in usage of your infrastructure or your services by users?” asks Paul Griffiths, technical director for Riverbed’s advanced technology group.
He describes WAN optimisation as a “golden opportunity” to provide telemetry data on all the traffic flows to a centralised visibility and reporting platform. “Security may be top of CIOs’ minds, but with the sheer quantity of encrypted traffic flowing in today’s corporate networks, having transparency of how that data is flowing across the network should never be low on the priority list.”
Silver Peak agrees that optimisation has become more crucial in an era when companies are increasingly embracing virtualisation, cloud, etc. But when connecting users to applications across geographically distributed organisations, the company says network managers are introduced to new challenges that traditional WANs were never engineered to address. This includes poor and unpredictable application performance, which impacts user productivity, and cloud applications consuming the WAN.
Nick Applegarth, Silver Peak’s VP of sales for EMEA, says: “While the rest of the infrastructure has become more fine-tuned for a cloud and virtual world, the WAN continues to be subject to the limitations associated with traditional private MPLS networks and branch office infrastructure. Today’s networking solutions need to be able to incorporate broadband and the internet into the WAN, not only for the potential costs savings involved, but to fully address enterprise’s changing connectivity requirements.”
Same old, same old?
So can network managers still use the same hardware/software that their company invested in years ago to optimise today’s networks? Or are these now past their use by dates?
Chris Wade, commercial director with The Networking People (TNP), believes the integration of old and new should be a key element of any network progression strategy, particularly in the public sector where there is a heavy focus on value for money and the current drive to reduce cost.
“At TNP we often find that organisations, such as local authorities, are being pushed to replace relatively new infrastructure when this existing investment could be leveraged to optimise the network, thus allowing either cost savings or more focused investment in hardware and software to improve the core or security elements of the infrastructure.”
Mandana Javaheri, CTO of Savvius (formerly WildPackets), supports this approach to an extent: “There’s no denying that new hardware and software for network optimisation is expensive. However, with the right vendor support and upgrades, many of these tools can be used for quite a few years. It’s up to the network manager to understand the overall health of the network, and to be the judge of when a substantial upgrade or change is justified.”
Javaheri says the key here is to constantly test the network to ensure that everything is operating within acceptable parameters, and look at whether the existing solutions offer a good mix of performance, visibility and control over the network. “Is there a business need to get the most out of every protocol on the network? Does the GUI offer an intuitive and efficient experience with a good feature set such as SSL decryption? Does the solution provide solid reporting, and the ability to facilitate cloud-based application deployment? If not, then it may be time to start working on a shortlist.”
For Griffiths, the need for such as shortlist may come sooner rather than later. “There is some exaggeration in this comment, but it often seems that companies purchased their networking equipment back in the 1980s, installed it, locked it in a closet, and forgot about it because it doesn’t keep up with the times.”
More realistically, he says that while optimisation technology has developed in terms of speed, capacity and functionality, the administration approach to configuration and change control is still ‘per device’ via a CLI. “That may seem ridiculous but it shows how little has actually changed in terms of how IT approaches networking technology. Legacy management tools are prone to error, are inflexible, and not fit for the business demands of the 21st century.”
So what should you look for when selecting an optimisation solution? The answer may not be as straightforward as you’d expect. According to Mav Turner, director of product management for SolarWinds, network managers need to be investing in the future while taking care of the present. But he reckons many software vendors struggle to provide the management capabilities for both cloud and on-premises application. This leads to companies purchasing multiple solutions, which ultimately adds to the complexity.
According to Microlease technical manager Geoff Kempster, the IT department should begin the optimisation process by ensuring that the physical installation is fully compliant with the stipulated requirements. “This may involve the physical testing of LANs to certify their conformance to the relevant Cat 6 or Cat 7 standards. Alternatively, it could be verifying the characteristics of a fibre in a WAN, which may include dispersion testing and thorough optical time domain reflectometer measurements.”
Once the physical environment of the network has been verified, Kempster says data performance can then be tested. “This could be a simple RFC-2544 test of the network, but more companies are looking to use the newer, highly sophisticated Y-1534 test processes, which are designed to enable testing of the network that is more real-world applicable.”
However, Griffiths says there is still a lot of human interaction at the low level which isn’t totally necessary. He believes that in today’s ‘software-defined’ world, organisations no longer require someone to provision all the intricate components that go into their networks.
Earlier this year, Riverbed launched SteelConnect, a new software-defined platform which it claims is unique in its ability to unify network connectivity and orchestration of application delivery across hybrid WANs, remote LANs, and cloud networks (see News, p6, Jun 2016 issue).
Silver Peak is also an exponent of the software defined WAN (SD-WAN). Applegarth says the benefits are four-fold: increased flexibility; more visibility and control; optimal performance; and reduced connectivity, equipment and admin costs.
He says that while network managers may be reluctant to completely replace their traditional MPLS networks with broadband connectivity, SD-WAN technology allows them to move at their own pace. At the same time, it enables their organisations to keep on top of innovation trends such as cloud and virtualisation. “With SD-WAN, the ultimate goal may be a 100 per cent broadband WAN, yet most companies will take incremental steps by deploying a hybrid WAN. As MPLS upgrades arise, businesses can then evaluate lower-cost broadband internet services as an alternative or complementary path for connecting users to cloud-based applications.”
This, says Griffiths, provides an opportunity for organisations to gradually reduce reliance on MPLS bandwidth or preserve that connectivity for remaining data centre applications. “Enterprises can then begin to migrate additional applications from the data centre and into the cloud as desired, and do so in an optimal way.”
Critically, in these financially constrained times, Griffiths adds that SD-WAN architecture has a strong economic incentive for organisations, with reduced connectivity, equipment and administration costs – up to 90 per cent in some cases, he claims.
What to avoid
It could be argued that if you get optimisation right from the very outset, your network will be in a better position to more easily handle new technologies as they evolve.
“Network managers need to be continually monitoring and upgrading the performance of their networks, as well as ensuring that they are functioning at maximum efficiency,” says Microlease’s Kempster. “This has to be a whole life process, from the initial installation of the network all the way through its operational lifespan.”
But if you are considering a new optimisation solution, Savvius says that some of the biggest pitfalls facing companies today include ignoring warning signs from existing equipment. Javaheri says it’s important to monitor utilisation – not just averages, but peak periods to see how much stress the network is under.
Turner advises network managers to look for platforms that offer a clear value for the hybrid IT environments which they are responsible for. He believes usability is the key characteristic to look for, and is more important than any specific feature. “The network is more important than ever given the digital nature of all businesses. To make sure your network is running and you are delivering services securely, you need products that are easy to use and just work.”
While that may sound obvious, Turner also says that when you are in the purchasing cycle, if you don’t make a commitment to deploying some of the new technology, you probably shouldn’t pay the extra money for the features.
“My favourite example of this was the 1GB to the desktop experience we had a little over a decade ago. All of the network vendors where pushing GB in their access layer switches, but for the vast majority, this simply wasn’t needed or where the bottleneck existed. The same thing can be said today. If you don’t plan to roll out the new feature quickly, be careful about buying more than you need.”
Griffiths may not fully agree here. He points out that mature product sets provide additional functionality which some organisations may not even realise that they need. Network managers therefore need to ensure that what they invest in is aligned with their company’s plans for the future.
Silver Peak warns that while virtualisation and cloud technologies promise to optimise and enhance the enterprise, they can often result in slower performance not to mention “wasted costs” for the business.
“This is particularly true for cloud applications as a result of the transmission being sent back over MPLS connections to the organisation’s data centre and back again,” says Applegarth.
“Networkmanagers can also lose visibility and control over the expanding mix of applications. In fact, most network managers would not be able to say how many SaaS applications are running on their company’s WAN. As such, they need to be able to rein in how applications are being used on the network.”
He continues by saying that SD-WAN will ensure private line performance over broadband and internet connectivity by overcoming quality problems created by packet loss and out-of-order packets.
“This is especially important for cloud users and those that are increasingly using SaaS applications in the branch.
“Once connected, an SD-WAN fabric should also provide visibility into both data centre and cloud traffic, and provide the ability to centrally assign business intent policies to secure and control the WAN traffic. It should dynamically select the best path – whether that’s MPLS or broadband – for each application based on customer-defined policies and real-time network quality measurements, all while keeping the data in-flight encrypted edge-to-edge.”
Griffiths offers further advice to network managers and says they should avoid ‘point products’ – i.e., those that offer only a single function or limited feature set, as these create a deployment where you don’t have a complete solution.
“Administration becomes fragmented and complex when you start to include other vendor’s products to make up the shortfall. They only lead to siloed IT management practices, confusion and conflict.
“A decade ago, when Riverbed was making a pure play for WAN optimisation, we saw that if we did not evolve by adding more functionality and creating an integrated suite of platforms, then we were going to go the same way as other vendors who have fallen by the wayside over the years.
“For pure play vendors, be careful about adopting tech that looks very attractive on first glance, but falls short in key areas because they simply don’t have the experience or fail to innovate.”
Ready for the future
So what of tomorrow’s optimisation platforms? In Turner’s words above, if network managers “need to be investing in the future while taking care of the present,” what are the features such products will need to incorporate in the future?
“The constantly changing nature of business and the rate at which IT is progressing can make it challenging for organisations to look forward and say definitely what features will need to be incorporated in the future,” says Griffiths. “Many companies want to take advantage of public infrastructure, but equally they don’t want to risk security. There has to be some level of compromise. You can’t lock everything down to such an extent that nobody can get access to services they need in the time that they need it.”
As a result, Griffiths says products need to incorporate zero-touch deployment so companies can deploy rapidly at the right place and right time. He adds that when they are deploying that technology, it must fit in with their security models and project timelines, and provide them with continued control and visibility. “That way they can see what is going on over the network, and make sure they can change things as and when needed.”
Meanwhile, Applegarth forecasts a future where SD-WAN innovations will enable enterprises and service providers to build WANs that automatically learn and adapt to dynamically changing network conditions and application demands.
TNP says that one of the major development areas for its customer base is in the advent of smart cities and the IoT. To address these trends, Wade points out that networks need the flexibility to accommodate a large variety of connectivity media and interface types. “If we are connecting every device in a town, city or rural environment, then it is no good just having pervasive fibre connectivity because this will be too inflexible and probably too costly (not to mention the logistical issues surrounding plumbing it in).
“Networks need to have a variety of speeds and media: fibre, copper, wireless and, more importantly, the core design to accommodate connecting each one of these in a flexible and cost-effective management. A traffic signal, public refuse bin or a lighting post (with a requirement for a low level of bandwidth/signalling data flow) needs entirely different connectivity than a council office or school with a 1-10Gbps requirement.”
Wade reckons that while the provider chosen to deliver these types of networks needs to have the flexibility to encompass all the media required, there aren’t many companies currently out there that can offer this capability. “Much of the industry is polarised to its preferred method of connectivity and interface. We believe that networks should have the core and edge technology to accommodate the necessary end-user devices flexibly and cost effectively.
“We often see public procurement documentation that specifies fibre connectivity when a more flexible and lower-speed alternative would be more than capable and, significantly in the current economic climate, deliver fit for purpose connectivity at significantly lower cost.”
Riverbed is also critical of what could be regarded as IT practices that have become ingrained over the years but are could now be considered as outmoded.
“When businesses look at troubleshooting in their IT environment, a certain level of hubris always creeps in,” says Griffiths. “When you spot a symptom within the system, you automatically assume, for example, that ‘it must be the web server that is causing the issue’. IT’s approach to troubleshooting is to focus on one specific area they think is causing the issue. Narrow-mindedness in troubleshooting is worrisome, and fixing one symptom doesn’t ensure the problem is fixed. IT teams should endeavour to step back and examine the whole picture.”