In an earlier post (see 'related blog' - tackling the increasing challenges of a remote workforce) I looked at considerations for how best to support a remote workforce. Another topic that the change in working practices has exposed is the impact of how ISPs differentiate between uplink and downlink speeds. Partly this has been driven by marketing needs – the ability to advertise super-fast broadband to the home is a way that ISPs have been able to compete. There is also a more practical reason in that the traditional shape of traffic has placed greater load on the downlink than the uplink. Home users consumed popular apps like Netflix or YouTube. Now Zoom or MS Teams are placing a 2 way load on the last mile link. But there is much more to it than that…
The Importance of the uplink/downlink conundrum
A recent Ericsson study showed how uplink speeds have a marked impact on time-to-content for a variety of applications. The study found that with an unrestricted downlink speed of at least 20Mbps most applications needed a minimum uplink speed of 300kbps to reach a target time-to-content of 4 seconds or less for content to render. Most importantly, at peak times (and pretty much all times are peak times these days) up to 20% of users were unable to achieve these speeds. Multiplied across a large remote workforce the loss in productivity can be immense.
How Network Emulation can help
Network emulators, like the Calnex SNE, can model a variety of network conditions including limited bandwidth in both uplink and downlink directions. Network engineers can provide a framework allowing applications teams to architect highly performant applications, even in the most extreme conditions, or help the business choose cloud applications that perform in the same way.
WAN emulation and security
Security has long been a priority for organisations. But how can security operations impact on the productivity of the business?
Employees will use tunnelling technologies or VPNs to access centralised documents and files. Accessing and sharing large files can be ultra-slow depending on the protocol being used. The result is significant reduction in productivity. But ultimately what is killing the response times? Is it the application architecture or is the choice of file access technology the culprit? For example SMB provides greater security but places a greater overhead on infrastructure. NFS is less secure but, in turn, places less burden on the network. A WAN emulator can help you determine which is the better choice for you. If you cannot compromise on security you may need to answer the challenge of slower access, or settle on a less demanding but less secure approach.