Interviews, insight & analysis on digital media & marketing

Why managing latency is the key to online gaming

By Gino Dion, Director of Innovation Solutions at Nokia Fixed Networks.

Online gaming represents a significant opportunity for service providers and has quickly become a selection factor for more than 2.5 billion active gamers around the world. According to PWC, the gaming industry topped US $118bn in 2018, with gamers aged 18 to 35 representing over 40 percent of the overall gaming community.

It is no longer just teens living at home playing online games, this is a demographic with significant disposable income. According to Limelight Networks gamers spend an average of 6.5 hours a week online, and over 30 percent of those spend over 12 hours weekly on online games .

In order to keep up with these growing trends, it is not about having a 100Mb/s versus a 10Gb/s residential broadband connection, it’s about offering the best and most consistent latency. A 500Mb/s connection will not make you a better gamer than your neighbor who is on a 50Mb/s service, but a consistent low-latency service will. Let’s face it, service providers can only squeeze so much extra revenue from higher-tier broadband services.

So how can service providers capitalize on these new trends?

The dimensions of online gaming

The online gaming segment has a multitude of dimensions to take into consideration, such as:

  • Platforms: PC, consoles, mobile
  • Multiplayer: Battle Royale (Fortnight, Call of Duty, etc.)
  • Massively Multiplayer Online Role Playing Games: World of Warcraft, EVE Online
  • Cloud Gaming: Nvidia GEForce NOW, Google Stadia

Each one has unique bandwidth and latency profile and requirements, from the very relaxed to the very stringent. Let’s take “Cloud Gaming” for example, in this scenario all of the heavy video and CPU processing is offloaded to a cloud gaming engine, which then “streams” the game as a video feed back to the end-user. In this situation the latency profile required is roughly 100ms round-trip, of which the network transport should not be any more than ~30ms. breakdown

This might seem relatively simple to deliver, but there are complications:

  • To deliver a 1080p resolution at 60fps you’ll need a minimum of 25Mb/s network connection, and that bandwidth will be used constantly during your gaming session. This means that if you were to play this game online in a traditional way it might generate 5Gb/month of data usage, that same usage over a Cloud Gaming service will now generate 500Gb/month of data usage – a 100 percent increase!
  • Second is the variability in latency. There are multiple latency choke points between you and the Cloud Gaming provider. These different points compound that latency effect and can lead to a bad gaming experience

This shows the average latency of a Cloud Gaming session over a period of time. We can calculate the average latency of the session as being ~57ms, which sounds very good as it is far below the target of 100ms. Unfortunately, we can see that at least two dozen times, that threshold has been crossed, causing significant visual degradation issues and a bad gaming experience. Highlighting the importance of not just having low latency, but also a consistently low latency profile.

What can innovative service providers do to cater to this valuable market segment?

Latency is a cumulative effect of all the various packet processing points on your network. The worst one being the infamous in-home network. Nokia has pioneered and developed unique technology to address this particular challenge, with our “Low Latency, Low Loss, Scalable Throughput” (L4S) protocol and our next generation “Active Queue Management” (PI2) which support both classic IP traffic and L4S traffic.

IP and Optical networking optimizations for gaming latency

Nokia offers an innovative solution to optimize this, using the “Segment Routing Interconnecting Controller” (SRIC), a Nokia “Network Services Platform”. NSP provides operators with best-in-class methodologies for assuring and optimizing gaming services across various networks, and even multiple equipment vendors.

Service providers can now create latency sensitive templates, that would monitor latency to all the various gaming data centers and services, across all peering-points available. Allowing for real-time routing decisions that would always offer the best latency regardless of what game is being played and what data center is being used.

Steps to monetization

With video games gaining in popularity and gamers being reliant on consistent low latency to enjoy a great gaming experience, it is imperative that service providers work hard to solve this problem. To fail to address this would likely see subscribers falling away from their services and moving to another provider who can better meet their needs and satisfaction levels.

To remain vibrant and competitive, it is essential that they look to create and adopt solutions that help data centers and peering points to monitor latency and better manage it to ensure gamers have the best service without visual degradation.

Service providers can market “latency” services by understanding the real needs of their customers, why latency is important, and where it can be improved. A great example of this is “MyRepublic” in Singapore, a quick look at their program will provide a simple roadmap on how to package this technology into a best-selling broadband service.

Opinion

More posts from ->