Meta (formerly Facebook) has invested billions in connectivity worldwide over the last decade and recently announced that its subsea cable investments in Europe and APAC could potentially contribute to over half a trillion USD in additional gross domestic product by 2025.

Now the company is tackling the connectivity challenges in reimagining network infrastructure to support the metaverse, Meta vice president Dan Rabinovitsj said in a blog. Delivering virtual experiences in the metaverse will require innovations in fields like hybrid local and remote real-time rendering, video compression, edge computing, cross-layer visibility, spectrum advocacy, network optimizations, improved latency between devices and within radio access networks (RANs) and more.

Meta plans to build several prototypes of the metaverse to fully understand how to blend virtual content with people’s physical worlds. In the meantime, it’s looking for collaborators to help fully realize its metaverse vision. Rabinovitsj writes:

“Over the next decade, we hope the metaverse will reach a billion people around the world, host hundreds of billions of dollars of digital commerce, and support millions of jobs for creators and developers. This opportunity calls for vast enhancements in capacity and fundamental shifts in how networks are architected and deployed, as well as industry-wide collaboration—from tech companies to mobile operators, service providers, policymakers, and more—to prepare for the metaverse.”

First up, Rabinovitsj explains how Meta has to reduce latency so that complex graphical scenes in the metaverse don’t take hours to download over current networks. To meet the tight latency constraints, the company sees remote rendering over edge cloud, or some form of hybrid between local and remote rendering, playing a greater role in the near future.

 “Enabling remote rendering will require both fixed and mobile networks to be rearchitected to create compute resources at a continuum of distances to end-users,” said Rabinovitsj.

Another challenge that lies ahead is providing consistent quality immersive experiences. A head-mounted display sitting close to the eyes will require retina-grade resolutions of much larger magnitude than a standard smartphone screen. To solve this issue, innovations across the hardware and software stack “as well as revolutionary improvements in network throughput,” will be needed, according to Rabinovitsj.

At Mobile World Congress (MWC) Barcelona, Meta said it plans to collaborate with the Telecom Infra Project (TIP) and partners to define the performance requirements for delivering great end-user experiences in the metaverse.

Meta is working with Telefónica to create a Metaverse Innovation Hub in Madrid where local startups and developers can access a “groundbreaking 5G laboratory.” There they’ll be able to utilize a metaverse end-to-end testbed on Meta and Telefónica’s network infrastructure and equipment, plus benefit from Telefónica’s open innovation ecosystem and Meta’s engineering resources.

Also at MWC, HTC chairwoman Cher Wang unveiled what a day in the life of the company’s metaverse world Viveverse—which will utilize HTC’s Vive virtual reality (VR) headsets—would look like. Viveverse, she said, would be a safe and secure space where kids and adults alike could customize, collaborate and have fun doing things like work out at the gym and go to virtual concerts.

In addition, HTC Vive debuted Vive Guardian, a new privacy and safety feature that lets parents limit access to apps and purchases while in-headset.