By Timothy Kelly
Dr. Patric Dudas explaining camera tech to Metaverse Symposium participants. Credit: Penn State/J. Futrick/ICDS
If it’s not here yet, then what’s all the hype about the metaverse? Google searches for the term “metaverse” peaked in early November 2021 after Facebook changed its name to Meta. Yet, the popularity of the search term remains elevated above the pre-announcement trend.
We met with Dr. Patrick Dudas, interim director of the Center for Immersive Experiences (CIE) at Penn State, to get the scoop on the metaverse. We also attended Penn State’s Metaverse Symposium, June 28-29. Here’s what we learned.
What is the metaverse?
Virtual reality has been around since the early 90s and the term “metaverse” was coined in a dystopian science fiction novel in 1992.
In an interview with Morning Brew, Louis Rosenberg, a virtual reality pioneer, said the metaverse “…really is this transition from a world where people look at digital content from the outside looking in, to a world where we are experiencing content from the inside…”
The metaverse is also a multiverse. The web as we know it consists of multiple sites. The metaverse will likewise consist of multiple virtual spaces with varying degrees of interactivity and user representation. As Dudas said,
“[The metaverse] is not going to be a single place.”
And ultimately, these virtual spaces will be an interconnected whole that allows users to move from one space to another while retaining their digital identity, and the metaverse is expected to include a fully functioning economy.
So how do you get there?
Over and above computing hardware and software, metaverse tech includes the various interface devices that allow for visual, auditory and tactile interactions with the virtual universe. So let’s clear up some of the jargon around metaverse access.
- Extended reality (XR) – the all-inclusive phrase for the ways we interact with the virtual world.
- Virtual reality (VR) – per Dudas, is a digital stage, a coded context in which we are able to engage in virtual relationships in an immersive world. We interact in it using a headset and sensor-based controllers. Meta’s Oculus is an example of a VR interface.
- Augmented reality (AR) – the projection of digital information onto a real-world image. Google Glass and the Magic Leap headsets are examples of this type of technology.
- Mixed reality (MR) – While this term has been used interchangeably with AR, according to Dr. Pejman Sajjadi of Penn State’s CIE, MR is really just AR plus the ability to interact with virtual objects. The Microsoft Hololens fits in this category.
You’ve probably already used some of this technology, even if it sounds a little unfamiliar. Have you ever used satellite imagery or street view on Google Maps? That’s XR. Have you used a virtual background on a Zoom call? Also XR. Watched a football game on TV? The first down line on your screen is XR. Played Pokemon Go? AR.
The metaverse is already here… but not yet
But while we already have many of the components, we do not yet have all of the technology necessary for the metaverse to exist. And the key missing component, according to Dr. Dmitriy Babichenko of the University of Pittsburgh, is interoperability. Interoperability is the ability of computer systems and software to exchange and use information. What that means, is that you get to keep a single identity, with all its attributes, that persists from place to place in the metaverse.
How does this technology help my business?
As Alex Kantrowitz wrote, “If you wear a headset in the near to mid-future, it’ll most likely be at work.”
AR headsets are currently being used for applications ranging from recording the inspection of a machine to preparing for surgery. AR headsets are at work in fields as diverse as healthcare, manufacturing, logistics and field service.
Digital simulation training has been around since the 1960s. With the advent of VR technology, virtual training opportunities are cheaper and expanding into multiple industries. You can train safely and cost-effectively for dangerous activities like managing hazardous waste, flying and military operations. VR is also used in engineering, architecture, education and, of course, entertainment. The sense of presence enabled by VR means that virtual meetings are much more engaging than a Zoom meeting. Entirely VR jobs are just around the corner.
What about research?
Not surprisingly, most VR research supported by Penn State’s CIE is focused on evaluating ways to use VR effectively for education. For example, The Health, Ingestive Behavior and Technology Laboratory at Penn State is currently running a study on food choices and virtual reality.
Many of the research papers listed on the CIE website examine the use of VR in virtual field trips. Probably the greatest benefit of a virtual field trip is that VR allows the participant a perspective impossible to achieve in the real world.
Metaverse Symposium keynote speaker Lisa Sibilia, co-founder and co-CEO of Youtopian. Credit: Penn State/J. Futrick/ICDS
For now, U.S. VR developers make their own standards for ethics, security and privacy regarding conduct and data usage in their respective universes. If that thought concerns you, you aren’t alone. That’s one reason why Dudas and Penn State hosted the Metaverse Symposium.
“The technologies of the metaverse, by their very nature, have the potential to give the platform providers incredible levels of power,” Louis Rosenberg said to Morningbrew.com. The decentralized nature of Web 3.0, which is also still evolving, could mitigate concerns around this issue.
Where do we go from here?
Right now, about 90% of web searches start at Google. Similarly, Dudas believes, Meta seeks to be the place where most of the traffic of the metaverse starts.
The story of the metaverse is still being written. The business and research opportunities are huge. Where the story goes from here, and how it ends, is up to us.
Lively discussion during a “try on the tech” breakout session at the Metaverse Symposium. Credit: Penn State J. Futrick/ICDS