Whenever I think about mixed reality scenarios, that is, human-perceived reality augmented by technology and software, I think of movies such as The Matrix, Tron, orReady Player One. These films have a dystopian viewpoint of the integration between humans and machines.
In the real world, however, we are already seeing how technology and software are affecting our day-to-day lives in a positive manner. Be it work, play or travel, many of the interactions technology and software allow us to have are beneficial.
What’s in a Term?
Terms for the different levels of virtual reality integration can help readers better understand what stages reality integration is in. Here are some definitions and real-world examples:
Augmented Reality (“AR”): AR can be defined as the real world augmented with real-time digital interfaces. One of many examples where this is already happening is Pokémon Go and Snapchat (more on this later). Here are some examples.
Mixed Reality (“MR”): MR is the next layered step from augmented reality; it’s closer to Virtual Reality, but still grounded in current reality. HoloLens from Microsoft is a really cool example of this. Check out this video to see a HoloLens user interacting with the popular game called Portal in the real world.
Virtual Reality (“VR”): VR a computer-driven simulated environment. This currently calls to mind images of VR users holding joysticks or sensors and wearing a helmet with a screen set built in. While this is the case for many VR solutions, increasingly whole experiences are now projected using VR. From flight simulators to surgical trainings for medical students, all the major technology companies have some form of VR research. Learn more here.
As mentioned earlier, I’d like to address a popular AR platform: Snapchat. It is an outstanding example of functional augmented reality. With facial recognition technology and filters, Snapchat has made seeing the world and yourself through filters and silly add-ons, such as bigger eyes, face swaps, or even dog ears a fun way to share and pass the time.
This may seem unassuming, but such a system of sharing and changing the way a user interacts with their own image and others has already had real-world impacts. For example, there is an increased desire for people to get plastic surgery to look more like some of the filters they see themselves through on Snapchat.
From Point A to Point Everywhere: AR in Google Maps
Snapchat might be a fun example, but there is also a more practical example of how many of us leverage AR in our daily lives: Google Maps. Google Maps leverages GPS coordinates to map the Earth and help users find directions to locations anywhere in the world. Millions of users leverage Google Maps to find the best route to work, check out a street prior to walking it, and estimate how long it will take to get from point A to point B.
Because many of us grew up learning to read paper maps (which in itself is an analog form of augmented reality), a digital and interactive representation of something we all generally know, the Earth, does not seem like a revolutionary step. Nonetheless, this has indeed been revolutionary since the advent of cheap smartphones.
MR, AR, and VR in the Workplace
Most MR, AR, and VR experiences rely on smartphone devices sharing information, usually in the form of the camera and other sensors with AI software connected in the cloud. This connection and exchange of data has helped to rapidly expand AI’s reach into MR AR and VR.
Here are some examples of how MR, AR, and VR are either in the workplace or will be soon to help users and companies collaborate better together with AI:
MR: Designing the next generation of cars and buildings and testing them without the need to produce a real-life example. More here.
AR: Screen Projections or Screen Share sessions such as WebEx, Go ToMeeting, or Skype
VR: Architecture Design walkthroughs and remote learning courses
Military scenario training
Medical school test surgeries
Leveraging MR, AR, and VR helps us interact better with machines while also lowering the costs of collaboration between employees, especially those not located near each other. This level of interaction with machines will continue as AI expands further in the cloud and on devices such as mobile phones, which are becoming increasingly sophisticated with multiple sensors and input points.
Formerly a Solutions Engineer at AvePoint, Bryan worked with enterprises to implement effective business-focused governance, GDPR and regulatory compliance, proactive training and solution deployments for enterprises of all sizes, balancing customizations and available technology to meet business needs.