Extended reality blurs the boundary between what is real and what is virtual. It combines all the leading technologies in human-machine interactions generated by computer technology and wearables.
The Enterprise metaverse can be thought of as the next generation of the mobile internet that will consist of persistent, shared, 3D virtual spaces linked to form a virtual ‘universe’.
Examples of metaverse applications that are in use today include VR training where multiple team members can interact with physical objects in a virtual environment, AR guided maintenance with procedural instructions overlayed digitally on a physical product, and visualization of live IoT data in a digital twin of a facility. The list of examples is growing rapidly.
Umajin is designed for efficiently authoring applications for the metaverse with low code tools and drag & drop building blocks. Umajin allows you to create mobile and XR workflows for enterprise applications, such as VR scenario training, AR maintenance and repair workflows, IoT data visualization, and digital twins. Umajin can also be used by designers to quickly create content and instantly deploy it to their mobile devices or VR headsets, such as the Meta Quest 2.
Headset limitations have slowed the rate of development of enterprise applications, but Meta’s Quest 2 for VR and presumably Apple’s impending AR headset launch, are removing some of the largest barriers. The process of authoring and publishing XR content is still a specialized task. Umajin provides the ability to 1) simplify, shorten, and democratize the creation of metaverse content, 2) facilitate the integration of existing and new data streams, and 3) accelerate the “deploy, iterate, improve” cycles and allow new ideas to be tested and improved or discarded on a much more rapid pace.
Umajin’s enterprise metaverse authoring platform allows one to deploy applications on PC, mobile devices or XR headsets. Many organizations will deploy initially on mobile devices (which are already ubiquitous), and move to XR devices as performance, comfort, and pricing improves. XR applications on mobile still bring significant value.
Digital avatars will be a key element of the metaverse experience and important to making the experience feel real. Umajin can be used to create avatars for a variety of use cases. Umajin supports both computer driven AI avatars, as well as human avatars based on real-time body tracking and animation.
The latest development from Umajin allows for real time avatars based on human individuals participating inside the VR experience. Their body movement can be tracked and displayed as a fully animated character. Real time tracking of multiple individuals enables training scenarios with trainees interacting with each other naturally inside the virtual world.
These are digital avatars that can have pre-scripted behavior that human participants can interact with inside the virtual world. This behavior can be enhanced to feel more natural with voice recognition allowing natural conversation as well as inverse kinematics making avatar animation to look more real and lifelike.
360 photo, 3D models, 3D animation, 2D animation, audio, images, text and video. 3D formats include DAE, STL, PLY, OBJ, FBX and glTF.
Umajin events are raised by triggers such as look at, walk, touch, voice and more. These events can launch actions which can change scene, play sound, play animations and more.
Umajin also provides a full stack of real world fusion systems for GPS tracking outside, and radio interpolation for inside tracking. This can be used to provide location for thousands of people and assets in real time in three dimensions. Once in the command and control environment it’s possible to scrub the timeline and see the historical position of items.
Umajin supports industry standard IOT sensors and data integration frameworks so we can read in data from any sensor such as temperature, power level, fluid levels, pressure. This data can be displayed visually inside the digital twin.