New upgrades to the enterprise mixed reality (MR) platform of Canadian company Arvizio mean massive three-dimensional (3D) models can now be streamed to multiple participants, who can simultaneously interact with them using Microsoft HoloLens and Magic Leap headsets.
Through a combination of dynamic level of detail processing and GPU accelerated rendering, Arvizio can stream LiDAR scans and photogrammetry models to untethered, standalone MR headsets.
With the new updates to MR Studio, aimed at architecture, engineering, construction, surveying, mining, energy and public safety initiatives, multiple participants can collectively view synchronised, difficult-to-render point cloud data, walk through at life scale and teleport to any position within the scan.
The MR Studio 4.0 release also includes fully automated 3D model optimisation and hybrid rendering for large and complex BIM (building information modelling) and CAD (computer-aided design) 3D models.
New augmented reality Immerse apps for IOS and Android devices with cross-platform sharing have been added, and as well as new 3D model formats, including GLTF 2.0.
MR Studio 4.0 also includes multi-model alignment to allow virtual objects, such as buildings and machinery, to be positioned within the point cloud experience.
3D scanning meets immersive tech
Jonathan Reeves, chief executive officer at Arvizio, said: “The ability to visualise and interact with 3D point cloud and photogrammetry content as if you were actually onsite, creates a seamless workflow environment from reality capture to immersion.”
“With MR Studio, our customers can leverage the convergence of advanced 3D scanning and immersive technologies for multi-user, interactive shared experiences that promote more insightful, effective decisions.”
Arvizio launched a solution for hybrid rendering of immense 3D models and point clouds in May.
The Arvizio IoT Gateway feature connects MR Studio to multiple internet of things frameworks, allowing telemetry and sensor data to be displayed within the immersive experience.
The convergence of streamed, hybrid-rendered, large-scale 3D models and real-time internet of things data allows real-world assets to be visualised at scale and augmented with real-time sensor data, according to Arvizio.
Image credit: Arvizio. Pictured: 350 million point photogrammetry model of Melbourne rendered in MR