New York State-based Rensselaer Polytechnic Institute is giving its students the opportunity to learn Mandarin Chinese on the streets of Beijing without having to leave campus, in the private research university’s latest foray into immersive technology.
In partnership with IBM Research, the institute has developed the intelligent, immersive environment for a six-week summer credit-bearing course.
The ‘AI-Assisted Immersive Chinese’ course in Mandarin Chinese combines artificial intelligence and other cutting-edge technologies to provide an experiential language-learning experience akin to studying abroad.
A 360-degree panoramic display system creates the sense that users are in the middle of various computer-generated scenes, such as a restaurant or a street market, that are similar to those they might encounter in China, according to the institute.
A combination of the immersive environment’s natural language understanding, narrative generation and gesture recognition technology capabilities enable students to interact naturally with characters in these scenes.
The room is able to engage in dialogue with students and offer immediate feedback on their non-native Chinese speech and pronunciation.
The scenarios used in the class were developed by the Mandarin Project, which is an initiative of the Cognitive and Immersive Systems Laboratory (CISL), an ongoing collaboration between Rensselaer and IBM Research.
Course instructor Helen Zhou, a Rensselaer associate professor of communication and media who worked closely with CISL to develop the programme, said: “The immersive environment is revolutionary. The language context we designed is based on real life scenes. These interactions give students a sense of how communication will play out in real world scenarios, and the system gives them immediate feedback on how well they did.”
Experience the simulation in the video below:
Immersive developments of 2019
A team of collaborators with ties to Rensselaer announced earlier this year that they are developing the use of virtual reality (VR) technologies to train and objectively evaluate colorectal surgeons.
Suvranu De, the principal investigator on the project and the director of the Rensselaer Center for Modeling, Simulation and Imaging in Medicine, and the team are working to create VR training and evaluation tools for the procedures tested in the Colorectal Objective Structures Assessment of Technical Skills (COSATS), which became the first technical skill examination used in national surgical board certification in 2014.
Part of the American Board of Colon and Rectal Surgery (ABCRS) examination, the test requires an elaborate setup in which residents must circulate to perform five open surgical tasks and have their work evaluated by board certified surgeons. Some individuals who passed the standard oral and written portions of the ABCRS exam have failed the COSATS.
VR simulation technology has been developed for colonoscopy and laparoscopic surgical procedures, Rensselaer explained, but the open surgical procedures such as the ones used in COSATS present challenges, such as requiring a fully immersive environment in which trainees can see and physically interact with the patient’s anatomy and untethered surgical tools.
“Our aim is to overcome these challenges and to create entirely new tools for educating surgeons safely and effectively,” De explained in February. “This is a highly multidisciplinary project involving engineering, haptics, computer graphics, artificial intelligence, surgery, and human factors.”
In addition to designing, developing and evaluating the first Virtual Colorectal Surgical Trainer for the five open surgical tasks in the COSATS, De’s team hopes to create the first Virtual Intelligent Preceptor, which will be an intelligent agent that will help surgeons learn important technical skills.
Researchers and students RAVE about new lab
In January, the institute launched the Rensselaer Augmented and Virtual Environment (RAVE), a new laboratory for researchers and students to experiment with different uses of VR and augmented reality.
Crucially, the RAVE can be configured and adjusted according to the needs of whoever is using it, according to the institute.
The institute explained that a materials science and engineering class might gather around large molecular structures visible to them only through smartphones and tablets, while a researcher wearing a VR headset might explore a distant celestial object and be able to move around it as if they were actually there.
Rich Radke, co-director of the RAVE and a professor of electrical, computer and systems engineering, said in January: “Virtual and augmented reality can be used to perform experiments and provide learning experiences that were previously impossible due to scale, cost, or safety. The RAVE presents unprecedented opportunities for how we study and incorporate this technology at Rensselaer.”
Image credit: Rensselaer Polytechnic Institute