Your Name

Edgar J. Rojas Muñoz

Assistant Professor @ Texas A&M University

Contact Information Contact Information Contact Information Contact Information

Home

Home Image

Welcome!
I am an Assistant Professor at Texas A&M University and the director of LEMUR, the Laboratory for Extended and Mixed User Realities.

My research focuses on Mixed Reality for the Social Good. I believe these technologies are meant to become the user interfaces of the future, and my works explore what are we missing to reach that goal.

Here you will find information about my projects, publications, and more. Feel free to explore the site to learn more!

Recent News

Purdue University Logo
Our paper titled "INDYvr: Towards an Ergonomics-based Framework for Inclusive and Dynamic Personalizations of Virtual Reality Environments" was accepted into a workshop in the International Symposium on Mixed and Augmented Reality (ISMAR). This paper introduces INDYvr, an ergonomics-based framework that dynamically adjusts the VR environment, fostering a personalized and inclusive experience. INDYvr will acquire the user’s physical attributes and capabilities and translate them into a parametric model, which is then used to dynamically adjust the VR environment to improve the user’s reachability and walkability. INDYvr represents a shift towards a user-centered VR paradigm, striving for a future where VR is accessible to all users, irrespective of their physical abilities. We will be presenting the paper this upcoming October 2024.

Purdue University Logo
The work from four groups of my Virtual Reality class was accepted into SIGGRAPH's Annual Faculty Submitted Student Work Exhibit. Their work was presented during the conference and resides in their archive.

About Me

Home

Born and raised in the tropical Costa Rica, my journey now has me exploring Texas' vast cattle grasslands. I currently reside in College Station, slowly learning how to become an Aggie (I even got a picture with the Queen of Aggieland!).

I am an early career Faculty working on Mixed Reality for the Social Good. I completed my PhD at Purdue University, where I focused on XR and Human-Computer Interaction under the supervision of Dr. Juan Wachs. I received my Licenciatura (Lic.) degree in Computer Engineering at Instituto Tecnológico de Costa Rica. I enjoy spending my free time playing board games, hiking, and writing a magical realism novel.

Work Experience

2022 - Current
Assistant Professor
Texas A&M University, College Station, TX, United States
Member of the Visual Computing and Computational Media section of the School. Lead researcher in various projects related to Mixed Reality and Human-Computer Interaction. Director of the Laboratory for Mixed and Extended User Realities (LEMUR). Mentor to graduate and undergraduate students from multiple disciplines.

2021 - 2022
Lecturer
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
Lecturer for a variety of courses, including: Introduction to Mixed Reality, Introduction to Programming, and Senior Design Project. Part of various multidisciplinary Mixed Reality projects for the Social Good.

2016 - 2020
Graduate Research Assistant
Purdue University, West Lafayette, IN, United States
Responsible for a variety of tasks, including my doctoral thesis project that explores how to evaluate the collaboration between agents by analyzing the gestures they use. In addition, I am part of a surgical telementoring project sponsored by the Department of Defense.

2015 - 2016
Undergraduate Research Assistant
Purdue University, West Lafayette, IN, United States
Researched and developed a large-scale interactive display based in touch inputs. The device was one of the main components of a telementoring system used by expert surgeons to transfer surgical instruction remotely.

2014 - 2015
Lecturer Assistant
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
The assistantship included the design, lecturing, grading, and student mentoring of two undergraduate level courses with an average of 40 students each: "Introduction to Coding" and "Coding Workshop".

2014 - 2015
Lecturer Assistant
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
Lead researcher in the development and use of a Display Wall with autostereoscopic screens. In addition, the work included the maintenance of the laboratory visualization clusters.

Education

2016 - 2020
Doctoral Degree
PhD in Industrial Engineering
Purdue University, West Lafayette, IN, United States
Current GPA: 3.75/4.00
Dissertation Title: Assessing Collaborative Physical Tasks via Gestural Analysis using the 'MAGIC' Architecture.

2010 - 2016
Licenciatura Degree
Licenciatura in Computer Engineering
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
GPA: 83.5/100, Class Rank 3
Thesis Title: Research, Design, and Construction of a Telementoring Interface for the Recognition and Capture of Real-Time Touch Gestures.

Conference Papers

Purdue University Logo

INDYvr: Towards an Ergonomics-based Framework for Inclusive and Dynamic Personalizations of Virtual Reality Environments
October 2024
Venue: International Symposium on Mixed and Augmented Reality (ISMAR)

This paper introduces INDYvr, an ergonomics-based framework that dynamically adjusts the VR environment, fostering a personalized and inclusive experience. INDYvr will acquire the user’s physical attributes and capabilities and translate them into a parametric model, which is then used to dynamically adjust the VR environment to improve the user’s reachability and walkability. INDYvr represents a shift towards a user-centered VR paradigm, striving for a future where VR is accessible to all users, irrespective of their physical abilities.

Purdue University Logo

Designing Meaningful Tourism Experiences to Promote Ecotourism in Protected Areas Using Augmented Reality
September 2024
Venue: International Congress on Environmental Intelligence, Software Engineering, and Electronic Mobile Health (AmITIC)

This study explores the design of immersive experiences for tourism and environmental conservation through augmented reality in three protected wilderness areas in northern Costa Rica: Arenal Volcano National Park, Caño Negro Mixed Wildlife Refuge, and Juan Castro Blanco National Water Park. The results were positive, though concerns regarding the system's maintenance were highlighted. The implications of using augmented reality in tourism and conservation contexts are discussed, promoting effective use tailored to the needs of both users and the environment.

Purdue University Logo

Developing a VR-based Training Platform for Emergency Fire Handling Services Using Unity 3D
December 2023
Venue: International Conference on Frontiers of Information Technology (FIT)

This paper presents a novel VR-based training platform tailored for firefighters, which leverages Unity 3D and state-of-the-art fire simulation techniques to deliver high-fidelity experiences that closely mimic real-world dynamics. Trainees engage in an immersive VR setting where they experience full autonomy and multisensory feedback, heightening the educational impact through a procedural fire spreading mechanism that emulates actual fire behavior. The paper concludes by emphasizing the platform’s alignment with the set design standards for VR-based firefighter training and outlines prospective user testing with professional firefighters to further refine the VR experience.

Purdue University Logo

Don’t Walk Away! Virtual Safety Boundaries for Collaborative Virtual Reality Learning Environments
October 2023
Venue: IEEE Frontiers in Education (IEEE FIE)

This paper explores how to design and implement Collaborative Virtual Reality Learning Environments (CVRLE) taking Teacher-Student Dynamics (TSD) into account. CVRLEs engage students in a virtual representation of a learning space. However, they do not incorporate TSDs, which are key elements of in-person teaching. The current work explores how to recreate one specific TSD: the watchfulness teachers have over the students’ location during a field trip. We propose and compare seamless approaches to recreate this TSD inside CVRLEs via virtual safety boundaries around the teacher. Overall, the vision of this work is to solidify CVRLEs into a plausible method to perform teaching and learning when co-presence between teacher and students cannot be guaranteed. The design guidelines from this project will inform the creation of future interactive CVRLEs.

Purdue University Logo

Towards an Intelligent Tutoring System for Virtual Reality Learning Environments
October 2023
Venue: IEEE Frontiers in Education (IEEE FIE)

This paper explores the development of an Intelligent Tutoring System (ITS) inside a Virtual Reality Learning Environment (VRLE). Understanding how to use and interact with a VRLE usually represents a steep learning curve, particularly for the instructor. Our work addresses this problem by integrating an ITS into the VRLE. In doing so, timely guidance can be provided to the students, and the instructor can adapt the ITS support to best guide student learning across multiple skill levels. Our approach immerses students in an engineering-based VRLE that resembles a space launch mission control. The results of the interviews validated the ITS as a successful approach in providing students with prompt feedback while they are immersed in the VRLE. Overall, the work strengthens the validity of VRLEs as platforms that support more engaging and immersive learning experiences, and informs how to effectively integrate ITS within them.

Purdue University Logo

Cell Tour: Learning About the Cellular Membrane Using Virtual Reality
October 2023
Venue: IEEE Frontiers in Education (IEEE FIE)

This paper explores the application of VR in biology education, specifically focusing on teaching the functionality of the cell membrane. The traditional methods of textbook reading and microscopic observation lack interactivity and engagement, motivating the development of a VR-based learning approach designed as a game-like experience. The study compares the knowledge gain of participants who played the VR game with a control group who watched instructional videos.

Purdue University Logo

Crystal Viewpoints: Virtual reality viewpoint design for analytical measurement of crystal structures in Materials Science and Engineering
October 2023
Venue: IEEE Frontiers in Education (IEEE FIE)

This paper provides an exploration of viewpoints that are feasible candidates for visualizing the crystal structures in a Virtual Reality Learning Environment (VRLE). the optimal selection of viewpoints and perspectives that promote effective exploration and enhanced interactions with the crystal structures inside the VRLEs remains unclear. Our work addresses this gap by comparing three distinct viewing configurations in a VRLE: (A) from the bottom base of the structure, (B) hovering on top of the stacked crystal structure, and (C) from the center of the structure. To compare the viewpoints, we analyze the user preferences between the visualization approaches. Overall, this paper provides guidelines on the design of experiences related to crystalline structures in VRLEs for better student engagement and enjoyment with Materials Science content.

Purdue University Logo

The AI-Medic: A Multimodal Artificial Intelligent Mentor for Trauma Surgery
September 2020
Venue: Demo for IEEE International Conference on Multimodal Interaction (IEEE ICMI)

We present the AI-Medic, the initial steps towards the development of a multimodal intelligent artificial system for autonomous medical mentoring. The system uses a tablet device to acquire the view of an operating field. This imagery is provided to an encoder-decoder neural network trained to predict medical instructions from the current view of a surgery. The network was training using DAISI, a dataset including images and instructions providing step-by-step demonstrations of surgical procedures. The predicted medical instructions are conveyed to the user via visual and auditory modalities.

Purdue University Logo

The MAGIC of E-Health: A Gesture-Based Approach to Estimate Understanding and Performance in Remote Ultrasound Tasks
May 2020
Venue: IEEE International Conference on Automatic Face and Gesture Recognition (IEEE FG)

This work presents an approach to estimate task understanding and performance during a remote ultrasound training task via gestures. These task understanding insights are obtained through the PIA metric, a score that represents how well are gestures being used to complete a shared task. To evaluate our hypothesis, a remote ultrasound training task consisting of three subtasks: vessel detection, blood extraction, and foreign body detection. Afterwards, their task understanding and performance was estimated using our PIA metric. These results indicate that a gesture-based metric can be used to estimate task understanding, which can have a positive impact in the way remote ultrasound tasks are performed and assessed.

Purdue University Logo

Beyond MAGIC: Matching Collaborative Gestures using an Optimization-based Approach
May 2020
Venue: IEEE International Conference on Automatic Face and Gesture Recognition (IEEE FG)

This paper introduces three novel approaches to compare gestures performed by individuals as they collaborate to complete a physical task. Our approach relies on solving three variations of an integer optimization assignment problem, i.e. based on gesture similarity, based on temporal synchrony, and based on a combination of both. The obtained results support the proposed technique for gesture comparison. This in turn can lead to the development of better methods to evaluate collaborative physical tasks.

Purdue University Logo

How About the Mentor? Effective Workspace Visualization in AR Telementoring
March 2020
Venue: IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR)

This paper presents a method for robust high-level stabilization of a mentee first-person video to provide effective workspace visualization to a remote mentor. The visualization is stable, complete, up to date, continuous, distortion free, and rendered from the mentee’s typical viewpoint, as needed to best inform the mentor of the current state of the workspace. In one study, the stabilized visualization had significant advantages over unstabilized visualization, in the context of three number matching tasks. In a second study, stabilization showed good results, in the context of surgical telementoring, specifically for cricothyroidotomy training in austere settings.

Purdue University Logo

MAGIC: A Fundamental Framework for Gesture Representation, Comparison and Assessment
May 2019
Venue: IEEE International Conference on Automatic Face and Gesture Recognition (IEEE FG)

This work introduces the Multi-Agent Gestural Instructions Comparer (MAGIC), an architecture that represents and compares gestures at the morphological, semantical and pragmatical levels. MAGIC abstracts gestures via a three-stage pipeline based on a taxonomy classification, a dynamic semantics framework and a constituency parsing; and utilizes a comparison scheme based on subtrees intersections to describe gesture similarity. This work shows the feasibility of the framework by assessing MAGIC’s gesture matching accuracy against other gesture comparison frameworks during a mentor-mentee remote collaborative physical task scenario.

Purdue University Logo

Robust High-Level Video Stabilization for Effective AR Telementoring
March 2019
Venue: IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR)

This work presents the design, implementation, and evaluation of a method for robust high-level stabilization of mentees first-person video in augmented reality (AR) telementoring. This video is captured by the front-facing built-in camera of an AR headset and stabilized by rendering from a stationary view a planar proxy of the workspace projectively texture mapped with the video feed. The result is stable, complete, up to date, continuous, distortion free, and rendered from the mentee’s default viewpoint.

Purdue University Logo

3rd Virtual and Augmented Reality for Good (VAR4Good) Workshop
October 2018
Venue: International Symposium on Mixed and Augmented Reality (ISMAR)

Virtual Reality (VR) and Augmented Reality (AR) are becoming mainstream. With the research and technological advances, it is now possible to use these technologies in almost all domains and places. This provides a bigger opportunity to create applications that intend to impact society in greater ways than beyond just entertainment. Today the world is facing different challenges including healthcare, environment, and education. Now is the time to explore how VR/AR might be used to solve widespread societal challenges. The third Virtual and Augmented Reality for Good (VAR4Good) workshop will bring together researchers, developers, and industry partners in presenting and promoting research that intends to solve real-world problems using VR/AR. The workshop will provide a platform to grow a research community that discusses challenges and opportunities to create Virtual and Augmented Reality for Good.

Purdue University Logo

A First-Person Mentee Second-Person Mentor AR Interface for Surgical Telementoring
July 2018
Venue: International Symposium on Mixed and Augmented Reality (ISMAR)

This application paper presents the work of a multidisciplinary group of designing, implementing, and testing an Augmented Reality (AR) surgical telementoring system. The system acquires the surgical field with an overhead camera, the video feed is transmitted to the remote mentor, where it is displayed on a touch-based interaction table, the mentor annotates the video feed, the annotations are sent back to the mentee, where they are displayed into the mentee’s field of view using an optical see-through AR head-mounted display (HMD).

Purdue University Logo

Augmented Visual Instruction for Surgical Practice and Training
February 2018
Venue: IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR)

This paper presents two positions about the use of augmented reality (AR) in healthcare scenarios, informed by the authors’ experience as an interdisciplinary team of academics and medical practicioners who have been researching, implementing, and validating an AR surgical telementoring system. First, AR has the potential to greatly improve the areas of surgical telementoring and of medical training on patient simulators. Second, AR annotations for telementoring and for simulator-based training can be delivered either by video see-through tablet displays or by AR head-mounted display.

Journal Articles

Purdue University Logo

Improving Motivation and Learning Experience with a Virtual Tour in an Assembly Line to Learn About Productivity
July 2023
Venue: Sustainability

We propose the use of a Virtual Tour to substitute in-person visits to a manufacturing plant for a lecture on Enterprise Productivity at the School of Business Administration at our University. We present a prototype of a virtual tour of an assembly line in a simulated environment, where students can explore and learn about the manufacturing process of car seats. We performed a mixed method user study, with quantitative and qualitative data, to determine whether the application can help learn the intended concepts and improve the learning experience and motivation of students. Results show that the use of the virtual tour application increased motivation in learning.

Purdue University Logo

Assessing Task Understanding in Remote Ultrasound Tasks via Gestural Analysis
August 2021
Venue: Pattern Analysis and Applications

This work presents a gesture-based approach to estimate task understanding and performance during remote ultrasound tasks. Our approach is comprised of two main components. The frst component uses the Multi-Agent Gestural Instruction Comparer (MAGIC) framework to represent and compare the gestures performed by collaborators. Through MAGIC, gestures can be compared based in their morphology, semantics, and pragmatics. The second component computes the Physical Instructions Assimilation (PIA) metric, a score representing how well are gestures being used to communicate and execute physical instructions. These results demonstrate that gestures can be used to estimate task understanding in remote ultrasound tasks, which can improve how these tasks are performed and assessed.

Purdue University Logo

Assessing Collaborative Physical Tasks via Gestural Analysis
November 2020
Venue: Transactions in Human-Machine Systems

This work introduces the physical instruction assimilation (PIA) metric, a novel approach to estimate task understanding by analyzing the way in which collaborators use gestures to convey, assimilate, and execute physical instructions. PIA estimates task understanding by inspecting the number of necessary gestures required to complete a shared task. The results from this paper hint that gestures, in the form of the assimilation of physical instructions, can reveal insights of task understanding and complement other commonly used metrics.

Purdue University Logo

The AI-Medic: An Artificial Intelligent Mentor for Trauma Surgery
October 2020
Venue: Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization

Telementoring generalist surgeons as they treat patients can be essential when in situ expertise is not available. However, unreliable network conditions, poor infrastructure, and lack of remote mentors availability can significantly hinder remote intervention. To guide medical practitioners when mentors are unavailable, we present the AI-Medic, the initial steps towards an intelligent artificial system for autonomous medical mentoring. This work provides a baseline for AI algorithms assisting in autonomous medical mentoring.

Purdue University Logo

Evaluation of an augmented reality platform for austere surgical telementoring: a randomized controlled crossover study in cricothyroidotomies
May 2020
Venue: Nature Digital Medicine

Telementoring platforms can help transfer surgical expertise remotely. However, most telementoring platforms are not designed toassist in austere, pre-hospital settings. This paper evaluates the system for telementoring with augmented reality (STAR), a portableand self-contained telementoring platform based on an augmented reality head-mounted display (ARHMD). The system is designedto assist in austere scenarios: a stabilized first-person view of the operating field is sent to a remote expert, who creates surgicalinstructions that a local first responder wearing the ARHMD can visualize as three-dimensional models projected onto the patient’sbody. Our hypothesis evaluated whether remote guidance with STAR could lead to performing a surgical procedure better, asopposed to remote audio-only guidance.

Purdue University Logo

The System for Telementoring with Augmented Reality (STAR): A Head-Mounted Display to Improve Surgical Coaching and Confidence in Remote Areas
November 2019
Venue: Surgery

Although conventional telementoring systems have proven beneficial to address this gap, the benefits of platforms of augmented reality-based telementoring in the coaching and confidence of medical personnel are yet to be evaluated. This study presents an evaluation of the effectiveness of such coaching using the System for Telementoring with Augmented Reality (STAR). STAR is a novel platform that leverages an AR, head-mounted display (ARHMD) worn by the mentee surgeon to display mentor-authored operative instructions. Mentees wearing the ARHMD can visualize these expert instructions as three-dimensional (3-D) overlays directly onto their field of view of the patient’s body.

Purdue University Logo

Telementoring in Leg Fasciotomies via Mixed-Reality: Clinical Evaluation of the STAR Platform
October 2019
Venue: Military Medicine

Point-of-injury (POI) care requires immediate specialized assistance but delays and expertise lapses can lead to complications. In such scenarios, telementoring can benefit health practitioners by transmitting guidance from remote specialists. However, current telementoring systems are not appropriate for POI care. This article clinically evaluates our System for Telementoring with Augmented Reality (STAR), a novel telementoring system based on an augmented reality head-mounted display. The system is portable, self-contained, and displays virtual surgical guidance onto the operating field. These capabilities can facilitate telementoring in POI scenarios while mitigating limitations of conventional telementoring systems.

Purdue University Logo

Augmented Reality as a Medium for Improved Telementoring
October 2018
Venue: Military Medicine

Current telementoring systems have limited annotation capabilities and lack of direct visualization of the future result of the surgical actions by the specialist. The System for Telementoring with Augmented Reality (STAR) is a surgical telementoring platform that improves the transfer of medical expertise by integrating a full-size interaction table for mentors to create graphical annotations, with augmented reality (AR) devices to display surgical annotations directly onto the generalist’s field of view. Along with the explanation of the system’s features, this paper provides results of user studies that validate STAR as a comprehensive AR surgical telementoring platform. In addition, potential future applications of STAR are discussed, which are desired features that state-of-the-art AR medical telementoring platforms should have when combat trauma scenarios are in the spotlight of such technologies.

Purdue University Logo

Augmented Reality Future Step Visualization for Robust Surgical Telementoring​
May 2018
Venue: Simulation in Healthcare

We present a novel method for visualization in AR telementoring that allows the trainee to visualize future steps of a surgical procedure independently of the quality of the connection to the mentor. This is in contrast to conventional AR interfaces, which only provide support for the current step of a procedure. This “future step visualization” illustrates to the trainee what the operating field will appear like after a future step of an operation has been completed, by superimposing prerecorded videos of future steps of the procedure directly onto the trainee's view of the operating field.

Purdue University Logo

Surgical Telementoring without Encumbrance: A comparative study of see-through augmented reality based approaches
March 2018
Venue: Annals of Surgery

This study investigates the benefits of a surgical telementoring system based on an augmented reality head-mounted display (ARHMD) that overlays surgical instructions directly onto the surgeon’s view of the operating field, without workspace obstruction. Twenty medical students performed anatomical marking and abdominal incision on a patient simulator, in 1 of 2 telementoring conditions: ARHMD and telestrator. : The ARHMD system promises to improve accuracy and to eliminate focus shifts in surgical telementoring. Because ARHMD participants were able to refine their execution of instructions, task completion time increased. Unlike a tablet system, the ARHMD does not require modifying natural motions to avoid collisions.

Creative Works

Purdue University Logo

13th Annual Faculty Submitted Student Work Exhibit
August 2024
Venue: Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH)

The work from four groups of my Virtual Reality class was accepted into SIGGRAPH's Annual Faculty Submitted Student Work Exhibit. Their work was presented during the conference and resides in their archive.

Purdue University Logo

The Cursed Bear
August 2024
Venue: PVFA Performance Series

Three students from my Virtual Reality class got their work showcased as part of the College of Performance, Visualization & Fine Arts Performance Series. Their multiplayer Vritual Reality game was open to public, and a caster from the Communications Department narrated as members of the audience joined to play the game in real time.