Selected Projects

I am an Assistant Professor at Texas A&M University in the Department of Visualization, where I lead the Interactive Data and Immersive Environments (INDIE) research group. My research interests include immersive virtual reality, visual analytics, educational software, training systems, and human-computer interaction. Descriptions of selected projects are below. My CV is also available.

Visual History Tools for Analytic Process Memory

Visual provenance tools can help capture and represent the history of analytic processes. Fully detailed analysis records could enable thorough process reviews and lead to strong understanding of the steps taken in analysis, but such review would cost a great deal of time and could require a large amount of data storage. Lightweight representations of visual history are more practical for portability and quick referencing, but how much do lightweight representations aid recall, and what level of detail is necessary? In this research, we investigate these questions with a controlled evaluation of how the level of detail of visual history aids affects the ability to remember and explain memory the analysis process.

Learning with Locations in 3D Virtual Environments

 

I am investigating how people use space and locations to help themselves learn in virtual environments. I am interested in whether learners can take advantage of 3D layouts to help understand information. Although complex arrangements of information within 3D space can potentially allow for large amounts of information to be presented, accessing this information can become more difficult due to increased navigational challenges. This work is also concerned with further understanding the effects of a virtual environment's level of spatial fidelity, involving the distinction between physical and virtual 3D space. This project was done in collaboration with School of Education at Virginia Tech, and it uses an educational environment to teach a history lesson about the First World War. This research is exploring design factors for 3D educational environments and is considering factors such as information layout, environmental detail, travel techniques, and immersive display fidelity.

Design and Evaluation of Visualizations for Analysis of Streaming Data

 

Data analysis on high volume, high velocity data is becoming increasingly important to allow analysts to make decisions more quickly than traditional methods allow. Human analysts need visualization tools to be able to keep up with the data stream to have any chance of maintaining a real-time investigation. This project includes the identification of fundamental data attributes and visualization design factors relevant for streaming data, followed by the systematic evaluation of the effectiveness of the design factors for human analysis tasks.

Supporting Cognitive Processing with Spatially Distributed Information Presentations

   

Through a series of controlled experiments, this work studied how people process information when items are presented at different locations in physical or virtual space. We have found evidence that spatial information layouts can support improved memory through spatial indexing, but this is not always the case. Through studies using both large 2D displays and immersive virtual reality systems, we have worked to better understand how people use locations when trying to learn spatially distributed information.

Evaluating Training Effectiveness for Visual Scanning with VR Systems

   

Virtual simulations allow military personnel to train for real-world scenarios. In this project (supported by the Office of Naval Research), we are studying how various hardware and software factors affect training transfer for visual scanning tasks. For example, with a task involving the identification of weapon carriers in an urban environment, our research showed significant improvements in training transfer after training with increased levels of visual realism. By taking advantage of both head-mounted displays and surround-screen projection systems, we are also studying how display properties affect training and the ability to maintain orientation within complex 3D spaces.

Immersive Data Analysis for Virtual Exploration of Space

   

Working with NASA Langley Research Center and the National Institute of Aerospace, we explored the use of immersive virtual environments based on real places that are too far away for physical human exploration. The virtual medium allows interesting combinations and representations of information collected by NASA. In this research, we are exploring methods for integrating multiple data visualizations into a single environment. We are studying the effects of varying levels of display fidelity and navigational control on understanding scientific data.

Collaborative Navigation in Virtual Search and Rescue

This research explores the use of a collaborative guidance system for search-and-rescue in a complex 3D environment. Based on an augmented reality approach, we implemented and evaluated a proof-of-concept system that allowed a scene commander and responder to efficiently search a building using visual, nonverbal communication. We named our implementation CARNAGE (Collaborative Augmented Rescue Navigation and Guidance Escort). This project began as an entry for IEEE 3DUI Contest 2012, where it won both the First Place Award and the Popular Choice Award.