As a Human-Computer Interaction researcher, who specializes in eye tracking, I develop gaze-assisted, multi-modal, accessible interaction methods for impairments and disabilities. Also, as a Machine Learning researcher, I perform predictive analytics using eye movements data applied to the domains of security, education, and economics.

Texas A & M University

Vijay Rajanna

   Who Am I ?

I am a Human-Computer Interaction and Machine Learning researcher at the Sketch Recognition Lab, Department of Computer Science and Engineering. I recently defended my Ph.D. dissertation and was advised by Dr. Tracy Hammond. At the Sketch Recognition Lab, our work spans across multiple dimensions of application-oriented research in Computer Science. We have successfully developed intelligent algorithms for sketch, gesture, and activity recognition. Our major contributions include a gaze-assisted multi-modal interaction framework, a haptic-assisted navigation system, wearable devices for indoor navigation of users with vision impairment, a personal health assistant for physical and mental wellness, a social networking platform for kids, and so on.

Research

   Summary

Research Area:

Human-Computer Interaction, Human-Centered Computing, HCI Experimental and Observational Methods, Machine learning, Statistical Analysis – parametric and non-parametric, Eye Tracking Analytics, Interaction Modeling (Fitts’ Law), Gaze-assisted Interaction, Virtual Reality (VR), Gaze Contingent VR, Accessibility, Cognitive Systems, Haptics, Multi-modal Interaction, Gesture Recognition, Activity Recognition, Sketch Recognition, Wearables, Data Visualization, and Cognitive Psychology.

Research Statement:

My work is interdisciplinary in nature as it spans across Computer Science, Cognitive Psychology, and Personal Health disciplines. In my doctoral research, I focus on developing gaze-assisted, multi-modal paradigm to interact with computers. Eye movement-based interactions are crucial in scenarios where a user cannot use conventional input devices like a mouse and keyboard, and largely, there are two such scenarios: 1) situational impairments, i.e., a user's hands are engaged in a task, for example, driving, a surgeon operating, playing music, holding things, etc. 2) impairments and disabilities by birth or due to an injury. In these two scenarios, gaze-assisted interaction serves as a rich, contextual, and non-invasive technology to help people interact with computers.

Gaze-assisted, multi-modal interaction:

I have developed a gaze and foot-based interaction framework to achieve accurate "point and click" interactions and to perform dwell-free text entry on computers. Also, I have developed a gaze gesture-based framework for user authentication to counter shoulder surfing attacks and to interact with a wide range of computer applications using a common repository of eye gestures. I have also been exploring gaze-assisted interactions in virtual and augmented reality. Specifically, I investigate the feasibility of using eye movements for text entry in VR.

Eye movements-based predictive analytics:

In addition to developing gaze-assisted interactions, I perform predictive analytics using movements data. Some of the relevant works include identifying what aspects in a candidate's resume influence a recruiter's decision to hire or not to hire, predicting the perceived difficulty of problems when students solve problems on intelligent tutoring systems, and so on. I envision that when eye tracking becomes pervasive, we can expect many consumer devices to support hands-free interactions. Also, users with accessibility needs could use the same devices and have the same experience as the able-bodied people.

Motivation:

When personal computers were first introduced back in the 80s, they were meant to be used for specific tasks, in specific ways, and with specific input and output units. However, with the advancements in ubiquitous computing, i.e., computing available anywhere, anytime, and on any device, the notion that computers are used in structured spaces is becoming obsolete.

Today, we live in a world where we are constantly interacting with computing devices while sitting (desktop/laptop), moving (mobile phone/fitness trackers), and even sleeping (sleep trackers). These scenarios pose many challenges on how these interactions should be designed? What is the minimal human effort required? How can we achieve efficiency? Can we multitask? etc. While we try to answer these questions by developing novel interactions and supplementary devices, we must consider a "Human-centered" approach. Why? Because the best technology fits seamlessly into people’s lives to the point where it is not even considered a technology and is forgiving of human errors.

I strongly believe futuristic research in HCI will rely highly on the natural human inputs like the pattern of eye movements, physical activity, emotions, stimulus-response compatibility, etc. My research is largely motivated toward leveraging these implicit human inputs to create intuitive, multi-modal interaction systems that enable rich, hands-free interactions, and also serve as an accessible technology for users with physical impairments.

   Education

Doctor of Philosophy, Computer Science
Thesis Defended on September, 7th, 2018

Master of Science, Software Systems
Graduation - December 2011

Bachelor of Engineering, Computer Science and Engineering
Graduation - June 2008

   Work Experience

Over SIX years of industry experience, working for research and development firms. Most of my projects span across the domains of HCI, intelligent systems, machine learning, and data analytics.


Organization

Ocular Data Systems, Inc. Pasadena, California [advised by: Dr. Rich Diephuis, Ron Waldorf.]

Designation

Research Internship. June 1st, 2018 – Aug 24th 2018, and Jan 10th, 2018 - May 3rd, 2018

Nature of work

Developed the initial version of a virtual reality eye tracking system and related algorithms to analyze pupil and smooth pursuits. The system has an application for workplace safety.


Organization

GazeIT Lab, Technical University of Denmark, Copenhagen, Denmark. [advised by: Dr. John Paulin Hansen ]

Designation

Research Internship. May 16th, 2017 - August 31st, 2017.

Nature of work

Developed gaze typing interfaces for virtual reality, and investigated how the keyboard design, selection method, and motion in the field of view impact typing performance and user experience.


Organization

IBM - Cognitive Environments Lab, Watson Research Center, New York. [advised by: Dr. Rachel Bellamy, Dr. Maryam Ashoori ]

Designation

Research Internship. May 16th, 2016 - August 26th, 2016.

Nature of work

Developed a visual feedback system and implemented abilities to handle a meeting in a voice-enabled cognitive agent powered by IBM Watson cognitive services.


Organization

IBM - CIO Lab, New York.

Designation

Research Internship. May 26th, 2015 - August 14th, 2015.

Nature of work

Developed mobile interfaces with human-centered design, which assist in navigating robots in a data center, floor map generation, temperature monitoring, and asset management.


Organization

National Instruments Corporation, Austin, Texas.

Designation

Intern/Co-op-Software Eng/IMAQ Software. May 19, 2014 - August 8, 2014

Nature of work

1) Redesigned the User Interface and Interaction aspects of IMAQ Software. 2) Conducted a feasibility study on leveraging vision-based object tracking algorithms to track user activities.


Organization

National Instruments, Bangalore, India

Designation

Staff software engineer. July 2012 to July 2013

Nature of work

1) Conducted research and development in "Computer Vision" for implementing image-processing algorithms, deployed in industrial inspection systems. 2) Performed multicoring and platform portability of vision algorithms on to various real time architectures.


Organization

NOKIA India Private Limited. Bangalore, India.

Designation

Engineer, System Software. June 2010 to July 2012

Nature of work

Conducted research and development toward implementing FlashLite adaptation layer on Symbian operating system to enable flash content on Webkit, Symbian WRT based browsers, and standalone applications.


Organization

Robert Bosch Engineering and Business Solutions, Bangalore. India

Designation

Software Engineer. July 2008 to May 2010

Nature of work

Developed windows based applications that aid in calibration and analysis of Electronic Control Units in automobiles. Calibration tunes engine parameters for desired performance metrics.


Organization

NDS Services Pay TV Technologies, Bangalore. India.

Designation

Project Trainee - Internship. February 2008 to July 2008

Nature of work

Developed applications for Set-Top Box which are broadcasted on Media Highway Core platform.

   Teaching, Recognition, Organization, and Professional Service


TEACHING

Spring 2017: Instructor of Record

  Course: CSCE 206 - Strctured Programming in C++.

  Number of students: 71.

  Resonsibilities: Composing syllabus, teaching, creating exams and assignments, and final grading

  Students' Feedback: URL

Fall 2016: Instructor of Record

  Course: CSCE 206 - Strctured Programming in C++.

  Number of students: 46.

  Resonsibilities: Composing syllabus, teaching, creating exams and assignments, and final grading

  Students' Feedback: URL


RECOGNITION

First place winner - Student Research Week, graduate engineering poster category - Texas A&M University, Spring 2018.

Finalist - US south regional 3 Minute Thesis Competition, Conference of Southern Graduate Schools. Arkansas, Spring 2018.

Invited speaker at the Professional Development Award Ceremony by the Offfice of Graduate and Professional Studies. TAMU, Spring 2018.

Winner - challenge by the College of Engineering - Hackathon ’18 - Diversifying Space and Place (24 Hrs) - Texas A&M University, Spring 2018.

First place winner - 3 Minute Thesis Competition, doctoral category - Texas A&M University, Fall 2017.

People's choice award - 3 Minute Thesis Competition, doctoral category - Texas A&M University, Fall 2017.

Texas A&M representative at the - US south regional 3 Minute Thesis Competition, doctoral category - Fayetteville, AR in Feb 2018.

Certificate of appreciation for supporting Teaching for Transformational Learning, Texas A&M, Spring 2018.

Awarded the Basic Level Professional Development Certificate by the Offfice of Graduate and Professional Studies. Texas A&M, Spring 2018.

The Graduate and Professional Student Council (GPSC) Travel Award sponsored by the Vice President of Research, Texas A&M, Fall 2017.

The Office of Graduate and Professional Studies (OGAPS) Research and Presentation Grant, Texas A&M, Fall 2017.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A&M University, Spring 2017.

Received X - Factor award at the TAMU Diversity Accessibility Hackathon (24 Hrs) - College of Architecture, Texas A&M, Feb 2017.

Awarded second place in the IAP Research Poster Competition at Texas A&M, March 2017.

Finalist - ACM Student Research Competition, TAPIA 2017, Atlanta, GA.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A&M University, Fall 2016.

Publication “Let Me Relax” received the Best Student Paper award at MobiHealth 2015, London, UK.

Awarded first place in the Graduate Poster Competition at the TAPIA 2016 conference, Austin, TX.

Awarded third place in the Student Research Competition at the ASSETS 2016 conference, Reno, NV.

Recognized as a CIRTL Associate - Fellow of the Academy for Future Faculty, Texas A&M University, Fall 2016.

Awarded second place in the university-wide “Student Research Week 2015” at Texas A&M University.

Awarded second place in the Accessibility Mapping Challenge at GIS Day 2016, Texas A&M University.

MS thesis research “Accelerometer-Based Gesture Recognition Framework” was awarded as an excellent project.

Department of Computer Science and Engineering one time scholarship - Spring 2016.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A & M University, Fall 2016.

ACM Travel Grant - Student Research Competition - ASSETS 2016, Reno, Nevada - Fall 2016

ACM Travel Grant - Doctoral consortium and Poster presentation - IUI 2016, Sonoma, California - Spring 2016

TAPIA Scholarship - Doctoral consortium and Poster presentation - TAPIA 2017, Atlanta, Georgia - Fall 2017

TAPIA Scholarship - Doctoral consortium and Poster presentation - TAPIA 2016, Austin, Texas - Fall 2016

Department of Computer Science and Engineering Travel Grant - CHI 2017, Denver, Colorado - Spring 2017

Department of Computer Science and Engineering Travel Grant - ETRA 2016, Charleston, South Carolina - Spring 2016

Department of Computer Science and Engineering Travel Grant - SAP 2016, Anaheim, California - Summer 2016

Department of Computer Science and Engineering Travel Grant - TAPIA 2015, Boston, Massachusetts - Spring 2015

Department of Computer Science and Engineering Travel Grant - TAPIA 2014, Seattle, Washington - Spring 2014

Honored with the Best Performance Award for the year 2009 at Bosch.

Received Appreciation Award at NOKIA for FlashLite design and other contributions.


ORGANIZATIONS

LEAD - Google Developer Group TAMU, Fall 2017 - Spring 2018

LEAD - Google Developer Group TAMU, Fall 2016 - Spring 2017

Organizer - Google Developer Group TAMU, Spring 2016 - Fall 2015

Executive Committee - TAMU Academy for Future Faculty, Fall 2016 - Spring 2017

Vice President, Social Events - Computer Science and Engineering Graduate Student Association, Spring 2015 - Fall 2015

Social Officer - Computer Science and Engineering Graduate Student Association, Spring 2014 - Fall 2014


PROFESSIONAL SERVICE

Reviewer - Conferences and Journals:
  • 2018: IUI, CHI, DIS, ETRA.
  • 2017: CHI, UIST, IDC, DIS, IUI, CSCW, CIPTTE.
  • 2016: UIST, IUI, ETRA, CIPTTE.
  • 2015: WIPTTE, IJHCS, CAD

Mentoring - Research

  • Aadil Hamid (MS, 2016 - 2018)
  • Rahul Bhagat (MS, 2016 - 2018)
  • Amy Li (BS, 2017 - 2018)
  • Jeff Zaho (BS, 2017 - 2018)
  • Purnendu Kaul (MS, 2014 - 2016)
  • Siddhartha Karthik (MS, 2014 - 2016)

Mentoring - Senior Capstone

  • Spring 2018: GoCooking, Team:
  • Fall 2017: Evaluation of Artistic Works Through Web-based Eye Tracking, Team: Bailey Bauman, Regan Gunhouse, Antonia Jones, Willer Da Silva, Shaeeta Sharar.
  • Fall 2015: PhysiotherAPPy, Team: Alex Davis, Rin Davis, Jacob Hile, Kavila Krishnan, Patrick Sheehan.
  • Fall 2015: Gaze Interaction on Larger Displays, Team: Nick Hanneman.
  • Fall 2014: KinoHaptics, Team: Patrick Vo, Jerry Barth, Matthew Mjelde, Trevor Grey
  • Spring 2014: PebFit, Team: Chris Lowetz, Hunter McElroy, Aurea Del Moral

Mentoring - Aggie Challenge: Fall 2017, Spring 2017, Fall 2015, Spring 2015, Fall 2014, Spring 2014.

Judge: Student Research Week - 2017, IDEA challenge 2015.




Built with BootStrap, Startbootstrap, Dreamcodes, CodePen, Pixabay, Unsplash, and some Love and Passion for web programming...