The complexity of human eyes is both fascinating and offers opportunities for further exploration.
As an eye-tracking researcher, who specializes in HCI, I develop gaze-assisted multimodal interaction methods.

Texas A & M University

Vijay Rajanna

   Who Am I ?

I am a Ph.D. candidate in the Department of Computer Science and Engineering at Texas A&M University. I pursue research in Human-Computer Interaction at the Sketch Recognition Lab, advised by Dr. Tracy Hammond. At the Sketch Recognition Lab, our work spans across multiple dimensions of application oriented research in Computer Science. We have successfully developed intelligent algorithms for sketch, gesture, and activity recognition. Our major contributions include a gaze-augmented multi-modal interaction framework, a haptic-assisted navigation system, wearable devices for indoor navigation of users with vision impairment, a personal health assistant for physical and mental wellness, a social networking platform for kids, etc.




When personal computers were first introduced back in the 80s, they were meant to be used for specific tasks, in specific ways, and with specific input and output units. However, with the advancements in ubiquitous computing, i.e., computing available anywhere, anytime, and on any device, the notion that computers are used in structured spaces is becoming obsolete.

Today, we live in a world where we are constantly interacting with computing devices while sitting (desktop/laptop), moving (mobile phone/fitness trackers), and even sleeping (sleep trackers). These scenarios pose many challenges on how these interactions should be designed? What is the minimal effort required? How can we achieve efficiency? Can we multitask? etc. While we try to answer these questions by developing novel interactions and supplementary devices, we must consider a "Human-centered" approach. Why? Because, the best technology fits seamlessly into people’s lives to the point where it is not even considered a technology and is forgiving of human errors.

I strongly believe futuristic research in HCI will rely highly on the natural human inputs like the pattern of eye movements, physical activity, emotions, stimulus-response compatibility, etc. My research is largely motivated toward leveraging these implicit human inputs to create intuitive, multi-modal interaction systems that enable rich, hands-free interactions, and also serve as an accessible technology for users with physical impairments.

Research Statement:

My work is interdisciplinary in nature as it spans across Computer Science, Cognitive Psychology, and Personal Health disciplines. In my doctoral research, I focus on developing gaze-assisted, multi-modal paradigm to interact with computers. Eye movement-based interactions are crucial in scenarios where a user cannot use conventional input devices like a mouse and keyboard, and largely, there are two such scenarios: 1) situational impairments, i.e., a user's hands are engaged in a task, for example, driving, a surgeon operating, playing music, holding things, etc. 2) impairments and disabilities by birth or due to an injury. In these two scenarios, gaze-assisted interaction serves as a rich, contextual, and non-invasive technology to help people interact with computers.

In this regard, I have developed a gaze and foot-based interaction framework to achieve accurate "point and click" interactions and to perform dwell-free text entry on computers. Also, I have developed a gaze gesture-based framework for user authentication to counter shoulder surfing attacks and to interact with a wide range of computer applications using a common repository of eye gestures. Lastly, I have been exploring gaze interactions in virtual and augmented reality. Specifically, I investigate the feasibility of using eye movements for text entry in VR. I envision that when eye tracking becomes pervasive, we can expect many consumer devices to support hands-free interactions. Also, users with accessibility needs could use the same devices and have the same experience as the able-bodied people.

Research Area:
Human-Computer Interaction, Eye Tracking, Gaze-assisted Interaction, Cognition in Human Computer Interaction, Eye Tracking Analytics, Multi-modal Interaction, Foot-operated Input Modality, Virtual Reality, Augmented Reality, Haptics, Gesture Recognition, Activity Recognition, Sketch Recognition, Wearable Devices, Pervasive Computing, Cognitive Agents.


Doctor of Philosophy, Computer Science
Expected Graduation - Summer 2017

Master of Science, Software Systems
Graduation - December 2011

Bachelor of Engineering, Computer Science and Engineering
Graduation - June 2008

   Work Experience

Over FIVE years of industry experience, working throughout on research and development projects at the leading product development firms.


Ocular Data Systems, Inc. Pasadena, California [advised by: Dr. Rich Diephuis, Ron Waldorf.]


Research Internship. Jan 10th, 2018 - May 3rd, 2018.

Nature of work

Developing prototype virtual reality eye tracking device and related algorithms with an application for workplace safety.


GazeIT Lab, Technical University of Denmark, Copenhagen, Denmark. [advised by: Dr. John Paulin Hansen ]


Research Internship. May 16th, 2017 - August 31st, 2017.

Nature of work

Developed gaze typing interfaces for virtual reality, and investigated how the keyboard design, selection method, and motion in the field of view impact typing performance and user experience.


IBM - Cognitive Environments Lab, Watson Research Center, New York. [advised by: Dr. Rachel Bellamy, Dr. Maryam Ashoori ]


Research Internship. May 16th, 2016 - August 26th, 2016.

Nature of work

Developed a voice-enabled cognitive agent that can transform a normal space into a cognitive environment to support user tasks. The system was powered by IBM Watson cognitive services.


IBM - CIO Lab, New York.


Research Internship. May 26th, 2015 - August 14th, 2015.

Nature of work

Developed mobile interfaces with human-centered design, which assist in navigating robots in a datacenter, floor map generation, temperature monitoring, and asset management.


National Instruments Corporation, Austin, Texas.


Intern/Co-op-Software Eng/IMAQ Software. May 19, 2014 - August 8, 2014

Nature of work

1) Redesigned the User Interface and Interaction aspects of IMAQ Software. 2) Conducted a feasibility study on leveraging vision based object tracking algorithms to track user activities.


National Instruments, Bangalore, India


Staff software engineer. July 2012 to July 2013

Nature of work

1) Conducted research and development in "Computer Vision" for implementing image-processing algorithms, deployed in industrial inspection systems. 2) Performed multicoring and platform portability of vision algorithms on to various real time architectures.


NOKIA India Private Limited. Bangalore, India.


Engineer, System Software. June 2010 to July 2012

Nature of work

Conducted research and development toward implementing FlashLite adaptation layer on Symbian operating system to enable flash content on Webkit, Symbian WRT based browsers, and standalone applications.


Robert Bosch Engineering and Business Solutions, Bangalore. India


Software Engineer. July 2008 to May 2010

Nature of work

Developed windows based applications that aid in calibration and analysis of Electronic Control Units in automobiles. Calibration tunes engine parameters for desired performance metrics.


NDS Services Pay TV Technologies, Bangalore. India.


Project Trainee - Internship. February 2008 to July 2008

Nature of work

Developed applications for Set-Top Box which are broadcasted on Media Highway Core platform.

   Teaching, Recognition, Organization, and Professional Service


Spring 2017: Instructor of Record

  Course: CSCE 206 - Strctured Programming in C++.

  Number of students: 71.

  Resonsibilities: Composing syllabus, teaching, creating exams and assignments, final grading

  Students' Feedback: URL

Fall 2016: Instructor of Record

  Course: CSCE 206 - Strctured Programming in C++.

  Number of students: 46.

  Resonsibilities: Composing syllabus, teaching, creating exams and assignments, final grading

  Students' Feedback: URL


First place winner - Student Research Week, graduate engineering poster category - Texas A&M University, Spring 2018.

Finalist - US south regional 3 Minute Thesis Competition, Conference of Southern Graduate Schools. Arkansas, Spring 2018.

Winner - challenge by the College of Engineering - Hackathon ’18 - Diversifying Space and Place (24 Hrs) - Texas A&M University, Spring 2018.

First place winner - 3 Minute Thesis Competition, doctoral category - Texas A&M University, Fall 2017.

People's choice award - 3 Minute Thesis Competition, doctoral category - Texas A&M University, Fall 2017.

Texas A&M representative at the - US south regional 3 Minute Thesis Competition, doctoral category - Fayetteville, AR in Feb 2018.

The Graduate and Professional Student Council (GPSC) Travel Award sponsored by the Vice President of Research, Texas A&M, Fall 2017.

The Office of Graduate and Professional Studies (OGAPS) Research and Presentation Grant, Texas A&M, Fall 2017.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A&M University, Spring 2017.

Received X - Factor award at the TAMU Diversity Accessibility Hackathon (24 Hrs) - College of Architecture, Texas A&M, Feb 2017.

Awarded second place in the IAP Research Poster Competition at Texas A&M, March 2017.

Finalist - ACM Student Research Competition, TAPIA 2017, Atlanta, GA.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A&M University, Fall 2016.

Publication “Let Me Relax” received the Best Student Paper award at MobiHealth 2015, London, UK.

Awarded first place in the Graduate Poster Competition at the TAPIA 2016 conference, Austin, TX.

Awarded third place in the Student Research Competition at the ASSETS 2016 conference, Reno, NV.

Recognized as a CIRTL Associate - Fellow of the Academy for Future Faculty, Texas A&M University, Fall 2016.

Awarded second place in the university-wide “Student Research Week 2015” at Texas A&M University.

Awarded second place in the Accessibility Mapping Challenge at GIS Day 2016, Texas A&M University.

MS thesis research “Accelerometer-Based Gesture Recognition Framework” was awarded as an excellent project.

Department of Computer Science and Engineering one time scholarship - Spring 2016.

Received Graduate Teaching Fellow award by the Dwight Look College of Engineering - Texas A & M University, Fall 2016.

ACM Travel Grant - Student Research Competition - ASSETS 2016, Reno, Nevada - Fall 2016

ACM Travel Grant - Doctoral consortium and Poster presentation - IUI 2016, Sonoma, California - Spring 2016

TAPIA Scholarship - Doctoral consortium and Poster presentation - TAPIA 2017, Atlanta, Georgia - Fall 2017

TAPIA Scholarship - Doctoral consortium and Poster presentation - TAPIA 2016, Austin, Texas - Fall 2016

Honored with the Best Performance Award for the year 2009 at Bosch.

Received Appreciation Award at NOKIA for FlashLite design and other contributions.


LEAD - Google Developer Group TAMU, Fall 2017 - Spring 2018

LEAD - Google Developer Group TAMU, Fall 2016 - Spring 2017

Organizer - Google Developer Group TAMU, Spring 2016 - Fall 2015

Executive Committee - TAMU Academy for Future Faculty, Fall 2016 - Spring 2017

Vice President, Social Events - Computer Science and Engineering Graduate Student Association, Spring 2015 - Fall 2015

Social Officer - Computer Science and Engineering Graduate Student Association, Spring 2014 - Fall 2014


  • 2018: IUI, CHI, DIS, ETRA.
  • 2016: UIST, IUI, ETRA, CIPTTE.
  • 2015: WIPTTE, IJHCS, CAD

Judge: Student Research Week - 2017, IDEA challenge 2015.

Mentoring - Research: Aadil Hamid (MS, Spring 2018), Rahul Bhagat (MS, Spring 2018), Amy Li (BS, Spring 2018), Jeff Zaho (Spring 2018), Siddhartha Karthik (MS, Summer 2016),

Mentoring - Senior Capstone

  • Fall 2017: Evaluation of Artistic Works Through Web-based Eye Tracking, Team:
  • Fall 2015: PhysiotherAPPy, Team: Alex Davis, Rin Davis, Jacob Hile, Kavila Krishnan, Patrick Sheehan.
  • Fall 2015: Gaze Interaction on Larger Displays, Team: Nick Hanneman.
  • Fall 2014: KinoHaptics, Team: Patrick Vo, Jerry Barth, Matthew Mjelde, Trevor Grey
  • Spring 2014: PebFit, Team: Chris Lowetz, Hunter McElroy, Aurea Del Moral

Mentoring - Aggie Challenge: Fall 2017, Spring 2017, Fall 2015, Spring 2015, Fall 2014, Spring 2014.

Copyright © Vijay Rajanna, 2017

Built with BootStrap, Startbootstrap, Dreamcodes, CodePen, Pixabay, Unsplash, and some Love and Passion for web programming...