My research focuses on enabling interaction with computers in challenging situations. Specifically, while users' hands are encumbered holding everyday objects. This always-available input has a wide range of applications, from controlling mobile devices when on the move to controlling systems in healthcare contexts. To build an entire stack of such novel input experiences, my work employs:
interaction design by understanding user preferences,
creation and empirical analysis of large-scale datasets,
machine learning models for real-time recognition.
I was recently a visiting researcher with Prof. Daniel Ashbrook at the University of Copenhagen. I did my Master's at the Media Interaction Lab, advised by Prof. Michael Haller. In the past, I have worked with Prof. Ellen Yi-Luen Do at the National University of Singapore. I have also been fortunate to work with the Parkinson's Research Group at the UBC Hospital in Vancouver, and contributed to Open-source communities (Mozilla and Python).
Note about publication venues: In my area of research, ACM CHI and UIST are the top-most and highly impactful venues (Google Scholar and Microsoft Academic both rank CHI as the #1 venue). The review process in these venues is multi-stage and very selective for each publication, involving 4-6 international subject-matter experts.
Computational Design of Sparse IMU Layouts for Sensing Fine-Grained Finger Microgestures
Under Journal Review
Adwait Sharma, Christina Salchow-Hömmen, Vimal S. Mollyn, Aditya S. Nittala, Michael A. Hedderich, Marion Koelle, Thomas Seel, Jürgen Steimle
Computational design tool to solve a multi-factorial problem of recognizing always-available input with minimal sensors.
Email me for preprint
Robust Microgestures while Grasping Everyday Objects
In Proc. of ACM CHI
Adwait Sharma, Michael A. Hedderich, Divyanshu Bhardwaj, Bruno Fruchard, Jess McIntosh, Aditya S. Nittala, Dietrich Klakow, Daniel Ashbrook, Jürgen Steimle
Easy and rapid-to-perform gestures, which are resilient to false activations.
Real-time Sensing of Surface and Deformation Gestures on Flexible, Interactive Textiles, using a Hybrid Gesture Detection Pipeline
In Proc. of ACM UIST
Patrick Parzer, Adwait Sharma, Anita Vogl, Jürgen Steimle, Alex Olwal, Michael Haller
Wearable textile that can detect 22 surface and deformation gestures in real-time.
From my experience with open-source communities, I strive to release datasets for future research.
These datasets can be extremely large. Please refer to the respective publication or email me for more details.
First dataset of single-finger movements with multiple fingers and on 36 everyday actions. In particular, these include a wide variety of real-world objects and grasps.
Apparatus: 11 infrared OptiTrack cameras
Data: Manually labeled 5,530 gesture trials (880K frames)
Apparatus: Noitom® Hi5 VR Glove
Data: Manually labeled 5,840 trials
[Download] Compressed Zip file with OptiTrack and VR Glove Dataset
Using IMUs placed on the hand, I captured finger-microgestures and hand-manipulations in freehand and while holding 12 objects.
Apparatus: 17 Synchronized IMUs
Data: Manually labeled 13,860 trials, resulting in a total of 3.4M frames.
[Code and Data] (Coming Soon)
As a Teaching Assistant, I have designed and co-organized the following courses with Prof. Jürgen Steimle:
Human-Computer Interaction Core Lecture 2021: bachelors or masters
Physical Computing 2021: bachelors or masters
Interactive Computing with Augmented and Virtual Reality 2019: bachelors or masters
Programming for Engineers 2018: bachelors or masters
Committee Member: CHI '22 (Student Design Competition)
Reviewer: AH ('18, 19, 20), CHI ('19, 20, 21, 22), IJHCS ('19), ISS ('19), MobileHCI ('19), SIGGRAPH ASIA ('19, 20), TEI ('18), UIST ('19, 20)
Student Volunteer: UIST' 17, CHI' 21
Session Chair: PyCon US' 14