Doyoung Lee is in a last year of PhD course in the Interactions Lab @ UNIST (Ulsan National Institute of Science and Technology). He majored in Design & Human Engineering and Computer Science Engineering at UNIST, Korea. His interests are in controlling things, prototyping, smart toys, AR/VR, and novel interfaces. His current works are mainly focused on the AR/VR input methods, especially body-based interfaces. His contributions are usually in design(e.g., workshop process, elicitation studies), developments (e.g., PCB, physical computing, JAVA, and UNITY), and evaluations (e.g., statistics, study designs, user evaluations). Also, He is a maker! he likes to make fun and interactive toys or arts (See projects!)
You can Find my CV here!
How can we make a better interaction with wearables?
One way is utilizing current system more efficiently. For example, we can make use of touch screen on smartwatches, as [Edgetouch, Beats, FlatTouch, TriTap, and HexaBraille] did.
While these techniques were useful to widen the design space of these wearables, but the fundamental problems, such as tiny screen and difficulties in mobile condition, limits the real-world usage. From this perspective, my current interests are in the body-based interfaces. Using body as interaction medium can provide diverse benefits, 1) much wider design space (touches on the back of the hand or diverse form of interactions like motion and pose), 2) always available and easy access (face, arm, and fingers are typically unobstructed by garments), 3) easy interface mapping as they enable the use of proximate body regions (hand for watch and face for glasses), 4) proprioception and passive haptics support more accurate and quick inputs, and 5) socially acceptable interface by leveraging the natural body motions.
To validate these benefits, I'm conducting diverse studies relating body-based interfaces - "Bodily inputs for smartwatch"[Bodily inputs], "Hand-to-face input for HMD"[FaceTouch] or "touch inputs on multiple nails for wearables"[Nailz].