Doyoung Lee is in a last year of PhD course in the Interactions Lab @ UNIST (Ulsan National Institute of Science and Technology). He majored in Design & Human Engineering and Computer Science Engineering at UNIST, Korea. His interests are in controlling things, prototyping, smart toys, AR/VR, and novel interfaces. His current works are mainly focused on the AR/VR input methods, especially body-based interfaces. His contributions are usually in design(e.g., workshop process, elicitation studies), developments (e.g., PCB, physical computing, JAVA, and UNITY), and evaluations (e.g., statistics, study designs, user evaluations). Also, He is a maker! he likes to make fun and interactive toys or arts (See projects!)

You can Find my CV here!

Research theme

How can we make a better interaction with wearables?

One way is utilizing current system more efficiently. For example, we can make use of touch screen on smartwatches, as [Edgetouch, Beats, FlatTouch, TriTap, and HexaBraille] did.

While these techniques were useful to widen the design space of these wearables, but the fundamental problems, such as tiny screen and difficulties in mobile condition, limits the real-world usage. From this perspective, my current interests are in the body-based interfaces. Using body as interaction medium can provide diverse benefits, 1) much wider design space (touches on the back of the hand or diverse form of interactions like motion and pose), 2) always available and easy access (face, arm, and fingers are typically unobstructed by garments), 3) easy interface mapping as they enable the use of proximate body regions (hand for watch and face for glasses), 4) proprioception and passive haptics support more accurate and quick inputs, and 5) socially acceptable interface by leveraging the natural body motions.

To validate these benefits, I'm conducting diverse studies relating body-based interfaces - "Bodily inputs for smartwatch"[Bodily inputs], "Hand-to-face input for HMD"[FaceTouch] or "touch inputs on multiple nails for wearables"[Nailz].

Conference Papers

  1. DoYoung Lee, SooHwan Lee and Ian Oakley (2020). "Nailz: Sensing Hand Input with Touch Sensitive Nails";. In Proceedings of CHI'20, honolulu, hawai'i. [link]

  2. DoYoung Lee, Youryang Lee, Yonghwan Shin and Ian Oakley (2018). "Designing Socially Acceptable Hand-to-Face Input";. In Proceedings of UIST'18, Berlin, Germany. [link]

  3. Rasel Islam, DoYoung Lee, Liza Suraiya Jahan and Ian Oakley (2018) "GlassPass: Tapping Gestures to Unlock Smart Glasses."; In Proceedings of Augmented Human 2018, Seoul, Korea. [link]

  4. Gil, H.J., Lee, D.Y., Im, S.G. and Oakley, I. (2017) "TriTap: Identifying Finger Touches on Smartwatches."; To appear in Proceedings of ACM CHI'17, Denver, CO, USA. [link]

  5. Oakley, I., Lindahl, C., Le, K., Lee, D.Y. and Islam, R.M.D. (2016) "The Flat Finger: Exploring Area Touches on Smartwatches". In Proceedings of ACM CHI'16, San Jose, CA, USA. [link]

  6. Oakley, I., Lee, D.Y., Islam, R.M.D. and Esteves, A. "Beats: Tapping Gestures for Smart Watches". In Proceedings of ACM CHI'15, Seoul, Republic of Korea. [link]

  7. T Yang, DY Lee, Y Kwak, J Choi, C Kim, SP Kim. (2015) "Evaluation of TV commercials using neurophysiological responses". Journal of physiological anthropology 34 (1), 19 [link]

  8. Oakley, I. and Lee, D.Y. (2014) "Interaction on the Edge: Offset Sensing for Small Devices". In Proceedings of ACM CHI 2014, Toronto, Canada. [link]

  9. TY Yang, DY Lee, SP Kim. (2014) "Development of Neural Indices from the Human Encephalography to Evaluate TV Commercials". Ergonomics Society of Korea, 380-384 [link]