Research + Design

LOTUS ZHANG

About Me

I'm a Mixed-Methods HCI & UX Researcher with 9 years of experience across academia and industry (Google, Meta, and Microsoft). My research spans accessibility, human-centered AI, creativity support, and privacy & trust. I combine quantitative rigor (large-scale surveys, statistical analysis), qualitative depth (participatory design, interviews, usability testing), and system prototyping to make sense of complex, ambiguous problems and deliver clear, actionable insights.

I'm currently a final-year PhD candidate advised by professor Leah Findlater at the University of Washington, and I'm on the industry and academic job market.
A woman with long black hair and round glasses smiling at the camera, wearing a dark jacket over a maroon turtleneck. She is outdoors near a beach with ocean waves and rocks in the background.

Selected Projects

Other Projects

Voice & Audio Interaction

Reimagining the web for ears. Most online content assumes visual interaction. With the rise of voice technology, I co-led interview studies to explore how different content types can be translated into audio for navigation and consumption. Findings informed design considerations for multimodal interfaces where hearing and speech complement visual input.

Published: CSCW 2021, CHI 2022

KEYWORDS:

Voice Interaction, Interface Design, Multimodal UX

Input Accessibility

Age, motor ability, and input performance. Analyzed data from 700+ participants to examine how age and motor ability affect input performance with mouse and touchscreen. Results showed continuous variation rather than clear-cut categories—challenging assumptions that age or disability can directly predict performance.

Published: ASSETS 2020 (Best Paper)

KEYWORDS:

Accessibility, Input Techniques, Aging, Data Analysis

CuddleBit

Robotic touch for emotion regulation. Explored how touch and movement in furry robots can convey emotion. Through user studies and interviews, we found people empathize with robot “emotions” under certain narratives—highlighting opportunities and risks in affective computing.

Published: CHI 2018

KEYWORDS:

Haptics, Social Robots, Affective Computing

Mixed Reality Usability

Making AR/VR testing more accessible. Supported the development and evaluation of MRAT, a tool to support low-cost usability testing for Mixed Reality applications. My contributions included tool development, user study setup, and competitive analysis.

Published: CHI 2020 (Best Paper)

KEYWORDS:

AR/VR, Usability Studies, Prototyping

lotus.hanzi@gmail.com

Designed with ♥ by Lotus Zhang © 2025