Research + Design

LOTUS ZHANG

About Me

I'm a Mixed-Methods User Researcher with ~10 years of academic and industry experience across Google, Meta, Microsoft, and the University of Washington. I combine quantitative rigor (large-scale surveys, statistical analysis) with qualitative depth (participatory design, interviews, usability testing) to make sense of complex, ambiguous problems and deliver clear, actionable insights.

I thrive in cross-functional teams and bring adaptability and inclusivity to projects that span diverse domains, from advancing accessibility innovation to shaping human-centered AI, to tackling challenges in trust, safety, and emerging technologies.
A woman with long black hair and round glasses smiling at the camera, wearing a dark jacket over a maroon turtleneck. She is outdoors near a beach with ocean waves and rocks in the background.

Selected Projects

Other Projects

Voice & Audio Interaction

Reimagining the web for ears. Most online content assumes visual interaction. With the rise of voice technology, I co-led interview studies to explore how different content types can be translated into audio for navigation and consumption. Findings informed design considerations for multimodal interfaces where hearing and speech complement visual input.

Published: CSCW 2021, CHI 2022

KEYWORDS:

Voice Interaction, Interface Design, Multimodal UX

Input Accessibility

Age, motor ability, and input performance. Analyzed data from 700+ participants to examine how age and motor ability affect input performance with mouse and touchscreen. Results showed continuous variation rather than clear-cut categories—challenging assumptions that age or disability can directly predict performance.

Published: ASSETS 2020 (Best Paper)

KEYWORDS:

Accessibility, Input Techniques, Aging, Data Analysis

CuddleBit

Robotic touch for emotion regulation. Explored how touch and movement in furry robots can convey emotion. Through user studies and interviews, we found people empathize with robot “emotions” under certain narratives—highlighting opportunities and risks in affective computing.

Published: CHI 2018

KEYWORDS:

Haptics, Social Robots, Affective Computing

Mixed Reality Usability

Making AR/VR testing more accessible. Supported the development and evaluation of MRAT, a tool to support low-cost usability testing for Mixed Reality applications. My contributions included tool development, user study setup, and competitive analysis.

Published: CHI 2020 (Best Paper)

KEYWORDS:

AR/VR, Usability Studies, Prototyping

lotus.hanzi@gmail.com

Designed with ♥ by Lotus Zhang © 2025