Research + Design

LOTUS ZHANG

Welcome! Thanks for stopping by.
You can learn more about me and my research here.

ABOUT ME

I'm Lotus Zhang (张菡子), a Ph.D. candidate at the University of Washington in the department of Human Centered Design & Engineering (HCDE). My advisor is professor Leah Findlater.
I conduct interdisciplinary research at the intersection of human-computer interaction and accessibility. My research focuses on supporting creative pursuits and leisure of people with different abilities through design. My work is published at top HCI venues, including CHI, CSCW, and ASSETS. I have a background in psychology and computer science, and I use a mixture of quantitative and qualitative methods in my research.
My recent research looks into AI-assisted creative support for blind and low vision individuals.
KEYWORDS:

Accessibility, Creative Support, Generative AI, Multimodal Interaction Design, User-centered Design

a cartoon avatar girl with pink hair tied up as a ponytail wearing a headphone

News [2023.10] Our paper "Understanding Digital Content Creation Needs of Blind and Low Vision People" received a best paper nomination at ASSETS 2023!

News [2023.10] I'm excited to attend the Doctoral Consortium at ASSETS 2023 this month!

News [2023.02] I passed my general exam in February, 2023!

PROJECTS

ACCESSIBLE CREATIVE SPACE

Support blind and low vision (BLV) creators in the digital space. BLV individuals desire to engage in digital creative activities for reasons ranging from employment to self-expression. I conduct human-centered research to shape creative tool design on the experience of the BLV community. Guided by a formative study with 165 BLV creators, I focus on exploring AI-assisted approaches to support BLV creators on visual creative tasks, using a combination of design probe studies, participatory design workshops, and usability tests. ASSETS 2023 Best Paper Nominee

Advisor: Leah Findlater (UW, HCDE)

KEYWORDS:

Accessibility, BLV, Creative Support, Generative AI

ACCESSIBLE VISUAL ARTS

Understand blind visual art patrons' experiences. Visual arts play an important role in cultural life and provide access to social heritage and self-enrichment, but most visual arts are inaccessible to blind people. Through a large-scale survey and follow-up interviews, we examined how blind people adopt existing alternative accesses to visual arts as well as their perspectives towards improving their experience. Based on qualitative research insights, we provide a roadmap for technology-based support for blind visual art patrons. CHI 2023

Co-first author: Franklin Li (CMU), Advisor: Patrick Carrington (CMU), Leah Findlater (UW, HCDE)

KEYWORDS:

Accessibility, BLV, Visual Arts, User Research

VOICE & AUDIO INTERACTION

Tour the online world with ears. Technology use has long been dominated by visual modality. Progress in voice technology now allows us to utilize our hearing and speech in navigating online content. Yet, most websites are not readily designed for voice interaction. We conduct a series of interview studies to explore this design challenge - how should different forms of online content be “translated” into an audio format? CSCW 2021, CHI 2022

Advisor: Leah Findlater (UW, HCDE)

KEYWORDS:

Multimodal Interaction, Voice Design, Interface Design, Social Computing

INPUT ACCESSIBILITY

Re-evaluate input performance at varying age and motor ability. In this study, we critically examined the nuanced relationships between age, motor ability, and input performance by analyzing a dataset from a large-scale study that captures input performance with a mouse and/or touchscreen from over 700 participants. Our analysis illustrates the continuous relationship between these factors, but also suggests that knowing a user’s age and self-reported motor ability should not lead to assumptions about their input performance. ASSETS 2020 Best Paper

Advisor: Leah Findlater (UW, HCDE)

KEYWORDS:

Accessibility, Input Techniques, Aging, Data Mining

CUDDLEBIT

Calm your emotion with furry robots. How does the sense of touch communicate emotion? We explored how people perceive and empathize with different robot movements with a set of user studies and short interviews. We learned that under certain narratives, it is possible for us to empathize with automatically generated robot emotions. However, controlling these narratives is a major challenge for affect computing. CHI 2018

Advisor: Karon MacLeon, Mentor: Paul Bucci (UBC, CS)

KEYWORDS:

Affect Computing, Social Robot, Haptics, Physical Prototyping

MIXED REALITY USABILITY

Facilitate Mixed Reality usability analysis. It is challenging to perform low-cost usability testing with Virtual/Augmented reality tools. In this project, we developed and evaluated MRAT, an analysis tool for assisting the usability and user experience testing of Mixed Reality applications. My contributions include tool development, user study setup, and competitive analysis. CHI 2020 Best Paper

Advisor: Michael Nebeling (UMich, iSchool)

KEYWORDS:

AR/VR, Unity, Usability Study

lotus.hanzi@gmail.com

Designed with ♥ by Lotus Zhang © 2023