Welcome! Thanks for stopping by.
You can learn more about me and my research here.
Accessibility, Creative Support, Generative AI, Multimodal Interaction Design, User-centered Design
News [2023.10] Our paper "Understanding Digital Content Creation Needs of Blind and Low Vision People" received a best paper nomination at ASSETS 2023!
News [2023.10] I'm excited to attend the Doctoral Consortium at ASSETS 2023 this month!
News [2023.02] I passed my general exam in February, 2023!
Support blind and low vision (BLV) creators in the digital space. BLV individuals desire to engage in digital creative activities for reasons ranging from employment to self-expression. I conduct human-centered research to shape creative tool design on the experience of the BLV community. Guided by a formative study with 165 BLV creators, I focus on exploring AI-assisted approaches to support BLV creators on visual creative tasks, using a combination of design probe studies, participatory design workshops, and usability tests. ASSETS 2023 Best Paper Nominee
Advisor: Leah Findlater (UW, HCDE)
Accessibility, BLV, Creative Support, Generative AI
Understand blind visual art patrons' experiences. Visual arts play an important role in cultural life and provide access to social heritage and self-enrichment, but most visual arts are inaccessible to blind people. Through a large-scale survey and follow-up interviews, we examined how blind people adopt existing alternative accesses to visual arts as well as their perspectives towards improving their experience. Based on qualitative research insights, we provide a roadmap for technology-based support for blind visual art patrons. CHI 2023
Co-first author: Franklin Li (CMU), Advisor: Patrick Carrington (CMU), Leah Findlater (UW, HCDE)
Accessibility, BLV, Visual Arts, User Research
Tour the online world with ears. Technology use has long been dominated by visual modality. Progress in voice technology now allows us to utilize our hearing and speech in navigating online content. Yet, most websites are not readily designed for voice interaction. We conduct a series of interview studies to explore this design challenge - how should different forms of online content be “translated” into an audio format? CSCW 2021, CHI 2022
Advisor: Leah Findlater (UW, HCDE)
Multimodal Interaction, Voice Design, Interface Design, Social Computing
Re-evaluate input performance at varying age and motor ability. In this study, we critically examined the nuanced relationships between age, motor ability, and input performance by analyzing a dataset from a large-scale study that captures input performance with a mouse and/or touchscreen from over 700 participants. Our analysis illustrates the continuous relationship between these factors, but also suggests that knowing a user’s age and self-reported motor ability should not lead to assumptions about their input performance. ASSETS 2020 Best Paper
Advisor: Leah Findlater (UW, HCDE)
Accessibility, Input Techniques, Aging, Data Mining
Calm your emotion with furry robots. How does the sense of touch communicate emotion? We explored how people perceive and empathize with different robot movements with a set of user studies and short interviews. We learned that under certain narratives, it is possible for us to empathize with automatically generated robot emotions. However, controlling these narratives is a major challenge for affect computing. CHI 2018
Advisor: Karon MacLeon, Mentor: Paul Bucci (UBC, CS)
Affect Computing, Social Robot, Haptics, Physical Prototyping
Facilitate Mixed Reality usability analysis. It is challenging to perform low-cost usability testing with Virtual/Augmented reality tools. In this project, we developed and evaluated MRAT, an analysis tool for assisting the usability and user experience testing of Mixed Reality applications. My contributions include tool development, user study setup, and competitive analysis. CHI 2020 Best Paper
Advisor: Michael Nebeling (UMich, iSchool)
AR/VR, Unity, Usability Study