Hi!

I am a first-year PhD student at UM CSE, working on HCI and accessibility. I am fortunate to be advised by Prof. Anhong Guo.

Yuxuan Liu wearing round glasses, a black shirt, and a backpack outdoors, with trees and a distant city skyline behind him.

Publications

  1. A figure showing an overview of the HandProxy system in 3 stages. The user starts by giving a speech command (first stage), which is used and interpreted by the HandProxy system to control a virtual proxy hand (second stage), and the proxy hand will then interact within the virtual environment to perform hand interactions on behalf of the user (third stage).
    HandProxy: Expanding the Affordances of Speech Interfaces in Immersive Environments with a Virtual Proxy Hand
    Chen Liang, Yuxuan Liu, Martez Mott, and Anhong Guo
    IMWUT/UbiComp ’25
  2. WorldScribe is a system that generates automated live real-world visual descriptions that are customizable and adaptive to users’ contexts.
    WorldScribe: Towards Context-Aware Live Visual Descriptions
    Ruei-Che Chang, Yuxuan Liu, and Anhong Guo
    UIST ’24
  3. EditScribe enables non-visual image editing using natural language verification loops powered by large multimodal models.
    EditScribe: Non-Visual Image Editing with Natural Language Verification Loops
    Ruei-Che Chang, Yuxuan Liu, Lotus Zhang, and Anhong Guo
    ASSETS ’24