<

Yang Zhang

My name is Yang Zhang. I'm a 3rd year PhD student at Human-Computer Interaction Institute (HCII), Carnegie Mellon University, advised by Prof. Chris Harrison. I am also a Qualcomm Fellow. Broadly, I build interfaces which bridge gaps between computing resources and people's daily lives in a natural and efficient way.

Specifically, I'm interested in sensing technologies for wearables, fabrications, and Internet of Things. I worked with capacitive and RF sensing on human body and a wide range of applicable materials to extend interactions beyond touchscreens. My research projects have been mostly published at HCI conferences -- CHI and UIST. For more details of my research, please take a look at the following projects.

Research

Jun Gong, Yang Zhang, Xia Zhou, Xing-Dong Yang (UIST 2017) 

We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning.

Yang Zhang, Gierad Laput, Chris Harrison (CHI 2017) 

Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces. We show that our technique is compatible with commonplace manufacturing methods, such as spray/brush coating, vacuum forming, and casting/molding, enabling a wide range of possible uses and outputs. Our technique can also bring touch interactivity to rapidly fabricated objects, including those that are laser cut or 3D printed.

Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017) 

The promise of smart environments and the Internet of Things (IoT) relies on robust sensing of diverse environmental facets. Traditional approaches rely on direct or distributed sensing, most often by measuring one particular aspect of an environment with special-purpose sensors. In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.

Robert Xiao, Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017)

At present, most of these devices rely on mechanical inputs, webpages, or smartphone apps for control. However, as IoT devices proliferate, these existing interaction methods will become increasingly cumbersome. Will future smart-home owners have to scroll though pages of apps to select and dim their lights? We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances, and summons appliance-specific interfaces.

Yang Zhang, Robert Xiao, Chris Harrison (UIST 2016) 

Electrical Impedance Tomography (EIT) was recently employed in the HCI domain to detect hand gestures using an instrumented smartwatch. This prior work demonstrated great promise for non-invasive, high accuracy recognition of gestures for interactive control. We introduce a new system that offers improved sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.

Junhan Zhou, Yang Zhang, Gierad Laput, Chris Harrison (UIST 2016)  

In AuraSense, we use Electric Field Sensing to support multiple interaction modalities for smartwatch. To explore how this sensing approach could enhance smartwatch interactions, we considered different antenna configurations and how they could enable useful interaction modalities. We identified four configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.

Yang Zhang, Junhan Zhou, Gierad Laput, Chris Harrison (CHI 2016)  

SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate. We envision the technology being integrated into future smartwatches, supporting rich touch interactions beyond the confines of the small touchscreen.

Yang Zhang, Chris Harrison (UIST 2015)  

Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.

Yang Zhang, Chris Harrison (ITS 2015)  

We quantitatively evaluate how electrostatic force feedback can be used to enhance, in particular targeting, where virtual objects rendered on touchscreens can offer tactile feedback. We conducted a Fitt's Law style user study and explored three haptic modalities: No feedback, Physical, and Electrostatic. The result shows that electrostatic haptic feedback can improve targeting speed by 7.5% compared to conventional flat touchscreens.

Fun Projects

Footprint

  • Oct 22 Attended UIST 2017 at Quebec City.
  • Sep 23 One week off. Visiting Zhuoshu in New York.
  • Sep 10 Invited to give a talk at CCTV2 .
  • July 11 Disney projects in full swing.
  • July 5 Kayak and swim at North Shore.
  • May 8 Present and demo Electrick at CHI 2017.
  • Jan 2 Back in Pittsburgh.
  • Dec 22 Visit ASU and Ling.
  • Dec 17 Visit Hong Kong to see my wife.
  • Oct 20 Back in Pittsburgh.
  • Oct 16 Attend UIST 2016 @ Tokyo, give AuraSense presentation.
  • Oct 09 Talk about research and share experience living abroad with ICMLL lab.
  • Oct 06 Wonderful wedding ceremony with two families and friends at Beijing.
  • Sep 22 post-CHI party at Union Grill. Preparing for my wedding.
  • Aug 28 CHI 2017 projects final push.
  • Jun 26 Three lab papers got accepted at UIST 2016. Go FIGlab!
  • Jun 1 Summer projects for CHI 2017 are in full swing.
  • May 16 I got married!
  • April 13 UIST 2016 Paper submitted. Fly to St. Louis for weekends.
  • April 1 UIST 2016 in full swing.
  • Mar 23 Qualcomm Innovation Fellowship finalist presentation and demo.
  • Jan 26 Pittsburgh Penguins vs. New Jersey Devils. We won!
  • Jan 9 Got back to Pittsburgh. New semester started!
More >