Yang Zhang

My name is Yang Zhang. I'm a 4th year PhD student at Human-Computer Interaction Institute (HCII), Carnegie Mellon University, advised by Prof. Chris Harrison. I am also a Qualcomm Fellow. Broadly, I build interfaces which bridge gaps between computing resources and people's daily lives in a natural and efficient way.

Specifically, I'm interested in sensing technologies for wearables, fabrications, and Internet of Things. I work with capacitive, RF and optical sensing on human body and environment to extend interactions beyond touchscreens and detect human activities. My research projects have been mostly published at HCI conferences -- CHI and UIST. For more details of my research, please take a look at the following projects.


Yang Zhang, Chouchang(Jack) Yang, Scott E. Hudson, Chris Harrison and Alanson Sample (CHI 2018)  Best Paper Award 

Human environments are typified by walls -- homes, offices, schools, museums, hospitals and pretty much every indoor context one can imagine has walls. Wall++ is a low-cost sensing approach that allows walls to become a smart infrastructure. Instead of merely separating spaces, walls can now enhance rooms with sensing and interactivity. Our wall treatment and sensing hardware can track users' touch and gestures, as well as estimate body pose if they are close. By capturing airborne electromagnetic noise, we can also detect what appliances are active and where they are located.

Yang Zhang and Chris Harrison (CHI 2018) 

We present a new technical approach for bringing the digital and paper worlds closer together, by enabling paper to track finger input and also drawn input with writing implements. Importantly, for paper to still be considered paper, our method had to be very low cost. This necessitated research into materials, fabrication methods and sensing techniques. We describe the outcome of our investigations and show that our method can be sufficiently low-cost and accurate to enable new interactive opportunities with this pervasive and venerable material.

Robert Xiao, Teng Cao, Ning Guo, Jun Zhuo, Yang Zhang and Chris Harrison (CHI 2018) 

LumiWatch is the first, fully-functional and self-contained projection smartwatch implementation, containing the requisite compute, power, projection and touch-sensing capabilities. Our watch offers roughly 40 cm2 of interactive surface area -- more than five times that of a typical smartwatch display. We demonstrate continuous 2D finger tracking with interactive, rectified graphics, transforming the arm into a touchscreen. We discuss our hardware and software implementation, as well as evaluation results regarding touch accuracy and projection visibility.

Jun Gong, Yang Zhang, Xia Zhou, Xing-Dong Yang (UIST 2017) 

We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning.

Yang Zhang, Gierad Laput, Chris Harrison (CHI 2017) 

Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces. We show that our technique is compatible with commonplace manufacturing methods, such as spray/brush coating, vacuum forming, and casting/molding, enabling a wide range of possible uses and outputs. Our technique can also bring touch interactivity to rapidly fabricated objects, including those that are laser cut or 3D printed.

Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017) 

The promise of smart environments and the Internet of Things (IoT) relies on robust sensing of diverse environmental facets. Traditional approaches rely on direct or distributed sensing, most often by measuring one particular aspect of an environment with special-purpose sensors. In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.

Robert Xiao, Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017)

At present, most of these devices rely on mechanical inputs, webpages, or smartphone apps for control. However, as IoT devices proliferate, these existing interaction methods will become increasingly cumbersome. Will future smart-home owners have to scroll though pages of apps to select and dim their lights? We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances, and summons appliance-specific interfaces.

Yang Zhang, Robert Xiao, Chris Harrison (UIST 2016) 

Electrical Impedance Tomography (EIT) was recently employed in the HCI domain to detect hand gestures using an instrumented smartwatch. This prior work demonstrated great promise for non-invasive, high accuracy recognition of gestures for interactive control. We introduce a new system that offers improved sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.

Junhan Zhou, Yang Zhang, Gierad Laput, Chris Harrison (UIST 2016)  

In AuraSense, we use Electric Field Sensing to support multiple interaction modalities for smartwatch. To explore how this sensing approach could enhance smartwatch interactions, we considered different antenna configurations and how they could enable useful interaction modalities. We identified four configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.

Yang Zhang, Junhan Zhou, Gierad Laput, Chris Harrison (CHI 2016)  Honorable Mention Award 

SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate. We envision the technology being integrated into future smartwatches, supporting rich touch interactions beyond the confines of the small touchscreen.

Yang Zhang, Chris Harrison (UIST 2015)  

Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.

Yang Zhang, Chris Harrison (ITS 2015)  Best Note Award 

We quantitatively evaluate how electrostatic force feedback can be used to enhance, in particular targeting, where virtual objects rendered on touchscreens can offer tactile feedback. We conducted a Fitt's Law style user study and explored three haptic modalities: No feedback, Physical, and Electrostatic. The result shows that electrostatic haptic feedback can improve targeting speed by 7.5% compared to conventional flat touchscreens.

Fun Projects


  • Jan 19 New semester started. Had a very fruitful first week.
  • Oct 22 Attended UIST 2017 at Quebec City.
  • Sep 23 One week off. Visiting Zhuoshu in New York.
  • Sep 10 Invited to give a talk at CCTV2 .
  • July 11 Disney projects in full swing.
  • July 5 Kayak and swim at North Shore.
  • May 8 Present and demo Electrick at CHI 2017.
  • Jan 2 Back in Pittsburgh.
  • Dec 22 Visit ASU and Ling.
  • Dec 17 Visit Hong Kong to see my wife.
  • Oct 20 Back in Pittsburgh.
  • Oct 16 Attend UIST 2016 @ Tokyo, give AuraSense presentation.
  • Oct 09 Talk about research and share experience living abroad with ICMLL lab.
  • Oct 06 Wonderful wedding ceremony with two families and friends at Beijing.
  • Sep 22 post-CHI party at Union Grill. Preparing for my wedding.
  • Aug 28 CHI 2017 projects final push.
  • Jun 26 Three lab papers got accepted at UIST 2016. Go FIGlab!
  • Jun 1 Summer projects for CHI 2017 are in full swing.
  • May 16 I got married!
  • April 13 UIST 2016 Paper submitted. Fly to St. Louis for weekends.
  • April 1 UIST 2016 in full swing.
  • Mar 23 Qualcomm Innovation Fellowship finalist presentation and demo.
  • Jan 26 Pittsburgh Penguins vs. New Jersey Devils. We won!
  • Jan 9 Got back to Pittsburgh. New semester started!
More >