<

Yang Zhang

Yang Zhang is a 4th year Ph.D. student at Human-Computer Interaction Institute (HCII), Carnegie Mellon University, advised by Prof. Chris Harrison. He is also a Qualcomm Innovation Fellow. His research focuses on sensing techniques for future HCI interfaces that feature both interactivity and activity recognition capabilities, through 1) enhancing mobile devices for interactions beyond touchscreens, 2) enabling low-cost sensing on everyday objects, and 3) developing high-accuracy and privacy-preserving deployed sensors for IoT applications.

Taxonomies of his recent research can be found below. More exciting projects are on the way.

[Research focus diagram inspired by professor Bjoern Hartmann]

Research

Y Zhang, G Laput and C Harrison (UIST 2018) [DOI] [PDF] [Code]
Honorable Mention Award

Vibrosight senses activities across entire rooms using long-range laser vibrometry. Unlike a microphone, our approach can sense physical vibrations at one specific point, making it robust to interference from other activities and noisy environments. This property enables detection of simultaneous activities, which has proven challenging in prior work.

Y Zhang, C Yang, S E. Hudson, C Harrison and A Sample (CHI 2018) [DOI] [PDF]
Best Paper Award & Innovation by Design Award

Wall++ is a low-cost sensing approach that allows walls to become a smart infrastructure. Our wall treatment and sensing hardware can track users' touch and gestures, as well as estimate body pose if they are close. By capturing airborne electromagnetic noise, we can also detect what appliances are active and where they are located.

Y Zhang and C Harrison (CHI 2018) [DOI] [PDF]

We developed a sensing technique for paper to track finger input and also drawn input with writing implements. Importantly, for paper to still be considered paper, our method had to be very low cost. This necessitated research into materials, fabrication methods and sensing techniques. We describe the outcome of our investigations and show that our method can be sufficiently low-cost and accurate to enable new interactive opportunities with this pervasive and venerable material.

R Xiao, T Cao, N Guo, J Zhuo, Y Zhang and C Harrison (CHI 2018) [DOI] [PDF]
Innovation by Design Award

LumiWatch is the first, fully-functional and self-contained projection smartwatch implementation, containing the requisite compute, power, projection and touch-sensing capabilities. Our watch offers more than five times that of a typical smartwatch display. We demonstrate continuous 2D finger tracking with interactive, rectified graphics, transforming the arm into a touchscreen.

J Gong, Y Zhang, X Zhou, XD Yang (UIST 2017) [DOI] [PDF]

Pyro is a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning.

Y Zhang, G Laput and C Harrison (CHI 2017)  [DOI] [PDF]

Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces.

G Laput, Y Zhang and C Harrison (CHI 2017) [DOI] [PDF]
Innovation by Design Award

In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.

R Xiao, G Laput, Y Zhang and C Harrison (CHI 2017) [DOI] [PDF]

We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances through EMI sensing, and summons appliance-specific interfaces.

Y Zhang, R Xiao and C Harrison (UIST 2016)  [DOI] [PDF]

We improved our prior work on wearable Electrical Impedance Tomography with higher sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.

J Zhou, Y Zhang, G Laput and C Harrison (UIST 2016) [DOI] [PDF]

AuraSense enhances smartwatches with Electric Field Sensing to support multiple interaction modalities. We identified four electrode configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.

Y Zhang, J Zhou, G Laput and C Harrison (CHI 2016) [DOI] [PDF]
Honorable Mention Award

SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate.

Y Zhang and C Harrison (UIST 2015) [DOI] [PDF]

Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.

Y Zhang and C Harrison (ISS 2015) [DOI] [PDF]
Best Short Paper Award

We quantitatively evaluate how electrostatic force feedback can be used to enhance, in particular targeting, where virtual objects rendered on touchscreens can offer tactile feedback. We conducted a Fitt's Law style user study and explored three haptic modalities. The result shows that electrostatic haptic feedback can improve targeting speed by 7.5% compared to conventional flat touchscreens.

Fun Projects

Footprint

  • Jun 6 Start internship at MSR, Redmond.
  • Apr 20 Attend CHI.
  • Jan 19 New semester started.
  • Oct 22 Attended UIST 2017 at Quebec City.
  • Sep 23 One week off. Visiting Zhuoshu in New York.
  • Sep 10 Invited to give a talk at CCTV2 .
  • Jul 11 Disney projects in full swing.
  • Jul 5 Kayak and swim at North Shore.
  • May 8 Present and demo Electrick at CHI 2017.
  • Jan 2 Back in Pittsburgh.
More >