Yang Zhang

About Me

My name is Yang Zhang. I'm a 2nd year PhD student at Human-Computer Interaction Institute (HCII), Carnegie Mellon University, advised by Prof. Chris Harrison. Broadly, I build interfaces which bridge gaps between computing resources and people's daily lives in a natural and efficient way.

My research interests include sensing technologies for wearables, fabrications, and Internet of Things. I utilize on electric field sensing to enable gesture controls and touch input for a wide variety of applications. For more details of my research, please take a look at the following projects.


Yang Zhang, Gierad Laput, Chris Harrison (CHI 2017) 

Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces. We show that our technique is compatible with commonplace manufacturing methods, such as spray/brush coating, vacuum forming, and casting/molding, enabling a wide range of possible uses and outputs. Our technique can also bring touch interactivity to rapidly fabricated objects, including those that are laser cut or 3D printed.

Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017) 

The promise of smart environments and the Internet of Things (IoT) relies on robust sensing of diverse environmental facets. Traditional approaches rely on direct or distributed sensing, most often by measuring one particular aspect of an environment with special-purpose sensors. In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.

Robert Xiao, Gierad Laput, Yang Zhang, Chris Harrison (CHI 2017)

At present, most of these devices rely on mechanical inputs, webpages, or smartphone apps for control. However, as IoT devices proliferate, these existing interaction methods will become increasingly cumbersome. Will future smart-home owners have to scroll though pages of apps to select and dim their lights? We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances, and summons appliance-specific interfaces.

Yang Zhang, Robert Xiao, Chris Harrison (UIST 2016) 

Electrical Impedance Tomography (EIT) was recently employed in the HCI domain to detect hand gestures using an instrumented smartwatch. This prior work demonstrated great promise for non-invasive, high accuracy recognition of gestures for interactive control. We introduce a new system that offers improved sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.

Junhan Zhou, Yang Zhang, Gierad Laput, Chris Harrison (UIST 2016)  

In AuraSense, we use Electric Field Sensing to support multiple interaction modalities for smartwatch. To explore how this sensing approach could enhance smartwatch interactions, we considered different antenna configurations and how they could enable useful interaction modalities. We identified four configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.

Yang Zhang, Junhan Zhou, Gierad Laput, Chris Harrison (CHI 2016)  

SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate. We envision the technology being integrated into future smartwatches, supporting rich touch interactions beyond the confines of the small touchscreen.

Yang Zhang, Chris Harrison (UIST 2015)  

Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.

Yang Zhang, Chris Harrison (ITS 2015)  

We quantitatively evaluate how electrostatic force feedback can be used to enhance, in particular targeting, where virtual objects rendered on touchscreens can offer tactile feedback. We conducted a Fitt's Law style user study and explored three haptic modalities: No feedback, Physical, and Electrostatic. The result shows that electrostatic haptic feedback can improve targeting speed by 7.5% compared to conventional flat touchscreens.

Fun Projects

Birthday gift for Zhuoshu

I made a birthday gift for Zhuoshu. We both love Pokemon. So I 3D printed this design from thingiverse, and made a bulbasour with a succulent plant. The plant found its new home very comfortable. Its root grabbed the plastic body tightly after a week. It has been growing. Amazing adaptability!

I made a chess set for the guests at my wedding. Each chess piece is 3D-printed with our names and the wedding date. The box is fabricated by a laser cutter, on which the pattern matches the decorations of the ceremony. One set takes roughly 60 hours to make but it totally worth it. When Zhuoshu was switching her dress, I hosted a lottery where five winners got the chess sets.

Wedding bands

We made our wedding bands with a pattern of our heartbeats interleaved together. We first recorded our heartbeats with a mobile app. Then we tuned the smoothness of the waveforms and adjusted the phase difference between the two. The two bands are made of platinum.

As a practice project for using the Shopbot downstairs, I cut a piece of wood which ended up a triangle mesh. I then painted it with acrylic paint diluted with a lot of water to not entirely cover the texture of wood. It is now on the wall of the lab.

Metal Birds

While I was shopping at Construction Junction, I ran into a bunch of door pulls on sale. I got them back and made a sculpture. It took me 8 hours working at TechShop, grinding, sawing and welding. Real fun!

Plantie: 1st Most Creative Award of UIST 2014 Student Innovation Contest

Plantie is a cute and interactive pot which walks your household plant around. Powered by Kinoma Create and distributed sensors in the house, it has a couple of features contributing to the protection of your house privacy. The idea behind this project is how we can make plants more of hosts of the family rather than merely decorations. We presented Plantie in UIST 2014 Student Innovation Contest and fortunately we finally was granted "1st Most Creative Award".

Twitter Light: Internship Project at Kinoma, Marvell Semiconductor

This project happened during my internship at Kinoma Group, Marvell and is based on Kinoma Create, a JavaScript-powered Internet of Things Construction Kit. It is a LED matrix in the form of a world map to indicate the global tweet data. The Kinoma Create uses a socket to poll the tweet stream and shows tweets on the screen. Geographical information and the current traffic load can be visualized by the brightnesses of the LEDs. The map presents data in a low res, but geographically accurate way. More info of this project can be found here .

WikiTop10: a RESTful Wikipedia Data Analytics System

In this project, I developed a servlet which downloads Wikipedia docs as GZIPInputStream and parses it in real-time. I also developed an IOS app to refresh the top 10 most popular articles of Wikipedia in real-time. By browsing this app, users will be aware of the top 10 things people want to know the most. Not only will this app benefit to our knowledge database, it can also prepare users for potential conversation topics.

ZipperSense: Gadget Attached on Zippers for Gesture Detection

Zippers are ideal for input and can be use in an eye-free manner. ZipperSense uses a microphone to collect sounds for gestures detection. Gestures such as "V", "A", "W" and "M" by zipping up and down can be detected and used to control computational devices such as smartphones and computers.

Guitar Game

Music games are really popular these days. Yet few of them allow user upload their own music as the game music and no guitar game detect the direction of strums, which is a vital part of engagement. Therefore I built a tangible music game in which user can upload any music and the program will automatically generate the music game based on the signal processing of the audio information. Laser pointer and photo resister are used as strings so that the direction of strums can be detected according to the order of laser being blocked. Copper foil functions as buttons. All other jobs are done by python programming.

Interactive Fish

In this project I'm trying to make an interactive drawing of Chinese water-color fish which can be projected on the celling. Fish are attracted by people's show up and follow the user's position. Users disturb the fish by raising their arms.

CNC Router Rocking Chair

In the first semester, I engaged myself into many fabrication processes. One of the fabrication-centric classes I attended taught us how to use CNC router to build stuffs. I was so excited and built myself a rocking chair. Modeling and fabricated by myself, however, it is assembled by me and my friend. What I haven't imaged is that when we finished it, all my friends waited in line to try the chair. They were excited to try it and made jokes about it. I really enjoyed to see my chair gathering people around and making them laughing.

Whisker and Face

It's a mid-term project of Making Things Interactive class. In the exploration of materials, our device evolved from a paper-based mask to a solenoid-control silicon face. When someone bring their faces close to the silicon face, it starts trembling. Then we get closer, it trembling harder and end up with a freeze with a expression of horror.

Heart Beat

Heart, in many cultures, is a metaphor of feelings and emotions. In the project of "Heart Beat". We try to visualize the heart beat to create a new medium for people to communicate with others. We realized two devices the one of which is a hand hold light which senses and visualizes beats by the power of red LEDs. Another one is a wall full of paterns generated by the projector. If we put our hands on the wall, it starts to release ripples. We hope that by showing our heart beats we can augment the feeling of being together with others or create a feeling of being connected of people in distance.

LED Jump Game

It's a small project for the class Gadgets, Sensors and Activity Recognition in HCI. Users should control the LED dot to jump higher by pressing the only button.

Remote Control Device of Indoor Lights

This device is designed for the remote control of indoor lights. To use it, we first attached the device on the surface of the switch of a light and then the light can be controlled by a remote controller. When designing it, my main goal was to turn lights to be remote-controlled without any changes to them. It is convenient for people who know little about circuits. We realized this device with remote control chips and electromagnets.

CMU Classes

  • 16-720 Computer Vision (spring 2017)   notes
  • 10-807 Deep Learning (fall 2016)   notes
  • 10-601 Machine Learning (spring 2016)   notes
  • M.S. Graduation Project (advised by Prof. Chris Harrison) (spring 2015)
  • 15-619 Cloud Computing (fall 2014)
  • 95-771 Data Structure and Algorithm
  • 05-689 Independent Study in HCI
  • 98-222 Introduction to iOS Development
  • 15-213 Introduction to Computer System (spring 2014)
  • 67-103 Fundamentals of Web Design
  • 05-833 Gadgets, Sensors and Activity Recognition in HCI
  • 05-834 Applied Machine Learning
  • 15-112 Fundamentals of Programming and Computer Science (Python) (fall 2013)
  • 24-780 Engineering Computation (C++)
  • 48-478 Digital Tooling
  • 48-739 Making Things Interactive


  • July 11 Disney projects in full swing.
  • July 5 Kayak and swim at North Shore.
  • May 8 Present and demo Electrick at CHI 2017.
  • Jan 2 Back in Pittsburgh.
  • Dec 22 Visit ASU and Ling.
  • Dec 17 Visit Hong Kong to see my wife.
  • Oct 20 Back in Pittsburgh.
  • Oct 16 Attend UIST 2016 @ Tokyo, give AuraSense presentation.
  • Oct 09 Talk about research and share experience living abroad with ICMLL lab.
  • Oct 06 Wonderful wedding ceremony with two families and friends at Beijing.
  • Sep 22 post-CHI party at Union Grill. Preparing for my wedding.
  • Aug 28 CHI 2017 projects final push.
  • Jun 26 Three lab papers got accepted at UIST 2016. Go FIGlab!
  • Jun 1 Summer projects for CHI 2017 are in full swing.
  • May 16 I got married!
  • April 13 UIST 2016 Paper submitted. Fly to St. Louis for weekends.
  • April 1 UIST 2016 in full swing.
  • Mar 23 Qualcomm Innovation Fellowship finalist presentation and demo.
  • Jan 26 Pittsburgh Penguins vs. New Jersey Devils. We won!
  • Jan 9 Got back to Pittsburgh. New semester started!
More >