Skip to main content

Controlling Physical Appliances Through Head-Worn Infrared Targeting

Yu-Hsiang Chen (UC Berkeley), Ben Zhang (UC Berkeley), Claire Tuna (UC Berkeley)
Tools
Location: C260

Increasingly, devices and services in our built environment are networked and can be controlled remotely. The proliferation of smart, controllable devices such as intelligent lighting, AV equipment, HVAC systems, or kitchen appliances raises the question of how to best interact with them.

Today, commercial solutions (such as Belkin WeMo) use handheld mobile devices as universal remote controls to control such appliances. In these solutions, users first browse a list of all available devices and then call up a device-specific user interface. This method faces two challenges: naming and scoping. Assigning clear names is non-trivial. Without a method of scoping selection to automatically filter non-relevant devices, paging through long lists of names or navigating hierarchies becomes cumbersome.

In this talk, we will discuss a new method for selecting and controlling smart appliances in physical spaces through the use of a head-worn computing device with near-eye display and wireless communication. We augment Google Glass with custom hardware for this purpose. Users first look in the direction of the appliance they wish to control to initiate interaction (e.g., at a lamp to control lighting, or at a speaker to change music playback volume). Once acquired, an appliance specific control UI shown on the head-mounted display enables adjustment of discrete and continuous parameters through a touchpad interface.

Our implementation and demo uses an augmented Google Glass with a narrow-beam IR emitter and a wireless 802.15.4 radio; all target appliances are equipped with IR receivers and wireless radio. We would do a live demo, show some use cases in videos, and also discuss the benefits and limitations of this technique.

DEMO:
During the talk, we will also show a live demo about this target selection using a mock TV player and a smart lamp. A second person will walk onto the stage and manipulate these appliances while the presenter narrates what is happening. Accompanied with this limited on-stage demo, we will also play another video which demonstrates the use cases in two scenarios.

SPECIAL NEEDS:
We would need a TV set on wheels and a lamp.

Photo of Yu-Hsiang Chen

Yu-Hsiang Chen

UC Berkeley

Sean (Yu-Hsiang) Chen is a UI/UX designer and creative technologist. He recently acquired a Master’s degree in School of Information, UC Berkeley, where he focused in UI/UX design, Human-Computer Interaction, and product design. After graduation, he worked as a part-time HCI research assistant under Professor Bjoern Hartmann. Prior to joining Berkeley, he worked in IBM Taiwan as a software engineer for 3 years and focused on mobile technology.

Photo of Ben Zhang

Ben Zhang

UC Berkeley

Ben Zhang is a second year Ph.D. student at UC Berkeley working with Professor Edward A. Lee in TerraSwarm project. The primary focus of his research is about networked embedded devices that augments human-environment interaction and the back-end infrastructure that provides a platform for sensors and actuators. Prior to joining Berkeley, he has interned in Microsoft Research Asia working on magnetic-based proximity detection (LiveSynergy) and smart earphones for in-situ health monitoring (SEPTIMU).

Claire Tuna

UC Berkeley

Claire Tuna is a fourth-year undergraduate at UC Berkeley, studying Computer Science and working as an HCI research assistant in the Berkeley Institute of Design.