Hello
Tux!
A
reminder that our final Sanders Series Invited
Lecture of this academic year will take place
tomorrow at 12:30pm in the MaRS
auditorium. We are very excited to be hosting Dr.
Yang Li from Google.
See
you tomorrow!
---
Dr. Yang Li:
Enabling New Input Dimensions for Mobile
Interaction

The limited interaction bandwidth of
existing mobile user interfaces is incompatible
with the rapidly growing computing power of mobile
and wearable devices. To address this problem, it
is important to explore new interaction dimensions
that can utilize the rich sensing capabilities of
these devices as well as their seamless
integration into our everyday activity. In this
talk, I will first describe how we can
significantly reduce user effort in mobile
interaction, at scale, by leveraging gestural
input. I will then describe how to empower
developers to leverage new input dimensions such
as gestural, cross-device and contextual input
through new tools and frameworks. From these
systems, I will discuss how these input
dimensions, though natural to the user, deeply
challenge traditional interactive computing, and
how we can address this challenge by providing
high-level tool support.
Yang Li is a Senior Research Scientist in Human
Computer Interaction and Mobile Computing at
Google. He leads the Predictive User Interfaces
group at Google. He is also an affiliate faculty
member in Computer Science & Engineering at
the University of Washington. He earned a Ph.D.
degree in Computer Science from the Chinese
Academy of Sciences, and conducted postdoctoral
research in EECS at the University of California
at Berkeley. He has published over 50 papers in
the field of Human Computer Interaction, including
29 publications at CHI, UIST and TOCHI. He has
constantly served on the program committees of
top-tier HCI and mobile computing conferences.
Yang’s research focuses on novel tools and methods
for creating mobile interaction behaviors,
particularly regarding emerging input modalities
(such as gestures and cameras), cross-device
interaction and predictive user interfaces. Yang
wrote Gesture Search, a popular Android app for
random access of mobile content using gestures.
Yang develops software tool support and
recognition methods by drawing insights from user
behaviors, and leveraging techniques such as
machine learning, computer vision and
crowdsourcing to make complex tasks simple and
intuitive.
|
OUR
SPONSORS:

Tux is made possible by the
support of our sponsors, Steven Sanders, Autodesk,
University of Toronto Departments
of Computer Science, and MaRS.
About MaRS: MaRS is the one of the
world’s largest urban innovation hubs—a place
for collaboration, creativity and
entrepreneurship. Located in the heart of
Toronto’s research district, MaRS provides the
space, training, talent and networks required to
commercialize important discoveries and launch
and grow Canadian startups.
|