Good evening, and happy Thanksgiving!

A reminder of Dr. Hrvoje Benko’s talk today. Details are below; we look forward to seeing you at the first Tux event!

Today! Tuesday, October 13 at 12:30pm, Tux Proudly Presents: Hrvoje Benko, PhD

The MaRS Discovery District Auditorium @ 101 College St. (Lower Level)

Lunch reception begins at 12:30pm. A brief introduction to Tux, followed by our exciting guest speaker: Dr. Hrvoje Benko, a Senior Researcher at Microsoft Research, begin at 1:00 sharp. Dr. Benko is the first of the invited Sanders Series lecturers, who are intermingled with other Tux meetings in our inaugural year.

Please feel free to share this invitation with anyone who conducts HCI research, corporate or academic, in the Toronto area. All kindly confirm your attendance via email to Grace Chen (gchen@dgp.toronto.edu). Grace can also add anyone who requests it to this mailing list for future notifications.

http://research.microsoft.com/en-us/um/people/benko/Hrvoje_Benko_HeadShot_lowres.jpg

Hrvoje Benko is a Senior Researcher at Microsoft Research. He explores novel interactive computing technologies and their impact on human-computer interaction. In particular, his research interests include augmented reality, touch and gesture-based interfaces, depth sensing, and display technologies. He helped found and lead the Microsoft Touch Mouse project and he has extensively collaborated with the Surface Computing group at Microsoft. He has been active in the human-computer interaction field, authoring more than 50 scientific papers and journal articles, as well as serving as the General Chair (2014) and the Program Chair (2012) of the ACM Conference on User Interface Systems and Technology (UIST). For his publications, he received several best paper awards at both ACM UIST and ACM SIGCHI. Before joining Microsoft, he obtained his PhD at Columbia University. More detail can be found on his website: http://research.microsoft.com/~benko/.

 

Interacting with Photons: Creating Interactive Projected Augmented Reality Experiences

We have been investigating how depth sensing cameras and projectors can be used to enable highly immersive and interactive augmented reality experiences. In contrast to head-worn displays, such projector + depth camera systems offer the ability to create wide-field-of-view immersive augmented reality experiences without the user having to wear any additional gear. While large-scale projection mapping installations have become familiar forms of artistic expression, our work showcases how one can leverage the unique capabilities of today’s depth cameras and fast GPUs to enable real-time projection mapping on any surface (including deformable moving surfaces), thus enabling truly interactive experiences.  In this talk, I present a progression of research prototypes, each one exploring a different use scenario, while discussing the challenges in authoring such experiences. I draw examples from several highly publicized projects such as OmniTouch, IllumiRoom, and RoomAlive, as well as highlight the recent release of our open source RoomAlive Toolkit (https://github.com/Kinect/RoomAliveToolkit/).