Hi Mingming,
Do you want to have a chat with Dr. Yang Li?
Thanks,
Haijun
From: Haijun Xia [mailto:haijunxia@dgp.toronto.edu] Sent: Monday, March 28, 2016 7:59 PM To: 'hci@dgp.toronto.edu' hci@dgp.toronto.edu; 'graphics@dgp.toronto.edu' graphics@dgp.toronto.edu Subject: RE: Demo Schedule for Dr. Yang Li' DGP Visit
A reminder of scheduling a demo for Dr. Yang Li. Many of you are working on submissions.. but please do consider the benefits of exposing your exciting work to the best researcher in our field and making connections to Google Research.
Thanks,
Haijun
From: Haijun Xia [mailto:haijunxia@dgp.toronto.edu] Sent: Saturday, March 26, 2016 3:56 PM To: 'hci@dgp.toronto.edu' <hci@dgp.toronto.edu mailto:hci@dgp.toronto.edu
; 'graphics@dgp.toronto.edu' <graphics@dgp.toronto.edu
mailto:graphics@dgp.toronto.edu > Subject: Demo Schedule for Dr. Yang Li' DGP Visit
Good afternoon everyone,
Dr. Yang Li is visiting our lab on Wednesday, April 6th after his TUX talk on Tuesday. I am putting together the demo schedule for him. His will be seeing demos at10:30 am - 12 am and 2:15pm - 5pm. Please let me know by Monday night if you want to give a demo with the time you prefer.
Thanks,
Haijun
You can find his bio here:
Yang Li is a Senior Research Scientist in Human Computer Interaction and Mobile Computing at Google. He leads the Predictive User Interfaces group at Google. He is also an affiliate faculty member in Computer Science & Engineering at the University of Washington. He earned a Ph.D. degree in Computer Science from the http://english.cas.ac.cn/ Chinese Academy of Sciences, and conducted postdoctoral research in http://www.eecs.berkeley.edu/ EECS at the http://www.berkeley.edu/ University of California at Berkeley. He has published over 50 papers in the field of Human Computer Interaction, including 29 publications at CHI, UIST and TOCHI. He has constantly served on the program committees of top-tier HCI and mobile computing conferences.
Yang's research focuses on novel tools and methods for creating mobile interaction behaviors, particularly regarding emerging input modalities (such as http://yangl.org/pdf/gesturesearch-uist2010.pdf gestures and http://youtu.be/JJSZGdMYV9s cameras), https://www.youtube.com/watch?v=xGqn1FQRQPQ cross-device http://googleresearch.blogspot.com/2013/09/projecting-without-projector-sha ring.html interaction and http://dl.acm.org/citation.cfm?id=2647355 predictive user interfaces. Yang wrote https://play.google.com/store/apps/details?id=com.google.android.apps.gestu research&hl=en Gesture Search, a popular Android app for random access of mobile content using gestures. Yang develops http://youtu.be/8OXExn29OTE software tool support and http://yangl.org/pdf/protractor-chi2010.pdf recognition methods by drawing insights from http://yangl.org/pdf/motiongestures-chi2011.pdf user behaviors, and leveraging techniques such as machine learning, computer vision and http://yangl.org/pdf/crowdlearner.pdf crowdsourcing to make complex tasks simple and intuitive.
Oops.
I guess you can tell the difficulty I am facing now. Sorry for the spamming, but please take it as the final reminder of scheduling a slot with Dr. Yang Li.
I am very sorry about this mistake, Mingming.
Sincerely,
Haijun
From: nrg [mailto:nrg-bounces@dgp.toronto.edu] On Behalf Of Haijun Xia Sent: Tuesday, March 29, 2016 6:47 PM To: hci@dgp.toronto.edu; graphics@dgp.toronto.edu Subject: Re: [nrg] Demo Schedule for Dr. Yang Li' DGP Visit
Hi Mingming,
Do you want to have a chat with Dr. Yang Li?
Thanks,
Haijun
From: Haijun Xia [mailto:haijunxia@dgp.toronto.edu] Sent: Monday, March 28, 2016 7:59 PM To: 'hci@dgp.toronto.edu' <hci@dgp.toronto.edu mailto:hci@dgp.toronto.edu
; 'graphics@dgp.toronto.edu' <graphics@dgp.toronto.edu
mailto:graphics@dgp.toronto.edu > Subject: RE: Demo Schedule for Dr. Yang Li' DGP Visit
A reminder of scheduling a demo for Dr. Yang Li. Many of you are working on submissions.. but please do consider the benefits of exposing your exciting work to the best researcher in our field and making connections to Google Research.
Thanks,
Haijun
From: Haijun Xia [mailto:haijunxia@dgp.toronto.edu] Sent: Saturday, March 26, 2016 3:56 PM To: 'hci@dgp.toronto.edu' <hci@dgp.toronto.edu mailto:hci@dgp.toronto.edu
; 'graphics@dgp.toronto.edu' <graphics@dgp.toronto.edu
mailto:graphics@dgp.toronto.edu > Subject: Demo Schedule for Dr. Yang Li' DGP Visit
Good afternoon everyone,
Dr. Yang Li is visiting our lab on Wednesday, April 6th after his TUX talk on Tuesday. I am putting together the demo schedule for him. His will be seeing demos at10:30 am - 12 am and 2:15pm - 5pm. Please let me know by Monday night if you want to give a demo with the time you prefer.
Thanks,
Haijun
You can find his bio here:
Yang Li is a Senior Research Scientist in Human Computer Interaction and Mobile Computing at Google. He leads the Predictive User Interfaces group at Google. He is also an affiliate faculty member in Computer Science & Engineering at the University of Washington. He earned a Ph.D. degree in Computer Science from the http://english.cas.ac.cn/ Chinese Academy of Sciences, and conducted postdoctoral research in http://www.eecs.berkeley.edu/ EECS at the http://www.berkeley.edu/ University of California at Berkeley. He has published over 50 papers in the field of Human Computer Interaction, including 29 publications at CHI, UIST and TOCHI. He has constantly served on the program committees of top-tier HCI and mobile computing conferences.
Yang's research focuses on novel tools and methods for creating mobile interaction behaviors, particularly regarding emerging input modalities (such as http://yangl.org/pdf/gesturesearch-uist2010.pdf gestures and http://youtu.be/JJSZGdMYV9s cameras), https://www.youtube.com/watch?v=xGqn1FQRQPQ cross-device http://googleresearch.blogspot.com/2013/09/projecting-without-projector-sha ring.html interaction and http://dl.acm.org/citation.cfm?id=2647355 predictive user interfaces. Yang wrote https://play.google.com/store/apps/details?id=com.google.android.apps.gestu research&hl=en Gesture Search, a popular Android app for random access of mobile content using gestures. Yang develops http://youtu.be/8OXExn29OTE software tool support and http://yangl.org/pdf/protractor-chi2010.pdf recognition methods by drawing insights from http://yangl.org/pdf/motiongestures-chi2011.pdf user behaviors, and leveraging techniques such as machine learning, computer vision and http://yangl.org/pdf/crowdlearner.pdf crowdsourcing to make complex tasks simple and intuitive.