Hi all,
We need a volunteer for HCI meeting on Tuesday, November 13, 2018. If you
want to lead an activity or present your research, please let me know.
Thanks,
Rifay
--
Mohammad Rashidujjaman Rifat
Ph.D. Student, Department of Computer Science, University of Toronto.
http://www.dgp.toronto.edu/~rifat/
Anybody has any idea how you can use AI in your project to show social
impact?
Regards,
Ishtiaque
---------- Forwarded message ---------
From: Emily Wilson <emily.wilson(a)utoronto.ca>
Date: Fri, Nov 9, 2018, 4:05 PM
Subject: Google Impact Challenge
To: Ishtiaque Ahmed <ishtiaque(a)cs.toronto.edu>
Hello Ishtiaque,
I hope you are doing very well and looking forward to the weekend.
Google has announced a new competition – Google Impact Challenge – for
organizations using AI for social good. Since AI is a broad category, and
your lab is actively working for greater social good, I wanted to share the
details and learn if you would be interested in pursuing this opportunity.
I am attaching the overview and application guide. You can also see more
details here: https://ai.google/social-good/impact-challenge/#faqs
Please let me know your thoughts on this opportunity!
All the best,
Emily
*Emily Wilson*
Senior Development Officer, Corporate & Foundation Development
Faculty of Arts and Science, Office of Advancement
University of Toronto
Sidney Smith Hall, Suite 2036
100 St. George Street | Toronto, Ontario | M5S 3G3
*e:* emily.wilson(a)utoronto.ca
*t:* 416.978.4177
*f:* 416.971.2374
--------------------------------------------------------------------
*You can support discovery by donating to the Faculty of Arts & Science:
http://uoft.me/SupportDiscovery <http://uoft.me/SupportDiscovery> *
--------------------------------------------------------------------
This email may contain confidential and/or privileged information for the
sole use of the intended recipient. Any review or distribution by others is
strictly prohibited unless explicitly specified in the text above. If you
have received this email in error, please contact the sender and delete all
copies.
I will be talking tomorrow evening at this event. Details below, you all
are welcome -
http://www.torchi.org/event-3101235
Best Regards,
Ishtiaque
Syed Ishtiaque Ahmed
Assistant Professor
Department of Computer Science
University of Toronto, ON, CA
web: https://www.ishtiaque.net/
[cid:image005.png@01D357B6.5B82A7B0]
[cid:image008.png@01D10810.C2D01210]
[cid:image008.png@01D357B6.5B82A7B0]
Dear Tux,
A reminder that our next Member Presentation will take place tomorrow (Tuesday) at the DGP lab (40 St. George St., 5th Floor). We have two exciting speakers; looking forward to seeing you there!
Haijun Xia
[http://www.tux-hci.org/wp-content/uploads/2018/11/haijunxia-223x300.jpg]<http://www.tux-hci.org/wp-content/uploads/2018/11/haijunxia.jpg>
Abstract
Supporting Direct Human-Computer Communication
Abstract: From the dawn of digital computing, we have striven to communicate with computers to fully leverage their computing power. The development of sensing technologies enables such communication with verbal language, gestural language, and graphical language.
Despite the many different input techniques, conversations with computers are all structured around a fixed sets UI elements that do not support much flexibility. As such, the rich and dynamic thoughts we could have articulated naturally with flexible words, gestures, and visuals must be formalized as structured, restrictive, rigid, and repetitive tasks around such element. I seek to design a new interaction language that enables us to directly and flexibly articulate our creative thoughts. I approach this from two directions. First, I design new graphical representations of digital content to match our dynamic and flexible needs. Second, I invent novel interaction techniques to enable the direct articulation of user intention.
Bio
I am a PhD student advised by Prof. Daniel Wigdor at DGP lab, University of Toronto. I am also a Microsoft PhD Fellow and Adobe PhD Fellow.
My research area is Human-Computer Interaction, in which I focus on creating flexible digital media to augment our creativity. I approach this from two directions: 1) I invent novel representation of the abstract content to match our dynamic needs; and 2) I develop novel interaction techniques that allow us to express our thoughts and ideas via graphical, gestural, and vocal communication that we are all naturally capable of. For more information, please visit www.haijunxia.com<http://www.haijunxia.com/>
Seongkook Heo
[http://www.tux-hci.org/wp-content/uploads/2018/11/Seongkook-e1502313477475.…]<http://www.tux-hci.org/wp-content/uploads/2018/11/Seongkook-e1502313477475.…>
Abstract
Expanding Touch Interaction Bandwidth by Making Computers to Feel Our Touch and to be Felt
Our natural touch is rich, nuanced, and full of physical properties such as force and posture that imply our intentions. When we manipulate physical objects, we also understand the status of the object and control the posture or the forces by what we feel from our fingers. This rich physical interaction enables eyes-free and skillful object manipulation. However, most touch interfaces ignore this rich source of information and only register the contact location of a finger and do not give any physical reactions to our touch. This limited bandwidth of input and output channel often necessitates the use of multiple input modes and many buttons and makes eyes-free interaction challenging. In this talk, I will introduce projects that my colleagues and I have worked on to enrich our touch interaction with computers by utilizing previously unused physical properties of our touch. I will discuss how we can make the computers to sense more from our touch and use such information to make the richer interaction and also our haptic feedback methods that could enable virtual contents to be felt.
Bio
Seongkook is a postdoctoral fellow in the DGP Lab at the University of Toronto working with Prof. Daniel Wigdor. He received his Ph. D. in Computer Science at KAIST in 2017, under the supervision of Prof. Geehyuk Lee. He is interested in making communication between human and computers richer and more natural through the better use of new input and output modalities. His work has been published in premier conference proceedings and journals, such as CHI, UIST, CSCW, and IEEE Trans. Haptics. (Learn more at: http://seongkookheo.com<http://seongkookheo.com/>)
[cid:image009.png@01D357B6.5B82A7B0]
OUR SPONSORS:
[cid:image010.png@01D357B6.5B82A7B0]
TUX is made possible by the support of our sponsors, Steven Sanders, Autodesk,
University of Toronto Department of Computer Science, and MaRS.
About MaRS: MaRS is the one of the world's largest urban innovation hubs-a place for collaboration, creativity and entrepreneurship. Located in the heart of Toronto's research district, MaRS provides the space, training, talent and networks required to commercialize important discoveries and launch and grow Canadian startups.
_______________________________________________
tux-announce mailing list
tux-announce(a)dgp.toronto.edu<mailto:tux-announce@dgp.toronto.edu>
https://www.dgp.toronto.edu/cgi-bin/mailman/listinfo/tux-announce
_______________________________________________
tux-announce mailing list
tux-announce(a)dgp.toronto.edu
https://www.dgp.toronto.edu/cgi-bin/mailman/listinfo/tux-announce
Best Regards,
Ishtiaque
Syed Ishtiaque Ahmed
Assistant Professor
Department of Computer Science
University of Toronto, ON, CA
web: https://www.ishtiaque.net/
---------- Forwarded message ---------
From: Syed Ishtiaque Ahmed <ishtiaque(a)csebuet.org>
Date: Sat, Nov 3, 2018 at 9:54 AM
Subject: Fwd: [TIER] CFP LIMITS 2019 - Fifth Workshop on Computing within
LIMITS
To: Ishtiaque Ahmed <ishtiaque(a)cs.toronto.edu>
---------- Forwarded message ---------
From: Jay Chen <jchen(a)cs.nyu.edu>
Date: Sat, Nov 3, 2018, 4:13 AM
Subject: [TIER] CFP LIMITS 2019 - Fifth Workshop on Computing within LIMITS
To: <TIER(a)tier.cs.berkeley.edu>
Call for Papers
--------------------
LIMITS 2019
Fifth Workshop on Computing within LIMITS
June 10-11, 2018
Lappeenranta, Finland
http://computingwithinlimits.org/2019/
The ACM LIMITS workshop aims to foster discussion on the impact of present
and future ecological, material, energetic, and societal limits on
computing. These topics are seldom discussed in contemporary computing
research. A key aim of the workshop is to promote innovative, concrete
research, potentially of an interdisciplinary nature, that focuses on
technologies, critiques, techniques, and contexts for computing within
fundamental economic and ecological limits. A longer-term goal is to build
a community around relevant topics and research. We hope to impact society
through the design and development of computing systems in the abundant
present for use in a future of limits. A recent article
<https://cacm.acm.org/magazines/2018/10/231374-computing-within-limits/abstr…>
in the Communications of the ACM provides a good primer on Computing within
Limits. This year we are co-locating with ICT4S in Europe.
Abstract submission deadline: Feb 1, 2019
Paper submission deadline: Feb 8, 2019
Paper reviews available: March 14, 2019
Camera-ready paper deadline: March 28, 2019
Jay Chen, NYU Abu Dhabi, jchen(a)cs.nyu.edu, Workshop Co-Chair
Oliver Bates, Lancaster University, o.bates(a)lancaster.ac.uk, Workshop
Co-Chair
For more information, please visit: http://computingwithinlimits.org/2019/
_______________________________________________
TIER mailing list
Website: http://tier.cs.berkeley.edu
TIER(a)tier.cs.berkeley.edu
https://www.millennium.berkeley.edu/cgi-bin/mailman/listinfo/tier