Hi All,
We have our HCI group meeting tomorrow at 12:30 PM at DGP Seminar room. In
the first session, Zhen Li will conduct a brainstorming session for his
research (abstract attached). In the second session, Dina Sabie will
present her research. Please drop by and take part in the activities. Lunch
will be provided during the meeting.
*Zhen's brainstorming:*
People can perform complicated computing tasks in their offices, using
their desktops with keyboards, mice, and monitors. But when they are
outside, they usually perform these tasks on their mobile phones. How about
combining an Augmented/Mixed Reality (AR/MR) device with the smartphone to
improve the user experience and provide more functionalities? I will lead a
brainstorming session and collect some feedback from you.
Thanks,
Rifat
--
Mohammad Rashidujjaman Rifat
Ph.D. Student, Department of Computer Science, University of Toronto.
http://www.dgp.toronto.edu/~rifat/
Hi HCIers,
Hope the CHI reviews are not turning you down too much.
Whether the reviews are good or not, you always have a chance to clarify
the concerns and make them support your paper by writing a good rebuttal.
I wanted to let you know that there's a collection of useful materials on
writing CHI rebuttals on dgp wiki that you should check out if you haven't
already: http://www.dgp.toronto.edu/dgpwiki/ResearchResources
Best,
Seongkook
Hello Everyone,
Regan Mandryk (website <http://hci.usask.ca/people/view.php?id=1>) will be
visiting the DGP on Wednesday November 21st, following her TUX talk on
Tuesday (20th). I'll be arranging demos for her throughout the day. Please
let me know if you'd like to arrange a meeting. This is a great opportunity
to show off your work!
-Peter Hamilton
*Bio*
Regan Mandryk is a professor in Computer Science at the
University of Saskatchewan; she pioneered the area of physiological
evaluation for computer games in her award-winning Ph.D. research at
Simon Fraser University with support from Electronic Arts. With over
200 publications that have been cited thousands of times (including
one of Google Scholar’s 10 classic papers in HCI from 2006), she
continues to investigate novel ways of understanding players and their
experiences, but also develops and evaluates games for health and
wellbeing, and games that foster interpersonal relationships. Regan
has been the invited keynote speaker at several international game
conferences, led Games research in the Canadian GRAND Network,
organizes international conferences including the inaugural CHI PLAY,
the inaugural CHI Games Subcommittee, and CHI 2018, and leads the
first ever Canadian graduate training program on games user research
(SWaGUR.ca) with $2.5 million of support from NSERC. She was inducted
into the Royal Society of Canada’s College of New Scholars, Artists
and Scientists in 2014, received the University of Saskatchewan New
Researcher Award in 2015, the Canadian Association for Computer
Science’s Outstanding Young Canadian Computer Science Researcher Prize
in 2016, and the prestigious E.W.R. Steacie Fellowship in 2018.
Hello Everyone,
Regan Mandryk (website <http://hci.usask.ca/people/view.php?id=1>) will be
visiting the DGP on Wednesday November 21st, following her TUX talk on
Tuesday (20th). I'll be arranging demos for her throughout the day. Please
let me know if you'd like to arrange a meeting. This is a great opportunity
to show off your work!
-Peter Hamilton
*Bio*
Regan Mandryk is a professor in Computer Science at the
University of Saskatchewan; she pioneered the area of physiological
evaluation for computer games in her award-winning Ph.D. research at
Simon Fraser University with support from Electronic Arts. With over
200 publications that have been cited thousands of times (including
one of Google Scholar’s 10 classic papers in HCI from 2006), she
continues to investigate novel ways of understanding players and their
experiences, but also develops and evaluates games for health and
wellbeing, and games that foster interpersonal relationships. Regan
has been the invited keynote speaker at several international game
conferences, led Games research in the Canadian GRAND Network,
organizes international conferences including the inaugural CHI PLAY,
the inaugural CHI Games Subcommittee, and CHI 2018, and leads the
first ever Canadian graduate training program on games user research
(SWaGUR.ca) with $2.5 million of support from NSERC. She was inducted
into the Royal Society of Canada’s College of New Scholars, Artists
and Scientists in 2014, received the University of Saskatchewan New
Researcher Award in 2015, the Canadian Association for Computer
Science’s Outstanding Young Canadian Computer Science Researcher Prize
in 2016, and the prestigious E.W.R. Steacie Fellowship in 2018.
Hi all
Since my recent project has lots of common features and similarity with Lizhen’s one, so I and Lizhen will co-lead tomorrow’s brainstorm session. We will talk our ideas and progress first, and have a discussion then.
And I also have runnable demo prepared, if anyone is interested, feel free to have a try.
Best
FY
Begin forwarded message:
From: Mohammad Rashidujjaman Rifat <rifat(a)cs.toronto.edu <mailto:rifat@cs.toronto.edu> >
Subject: [Sig] HCI Meeting This Week
Date: November 12, 2018 at 2:27:54 PM EST
To: hci(a)dgp.toronto.edu <mailto:hci@dgp.toronto.edu> , Joseph Jay Williams <josephjaywilliams(a)gmail.com <mailto:josephjaywilliams@gmail.com> >, Winter Wei <winterwei(a)gmail.com <mailto:winterwei@gmail.com> >
Hi All,
We have our HCI group meeting tomorrow at 12:30 PM at DGP Seminar room. In the first session, Zhen Li will conduct a brainstorming session for his research (abstract attached). In the second session, Dina Sabie will present her research. Please drop by and take part in the activities. Lunch will be provided during the meeting.
Zhen's brainstorming:
People can perform complicated computing tasks in their offices, using their desktops with keyboards, mice, and monitors. But when they are outside, they usually perform these tasks on their mobile phones. How about combining an Augmented/Mixed Reality (AR/MR) device with the smartphone to improve the user experience and provide more functionalities? I will lead a brainstorming session and collect some feedback from you.
Thanks,
Rifat
--
Mohammad Rashidujjaman Rifat
Ph.D. Student, Department of Computer Science, University of Toronto.
http://www.dgp.toronto.edu/~rifat/
_______________________________________________
Sig mailing list
Sig(a)dgp.toronto.edu <mailto:Sig@dgp.toronto.edu>
https://www.dgp.toronto.edu/cgi-bin/mailman/listinfo/sig
Hi all,
We need a volunteer for HCI meeting on Tuesday, November 13, 2018. If you
want to lead an activity or present your research, please let me know.
Thanks,
Rifay
--
Mohammad Rashidujjaman Rifat
Ph.D. Student, Department of Computer Science, University of Toronto.
http://www.dgp.toronto.edu/~rifat/
[cid:image005.png@01D357B6.5B82A7B0]
[cid:image008.png@01D10810.C2D01210]
[cid:image008.png@01D357B6.5B82A7B0]
Dear Tux,
A reminder that our next Member Presentation will take place tomorrow (Tuesday) at the DGP lab (40 St. George St., 5th Floor). We have two exciting speakers; looking forward to seeing you there!
Haijun Xia
[http://www.tux-hci.org/wp-content/uploads/2018/11/haijunxia-223x300.jpg]<http://www.tux-hci.org/wp-content/uploads/2018/11/haijunxia.jpg>
Abstract
Supporting Direct Human-Computer Communication
Abstract: From the dawn of digital computing, we have striven to communicate with computers to fully leverage their computing power. The development of sensing technologies enables such communication with verbal language, gestural language, and graphical language.
Despite the many different input techniques, conversations with computers are all structured around a fixed sets UI elements that do not support much flexibility. As such, the rich and dynamic thoughts we could have articulated naturally with flexible words, gestures, and visuals must be formalized as structured, restrictive, rigid, and repetitive tasks around such element. I seek to design a new interaction language that enables us to directly and flexibly articulate our creative thoughts. I approach this from two directions. First, I design new graphical representations of digital content to match our dynamic and flexible needs. Second, I invent novel interaction techniques to enable the direct articulation of user intention.
Bio
I am a PhD student advised by Prof. Daniel Wigdor at DGP lab, University of Toronto. I am also a Microsoft PhD Fellow and Adobe PhD Fellow.
My research area is Human-Computer Interaction, in which I focus on creating flexible digital media to augment our creativity. I approach this from two directions: 1) I invent novel representation of the abstract content to match our dynamic needs; and 2) I develop novel interaction techniques that allow us to express our thoughts and ideas via graphical, gestural, and vocal communication that we are all naturally capable of. For more information, please visit www.haijunxia.com<http://www.haijunxia.com/>
Seongkook Heo
[http://www.tux-hci.org/wp-content/uploads/2018/11/Seongkook-e1502313477475.…]<http://www.tux-hci.org/wp-content/uploads/2018/11/Seongkook-e1502313477475.…>
Abstract
Expanding Touch Interaction Bandwidth by Making Computers to Feel Our Touch and to be Felt
Our natural touch is rich, nuanced, and full of physical properties such as force and posture that imply our intentions. When we manipulate physical objects, we also understand the status of the object and control the posture or the forces by what we feel from our fingers. This rich physical interaction enables eyes-free and skillful object manipulation. However, most touch interfaces ignore this rich source of information and only register the contact location of a finger and do not give any physical reactions to our touch. This limited bandwidth of input and output channel often necessitates the use of multiple input modes and many buttons and makes eyes-free interaction challenging. In this talk, I will introduce projects that my colleagues and I have worked on to enrich our touch interaction with computers by utilizing previously unused physical properties of our touch. I will discuss how we can make the computers to sense more from our touch and use such information to make the richer interaction and also our haptic feedback methods that could enable virtual contents to be felt.
Bio
Seongkook is a postdoctoral fellow in the DGP Lab at the University of Toronto working with Prof. Daniel Wigdor. He received his Ph. D. in Computer Science at KAIST in 2017, under the supervision of Prof. Geehyuk Lee. He is interested in making communication between human and computers richer and more natural through the better use of new input and output modalities. His work has been published in premier conference proceedings and journals, such as CHI, UIST, CSCW, and IEEE Trans. Haptics. (Learn more at: http://seongkookheo.com<http://seongkookheo.com/>)
[cid:image009.png@01D357B6.5B82A7B0]
OUR SPONSORS:
[cid:image010.png@01D357B6.5B82A7B0]
TUX is made possible by the support of our sponsors, Steven Sanders, Autodesk,
University of Toronto Department of Computer Science, and MaRS.
About MaRS: MaRS is the one of the world's largest urban innovation hubs-a place for collaboration, creativity and entrepreneurship. Located in the heart of Toronto's research district, MaRS provides the space, training, talent and networks required to commercialize important discoveries and launch and grow Canadian startups.
_______________________________________________
tux-announce mailing list
tux-announce(a)dgp.toronto.edu<mailto:tux-announce@dgp.toronto.edu>
https://www.dgp.toronto.edu/cgi-bin/mailman/listinfo/tux-announce
_______________________________________________
tux-announce mailing list
tux-announce(a)dgp.toronto.edu
https://www.dgp.toronto.edu/cgi-bin/mailman/listinfo/tux-announce