Communicating About Communicating:

Cross-Disciplinary Design of a Media Space Interface

Beverly Harrison

Dept. of Industrial Engineering

University of Toronto

4 Taddlecreek Road

Toronto, Ontario

CANADA M5S 1A4

(416) 978-7581

beverly@dgp.toronto.edu

Marilyn Mantei

Dept. of Computer Science

and Faculty of Library and Information Science

University of Toronto

Toronto, Ontario

CANADA M5S 1A4

mantei@dgp.toronto.edu

Garry Beirne and Tracy Narine

Ontario Telepresence Project

University of Toronto

6 Kings College Road

Toronto, Ontario

CANADA M5S 1A1

garry@dgp.toronto.edu

tracyn@dgp.toronto.edu

ABSTRACT

This paper describes both the benefits and the challenges that result from differing perspectives and methodologies in an interdisciplinary team. Our team of user interface designers, engineers, psychologists, and sociologists designed and implemented a desktop videoconfernencing system for a local company. We shared a common goal of smoothly installing the technology which would support and enhance current work practices within the company. Because the project involved supporting human-human communication and work cooperation, the sociologists had much more impact on the user interface design than had been anticipated. Furthermore, since any interface design impacted subsequent work behavior in the study population, sociologists needed to understand aspects of the interface design and to regulate the HCI group's influence on and access to the user population.

The engineers and interface designers were frustrated by the limited end user access and fixed schedules that were considered necessary by the sociologists. In turn, the sociologists saw the designers and engineers as non-observant researchers who had to be steered away from invasive practices and admonished to keep measures constant. In the end, both disciplines helped each other with accomplishing their goals. The sociologists better understood something about the evaluation of technology and how usability impacts future design and product acceptance. The designers and engineers learned how current work practices and roles, when studied in detail, can provide design clues. The team learned how to run smoother technology design and deployment.

KEYWORDS: user interface design, interdisciplinary design, desktop videoconferencing, videoconferencing, media space

INTRODUCTION

Members of the human-computer interaction community tend to view themselves as interlopers, invading the sacred grounds of software developers and subjecting them to interdisciplinary teams. The HCI literature contains a variety of articles suggesting how to manage the interdisciplinary aspect of the field [7, 13, 14], how to set up design teams that have a human-factors component [10, 11], and how to communicate to the programmer or software designer [5]. Being accepted and respected has been a constant concern of the human factors specialist in the field, so much so, that it comes as a surprise when the shoe is on the other foot. We describe just such a situation.

The Ontario Telepresence Project [3] is exploring user tools and techniques for long distance desktop videoconferencing. As such, project members are developing video mail, electronic whiteboards and virtual meeting spaces as applications that will use tomorrow's network capabilities [2, 3, 4]. One Telepresence mandate is to deploy new desktop videoconferencing environments into organizations and measure the use and impact of the technology. Six members of the HCI group were assigned to build the user interface for this technology. However, group communication is an area that is typically researched by sociologists, not cognitive psychologists. For this reason seven sociologists joined our project. Their mandate was to measure the impact of the technology we were building in a realistic field study setting. As such, the interface designers were to work with the sociologists on the development and introduction of the interface so that the various iterations of the technology did not adversely affect the data being gathered. This paper describes this interaction and details how it provided new forms of user data for the interface design, how it changed the process of design, and also how it changed the resulting interface. Our experience reflects valuable lessons to be learned from the sociology paradigm that apply broadly to user interface work. We pass these lessons and insights on to our readers.

THE INDIGO PROJECT

The multi-disciplinary team is part of the Ontario Telepresence Project at the University of Toronto and Carlton University. Since January 1993, the interface design and engineering teams have been working with sociologists to develop and install the Telepresence software and hardware in the "Indigo Corporation." (Indigo is a code name assigned to protect the confidentiality of the organization and its employees.)

The overall goal on this project was to take the research lab's "usable" system and make a simple but robust installation of software and hardware at the Indigo site. The engineering team converted the switching software to connect Indigo offices across cities - our laboratory software had never connected more than 25 offices across 3 floors [3, 11]. The interface design team converted our awkward university lab interface (Figure 1) into one that was usable by secretaries and managers alike (Figure 2). Concurrent with the design and implementation tasks, the sociologists were running pre-tests on organizational and communication characteristics of Indigo prior to the installation of the new technology. Data would be continuously gathered as the user interface was upgraded and the software provided progressively more functionality to the Indigo users.

Much of the sociologists' efforts focused on preparing questionnaires to elicit current communication and work practices (through network analysis and detailed observational studies). These included questions about who each user communicated with and what media they chose for the communication (email, phone, fax, face-to-face) [8]. Detailed field notes were taken over several months recording communication patterns, work habits, and office layout. To minimize concerns about privacy, video tapes were made only of pre-approved meetings. Organization charts, lists of technological artifacts in use in the workplace, transcriptions of interviews and several day long time-geography studies (information on where each employee moves throughout the day along with a description of the task they are carrying out) formed the remainder of the data collection.

The Indigo Project shared all the deadlines of an industry-based operation. We had 120 days to come in, design and install our system, and capture our data. With the exception of Indigo staff vacations, which altered training times, we met these deadlines.

Indigo has two office buildings, separated by about 100 miles. It is basically a public relations firm and its bi-location is a result of serving two of its major customers. The two offices are not autonomous, resulting in large amounts of traveling, electronic mail, faxes, and telephone traffic between the two sites. One member of the subsidiary office travels to the main office to meet with the president of Indigo at least twice a week. The Telepresence system was installed to help them with their long distance communication.

Employees all have Telepresence units on their desk. Each unit consists of a Macintosh workstation, a video monitor, a small color camera, a microphone, and a loud speaker. These allow users to connect to any one of their Indigo colleagues by video. Connections are made through the Telepresence software interface on their workstations, which runs continuously as a background process. Click here for Picture

Figure 1: User interface used to operate the Telepresence desktop videoconferencing system prior to the Indigo redesign. Click here for Picture

Figure 2: User interface developed for the Indigo field trial. (Research lab members names are shown to protect anonymity of field trial users.)

The interface designed for Indigo (shown in Figure 2) has several components. The names are co-workers whom Indigo people can contact by video. One must "sign in" to the system to be available (names shown in standard dark text). Otherwise the name is grayed out and the person cannot be contacted. To connect a colleague, the mouse is moved over the name and clicked to highlight the name. Once a name is selected, the mouse is clicked on the "contact" button. This initiates a video call. The blank white box at the top of the Telepresence window gives feedback on the status of the call. Next to each person's name is a small icon representing a door. Doors determine degrees of privacy. If the door is closed or ajar, the recipient of the call has to give permission (in response to a "knock") to enable the video contact. If the door is open, the video contact is made without requesting permission. To end a video call, either party selects "hang-up." Users can also "glance" for a few seconds into someone else's office (if door states allow it), look at themselves in the "mirror," or connect to the outside "window." (Many Indigo offices do not have windows. This feature provides an attractive outdoor view from a camera pointing out one of the few windowed offices.)

WHO'S WHO

Three faculty, one graduate student, and two members of the technical staff took part in the user interface design of Indigo. The faculty included two psychologists with usability expertise and a computer scientist/sociologist with a background in communications theory. The graduate student was a human factors engineer with expertise in designing interfaces for telecommunications systems and the technical staff included one multi-talented media engineer, and one interface designer/computer scientist. The psychologists, located at Carleton University, participated in weekly video conferences or teleconferences with the Toronto team members in order to discuss the design decisions considered during the week. Faxes and email supplemented the weekly meetings.

Three faculty and four graduate students conducted the sociology research on the field study. The faculty included an ethnographer specializing in field study methods, a social network analysis specialist, and an organizational sociologist specializing in the sociology of work. At least one of the user interface specialists attended sociology meetings and one of the sociologists attended interface design meetings, although neither participated in the day-to-day work of the other team. An important (unanticipated) factor was the co-location of the head of the user interface group and the head of the sociology group (offices directly next to each other). Both were senior researchers, and both exerted some management influence on each other's project. The co-location facilitated frequent communication. Joint meetings (sociologists and designers) were held weekly. Meetings within groups were held as needed, typically two or three times each week.

INTEGRATING THE DIFFERENT PERSPECTIVES

From the onset of the project, the interface design team encountered what they initially perceived to be obstacles to their work. The sociologists felt that the interface designers were plunging ahead with the design without noticing serious user issues that they were concerned with. We chronicle some of these differences and their resolution in the sections which follow. Although the negotiations were difficult at the time, in retrospect, paying attention to these differences suggests improvements in how the HCI community manages its interaction with users. For us, this resulted in better designs and better methodologies.

Designing the User Interface

Dramatically different interfaces affect the behavior of the users. Knowing that sociologists measure these user behaviors, the interface design team expected them to have a strong interest in what went into the initial interface. Thus, we believed that the actual interface design would be a major area of cooperation between the sociologists and the interface designers. The sociologists' very definite lack of interest was both surprising and disappointing. What they wanted was an interface (any interface) and an installation to study. The interface did not even have to perform well because it was just as interesting to measure the effects of a bad interface as a good one!

However, once the initial interface was in place, any changes made to the interface were very important to the sociologists because changes impacted their longitudinal study. For example, flexible privacy control (the door state settings) could not be installed without first checking the timing of the sociology studies. The sociologists did not object to the new interface, but they wanted warnings about such a change since it could have a strong impact on the communication behaviors they were measuring. While they had minimal interest in defining the content, they did need to know what the interface enabled users to do so that they could add appropriate questions into their surveys. Often they acted as usability testers, allowing them to gain early insight into the next design iteration. This also provided the interface team with useful early usability feedback. In contrast, they had a lot to say about how the interface team handled the user population, the questions that the users were asked, the content of the training sessions, and how user feedback was to be obtained from the field trial participants.

Timing Our Access to Users

The basic credo of the user interface designer is "know thy user." Our project required the interface designers to design the interface, install the equipment, write the user manuals, and develop and carry out the training. The interface design team needed access to the user population immediately in order to understand the communication aspects of the users' work for the first iteration. Additionally, interface designers also wanted continuous access to the Indigo users in order to discuss issues that arose as the design progressed.

In sharp contrast, the sociologists wanted controlled and limited access to the user population so that the measures they were taking would not be affected. They needed to do much of their work before they accessed the user population, i.e., they had to develop their instruments for capturing data (e.g., questionnaires, surveys). Until these measurement instruments were ready, they did not want the user population "tainted." Consequently, there were clear differences in opinion about when the "technologists" should have the "first" contact.

The interface design team was not always aware of the subtle reasons which impacted timing for access to users. The organizational sociologist pointed out the importance of establishing the right context for our studies. Indigo management had to announce the studies to its staff and achieve consensus on participation. The timing of the announcement was to be close to the installation date since users would become frustrated with long delays. Thus the introduction of the study to Indigo and the first measures from actual users were both delayed. To help choose features and test a bare bones, first interface, users were selected who were similar to the Indigo users. The sociologists helped in defining the "users" who had similar tasks, roles, and status to the users in the Indigo Company. Because of these constraints, we developed an initial interface with limited functionality that would be incrementally improved as more feedback was obtained directly from the Indigo sites. In our case, we discovered that interviewing similar users may be somewhat risky but still provides useful guidance in these circumstances. The interface designs were successful partly because we were persistent in finding alternative sources of input instead of waiting to interview the actual users (when it would have been too late).

The sociologists also wanted to minimize the number of times groups of researchers descended upon our user population. They were concerned for two reasons: (1) since this was a working business and our studies were necessarily time consuming, management might resent frequent interruptions; and (2) frequent interactions with the users would alter sociological data. In particular, we learned that users might resent too much attention. This was a change from the usual HCI belief that the more feedback and interaction with users the better. This infrequent access forced all of us to plan in detail what we were going to say, to do, and to test for since we had fewer opportunities to collect user information.

Coordinating Data Collection Process

The sociologists were interested in running longitudinal studies on how technology influences work process. They carried out detailed observations of the two Indigo offices prior to the installation of any technology. They then followed up with observations of the same people after the new system was installed and running. Questionnaires were distributed pre- and post-technology to determine communication patterns and the medium used (social network analysis); work patterns (nature of work survey), and tools and objects in the work culture (use of technology survey). The sociologists were concerned about flooding the users with giant questionnaires that would take hours to fill out and thus, frustrate the users and affect their responses. They also wanted to control the administration of the surveys so that there would be minimal disruption to the Indigo work site. This meant that the interface team and the sociologists had to develop questions in conjunction with each other, limit the number and type of questions to be asked, and integrate the questions into the a single survey form. Surprisingly, while difficult to negotiate, we found that this integration process was perhaps the most instructive and beneficial part of our collaboration.

The limit on the number of questions the interface design team could ask was problematic because very detailed design related information was needed. It was not obvious to the sociologists why some of these questions were necessary (and vice-versa) - until the rational was explained. This took some lengthy and occasionally tense meetings to reach a common understanding. In negotiating between the two teams, we discovered that some of the data the designers needed could be obtained from other studies the sociologists were planning to run, including a detailed survey of the technical objects in people's workspace and a time-geography study of people's work patterns throughout the day. To account for the remaining loss of data, changes planned for Version 2 of the interface were deferred to Version 3 with the hope of obtaining this information in the next round of questionnaire administration. Questions were asked only if the responses were needed at that point in time. Every question had to have a strong reason for remaining on the questionnaire. We also found that many of the interface designer team's questions captured data that the sociologists were interested in, and that a careful rewording of the questions allowed them to serve both needs. Some of the sociologists' questions were even moved under the "interface category" since they seemed to fit better. The integration efforts also taught us alot about piloting and validating our questions.

For the next design iteration, the interface design team needed an entirely new set of questions which reflected the new design issues now under consideration. In contrast the sociologists wanted to find out whether prior information has changed and therefore repeated the questions asked in the first survey. The design team lamented the loss of valuable question space devoted to the repetition and the sociologists were concerned that the new questions posed by the user interface personnel would affect the answers to their repeated questions.

Sharing Expertise

Prior to the installation of Version 1, a sociologist gave a talk outlining her detailed observations of the Indigo users and the site. This presentation gave all of us an understanding of who might benefit most from the technology, who the "early adopters" were, and what the climate in the field trial site was like. In particular, we came to realize that our future users had concerns about the "big brother" possibilities of the technology and about the potential for outsiders to invade their network [6]. (This also showed up to a lesser extent in the interviews of "similar" users.) In response, we adapted our training sessions to address these concerns and also modified the equipment installation to give users more obvious pull-the-plug capabilities.

The interface was individualized to support the different social roles that people were shown to play in the organization [1,15] It was also individualized to match different communication concerns at the two sites. In essence, as a result of the information gathered by the sociologists, four interfaces were built and installed rather than one single interface. For example, the "gatekeeper" individuals at each Indigo site were given special meeting management software that allowed them to set up meetings between the conference rooms of both places. In this way the "gatekeeper" role uncovered by the sociologists was preserved.

In turn, the interface design team was able to help the sociologists. Their technological objects assessment study labeled all the technology-based objects that appeared in each person's office, from the personal workstation to the electric pencil sharpener. The interface designers asked the sociologists to include a set of invisible objects - software packages. They were also asked that they consider technology, in particular video equipment, in the user's homes. Home electronics equipment was added to the questionnaire. The software data was assessed and captured by a sociologist working with a computer science student.

Interacting With the User Population

The sociologists' concern about managing access to the user extended to other aspects of the project, which interface designers might not worry about. User interface designers are not usually concerned about telling others "tales" about the organization that an interface is being designed for and the problems encountered in designing this interface. From the start of the project, it was stressed that no one inside the project, under any circumstances, was to mention the name of the company or identify users, thereby breaching their confidentiality. This was important in preserving the users' trust particularly given the confidential and sensitive nature of some of the sociologists data.

Additionally the interface design team's plan was to conduct a simple training session when the system was installed, give out user manuals, and then wait until the next installation for another training update. The sociologists' ideas of interactions with users were more elaborate. They organized a more "formal" event around the introduction of Versions 1, 2 and 3, complete with coffee and doughnuts. This focus on creating a pleasant social atmosphere was done to encourage questions, feedback and adoption. Also, the users were rewarded (albeit marginally) by a free and pleasant coffee break. Version 1's introduction was carefully timed to follow the discussion of the technology at a staff meeting. Version 2's introduction was postponed slightly because of recent staff layoffs, and Version 3's delivery was readjusted to accommodate vacation schedules in the Indigo staff. During training sessions, the sociologists observed user behavior closely. The interface designers were more concerned with ensuring that the system was understood and working smoothly.

The sociologists made the interface designers more aware of subtle issues which affected the training session. There was a difference in status between the two Indigo sites. Therefore, "higher status" people on the interface design team were assigned to do the training at the site with fewer people and where the president did not work. The sociologists felt this would enhance the users' satisfaction with the Telepresence system and would illustrate that their feedback was important. Finally, the sociologists also commented on the manuals that were distributed to the users and recommended changes that improved their consistency and clarity.

The sociology group and interface designers jointly established an Indigo user group. These meetings are intended to obtain user feedback, comments, complaints, and bug reports on the workings of the system. However, there was concern about overwhelming the Indigo users because the ratio of researchers to users was so high on this project. As a consequence only one or two people from the interface group and one or two from the sociology group were permitted to attend the user meetings. Feedback from the meeting is summarized by all the researchers in attendance and is then distributed to the rest of the interface group. Unfortunately, obtaining data from an intervening party makes it harder to recreate what the problems are and how they might relate to the interface design. Also since many researchers with varying interests are trying to elicit information from the users, there is not enough time or opportunity to explore the many user issues at the meetings. The sociologists continue to run the user group meetings but attendance has fallen off for the interface designers now that the last version of the interface has been installed.

WHAT CAN WE DRAW FROM THIS EXPERIENCE THAT APPLIES TO HCI

Others have discussed cross-discipline teams in the design of user artifacts and the problems they have encountered [7, 9, 12]. We wish to add to this knowledge by sharing our experiences not of combining computer science with HCI but rather by combining HCI with sociology. A number of discussions between the interface designers and the sociologists (including several focused on this paper) have provided us with some interesting insights about working in such cross-disciplinary teams and have given us ideas about how to make the exchange and cooperation work more smoothly. In this section we describe our findings as they relate to this cross-disciplinary collaboration. In discussing these issues we have tried to dissociate ourselves from the particular personalities and circumstances involved in our project although some effect from these factors is hard to avoid.

Raise the concern of "Hard" science meeting "soft" science. We worried that this would be a critical issue in managing differing perspectives. It turned out that within discipline methodological differences were more of a problem than cross-discipline perspectives. We all felt that the HCI and sociology approaches were so different that merging methods was viewed as more interesting and that there was less personal investment in defending a particular methodology. Within disciplines, there was more of a tendency to defend one approach. The sociologists indicated that they did not feel intimidated or less legitimized because of the hard versus soft science issue.

Share understanding of the terminology - preferably early on. Earlier "sensitization" seminars would have shown us the value of each discipline and helped us to understand more about the each discipline's approach. As in most cross-disciplinary teams, we needed to understand each other's terminology. We often called a rose by many names only to later discover we all meant a rose. We also often thought we were all talking about a rose when we meant a petunia, a lily and a dandelion. Beyond terminology differences, we also needed to understand the "definition of the situation". Each discipline's approach influences the way we view and describe users in a work situation. We should have had each person describe his/her view of the same situation to better understand the perspectives and contributions.

Have an alternate test user population Relying on one population for data collection and negotiating for access adds additional challenges. Asking "real" users questions about the technology to be designed may change both their understanding of the technology and their expectations. This can ruin preliminary sociological data. Test populations can be substituted provided that (1) they resemble the target population; (2) the target population is eventually used for input and feedback; and (3) input from the target population carries the greatest weight. Using a combination of test and real populations provides timely information with minimal disruption to the longitudinal studies.

Apply data collected for other purposes. The information gained in pre-technology detailed observations proved invaluable in designing the interface and the training program. These data (e.g., status, who the early adopters were) were very different from the kind of data interface designers typically collect (e.g., how do you think this feature works). On-going observations provided clues that predicted which aspects of the system were most problematic (e.g., problems with parties in a private office forgetting they were calling an open office and engaging in too personal a dialogue). This allowed designers to prioritize features that specifically addressed these concerns. Again this was not usability data but detailed observational data documenting subtle social cues, side comments, mutterings, or communication patterns.

Be more careful with the user population. The sociologists taught other team members that often data collection from the users is intrusive and invasive and can have impact on the future use of the interface being developed. They made the design team members aware of user privacy concerns and confidentiality issues. We believed that protecting the confidentiality of individuals was important in our work and soon learned that protecting the confidentiality of organizations is equally appropriate.

Focus more on the roles people play in social organizations. The sociologists' work revealed which individuals served as gatekeepers and who the early adopters were in the organization. These individuals were used to facilitate introduction of the system, to determine different communication functionality for specific user's interfaces. We also used them for follow-up information on what was happening with the system. Traditional user interface design typically ignores this type of information and groups users into much broader categories (e.g., novices and experts).

Involve the whole team in usability testing. We discovered that interface designers need usability testing data and sociologists need early knowledge of what features are coming out to design their questionnaires. Using sociologists as testers solves both problems and has all the advantages of participatory design. It also helps the sociologists to understand the design process and the designers to understand how people work (since many tester comments reflected the sociologists' backgrounds).

Coordinate and integrate the questionnaires. This proved to be one of the most difficult and beneficial exercises in our collaboration. It was in merging the questionnaires that the methods and the rational behind the questions for each discipline became clear. Doing this means checking your entrenched views at the door but the payoff is high.

Continue to follow up on the subject population. Although the interface designers were interested in the initial user group meetings, once the final version was designed attendance fell off. The sociologists are running the user group meetings until the end of the project. Interface design does not end at the door. We can learn from the longitudinal and follow-up studies of sociologists who continue to collect information from the users.

CONCLUSIONS

We believe that the above lessons reflect the value and contribution of different methodologies when designing and deploying a complex technology. Many of these lessons apply to other types of cross-disciplinary teams who need to establish common terminology and to understand the benefits of other ways of looking at users. Personalities do play a role which is reflected most in the careful and necessary negotiation that must go on. Some of the measures from sociology that were adopted might not have been so useful if we were designing an interface for a single user however, they were very applicable to our CSCW work in understanding and designing for groups.

Now that the project is nearing its end, it is clear that integrating HCI with sociology was a very valuable experience. We cannot say that the interface designers understand the sociologists, nor why sociologists do everything they do, but their work is valued. We hope that HCI researchers are being considered in the same light by the computer scientists.

ACKNOWLEDGMENTS

We particularly want to thank the three sociologists on our project, Gale Moore, Janet Salaff, and Barry Wellman. We also thank the two Carlton psychologists, Jo Tombaugh and Dick Dillon for their continued design feedback. This project would not have been possible without the on-going programming efforts of Tom Milligan and the Telepresence Ottawa Engineering Group, or the direction of Bill Buxton. Funding from the following sources is gratefully acknowledged: the Ontario Ministry of Technology who supported us through TRIO and ITRC, the Ontario Telepresence Project, and the Natural Sciences and Engineering Research Council. Finally thanks go to the Indigo Company which underwent the field studies conducted by our interdisciplinary team.

REFERENCES

1. Anderson, R., Button, G., and Sharrock, W.. Supporting the design process within an organizational context. In Proceedings of ECSCW'93 (Milan, Italy, September 1993), Kluwer Academic Publishers, 47-59.

2. Bly, S. A., Harrison, S.R. and Irwin, S. Media spaces: bringing people together in a video, audio and computing environment. Commun. ACM 36, 1 (January 1993), 28-47.

3. Buxton, W. A. and Moran, T. P. EuroPARC's integrated interactive intermedia facility (IIIF): Early experiences. In S. Gibbs and A.A. Verrijn-Stuart (Eds.) Multi-User Interfaces and Applications, Elsevier Science Publishers B.V., North-Holland, 1990.

4. Buxton, W. A. Ubiquitous Media and the Active Office. In press. Commun. ACM (1994).

5. Curtis, B., Krasner, H. and Iscoe, N. A field study of the software design process for large systems. Commun. ACM 31, 11 (November 1988), 1268-1287.

6. Dourish, P. Culture and control in a media space. In Proceedings of ECSCW'93 (Milan, Italy, September 1993), Kluwer Academic Publishers, 125-138.

7. Grudin, J. and Poltrock, S.E. User interface design in large corporations: coordination and communication across disciplines. In Proceedings CHI'89 (Austin,TX, April 1989) ACM Press, New York, 197-203.

8. Haythornthwaite, C., Wellman, B., and Mantei, M. Media use and work relationships in a research group. In Proceedings 27th Hawaii International Conference on Systems Sciences (Maui, Hawaii, January 1994).

9. Hughes, J. A., Randall, D. and Shapiro, D. Faltering from ethnography to design. In Proceedings of CSCW'92 (Toronto, Canada, November 1992), ACM Press, New York, 115-122.

10. Mantei, M. and Teorey, T.J. Incorporating behavioral techniques into the system development life cycle. MISQ 13, 3 (1989), 257-276.

11. Mantei, M. M., Baecker, R. M., Sellen, A. J., Buxton, W. A., Milligan, T. and Wellman, B. Experiences in the use of a media space. In Proceedings of CHI'91 (New Orleans, April 1991), ACM Press, New York, 203-208.

12. Monk, A., Nardie, B., Gilbert, N., Mantei, M. and McCarthy, J. Mixing Oil and Water? Ethnography versus experimental psychology in the study of computer-mediated communication. In Proceedings of InterCHI'93 (Amsterdam, 1993), ACM Press, New York, 3-6.

13. Mrazek, D. and Rafeld, M. Integrating human factors on a large scale: "Product usability champions." In Proceedings CHI'92 (Monterey, CA, May 1992) ACM Press, New York, 565-570.

14. Newell, A., and Card, S. K. The prospects for psychological science in human-computer interaction. Human-Computer Interaction 1, 3, (1985), 209-242.

15. Scott, J. Social Network Analysis: a Handbook. Sage Publications, London, U.K., 1991.