Looking at the Whole Picture: A Case Study of Analyzing a Virtual Workplace
Michelle P. Steves, Emile L. Morse
Evaluating collaborative systems is much more difficult than evaluating single-user systems due to the number of simultaneous (synchronous) or intertwined (both asynchronous and synchronous) interactions. The types of data that can be gathered are diverse and include: surveys, questionnaires, interviews, system instrumentation interaction logs, and test subjects' logs. The completeness, quality and granularity of each of the data sources must be considered when attempting to evaluate a system. In addition, for systems that support both synchronous and asynchronous collaboration, evaluation has to probe different parts of the data stores to find evidence for each type of activity. In a previous study, we analyzed data from a field study to determine the usability problems in a groupware system. The fact that a large percentage (75%) of the system use was asynchronous surprised us, and led us to suspect that the user-centered method that had been employed in the evaluation, might have been insufficient to detect problems in the face of such heavy asynchronous use. Re-analysis of the data using an artifact-centered approach revealed additional support for our initial findings regarding the usability of the groupware system. We found that tracking email and persistent, shared objects could be used effectively to add certainty to our prior analysis. In combination, we believe that user-centered and artifact-centered methods can yield superior usability analyses and might lead to a better method for assigning relative priorities to the problems discovered for application developers.
WETICE Workshop on Evaluating Collaborative Enterprises
and Morse, E.
Looking at the Whole Picture: A Case Study of Analyzing a Virtual Workplace, WETICE Workshop on Evaluating Collaborative Enterprises, , USA, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=821611
(Accessed February 29, 2024)