Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

CIFter Overview

Overview of the CIFter project

The CIFter Project was created by The National Institute of Standards and Technology (NIST) to promote the investigation and development of ways to evaluate and benchmark methodologies for assessing the usability of websites. CIFter is a compound acronym. The CIF is the Common Industry Format for reporting the results of usability studies with users. The "ter" refers to the "test, evaluation, and report" process.

There are two goals for the project. The first goal is to identify web usability practices that are efficient, economical, and effective. This will be achieved by having a number of usability evaluation groups perform evaluations of test website found on this CIFter CD. The data will be reported using the CIF. This evaluation data will be compared, contrasted, and analyzed to produce a set of evaluations for the website. The second goal is to use the results of the evaluations to create a benchmark against which other, future evaluations of this CIFter test website can be compared. The hypothesis is that this can be used for training usability evaluation specialists as well as comparing effectiveness of different methodologies.

Participating in the CIFter study

The CIFter CD collection was created in collaboration with Wayne Gray of George Mason University and The Motley Fool. The CIFter test website is a frozen-in-time version of The Motley Fool website. Participant evaluators have volunteered to evaluate the site according to the instructions on this CD. The original letter of invitation can be found on this CD. Briefly, you are being asked to conduct a summative evaluation of The Motley Fool website by using the specially constructed extract included on this CD. Your testing methodology and the results of your test are to be reported using the Common Industry Format (CIF). Your results will be analyzed along with those of other evaluation teams to:

  • determine whether the website extraction method that has been used for the CIFter evaluation provides a useful method for producing benchmarks,
  • provide a baseline/benchmark against which to compare the results of various testing methods (e.g., remote, automated methods),
  • provide baseline data for measuring the quality of other usability analyses, especially in academic and training settings, and
  • validate the use of the CIF for reporting usability results.

CD contents and organization

The full contents of the CD with a brief description of each item can be found in CD_contents.html

How to choose what you need for your evaluation

The following table shows the legal combinations of components for performing an evaluation of The Motley Fool website.

TMFsite XX 
TMFsite_instrumented   X
Tasks XXX
Common Industry Format (CIF) XXX
JWANDS delay server  X 
NIST tools   X
  • Method #1 uses the native extracted website.
  • Method #2 adds the JWands delay server; this server mimics network delays experienced when using the Internet over a 28.8 modem.
  • Method #3 uses the WebVIP-instrumented version of the website. This allows automatic capture of user interactions. The logs can be visualized by using VisVIP.

General Instructions:

After browsing the above documentation, decide which of the Methods you would like to use for your evaluation. Follow the appropriate set of instructions given below:

  • Method #1
    1. Install TMFsite.
    2. Extract the CIF documents and read them thoroughly.
    3. Convert the prototypical Tasks into actual tasks for your study.
    4. Run subjects.
    5. Submit CIF report of results to cifter [at] (cifter[at]nist[dot]gov).
  • Method #2
    1. Install TMFsite.
    2. Install JWANDS.
    3. Extract the CIF documents and read them thoroughly.
    4. Convert the prototypical Tasks into actual tasks for your study.
    5. Run JWANDS.
    6. Configure proxy server on user's machine.
    7. Run subjects.
    8. Re-configure browser to use non-proxy connection to the Internet.
    9. Kill JWANDS process.
    10. Submit CIF report of results to cifter [at] (cifter[at]nist[dot]gov).
  • Method #3
    1. Install TMFsite_instrumented.
    2. Install WebVIP.
    3. Extract the CIF documents and read them thoroughly.
    4. Convert the prototypical Tasks into actual tasks for your study; create TaskID's.
    5. Customize userbegin.html; assign UserID's.
    6. Run subjects.
    7. Submit CIF report of results to cifter [at] (cifter[at]nist[dot]gov).
    8. Submit user interaction data to NIST for post-processing.
    9. Run VisVIP on files returned by NIST. View pretty-print version returned by NIST.

** Important **

  1. Only the Internet Explorer 5 browser is suitable for this evaluation. The site that was extracted employed MS-specific HTML. When viewed with other browsers, the advertising graphics are generally not visible. For this comparative evaluation, it is important that all users see the same material.
  2. Official participants in the CIFter evaluation and analysis study must report their results in the Common Industry Format (CIF). It is good practice to use the report in any case.
  3. For up-to-date information, consult are many related research issues in comparing evaluation methodologies which are discussed there.
  4. More detail on the various automated usability evaluation tools developed by NIST is available on the NIST Web Metrics site.

CIFter website:

For late-breaking info, consult


This CD was created by the web usability staff of the Visualization and Usability Group of the Information Access Division of the Information Technology Laboratory at NIST

  • Dr. Sharon Laskowski -- Group Manager
  • Dr. Emile Morse -- Project Lead
  • John Cugini -- VisVIP
  • Paul Hsiao -- WebVIP
  • Joe Konczal -- VisVIP on PC and cleanup of TMF site
  • Natalie Moy -- JWANDS
  • Afzal Godil -- package testing

The production of the collection has only been possible because of the contributions and dedication of our collaborators. Thanks to The Motley Fool team for undertaking this time-consuming tasks of preparing their site, getting permission from their sponsors, developing the task set. It was our pleasure to work with so many of the Fools!

  • Elizabeth Brokamp
  • Eric Burnette
  • Chris DiCosmo
  • Kerah Cottrell
  • Drew Dixon
  • Renee Gutshall
  • Chris Hunter
  • Marthe LaRosiliere
  • Carl Leubsdorf
  • David Ostroff
  • Jay Perlman
  • Dan Rosenfeld
  • Karen Salisbury
  • Allie Shaw
  • Tracy Sigler
  • Mia Tidwell
  • Shana Woomer

Professor Wayne Gray was a key contributor to this collection. In addition to his infectious enthusiasm, he brought terrific problem-solving skills to the table. We are grateful to his students, Jeni Paluska and Anthony Harrison, for evaluating an early version of the site and for their helpful suggestions.

Thanks also to Professor Andrew Sears for sharing his Wands tool and supporting data.

This CD and the NIST Web Metrics software tools are part of a research product developed at the National Institute of Standards and Technology by employees of the Federal Government in the course of their official duties. Pursuant to title 17 Section 105 of the United States Code this project's materials are not subject to copyright protection and are in the public domain.

The CIFter project is an experimental system. NIST assumes no responsibility whatsoever for its use by other parties, and makes no guarantees, expressed or implied, about its quality, reliability, or any other characteristic.

Call for Future Participation

If your organization has a presence on the web and would be willing to share your website's usability problems with the general public in future benchmark developments, feel free to contact us at cifter [at] ( cifter[at]nist[dot]gov).

<Return to the CIFter Home Page>


Created September 29, 2016, Updated October 19, 2016