Participants: Alexis Scott-Morrison, Allan Eustis, David Baquis, David Flater, John Cugini, John Wack, Nelson Hastings, Sharon Laskowski, Tricia Mason, Wendy Havens, Whitney Quesenbery
Administrative Updates (Allan Eustis):
Discussion of Current and Planned Activities for the Next Generation VVSG
Whitney stated that the subcommittee should set a goal to have all HFP related edits completed before the March TDGC plenary meeting for the VVSG2007, with an exception to security related usability material. After edits are complete, the next step will be to work on security and verification - this is a hot topic including potential upcoming Congressional activity. Third, HFP needs to deal with usability benchmarks, making sure we have the resources to say in July that we know where we're going with particular tasks. Fourth is ongoing research. We need to decide what are the immediate needs for the Next Generation VVSG and what are test methods and suites that can be worked out after July delivery (we have started discussing these in greater detail). Requirements will be in this version, including general testing requirements. Details of the test protocols can go in the test methods written after July.
HFP has been having continuous discussions regarding examining audit methods and usability in terms of some of the security methods that are going to apply to election officials and poll workers, in relation to software independence. These two can be bundled together
Still struggling with specific method and implementation design e.g. barcodes for easy recounts. Questions arise when you get to explicit implementation design - how do you write requirements that are high level but still useful.
The subcommittee reviewed the list of top four items on their list.
John Cugini has almost completed the (usability) updates to the VVSG from the comments received at the December meeting. He will complete them and have them ready for discussion at the next HFP meeting. Any issues will be discussed via email beforehand.
Usability Benchmarks - It has become exceedingly obvious that you can not just pick a number based on a data set. You have to know how to analyze and know if you're using the right statistical methods. HFP has received expert advice on this matter and is close to saying "this is how we're going to count" and "how we plan to analyze". We may need 100 subjects (ROM) to test. Tests are being conducted on error rates noticed when people are instructed how to vote, we are not currently looking at voter satisfaction or time periods.
Specific details on what test methods were discussed. There should be guidelines for expert review. Questions arose about what does broad guidance mean for technical documentation? Are accessibility benchmarks written as test methods? These will not be included in the July deliverable. Can addendums be added to this version of the VVSG? This will be brought up to the EAC for answer. John Wack will find out what the process for making additions and corrections will be. TGDC may also want to approach this topic with EAC. Allan pointed out that the document, once reviewed by EAC, would be sent out for public comment. Final will probably not be released until 2008. David Baquis wanted to know what EAC's process was for making changes. (Same public review process as for previous versions of VVSG.)
David B. felt that HFP should seek out involvement from disability groups and accessibility consultants in its discussion regarding possible test methods. (This would have to be done through the EAC). (There will be a cognitive expert at an upcoming TEITAK Telecommunications and Electronic and Information Technology Advisory Committee disability - meeting that might be a useful resource.)
The next two teleconference meetings will cover end-to-end testing and usability benchmarks, respectively.
David B. suggested re-titling end-to-end to read "complimentary accessibility". Whitney's two major points being emphasized are how accessibility and accommodation can work hand-in-hand and second; that it's not enough for each piece of a component in a system to be accessible, but you have to look at it as a whole system - from the voters point of view and all the things a voter has to do. John Cugini questioned if we get performance (test based) requirements for accessibility written, would this be the basis for this kind of test?
How do you make sure the systems overcome the challenges? Part of the documentation a vendor should submit to the test labs is an explanation of their system and an explanation of how it is intended to be used so that it can be tested in its intended method of operation - documentation should be lay out steps that are assumed people would go through. If there are different possibilities, those should be laid out as well. This would help people that are constructing usability tests. John Cugini inquired if this would require adding new requirements. Yes.
At the next meeting, HFP should decide what specific issues involve STS coordination. Also what does it mean to be an "accessible" secure system?
Next meeting, Friday, January 26, 2007, 11:00 a.m.
[* Pursuant to the Help America Vote Act of 2002, the TGDC is charged with directing NIST in performing voting systems research so that the TGDC can fulfill its role of recommending technical standards for voting equipment to the EAC. These teleconferences serve the purposes of the HFP subcommittee of the TGDC to direct NIST staff and coordinate its voting-related research relevant to the VVSG 2007. Discussions on this teleconference are preliminary and do not necessarily reflect the views of NIST or the TGDC.]
policy / security notice / accessibility statement