Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

TGDC Transcripts - Meeting 8/17/2007

TGDC Plenary Meeting, August 17, 2007

TRANSCIPTS

Technical Guidelines Development Committee (TGDC) Meeting
August 17, 2007, Via Teleconference
NIST Gaithersburg, Md.

 

(START OF AUDIOTAPE 1, SIDE A)

MR. EUSTIS:  Well, good morning everybody. This is Allan Eustis with the NIST Information Technology Laboratory Voting Team. Welcome to I believe the tenth plenary session of the Technical Guidelines Development Committee.

You will notice some different faces from our end because except for Dr. Jeffrey, all of the TGDC members are thankfully participating and they are doing so at their locations remotely. So this I think is an efficient way to do the meeting but it's a little different then we normally do.

I just want to quickly go through a few of the safety issues for our EAC and non NIST attendees at the meeting, and I'll just go real quickly because we've held this meeting in the employee lounge many times and if we have an emergency such as a fire drill or a real fire, you'll exit, take a right and there's the first exit, (unintelligible) on both sides, glass doors.

And we're going to take a break everybody at 1:30 Eastern Daylight Time for a half hour lunch, and come back and hopefully be very efficient and finish this meeting up as quickly as we can in the afternoon. We do have the potential to go to 5:30 p.m. if necessary.

A few preliminary matters, and this is for all the NIST folks too, please turn off your cell phones and pagers.

I mentioned the half hour break at 1:30 p.m. for lunch.

If you are calling in and you're on a cell phone especially, or you're in an office area, if you could mute your phone until you're recognized or unless you want to make a comment, and we're hoping that most of the members will use our electronic hand raising tool.

And this is again for all NIST as well as TGDC members who have gotten very good at this, please identify yourself as you ask a question or whenever you're called on to answer a question. This will help us when we transcribe this public meeting and make it available for the public.

With that I will hand the meeting over to Dr. Jeffrey to open up the plenary session. Thank you.

DR. JEFFREY: Well, good morning everyone. Welcome to the tenth plenary session of the TGDC.

I definitely appreciate everybody's attendance in either real space or virtual space, and to see exactly who's here I'm going to actually turn it over to the parliamentarian Thelma Allen to do the roll call.

MS. ALLEN:   Good morning. Williams.

DR. WILLIAMS: Here.

MS. ALLEN:   Williams is present. Wagner.

MR. WAGNER:  Here. Wagner is present. Paul Miller.

MR. MILLER:  Here.

MS. ALLEN:   Paul Miller is present. Gale.

SECRETARY GALE:   Here.

MS. ALLEN:   Gale is present. Mason.

MS. MASON:   Here.

MS. ALLEN:   Mason is present. Gannon. Gannon. Gannon is not attending. Pearce.

MR. PEARCE:  Here.

MS. ALLEN:   Pearce is present. Alice Miller.

MS. MILLER:  Here.

MS. ALLEN:   Alice Miller is present. Purcell.

MS. PURCELL: Here.

MS. ALLEN:   Purcell is present. Quesenbery.

MS. QUESENBERY:   Here.

MS. ALLEN:   Quesenbery is present. Rivest.

DR. RIVEST:  Here.

MS. ALLEN:   Rivest is present. Schutzer. Schutzer. Schutzer is not responding. Jeffrey.

DR. JEFFREY: Here.

MS. ALLENG:  Jeffrey is present. We have 1l in attendance. We have enough for a quorum.

DR. JEFFREY: Thank you very much, Thelma.

So one of the things, I'll start off by saying that I'm actually very excited for a number of reasons. One, I'm hoping that at the end of today's session that we will have an approved package that we can forward to the EAC.

I'm also excited, both happy and sad at the same time, that I've announced that I am resigning from the U.S. government effective the beginning of September.

I've absolutely thoroughly enjoyed being both the NIST Director as well as being the Chair of the TGDC.

I will tell you with all honesty that when I look back on the time that I've spent at NIST and the number of important things going on, there are a few issues that bubble up to the level of importance that I think that the TGDC has been working on. When you talk about affecting the core of our own democracy, there's few better examples then developing a system.

So I want to thank all the TGDC members for helping to put this package together, and also from a personal standpoint, for educating me as somebody who knew nothing about voting systems and making this a very enjoyable process.

So with that I will actually get the meeting going and just remind people that if you have access to the Internet that we're using a software package that allows you to just click on Raise My Hand and we'll then be calling on people in the queue.

If you are on a cell phone, or if the package is not working, or if you just care not to use it, just please feel free to just speak up.

And also we are very, very pleased to welcome the U.S. Elections Systems Commissioner, Deneta Davidson here today, who will e speaking to us in just a few minutes. And it's always a pleasure to have Deneta here. Another aspect of this job I'm going to seriously miss.

I would also like to welcome a potential new member to the committee, Dr. Semcaner, who has been nominated by IEEE, and while the EAC is completing the required vetting process we've invited him to listen in and participate as an observer for this teleconference, so look forward to him formally joining.

So at this time I would like to entertain a motion to adopt the August 17, 2007 TGDC agenda. Is there a motion to accept the agenda?

SECRETARY GALE:   Dr. Jeffrey, this John Gale of Nebraska. I move to accept the agenda.

DR. JEFFREY: Excellent. Is there a second?

MR. PEARCE:  Philip Pearce, I'll second it.

DR. JEFFREY: Okay, there is a motion to accept and second. Is there any discussion? Okay, hearing none, is there any objection to unanimous consent? 

Hearing no objection, the agenda is adopted by unanimous consent.

At this time I would also like to entertain a motion to accept the summary minutes of the last meeting of the TGDC. Is there a proposal for that?

DR. RIVEST:  Ronald Rivest, so moved.

DR. JEFFREY: Okay, a second?

MS. PURCELL: Helen Purcell, second.

DR. JEFFREY: Great. Any discussion? Okay, hearing no discussion, is there any objection to adoption of the meeting minutes by unanimous consent? Okay, this is going to go great today. Hearing no objection, they are approved by unanimous consent.

Since the May meeting, I must say when I looked at the number of teleconferences that we held, three subcommittees have put in an incredible amount of time and effort to dot the i's, cross the t's, and resolve all the issues, and it's amazing what you can extract out of volunteers. So again, thank you on that, and I'm very pleased to see the amount of progress that's been made.

So today we're going to start going through those and at the end of each of the briefings we'll entertain motions to adopt those sections and to approve those sections as part of the final report to the EAC.

Now I'll just also remind any of the public that they are free to make comments and position statements regarding this work on the NIST, at the votenist.gov site where they will be posted, and all of the previous comments are already posted there.

You can go directly to votenist.gov, which has all of the information from the TGDC including the public comments that have been provided and I recommend everyone look at those. They're actually quite interesting.

So at this time I would very much like to welcome Chairwoman Davidson for some comments?

COMMISSIONER DAVIDSON: Thank you. Good morning everybody.

And I want to begin by expressing my sincere appreciation to the TGDC and to NIST for their tireless efforts in developing the NIST set of voluntary system and standards.The rewrite of the VVSG was no small task but we're pleased to know that the work that you have done on this document will serve to improve the competence in elections and create a stronger democracy for all.

Also I want to take this occasion to thank Dr. Jeffrey for his leadership of this committee. Under Dr. Jeffrey's chairmanship this committee has written the most comprehensive and thorough set of voting system standards ever.

Dr. Jeffrey, your leadership and guidance will be missed. I thank you for your dedication and service to this community and this committee, and I wish you the very best in the future.

I'd like to take time to go through once we get the VVSG at the EAC, when it's delivered, and I especially think this is important.

Now what I'm really talking about is a draft that we have put together because the NIST iteration is a complete rewrite of the 2005 VVSG.

We do recognize at the EAC that it's the responsibility to conduct a deliberative and thorough review of the document, providing as much time as necessary to receive comments and testimony from all major stakeholders before finalizing the NIST iteration.Keeping that in mind, the EAC has developed this draft and as I said, this draft can change, so definitely this is a draft. It could take longer then what we have planed as sometimes things do, but I'm going to go through it as well as I can.

The EAC will receive the TGDC recommendations from NIST and the TGDC we anticipate in September of this year. Obviously this is next month.

From that time we feel it will take us about 14 days to put it in the Federal Register, and we are going to put that in the Federal Register for 120 day comment period, not 90 days as the law requires. That time will take us from October we anticipate, somewhere into probably next year, about February, and there's a lot that we feel that has to be done during the present time.

First of all, the TGDC has agreed that they will give training to a select group of individuals that's been chosen by the Board of Advisors and the Standards Board to receive education, thorough education, from the NIST individuals in October. October is when that's been set up so that is an ongoing training session for them.Also during that time we plan on holding public meetings and in these public meetings we feel that we need to do several things.

We need to try to get a handle on when we really talk about it, when we get into it, the cost of actually what the laboratories cost will be in testing. NIST has said they would assist in this process.

As we go through everything that we are gong to tell you about, not only the laboratories but then we also need to hear from the manufacturers, field them and talk to them about in a public meeting obviously also, how long is this going to take once you know what the VVSG is. How long is it going to take you to design your new systems, to build them, to have them tested?

You know, we may not know the exact dollar amount in any of these areas, whether it's testing or building the equipment, but what we will hopefully know is how much more time because of the increased testing will it take the laboratories to actually get the documents through or the equipment through for testing.

Also we have to meet with the advocacy groups. We feel that we need to bring academics in to vet what has been done from the TGDC and we need to educate and we need to hear from our election officials.

So during these public reviews we feel that that information will be very useful and will also go into the comments in the comment period.

At the December meeting we will have a Standard Board meeting and that is a public meeting, Standard Board and Board of Advisors again where NIST will do an education to the whole group and hopefully they will have comments to add to the review period.At the close of the review period we hope to have some input from all these different groups that will be very important, and so we're going to turn around and return that back to the TGDC for their comments and discussions on it.

We feel that that will be done sometime for the TGDC recommendations in probably May to July of next year.

After that then the EAC will receive the document back from the TGDC, we'll say in October or any time up to July to October, and after that period then we are going to put it back in the Federal Register for another 120 days. That 120 days is the second period that people can make comments after the TGDC has reviewed it after the first comment period.

Again we probably will be meeting with some of our laboratories, the manufacturers advocacy groups, election officials, and also the academic community. This is a very important document. We're trying to give it the time that it really needs to be put in place to vet the document.

We feel that the close of the period of the comments of the last time will be sometime in February of '09, so you can see that it's taking a great deal of time to actually put this through and get it vetted properly as we feel it should. It deserves that, and as we said, with this being a complete rewrite we have to do it and do it right.The Commission then reviews it, we feel like in May through June, and then the guidelines would be published in the Register the final time as the law requires, and that would be somewhere in the neighborhood of June or July of '09.

So hopefully you understand that we heard from our election officials that they really feel that they've got an election upcoming, they've got presidential elections that are coming on in January and February so '08 wasn't the best time to really vet the document as much as they would like.

So we wanted to put it out initially upfront as soon as possible for that 120 days. We feel it's important to hear from the stakeholders, bring it back to the TGDC, and obviously meeting with all the individuals so that we can get more input on how long is it going to take to design this equipment, any type of a handle of what the cost will be.So in moving forward we are trying to associate everything that is in detail with this document to try to come to some type of conclusion to have the very best document to move forward with our voluntary voting guidelines.

So if there are any questions I would be more then happy to try to answer them but I know you want to get on with the meeting, but I do want you to know that we do see this as a very important start. This is the beginning. We see once we receive the document, we have to follow through the process very carefully. Thank you.DR. JEFFREY: Thank you very much, Deneta. Are there any questions from any of the TGDC members? Okay, thank you very much.

I think we all recognize that this is just the beginning of the path and obviously I speak for the TGDC, that as comments come in, if there are things that we can do to help assess and understand the implications, I think there's a lot of interest both within the NIST staff and the TGDC members to help make sure that this is the best product possible. And it's easy for me to say since I'm leaving, so I get to volunteer everyone else.

COMMISSIONER DAVIDSON: We're going to look you up wherever you go.

DR. JEFFREY: So at this time with that threat, I would like to call on Mark Skall to provide us an update as to, what have you done since we last talked to you?

MR. SKALL:   Boy, thank God he's leaving. I'd like to start really by expressing my appreciation and giving thanks to the TGDC as a whole, especially to our Chair, Bill.

I think those of us who work here know how much the NIST Director has on his plate. If you ever look at his calendar, we are all just amazed that he actually can make it to work the next day.

Sometimes it can look overwhelming, and for Bill to stay up to date on the voting activities, and when we brief him he's just right there, he asks such good questions and he knows exactly what we've been doing. I don't know, he must do it in his sleep. It must be one of those machines that goes on.

So I certainly want to thank him and I want to thank the whole TGDC. I know that this is a difficult task. I know there are some contentious issues and there were very short time constraints for much of this.

Additionally, I know almost all of you, if not all of you, have full time jobs and you do this really in your spare time so this truly is a labor of love.

And I really do appreciate getting to know all of you and I certainly want to thank you for your technical insights, your dedication, and your patience in working with many disparate groups, including those crazy NIST scientists. So thank you.

Okay, I'd like to now speak about the voting activities and talk about what we've done since the May meeting.

I'd like to preface it by saying this is really a historic day I believe for the election community and the public.

In the next few hours the NIST staff will summarize various sections of the voluntary voting system guidelines and answer any questions you have for a document that we believe is essentially complete.

This document is the NIST iteration of the voluntary systems guidelines. All that are left are some editorial corrections we need to make, especially to non-normative sections like the introduction.

So you will be voting as Bill said on this document before you, and the document I believe really will have a profound affect on impacting the next generation of voting systems in reliability, in security, in accessibility, and usability. So I think this is really an historic day.

Okay, so I will speak about the activities since the last meeting in May, which of course included continued research and the drafting of the VVSG in coordination with the TGDC, and I'll talk a little bit about the focus of the meeting and the strategy and the agenda.

So in general we responded to all of the TGDC issues and comments, and Human Factors and Privacy, better known as HFP.

We completed the research for the usability performance benchmarks. Although it's completed with respect to the delivery of the document to EAC, we will continue this research to validate many of the performance benchmarks we put in. This is a completely new area, we're breaking new ground, and we want to make sure that we continue to do research in this area.

We've also made final updates to usability and accessibility requirements in Core Requirements and Testing or CRT. We've completed the reliability and accuracy benchmarks research and made final updates to EAC, and quality requirements.Security and Transparency or STS, we probably made more updates then the rest of the sections and it's really, really in good shape now. I think it reads really well but it has probably some more changes then the other two sections.

We re-drafted the E-poll book and externally networked kind of activity requirements, and we've made updates to open-ended vulnerability and independent voter verifiable record related requirements.

As far as coordination with the TGDC, we've had 46 teleconferences since the last meeting in May. We've made numerous revisions and updates to the draft material and of course we've had many individual discussion with people on the TGDC, either in groups or individually.

The current draft bill as of August 7th I think is an incredibly impressive document. There are right now 1170 requirements, 570 pages, and a new format for improved readability.

I have been speaking at some conferences over the last few weeks with voting officials who had some concerns about plain language and readability and I have urged them to actually read this document. Many have then looked at it and have been really surprised at how well it reads.

It is a technical document. At the end of the day these need to be precise, testable, unambiguous requirements and those are very detailed, but certainly all the introductory material and all the material up until that point is really very readable and I want to commend the staff and the TGDC for what I consider to be really an extraordinary document.

The final technical editing we believe after this meeting will take approximately two to three weeks to get it to the EAC.

So again the aims of the meeting, this is the final TGDC meeting before the VVSG delivery to EAC in September, so we would like to get approval for the VVSG and have final editing instructions for NIST staff, and again it will take a few weeks to do these final edits.

So we're going to talk about the changes we've made since the last meeting. These will be high-level summaries. Then of course we will discuss whatever remaining issues that you would like us to discuss, and then ask for final approval and a resolution at the end of each presentation.

And the order will be STS first, Security and Transparency, Core Requirements and Testing, and Human Factors and Privacy. I think your agenda probably lists Nelson as speaking on STS. Barbara Guttman will speak on that in his stead. That's all I have. Thank you.

DR. JEFFREY: Thank you, Mark. Are there any questions for Mark? Okay, great.At this point I would like to ask John Wack to then present an overview of the VVSG document structure. So, John.

MR. WACK:    Thank you. I just have a few slides to present and what I'll do essentially is just go over some of the major changes and describe at a high level what we have been doing over the past couple of months and some of the final document production plans.

And we also received a couple of comments, a couple of questions from Secretary Gale that we thought would be good to discuss as well.

And before I start I would like to add my thanks also to Commissioner Davidson. Working with her and the EAC has been very much a pleasure.

And I think almost exactly 23 months ago we actually started working on what we call VVSG 2007, and that was Dr. Jeffrey's first meeting out in Boulder. That was a memorable meeting in many ways. So again I think his support has been vital and this project is a success because of him.

Okay, with that, in a nutshell everybody here has been working very hard and very diligently. We received a number of comments from you at the last TGDC meeting, directly at the meeting. We also received a number of comments as well in the several weeks thereafter.

Everybody took those comments very seriously, responded to all of them, and then we had a number of telecons afterwards so we feel that we have addressed those in the document before you. We've made the changes you have requested.

We did a number of other edits simply just to make the document more readable, less daunting. Many people who looked at this we think may have been put off by some of the previous versions because perhaps it was a little too complex looking, looked a little bit too complicated. We hope the new version looks better. We put a few other things out there on the website as well, spreadsheets of requirements cross-referenced.So after today, presuming that we have just an absolutely fabulous meeting and walk out of here with smiles on our faces, we expect that we've got a few more weeks of changes and reviews.

Commissioner Davidson talked to you a little bit about the public review process and in a way I think it's great that the TGDC version of the document is the version that's going out for the public review.

At the same time it puts more pressure on us at NIST because everybody is going to be looking at something that we directly produced, and it takes a long time and it is very difficult to go through the whole thing and make sure everything reads correctly, but it behooves us to do that so we'll be very busy after that.

And we plan to put out the two versions. One is the final PDF. We'll have that tagged for accessibility. The PDF version before you has a lot of hypertext links in it but I think everybody probably agrees that PDF is best for printing.

And so for onscreen viewing we're going to do an HGML version and we'll write other introductory material and so on and so forth.

We would also like to put out a small database of requirements with requirement language and fields in there that essentially would make it easier to find a number of different requirements that apply to different devices and so on and so forth.

With that, that is all I have there. I wanted to bring up a couple of things here. You know, you at home or wherever don't necessarily need to look at this, but I am bringing up a requirements matrix. We had three of these out there on the website underneath the PDF for the VVSG itself.

And Secretary Gale sent in a couple of questions. What external resources such as ISO, IEEE standards, so on and so forth were generally used as a basis for the requirements developed for the subcommittee report?

And, you know, apart from the telecons, several public hearings at NIST, we went to a cost of testing meeting with the EAC, a number of conversations with vendors, comments to the TGDC that we posted on the website. We've had a lot of consultation with various different groups.

A lot has happened over these past two years. Voting has been pretty controversial and we have tried as much as possible to stay in the middle and talk with as many different groups as we can, and get as many different sides of the story, and learn as much as we can. So we feel that we have done our work.

And up here you'll see that each of the requirements we have in these matrices are cross-referenced to a number of different sources. I'm scrolling down here rather quickly. You'll see VVSG 2005 appears prominently, but there are a number of other documents, a number of other standards that we've gone into as well.

Has there been adequate opportunity for the usual peer review of the underlying assumptions used to develop the requirements? I guess I kind of answered that in the first answer. We have consulted quite a bit with a number of different groups and we feel that we have listened to many different sides of the story so I guess the answer to that is yes.

With that I think unless there are any questions, I would like to go to another document which is the VVSG itself, and so what I'm displaying up on the screen is the PDF of the VVSG, and the people talking to you from now on will probably be talking about certain pages or certain requirements so I want to make sure we are all looking at the same documents.

So the TGDC members, you've got some documents mailed out to you on CDs. There is the PDF on the website. You can also download, or if you're looking at the printed copy you should all make sure that the first page says draft, August 7th.

DR. JEFFREY: John, there's a question. Secretary Gale, you have a question?

SECRETARY GALE:   Yes, I do, Dr. Jeffrey. With regard to the new document John, I want to compliment you and your team. The latest revisions have vastly improved the readability of the document and we're all very grateful for the hard work to accomplish that.

Of course I grew up in a paper era so in going through 580 pages with the Table of Contents it's great but of course I regretted the lack of an index. And my staff who are much more digitally based then I am said well, that's an issue of searchability and on a digital document it's not an issue of a paper index. And so I obviously understand that now.

And I would like to encourage you at NIST to continue to develop additional tools such as the searchable database, which will certainly increase the accessibility to that document if you're looking for specific subjects or topics. That would of course serve the purpose of an index but would aid the election community and the public in finding specific things that they're looking for. Thank you.

MR. WACK:    Okay, thank you, and we'll take your comments very seriously.To make sure we're starting off all on the same foot, the same page, again I just want to make sure we're all talking about the draft, August 7, 2007.And with that I will turn it back to Dr. Jeffrey and talk to you all later. Thank you.

DR. JEFFREY: Thank you very much, John. With that, it's time to roll up our sleeves and start getting into the meat of the issues. So first Barbara Guttman will be talking about the Security and Transparency sections. Barbara, over to you.

MS. GUTTMAN: Thank you very much. I'll roll up my sleeves here because my jacket actually is lightweight and rolls up.

So the first thing you'll notice of course, I hope you notice, is that I'm not Nelson Hastings, I'm Barbara Guttman, and truthfully I'm not even an electronics engineer.

So that said, let me present what STS has been doing on its portion of the VVSG, the portions it's responsible for.

I'm going to try to give out as I'm talking either slide numbers, although I'm not sure if the version you all have is page numbers. I'll also try to give out some slide names to help to follow along, but if you get lost please speak up.

So the first two slides that are called agenda just list all the sections for which there were changes. Just one thing I did want to point out is that there was some reorganization of the document and that each security topic used to be its own chapter and now there are two chapters devoted to security.

The first is called Security and Audit Architecture and it contains three security topics, Security and Audit, Electronic Records, and IVVR, which if you're not quite familiar with the term IVVR, wait, I'll get to that. And the second one, which is General Security topics, that includes all the other security topics. So I just wanted you to make sure you were aware of that before I went over them.

So that's what I'm going to start with, a very easy topic first, which was some general modifications we did throughout all the security sections, which was primarily harmonizing language.

There was a lot of harmonizing language to make sure it was consistent with the definitions, and we also moved all the documentation material to the part of the standard related to the documentation so that was really pretty straightforward.Now I'm going to move on to Security and Audit Architecture, and once again this one also had general changes that were primarily scoping it to fit into these new chapters, which I'll talk about as I go through them.

The first thing we did was to eliminate a lot of duplicate material. There was a lot of duplicate material as we were developing things. When you're writing it it's a lot easier to kind of put it in more the once everywhere you need it and then when you're done scope it back out.

There was a lot of explanatory information about election administration issues, which we had used to explain the requirements but we now deleted that material as well.Also in the audit section, I'm now in Audit, Steps Removed and Retained, as you recall in the Audit Architecture section we describe what kind of audits people might do so we could tell you what kind of requirements the system needed.

But some of these didn't actually result in actual requirements for either the system or the documentation so we went ahead and removed those also.

So the next section is Electronic Records. This one actually had probably the most extensive set of changes having to do with harmonization, clarification, and removing duplication, although actually the actual content didn't really change that much but it looks really quite different from how it looked beforehand. But that was primarily to better coordinate with CRT and the work they do in requirements for reports versus audit records.

Now I'm on to a section that really is -- there is sort of kind of a major difference from how the May draft looked, which was independent voter verifiable records.In the May draft we used the term voter verified paper records and what we did is we abstracted this concept up one level to recognize that it's really about an independent voting record.

And so we looked through the requirements we had for IVVR and we rewrote them to address any kind of independent kind of record, and in this way you could have non-paper based systems that could conform to the VVSG. And this is consistent with TGDC Resolution 66. The (unintelligible) resolution did call for independent voting records. So that's kind of an important change.

This change itself did not actually then result in changes to the VV-PAT or the P-Call section. But if you go to the next slide, the second slide called independent voter verifiable records, I do want to point out because we were working on this this summer, STS has one change that is beyond what is in the draft you have in front of you, which if you look at the slide, it's slide 10, the second IVVR slide, we want to add one additional requirement to clarify that all these requirements actually have to refer to the same record.

So it doesn't particularly change the intent but I wanted to call that specifically because it's different from what's in the document you're voting on.We also did make some clarification to the VV-Pat section which we clarified that paper records have to use OCR fonts, that you have t use a code book to interpret the paper record, that we tried to make some of the perimeters more tunable by election officials, and hopping over to the next slide, to clarify how you do verification for cut sheets VV-Pats.

We also made one change to the precinct optical scan, which was before we had a should requirement to support batching of paper records, but STS decided to remove that because it was actually a little bit too complicated.

So that's what I have to say on IVVR.

And now moving on to the slide called Crypto, which is cryptography, which is slide 13. In Crypto we actually only made editorial changes but there was an important question that was asked and we wanted to make sure everyone knew the answer to this, which was if you have a machine like a DRE that's actually supporting multiple precincts, are you going to end up needing more the one election key, and the answer is no, you do not.

So we wanted to make sure that was shared with you, and if any of you have other questions about the cryptography chapter we are pleased to answer them here at NIST, and feel free to just send us an e-mail and ask for a little tutorial because it is a little bit complicated.

Another important change was in the set up and inspection chapter which is, as the architecture for the VVSG evolved, and as we developed the requirements to support software independence, and as we developed the system integrity management chapter, we realized that we do not need the requirement for a trusted interface anymore, that the security goals that that requirement was designed to meet are already being met by other methods.

So STS decided to remove it and that's a sort of important change. And when we made that change we were then able to refocus this chapter more on software inspection, and then also actually address software installation which used to include software distribution and the distribution were really requirements that were primarily for the test labs. So we moved that to the section that addresses test labs and that helped us to just refocus that chapter.

Now let me get to access control. There was a technical issue with access controls which is, most people are familiar with operating systems that are really fairly robust but often in election systems you also have systems that are built with what are constrained operating systems and we wanted to make sure that we addressed those within the access control chapter.

So we were working on this and we came up with some ideas for how to best address this, to make sure that -- basically election management systems need to have robust access control, but capture devices can actually do what we call role base control.

But here's another place where we continued working on it after this draft was done so we actually have a change that we want to make to this draft.

It is on what is called slide 17, which is the second access control slide, list the language we want to add. This further clarifies what I just explained but this is the actual language we wanted to use and I wanted to call that specifically to your attention.

    And there is actually a third access control slide, which actually says we actually did a lot of work in making sure the scoping and language was correct.

    Moving on to the system and activity management chapter, I already extolled the virtues of it. Mostly in this chapter we actually just did some re-scoping and harmonization.

    In communications we actually harmonized this with the access control chapter so that neither chapter allowed for remote administration of systems. We had a little disconnect there so we fixed that.

    System event logging, we also addressed this issue of what happens when you have a constrained operating system and we added some material to address that.

    In physical security we implemented the TGDC decision about locks and you can see the language. It's actually off of slide 22 called physical security. You can see the language we used for that.

    And then in the pre and post test slides we clarified when labs do what we call the Test Lab Bill which was previously known as the Witness Bill, and we also harmonized with CRT about how the test labs treated unmodified software which is they have to acquire an independent copy of it to be used in their testing. And that's pre and post testing.

So I'm now on to open ended vulnerability testing, which was an area that was not terribly flushed out in the May draft so we've added a lot of material about open ended vulnerability testing to address the scope, focus, and priorities of the team, the team competition, the rules of engagement, flip over to the next slide, and the level of effort and the reporting requirements.

    And that's the end of my presentation so I'll pause for questions.

    DR. JEFFREY: Are there any comments or questions for Barbara on the STS section?

    SECRETARY GALE:        Dr. Jeffrey, this is John Gale.

    DR. JEFFREY: Yes. Go ahead Secretary Gale.

    SECRETARY GALE:   I stepped out for a minute so I didn't know how to que in so I hope this is acceptable.

    DR. JEFFREY: Absolutely.

    SECRETARY GALE:   With regard to the cryptography, the imbedding of a chip, I understand that that is current technology. The military uses it, the banking industry uses it, but it's a whole new concept for any election equipment hardware in terms of imbedding an encrypted chip and what I'm concerned about I guess are what are some of the consequences if those embedded chips fail.

So I'm interested in knowing the failure rate of such chips in terms of embedding that in equipment, either the small equipment like VREs and optical scans, precinct equipment. Do you have any sense of what the possible failure rate is?

    MS. GUTTMAN: Let me address that first by addressing one of your assumptions which is you described how embedded chips are becoming pervasive in various fields like (unintelligible). They have actually just become pervasive in all IT. This is how all PCs are going to come in the future. This is part of just where the entire industry is moving.

    I don't have specific failure rates but I suspect it's quite low. I am looking over at the Crypto team.

    DR. RIVEST:  This is Ron Rivest. Could I --

    MS. GUTTMAN: Well, why don't I look over at the Crypto team on the phone.

    DR. RIVEST:  This is Ron Rivest. Secretary Gale, this is a great question, but in terms of technology this is absolutely routine technology. This is a digital integrated circuit just like all the other integrated digital circuits on the motherboard there and there is no reason whatsoever to expect that there should be any impact on reliability.

    Nonetheless, that said, of course this will go through the usual testing and if vendors have any particular insights into those issues that I don't or others do, it would be good to hear that, but I don't expect any issues on the reliability side.

    I think issues that arise here are more just the management side, and making sure that all these chips are (unintelligible) where they are and what keys they're using.

    SECRETARY GALE:   Thank you, Ron. I appreciate that. This is Secretary Gale. And you understand my concern because with the embedded chips it may not be cost effective if there is a failure rate such that there can be reasonable expectation for a precinct to have a piece of equipment fail, and then the need to replace that equipment, and the flexibility needed of election officials to be able to move equipment once it is precinct specific with an embedded chip. That failure rate has a big impact on election administration.

Secondly, it also has a big impact on cost and that's who's responsible for repairing that or replacing it if it even could be replaced in either precinct optical scan equipment or in the central scan equipment which is much more expensive to buy.

    DR. RIVEST:  This is Ron Rivest again. Yes, those are good points and we'll have to see how this sorts out, but my belief is there is absolutely no reason to believe that this should have any impact on the reliability of this equipment. As I said, these are standard integrated circuit parts.

    SECRETARY GALE:   Thank you.

    MS. GUTTMAN: Any other questions?

    MS. DAVIDSON: This is Deneta Davidson with the EAC. When you were going over the independent voter verification record, and it kind of went a little fast, and this is one of the areas our election community has really been very interested in.

    Right now what we have is paper, and I know that we've discussed in the IVVR, does that still require paper -- but I know it doesn't quite require paper, but can you give us some examples of what -- I mean I know there are other things that could be developed in the future but is there something that you can kind of go over and just --

(Tape Interrupted While Changing Sides)

(END OF AUDIOTAPE 1, SIDE A)(START OF AUDIOTAPE 1, SIDE B)

    MS. GUTTMAN: There really isn't something right now that's an obvious solution to can you have a non-paper IVVR system, but just because there isn't one today doesn't mean the cleaver folks out in industry and academia aren't thinking of cleaver things and that's what the opening is for. It's for things that people haven't really figured out yet.

And there's such opportunity. There are just so many great ideas out there. And I know some of the people from your office went to like Vote.com and places. People are thinking creatively out there about better ways to do things and I personally find that very exciting.

    DR. JEFFREY: Any other comments, questions? Okay, hearing no other comments or questions, I would actually --

    MS. GUTTMAN: It goes back to John Wack.

    DR. JEFFREY: Okay, the Chairman is being corrected. It goes back to John Wack. John.

    MR. WACK:    Thank you, Barbara. It's back to Wack.If I can grab that from you, just to break things up a little bit, because I have a copy of my slides and other people don't.

I'm just going to go through parts of the document here and quickly I'm just going to add to Barbara's answer by going to a particular page in the introduction. I'm proud of this picture because I drew it myself and I'll try to blow it up here. I'm on page nine of the introduction.

    So basically the way things work, the conformance clause section, chapter two of part one, the conformance clause chapter describes in a sense what's needed for voting systems to conform to the VVSG, and to make it clear in that section we say that voting systems --

    DR. JEFFREY: I'm sorry. Whitney, I believe you've got a comment or question.

    MS. QUESENBERY:   I do have a question. Chapter nine, page nine -- I'm sorry what page is that?

    DR. JEFFREY: Whitney, could you hold on just a second while we turn up the volume? We're not hearing you. Okay, could you try again, Whitney? Sorry about that.

    MS. QUESENBERY:   This is Whitney Quesenbery. I asked that when we (unintelligible) documents that we give not only the internal page reference but the PDF file page so that those of us who are behind the (unintelligible) without any visual reference at all can keep up with you. You turn to it, then you tell us where you are, and you immediately begin speaking.

    DR. JEFFREY: So you're looking for just enough time to catch up, is that what you said?

    MS. QUESENBERY:   Yes, that's (unintelligible).

    DR. JEFFREY: Okay, could you repeat what page you're on and then after that let's add a 15, 20 second pause to let people actually flip through to get to the right page.

    MR. WACK:    Okay. On the PDF file I'm looking at page number 59, and in the document itself if you're just looking at the page numbers at the bottom,

I'm looking at the introduction on page nine.

    And the reason I'm looking at this particular slide is just in a graphical way I wanted to describe the changes and the clarifications we made to the conformance clause, which is simply to make clear that systems that meet the definition of software independence can conform to the VVSG, the draft VVSG, and that we have two methods for that.

And to make it clear, one for systems that use records, independent voter verifiable records, and the other the innovation class.

And to iterate what Barbara was saying, new types of systems using new forms of independent voter verifiable records have requirements already in VVSG, therefore they could conform to the VVSG. They would not have to use the innovation class.So the innovation class is really for new innovative voting systems. Could be for end-to-end cryptographic systems, things of that sort. Systems that use independent voter verifiable records do conform to the requirement in the VVSG. So that is the clarification we made there.

    I think I will go back to the slides at this point because I think probably it's easier for people to follow along there, and I'll talk about two other changes and these are really in the core requirements area, not in the STS chapters.So the first one has to do with data export, and in chapter six we used to have a section called integratability. We expanded this to talk about data export as well.Integratability in a sense kind of means no huge barriers to making two components integrate with each other and sort of on the road to being interoperable.

So we have general requirements in there basically saying that voting systems shall be integratable, voting devices shall be integratable, let me correct myself.But also we needed to handle the question of what format electronic records when they're exported should be in, and we made it clear that they should be in a publicly documented open non-proprietary format.

Vendors shall include a source code program that shows how these records can be read and we do have a requirement in there that essentially is a strong recommendation that a common consensus based format should be used, that we have followed some proposal by an IEEE subcommittee, and also the oasis election markup language are two common consensus based formats that could be used. So these are the requirements there based on comments from the last meeting.

    And then in the other area, which is chapter seven, which is requirements by voting activity, there we for the first time introduced requirements for electronic poll books.

    We had a subsection there that was on ballot activation and in an effort to more clearly address privacy related requirements and requirements related to networking for electronic poll books, we broke these out into two separate areas and added requirements essentially to strengthen privacy because on an electronic poll book it's basically a voter registration database and we want to make sure that voter information does not somehow leak over to a DRE, or a VD-Pad system, or whatever system is being used, and we wanted to insure that records can't be aggregated also to violate privacy.

In other words we don't want records from a DRE combined with records from an electronic poll book put together to show how people voted.

    The other thing is that we added requirements to permit electronic poll books to make external connections to remote voter registration databases. We added some requirements to improve security.

In other words there must be some sort of a firewall, things of that sort. So those are the two areas that we added outside of the STS chapters. And that is it. With that, are there any questions?

    DR. JEFFREY: Any questions or comments? This would wrap up the discussion the Security and Transparency. Yes, Secretary Gale, go ahead.

    SECRETARY GALE:   I'm sorry. The lighting I guess for getting ourselves up on the que --

    DR. JEFFREY: You just popped up.

    SECRETARY GALE:   Okay. May I proceed?

    DR. JEFFREY: Yes, sir, please do.

    SECRETARY GALE:   Thank you. I had a couple questions for John.

    Obviously your team and subcommittee has worked very, very hard to develop a voting system that is essentially impervious to attack and it's called the Gold Standard by people in the election community, that this is kind of the ultimate set of security standards.

Do you consider it to be impervious to attack by just current standards or do you consider it to be impervious to attack from methods of attack developed in the future? In other words, say maybe now under the standards that are being proposed is it safe for (unintelligible) of the future under computer industry standards?

    MR. WACK:    Well, it's sort of a short and a long answer. The short answer is no, security you can never guarantee that it will be impervious to attack and security in a sense always plays a catch up game and you tend to know about known threats, threats out in the future, you can take a guess at.

    To a certain extent in security, and I think what we've tried to do is where it seems appropriate to do so, we've tried to over build.

So for example, the designers of the Brooklyn Bridge didn't let's say know enough about engineering and had the materials to build a modern day bridge, so they over designed to a certain extent and today it carries lots of traffic and it's a strong bridge.

    And in the same way for example with electronic poll books, our original proposal was that it be best they not connect up to external networks. We know that external networks are difficult to secure. There will be new sorts of problems, new sorts of vulnerabilities, things of that sort.

    So given that there was a need to actually hook up to external networks, we then throw other things in there such as a firewall that we hope would block out most traffic, most threats, and only allow the information in that needs to come in.But the answer is no, security is always a catch up game. At the same time we do think we've done a good job. We're very familiar with many of the threats out there affecting IT technology, especially in the network areaso --

    DR. JEFFREY: And John, I think Ron Rivest would like to add something. Go ahead, Ron.

    DR. RIVEST:  Yes, thank you. Ron Rivest. Secretary Gale, that's an excellent question and John's response is a good one.

I'm hearing echoes, can you hear okay?

    DR. JEFFREY: Yes, you're coming across fine.

    DR. RIVEST:  Okay. I'd like to amplify that a bit. I'd like to stress that the steps that we've taken here in the Security and Transparency subcommittee in no way cab be viewed as the last word or building a impervious system.

These are reasonable security steps to mitigate many of the known risks and to move us forward in the security catch up game as John calls it, but there are many threats that are not answered by this.

There are things that need to be answered by procedures, and there are on the horizon, new classes of voting systems that we hope the innovation class will encourage our vendors to come forward with.

Some of those known as the end-to-end category of voting systems have properties that are not exhibited by any current systems and allow voters to achieve even higher levels of competence.

So this is a major step forward in the security of voting systems but it's in no way a gold plated solution or one that should be considered impervious. It's a set of reasonable measures to achieve a significant improvement in security.

    SECRETARY GALE:   Thank you, Dr. Rivest. If I could ask a follow-up question. In light of the fact that these standards will become official standards for this industry to design and build, again until maybe 2009, 2010 or later, is there enough I guess what John Wack was saying, enough design built into these standards to carry it to that date and beyond that date? In other words, is there kind of a predicable future of this covered technically for the next five or ten years in terms of security?

    MR. WACK:    I'm pausing. I wasn't sure whether Ron wanted to answer that.

    DR. RIVEST:  Why don't you go ahead, John.

    MR. WACK:    Well, we think yes. One of the main reasons is because we have moved towards software independence in this particular version, primarily because of the difficulty of testing systems.

As time moves on and we get a better handle on how to test very large complex systems such as voting systems, we think that we will be able to keep pace with new threats, new problems.

We expect to have a very good set of test sweeps out, roughly 2009, 2010, that will work with the VVSG, and in good testing that's where we think we will catch most of the problems. So yeah, we do think that we are moving along, keeping pace.

    DR. JEFFREY: Secretary Gale, I believe you've got some additional comments.

    SECRETARY GALE:   Yes, Dr. Jeffrey. I don't know if that's my system or somebody else's.

    DR. JEFFREY: It may be your system but we're fine with it.

    SECRETARY GALE:   Okay. Well, thank you. I appreciate you addressing that issue.

I also had some questions on innovation class. It's been a very exciting addition to this set of iteration and I compliment Dr. Rivest and the committee for incorporating that, but I was really concerned in looking over not only the short resolution but the broader one as well because the standards that the review committee is supposed to use seem very broad, very subjective, and then there's really nothing that defines that review committee or any suggestion on that.

Now maybe that's totally left to the EAC but with such a subjective and broad group of criteria it becomes highly critical that the review committee be a committee of almost a blue ribbon panel that are very carefully selected and that are going to be able to take those broad standards and convert them into objective criteria. So that was one of my concerns.

I like the idea, but however that review committee is defined is going to be critically important, or whether the process is stifled before it begins, or whether it's going to be open enough to allow vendors to move toward the prototype stage. Does that make sense?

    DR. JEFFREY: Whitney, do you have a comment on the question or do you want John to answer the question first?

    MS. QUESENBERY:   No, let John answer the question.

    DR. JEFFREY: Okay, thanks.

    MR. WACK:    If I may, I was actually looking over at Commissioner Davidson who wanted to add a few things.

    COMMISSIONER DAVIDSON: I just felt like I needed to say that the EAC really understands the concerns, and as we move forward in this we feel that definitely we didn't want criteria put in to place that might keep a manufacturer from designing something in the future that might be the answer. Not knowing what technology is going to bring.

Obviously we know there has to be procedures and everything put into place, but if it's not in the VVSG we felt that it would be better being in a procedural document that has to be vetted obviously.

We do very open processing in our office but we can change that as we move forward because this is a brand new idea. We know that it's going to take vetting. We know it's going to have to have that blue ribbon committee, but I wouldn't want to pick that today because there might be somebody to step forward in the future that we think would be great.

So we just felt that it shouldn't be in the VVSG, that it gives us more ability and flexibility in the future to work and hope they come up with a great product that everybody will like.

I didn't really give you a lot of answer, but that's kind of how we feel at the EAC, is I think we really want that flexibility right now and being able to change the document without having to come back to the TGDC and go through a vetting of a document that would take possibly years to change something to keep up with technology.

    SECRETARY GALE:   Thank you, Commissioner Davidson. This is John Gale, Nebraska. I think that answers my question.

I think what you're saying is the EAC will address this as a procedural issue, maybe in a separate set of criteria other than the technical guidelines.

    COMMISSIONER DAVIDSON: Deneta Davidson again. And yes, you're exactly right. I think if you look at what we have done with our laboratories in setting up procedures that were vetted in public meetings, you'll understand that we'll do exactly the same thing with this type of moving forward with definitely the innovation class.

    SECRETARY GALE:   This is John Gale, Nebraska. Thank you, Commissioner.

    DR. JEFFREY: Whitney, I believe you have a question or a comment.

    MS. QUESENBERY:   Yes, this is Whitney Quesenbery. I have much the same questions about the OAVT, which we went over I think in about 30 seconds. I know there's some questions about how we --

    MR. WACK:    Well, fire away.

    MS. QUESENBERY:   Well, how the standards to which the OAVT will be concepted, will be created, and I suppose the other thing is (unintelligible) weeks is a long time and does anyone have any idea what that costs?

    MR. WACK:    If I may , I'll ask Alyse Clay-Jones to address that, who is really the primary author there.

    MS. CLAY-JONES:   Hi, I think that you had two questions for us, one about standardizing the open ended vulnerability testing, and the other about the length of time, is that correct, Whitney?

    MS. QUESENBERY:   (Off microphone). Yes, this is Whitney. I mean the open endedness of one (unintelligible) brings up some interesting curricular questions about the (unintelligible), and the question that we talked about the last one is how do we decide that something is sufficient (unintelligible).

    MS. CLAY-JONES:   I don't know if you had a chance to look at the specific requirements that we've enumerated but we have made attempts to set up some boundary conditions, some rules of engagement for the OAVT team so that we get as specific as possible about what it would take in order to actually fail based on the open ended vulnerability test.

    So if you take a look at the specific requirements you'll see that there are some boundaries on the team.

    MS. QUESENBERY:   So this is new material that's been added?

    MS. CLAY-JONES:   Yes, ma'am.

    MR. WACK:    I'll just add my two cents to that too. Sometimes I think of open ended vulnerability testing as a little bit like network penetration testing, and network penetration testing is something that evolves as various vulnerabilities are discovered, and new threats and new techniques.

And I think the VVSG has good requirements in it for the basis of open ended vulnerability testing that deal with staffing and a number of basic items that are kind of fair game for open ended vulnerability testing, but it will probably be something that will evolve as labs become better at it.

And we would expect that labs would cooperate, share information to a certain extent on vulnerabilities with voting systems to insure that as much as possible one lab will do basically the same job as another lab will do.

    DR. JEFFREY: Are there any other questions or comments on Security and Transparency? Okay, hearing none, I believe all of you have copies of the draft resolution that outlines the specific chapters and sections of the areas that we're talking about for Security and Transparency. It's labeled as Resolution 6-07, STS VVSG Sections, Final Approval.

And for no other reason then perhaps symbolically, I'd ask whether the co-chairs, Ron Rivest and Helen Purcell of the STS would be interested in proposing this resolution.

    DR. RIVEST:  Ron Rivest speaking. Yes, so moved. I'd like to cosponsor this resolution.

    DR. JEFFREY: Helen, would you like to second it?

    MS. PURCELL: Yes, sir I would.

    DR. JEFFREY: Excellent. So let me read it for the record for those who are listening in but not having access to the written material.

    There's been a proposal and it has been seconded. It says the TGDC grants final approval for the Security and Transparency sections, part one, chapter two, section 2.7, chapters four and five all, chapter six, section 6.6, chapter seven, section 7.5.1, part two, chapter three, section 3.5, chapter four, section 4.3, part three, chapter three, section 3.4, chapter four, section 5.4 as part of its second set of VVSG recommendations to the Executive Director of the EAC, subject to editing as instructed by the TGDC at this meeting, and final review by the Chair of the TGDC.

    I now recognize Secretary Gale. Have you a comment or question?

    SECRETARY GALE:   Dr. Jeffrey, I do have one and I guess it concerns you. This resolution talks about final review by the Chair of the TGDC. Obviously that currently would be you and I'm hoping that would be accomplished before you were to retire from government service.

I certainly have fantastic confidence in your ability to bring it all together and complete that final review but you start talking about interim chairs or temporary chairs, that concerns me a little bit.

    DR. JEFFREY: I know that my staff, and we've chatted about this as recently as yesterday, will be working as hard as possible to have everything to me before I officially punch out and we're certainly committed to do that.Certainly what I would plan to do, and I would encourage that if by some chance they don't get this to me by the time I punch out, I think that the process would entail that any changes even minor ones will be in change mode, I mean even happy to glad, commas replaced with semi-colons, will be in change mode and would be posted on the website prior to the signature, the Chair of the TGDC signature so that everyone would get to see that.

Obviously if there is anything more then a simple non-substantive change that would obviously go back to make sure that everybody is on board with that.

So all I can do is say that the staff is committed as much as possible to get to the end of the race before I do, and I certainly encourage that.

    SECRETARY GALE:   Thank you.

    DR. JEFFREY: Any other comments or questions? Okay, if not, is there any objection to adopting this by unanimous consent? Okay, hearing no objections then this is adopted. Resolution 06-07 is adopted by unanimous consent and my heartiest congratulations to the Security and Transparency Subcommittee for a job well done.

    With that, we are a little bit ahead of schedule. I would like to actually if the Core Requirements team is prepared, I think CRT is next, right? Yes.

So I'll ask David Flater after he puts his jacket on, if he could give us a review of the CRT section. So as soon as we get the correct presentation up on the screen David will start. So for those who are listening in, sorry for the time delay.

    MR. FLATER:  Sorry for the delay. This is David Flater of NIST going to present about changes for the Core Requirements and Testing Subcommittee.

    In a previous presentation John Wack discussed two sections that were previously tagged as CRT sections. This was the integratability section, part one, section 6.6, and the ballot activation section, which is part one, section 7.5.1. Those sections do show changes that were made by the STO subcommittee.

    Apart from those sections, all of the other material tagged as CRT reflects neither changes in technology direction nor the addition of any significant amounts of brand new material relative to what was discussed at the May meeting and previous.

    So what I'm looking at instead is a large number of minor changes. In fact, because some members of the TGDC made a line by line review of the spec, there were in fact 282 issues within the CRT material.

However as I said, these were all relatively minor and I can say that the vast majority of these were in fact addressed head on or forwarded to the those who can address them and the ones that remain are sort of loose change if you will.

They are sufficiently minor that if in fact we never get back to them it won't be a tremendous loss but we do hope to clean up those loose strings in the time remaining to us.

    Next slide. Now I'm going to provide some highlights, the most significant of the relatively insignificant issues if you will.

I've been advised both to site part and chapter numbers and also to wait for people to be able to find those, so I am going to address these bullets in the order that they appear in the document and give people time to find those sections.

    The first one in document order would be software engineering practices. This is part one, section 6.4.1. This section benefited from a very careful review in which folks went requirement by requirement looking for issues and it was found that some of these requirements were over specified, some were under specified.

There were other minor adjustments and polishing that needed to be done to these requirements so this section reflects a fair number of those kinds of changes, but again, no major changes in technical direction or huge surprises.

    The next highlighted issue would be back in part three, section 5.1. As of May the section having to do with hardware testing, otherwise known as shake and bake tests, have been unintentionally omitted from the draft. That omission has now been corrected.

    The significant portions of the material from VVSG '05 that were to be carried over, plus modifications to that material that had been previously discussed in CRT and with the full committee now appear in the draft. Removed were some details that have been considered more in the scope of test methods work to proceed afterwards.

    The next highlighted change would be part three, section 5.2.3. This is a functional testing section in which the requirements on volume testing appear.The change here was to provide more perimeters for the volume testing of optical scanners. There was some question left by the requirements as they were previously drafted regarding the practice of re-circulating paper ballots through an optical scanner as part of the volume test.

Neither extremes seemed appropriate here, neither requiring that all of the ballots be unique for a huge volume test nor allowing unrestricted re-circulation of ballots.So a compromise was struck based on the number of paper ballots that it was thought feasible to produce as part of a volume test of the scope that was envisioned, and corresponding perimeters now appear in those requirements.

    Fourth highlight appears in part three, section 5.3.1. This is the explanation of the test method used for benchmarking of reliability and accuracy of voting equipment.

    The test method itself has not changed since it was discussed in May. What has changed is the explanation of that method. The previous draft was found to be confusing so it's been rewritten in different language to attempt to clarify.

    Finally there's a highlight that I suppose is most applicable to Appendix A, the terminology section, but also applies throughout the draft, which is that frequently misconstrued terms have been replaced with more explanatory wording.

    We had numerous discussions with the TGDC telecons to clarify these problematic terms and substitute terms that were found to be more transparent.

    Finally throughout the specification, as I said there were 282 issues, throughout the specification there were many minor changes for those many issues to change this word to that, things of that form to polish the requirements that had previously been drafted.

    And with that I will conclude and ask if there are any questions.

    DR. JEFFREY: Any comments or questions for the CRT or David? Whitney, I'm sorry you just popped up in my machine. Whitney.

    MS. QUESENBERY:   (Off microphone). This is Whitney. David the last slide, which is open issues and the notion about (unintelligible), I just wondered (unintelligible).

    MR. FLATER:  We were going to skip that.

    MS. QUESENBERY:   Okay. Are these all final edit things?

    MR. FLATER:  Yes, those are the minor inconsequential issues that I mentioned.

    MS. QUESENBERY:   Thank you very much.

    DR. JEFFREY: Any other comments or questions? Okay, hearing none I am going to again for symbolic reasons ask if the co-chairs of the CRT, Dan Schutzer and Paul Miller would like to propose a motion. There's a draft resolution 07-07 that's in the handout, that if you want I'd be happy to read, but either Dan or Paul, would you like to propose that?

    MR. MILLER:  This is Paul. Yes, I would like to propose that resolution.

    DR. JEFFREY: Thank you. So motion has been proposed. Dan or anyone else would you like to second? Dan may have dropped off. Would anyone else like to second?

    MR. PEARCE:  Philip Pearce, I'll second it.

    DR. JEFFREY: Okay, great. So there's a proposal, Resolution 07-07 that has been proposed and seconded. Let me read it for the record.

    The TGDC grants final approval for the Core Requirements and Testing sections part one, chapters one and eight all, chapter two except section 2.7, chapter six, all, except section 6.6, chapter seven all, except section 7.5.1, part two, all except chapter three, section 3.5, and chapter four, section 4.3, part three, all except sections 3.4 and 5.4, as part of its second set of VVSG recommendations to the Executive Director of EAC subject to editing as instructed by the TGDC at this meeting and final review by the Chair of the TGDC.

    So there is a motion and it has been seconded. Are there any questions or comments? Secretary Gale.

    SECRETARY GALE:   Dr. Jeffrey, I do have a point of order. On the first subcommittee report, I would simply request whether or not anyone had any objections to the unanimous consent in looking then at who was present.

It's a fairly slim quorum that we have and people could easily step aside for any number of reasons and be away from the screen when that question is asked. I would ask that we do a roll call vote so we can insure that we do indeed have a record of quorum voting on the issue with the majority.

    DR. JEFFREY: Okay, noted and so we will do the roll call vote. So there is a motion on the table and a second. Are there any comments or questions on the resolution 07-07? Whitney.

    MS. QUESENBERY:   I have a question that just is a coverage question, which is I don't see where in all of this unless it just gets covered in the final overall document, we cover things like the vocabulary or does that just get covered in the final --

    DR. JEFFREY: It's in the appendix that I think we'll cover at the end.

    MS. QUESENBERY:   Thank you.

    DR. JEFFREY: Okay, unless there's any additional comments or questions I'll ask the parliamentarian to do a roll call vote. This is proposal 07-07.

    MS. ALLEN:   Roll call for Resolution 07-07. Williams.

    DR. WILLIAMS: Abstain.

MS. ALLEN:   Wagner.

    MR. WAGNER:  Abstain.

    MS. ALLEN:   Paul Miller.

    MR. MILLER:  Yes.

    MS. ALLEN:   Frank Gale. I'm sorry, Gale.

    SECRETARY GALE:   Yes.

    MS. ALLEN:   My apologies. Mason.

    MS. MASON:   Yes.

    MS. ALLEN:   Gannon.

    MR. GANNON:  Yes.

    MS. ALLEN:   Pearce.

    MR. PEARCE:  Yes.

    MS. ALLEN:   Alice Miller.

    MS. MILLER:  Yes.

    MS. ALLEN:   Purcell.

    MS. PURCELL: Yes.

    MS. ALLEN:   Quesenbery.

    MS. QUESENBERY:   Yes.

    MS. ALLEN:   Rivest.

    DR. RIVEST:  Yes.

    MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

    DR. JEFFREY: Abstain.

    MS. ALLEN:   We have eight yes'es. We have enough for a quorum.

    DR. JEFFREY: Wait, we have nine. I count nine as well. The parliamentarian is still counting fingers. We've got nine. So that was nine yes'es and three abstentions, and zero no's. And with that, Resolution 07-07 passes and again my heartiest congratulations to the Core Requirements and Testing team. Job well done. Thank you.

    With that, we are still ahead of schedule. Let me ask a question to Secretary Gale. Do you feel that we need to go back on the STS for a roll call vote? Are you satisfied?

    SECRETARY GALE:   Well, thank you for raising that question, Dr. Jeffrey. I regret not having raised this when we voted on the first subcommittee, but this is so extraordinarily important that it's probably not a bad thought.

Procedurally I guess someone who was on the prevailing side would make a motion for reconsideration and then we would take a new vote on that resolution if we were to do it. It may be that when we get to the final vote it incorporates all the subcommittee votes anyway, but I think it makes a clearer record if we do have roll call vote.

    DR. JEFFREY: Okay, is there a motion to reconsider the vote and do a roll call vote for the Security and Transparency section?

    MS. PURCELL: Mr. Chairman, this is Helen Purcell. I'll make that motion.

    DR. JEFFREY: There is a motion. Is there a second?

    DR. RIVEST:  Ron Rivest, I'll second it.

    DR. JEFFREY: Okay, there's a motion and a second. Are there any comments or questions? Okay, hearing none, there is actually a proposal on the table to do a roll call vote and so the actual question, just for the record I will state, and if there is any objection, that the proposal is that we do a roll call vote for the Security and Transparency section.

    SECRETARY GALE:   Dr. Jeffrey, Secretary Gale, point of order. We'll be voting first on the motion to reconsider before we vote on the resolution.

    DR. JEFFREY: Yes, sir. The only vote on the table right now is to reconsider and to do a roll call vote. So it's not on the STS section, it's just whether we have another vote on the STS.

    If there are no comments or questions, I'll ask the parliamentarian -- and again the motion is whether to reconsider the vote.

    MS. ALLEN:   Williams.

    DR. WILLIAMS: Yes.

    MS. ALLEN:   Wagner.

    MR. WAGNER:  Abstain.

    MS. ALLEN:   Paul Miller.

    MR. MILLER:  Yes.

    MS. ALLEN:   Gale.

    SECRETARY GALE:   Yes.

    MS. ALLEN:   Mason.

    MS. MASON:   Yes.

    MS. ALLEN:   Gannon.

    MR. GANNON:  Yes.

    MS. ALLEN:   Pearce.

    MR. PEARCE:  Yes.

    MS. ALLEN:   Alice Miller.

    MS. MILLER:  Yes.

    MS. ALLEN:   Purcell.

    MS. PURCELL: Yes.

    MS. ALLEN:   Quesenbery.

    MS. QUESENBERY:   Yes.

    MS. ALLEN:   Rivest.

    DR. RIVEST:  Yes.

    MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

    DR. JEFFREY: Abstain.

    MS. ALLEN:   That would be ten yes'es, two abstains, so we have enough for a quorum to continue.

    DR. JEFFREY: Yes, and thank you. I glad we didn't exceed the fingers. So the motion to reconsider the STS vote, I'll ask if there are any additional comments or questions on the STS section. If not we'll go directly to a roll call vote.

Okay, if you will allow me not to reread the entire proposal, this is Resolution 06-07, which is on the screen, and if it's in front of you it's titled STS VVSG Sections, final approval. And with that I'll ask the parliamentarian for a roll call vote, and you're got all the fingers ready.

    MS. ALLEN:   I need a motion and a second please.

    DR. JEFFREY: I'm sorry, I guess we actually need the motion now. I thought the motion to reconsider was --

    MS. ALLEN:   The motion to reconsider, but I need a motion to pass it.

    DR. JEFFREY: Okay, I'm sorry. So much for my Roberts Rules. My apologies to the members. We need a motion to actually consider Resolution 06-07. Again, if the co-chairs would like to put the motion on the table and second it.

    DR. RIVEST:  Yes, so moved. This is Ron Rivest.

    MS. PURCELL: And second.

    DR. JEFFREY: Okay, so Resolution 06-07 is back on the table. It has been seconded, and with that if there are no comments or questions I'll ask the parliamentarian to call the vote by roll call.

    MS. ALLEN:   Williams.

    DR. WILLIAMS: Abstain.

    MS. ALLEN:   Wagner.

    MR. WAGNER:  Abstain.

    MS. ALLEN:   Paul Miller.

    MR. MILLER:  Yes.

    MS. ALLEN:   Gale.

    SECRETARY GALE:   Yes.

    MS. ALLEN:   Mason.

    MS. MASON:   Yes.

    MS. ALLEN:   Gannon.

    MR. GANNON:  Yes.

    MS. ALLEN:   Pearce.

    MR. PEARCE:  Yes.

    MS. ALLEN:   Alice Miller.

    MS. MILLER:  Yes.

    MS. ALLEN:   Purcell.

    MS. PURCELL: Yes.

    MS. ALLEN:   Quesenbery.

    MS. QUESENBERY:   Yes.

    MS. ALLEN:   Rivest.

    DR. RIVEST:  Yes.

    MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

    DR. JEFFREY: Abstain.

    MS. ALLEN:   We have nine yes'es and we have three abstains. We have enough for a quorum.

    DR. JEFFREY: And deja vous, congratulations again to the STS --

(Tape Interrupted While Changing Sides)

(END OF AUDIOTAPE 1, SIDE B)(START OF AUDIOTAPE 2, SIDE A)

    DR. JEFFREY: -- Taking a break in the middle of the session and then come back to that, so I'd like to ask Sharon to come up and talk about the Human Factors and Privacy.

And while she is setting up I'd like to say that actually this is probably one of the sections that's the most exciting because of the tremendous change that's really occurred in this versus VVSG 2005, and the original research has actually been folded into this to really make it much more accessible and usable.

And so I'd like to thank the entire subcommittee for really going beyond what was state of the art. So with that, Sharon.

    DR. LASKOWSKI:    Thank you Dr. Jeffrey, now that the microphone is turned on.

    This is Sharon Laskowski speaking, reporting on the Human Factors and Privacy Subcommittee work on the final draft of the VVSG.

    I do want to thank the entire TGDC for providing so much thoughtful input as the HFP subcommittee worked to pull this next draft together, and I really enjoyed working with every single one of you.

    I do have a natural stopping point so I think that given the time, I think that will work fairly well.

    So everyone should have the Human Factors and Privacy report on final draft of the VVSG pulled up. Title slide is number one.

I'm moving on to slide number two which is the overview. The outline of my talk is that I will summarize the significant changes from the VVSG '05, HFP work, then I will review the HFP changes from the previous draft that we discussed in May, and then I will go over the new material that is the usability performance pass/fail benchmarks.

    On to slide number three, the significant changes from the VVSG '05. I'm not going to read through them. I will just note the ones that were the most dramatic changes.

    Of course the performance benchmarks are quite new and different from anything we'd seen in a usability related standard and I will be talking about that extensively. We have added coworker usability requirements and we've also added plain language guidance, which support different cognitive requirements. Easier the language, easier for everyone to understand. So on this slide I think those are the highlights of the usability.

The highlights of the accessibility would be a requirement that accessibility throughout the voting session be tested as a requirement.

    On to slide number four, continuing significant HFP changes from the VVSG

'05. Here I think that the major point is that we've addressed low vision more fully and we've moved it to the general usability section, things like font size, and contrast and different choices, not just for the accessible voting station but for all the voting stations.

That's because especially with an aging population, we have people with varying degrees of vision that will not want to use the accessible station and will be using the regular voting station, including people that sometimes forget their glasses.

    On to slide number five. There are 14 significant changes since the May plenary. Some of these, and again I will just summarize fairly quickly, we've updated some terminology and refined some definitions.I think a big significant change especially is that we've clarified the interaction of section 3.2, which was the usability section, and section 3.3, which is the accessibility section.

There was a lot of confusion and in particular we wanted to make it very clear to everyone that all the vote editable ballot device requirements for usability also apply to the accessible voting stations, all of which are in the VEDB class, and we put in clarifying language in both of those sections so that if people are just looking at say the accessibility section they will know to go back and look at these other usability requirements.

    Of course the new metrics and proposed performance benchmarks are significant as well.

    On to slide number six, continuing with the changes since the May plenary. Here we have an additional requirement about making sure that failure to actually cast the ballot, have some notification that the voter could see, and we've also clarified discussion about privacy as well.

    Continuing on to slide number seven, changes since the May plenary. I think the major issue on this slide is that we clarified what we meant by poor reading vision. Again, discussed a little bit earlier in terms of this, is really also the placement of the poor vision, low vision requirements in the usability section from the accessibility section were also quite important related to this.

    On to slide number eight, continuing the changes since the May plenary. We upgraded the legibility of paper such as verification of the VD-Pat to a shall, and we specified two sufficient techniques that are variable font size and magnification for those with low vision, and we refined our various response and activity time and alert time requirements.

    And finally slide nine, changes since the May plenary. If you're looking for the section on maintenance, we've moved that to section 6.4.5. It seemed better placed there than in the usability section, and some other minor changes.

    And that concludes the changes since the May plenary. So this is kind of the natural stopping point. I can do a little quick intro, or first let me ask if there are questions on the first half so we can kind of go through those questions before we look at performance requirements. We might want to consider whether we want to continue here or wait until after lunch.

    DR. JEFFREY: Any questions on the first section, which are the changes? Sharon, roughly how long do you think the rest of your session will take?

    DR. LASKOWSKI:    Hard to say. Probably more then 15 minutes. I'm thinking 20 to 35 minutes.

    DR. JEFFREY: Okay, well what I might suggest at this point is that we take a slightly early break. Would there be any objections from any of the members to try to reassemble at 1:45 p.m. instead of two o'clock, still keep to a 30 minute break?

    Okay, hearing no objections, thank you all for your patience in using the high tech system that we've set up. It actually seems to be working pretty well so my compliments to anyone who set this up. So let's take a break right now and reconvene at 1:45 Eastern time, and we'll see you soon.

(BREAK)

    DR. JEFFREY: -- Meeting, and I'd like to ask the parliamentarian to first insure that we still have a quorum, so Thelma.

    MS. ALLEN:   Roll call. Williams, Williams. Williams is not responding. Wagner.

    MR. WAGNER:  Here.

    MS. ALLEN:   Wagner is present. Paul Miller.

    MR. MILLER:  Present.

    MS. ALLEN:   Paul Miller is present. Gale.

    SECRETARY GALE:   Here.

    MS. ALLEN:   Gale is present. Mason, Mason. Mason is not responding. Gannon, Gannon. Gannon is not responding. Pearce.

    MR. PEARCE:  Here.

    MS. ALLEN:   Pearce is here. Alice Miller, Alice Miller. Alice Miller is not responding. Purcell.

    MS. PURCELL: Here.

MS. ALLEN:   Purcell is present. Quesenbery.

    MS. QUESENBERY:   Here.

    MS. ALLEN:   Quesenbery is present. Rivest.

    DR. RIVEST:  Here.

    MS. ALLEN:   Rivest is present. Schutzer, Schutzer. Schutzer is not responding. Jeffrey.

    DR. JEFFREY: Here.

    MS. ALLEN:   Jeffrey is present. Williams, Williams is not responding. Mason, Mason is not responding. We have nine. We have enough for a quorum.

    DR. JEFFREY: Okay, thank you. Okay with that, Sharon, if you could continue. I think you were on page ten.

    DR. LASKOWSKI:    Thank you, Dr. Jeffrey. So we're on slide number ten of the HFP report.

    So in the second half of my talk, because this is new material, I am going to be talking about the usability performance requirements. I am going to give a little bit of review material from last time just to refresh everyone's memory and then go directly into the pass/fail benchmarks.

    So with usability performance requirements, our goal here is to develop a test method to distinguish systems with poor usability from those with good usability.It's based on the performance not evaluation of the design, so as you see, for most of the requirement in the VVSG, there are things about font size or scrolling, very much design oriented.

But we're interested in a test that detects all the types of errors one might see when voters interact with the voting system because guidelines on the design by themselves is not sufficient to detect those that occur in the interaction and we would like this test to be reliable so that we can repeat it in a test laboratory. And part of the reliability is that it is reproducible by the test laboratory so you trust the results.One of the benefits of course is that it is technology independent. You can apply this kind of test method to any type of voting system. So given such a test method you can calculate benchmarks based on measurements of the interaction during your test.

    So a system meeting the benchmarks has good usability and passes the test. So the values chosen for the benchmarks, these pass/fail benchmarks, become the performance requirements.

    Slide 11. So what we're really talking about is a test method, a usability test method for certification of a voting system in an accredited test laboratory where measuring the performance of the system in the test lab so we've got to control as much, as many of the other variables as we can, including the test participants.

    So we've got a test ballot that we designed to detect different types of usability errors and be typical of many types of ballots. Remember this is a national test so we want it to capture many sorts of errors, not just the kinds of errors you'd see in one particular type of ballot and one particular state for example.

    The test is done in a lab and the environment is tightly controlled for lighting, set up, the instructions. We do not allow assistance to the test participants and the test participants are chosen so that they reliably detect the same performance on the same system. And they are told exactly how to vote so we can measure the errors.Now the test results measure relative degree of usability between systems. They are not intended to predict nor can they predict performance in a specific election because the ballot is different, the environment is very different.

People go in knowing who they want to vote for. They can readily obtain assistance from coworkers. They may have received brochures in the mail about what the ballot looks like.

And for each election the voter demographics have got to be different, and a general sample of the U.S. voting population across the board can never be truly representative in such a lab test because all elections are local.

    The key is we're creating a measurement device to detect usability errors in the interaction of the voting system.

    Slide number 12, components of the test method. So obviously the test protocol you're using has to control lots of things and has to be very precise. It's got to be well defined. It describes the number and characteristics of the voters participating in the test, how to conduct the test. We've designed the test ballot to be relatively complex to insure that we evaluate the entire voting system and detect significant errors.

    The instructions to the voters are exactly how to vote so we can count those errors. This test method protocol, which we call the voting performance protocol, has a precise description of the test environment and also a method for analyzing and reporting the results so that each test lab could report those consistently, and performance benchmarks and their associated pass/fail values.

    So I'm on slide number 13 now. And let me recap the research that you heard about in May. We did some initial testing just to test the validity of our test protocol. Did we detect differences between systems and did it produce the errors we expected, and we did that on two different systems and it did. So we were very encouraged by this.

    Was it repeatable, that is do we show reliability? We did four tests on the same system, different test participants each time, for a total of 195 participants, and we've got similar results on that same system for those four tests.

    And the demographics we used were sufficient to detect all the possible errors that we had. Imagine what one could see or could do with such a system.

    So on to the next slide, slide 14. I talked about the benchmark tests, which are the new tests that I reported on the planning of in May.

So we selected four different systems. It was a selection of DREs, EDMs and P-COSS. We ran through 187 test participants. We took five measurements, three of these measurements I'll be talking about. These are our pass/fail benchmarks, and two measurements that the HFP subcommittee suggests that we report on only and do not use them as pass/fail criteria.

    FEMALE SPEAKER:   Sharon, if I could interrupt. We actually raised this question at the last TGDC and discussed with the entire TGDC, who I believe collectively came to the decision that two values should be only reported.

    DR. LASKOWSKI:    That is correct. So I'm on slide 15 now. So what about the performance measures? The names of three of the performance measures have been altered slightly and we've also included a base accuracy score that feeds into our measures.

    So let me first talk about this base accuracy score. That is, how do we count the number of errors. Well, we've got a test ballot that has 28 voting opportunities. We give instructions to the test participants, vote this way, and then we count how many were correct for each participant so we've got an error rate.

    And to calculate the base accuracy score, we basically look at the scores for each of the test participants and take the average, the mean percentage of ballot choices that were correctly cast and this gives us a base accuracy score, which I'll talk about how we use it in a couple slides from now.

    So we're going to calculate three effective measures that we then put benchmarks on for pass/fail and effectiveness in this situation. In fact, it's kind of a usability term, effectiveness here, it means really the accuracy with which the voters were able to make their choices.

    So first we have a total completion score. This basically is the percentage of test participants who are able to complete the process of voting and having their ballot choices recorded by the system.

    So examples of people that wouldn't complete might be someone who gives up in the middle or someone who forgets to hit the cast ballot usa-button.

    Our second measure of voting accuracy we call now the voter inclusion index. It uses the base accuracy score and the standard deviation, in other words the variability that you see across the test participants.

Okay, so why do we like this measure? Well if two systems have the same base accuracy score but you see a system that has a large variability -- there are some voters that do really poorly, some voters that do really well to offset that, that would be a large variability. That's not as good as a system that's producing consistently good results with that same base accuracy score.

    I put the formula for people that like to look at this and run some numbers through, but basically the idea is you divide the standard deviation by the variability, so higher variability gives you a lower voter inclusion index.

    Our third measure we call the perfect ballot index. So you look at the number of cast ballots that contain no errors and you look at the number of ballots that contain one error or more and you take the ratio. So the perfect ballot index is simply the ratio of the number of cast ballots containing no errors over those that contain at least one error.

    Now this does deliberately magnify the effect of even a single error so why are we interested in this, why is it useful?

    Well, given a system that has a high base accuracy score, it might still have a common error that everyone or quite a number of people are making. So this suggests some design flaw that everyone is having trouble with. So this measure basically captures that characteristic.

    And as I said, we have two other measures that we're not doing a pass/fail criteria. We're just reporting on those, and those are the average voting session time and the average voter competence.

These are interesting and good information to have. It's not as important as getting the accuracy of the vote and in fact neither of these measures correlates with effectiveness.

You can have people that take a long time or take a short amount of time and they may or may not have done well with respect to the accuracy of their vote for example. So the HFP subcommittee suggests and we discussed this in May that we're just going to report on those.

    So I've got a couple of tables here. I need to show you a little bit of data but I'll just point out specific items from these tables so that you get a general idea of how the HFP decided to make at least an initial recommendation of what the benchmarks ought to be and to lead into the discussion of that.

    We tested four systems. These were systems certified to the VSS'02, selection of VREs, DBMs (unintelligible). So I've called them system A,B,C, and D.

    In the second column we have total completion scores. Notice that these are reported as an interval. When you take a sampling you always have some uncertainty so you report it with a competence interval.

These are 95 percent competence intervals so for example, System B has a completion score interval of 92.8 percent to 100 percent.

    So the way to interpret a competence interval is basically to say if I repeated this test a 100 times, 95 times the true total completion score of that system would be in that interval.

    And the third column is our base accuracy score. Notice for example System D is similar to System C, 92.4 percent, but it's got a larger standard deviation of 19. So when we calculate the voter inclusion index again as an interval, we get quite an interesting range. For example, system B has inclusion index in the range of .49 to .85 where system D is much lower, .03 to .22.

    When we determine the pass/fail benchmarks we look to see whether that benchmark is included in that interval or above it.

    Okay, that next table on slide 21, if you look at the second column you'll see the perfect ballot index competence intervals. Again a large variety of values here. System D from about 1 to .352. System C totally below, an interval totally below System D. They don't even overlap.

    So going on to slide 22. So the HFP has proposed some benchmarks but I have a couple of slides to discuss this more fully because there's a couple ways to view these.

    So if we look at the data that I just showed you and we picked a total completion score of 98 percent, voter inclusion index of .35 and perfect ballot index of 2.33, two of the systems fail. These are VSSO 2002 systems.

    What we've done in the current draft that you'll see for each of these three benchmarks, we've put in placeholders, that is for example, total completion performance says the system shall achieve a total completion score of at least XXX percent as measured by the voter performance protocol.

    So let me go on to my next slide, 23. But the questions on the table, which is really a policy issue, is how tough should the benchmark thresholds be. So we'll point out first the benchmark data here used -- some interference.

    DR. JEFFREY: If anybody does not have their mute on, could you just check? Thank you.

    DR. LASKOWSKI:    All right, so in this initial benchmark data testing that we did, we used 50 test participants which we showed we could get repeatability so this was sufficient. However, for the actual test protocol for the labs we will use 100 test participants.

This is for statistical reasons. In order to compute the competence intervals easily we need to assume a normal distribution and best practice using a voter inclusion index or an index of a nature of the voter inclusion index, and for other kinds of process, benchmarks suggest that if you have 100 participants or more you can make those assumptions.

    Okay, when you have more participants you are more sure that you've captured the true value of that benchmark and you can narrow the competence intervals, so the test will be tougher because you'll have smaller competence intervals then you saw in the two tables I showed you with 100 participants.But the two points of view, our proposed benchmarks, do weed out poorly performing systems so this by itself is a big step in terms of these kinds of requirements and it is relatively easy to raise the thresholds.

But these were VSS 2002 systems. And in the standards world and conformance testing to a standard tend to be in general on the conservative side. However on the other side, there should be a forward looking standard.

New systems are coming down the pike very soon. For example, for VVSG '05 certification. So how much higher can we put these, and also recall because people are in the loop here, there is always some upper bound of how high you can go because humans always make mistakes.

    So in the next slide. I think to make some policy here we do need some additional data, one to say how much flexibility do we have with the test laboratories to get good reproducible results.

And we do propose to do some future, very soon in fact, testing with voters with difference experiences and different geographic regions, looking at older populations, such educated populations.

I should note, benchmark thresholds are always tied to the demographics of the test participants. We will be collecting also for existing demographics, some additional data, which will tell us more about in terms of exploring the statistics for looking at 100 users and we also will be looking at accessible systems to see how they perform with this test.    So I think with that, I believe Whiney probably has some additional discussion and some policy direction for us in terms of the benchmarks.

    DR. JEFFREY: Okay, Whitney, over to you.

    MS. QUESENBERY:   This is Whitney. Sharon, how did you guess?

    I'd like to speak in favor of the notion that this should be a forward looking standard with somewhat higher levels of benchmarks but I'm not going to propose any specific benchmarks here.

I know that we are going to do probably if I have the numbers right, at least as much data collection in the next round of testing that Sharon just mentioned, as we have done to date and that will give us some geographical distribution, a look at some broader demographics, and will give us a much richer dataset.

    What I'd like us to do at the TGDC today is to provide some direction to NIST as they do the data analysis of those tests with where they should be looking.

    So for instance if we look back at the three benchmarks that will be the pass/fail benchmarks, right now the level that we've been talking about are kind of pegged at a median.

They would disqualify half the systems and half the systems that were tested would pass, but these are VSS 2002 systems and we know that there has been improvements in systems since then and we can only hope that changes in new systems have added to the usability of those systems as well as improving other aspects like security and reliability.

So one way to look at it is to simply say we want to raise the threshold and just increase it, but in the long conversations on HFP we have been given a good education in statistics by the NIST scientists and statisticians that you can't just say I wish it could be 100 and therefore I'll make the level 100.

So these have to be numbers that are reasonable and I think we've also been concerned that we not create a standard that is so high that no system can meet it.However, one way to look a this is to say as we look at the scores that the systems we have tested so far and that we will test in the future in setting these benchmarks, we could say what is the highest level they have actually achieved.

So for instance if we look back on slide 20 at the benchmark test results that include total completion score and voter inclusion in that, one of the things you see is that one system has a voter inclusion index range interval that is higher, that is the bottom level of System B at .49 is actually completely above any of the other three systems.So I have to ask the question, especially on a standard that won't even be approved until 2009, why aren't we aiming at the highest level of each of those benchmarks?

    Now I'm going to say some numbers and I don't mean them as a formal technical proposal. They're just as an example. So maybe 98 percent, maybe we'd like to see 98.5 percent. I'm not sure if 99 is technically possible on the total complete score.

On the perfect ballot index we're currently proposing 2.33, but one system achieves a 1.07 to 352 range, again completely above -- well not completely above the others but above several of the others.

Why not go for something like 3.0 as pulling a number out of my hat, and on the voter inclusion index which is essentially accuracy plus variability, why not look for something like 5.0 that gets us into the range of the top systems.

Now I know that we can't just sit here as people who neither worked on the data or are statistical experts and say this is the value you should pick.

We really do look to the NIST scientists to give us recommendations and I believe that based on our initial request of them that the values that we have in front of us are good values when our direction was to create benchmarks that weeded out systems with poor usability.

The question I'd like to raise for everyone is should we be raising our sights and be looking to try to create benchmarks that exemplify the best usability that we can reasonably achieve. Thank you.

DR. JEFFREY: Thank you, Whitney. This is Bill Jeffrey. This is addressed to you Whitney as a question.

What you're describing, actually the way I interpret it, is a goal or an aspiration for this as opposed to a hard requirement. Is that fair?

Because certainly in the text of the requirement it should clearly state which direction -- your higher numbers in certain cases is better and would indicate perhaps a more robust system.

Putting an actual stretch goal at the moment as a requirement I think would be difficult to actually embed as a requirement.

Are you happy with specifying specific goals and emphasizing in the text that literally higher is better, in this case lower is better, and that may result in a more robust system?

MS. QUESENBERY:   Well, I think certainly if the intro doesn't now, and I think it does say that higher is better, I think we've constructed all of these numbers so that higher is better.

I guess what I'm saying is, and I don't want us to set numbers because I do think that this should happen after the next round of testing.

One of the things that Dr. Laskowski alerted us to is that when we do larger numbers of participants that competence intervals will be tightened and therefore the numbers will be tightened.

So I don't want to just say let's change 2.33 to 3. What I would like and I think we can do this as (unintelligible) of the meeting is to say perhaps we adopt benchmark levels that have been proposed now but that we know that those will be revisited after the next round of data collection, and that we set the direction for NIST to come back after the next round of data collection with a set of benchmarks that are not unachievable by any system but they reflect the best scores by any system.

DR. JEFFREY: Thank you. Are there any other comments or questions on this?

MR. PEARCE:  Yes, this is Philip Pearce. I'm sorry, I tried to raise my hand and it didn't give me the opportunity to do that for some reason. Anyway, is it okay if I do it at this time?

DR. JEFFREY: Absolutely, please do, sir.

MR. PEARCE:  Okay. Again the question or the comment that I've got is that I would like to in addition to looking at the next set of tests that are going to be conducted, that have been identified in the slides and in the discussion so far, is to also look at future tests because again things are changing and I would be pretty disappointed if it looked like we even tried to set standards based on the next set of tests and didn't look three years, or five years, or ten years down the road and to be able to conduct these tests again and to see where those values maybe would sit at that future time.

So some way or another I would like to see us factor that into the discussion and also into the recommendations that we make.

MS. QUESENBERY:   This is Whitney. I completely agree with Philip. I know that this puts a burden on to the EAC for how these things are updated but I think very much like the innovation process, hopefully the new requirement in whatever version this becomes on this whole thing that we're voting on now will improve systems and will improve the usability of them.

I think one area where we're all particularly concerned is the usability of the accessible voting system where we've had some antidotal reports in the field that are not very encouraging and hopefully these will also encourage, but we don't want to say well, we set a level back in 2008, and in 2020 we're still accepting that as the proper level, even if new systems have come along that might have improved it.

DR. JEFFREY: Chairman Davidson.

COMMISSIONER DAVIDSON: Underneath the law of HAVA, it requires us to go back every four years, the EAC, to review to see, not only in this area but any other area, if there is a need to bring the TGDC back together to revisit writing a new iteration, or especially one area, arena.

It doesn't say everything has to be done but that is really up to the EAC, and as we move forward obviously we have no idea what four years will be down the road. There may be a lot of changes or maybe really not that many changes. It's the unknown. But the law does require that the EAC review it before making a determination whether they bring back the NIST and TGDC to move forward.

DR. JEFFREY: Thank you. This is Bill Jeffrey. I would also like to just add that I think that this also shows, and I just want to reemphasize something mentioned earlier, what an exciting area that this is because this is the first time that we've actually got even an inkling of what the members would look like.

And I really do want to applaud the subcommittee for doing this research and also to emphasized that this is essentially original research that actually did go out for formal, technical peer review which also I think adds a lot of credibility to it.I'd like to recognize Secretary Gale for a comment or question.

SECRETARY GALE:   Thank you, Dr. Jeffrey. Well first of all I want to congratulate Sharon. She has done a very impressive and a very outstanding job developing a relatively simple formula with very real data to back it up.

I understand Whitney and Phil's position about possibly raising the threshold to encourage manufacturers to increase simplicity and usability, but in looking at the test data it does appear the subjects were probably a bit more educated then average and perhaps different then other test labs we might recruit so I think that the standard that Sharon has set maybe be a higher standard then you might realize and we won't know until there is further testing.

So it seems to me that it would be appropriate to approve this and wait for later testing in other locations to see if there is a different educational level and maybe a little different skill set maybe.

DR. JEFFREY: Whitney.

MS. QUESENBERY:   This is Whitney. I was actually reminded just looking at my notes from Commissioner Davidson's original talk, that the schedule actually has a slot in there in between the two public comment periods for the TGDC to review and look at the comments.

And maybe that's an appropriate time to re-look at this, and what we should do is pass the benchmarks as they've been proposed and then by then we'll know whether there's been new data that suggests that we should change it in any way.

DR. JEFFREY: Thank you. Any other comments or questions? Okay, if there are no other comments or questions, what I'd like to do is again for symbolic reasons ask if Whitney as Chair of the HFP subcommittee would like to propose Resolution 08-07.

MS. QUESENBERY:   Absolutely.

DR. JEFFREY: Is there a second?

MS. MILLER:  This is Alice. I'll second it.

DR. JEFFREY: Hi, Alice. I didn't know you were on. I'm sorry. Okay, so thank you. So good, we've got the symbolism complete.

Okay, let me read Resolution 08-07, which has been proposed and seconded. This is the HFP VVSG Sections, Final Approval, and fortunately it's a little easier to read then some of the other ones.

The TGDC grants final approval for the Human Factors and Privacy sections part one, chapter three all, part three, chapter four, section 4.5 as part of its second set of VVSG recommendations to the Executive Director of the EAC subject to editing as instructed by the TGDC at this meeting and final review by the Chair of the TGDC.

Are there any comments or questions on this proposal? Okay, hearing none I will ask Thelma to call a roll call vote.

MS. ALLEN:   Roll call for 08-07. Williams.

DR. WILLIAMS: Abstain.

MS. ALLEN:   Wagner.

MR. WAGNER:  Abstain.

MS. ALLEN:   Paul Miller.

MR. MILLER:  Yes.

MS. ALLEN:   Gale.

SECRETARY GALE:   Yes.

MS. ALLEN:   Mason.

MS. MASON:   Yes.

MS. ALLEN:   Gannon.

MR. GANNON:  Yes.

MS. ALLEN:   Pearce.

MR. PEARCE:  Yes.

MS. ALLEN:   Alice Miller.

MS. MILLER:  Yes.

MS. ALLEN:   Purcell.

MS. PURCELL: Yes.

MS. ALLEN:   Quesenbery.

MS. QUESENBERY:   Yes.

MS. ALLEN:   Rivest.

DR. RIVEST:  Yes.

MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

DR. JEFFREY: Abstain.

MS. ALLEN:   We have nine yes'es. We have enough for a quorum.

DR. JEFFREY: Excellent. It's my fingers too. So congratulations to the Human Factors. I think you just made history actually. First time I think we ever got requirements like this, so congratulations.

Looking at my agenda there is also a resolution, which would be the final approval for the entire document, which includes the appendices. It has not been formally introduced. Would somebody like to introduce it? I'm talking about Resolution 09-07 and I'll read it if somebody wants to introduce it.

MS. QUESENBERY:   I'll be happy to introduce it. This is Whitney.

DR. JEFFREY: Hey Whitney, thanks. Is there a second? This is Bill Jeffrey, I'll second it.

Let me read it. This has been proposed and seconded. VVSG document final approval reads the TGDC grants final approval for the document Draft Voluntary Voting System Guidelines Next Iteration, August 7, 2007, in its entirety as the second set of VVSG recommendations to the Executive Director of the EAC subject to editing as instructed by the TGDC at this meeting and final review by the Chair of the TGDC.

Resolution 09-07 has been offered and seconded. Are there any comments or questions on it? Hearing none, I will ask Thelma to do a roll call.

MS. ALLEN:   Roll call for 09-07. Williams.

DR. WILLIAMS: Yes.

MS. ALLEN:   Wagner.

MR. WAGNER:  Abstain.

MS. ALLEN:   Paul Miller.

MR. MILLER:  Yes.

MS. ALLEN:   Gale.

SECRETARY GALE:   Yes.

MS. ALLEN:   Mason.

MS. MASON:   Yes.

MS. ALLEN:   Gannon.

MR. GANNON:  Yes.

MS. ALLEN:   Pearce.

MR. PEARCE:  Yes.

MS. ALLEN:   Alice Miller.

MS. MILLER:  Yes.

MS. ALLEN:   Purcell.

MS. PURCELL: Yes.

MS. ALLEN:   Quesenbery.

MS. QUESENBERY:   Yes.

MS. ALLEN:   Rivest.

DR. RIVEST:  Yes.

MS. ALLEN:   Schutzer. Jeffrey.

DR. JEFFREY: Abstain.

MS. ALLEN:   We have ten yes'es. We have enough for a quorum.

DR. JEFFREY: Congratulations team. I don't know why there is so much noise in the background. It's probably the clapping. My heartiest congratulations to the entire TGDC and the NIST support team. I think you just produced a document that will be going to the EAC shortly.

So with that, I will open it up if there are any additional resolutions, or comments, or questions that TGDC members have, for example Ron Rivest.

DR. RIVEST:  Yes, thank you. This is Ron Rivest. We had some discussion earlier about the innovation class and I did want to introduce a resolution recognizing that we've really just gotten started with the innovation class.

We're handing this process over to the EAC. We've had a number of discussions with the EAC and within the TGDC and as it was noted earlier by Secretary Gale that this process has really just begun. There's a lot of uncertainty here and the resolution really just sort of recognizes the handoff and the importance we put upon the innovation class.

Let me now read the proposed resolution. This is Resolution number 07 --

(Tape Interrupted While Changing Sides)

(END OF AUDIOTAPE 2, SIDE A)(START OF AUDIOTAPE 2, SIDE B)

    DR. RIVEST:  -- Title encouragement of innovation. The TGDC recognizes that innovation and voting systems must take place for voting systems to be usable, accessible, secure, reliable, and accurate for all voters and voting populations.

For innovation to occur, the TGDC has directed NIST to create initial requirements and a general set of procedures for an innovation class as outlined in TGDC Resolution 03-06.The TGDC urges the EAC to develop and published detailed plans and specific procedures for an innovation class program so as to encourage innovation in voting systems and to make clear to vendors how they may use the specific procedures, steps of the innovation class to achieve conformance to the VVSG for their innovative products.

    SECRETARY GALE:   This is John Gale, Secretary of State, Nebraska. I would second that but I think the Resolution number might be wrong, Dr. Rivest.

    DR. JEFFREY: Yes.

    DR. RIVEST:  I have the wrong number?

    DR. JEFFREY: If Dr. Rivest would accept a friendly amendment. He may be reading an old version. We have Resolution 10-07, which is a slightly more concise version of what you've said.

    DR. RIVEST:  Well I've got perhaps the older version, yes.

    DR. JEFFREY: Would you like me to read --

    DR. RIVIST:  Please.

    DR. JEFFREY: And please then confirm that this is the one that you are looking for.

    This is Resolution number 10-07. The TGDC recognizes that innovation in voting systems must take place for voting systems to become more usable, accessible, secure, reliable, and accurate for all voters in voting populations.

The TGDC urges the EAC with technical assistance from NIST, to continue to develop and publish detailed plans and specific procedures for an innovation class program so as to encourage innovation in voting systems and to make clear to manufacturers how they may use the innovation class to achieve conformance to the VVSG for their innovative products.

    DR. RIVEST:  I'm happy with that as a friendly amendment, yes.

    SECRETARY GALE:   Secretary John Gale. I also accept that as a second.

    DR. JEFFREY: Excellent, thank you. So there is a resolution and it has been seconded. This is Resolution 10-07. Are there any comments or questions?

    This is Bill Jeffrey. I'll just add a comment.  I heartily endorse this.Any other comments or questions? Okay, hearing none I will ask the parliamentarian, Thelma Allen to please call the roll call vote. That's hard t say.

    MS. ALLEN:   Roll call for 10-07. Williams.

    DR. WILLIAMS: Yes.

    MS. ALLEN:   Wagner.

    MR. WAGNER:  Abstain.

    MS. ALLEN:   Paul Miller, Paul Miller. Gale.

    SECRETARY GALE:   Yes.

MS. ALLEN:   Mason.

    MS. MASON:   Yes.

    MS. ALLEN:   Gannon.

    MR. GANNON:  Yes.

    MS. ALLEN:   Pearce.

    MR. PEARCE:  Yes.

    MS. ALLEN:   Alice Miller.

    MS. MILLER:  Yes.

    MS. ALLEN:   Purcell.

    MS. PURCELL: Yes.

    MS. ALLEN:   Quesenbery.

    MS. QUESENBERY:   Yes.

    MS. ALLEN:   Rivest.

    DR. RIVEST:  Yes.

    MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

    DR. JEFFREY: Abstain.

    MS. ALLEN:   We have nine yes'es. We have enough for a quorum.

    DR. JEFFREY: Excellent. Congratulations. Number 10-07 passes. With that I will ask if there are any other comments, questions, or resolutions that the TGDC would like to offer.

    MS. PURCELL: Mr. Chairman, this is Helen Purcell.

    DR. JEFFREY: Yes.

    MS. PURCELL: I would like to offer another resolution. We have all talked this morning, or various people have talked about the good work that has been done not only by the Chair but by the staff of NIST so I would like to propose Resolution, I believe it would be 11-07.

    DR. JEFFREY: That is correct.

    MS. PURCELL: The TGDC expresses its sincere appreciation for the exemplary leadership of the Chair, Dr. William Jeffrey, and the work of this committee to meet the relevant mandates of the Health America Vote Act.

The TGDC also recognizes the superior technical efforts of NIST scientists and support staff in both the drafting of the VVSG recommendations in organizing the activities of this committee and its working subcommittees.

    DR. JEFFREY: Thank you. Is there a second?

    DR. RIVEST:  This is Ron Rivest, I would like to --

    DR. JEFFREY: Okay, there's a second and a third and a fourth. There is Resolution number 11-07 that has been proposed, seconded, third and fourth. Are there any comments or questions on this?

    SECRETARY GALE:   This is Secretary Gale. I just want to I guess elaborate a little bit on the tremendous leadership you have provided Dr. Jeffrey for not only NIST and for TGDC but for the country.

NIST provides an incredible academic and research institution that's very significant to America and its future, and your leadership as captain of that ship is extraordinary and you'll be missed, not only by all of us but any of those who believe in the importance of science and the future development and growth of our economy and provision for our future. So thank you for your service.

    DR. JEFFREY: Thank you very much, sir. And again, if everyone can make sure that their system is on mute.

There's a very almost embarrassing resolution on the table and it's been seconded. I will ask the parliamentarian to call the roll call vote and I'll be noting who says no.

(LAUGHTER)

    MS. ALLEN:   Roll call for 11-07. Williams.

    DR. WILLIAMS: Yes.

    Ms. ALLEN:   Wagner.

    MR. WAGNER:  Abstain.(LAUGHTER)

    MS. ALLEN:   Paul Miller, Paul Miller. Gale.

    SECRETARY GALE:   Yes.

    MS. ALLEN:   Mason.

    MS. MASON:   Yes.

    MS. ALLEN:   Gannon.

MR. GANNON:  Enthusiastically, yes.

    MS. ALLEN:   Pearce.

    MR. PEARCE:  Yes.

    MS.ALLEN:    Alice Miller.

    MS. MILLER:  Yes.

    MS. ALLEN:   Purcell.

    MS. PURCELL: I guess I'm a yes.

    MS. ALLEN:   Quesenbery.

    MS. QUESENBERY:   Yes.

    MS. ALLEN:   Rivest.

    DR. RIVEST:  Another enthusiastic yes.

    MS. ALLEN:   Schutzer, Schutzer. Jeffrey.

    DR. JEFFREY: I'm going to vote yes.

(LAUGHTER)

    MS ALLEN:    We have ten yes'es. We have enough for a quorum.

    DR. JEFFREY: But that was only because of the second half of that compliment was the NIST scientists and staff. With that, are there any other comments, questions, or resolutions?

    SECRETARY GALE:   Dr. Jeffrey, this is John Gale, Secretary of State, Nebraska.

    DR. JEFFREY: Yes, sir.

    SECRETARY GALE:   This is a final comment. Obviously we're all very, very proud of NIST and your staff and the work they have done, as well as all the members of TGDC as voluntary members of this extraordinary effort.

    I think we need to urge the election administration community, and the election equipment industry, and the public, and all of the non-profit organizations with an interest to really give this a very thorough vetting.

Obviously we've all done the best we can do but we're not saying this is absolutely without some flaw, or some fault, or some possible need for modification.

    The issues of cost, the issues of the election administration practicality and the reasonableness are things that really deserve the same amount of attention that we've given to the technical development of these guidelines so no one should assume by this vote or by the adoption of this iteration that it is ultimately an unalterable record that is irrevocably committed in stone.

It's certainly the best we can offer and we hope that we have a very vigorous and robust debate out there as to how to move forward.

In elections there's nothing more critical of our democracy then voter confidence in our elections and part of that is equipment, part of that is election administration, and part of it is in the usability equipment by the average voter.

So we have a lot of work left to be done, but we certainly have made a first bold step forward. Thank you.

    DR. JEFFREY: Thank you. Actually that was incredibly well put. I think that this is an historic moment but it's just the first of what will be many steps along the way to get to the final version and I don't envy the EAC for the amount of work that lies ahead. The public comment period is going to be incredibly important.

I completely concur that this is not cast in stone, that as we get comments and as we understand the implications from the vendors and from the costing, that, you know, I think the TGDC and hopefully NIST will be standing right there to provide whatever support the EAC needs in working through the issues because again I think everybody's got exactly the same vision and goal in mind.

I'll also say that I came into this two years ago obviously knowing nothing about elections other then casting my own ballots, so with quite a bit of trepidation, but I'd like to thank all the TGDC members for explaining all these things to me, the NIST staff for being absolutely phenomenal.

I said at the beginning, this is one of the most important projects that we could be working on. It affects the life of every single American. You should be incredibly proud. I'm incredibly humbled to be working with you on this and I am very, very proud of the product that we have delivered. So thank you all very much.

And with that if there are not any other last comments or questions, I officially then adjourn this meeting and I would like to offer a round of applause for everybody who worked on this. So thank you.

(APPLAUSE)

    And thank you to all the TGDC members as well. This meeting is adjourned.(END OF AUDIOTAPE 2, SIDE B)

CERTIFICATE OF AGENCY

I, Carol J. Schwartz, President of Carol J. Thomas Stenotype Reporting Services, Inc., do hereby certify we were authorized to transcribe the submitted cassette tapes, and that thereafter these proceedings were transcribed under our supervision, and I further certify that the forgoing transcription contains a full, true and correct transcription of the cassettes furnished, to the best of our ability.

_____________________________
CAROL J. SCHWARTZ
PRESIDENT
ON THIS DATE OF:

_____________________________

Created September 28, 2010, Updated April 2, 2019