and Transparency Teleconference Meeting
Meeting commenced at 10:30 a.m.
Participants: Alicia Clay, Allan Eustis, Anoop Singhal, Bill Burr, David Flater, John Kelsy, John Wack, Nelson Hastings, Quynh Dang, Rick Kuhn, Ron Rivest, Sharon Laskowski, Steve Quinn, Thelma Allen, Wendy Havens, Wendy Orbaugh, Rene Peralta
Looking at the overall package of what needs to be done for December meeting.
Need to identify what will have the most qualitative improvements in the voting system.
The following 3 items are things for discussion that might/should be at the top of the list:
Security documentation is good place to start since we already have a framework in place. Security evaluation process shouldn't be put together from bits from vendors. Requirements, mechanisms, policy need to be articulated by the vendor and why they think is sufficient to support the security objectives listed. Good place to push forward, doesn't require any changes in the systems being sold. Makes the evaluation system for security more robust and reliable. Qualitative improvement should be high.
JW: If voting standards have general statements, you get the least amount possible, so we might want to be more specific with possibly an outline, details about what we expect from the vendor. Our challenge will be to right the specs.
JK: Links to open-ended testing is where there's a requirement statement; there should be a statement saying "how" you are meeting this requirement and point you to where it can be reviewed to verify. Proposed as a high priority item, should it be? Seems like it should be a part of every other component.
JK: Documentation requirements should be included where helpful to the evaluator to judge where it passes/fails.
JW: Two audiences - testing lab and users. Some effort should be put forward to identify each aspect of the system that requires procedures.
SQ: It doesn't have to be human-readable in order to make it into the standard.
AE: VVSG primary users are the testing labs.
JW: We started out in relation to security systems and how they are presented to labs and how they are used by vendors but we've discovered as time goes on that usability and security is important for users and election officials. A lot of the problems we are seeing now are because systems are not well documented and people are not appropriately trained. One of the tests should be whether or not procedures are plausible and can be followed. We realize for 2007 that we are not going to be able to do as good a job for security for poll workers as we would like. Procedures for IT systems needs to be present in order for the security goals of the IT system to be effective - what the labs need to evaluate is the system and its documentation. Vendors must be specific about what operating systems they are using, and what off-the-shelf software they are using.
IDV Related Requirements. Definitional proposal - software independence is a useful term to help us characterize what IDV is all about and what kind of testing we want. System is software dependent if correctness of the election results depends on the correctness of the software, that is to say if an undetected error in the software could cause an undetected error in the election results. CSD is willing to accept this proposal as the framework. The character of the testing we do should depend on whether the system is software dependent or not.
JK: The first question we have to ask is are we in the VVSG 2007 certifying closed box DREs.
JW: Opinion is no, IDV with two sets of records should be way to go. Not discussed with TDGC as a whole. Issues may arise because some states are invested in black box DREs and what happens to them, issue which should be handled by EAC, especially if we go a different direction and those are no longer certifiable. The other issue is the debate on VVPAT systems as one of the major types of software independent systems as to whether or not they actually achieve their goals in providing the detectability and auditability that we want. The current round of VVPAT has some issues. The proposal that IDV systems be the only ones that are certifiable is a good place to start from. If we do not write requirements for closed box DREs and we are overwritten, the security holes will continue to exist and we will do nothing to make it better. If black box DREs are allowable, then a vendor proposing software dependent system needs to go to extraordinary lengths to prove that the software in reliable, this may require formal proofs that other systems do not require. If NIST decides to take this approach, it must be organizationally backed. A fairly simple white paper needs to be put out ahead of December TDGC meeting discussing this approach and framing this issue. If we are not going to support black box DREs, then we need to give alternatives. Make sure that TDGC does not think they're only course of action is to throw out current DREs. We are not talking only about existing systems, but new systems down the road. VVSG 2007 certified systems will not be available until 2012. Based on a 10 year system lifetime, they do not want people to think they have to throw out of the systems that don't meet the 2007 standards. IDV spans both categories - dependent and independent.
ACTION: Paper that Ron did on software dependence needs to be warped into a white paper that STS recommends for review by TGDC members. See John Kelsy's email for possible plausible strategies. Email conversations to follow. Move forward with proposal that black box DREs are not going to be supported.
One of HFP's goal is to increase universal accessibility over several years. HFP wants comparable functionality. Paper is not accessible, but when used with voting system with accessible functionality, it is OK.
Open Ended Vulnerability Testing. Ron feels this is something that can provide major qualitative improvement in the security of voting systems. If we can get this up and flying by December, it will be a major milestone. At the monthly meetings between EAC and NIST, it keeps coming up about what can be done now to improve voting systems. Open ended vulnerability testing keeps coming up. This should be a high priority because we may need to provide material on this before the July timeframe. Some skepticism that open ended vulnerability testing will improve the security of the systems. Might be another matter for the TDGC to take on as a whole. If we are trying to test existing systems, the penetration testing has a lot less appeal than if you were testing future systems. The goal is for certification of new systems primarily, not as important for older systems.
ACTION: Meet internally to discuss options and resources.
Next meeting September 19, 2006.
Meeting adjourned at 11:55 a.m.
John Kelsey e-mail:
Some Notes on Alternatives
for Audit Architecture Strategy
In the last couple
of months, we've had a couple apparent changes of direction with respect
to what I'll call the architectural requirements for allowing auditing
in voting systems. We need to make some decisions about the direction
we're going to take, given the time and resources we have available and
what we think will actually be used.
a. The impossibility of meaningful audit of DRE results, and the resulting requirement to trust the in operation of software and hardware.
b. Implementation or design flaws that make the DREs less secure than they have to be.
Our original goal was to address (a) with IDV. That is, we could write standards for pure-electronic systems that were designed to be meaningfully auditable. This centered around the idea of some independent piece of software or hardware which could be used to audit the correct recording of the interaction between the voter and the voting machine. (This should sound a bit like Chaum's observer chips.) The good thing about this is that while there are no IDV systems (other than VVPAT) on the market, there's nothing all that hard about building one from available technology. While using an IDV system still leaves us trusting some software and hardware, we only end up trusting that one of two systems hasn't been tampered with.
My sense is that if we have standards for IDV (assuming we can write meaningful ones), we may be able to exclude pure, closed-box DREs. This would be a really good thing, and getting people from unauditable black boxes to pairs of black boxes that audit each other seems like a pretty big win. (We still need to address (b), which is where all the headline-grabbing attacks have happened.) I'm not sure this is possible, but it might be.
The downside of all this is that IDV gives you some audit, but not a really strong audit. It's like the difference between having two different accountants who work in the same office checking each other's work, and having a completely independent auditor come in and go over the books and all the supporting records. It's very easy to imagine that the two different modules in the IDV system will be corrupted at the same time--they'll be stored under the control of the same person, they'll likely have some software in common, they'll be sold by a consolidator who buys the individual components and hooks them together into full systems, etc. What I see as the big problem with IDV is that it pushes us toward a future that is still software-dependent. The clever attacker has to take over two machines, or three, or ten, but then he gets to fix the election.
1.2 Paper (Direct Verification)
John and Ron's software independence paper basically pushed us toward a different direction--not allowing any voting machines which can't be meaningfully audited in a way that doesn't depend on any software or hardware operating correctly. Excluding crypto (which does rely on software correctness, but allows a skeptic to write his own software, and which is not at all ready for standardization), this means we accept only paper systems right now. My concern with this is that I don't think it addresses the DRE question in a way that's actually workable. I'd love to see us say "DREs are no longer going to be part of the standard," and if we can do that, we should. But I think if we write that kind of standard, the TGDC won't accept it, and the EAC will ignore it and keep allowing DREs, and we will be left with no meaningful improvement in the security of the existing crop of DREs, because we won't have written meaningful standards for them.
1.3 End-to-End Crypto
I'm pretty convinced that some kind of crypto voting scheme, like those pushed by VoteHere and David Chaum, is where we'd like to end up. I think there are still some pretty big issues to hash out about them, and we need some kind of operational experience running small elections--I know David Chaum is working on this with the student election competition. I've heard that some states have provisions for the use of experimental voting technology, which is a potential starting point for this stuff. (I keep thinking one of David's all-paper schemes might make a good absentee ballot scheme.) The biggest issue with these is how we standardize the cryptographic protocols. We can describe this class of systems at a high level, and do a Brennan-center type threat analysis of them, but the process of deciding that a proposed voting protocol is secure is going to be harder to deal with. To my mind, protocols are much harder to get right than cryptographic mechanisms like hash functions or block ciphers, but NIST took many years to settle on one acceptable block cipher, and we'll likely do the same with a hash function. It's hard for me to visualize us or anyone else being able to confidently approve a whole proposed voting protocol in much less time.
2 So, What About DREs?
The elephant hiding under the rug (I love mixed metaphors) is what we do about DREs. I see a few plausible strategies, and I'd like to discuss which of them makes sense, both in terms of technology and in terms of what will actually get accepted. (I don't know about you guys, but I'm not really interested in writing standards that get ignored. Life's too short.)
2.1 No DREs in the VVSG2007, Software Independence
We could get rid
of any support for DREs in the next version of the standard. Just say
no! But then we have to decide what to do about the present and the future.
If we go with the software independence notion from John and Ron as a
requirement, along with allowing crypto when it's ready, we get a path
If we want to do
this, I'll be very happy. In that case, we could really try to push the
crypto stuff forward, with the hope that existing DREs slowly die off
and ultimately are replaced with some kind of end-to-end crypto scheme.
But we're a long way from being able to standardize on a crypto scheme,
so for the next 10 years or so, probably only paper-based systems would
be certified. In security terms, this is probably the best we could do,
if it is followed. If we end up certifying DREs anyway, we lose our chance
to address some of the huge security problems with them. (DREs could be
made very hard to attack for outsiders, though probably not for insiders.)
We could also not certify any pure closed-box DREs in the VVSG2007, but write a somewhat theoretical standard (since we have no operational experience, only a few prototypes) for IDV systems. The advantage of this is that some vendor could build a system of this kind from existing components in a couple of years.
This would require writing some pretty extensive requirements for IDV, a lot more than just requiring an audit port on the side of the DRE. One downside here is that we could end up pushing people toward IDV, and thus ultimately away from crypto. Since crypto has the potential to be a lot more secure than IDV, that's a little troubling.
We could write requirements for DREs in VVSG2007 and allow them to continue to be certified, subject to some much tougher testing and security requirements. This is where both the audit port and the OEVT stuff come in. We could then require software independence for VVPAT and Opscan systems (which basically means specifying how they must be audited). We could also write some theoretical discussions of crypto for the future.
This seems like where we're headed right now. We certify DREs for now with tighter requirements, but try to ultimately push software independence where we can. We encourage the eventual move to crypto.
2.4 DREs in VVSG2007, IDV
Finally, we could both allow new DREs with tighter requirements, and also push IDV architecture as a future alternative. I think this doesn't make much sense.
3 My Ideas
I'd like to see us do one of the following:
a. Move to IDV for DREs now, and push toward crypto in the future. (This probably requires more work than we'll be able to get done in the available time.) Don't support DREs anymore.
b. Don't support DREs anymore. Move to paper now, and push toward crypto in the future.
What I think we'll end up doing, in practice, is:
c. Allow DREs with tightened security requirements and no IDV, and make some vague handwaves about how we would like to see crypto systems in the future. Also make some vague comments about how software independence is worthwhile.
I suspect that (a) is politically possible but pretty hard to actually write, and that (b) is much easier to write but probably will either be changed or ignored by the EAC. That lets us keep relatively clean hands, but it does mean that the current security problems with DREs don't get addressed in any comprehensive way in the standard.
Anything we do with
a full IDV spec is going to be a pretty theoretical standard (we'll be
standardizing stuff that exists only in prototype form right now), and
will take some serious time. However, a lot of that time is going to come
from the requirements for
OEVT is going to take another year anyway. So maybe making IDV take that long, and getting down to specific testable requirements then is okay. I'd love to hear some discussion of this.
policy / security notice / accessibility statement