Comments on the FEC's Voting System Standards Proposal

By Douglas W. Jones of the University of Iowa

submitted to the FEC by E-mail July 13, 2001.
Indexed on the web at http://www.cs.uiowa.edu/~jones/voting/


I've read through the entire proposed voting system standards, found at

http://fecweb1.fec.gov/pages/vss/062801vss.html
My inital remarks are appended. By way of introduction, I have served on the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems for almost a decade, and I now chair that board. I have served on the computer science faculty at the University of Iowa for over 20 years, and I have testified about voting systems and practices before the United States Civil Rights Commission and before the House Science Committee. My full vita can be found on my web page at http://www.cs.uiowa.edu/~jones/

1 Introduction
1.1 Objectives and Usage of the Voting System Standards

"Essentially, [the standards] address what a voting system should reliably do, not how the system should meet this requirement." This is laudable, but in fact, again and again, the standards end up mandating solutions when they ought to be setting goals. I have pointed out some of these in my later comments.

1.5 Definitions
1.5.2 Paper Vote Based Voting system

This is a remarkably awkward term. Machine readable paper ballot would be preferable, or to be very precise, perhaps machine readable Australian paper ballot. The phrase "paper vote based voting system" is non-gramatical and ambiguous (is it paper-vote based or paper vote-based), and it fails to mention the involvement of machinery.

The term machine readable physical ballot might be even better, because I can propose several machine-readable ballot formats that aren't paper-based; use plastic, for a trivial example, or use little clay balls, like they did in ancient Greece.

1.5.3 Electronic Voting System

Personally, I'm wary of the change from direct-recording electronic voting machine. After all, mark-sense readers are electronic vote tabulating machines, and in the context of the complete voting system, including election definition, voting, vote tabulation, and consolidation of the canvass, the use of machine readable paper ballots is only a small component in an otherwise purely electronic system.

Also, we've had a decade to get used to the phrase direct-recording electronic voting machine, and the initialism DRE. Given the disadvantages of the proposed new terminology, I, personally, won't abandon the old, and a standard can't afford to invite that kind of reaction.

1.6 Application of the Standards and Test Specifications
1.6.1 Qualification Tests

"Qualification tests validate ..." is an odd choice of wording. I'm not sure that testing can validate something. It may confirm something, but seem my comments on section 9.2 on the limits of testing.

2 Functional Capabilities
2.2 Overall System Capabilities
2.2.1 Security

There is an HTML error that makes the heading 2.2.1 the same size as that for 2.2; it ought to be one size smaller (2.2.2 is correct).

I see no recognition of the fact that security is and has always been a matter of policy, as much administrative as technical. The most a voting system can do is to provide protection mechanisms that are sufficient to implement the desired policy. While some of the necessary policies can be embodied in the machinery, others will necessarily rely on administrative functions that are outside the scope of the standard as given in section 1.1 -- this standard can demand that voting machines have a secure lock, but effective security requires that the administrators of the system keep the key in a secure place, as opposed to, for example, leaving it in the keyhole.

2.2.2 Accuracy and Integrity
2.2.2.1 Vote Accuracy Measures
2.2.2.1.1 Common Standards

These standards are largely generic, but there are some very specific things we can demand of a voting system. Specifically, in regard to error detection and correction, I strongly advocate that, for each race, we require counting the votes for each candidate, plus overvotes, plus undervotes, and that these numbers be carried forward from the voting machine all the way to the final canvass. At each stage in the counting, for simple elections, the sum of these numbers can be compared with the number of ballots that have been counted.

I've written this up in detail in:

http://www.cs.uiowa.edu/~jones/voting/counting.html

2.2.2.1.2 Electronic System Standards

If the purpose of this independent record of each ballot is to allow recount, and more generally, to assure that the design is testable and auditable, we need a far stricter requirement. See the web site cited above. Specifically, the phrase "independent and distinct" must be very strictly enforced; ideally, the software used to store this independent record of each ballot should have no communication with and be developed independently of the software used in the normal ballot processing. In my ideal design, it would run on a different CPU, with its input obtained by from wiretaps on the hardware interfaces to the display screen and touch panel. Any software components shared between the primary and independent and distinct ballot processing paths must be subject to audits that are far more stringent than the audits for other software components.

2.2.3 System Audit
2.2.3.1 System Audit Purpose and Context
2.2.3.2.1 Time, Sequence, and Preservation of Audit Records

The requirement that each system include a real-time-clock poses security threats. Audit records must be timestamped, but there are other system components that, ideally, should not be aware of the time and date!

2.3 Pre-voting Functions
2.3.1 Ballot Preparation
2.3.1.2 Ballot Formatting

Allowing simultaneous display of all choices for a single contest or office on the same page (screen, column ...) is laudable. It is clear that this was a central problem Florida in the 2000 general election. However, no voting system can meet this demand arbitrarily. Give me ballot pages the size of bedsheets, and I can set ballot access rules that would overflow the page. In sum, either this section or 2.3.1.1.1 must specify the maximum number of candidates for one office that must be allowed. (I cited 2.3.1.1.1 because the 500 voting position requirement is stated there.)

As I understand the history, Florida, in 2000, had 11 presidential candidates on the ballot, more than they'd ever had before, and more than any other state. Unfortunately, many voting systems had only anticipated 10 candidates per office, so county election officials across Florida had to find inventive ways of dealing with this. Unfortunately, in case after case, their inventions prooved seriously flawed - butterfly and two-page punched-card ballots, presidential candidates in two columns of a mark-sense ballot, etc.

We must put part of the blame on ballot access rules that got too easy in Florida (I suspect they are too difficult in other states).

Also, presidential and (in many states) gubinatorial races are a special case because they must list a candidate and running mate by each voting position. Thus, a ballot format that accomodates 20 candidates for county sherrif on one page may only have room for 10 presidential or gubinatorial alternatives.

2.3.5 Verification at the Polling Place

The requirements given here are useful, but unless the system audit is much stronger than I have reason to expect from my experience reading Wyle Labs audit reports, and unless there is far stronger equipment security than I have been led to expect from local jurisdictions, all of this is mere pro-forma gobldygook.

It is just too easy to write software that will print out the expected report with no reference to what is actually running or the actual totals in any particular registers. Item d, the zeros' report, is particularly annoying in this regard. This had real value with recording lever machines (the kind that imprented the values in all the registers on bedsheet paper), but it has very limited value with computerized systems where both the clearing of the counters and the vote printing of the zeros' report are done by the same software.

2.4 Voting Functions

2.4.4 Augmenting the Election Counter (for Paper-based Systems)

The term public counter has been in use for so long that this term really ought to remain in use. Furthermore, this term is used in 3.2.4.2.6.

By the way, why are we augmenting the counter -- the verb "to augment" ususally implies the addition of functionality, while "to increment" means to add to in the numerical sense. I suspect the latter was intended. This appllies to 2.4.5 as well!

2.4.5 Augmenting the Life Cycle Counter (for Paper-based Systems)

The term protective counter has been in use for so long that this term really ought to remain in use, unless this counter has taken on a new use, for example, to determine when maintenance actions such as pinch roller replacement, read-head cleaning and so on are needed. If this is the case, this purpose should be documented.

See my comments on 3.2.4.2.7 for more on this issue.

2.5 Post-Voting Functions
2.5.2 Closing the Polling Place (Internet Voting Systems)

In fact, the procedures given here make sense for a central-count polling-place Internet voting system, but they don't allow for a precinct-count Internet voting system at all. The use of the internet to report totals from a precinct-count DRE machine to a counting center at the time the polls close is technically feasible, and I see no reason that the standard should arbitrarily prohibit such a mode of operation. Part of the problem may be with poor selection of terminology - The use of the Internet, as opposed to dial-up phone lines or packet radio is not really the issue.

For any type of voting machine installed at a precinct polling place, there is significant value to having the machine accumulate totals for that precinct independently of any totals computed at a remote location using ballot images that have been transmitted over any type of communications channel (including hand carried disks). If a local total is computed, that total really ought to be made public at the precinct in order to allow public oversight of the canvassing process.

3 Hardware Standards
3.2 Performance Requirements
3.2.1 Environmental Requirements

Are some or all of the requirements stated in the following subsections lifted from existing standards for commercial or military equipment? It would be appropriate to cite these standards here.

3.2.1.4 Electrical Supply

The voltages and line frequencies listed are all nominal, with no statement of noise or brownout tolerance. These constraints really ought to be lifted verbatim (or by citation) to relevant commercial or industrial standards.

3.2.2 Conrol Requirements
3.2.2.7 Error Recovery (Precinct Count System)

Item d is odd. It discusses a mechanism that is a key part of error recovery in general, and hints that this mechanism may not be allowed in some systems (electronic systems equipment, a nonstandard term). In fact, all error recovery in computer systems is done through a combination of checkpointing, redundancy and re-execution from a transaction log. All of these are mechanisms, and the standard must be very careful when it deals with mechanisms if it is to conform to the objective given in 1.1 that it dictate results and not means.

3.2.4 Vote Recording Requirements
3.2.4.1 Paper-based Recording Requirements
3.2.4.1.3 Marking Devices

I thought a pen or pencil was a marking device, and the wording of this section suggests otherwise (by mentioning pens to be used with marking devices). And, why not pencils. Optical mark scanners were developed with "number 2 soft lead pencils" in mind, and there are many people who are well trained in their use.

In any case, marking devices for optical mark sensing cannot be shown to meet any performance standards without consideration of human factors, and sadly, this standard explicitly avoids such issues as currently formulated.

The current situation with regard to optical mark sense voting systems must not be allowed to continue. Vendors set the standard, in their hardware, of what constitutes a vote. This standard is not well documented -- what is documented is a set of instructions for how to make a mark that will be guaranteed to be counted as intended, but many voters (the numbers are unknown) mark ballots in other ways, and these votes are counted in unknown ways by the machines we are given.

In the event that hand counting is employed, and it is routinely, a completely different set of criteria are employed by the people who look at the ballots. People are guided by state law and by intuitive sense of what the marks indicate about voter intent. Because the vendors do not tell us what kinds of marks their systems sense and they do not tell us how their machines distinguish between marks that are votes and marks that are not, we cannot even begin to determine whether the machines count votes in conformance with current law governing what is a vote.

Fortunately, the vast majority of voters are careful to make marks that are unambiguous, but in a close election, the few who are careless in their marking carry great weight.

3.2.4.2. Electronic Systems Recording Requirements
3.2.4.2.3. Vote Recording

The requirement in item b for redundant storage is discussed in my comments on section 6.6.2, among others. This section seems like something I remember from the 1990 standard, particularly the phrase "with polling to detect discrepancy." Like the older standard, there is no statement of what should be done when discrepancy is detected.

In fact, the entire statement about redundancy with polling to detect discrepancy is a statement of mechanism, and as such, its inclusion in the standard violates the spirit of the goal stated in section 1.1 that the standard dictate results and not means. The result you want is that votes cast shall be retained in a form that can be counted despite the failure, destruction or corruption of any single memory subsystem in the voting system. It is true that redundant storage is required to meet this requirement, but it is a mistake to dictate how that redundancy is exploited by saying things like "with polling".

3.2.4.2.7 Protective Counter

I understand the use of the protective counter in classical lever machines, where the lack of any physical reset mechanism on the protective counter prevented it from being reset by any means without first breaking physical seals and destroying its casing.

Short of using an electromechanical couner without a physical reset mechanism, none of the protective counters I've seen on electronic voting systems (including mark-sense ballot readers and DRE machines) have had this degree of integrity. So long as the protective counter is just a record on the disk drive or in eeprom on the voting system, its value in the audit process drops to near nil, and as such, I wonder if our requirement for this counter is merely a vestige of the bad old days of lever machines and does not represent any real security.

Furthermore, with modern DRE machines, the time "since the unit was built" is difficult to determine. For many purposes, with many DRE systems, it seems reasonable to say that the machine is built at the polling place just prior to opening the polls. This is particularly true of machines where the voting stations are networked and where each station stores its ballot images redundantly by saving one copy locally and one copy in the memory of an adjacent station or in all other stations in the same polling place. Add to this the fact that the machine is completely unable to perform election functions until an election definition card is inserted. That card really defines it as a voting station, and with a different card, it could have been defined as the control station used by an election judge to enable the other machines in the same precinct. I see systems from Fidlar and from Global that embody many aspects of the situation I've outlined, and I can't for the life of me figure out the role of the protective counter in this setting.

3.2.5 Paper-based Conversion Requirements
3.2.5.1 Ballot Handling
3.2.5.1.1 Outstacking

At the bare minimum, I'd like to see all central count systems and all precinct count systems that don't return overvoted ballots to the voter include provisions to separate overvoted ballots from the stream. This could be done in a separate pass through a counting machine that has 2-way outstacking, or it could be done using a multiway outstacker.

The reason I want this is that I believe that every overvote should be examined by a person, if not the voter, then a team of two election workers (representing opposing parties, of course). The reason I want this is because wood chips in the ballot stock, stray pencil marks (so-called hesitation marks) and smudges may all end up converting legitimate votes into overvotes, and people can easily identify these. Some optical mark machines distinguish these quite well, but others are poor at dealing with these. We owe it to the voters to get it right!

3.2.5.1.2 Multiple Feed Prevention

1 in 5000? This seems awfully high. Recounts with reasonable machines in current use frequently get within 1 in 10000 on the overall count. If current equipment does this well, including multiple feed and all other problems, it seems quite improper to accept 1 in 5000 for multiple feeds alone!

3.2.5.2 Ballot Reading
3.2.5.2.1 Read Accuracy

I have seen Item a used to define valid marks, instead of the other way around. See my testimony before the House Science Committee, http://www.cs.uiowa.edu/~jones/voting/congress.html, under the heading Accuracy Standards, a Mark Sense Example.

In short, in testing a Chatsworth mark-sense reader, when we observed serious deficiencies, the vendor used this defense. The problem involved were largely human factors problems, and they were very serious, leading to a net accuracy for the whole system of about 1 in 200. The major problem was with the ballot marking instructions (they were very optimistic about the ability of the reader to read marks made with any pen or pencil).

The read accuracy section makes no mention of sensing thresholds or uniformity of the thresholds across the different sensors. It is crucial that these thresholds be set so that the same mark will always be interpreted the same way, if read by a different photosensor, possibly on a different scanner. Some scanners can read ballots upside down, rightside up, front to back or back to front. This means that each vote may be seen by any of 4 different sensors. This must not change the interpretation of a significant number of votes!

Also, the wording of this section seems to assume that mark sensing is done by hardware. Many modern mark-sense readers used in voting applications use standard fax machine or image scanners, and do all mark sensing using image processing software on the digitized image of the paper ballot. The term "digitized image of the paper ballot" should not be confused with the term "ballot image" as we've been using it for a decade or so! This image processing subsystem can be quite sophisticated, and it's all software!

3.2.6.1 Paper-based System Processing Requirements
3.2.6.1.1 Processing Accuracy

How can this be measured without reference to human factors? It seems to me that measurement of error rates in mark-sense scanning equipment verge on the meaningless unless the marks are made by real people.

3.2.7. Reporting Requirements
3.2.7.1 Removable Storage Media

Elsewhere, CD-R media are mentioned. It should be added to this list.

3.4 Design, Construction, and Maintenance Characteristics
3.4.1 Materials, Processes and Parts
3.4.1.1 Ballot Cards

There are paper-based systems that don't use ballots in card form. I've got machine-readable paper ballots in my collection that are cards, but I've also got ballots as big as 11 by 17 inches, and printed on plain paper. The assumption that such ballots are necessarily similar to punched cards must be expunged from the standard.

4 Software/Firmware Standards
4.1 Scope
4.1.4 Exclusions

this section is improved, but it still excludes commercial operating systems from source code review. Unless some other part of the standard incorporates extraordinary controls, this leaves some large loopholes.

4.6 Software for Internet Voting Systems
4.6.2 Vote Accuracy and Integrity

The requirements for test ballots may contradict those for vote privacy! Test ballots are indistinguishable from real ballots (item b) yet, for each, we must be able to determine when it was sent and where it was sent from. Allowintg this to be determined for non-test ballots would violate vote privacy because the when and where information would be enough to reveal who voted that ballot.

4.6.3 Vote Privacy

The requirement that all record of a vote be erased from the voting machine once an internet voting session is terminated really ought to apply to the user interface subsystem of all DRE machines -- the presence of the internet between the user interface and the electronic ballot box doesn't change the need for this erasure. Furthermore, this is at the root of the problem I observed in one of the DRE systems a few years ago, when it was the window manager that was retaining information about the voter's choice and revealing this to the next voter.

5. Telecommunications
5.1 Scope
5.1.1 Types of Components

Hand carried diskettes, PCMCIA cards, etc are not covered by this section, yet many of the most significant flaws in the security of existing DRE and precinct count systems become quite obvious if you look at the problems with securing the data transmitted in such media. The comparison of hand-carried media with electronic transmission has been common since the late 1960's, so it is quite reasonable to expect an acknowledgement of this here.

5.2 Performance Requirements
5.2.3 Privacy

My first reading of this section suggests that there is a blanket requirement that everything be encrypted. This violates an objective stated in section 1.1 by mandating a mechanism instead of mandating goals that must be met!

In fact, it is also wrong! Certain information should only be transmitted after it is disclosed to the public (for example, subtotals in the canvassing process being sent from a local counting center or precinct). The encryption of such data is a mistake because it makes the auditor's job difficult by obscuring data that ought to be trivial to compare with the disclosed data. In fact, what is needed is a cryptographically secure checksum (electronic signature) appended to the plaintext, so that any observer may verify that the information transmitted is identical to the information publically disclosed.

This is evidence of a more critical problem with the standard. The requirements for encrypting or otherwise protecting any any particular stream of data depend on the threats posed by that data, and there are two sensible ways for the standard to address this:

1) The standard could carry out the threat analysis for all of the commonplace data streams (ballots, ballot boxes, subtotals in the canvass, etc), and based on this analysis, it could dictate how each is to be protected.

2) The standard could mandate such a security analysis by the vendor for each data stream the vendor chooses to transmit.

There is some discussion of this in section 6, but it seems inadequate, and 6.5.2 seems to repeats the basic premise that encryption should be used everywhere.

5.3 Prohibitions

In effect, transmission of data in a hand-carried diskette or PCMCIA card seems to emerge as an alternative for the communication of the prohibited data listed here, yet there are no guidelines here for securing this data. I hope to find them in section 6.

6 Security
6.2 Access Control
6.2.2 Access Control Policy

All the subsections here seem to focus on humans as the data manipulators, and none focus on software components of the voting system as data manipulators. Humans are important, but program components are also actors in this game, and access control mechanisms that isolate parts of the system from each other are essential to the constuction of verifiable systems in this arena.

6.5 Telecommunications and Data Transmission

Section 6.5.2 lists acceptable encryption standards, but I find no corresponding listing of acceptable cryptographically secure checksum standards, and these are as essential as cryptography.

6.5.3 Virus Protection ...
6.5.3.2 Virus Forms

The definiton of worm given here is wrong! A trojan horse is not a form of virus! A logic bomb is not a form of virus! Furthermore, the vulnerability of systems to trojan horses and logic bombs does not depend on connection to tellecommunications networks. Any system incorporating third-party software is vulnerable to trojan horse attacks, and any system incorporating software not written by the end user is vulnerable to logic bomb attacks. In sum, this section is weak!

6.5.3.3 Use of Antivirus Software

The blanket requirement for the use of antivirus software is incorrect and violates an objective stated in section 1.1. It should be sufficient to demonstrate that the data format used for data transmission is incapable of transmitting a virus -- there are many inherently safe data formats.

Furthermore, if a data transmission format is use that is subject to virus attack, I would demand that the vendor justify this! Several commentators have described antivirus software as an example of "closing the barn door after the horses have left." The horses left the barn when data storage and transmission formats were adopted that allowed the inclusion of viruses in the data.

Technically, a virus cannot infect a system that contains no self-modifying code (interpreting this in its broadest sense, a sense that includes run-time linkage, just in time compilation, and interpretive execution of code created after the initial load of the program)! Therefore, section 4.2, if strictly interpreted, confers automatic protection against virus infections.

6.6 Internet Voting
6.6.1 General Security Requirements ...

Some of these are general requirements that apply to all voting, not just internet voting. They should be listed elsewhere and imported to this list of requirements. This is obvious for requirement a. For requirement b, it is worth noting that many DRE machines store (and allow transmission) of a voter's choices (ballot image) so this too ought to be general.

My comments on 5.2.3 apply as well to item c. Blanket requirements that everything be encrypted are not sufficient! Each item that is transmitted must be analyzed to determine the threats. For example, in this case, the transmission of a set of ballot labels from the internet voting server to the user's machine may not need encryption, but it must be secured against tampering -- so, what it needs is a cryptographically secure checksum or signature.

Almost everything in this section should apply not only to internet voting, but to voting systems that use any public communications channel. For example, if voting is done by touch-tone phone (not a great idea) or by dial-up modem and personal computer, all of these ideas ought to apply, whether or not the Internet is used!

In Chicago, the precinct-count voting systems they use have the ability to radio the vote totals or ballot images to the county's tabulating center. The same radio technology could connect DRE machines to a server at a tabulating center, with no use of the internet, yet everything above the level of the radio link would be identical to an internet voting system, and these rules ought to apply. The airwaves are, after all, even more publically accessible than the Internet or the telephone network.

6.6.2 Vote Server Data Center Requirements ...

The word architected makes me cringe, and system architecture is has been my central interest for 30 years.

The requirement for redundant storage here and elsewhere in the standard is problematic because redundancy does not solve the problem. Furthermore, redundant storage is a mechanism and what you should mandate, in keeping with section 1.1, is results and not specific mechanisms.

Redundancy does not guarantee the result you want, reliable operation in the face of failure or attack. Rather, redundancy is merely an essential mechanism that is required, among other things, to achieve this goal.

In looking at the crop of voting systems designed to the 1990 standards, which mandated redundancy in DRE systems, I have found that few system designers understand how to use redundancy effectively. The fundamental concepts of atomic transactions and stable storage are keys here, and if the standard fails to grapple with these, I'm afraid that we'll end up with gratuitous redundancy that does not lead to fault tolerance or reliability.

Specifically: In the event that redundant copies of a ballot image are found to differ, there must be a rule in place for determining which copy is to be counted and which is to be discarded. The redundancy requirement is, in effect, comparable to requiring that paper ballots be filled out with a carbon copy, and that these copies be deposited in two different ballot boxes. Effective use of this redundancy requires that there be a way to identify, for any ballot, the corresponding carbon copy, and yet we must guarantee that the information allowing this cannot be correlated to the identity of the voter. This isn't an easy requirement, and my initial reading of the standard suggests that it has not been fully understood.

This issue is addressed only briefly in 3.2.3.2, and there, only with regard to long-term retention and not the active memory systems of an electronic voting system.

6.6.3 Voting Process Security for Poll Site Internet Voting Systems
6.6.3.1 Documentation of Security Activities at Poll Site

The presence of this section raises the alarming possibility that there is no requirement for documentation of security activities at the poll site for non internet voting! As far as I can tell, this entire section should be applied generally to all poll sites! Every DRE and precinct-count machine I am familiar with requires enabling prior to opening the polls and this enabling operation is a security activity that must be governed by exactly the kinds of rules discussed here!

6.6.3.2 Capabilities to Operate During Denial of Service Attack

First, denial of service attacks are not the only threat. The exact same issues arise if a tree falls and severs the wires between a polling place and the local internet server. We have thunderstorms that do this kind of thing quite regularly in my region of the country. There is an old maxim in the field of computer system security that anything that can be done by a malicious person can also occur as the result of an accident.

Whatever the cause of the problem, this section makes it clear that the line between DRE and Internet voting is very vague. DRE machines equipped with communications equipment and able to transmit the contents of their "ballot box" to a tabulating center must be covered the the rules that are presented here for internet voting, whether the transmission is done only when the polls close, at the end of each voting session, or intermittently. It does not matter whether the DRE machine transmits intermittently because the communications network only allows occasional successful transmission, or because ballots are saved up and transmitted in batches!

6.6.4 Voting Process Security for Remote Site Internet Voting Systems
6.6.4.1 Request for Internet Balloting

This rule governs administrative procedures. Section 1.1 says that these standards are not intended to govern administrative procedures.

In fact, security (the subject of this chapter) always involves both administrative procedures and mechanisms, so the standard cannot fully stay away from the administrative, but the whole section on Internet voting seems far more willing to dictate administrative procedures than any of the other parts of the standard I have read.

6.6.4.2 Authorization for Internet Ballot

The first bullet introduces a requirement that the ballot be associated with the voter registration record. This requirement is in potential conflict with the requirement for a secret ballot. The conflict can be resolved through appropriate use of cryptography and strict access controls within the system, but the standard is lax in its discussion of access control within a system (Se my comments on section 6.2.2.)

6.6.4.3 Voter Authentication

The reference to a "single out of band transmission" is problematic because with communication systems such as the Internet, it is essentially impossible to count transmissions. When I hit a key on my keyboard and I'm communicating over the internet using the TCP protocol to a remote server, that keypress may result in no transmission (if the communication line is backed up), it may result in one transmission (what I wish was the normal case), or it may result in many transmissions (if there are errors requiring retransmissions).

This section makes it clear that remote site Internet voting is in many ways akin to absentee voting, and in fact, full implementation of the provisions of this section may require changes to absentee voting procedures. It is not clear that the standard recognizes this possibility.

6.6.4.5 Transmitting a Ballot to the Vote Server

Again, we have the requirement that voter ID information be transmitted with the ballot image. My comments on section 6.6.4.2 and 6.2.2 apply again.

6.6.4.6 Receipt of a Ballot by the Vote Server

Paragraphs c and g both require storage of the ballot. The first is for the purpose of fault tolerance, and the second is for the purpose of constructing an audit trail. Can a single storage mechanism satisfy both requirements?

If the ballot image is stored as part of the permanent record with the voter identification information required by 6.6.4.5 and 6.6.4.2, then we face a serious threat to the right to a secret ballot and the system designer must be required to detail, very clearly, how this threat is addressed. I believe that it can be addressed, but it's not an easy problem to solve.

If the voter identification is stripped from the ballot image before it is stored in any non-volitile storage medium (CD-R is suggested repeatedly), then the right to a secret ballot is easier to protect, but it may be harder to meet other requirements.

6.6.4.7 Vote Authentication and Separation from Voter Identification

This section is nice, but it does not answer the questions raised in 6.6.4.6, 6.6.4.5 or 6.6.4.2.

7 Quality Assurance

Most of the guidance in this section focuses on classic hardware quality assurance issues. Section 7.5 does require documentation that is essential for software quality assurance, but the requirement is stated in such broad form that I can easily imagine a very prefunctory document being submitted as a system security specification (7.5.d.6) or a system software specification (7.5.d.4). I have seen too many of these "pro forma" specifications that exactly meet the letter of the law but in fact provide no assurance that the system was developed in a manner that would assure that it meets any quality standard.

7.5 Documentation

I would be far happier if, for each quality assurance document listed in section 7.5, there was a specification of the required contents for that document. I imagine that, in some cases, there are ANSI/ISO standards that govern such documents, and these should be cited!

8 Configuration Management
8.1 Introduction
8.1.2 Configuration Management Benefits

This is an odd section to find here. There is no need to include such salesmanship in the standard itself. When the DOD issued the Ada Standard, they provided a separate document, the Ada Rationalle, and they put discussion of alternatives, justification for features, and similar material in the Rationalle and not in the standard itself.

As far as I am concerned, however, there is absolutely no need to justify configuration management as a component of a voting system standard. I have been faced with too many voting systems where the vendor couldn't tell me what version of the system we were testing or what version of some third-party subsystem was being offered for approval. This is a disgrace.

9 Overview of Qualification Tests
9.2 Testing scope

This section suggests that qualification testing might be expected to assure the "absolute logical correctness of all ballot processing software, for which no margin for error exists."

In fact, no known methodology for testing can provide anything approaching an absolute assurance of correctness. Source code audits have been empirically shown to miss a significant number of errors. Black box testing can never demonstrate correctness. While white-box testing -- that is, testing based on full knowledge of the design gained by an audit, has the potential to find all errors, empirical studies have shown that, in practice, many errors survive such rigorous testing and inspection without detection.

Design for auditability and testability can reduce the severity of these problems, but the Software Standards given in Chapter 4 mandate little. The Security Standards and Quality Assurance material in Chapters 6 and 7 could have helped by mandating design methodologies that would lead to testable results, but I find no such mandate. In fact, there are methodologies that focus on designing software to be auditable and testable, and the most elementary part of a test plan would be the requirement that the methodology used be documented!

9.2.1 Test Categories
9.2.1.2 Focus of Software Evaluation

The previous section on hardware evaluation is nice and specific, with reference to Military Standards (MIL-STD) 810D. I would have hoped for similar rigor in the software evaluation -- there are applicable standards! Instead, all I find is a brief paragraph that describes what is, in fact, the most difficult part of this entire process.

9.2.1.3 Focus of Telecommunication Tests

9.2.1.4 Focus of Security Tests

My comments on 9.2.1.2 apply also to these sections too!

9.2.2 Test System

The escrow requirement is mentioned, in passsing in section 6.2.1 and it shows up again in this section.

In fact, escrow is problematic, and the central issue is the focus of the mention in this section: "The software submitted for qualification shall be identical to the escrowed version."

In fact, if you hand me a typical modern voting system and you hand me a copy of the software that was escrowed, I would have an extremely difficult time determining whether the machine was running the version that was escrowed. For the escrow requirement to be useful, there must be an escrow plan that includes a clear statement of a feasible test that can be used to demonstrate that the software recovered from escrow is indeed the software that was subject to source code audit and the same as the software that actually resides in a machine.

I suspect that, for many current production voting machines that have satisfied the 1990 standard, there is no practical way to demonstrate that the software in the machine corresponds to the escrowed software. Furthermore, I suspect that an effective escrow mechanism must be considered from the start of the system design process for such testing to be practical.

9.3 Applicability
9.3.1 General Applicability
9.3.1.1 Exclusions

Earlier, in 9.2.1.1, MIL-STD 801D is cited. Now, in this section, we find a blanket exemption for a system composed entirely of off-the-shelf components, with no requirement that the individual components meet the standards that the test would have required. Not all off-the-shelf equipment, for example, is guaranteed to meet the thermal requirements of 3.2.1.5, and furthermore, the manner in which off-the-shelf components are combined can have a big impact on their thermal characteristics. Take an off-the-shelf motherboard, disk drive, CPU and RAM, put them in an off-the-shelf case, and you aren't guaranteed acceptable thermal performance unless you also use the right number of off-the-shelf fans!

9.3.2.3 Utility Software and/Device Handlers

This exclusion is dangerous. Here is an illustration that I used previously in my testimony before the House Science Committee on May 22, 2001:

See http://www.cs.uiowa.edu/~jones/voting/congress.html for more detial, under the heading Exempt Software, A Direct Recording Example.

In summary, in January 1998, Fidlar and Chambers came before the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems with their EV2000 voting system. This system uses a version of Windows, and just prior to our examination, there had been an upgrade to the window manager component of this operating system (this would qualify, I believe, as exempt I/O software). Unfortunately, the window manager upgrade added an undocumented feature that had the net effect of disclosing to each voter the ballot selections made by the previous voter! The disclosure was in the form of very subtle shading around the "pushbutton" most recently selected from among the rank of "pushbuttons" presented for voting in each race. The vendor of the window manager (Microsoft) thought of this as a minor enhancement -- something that would help users coming back to a menu to remember the selection they'd most recently selected. The effect of this on the voting application was entirely unintentional, and worse yet, nothing in the documentation of this off-the-shelf component would have aided a security auditor in discovering this violation of the right to a secret ballot. The highlighting in question was subtle enough that 2 of the three members of the examining committee didn't notice it, and the third member (myself) voted a number of test ballots before realizing what was happening.

This example makes it very clear to me that blanket exclusions for industry-standard third-party software components are dangerous! In fact, at this point in time, I believe that a complete source code audit of the window manager, if one is used, is the only way to avoid this particular threat to voter privacy.

The only reason that a similarly complete audit may not be required for network or file-system device drivers is that cryptographic security (either encryption or cryptographically secure checksums) can protect data from the third-party components used here.

9.5 Qualification Test process
9.5.2 Qualification Testing
9.5.2.6 Witness of System Build and Installation

This is a very important requirement, but it needs one additional feature: Specifically, the escrow copy required by section 9.2.2 (see my remarks there) should be made at this time in order to guarantee that the escrowed version is the version subject to qualification testing. I strongly recommend that the escrowed version be constructed first, and then the system build and all copies of the code used in the examination and test process be completed from the escrowed version.

Furthermore, I recommend that the vendor be required, after having installed the software, to demonstrate the methodology by which the installed software can be verified to be identical to that escrowed. At the completion of operational testing, this verification should be repeated in order to demonstrate that the software resident on the voting system after use is still the authorized version.

Appendix B Applicable Documents

In general, it would have been helpful if existing standards were cited where they are incorporated to any significant extent into this standard. This would make it far easier to recognize which parts of this standard are original or unique consequences of the voting system problem, and which are simple good practices that have become accepted standards in other domains. It would also be helpful because, when one of the other standards is upgraded or made obsolete, the corresponding parts of this standard must be inspected to determine the consequences, if any, of that change.