Voting System Transparency and Security:

The need for standard models

Part of the Voting and Elections web pages,
by Douglas W. Jones
THE UNIVERSITY OF IOWA Department of Computer Science

Submitted to the
U. S. Election Assistance Commission
Technical Guidelines Development Committee Hearing on Transparency and Security
National Institute of Standards and Technology
Gaithersburg, Maryland
September 20, 2004


Threats to Voting System Transparency

A central requirement of voting systems, in the larger sense that encompasses both the technology of elections and its human context, is that the entire process be observable. Both this right and its threatened status was clearly articulated by L. D. Lewelling in his 1893 Governor's Biennial Message to the Legislature of Kansas, where he said "As the law stands, the judges of election may utterly disregard the plain provisions of the statute respecting the rights of candidates and of electors to be present at the counting an canvassing of votes without incurring any liability, either civil or criminal."

A century later, this situation has not changed. This was driven home recently when I was involved in drafting a scorecard for rating the quality of election administrations. When shown a draft of this scorecard, one New Jersey election official commented that requiring open disclosure of consultant's reports on election administration on the internet would only be "airing dirty laundry" and is not necessary. In fact, every time the draft scorecard mentioned the right of the public to observe some part of the election procedure, he became very nervous.

I have come to believe that this situation is normal, and I offer the following anecdotes in support of this:

At the canvassing center

In Lincoln County Missouri, the only observers allowed at the canvass of the August 6, 2002 primary were the chairs of the county Democratic and Republican parties and a single member of the press, despite the fact that one Republican primary race was hotly contested by an outsider, Gregory Allsberry, running against the party establishment (reported to me by Gregory Allsberry). The most recent report I have received of such problems was from the U.S. Virgin Islands primary on September 11, where the Board of Elections apparently blocked media access to the vote tabulation center (reported by E-mail from Stephan Rockford on September 14).

The situation is made worse by the advent of new technology. With hand counted paper ballots, an observer could generally understand the process being observed. With computerized election systems, the ability of observers to understand what it is that they are seeing is seriously degraded. Watching a technician or programmer sitting at a computer operating the vote tabulating software, an observer must both have a clear view of the screen and understand the software itself before he or she can draw any useful conclusions about the honesty of the tabulation.

After Geneva Switzerland moved to Internet voting, according to Michel Chevallier, Secrétaire adjoint, Chancellerie d'Etat de Genève, one of their biggest early mistakes was a failure to explain, to the observers, what they were seeing. Watching a server tabulate ballots submitted over the Internet is singularly unenlightening. Chevallier concluded that observers should be offered training in the use of the election management system so that they can understand what is being done by the technicians and programmers running that system.

I do not believe that most observers from parties, the press or the public would have the time to take training in the use of the election management system, but at the very least, the instruction manuals for the tabulating machines and election management systems should be available to observers, preferably on the internet so they can study them in advance. In addition, observers must be able to clearly see the display screen of any computer being used in canvassing the election. It would not be unreasonable to ask that a video projector be attached to the video output of each computer.

At the precinct

One of the central complaints about modern direct-recording electronic voting systems is that their operation is not open to observation. Unfortunately, large parts of this complaint apply equally to the mechanical lever voting systems that were introduced over a century ago; in fact, it is perfectly appropriate to refer to these machines as direct-recording mechanical voting systems. Whether votes are cast on modern DRE systems or their older mechanical counterparts, both the voter and election observers must face the fact that they are dealing with black boxes, where the accuracy of the entire election rests on the honesty, thoroughness and competence of the technicians who have access to the innards of these machines.

We know that complete testing of mechanical lever voting machines was quite rare. Roy Saltman commented on this in his report, Accuracy, Integrity, and Security in Computerized Vote-Tallying (National Bureau of Standards Special Publication 500-158, August 1988, Section 3.3.2), and it is clear that what was difficult with mechanical lever machines is even more difficult with electronic voting systems, in general.

In addition to these long-recognized problems, all modern electronic voting systems pose additional problems that follow directly from the miniaturization of the technology. Where the automatically recorded record of precinct vote totals from a lever machine was recorded on a sheet of paper described as a "bedsheet" because it was so large, the automatic totals produced by a typical precinct-count voting system are recorded on media such as compact flash cards or PCMCIA cards. The largest electronic media in common use today include devices such as the ES&S PEB, which is about 1×3×6 inches in size, or the somewhat larger memory pack found in the Optech III Eagle precinct-count ballot scanner.

If we confine ourselves to precinct-count systems, note how easy it is for an observer to determine that the ballot box dumped out for hand counting is the same ballot box that was used by voters, and note how easy it is for an observer to determine that the bedsheet removed from the back of an automatic recording voting machine is the same one that the election judges sign and witness for delivery to the county building.

In contrast, when a memory device the size of a large postage stamp or a pack of cigarettes is involved, it is very difficult for an observer to determine, with any confidence, that the device removed from the voting machine is the same device that was, only seconds later, inserted in the evidence envelope for transmission to the county building.

When transporting ballot boxes from the precinct to the county building, we have traditionally relied on rules such as: "The polling station's results can be conveyed to the electoral district (for instance) by the presiding officer of the polling station, accompanied by two other members of the polling station staff representing opposing parties." (Code of good practice in electoral matters Doc. 9624, European Commission for Democracy through Law (Venice Commission), 13 Nov. 2002, Section, Paragraph 50.) Where rules requiring joint custody are quite meaningful in the context of large objects such as ballot boxes, joint custody of a device such as a PCMCIA card or a compact flash card are problematic. Only when this card is sealed inside a much larger container can we begin to speak meaningfully about observability and joint custody.

Of course, the process becomes even more difficult to oversee when results are transmitted by radio or over public networks such as the telephone network or the Internet. In all of these cases, observers have no way of directly observing that the communication link that is actually established connects to the authorized remote party and not to some unknown third party. No amount of technical wizardry can change this, although we may indeed impose sufficient technical constraints to allow ourselves to move forward despite this.


All of our technical solutions to the problems described here rest on use of specific software in the voting system, at the tabulating center, or in portable devices. This raises the question, how can an observer assure himself or herself that the software that is actually in use is indeed the very same software that has been approved for use.

For the computer I am using to write these comments, I can begin to answer this question by clicking on the "About this Mac" option on my screen, which helpfully informs me that I am running Mac OS X Version 10.3.5. This message tells me, with real certainty, that I am not running an authentic version of, say, Mac OS Version 10.3.4, because we can define authentic versions of the operating systems as those versions that honestly report their identity. Unfortunately, the self-reported identity of a piece of software does nothing to assure an observer that this software is honest!

In the case of my computer system, I trust the self-report of the system only because I personally installed the original version of the operating system on this machine, using media provided by the vendor, and because I trust the vendor's software update product to make secure connections to their web server to install operating system upgrades. Thus, a central element of my own personal trust here is that I personally had physical control of this computer system since it originally came out of the box.

The use of "software fingerprints" computed by some cryptographically secure hash function does nothing to change this fact. So long the observer is limited to inspecting the self-declaration of identity of the system, there is no way for the observer to know whether that identity is declared honestly or not. The self-declaration that a piece of software has some particular MD5 hash can definitively tell you that the system is not the correct system, if the announced hash value is not the correct one, but it cannot tell you that the system is correct, since dishonest software could easily report a dishonest number.

Only if the observer can directly examine the memory of the computer and compare it with a reference memory image can the observer really know that what is in the computer and what is authorized to be there are the same. If we allow this comparison, we compromise the author's right to retain this software as a trade secret. In addition, if we are not very careful, the same memory access that allows inspection can also allow modification, thus elevating the election observer to the status of a security threat.

It may be possible to protect proprietary software from disclosure to observers if we allow the observer to run software on the voting system that has read-only access to the system memory and a very narrow channel through which the software can announce the cryptographically secure hash code it has computed. This requires that the observer trust the processor on the system to accurately run the hash-checking software, it requires that the firewalls protecting the system from the observer's software be secure against attacks by the observer's software, and it requires careful design of the choked-down channel by which the observer's software can report the hash code without disclosing the proprietary software itself.


Our current system of voting system certification illustrates a major failure in voting system transparency. As things stand right now, voting system testing under the FEC/NASED voting system standards (1990 and 2002) is an entirely closed process. The testing authorities are not obligated to disclose any report of their testing to the public other than a simple pass-fail judgement, while hundreds of pages of test results are sent back to the vendors.

There is an overwhelming public interest in the integrity of our election machinery, and this interest extends to all questions about the competence and thoroughness of the testing to which our voting systems are subjected. As things stand, the voting system vendors have been allowed to hide behind a myth of thorough and painstaking testing, telling not only the public but also state and county authorities that these tests prove the security of their systems when they do no such thing.

Threats to Voting System Security

There is ample documentation at this point demonstrating that many voting systems on the market today suffer from serious flaws in their security. I have written about this at length; a good short summary of the situation is contained in my open letter To the election officials of the State of Iowa, a position statement presented to the Iowa State Association of Counties, Des Moines, Iowa, March 17, 2004, at the invitation of Iowa Secretary of State Chet Culver. This is on the web at:

In addition, unfortunately, many of the assessments of voting system security have contained serious defects. These defects reflect badly on voting system vendors, the independent testing authorities, and on security professionals, suggesting that voting system security is a sufficiently specialized domain that many security experts have not correctly identified some of the fundamental security requirements of the voting domain. I have written about this in my submission to RSA Cryptobytes of August 25, 2004, scheduled for publication sometime soon and included in full as the major part of what follows.

Many people believe that elections are really trivial, and when an election is conducted in a small group by a show of hands, security is not an issue. Everyone present can observe the entire process and determine the result for themselves. Security becomes an issue when the number of participants grows to the point that the voters cannot all vote in the same room, and it becomes an issue when secret ballots are introduced in order to protect the rights of voters who oppose the powerful or hold unpopular opinions.

When elections are distributed between many locations, we must secure the conveyance of data between these locations, not so much because of the possibility of eavesdropping, but because we need to assure ourselves that it is authentic. In fact, almost all of the data we are interested in conveying is public. The ballot layout is usually published weeks before the election and the totals from the precinct are usually posted in public when the polls close. The only actual secrets included with this data are authentication keys being distributed for later use.

Unfortunately, these elementary facts appear to be lost on many voting system developers, evaluators and customers. The following examples illustrate this.

Fairy Dust

In the summer of 1996, a subcontractor working for Wyle Laboratories of Huntsville, Alabama evaluated the software of the Electronic Ballot Station, an innovative new voting system made by I-Mark Systems of Omaha Nebraska. In the review of this software, the subccontractor reported that this was the best voting system software they had ever seen, and they were particularly impressed by its security and its use of DES. (See Qualification Testing of the I-Mark Electronic Ballot Station, Report number 45450-01, Wyle Laboratories, Huntsville AL, 1996, 336 pages. Note, this report is proprietary. Only content discussed in open meetings of the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems is cited here.)

This system was brought before the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems on November 6, 1997 by Global Election Systems of McKinney Texas, which had renamed the system the AccuTouch EBS 100. At that examination, it quickly became apparent that the use of DES in this system was quite naive. The question that exposed this was simple: Given that DES is a symmetric key cypher, the security of the system depends crucially on how the key management and distribution problems are solved. So, how are they solved?

The answer from Global was disappointing but difficult to draw out: There was no key management or key distribution problem because there was only one key and it was hard coded into every copy of the system. In a prototype system, as a place-holder for future development, such a scheme might be appropriate, but such a primitive scheme should never have come to market. Unfortunately, this primitive security system remained in use until 2003, by which time, Diebold had purchased Global Election Systems. (See T. Kohno, A. Stubblefield, A. Rubin and D. Wallach, Analysis of an Electronic Voting System, IEEE Symposium on Security and Privacy, Oakland CA, May 2004.)

Here, it is clear that cryptography was used as fairy dust. It was sufficient to fool the examiner for Wyle Labs into believing that the system was secure, where a more able examiner would have admitted an inability to evaluate the system's security instead of being impressed by a thin veneer of cryptography.

Incorrect use of Cryptography

What did the I-Mark system encrypt? As it turns out, encryption was used to guard the contents of the electronic ballot box during transfer from the electronic ballot station to the centralized election management system. This raises a second issue: This information is not secret. The best practice when closing the polls at a polling place is to print and post in public a copy of the election totals for that polling place before transfer of the electronic record to the election management system. This allows observers to verify that the data eventually published for that polling place matches the data disclosed before transmission.

If the data is already public, encryption must serve some other purpose. In this case, the intent is clearly to offer some degree of authentication. Unfortunately, simple encryption offers no authentication at all unless there is some redundant structure to the encrypted data. A compactly encoded binary file of election results would offer little assurance in this regard.

I-Mark, Global and Diebold were not alone in making this error. In the fall of 2003, the state of Ohio contracted with Compuware Corporation to evaluate four of the direct-recording electronic voting systems then on the market. The Compuware report noted that the Election Systems and Software iVotronic system made no use of cryptography in data transfers from the voting machine to the election management system, it recommended that strong encryption be used, but it did not mention the need for authentication. (See Direct Recording Electronic (DRE) Technical Security Assessment Report, Compuware Corporation, Columbus OH, Nov 2003.)

In fact, there is authentication in several of these voting systems, but it is accidental and weak. In the case of the Optech Eagle, sold by both Sequoia and Election Systems and Software, the data returned to the election management system includes the time at which the system was prepped for the election, and this is checked on receipt. While the time at which a system was prepped for the election is no secret, obtaining this information to the full precision of the hardware clock is difficult, so it represents a useful if weak authentication token, defending against forgery but not man-in-the-middle attacks. (See D. W. Jones, Problems with Voting Systems and the Applicable Standards, Improving Voting Technologies -- The Role of Standards, Serial No. 107-20, pages 154-191, U. S. House of Representatives Committee on Science, Washington DC, May 2001; available on the web at

How strong is strong enough?

When the I-Mark/Global/Diebold AccuTouch system came under widespread public criticism in 2003, many considered the use of DES to be a significant weakness. This encryption standard, with only a 56-bit key, was never seen as very secure; designs for a brute-force DES cracker had been published in 1998, and successful attacks were demonstrated shortly after that. (See Electronic Frontier Foundation, Cracking DES, O'Reilly, 1998.)

The use of cryptographically secure authentication to protect transmission of election data from precincts to election management systems is a specialized context, in which the basic assumptions under which DES was cracked may not apply. There are two ways in which an adversary may attack this transmission path in a voting system:

First, the adversary may attempt a man-in-the-middle attack, trying to crack the authentication, edit the vote totals and forge new authentication data for the edited totals. In jurisdictions where polling places transmit totals by public networks, for example, by telephone, there is usually a fairly short window during which the data must be transmitted, on the order of an hour. If data is hand-delivered, for example, in an electronic cartridge, the delivery window will be longer to allow for physical travel, but this does not give the adversary much more time for computation. Attacks that take many hours would be of no use here.

Second, the adversary may forgo cracking the authentication keys and attempt a trial-and-error attack, hoping to deliver an acceptable forgery before the authentic data is transmitted. Alternately, the trial-and-error strategy could be forced on a man-in-the-middle attack when a complete crack of the authentication keys is impossible. In either case, if even one bit of authentication information is wrong, the attack can be detected. All modern voting systems offer alternative channels that can be used when an attack is discovered, so trial-and-error is unlikely to pay off. In short, very weak authentication is sufficient if the attacker gets only one shot at a trial-and-error attack.

Is pseudo-random random?

One critical requirement for any voting system used in the United States is that it protect the secrecy of the voter's ballot. The order in which voters enter a particular voting booth is no secret, any observer can record this. The ballots themselves are also only weakly guarded. In case of a recount, they may well become public record, as in Florida 2000. What must be broken is the link between voters and their ballots.

One way to break this link is to store the ballots in random order inside the voting machine. Unfortunately, what a naive programmer may believe to be random may be merely pseudorandom and quite predictable, to a cryptanalyst. Unfortunately, this fact is lost on many who advertise their services as security professionals.

For the I-Mark/Global/Diebold AccuTouch system, for example, a well-known and very weak linear congruential random number generator was used. Unfortunately, when Compuware Corporation evaluated this same system, they concluded that this generator posed no risks. Curiously, they did note that the pseudorandom number generators used for this purpose by ES&S and Sequoia were seeded from the real-time clock, showing some awareness of the limits of randomness.

Unfortunately, a brute-force exhaustive search through all possible 32-bit seeds is remarkably fast on a modern computer. Furthermore, the sample size, typically around 100 ballots per voting machine, is large enough that an exhaustive search may well be sufficient to reveal the seed that put the ballots into particular slots within the ballot box. As a result, simply seeding a weak pseudorandom number generator from the time of day clock may offer no real privacy.

Clearly, the strength and seeding of the pseudorandom number generators used for ballot storage should have been investigated by Compuware. It is not safe to rely on the random number package that comes with whatever system or language is being used, nor to rely on default seeding of these generators. The only acceptable alternative to a carefully seeded cryptographically secure pseudorandom number generator is the use of additional carefully selected sources of randomness to bolster a weak generator.

Do we need public keys?

Is symmetric key cryptography safe for use with voting machines, or do we need to build a public key infrastructure for elections? The central issue here is one of secure key distribution, and the answer rests on an understanding of how voting systems are used.

A typical jurisdiction has an elections office that includes a secure warehouse where all of the voting machines are stored between elections and where the election management system runs. Prior to the election, the election management system is used to prepare ballot information for each precinct and load this into the voting machines.

Some voting systems are loaded by physically connecting them to the election management system, one at a time in the secure warehouse. Others are loaded using PCMCIA cards or compact flash cards that are sealed into the system in the warehouse. Yet others are loaded at the polling place immediately before the polls open, using portable media hand delivered to precinct election officials.

So long as the voting systems are prepped for the election in the secure premises of the election warehouse and then securely delivered to the polling place, or so long as portable media are held in trustworthy hands, cryptographic keys can be delivered to the voting system through this route and there should be no need for more complex cryptographic models.

There are two thorny issues that must be addressed before accepting this argument. First, the custody issue must be addressed seriously. If authentication or cryptographic keys are loaded in a voting machine and then it is left unattended in an insecure location, someone might open the machine up and extract this information. Clearly, physical security is not obsolete.

The second concern is rising pressure from county election managers for faster ways to prepare machines for election day. This leads, naturally, to proposals for remote-control initialization and testing of voting machines using wireless technology. The security problems this could create verge on nightmarish, yet some vendors are proposing that their next-generation voting systems will operate this way.

Anti-virus tools?

It is clear that voting systems must be protected from viruses, and this is required by Section 6.4.2 of the 2002 FEC/NASED voting system standards. What is not so obvious is that protection from viruses need not rest on the use of anti-virus software. Unfortunately, this has not been understood by many well-meaning security evaluators. For example, one assessor asked that Miami-Dade County install anti-virus software on the ES&S iVotronic voting machine. (See C. Jackson, Audit Report -- City of Opa-locka Special Election Held April 29, 2002, Memo to D. Leahy, Miami-Dade County Elections Department Public Records, August 7, 2002.)

The iVotronic does not run on a Microsoft Windows platform, nor does it incorporate any Windows utilities or Windows compatible data formats above the level of directory formats on removable media (compact flash cards). As such, no commercial anti-virus software is applicable and it is quite possible that the system is inherently virus-proof.

Certainly, assessment of the security of voting systems against viruses and similar attacks is appropriate, but simply checking that the latest anti-virus tools are installed is not enough. Instead, the relevant questions are: Are the communication protocols used by this system inherently free of virus delivery mechanisms, and are they correctly implemented; among other things, for example, are they free of buffer overflow vulnerabilities?

If it can be shown that a communications channel cannot deliver data that will serve as input to an interpreter or be read as machine code, then that channel does not threaten to inject viruses into the system. This is the question that must be assessed on most embedded systems, and in general, the security offered by systems that meet this standard is higher than can possibly be met by installing and regularly updating anti-virus software. In fact, the regular installation of such software opens a path for Trojan horse attacks and so, it poses security risks of its own.

Defense in depth

The classic security strategy is that, in securing any system, it is appropriate to assume that each of the individual defensive measures used in the system will fail. Therefore, for each defense, there should be backups. This is called a strategy of defense in depth. (See the National Security Agency Security Recommendation Guides, guide number 1, Defense in Depth, Available from

Unfortunately, there is little evidence of effective defense in depth in current voting systems. For example, consider the ES&S iVotronic and the problem of assuring that this machine is not turned on between the time it is prepared for an election and the time the polls are formally opened. This machine is operated by inserting a device called a PEB in a socket on the front of the machine. The PEB is itself an obscure, custom device, so we have an element of security through obscurity here.

Unfortunately, all an observer can rely on is the assertion by an expert that the PEB is indeed obscure and so, difficult to reverse-engineer and duplicate. In addition to this security, all of the dangerous operations on the machine require the entry of a password. Again, the observer must rely on the expert's assertion that the password system is strong and used properly. What is missing is a simple outer layer of defense, something trivial and physical.

What I suggested in my Observations and Recommendations on Pre-election testing in Miami-Dade County, prepared on Sept 9, 2004 (Available from, is that ES&S should add provisions to the iVotronic to allow a numbered tamper-evident seal to be applied across the PEB socket. This simple barrier would add a defensive layer that is entirely absent on the system as it now stands.


Unfortunately, these stories show that not only voting system vendors but also a significant number of voting system evaluators have seriously misunderstood the security requirements for voting systems, while we continue to allow elections and voting system evaluation to be carried out with minimal rights of public observation. The presence of inept security in voting systems reflects badly on the vendors and on the level of sophistication of their customers, but after the many publications of the past year, this is not news.

What is more distressing is the extent to which the security evaluations that have been done for voting systems expose flaws in the knowledge of security professionals. It may not be too much of an exaggeration to state that many of today's security professionals have focused so much on conventional data processing applications using Microsoft Windows in a corporate setting that they are very poorly adapted to examining the security of novel applications outside the Windows domain or outside the commercial data processing domain.

What I believe we need is a reference model, well above the level of specific details of file format and system function, documenting the data paths within the voting system, from election definition to post-election audit, and for each path, documenting the threat model and security mechanisms appropriate for addressing those threats. This model must also document the assumptions made about the public's right to observe, so that election laws and election conduct can be evaluated against it.

I do not see compelling reasons to believe that the data paths involved differ greatly over wide range of voting systems. Direct-recording electronic voting systems and precinct-count mark-sense ballot scanners all require election-specific preparation, all must convey results back to the canvassing center, and all must record event logs. There may be multiple models that are judged sufficient, but evaluating a voting system to verify that it incorporates an approved security model is far simpler than evaluating it against vague security requirements such as those incorporated in the current FEC/NASED voting system standards.