The European 2004 Draft E-Voting Standard:

Some critical comments

Part of the Voting and Elections web pages, http://homepage.cs.uiowa.edu/~dwjones/voting/
by Douglas W. Jones
THE UNIVERSITY OF IOWA Department of Computer Science

Posted to the web, Sept 29, 2004;
Revised, Oct 11, 2004.

On July 13, 2004, the Multidisciplinary Ad Hoc Group of Specialists on Legal, Operational and Technical standards for e-enabled voting, (IP1-S-EE) released its Draft Recommendation of the Committee of Ministers to member states on legal, operational and technical standards for e-voting. The Committiee of Ministers adopted a revised version of this document on September 30, 2004, releasing this as Recommendation Rec(2004)11 of the Committee of Ministers to member states on legal, operational and technical standards for e-voting The Council of Europe can only make nonbinding recommendations. In this regard, these recommendations could follow a path similar to that of the "voluntary" federal voting systems standards in the United States, which is to say, nothing prevents states from writing them into law, and if a sufficient number do so, the nonbinding nature of these standards could change without the cooperation of their authors.

This draft and the revision accepted by the council contain many useful ideas, but they also suffer from some serious deficiencies, mostly in the appendices that contain the actual standards. These are discussed here in roughly the order of presentation in the draft recommendations.

In the following, quotations from the draft are enclosed in boxed paragraphs, while commentary follows outside the boxes. Changes made between the draft and the final document are indicated by strike-through and underlining, following common practice for document revision.

Recommendations

(i.) E-voting shall respect all of the principles of democratic elections and referendums. E-voting shall be as reliable and secure as democratic elections and referendums which do not involve the use of electronic means.

This recommendation presupposes metrics for the comparison of the reliability and security of electronic and non-electronic systems. Unfortunately, there are, at this point, no widely accepted metrics for this. The problem is one of risk assessment. Quantitative comparison requires estimates of both the probabilities and costs of the various eventualities. The problem is, we have no reasonable way to estimate the probability of the rare events, nor is it straightforward to place values on such outcomes as the theft of an election.

Even if we assume that we have reasonable metrics, it is not clear that reliance on a-priori estimates of risk likelihood and cost is reasonable. The problem is, we are comparing two such different systems. With conventional election technology, the greatest risk is what has been described as "retail fraud", where each fraudulent vote must be created individually, by stuffing ballot boxes, by a voter successfully voting twice, or similar means. Retail fraud, if it is to have a significant effect, must have large numbers of participants and is therefore unlikely to be a closely held secret.

In contrast, with electronic voting, one of the biggest threats we face is the possibility of "wholesale fraud", where hackers or corrupt election officials manipulate the results. If this ever occurs, one person or a very small number of conspirators could potentially steal an election. We might never know that this had been done! Even if the likelihood of this event is very low, we might consider the potential loss so unacceptable that we refuse to accept this risk.

Appendix I, Legal Standards
A. Principles
III. Free Suffrage

10. The way in which voters are guided through the e-voting process shall be such as to prevent their voting precipitately or without reflection.

This rule is paternalistic, and in looking at e-voting systems designed over the past 2 decades, this idea has led to some very frustrating user interfaces. The fundamental problem is that most voters enter the polling place with their minds made up, and all they want to do is to vote. User interfaces that attempt to elicit reflection, for example, by forcing the voter to read the entire ballot text, are simply annoying.

Appendix I, Legal Standards
A. Principles
III. Free Suffrage

15. The e-voting system shall prevent changing of a vote once that vote has been cast.

This rule may be mistaken. Particularly with remote electronic voting, it may be appropriate to allow voters to change their votes. One of the most frequently cited objections to remote voting (including both electronic and postal variants), is that a voter could vote in the presence of someone who is attempting to coerce a particular vote. The potential for voter coercion can be reduced by making all ballots provisional, with the rule that casting a replacement ballot voids all ballots previously cast by the same voter.

Appendix I, Legal Standards
A. Principles
IV. Secret Suffrage

17. The e-voting system shall secure that votes in the electronic ballot box and votes being counted have been made and remain are, and will remain, anonymous, and that it is not possible to reconstruct a link between the vote and the voter.

19. Measures shall be taken to ensure that the information needed during electronic processing cannot be used to breach the secrecy of the vote.

These two paragraphs take an absolutist attitude toward ballot secrecy. This is quite different from the attitude taken by the British Secret Ballot Act of 1872, where all information needed to connect the ballot to the voter is retained but held as a secret by the election administration. This absolutist attitude of these two paragraphs is contradicted later:

Appendix I, Legal Standards
B. Procedural Safeguards
III. Reliability and security

35. Votes and voter information shall remain sealed as long as the data is held in a manner where they can be associated ...

This contradicts the absolute prohibitions given earlier and is far closer, in effect, to the British Secret Ballot Act. I believe that these two perspectives on ballot secrecy, one that is absolutist, and one that is conditional on the correct functioning of the election apparatus (human and technical) should not be confused. Their consequences for system design are quite different, and they also have very different behavioral consequences. Provisional ballots, for example, require that the voter's identity be attached until the end of the voting period, an option that is compatible only with the a conditional and not an absolute interpretation of ballot secrecy.

The choice between these two models of voter privacy should be viewed as a matter of public policy, not of voting technology. Technological arguments that purport to show that a given technology for conditional privacy is strong enough to be trustworthy should be taken as arguments to change the law to permit a conditional model, not as arguments for the use of technology offering conditional privacy in jurisdictions where the law calls for absolute privacy. I have argued this previously in Auditing Elections in the Oct. 2004 issue of Communications of the ACM. and again in my Comments on the Sept. 20 hearing before the US Election Assistance Commission Technical Guidelines Development Committee.

Appendix I, Legal Standards
B. Procedural Safeguards
II. Verifiability and accountability

24. The components of the e-voting system shall be disclosed, at least to the competent election authorities, as required for verification and certification purposes.

Limitations on disclosure to the competent election authorities is dangerous. This implies that we can safely put our trust in those authorities. While this might be true in stable democracies with long histories of honest elections, there are many young democracies with no such history, and there are old democracies with long histories of petty corruption in certain electoral jurisdictions. It seems reasonable, therefore, to require disclosure to the public in all cases where it cannot be shown that such disclosure will endanger the integrity of the election.

Appendix I, Legal Standards
B. Procedural Safeguards
II. Verifiability and accountability

26. There shall be the possibility for a recount. Other features of the e-voting system that may influence the correctness of the results shall be verifiable.

27. The e-voting system shall not prevent the partial or complete re-run of an election or a referendum.

These two paragraphs are confusing for several reasons. The first is the lack of a local definition of the word recount. Unfortunately, this word has been redefined, as needed, by election jurisdictions. For example, in jurisdictions that use mechanical-lever voting machinery in the United States, it has been common to define a recount as a repetition of the canvass of the election, since the machinery itself does not record any recount-able ballots. In other jurisdictions, a recount is defined in terms of the re-examination and re-tabulation of all of the ballots.

What does it mean for a feature of the e-voting system to be verifiable? The use of the term verification by computer scientists concerned about software verification is so restrictive that the only software that has ever been verified with any degree of confidence is trivial software. Taking this definition of verification would preclude the use of software in voting systems. In contrast, the use of the term verification by voting system vendors has been very permissive. If a voting system has been subject to any testing, vendors will claim that its accuracy and integrity have been verified.

The strongest possible interpretation of the requirements that recounts be possible and that voting systems be verifiable would be that there is some type of voter-verified physical record of the voter's intent, but as worded, these requirements can easily be watered down to permit almost anything by redefining the words recount and verifiability until they permit whatever system is currently under consideration.

And, how can any voting technology possibly prevent a partial or complete re-run of an election? However a recount is defined, it generally involves a partial re-run of the election, starting with the saved state of the first run as of the time the polls closed. A complete re-run can always be accomplished by defining the re-run as a complete new election, as was done in New York City, where terrorist attacks halted the election of September 11, 2001 and forced it to be re-run two weeks later.

Appendix I, Legal Standards
B. Procedural Safeguards
III. Reliability and security

31. Before any e-election or e-referendum takes place, the competent electoral authority shall satisfy itself that the e-voting system is genuine and operates correctly.

Again, as in paragraph 24, we are forced to trust the competent electoral authorities. Why is there no guarantee of the rights of the public to observe this pre-election testing? At the very least, we should ask that representatives of the public be admitted to observe the tests, either members of the press or representatives of the candidates (or issue advocacy groups, in the case of a referendum).

Appendix I, Legal Standards
B. Procedural Safeguards
III. Reliability and security

32. Only persons appointed by the electoral authority shall have access to the ... election data. ...

34. ... If stored or communicated outside controlled environments, the votes shall be encrypted.

Note that later, in paragraph 83, there is a discussion of "election observation data" maintained so that "election observation can be carried out." That paragraph could be construed as weakening the requirements of paragraphs 32 and 34, since it may be taken to imply the existence of independent election observers.

Some election data must be revealed to the public. The results of the election, for example. Other election data can easily be inferred by the public. For example, the sample ballots or equivalent information distributed to the electorate before an election are generally enough to determine the details of the election definition data files. Or, for example, when the polls are closed at a polling place, one common practice is to immediately disclose the total votes received at that polling place. Using these totals, from each polling place, members of the public may duplicate all of the activities in canvassing the election.

This argument leads me to a conclusion that is the opposite of that required by this sentence: All election data shall be immediately disclosed to the public as soon as it can be shown that such disclosure does not compromise the integrity or security of the election. Obviously, cryptographic keys used for authenticating election data must be closely held, but in most cases, the data that was authenticated by these keys can be revealed immediately. This subject is covered in my Testimony before the Sept. 20 hearing before the US Election Assistance Commission Technical Guidelines Development Committee.

I conclude that, quite generally, very little election data should be encrypted, but rather, all election data must be authenticated. Cryptographic mechanisms are indeed useful for data authentication, but authentication is not encryption!

Appendix II, Operational Standards
II. Voters

39. There shall be a voters register ...

40. The possibility of creating ... a mechanism allowing online application for voter registration ... shall be considered.

The need for a voters register, distinct from the citizenship rolls of the country, is not obvious. We require this in the United States because we have not historically maintained national, state or local citizenship records, nor have we historically required citizens to carry official identity documents of any kind. In nations where national identity cards are mandated and where only citizens can vote, the citizenship database itself should suffice as a voter register.

The California Internet Voting Task Force recommended against allowing online voter registration. The model of voter registration used in the United States is, of course, distinctive, but their reasoning applies more broadly. Unless the on-line system can verify the potential voter's identity, we should insist on a human element.

Appendix II, Operational Standards
II. Voters

41. In cases where there is an overlap between the period of voter registration and the voting period, provisions for appropriate voter authentication shall be made.

This requirement is puzzling because of the initial specification. It would seem that "provisions for appropriate voter authentication" should always be made, regardless of the presence or absence of an overlap between voter registration and the voting period.

There is a special circumstance that must be considered. If a voter registers very late before the election, their name may not appear on the printed voter list provided at their polling place. Is this the special condition which paragraph 41 was designed to resolve?

Appendix II, Operational Standards
IV. Voting

44. It is particularly important, where remote e-voting takes place while polling stations are open, that the system shall be so designed that it prevents any voter from voting more than once.

Here again, we have a restatement of a very general rule: It is indeed particularly important, whatever voting system is used, to prevent any voter from voting more than once. Many jurisdictions in the United States permit early voting in the week or weeks before an election. During the early voting period, voters may go to an early voting site and cast a ballot. Generally, those who have cast early ballots are noted on the voter lists provided to the precinct polling places, so that they cannot vote again on the day of the election, but in some jurisdictions, those who vote during the last day of early voting cannot be noted on the printed voter lists because of the time required for data entry and printing.

This system failure illustrates two things: First among these is the fact that the failure that paragraph 44 addresses can arise not only with e-voting but with other technologies, and second, that these problems can occur when the time delay from the end of early voting to the start of the regular election is short. Overlap is not required if there is latency in the information flow, caused, for example, by the speed of printing of voter lists for use at conventional voting precincts, or by the speed of data entry for the list of voters who have voted early.

Appendix II, Operational Standards
IV. Voting

48. 47. There shall be equality in the manner of presentation of all voting options on the device used for casting an electronic vote.

This requirement is far more easily stated than it is acted on. Many jurisdictions use some form of ballot rotation (varying the order of names from one ballot to the next) in order to equalize the presentation, but if the list of candidates is long, rotation makes it difficult to find any particular candidate. For an extreme example, consider the California gubernatorial recall election of October 7, 2003, where there were over 130 candidates.

At the opposite extreme, some jurisdictions list candidates in the order of popularity of their parties in the most recent jurisdiction-wide race, as is done in Florida. This makes it easy for voters to find their candidates, since the most popular candidates will be near the top of the list, but it also favors the party currently in power. Mixtures of these two models are fairly common, for example, rotation of candidate names within equivalence groups, so major party candidates are always listed ahead of minor party candidates -- the definition of major parties generally being that some candidate from that party earned a significant fraction of the vote, for example, 5%, in the most recent jurisdiction-wide election.

Appendix II, Operational Standards
IV. Voting

51. A remote e-voting system shall not enable the voter to be in possession of a proof of the content of the vote cast.

This rule prohibits cryptographic systems such as that being developed by VoteHere (Andrew Neff and Jim Adler) and SureVote (David Chaum). These systems prove to the voter, in the privacy of the voting booth, that the receipt contains their vote, but they do not provide, to the voter, sufficient information to prove to anyone else how they voted, using that receipt. A set of cryptographic keys held by the trustees of the election authority can, in theory, be used by the voter to disclose their vote, but these keys should be managed in such a way that no voter will ever obtain all of the necessary keys.

It must be recognized that the VoteHere and SureVote schemes only offer conditional ballot secrecy, as permitted by paragraph 35, and not the absolute secrecy required by paragraphs 17 and 19. If a jurisdiction is willing to accept conditional secret ballots, as has long been the case under the British secret ballot act of 1872, then the receipt systems of these proposals might be worthy of serious consideration.

Appendix II, Operational Standards
IV. Results

56. When counting the votes, representatives of the competent electoral authority shall be able to participate in, and any observers to observe, the count.

The rights of observers at the tabulation center have been severely degraded by the use of computerized vote tabulation systems. Where votes are hand counted, observers can see what is being done. When votes are tabulated using computers, all the observer can see is a box with some attached fans and blinking lights, and perhaps the back of the technician or programmer sitting at the keyboard and typing unknown commands into the system.

After Geneva Switzerland moved to Internet voting, according to Michel Chevallier, Secrétaire adjoint, Chancellerie d'Etat de Genève, one of their biggest early mistakes was a failure to explain, to the observers, what they were seeing. Unless the observers can actually understand the commands typed by the technician or programmer into the election management system, the observer cannot distinguish between criminal manipulation of electronic vote records and the normal conduct of the election.

Paul Craft of the Florida Division of Elections reported, at the U. S. Election Assistance Commission Technical Guidelines Development Committee Hearing on Transparency and Security held in Gaithersburg, Maryland on September 20, 2004, that he had attached video projectors to some of the computer systems used in state voting system acceptance tests so that observers at the tests could see what was being done on the computers that were parts of those voting systems.

Merely allowing an observer to see the computer screen is not sufficient, however, unless the observer can also understand the content of the screen or the commands being entered. Therefore, observers must be given access to the user manuals and related documentation for the software being used, or, as Michel Chevallier has suggested, the observers could be allowed to attend the same training classes that are attended by the election technicians.

Appendix II, Operational Standards
IV. Results

58. In the event of any irregularity affecting the integrity of votes, the affected votes shall be tabulated as such.


V. Audit

59. The e-voting system shall be auditable.

What does this mean? All current vendors of electronic voting systems in the United States claim that their systems are auditable, but proponents of voter-verified paper ballots generally hold that many of these systems only offer the possibility of a partial audit.

If auditing detects an irregularity -- and that is the point at which irregularities are generally discovered, there needs to be some decision rule for resolving the irregularity. Generally, redundancy in the election records can be used, for example, comparing the number of signatures in the poll-book with the number of ballots counted in order to determine if there is an irregularity. If there are two records of the votes from some jurisdiction or sub-jurisdiction, for example, the electronic records from a precinct polling place and the paper record of the vote totals, printed at that polling place when the polls closed, the number of signatures found in the poll book can be used to determine which of these two records is more likely to be authentic in the event that they disagree.

It also seems reasonable to give primacy to certain records while considering other records to be secondary. A manuscript signature of a voter in a poll-book or a paper ballot filled out by a voter are primary records, direct physical records of the presence of that voter at the polling place and of some voter's intent in that election. In contrast, electronic records should probably be viewed as secondary. The fact that some smart-card was inserted in the smart-card reader at an unattended polling place demonstrates only that that card was present, not that the legal owner of that card was there. The fact that the software in an electronic voting machine recorded a voter for some candidate does not prove that the voter intended that vote to be recorded; we can only infer this intent if we are certain that the software was honest. In effect, these secondary records of the election should have legal weight comparable to copies of original documents -- they are given weight that is tempered by our understanding of the accuracy and reliability of copies.

In addition, it seems reasonable to give more weight to records when the chain of custody of those records is tightly documented, as opposed to records that have been carelessly transmitted. If a voting machine is left in the custody of one person for long enough that that person could, in theory, make modifications to that machine or to the firmware or software resident in that machine, then evidence from that machine should be viewed with a degree of distrust when compared to evidence from some system that was never outside the joint oversight of representatives of opposing parties except when it was secured away from access by any person.

Appendix III, Technical Requirements
C. Systems Operation

69. The competent electoral authorities shall publish an official list of the software used in an e-election or e-referendum. Member states may exclude from this list data protection software for security reasons. At the very least it shall indicate the software used, the versions, its date of installation and a brief description. A procedure shall be established for regularly installing updated versions and patches of the relevant protection software. It shall be possible to check the state of protection of the voting equipment at any time.

This rule contains a number of problems. First, it requires only the disclosure of the identity of the software being used, and offers no incentive to disclose more than this. There have been numerous demonstrations that open-source software products such as Linux are more resilient and more secure than proprietary products such as the many compatible versions of Unix that Linux has displaced over the past decade.

It seems fair to argue that, unless there is a compelling reason not to disclose the source code of software used in vote counting, preference should be given to systems where the code is disclosed. The only code where disclosure of the actual source code is likely to be dangerous is code that contains passwords or other secrets essential to the security of the voting system. As a general rule, such code is evidence of poorly designed security models, since in competently designed secure systems, passwords and other security keys are essentially always separated from the code.

The exclusion given above for the protection software appears to be based on the same argument, that disclosure of this software could endanger system security. As worded, however, this rule not only allows the source code for the protection software to be closely held, but even the identity of that software. Thus, if the voting system uses obsolete protection software or software with widely known flaws, the election authority has the right to hold this as a secret indefinitely. Perhaps the key word here is competent election authorities, since by using obsolete or flawed protection software, the election authorities will be demonstrating their incompetence, and this rule allows them to indefinitely hide this fact from their constituents.

This rule asks for regularly installing updated versions and patches; this, in itself, introduces new security vulnerabilities. As a worst-case scenario, consider the possibility that the vendor of a major voting system component, for example, the graphical user interface support library for an e-voting system, was in an adversary relationship with the government. This is not an unlikely assumption. Microsoft, for example, has been involved in adversarial relationships with both the United States government and several EC governments. If an employee of that company had no scruples, that employee could wait until the names of the candidates in an election were announced, select those candidates who were unfavorable to the company, and then issue a software patch that included provisions to interfere with a small fraction of the votes for those candidates. I first described this attack in my lecture at thee Paul D. Scholz Symposium on April 13, 2000.

An effective defense against such attacks requires that all software upgrades to voting systems be put through a rigorous inspection and testing process. The closer an upgrade is to the date of an election, the more we should view it with suspicion. In effect, in the voting system realm, we must never accept the explanation that some software change is merely a routine upgrade.

The final requirement, that it shall be possible to check the state of protection of the voting equipment at any time, asks for quite a bit, while offering very little advice on how this requirement can be satisfied. It is extremely difficult, for example, to ascertain what version of a program is actually running on a computer. No self-report of a version number or cryptographically computed software "fingerprint" can prove that the required version is running. The only acceptable proof comes from allowing an independently written program to inspect the running software and either compare it with the correct version or compute its fingerprint. Allowing independently written fingerprint computation routines to be run on the voting system introduces a new potential security loophole, so these routines and the protocols for injecting them into the voting system must be subject to extreme scrutiny.

Appendix III, Technical Requirements
C. Systems Operation

73. Before each election or referendum, the equipment shall be checked and approved in accordance with a protocol drawn up by the competent electoral authorities. The equipment shall be checked to ensure that it complies with technical specifications. The findings shall be submitted to the competent electoral authorities.

It is important that the test plans be disclosed, that the public (or at the very least, the press and representatives of the candidates and issue advocacy groups) be invited to observe the testing, and that the findings from this testing be disclosed. If this is not done, how can the public be assured that the competent electoral authorities are indeed competent.

Appendix III, Technical Requirements
C. Systems Operation

75. Key e-election or e-referendum equipment shall be located in a secure area and that area shall, throughout the election or referendum period, be guarded against interference of any sort and from anyone any person. ...

Without an accompanying requirement that the equipment be observable, this requirement could be used to hide the key equipment out of the view of observers so as to allow insiders within the election administration to interfere with the system. Only if observers can see the equipment can they know that it is indeed not being interfered with. Well built election offices that I have visited have included a secure tabulating center that included a glass wall to a public observing area. The tabulating center in Miami-Dade County has two observing rooms on the other side of the glass wall, one for the press and public, and one for candidates, party officials and issue advocacy group representatives who may want to observe without being pestered by the press. In small rural jurisdictions, I have seen simpler arrangements, where the glass wall separates a lounge area from an office or work area that is normally used for other purposes; normally, a curtain closes the glass wall, but during elections, this is opened and the lounge area is opened to the public.

Appendix III, Technical Requirements
D. Security
I. General Requirements

84. The e-voting system shall maintain reliable synchronised time sources. The accuracy of the time source shall be sufficient to maintain time marks for audit trails and observations data, as well as for maintaining the time limits for registration, nomination, voting, or counting.

This rule could be read as requiring that the e-voting system itself solve the problem of maintaining a time standard. In fact, it is perfectly reasonable to rely on time standards maintained by national standards bureaus, or easily accessible secondary time standards.

The term audit trail is sufficiently confusing that I recommend its abandonment except when it is used in its broadest sense to incorporate all information an auditor might wish to inspect in order to determine if an election was conducted correctly. Among the relevant information are event logs of various parts of the voting system. I believe that the term event logs should generally be preferred when referring to the files of time-stamped event records maintained by various parts of a voting system.

Access to time information is dangerous! If software has access to this information, the software can be programmed to behave one way during the election and another way during testing. In the United States, general elections have long been held on the first Tuesday after the first Monday of November of even numbered years. Voting software that has access to calendar information can easily be fixed to always behave honestly except on these dates. On election day, the polls are generally open for many hours, for example, 6 AM to 9 PM in the state of Iowa. Voting software that has access to the time of day could easily be fixed to always behave honestly unless the polls had been open for longer than a typical pre-election test.

I therefore recommend that access to time-of-day information be limited to a need-to-know basis. Every access made to the time-of-day should be justified, and the software making these accesses should be clearly identified and subject to careful scrutiny. In addition, internal firewalls within the voting equipment that prevent undocumented access to the time of day should be maintained, if at all possible.

I also recommend that voting system components be examined to see if they have any need for time-of-day information. There may be significant pieces of the voting system that should not contain time-of-day clocks and should not, themselves, record event logs or retain any other information. Modems, printers and certain handicapped assistance devices are almost certainly in this category.

Appendix III, Technical Requirements
D. Security
III. Requirements for the Voting Stage

93. Residual information holding the voter's decision or the display of the voter's choice shall be destroyed after the vote has been cast. In the case of remote e-voting, the voter shall be provided with information on how to delete, where that is possible, traces of the vote from the device used to cast the vote.

Reliance on the voter to carry out an optional deletion step is not good. If a voter opts not to follow this advice, the voter could then sell their vote to someone, using this residual information as proof of how they voted.

Appendix III, Technical Requirements
D. Security
III. Requirements for the Voting Stage

96. After the end of the e-voting period, no voters shall be allowed to gain access to the e-voting system. However the acceptance of electronic votes into the electronic ballot box shall remain open for a sufficient period of time to allow for any delays in the passing of messages over the e-voting channel.

It is important to allow not only for message passing latency in the e-voting channel, but also for latencies introduced by human factors. When voting at polling places, those voters who are in line to vote at the end of the election period should be allowed to vote. They arrived at the polling place in time to vote, and are only waiting in line because of the conduct of the officials at that polling place and the insufficient number of voting stations provided.

Equivalent phenomena occur with internet access, requiring similar rules for remote e-voting. A voter who attempts to connect with the e-voting system during the e-voting period should be allowed to vote, based on the time of the attempted connection, so long as they continue waiting. Delays in this case are generally not just message passing delays, but delays in access to network servers because the number of clients attempting to access the server exceeds the number of clients the server can simultaneously support.

Appendix III, Technical Requirements
D. Security
IV. Requirements in Post-Voting Stages

99. The e-voting system shall maintain the availability and integrity of the electronic ballot box and the output of the counting process as long as required.

In carrying out this rule, it may be appropriate to distinguish between original records and copies of those records. For example, the ES&S iVotronic touch-screen voting machine records a copy of its electronic ballot box to compact flash card when the polls close, along with a copy of its event log. Compact flash media are relatively expensive, so the usual practice is for election authorities to reuse the compact flash cards in the next election; in the United States, most jurisdictions have around 4 elections per year, one general election and several minor elections, while federal law requires that election data for federal races be retained for 22 months. Therefore, the standard practice is to archive the data from the compact flash cards to an archival medium such as a recordable compact disk. One such disk can hold copies of the ballot data and event logs from hundreds of voting machines.

Clearly, the process of making archival copies of election data from the original media needs to be closely regulated. This copying operation is a link in the chain of custody of that election data that could be quite important in the event that this data becomes evidence in a court case arising from the conduct of an election. A defense in depth policy toward the making of such copies would require both that the copies be made in the presence of witnesses who can attest that the correct procedures were followed and that electronic or cryptographic measures be used to assure that the copies are indeed correct and authentic.

Appendix III, Technical Requirements
E. Audit
III. Monitoring

105. Disclosure of the audit information to unauthorised persons shall be prevented.

Such general nondisclosure rules are extremely dangerous. It would be preferable to make all audit information public with the exception of that information that can be shown to be a danger to the integrity of the election process. Security analysis of the voting system should clearly identify these specific pieces of information in advance of the use of that system!

Appendix III, Technical Requirements
E. Audit
IV. Verifiability

107. An The audit system shall provide the ability to cross check and verify the correct operation of the e-voting system and the accuracy of the result, detecting voter fraud and proving to prove that all counted votes are authentic and that all votes have been counted.

This is an extremely strong requirement, one that most current e-voting systems and e-voting proposals fail. The requirement of absolute ballot secrecy may make it impossible to meet this requirement with any purely electronic voting system, although hybrid systems that maintain voter-verified printed records of each vote cast may be able to achieve this.

Appendix III, Technical Requirements
E. Audit
V. Other

110. Member states shall take adequate steps to ensure that the confidentiality of any information obtained by any person while carrying out auditing functions is guaranteed.

Taken to an extreme, this could be used to forbid an auditor from disclosing that the voting system had completely failed to function. As stated in my comments on paragraph 105, I would prefer rules that required disclosure of all audit reports except where it could be shown that such disclosure would threaten the integrity of the election system. The burden of proof should be placed on the election authority to demonstrate that nondisclosure is required, and the amount of material withheld and the duration of withholding should both be minimized.