Assignment 9, due Apr 19Solutions
Part of
the homework for 22C:169, Spring 2010

The parts of this question have to do with the "in theory" statement above. One obvious threat to voter privacy occurs when the key custodians (each of whom provides the key to one stage of the mix net) collude to violate voter privacy. If a single custodian remains honest, voter privacy is assured. This question asks what other threats to voter privacy exist in a mixnetbased system.
a) Each stage in the mix net requires a computer and software. As the exclusive provider of this equipment, what threat do you pose? (0.5 points)
My system could copy the decryption keys and send them to me covertly. Once I have all the keys, I can trace each ballot from the public encrypted copy to the final decrypted value, destroying the right to a secret ballot.
b) What attack could the custodian of the key to the final mix make if he knew that his stage of the mix net would not be subject to challenge. (0.5 points)
The final custodian could try random keys until one of them produced "the right" election result, then make that result public.
After the totals are computed, require randomly selected key custodians to reveal the keys to some stages of the mix net so that all computations performed by that stage can be independently checked.
a) How many key custodians should reveal their keys? How does this depend on our degree of confidence in the integrity of the key custodians. Suppose we suspect that k out of N custodians may be dishonest, but we don't know which ones. (0.5 points)
If the key to one stage of the mix net is not revealed, ballots remain secret. The key may be revealed by the audit, or it may be revealed by a dishonest custodian. Therefore, if all custodians are honest, we may reveal N1 out of N keys for auditing purposes. If k out of N custodians are thought to be dishonest, we must reveal no more than k+1 keys.
b) What are the constraints on the disposition of the intermediate versions of the ballot produced between consecutive mixnet stages. Should these be disclosed, when and to whom. (0.5 points)
All of the intermediate results must be disclosed prior to the random selection of the stage or stages that will be subject to audit. If disclosure was done after selecting the stages to audit, a dishonest custodian could release an incorrect key value and the corresponding incorrect interstage ballots.
a) Adding test ballots to the ballot box looks like ballot box stuffing. How could you verify, or indeed, could you verify that the ballot box had not been stuffed in a mixnetbased voting system where test ballots were used. (0.5 points)
Nothing in the mixnet idea can test for ballot box stuffing. Any defensive measures must be external, for example, in the procedure used to enter the test ballots into the system.
b) In Internetbased voting systems, voting the test ballots described here can be used to deal with several threats outside the realm of mixnets. Identify such a threat and describe how voting the test ballots can help detect or defend against it. (0.5 points)
Casting test ballots from randomly selected remote sites can be used to detect denialofservice attacks, and if the test ballots appear to have been successfully cast but do not appear in encrypted form in the public ballot box, this is evidence that something like a spoofing attack has been perpetrated.
a) The voter's ballot was correctly deposited in the ballot box. (0.5 points)
This is up to the voter, who may conduct this test in public or private. To figure out the sample size, we need to get into formal statistics. Note that our question is, was the vote recorded correctly, and we expect 100% correctness. If N voters check and 1% of the ballots are indeed corrupt, we will detect the problem if just one out of N voters detects the problem. So, the probability of one out of N voters noticing the error is:
(1  (the probability that everyone will miss the error))
(1  (the probability that one voter will miss the error)^{N})
(1  (1  (the probability of a voter's ballot being corrupted))^{N})
(1  (1  (0.01))^{N})
(1  (0.99)^{N})
So, if 100 voters check their ballots, we have a .63 chance of detecting a 1% fraud rate. If 200 voters check their ballots, we have a .87 chance of detecting the fraud. It only takes about 70 voters checking their ballots to have an even chance of detecting a fraud in which 1% of ballots are incorrectly recorded.Of course, there are problems. If one voter reports the fact that his ballot was not found in the ballot box, will you believe that report, or will you write it off as an incompetent voter unable to figure out how to check the ballot box? How do you prevent a voter intent on destroying public confidence from making a false claim of fraud?
b) The mix net operated correctly. (0.5 points)
You require a randomly selected key custodian to reveal the decryption key to one stage of the mix net, after you have revealed the contents of each intermediate stage in the mixnet decoding. Anyone may then compute the computations done in one mixnet stage to verify that that stage was honest. To achieven 99% confidence that the election is correct, we have to check 99 out of 100 mixnet stages, but this is an unreasonable number of mixnet stages.
Of course, this is the wrong statistical model for checking a mixnet. Here, the right question is: What chance of detection would be sufficient to deter a key custodian from dishonestly operating one stage of the mix net? If a 25% chance of detection is a sufficient deterrent, then we only need to open 1 out of 4 mixnet stages.
a) Explain how they can demonstrate this by exchanging nonces. (0.5 points)
A sends B a nonce X (a random value), B encrypts X with the shared secret S and returns S(X) to A. A computes S(X) locally. A concludes all is well if the local and remote computations produce the same value S(X).
For A and B to both learn that all is well, each must send its own nonce to the other and then verify the result.
b) Explain how they can demonstrate this by exchanging encrypted messages. Make sure you clearly distinguish your answers to parts a and b. The data flow involved is distinctly different. (0.5 points)
A computes a random value R computes S(R) and sends <R,S(R)> it to B. B can then compute S(R) locally to verify that A used the correct value S.
Note here that A generates the random value that allows B to verify that A had the the correct value of S, while in the first solution, A both generates the random value and learns about the correctness of B's value of S.
As with the first solution, both A and B must do both tasks for both processes to agree that each has the secret correctly.