## Paper.dvi

**Cryptographic Methods for Storing Ballots on a Voting Machine**
**Abstract**
The

**subliminal-free **property ensures that malicious soft-

ware loaded on the terminal cannot leak covert information

*A direct recording electronic (DRE) voting machine must*
exposing how voters voted. Note that we must assume the

*satisfy several requirements to ensure voter privacy and the*
user of the voting machine is actively trying to leak infor-

*integrity of the election. A recent proposal for a vote stor-*
mation about their vote, since they may be under coercion

*age system due to Molnar et al. provides tamper-evidence*
*properties while maintaining voter privacy by storing bal-*
Molnar et al. [15] propose a vote storage system based

*lots on a programmable, read-only memory (PROM). We*
on PROM storage, a form of write-once memory.

*achieve the same properties and protect against additional*
data structures used to store the ballots ensure that any il-

*threats of memory replacement through cryptographic tech-*
licit writes to the memory will be detected, thus providing

*niques, without the use of special hardware. Our approach*
tamper-evidence. Properties (3) and (4) are also maintained

*is based on a new cryptographic primitive called History-*
by the data structures used. However, the PROM approach
does not address the additional threat of one PROM beingreplaced with another. Since the PROM’s must be trans-ported from the voting machines to the canvassing facility

**Introduction**
at the end of the polling period, this threat must not be un-derestimated.

The deployment of electronic voting terminals intro-
duces the problem of adequately storing votes in digital

**Our contributions.**
form. A vote storage system must store votes on the voting
storage system which builds on the Molnar et al. proposal
terminal during the election and possibly in archive form
by providing additional tamper-evidence properties. Specif-
long after the end of the election. While there is some de-
ically, it protects against the threat of physical replacement
bate as to the precise requirements of a vote storage system,
of the memory storing the ballots while maintaining the
some deployed systems [12] have been shown to be clearly
tamper-evidence, voter privacy, and durability provided by
the previous system. The new proposal incurs no additional
Molnar et al. [15] recently identified seven requirements
cost in the hardware requirements of the voting machines
that a vote storage system must satisfy. The four primary
and furthermore removes the need for disposable PROM
requirements are: (1)

**durable **— the storage system must

memories (which must be purchased for each election) by
not lose stored votes if the terminal crashes, (2)

**tamper-**
employing cryptographic techniques implemented in soft-

**evident **— an audit will detect if stored votes were modified

or deleted, (3)

**history-hiding **— the layout of ballots on

The proposed vote storage system is based on a new
the storage medium should reveal no information about the
cryptographic primitive we call

**History-Hiding Append-**
order in which ballots were cast, and (4)

**subliminal-free**
**Only Signatures **(HHAOS), which is of independent inter-

— a malicious implementation or user should not be able to
est. We provide two HHAOS schemes and prove them se-
embed covert information in the voting record.

cure. The first is based on any existing digital signature sys-
The

**history-hiding **property is necessary for voter pri-

tem in a straightforward manner, but requires the voting ter-
vacy. The concern is that during the election a coercer
minal to store large state and does not satisfy the subliminal-
might keep track of the order in which voters visited a par-
free property. The second system is space efficient and can
ticular voting terminal. If the history-hiding property does
be made subliminal-free at election close time. This second
not hold, the coercer could inspect the voting record post-
system makes use of aggregate signatures in groups with a
election and learn how each voter voted. Thus, history-
hiding is necessary to prevent coercion and vote buying.

The HHAOS primitive builds on an earlier primi-
tive called append-only signatures, introduced by Kiltz et
and moves on to the signing key for the next time period.

al. [11]. The basic idea is as follows (we give precise defi-
Unfortunately, most forward secure signatures are inher-
ently history-preserving. One exception is a forward-secure
• An HHAOS is initialized by generating a public key
system due to Itkis and Reyzin [9] which could be made
history-independent. The resulting HHAOS, however, is
performs this initialization prior to election start. PK
less efficient than our second construction.

is printed on paper and stored in a trusted facility. Itcan be replicated for higher assurance. Φ1 is stored in

**History-Hiding, Append Only Signatures**
non-volatile memory in the voting machine.

• Let Φi be the value in non-volatile memory when voter
We start by precisely defining a history hiding append
number i walks up to the terminal. After the voter
only signature system. We then explain how this definition
casts his vote vi, the terminal runs an append algo-
implies the properties discussed in the introduction. For-
rithm Append(Φi, vi). This algorithm outputs a new
mally, an HHAOS scheme consists of three algorithms:
Φi+1 which contains the ballot vi. This new Φi+1 iswritten to non-volatile memory and replaces Φi.

• At election close the terminal runs algorithm Finalize
Given a security parameter κ, produce a public key PK
to update the last Φ and prevent any further appends.

and an initial signature Φ, which corresponds to the
This finalized Φ is published or stored in an archive.

At any time post-election, anyone can validate authen-ticity of the set of ballots X = {v
Given a signature Φ for some set X and a new stringx ∈ {0, 1}∗, produce a new signature Φ′ for the set
To satisfy the tamper-evidence and history-hiding require-
ments, an HHAOS must satisfy two security properties:
• Append-only: given a signature Φ on a set of messages
Given a the public key PK and a set X, determine
X it is difficult to remove messages from X. That is,
whether Φ is a correct signature for X.

it is difficult to create a valid Φ′ for a set X′ satisfyingX ⊆ X′.

The system must satisfy the following correctness property:
• History-hiding: given a signature Φ on a set of mes-
sages X, an adversary cannot determine the order in

**Definition 1 (Correctness). **Let X = {x1, . . . xn} ⊆

{0, 1}∗. Compute PK, Φ0 ← KeyGen(1κ) and Φi ←Append(Φi−1, xi) for i = 1, . . . , n.

Note that when a new Φ is computed and stored within
Verify(PK, X, Φn) = True. If this holds for all finite
the voting machine, the previous value is deleted. While
X ⊆ {0, 1}∗, then the scheme (KeyGen, Append, Verify)
securely deleting data on a commodity system takes some
care, it is certainly possible [7, 5].

We define security for an HHAOS system using two
games. The first game captures the append-only property.

**Relation to append-only signatures (AOS).**
al. [11] recently introduced the concept of an append-only
signature (AOS) for the purpose of securing routing pro-tocols. They give several elegant constructions and prove

**Setup **The challenger computes PK, Φ0

that AOS is equivalent to hierarchical identity-based signa-
KeyGen(1κ) and gives PK to the adversary.

tures. An AOS is closely related to HHAOS — it satisfies

**Corrupt **The

the same append-only property, but need not be history-
hiding or subliminal-free. Not surprisingly, the construc-
tions in [11] are not history-hiding.

computes Φi ← Append(Φi−1, xi) foreach i ∈ {1, . . . n}, then returns Φn to the

**Relation to forward secure signatures.**
signatures [1] enable one to periodically update the sign-

**Forge **The adversary returns a set Y and a signa-

ing key so that a key compromise at day n does not in-
validate signatures issued prior to day n.

tempted to use forward-secure signatures for vote storage:
If Verify(PK, Y, ΦY ) = True and X ⊆ Y , then the adver-
after signing a vote v the terminal discards the signing key

**Definition 2 (Append Only Unforgeability). **An HHAOS

**Set finalization.**
For the voting application we need to “fi-
scheme (KeyGen, Append, Verify) is (t, ǫ)-append only un-
nalize” the ballots and prevent any further appends. We
forgeable if every probabilistic algorithm with running time
implement this Finalize operation as follows. Let Φ be a
at most t wins Game 1 with probability at most ǫ. We
say the scheme is append only unforgeable if the schemeis (t, ǫ)-append only unforgeable where t is a polynomial in
the security parameter κ and ǫ is negligible in κ.

where |X| is the number of elements in the set X.1 We
Note that in Game 1 we only give the adversary the
modify the Verify(PK, X, Φ) algorithm to return False if a
power to issue a single query X. This captures the vot-
ℓ is included in X and ℓ = |X| − 1.

ing terminal settings where every append-only chain is used
Note that without embedding the size of X in the final-
only once, namely for one machine in one election. One can
ize message there is nothing to prevent additional messages
extend Game 1 to allow the adversary to adaptively issue
queries for multiple sets X, as in [11]. Since here we focus
Now if Φ is a signature for some set X and Φ′ =
on the application to voting, the definition using Game 1
Finalize(Φ), then Φ′ may be given to Verify but may not be
is sufficient. The second game captures the history-hiding
used with Append to produce further signatures. This final-
ization operation may optionally be performed after everyappend, thus producing two signatures — one signature ΦA
which may be used for further appends and one signature

**Setup **The challenger computes PK, Φ0

ΦV which may only be used to verify the current set.

KeyGen(1κ) and gives PK to the adversary.

**Challenge **The adversary returns an ordered set

**Multiset semantics.**
X = {x1, x2, . . . xn} and two permutations
ply appending a nonce to each string added, thus maintain-
ing the uniqueness of each element in the set. Alterna-
coin b ∈ {0, 1} and then computes Φi ←
tively, a serial number ℓ may be appended to each element,
Append(Φi−1, λb(xi)) for i ∈ {1, . . . n}.

where ℓ is the number of instances of that element that are
The challenger returns Φn to the adversary.

already present. Using such a serial number has the ad-

**Guess **The adversary outputs a guess b′.

vantage of avoiding the additional subliminal channel thatnonces would provide, but requires the append algorithm to
We define the advantage of the adversary in Game 2 to be
be aware of which messages the signature validates.

**Definition 3 (History-Hiding). **An HHAOS scheme

**A Simple Construction**
(KeyGen, Append, Verify) is (t, ǫ)-history-hiding if everyprobabilistic algorithm with running time at most t has ad-
We now turn to constructing history-hiding append-only
vantage at most ǫ in Game 2. We say that the scheme is
signatures. We start with a simple construction that builds
history-hiding if it is (t, ǫ)-history-hiding where t is a poly-
an HHAOS from any digital signature system. This con-
nomial in the security parameter κ and ǫ is negligible in κ.

struction stores large state on the voting terminal and also
Much related work refers to data structures which are
assumes that an upper bound on the total number of ballots
history-

*independent*, i.e., independent of the sequence of
operations used to construct them in an information the-
The system works as follows. Let L be an upper bound
oretic sense [15, 14, 16, 4, 8]. Our definition of history-
on the number of messages to be signed. At setup time we
hiding is based on a weaker notion of computational
prepare L signature key pairs. Then to sign the ith message
xi, we pick at random an available key pair, sign xi with
needed in practice, both our constructions satisfy history-
it, and delete the private signing key. The signature con-
independence in the information theoretic sense.

tains the list of message-signature pairs plus all the unusedsigning keys.

**Extensions**
More precisely, let (G, S, V ) be a digital signature sys-
tem. Here G generates signature key pairs, S signs mes-
We describe two simple methods of adapting an HHAOS
sages, and V verifies signatures. We also need a collision
scheme. We first show how one can disallow further ap-
resistant hash function H (such as SHA-256). The generic
pends to a signature. Second, we describe a method for
n is a special message that cannot appear in
handling multisets, that is, producing signatures that verify
X for any n ∈ Z. We also assume the number of elements in X is stored
that certain messages were added multiple times.

HHAOS, denoted HHAOSS, for signing up to L messages

**Theorem 1. ***If *(G, S, V )

*is existentially unforgeable under*
*a chosen message attack and *H

*is collision resistant then*HHAOS
S

*is *append only unforgeable

*and *history-hiding

*.*
Run G(κ) L times to generate L signature key pairs
The proof is only outlined as the next construction is
our focus. First, the history-hiding property is trivial sincethe final content of Φ is independent of the order in which
messages were inserted. The append only property follows
(PK1, SK1, null), . . . , (PKL, SKL, null)
from the security of (G, S, V ) by a straightforward argu-ment. We note that during the append-only unforgeability
game the simulator issues a single chosen-message query
Let Φ = {(PK1, Y1, Z1), . . . , (PKL, YL, ZL) } and
for PK. Hence, it suffices that (G, S, V ) is existentially un-
let x ∈ {0, 1}∗ be a string. To generate the new signa-
forgeable against a single-query chosen message attack. In
particular, (G, S, V ) can be a one-time signature.

• Pick a random r ∈ {1, . . . , L} for which Yr =
null. This Yr is a signing key for the public key

**Performance.**
The size of Φ is always O(L). The time to
verify a signature is O(L) no matter how many messages
• Generate a signature σ ← S(Yr, x) and output:

**Subliminal-freeness.**
the Append algorithm could choose the random r pseudo-
The net effect on Φ is that the secret key SK
randomly so as to leak the order in which messages were
from tuple r and replaced by a message-signature pair
added. For example, let k be a secret key embedded in the
voting terminal. When appending the ith message, the vot-ing terminal can choose the randomness r as r ← F (k, i)
where F is a pseudo-random permutation such as AES. The
final signature Φ will appear to be properly generated. How-
ever, anyone who knows k can recover the exact order in
{ (PK1, Y1, Z1), . . . , (PKL, YL, ZL) } do the fol-
• If PK = H(PK1, . . . , PKL), output False and

**Bounding the number of messages.**
an a-priori upper bound on the number of messages to be
signed. For voting machines this is easily provided; a gen-
• If Yi = null and Yi is not a valid signing key
erous estimate suggests that less than 3,000 votes across all
individual races may be cast in one day at a particular vot-
ing machine based on the time necessary to cast each [15].

Tripling this for safety, we may assume that well under
9,000 messages will need to be signed at each machine, a
• If X = { x | ∃i, σ : Zi = (x, σ) } output True,

**An Efficient Construction**
The test on line (∗) is crucial for security — with-
out it there is a trivial attack on the system. To test
Φ so that its size at any moment depends only upon the
that Yi is a valid signing key for PKi one can test that
number of messages signed so far. Also, the amount of data
V (PKi, m, S(Yi, m)) outputs True for some arbitrary
per message is far less than in the previous system. More
importantly, a further benefit of this construction is that it
The system clearly satisfies the correctness property for
can evade the subliminal attack on the first system.

HHAOS as long as Append is activated no more than L
Recall that the system of Section 3 stores in Φ a list of
times. Security follows from the security of the underlying
public-keys plus a list of signatures. At a high level, our sec-
signature system (G, S, V ) and the collision resistance of
ond system improves upon the previous scheme using two
ideas. First, we plan to use an aggregate signature system
to aggregate all the signatures in Φ into a single short sig-
(S1, S2) a signature. Then given S1 ∈ G and S2 =
nature. Recall that an aggregate signature system can com-
{(x1, z1), (x2, z2), . . . , (xℓ, zℓ)}, compute
press a set of signatures from different public keys and on
different messages into a single signature. We will use the
BGLS aggregate signature system [2, 3] for this purpose.

Second, and more importantly, we use the fact that a
BGLS aggregate signature cannot be de-aggregated. That
If u = v and X = {x1, . . . , xℓ} output true, otherwise
is, given an aggregate signature it is not possible to remove
any signature from the aggregate. This in turn means that
we do not need to pre-generate all the public keys as we
an ordered list then Append must randomly permute the or-
did in the previous section. Instead, the Append algorithm
dering of the elements before outputting Φ′. This is crucial
can generate public / private key pairs on the fly and simply
append the resulting public-key to Φ. As a result, Φ nowgrows by one public-key per message signed.

**Properties**
**Background**
The following three theorems correspond to the correct-
ness and security properties given in Section 2. Correctness
The second construction uses bilinear maps, which we
is a matter of simple algebra, append only unforgeability
now briefly review. For further background see [2]. Let G
follows from the computational Diffie-Hellman assumption,
and GT be multiplicative groups of prime order p. Let g be
and history-hiding may be proven with no assumptions. The
a generator of G. Then a computable map e : G × G → GT
proofs are given in Appendix A. The history-hiding proof
also demonstrates that HHAOSE (like HHAOSS) is actu-ally fully history-independent in addition to being history-
∀x, y ∈ G, ∀a, b ∈ Z, e(xa, yb) = e(x, y)ab (bilinearity)
and e(g, g) = 1 (non-degeneracy). Several efficient im-

**Theorem 2. **HHAOSE

*is *correct

*.*
plementations of bilinear maps (e.g., the Weil pairing) are

**Theorem 3. ***If the computational Diffie-Hellman assump-*
currently available [13]. We also assume a hash function

*tion holds in *G

*, then *HHAOS

**Algorithms**
**Theorem 4. **HHAOSE

*is *history-hiding

*.*
The proof of Theorem 3 uses the fact that a BGLS ag-
gregate signature cannot be de-aggregated. That is, given
Fix groups G and GT of order p, where the size of
an aggregate signature on a set of messages X it is difficult
p is a determined by the security parameter κ. Pick
to recover an aggregate for a subset of X. This property
a generator g of the group G and a random exponent
was already discussed in [2]. Coron and Naccache [6] later
showed that de-aggregation is as hard as the ComputationalDiffie-Hellman problem.

The append-only requirement (Game 1), however, is
more strict than de-aggregation — we require that the ad-
Here Φ is a signature on the empty set. The exponent
versary not be able to produce an aggregate signature on
any set Y where X ⊆ Y . Hence, append-only securityis not directly implied by the difficulty of de-aggregation in
BGLS. Our proof of Theorem 4.3 shows that the system has
Given a signature Φ = (S1, S2) and a new string
the append-only property. The proof is a little simpler than
the proof in [6] since our settings are more flexible.

**Performance**
The algorithms KeyGen and Append have very modest
computation requirements; Verify is somewhat more expen-
Let PK = (g, u = e(g, g)α) be a public key,
sive. The KeyGen algorithm requires two modular expo-
nentiations (the pairing can be precomputed). The Append
algorithm requires two modular exponentiations, one mod-
established a reference platform, we will then describe each
ular multiplication, and one evaluation of the hash function
of several isolated modules and their relationships. These
H. The Verify algorithm requires |X|+1 pairings, |X| mod-
may be software modules on the same hardware, or hard-
ular multiplications, and |X| evaluations of H. The space
ware isolation may be employed [17]. Finally we will con-
(in bits) required to store a history-hiding append only sig-
sider the operational procedures that should be carried out
nature Φ for a set X is ℓ1 + (|X| + 1) · ℓ2, where ℓ1 is the
by poll workers and election officials to initialize the voting
number of bits required to store the strings in X and ℓ2 is
machines, provide access to voters, and verify results.

the length of a group element from G.

**Hardware**
**Subliminal Free Rerandomization**
Although the HHAOS scheme may be used with a wide
As described, the construction of Section 4.2 contains
range of potential DRE equipment, we base our discussion
subliminal channels that could be used by a malicious
on commodity PC machines such as those suggested by the
implementation of the Append algorithm to violate the
Open Voting Consortium (OVC) as a part of their architec-
history-hiding property. As in the previous section, the val-
ture for an open, verifiable electronic voting system [10].

ues ri can be used to leak the order in which votes were
Specifically, the OVC recommends the use of a commod-
ity PC with a locked case. The machine would most likely
This situation can be remedied by adding the following
not have a hard drive, but instead boot from a publicly re-
viewed CD distributed before the election which containsthe operating system (e.g., a stripped down Linux distribu-
tion), the voting machine software, and lists of candidates.

Each machine would include a printer and a removable flash
S2 = {(x1, y1), (x2, y2), . . . (xn, yn)},
memory (i.e., a USB drive or a Secure Digital memory card)
on which to record the electronic ballots. Input may be ob-tained through a touch screen or key pad.

y′i = yi · gsi for all i ∈ {1, . . . n}
In addition, we require that each machine have a small
S′1 = S1 · H(x1)s1 · H(x2)s2 · · · H(xn)sn
amount of internal non-volatile memory (e.g.

which to store the initial history-hiding append only sig-
2 = {(x1, y′1), (x2, y′2), . . . (xn, y′n)} .

nature when the machine is initialized. We also assume
the availability of a reasonably secure random number gen-
erator, such as the /dev/urandom device provided by
The signature Φ′ is then another correct signature for the
the Linux kernel. The hardware assumptions of this PC-
same set, but with rerandomized values r1 + s1, r2 + s2,
based architecture are consistent with recent work on high-
etc. As in the Append algorithm, if the set S′2 within Φ′
assurance voting machines [15, 18] in addition to the OVC
is produced as a list, the elements should first be randomly
proposals, although the previously proposed PROM-based
vote storage method only requires a random number gener-
If a signature Φ is produced by a potentially malicious
ator if the “random placement table” technique is used. The
server, its subliminal channels may be cleaned by having
HHAOS scheme for vote storage could also be employed
several parties run the Rerandomize algorithm on it. If any
within a system far less capable than a PC, such as the gum
one of those parties is honest, then the subliminal chan-
stick sized motherboards produced by Gumstix, Inc. and
nels will not contain any of the original information from
used in the prototype system of Sastry, et al. [17].

the malicious server. This re-randomization can take placewhen the election is closed and before Φ is made public.

**Secure Vote Storage**
**User interface module.**
between several isolated modules, the first of which is the
Now that we have introduced the HHAOS cryptographic
user interface module. The user interface module is the
primitive and given two constructions realizing it, we fur-
component of the electronic voting machine that interacts
ther consider its practical use in a Direct Recording Elec-
with a voter to present them with the election choices and
tronic (DRE) voting machine. We tailor our description to
produce their completed ballot. Ideally, its source code
the use of the more efficient construction, HHAOSE. First
should be small and easy to publicly verify [18]. After inter-
we will lay out our general assumptions regarding the hard-
acting with the voter, it invokes the InsertBallot procedure
ware architecture of an electronic voting machine. Having
of the cryptographic vote storage module (CVSM). In de-
the Close procedure is invoked, the CVSM uses Finalize to
prevent any further additions to the signature and copies it

**Verification and aggregation module.**
nature on a set of ballots stored on a removable flash mem-ory, we simply check that the public key fingerprint pro-
vided matches the public key stored on the memory and use
the Verify algorithm to check that the signature matches the

**CF20 6A5C D8E6**
**Operational Procedures**
**Initialization and polling.**
ing machines are stored at central municipal facilities be-

**Figure 1. Relationships between modules in**
fore being taken to the individual polling places on election

**a DRE voting machine architecture.**
day. Immediately prior to transport, the election officialsshould invoke the Open procedure on each machine, thusstoring the initial history-hiding append only signature onthe internal flash and printing out a sheet with the public
scribing the CVSM, we consider the ballots received from
key fingerprint. These sheets are collected for later veri-
the user interface module to be simple bitstrings which are
fication of the electronic ballots. Ideally, the fingerprints
accumulated in a multiset. Each string which corresponds
should be immediately sent to the county canvassing facili-
to a vote in a single electoral race.2 Additionally, the user
ties where they will be needed; this can be accomplished by
interface module provides poll workers with a means to set
simply reading the hex fingerprint over the phone. To min-
up the machine before polling begins and close the polls
imize the possibility of the persons at the canvassing facil-
at the end of the polling period. These features invoke the
ity being tricked into using the wrong key fingerprints, they
Open and Close procedures of the CVSM.

may be transmitted in additional ways such publicly postingthem on the Internet and bringing the sheets to the canvass-

**Cryptographic vote storage module.**
ing facilities in hard copy. Note that from this point until
ploys the HHAOS scheme of Section 4 to store the multi-
the close of polling the machines should not be left unat-
set of ballots on the removable flash memory while provid-
tended. If someone were to boot a machine with their own
ing tamper evidence and maintaining history-independence.

software and read the initial history-hiding append only sig-
Here we give a high level description of the values stored in
nature stored internally, they may later be capable of replac-
the CVSM and how they are updated. For concreteness, we
ing all the results reported by that machine. This and other
give a more detailed description in Appendix B.

threat scenarios are considered in detail in Section 6. Once
When the Open procedure is invoked by the user inter-
the electronic voting machines are at the polling places and
face module, the CVSM uses the KeyGen algorithm of the
the polls have opened, voters may visit the machines one
HHAOS scheme to create a public key and initial signature.

by one and have their votes recorded. After the polling pe-
The public key is saved on the removable memory and a
riod has ended, poll workers activate the Close procedure
fingerprint (i.e., collision resistant hash) is printed using the
on each electronic voting machine and collect the remov-
printer attached to the machine. The handling of this sheet
able flash memories containing the ballots.

is described in Section 5.3. The initial signature is storedon non-volatile memory within the machine. When the In-

**Canvassing.**
The removable memories are transported to
sertBallot procedure is invoked with a ballot b, the Append
canvassing facilities where the contents are read. Using the
algorithm is used to update the internal signature with b
public key fingerprints received from the staging facility, the
(overwriting the previous value). The ballot is saved to the
contents of each memory are checked. The ballots may then
removable memory (taking care to ensure that every order-
be totaled and the results publicly announced. Note that if
ing of the ballots currently stored is equally likely). When
we assume the public key fingerprints reach the canvassingfacility securely, the integrity of the election does not de-
2Grouping all the choices made by a voter into a single ballot string
would provide subliminal channels which could be used by a voter under
pend on the integrity of the contents of the flash memories.

It is therefore reasonable to transmit the signed electronic
ballots over the Internet from the polling places to the can-
a PROM), the internal flash memory on which the initial
vassing facility rather than physically transporting the mem-
history-hiding append only signature is stored, and the pub-
ories. This may somewhat reduce expenses. The history-
lic key fingerprint (these last two components only exist in
hiding append only signatures should be rerandomized as
described in Section 4.5; this may be performed once at
An adversary may gain read-only or read / write access
the polling place before sending the electronic ballots to
to the removable or internal memory within a voting ma-
the canvassing facility and again at the canvassing facility
chine either between machine initialization and finalization
before making the signed ballots publicly available.

or after finalization (a compromise prior to initializationwill have no effect). Note that we may consider replace-

**Comparisons**
ments of PROM’s and writes to removable flash memoriesto be equivalent operations, since the contents of a PROMbeing replaced may first be read and partly copied over to
The use of history-hiding append only signatures for se-
the new PROM, gaining the effect of general purpose mem-
cure vote storage in a DRE voting machine serves primarily
ory. Additionally, we consider the effect of the public key
as an alternative to the PROM system. While the PROM
fingerprint printed during machine initialization being in-
system ensures any illicit writes will be detected, it does
correctly communicated to the canvassing facility (e.g., as a
not address the threat of one PROM being replaced with an-
result of social engineering attacks).

other. Ensuring the integrity of the election requires phys-ical tracking and monitored transport of the PROM mem-ories. The same considerations apply to the use of other

**Threat Evaluation**
write-once media such as recordable CD’s in storing elec-tronic ballots.

**Integrity.**
Given this threat model, we now evaluate the
Essentially, the use of an HHAOS scheme replaces the
integrity properties of the new cryptographic vote storage
physical tracking requirement by requiring secure commu-
module and the previous PROM vote storage module. In
nication of a public key fingerprint. A more simplistic ap-
Table 1 we list all combinations of the previously described
proach to gain this effect would be to use a normal digital
compromises and the resulting effects on election integrity.3
signature scheme to sign ballots stored by the vote storage
The column for the PROM VSM depends only on the com-
module. However, it is likely necessary to save the signing
promise to the removable memory, since that system does
key on non-volatile memory within the machine in order to
not include the internal memory or a public key finger-
transport it to the polling place and for fault tolerance, leav-
print. Dashes in the table denote the collapse of several
ing it vulnerable to compromise. The append only property
rows where the outcome is the same for all values of that
of an HHAOS scheme limits this threat by ensuring at least
the integrity of ballots cast before the point of compromise.

The key security improvements offered by the CVSM
We now detail a threat model in which to evaluate the
over the PROM VSM manifest in scenarios B and E. In
cryptographic vote storage module of Section 5 and the
these cases the removable memory is swapped or illicitly
PROM-based vote storage module. After explaining the
written either before or after finalization, and the internal
model, we will highlight the improvements offered by the
memory of the CVSM and the public key fingerprint are se-
new techniques. Finally, we will compare the efficiency and
cure.4 In both cases, any ballot tampering will be detected
if the CVSM is used, but if the PROM VSM is used, theballots currently stored at the point of compromise may be

**Threat Model**
A lesser improvement is obtained if the internal memory
DRE voting machines face a wide variety of threats;
of the CVSM is also compromised. In scenario C, if the ad-
however, we will restrict our attention to the types of at-
versary is able to write the internal memory when they write
tacks relevant to the new and previously proposed systems
the removable memory, they may insert ballots undetected.

for vote storage. We focus on illicit read and write compro-
They may not, however, remove or modify ballots already
mises to the memories involved in vote storage along with
present without detection. Similarly, in scenario F, if the ad-
key management issues. In particular, we do not consider
versary gains read-only or read / write access to the internal
the issue of software verification. That said, the algorithms
memory after the first k ballots have been cast, then they
proposed in Sections 4 and 5 are simple enough to be veri-
may alter the set of ballots when they compromise the re-
fied by hand, with some effort. Assuming correct software,
the three different components that will be considered in
Reads of the removable memory are not considered here since they
our threat model are the removable storage on which the
4A read of the internal memory at the time of compromise of the re-
electronic ballots are recorded (either a flash memory or
movable memory is also acceptable in scenario B.

No tampering possible without detection.

✔

**/**✘:

Possible to insert ballots undetected, but ballots already present at point of compro-mise may not be removed without detection.

Arbitrary, undetected tampering with ballots present at point of compromise possible.

**Table 1. Results of various threat scenarios on election integrity using the cryptographic and PROM**

vote storage modules.
movable memory after finalization. However, the resulting
guarantees. The data structures in both the internal mem-
set must include the first k ballots cast if it is to verify.

ory of the CVSM and its removable storage are history-
If the public key fingerprint does not correctly reach the
independent. In either system, an illicit read of the remov-
canvassing facility, then the new system offers no improve-
able storage during the polling process will reduce voter pri-
ments over the PROM-based system. It should be easier,
vacy by partitioning the ballots into those cast before the
however, to ensure the safe arrival of a public key finger-
compromise and those cast after (but no further privacy will
be lost). In the case of the CVSM, an illicit read of the value
An additional issue affecting election integrity is that of
S1 stored internally will reduce voter privacy in the same
“vote disqualification” attacks, in which the attacker does
way, assuming the final contents of the removable storage
not insert or delete votes, but instead attempts to prevent
votes from being counted (presumably in a region support-
However, in the case of a malicious random number gen-
ing their opponent). An attacker who is able to replace the
erator or a malicious implementation of the CVSM algo-
public key fingerprint or write the internal memory would
rithms, the new approach suffers from subliminal channels
be able cause the final signature check to fail, even if they
that may reveal a great deal of information about the or-
do not have write access to the removable memory. This
dering of ballots. The PROM VSM suffers the same prob-
suggests the following policy. If the signature check fails, a
lem when the random placement table technique for insert-
recount should be performed based on a set of paper receipts
ing ballots into the PROM is used with a malicious random
or some other redundant source of information (if possible),
number generator. This threat is mitigated when using the
but in no case should the votes be outright discounted.

CVSM by employing the Rerandomize operation describedin Section 4.5. If the contents of the removable memoryare rerandomized once at the polling place after finaliza-

**Privacy.**
Having considered the improvements to elec-
tion and once at the canvassing facility before the contents
tion integrity offered by the use of the HHAOS scheme in
are publicly posted, then the subliminal channels will be
the CVSM, we now compare the privacy properties of the
publicly visible only if both the machines performing reran-
CVSM and PROM VSM. Assuming a secure random num-
domization are malicious. One point to be made regarding
ber generator and a non-malicious implementation of the
the process of rerandomization when using the CVSM is
CVSM algorithms, the two systems offer the same privacy
that the rerandomization operation may be performed by an
untrusted entity. In the worst case, the subliminal channels

**Conclusions**
will remain, but the machine performing rerandomizationmay not change the ballots without invalidating their signa-
We presented a new cryptographic tool for storing cast
ture. This is not the case if one were to rerandomize the
ballots on a voting terminal. The tool, called history-hiding
output of the PROM VSM when using random placement
append-only signatures (HHAOS), preserves all the benefits
tables. The ballots would need to be copied to a new PROM
of a hardware-based solution, while preventing hardware re-
(or empty space on the original), and the machine perform-
placement attacks. We presented an efficient realization of
ing rerandomization would need to be trusted to protect
HHAOS using groups with a bilinear map. We also dis-
election integrity. When using the PROM VSM, however,
cussed a less efficient system that uses any standard signa-
subliminal channels may be avoided entirely by using a dif-
ferent (and less efficient) storage method, such as copyoverlists or lexicographic chain tables [15].

**Robustness and Efficiency**
The cryptographic vote storage module described in Sec-
tion 5 shares fault tolerance properties similar to those ofthe PROM-based vote storage module. All the informationnecessary for the CVSM to continue operation after a powerfailure or system crash is stored on non-volatile memory.

When overwriting values on either the internal memory orthe removable memory, simple two-phase commits may beused to allow recovery in the case of a crash in the midst ofwriting. In this case, a crash in the middle of an operationmay reveal the last ballot stored, but there will be no furthercompromise of voter privacy. The unavailability of the pub-lic key fingerprint at verification time will prevent verifyingthe integrity of the electronic ballots, but will not preventthem from being counted.

The computational requirements placed on the voting
machine by the CVSM algorithms are very modest. Thevoting machines need only compute modular exponentia-tions twice at initialization (the pairing may be precom-puted) and twice for each ballot recorded (also evaluatinga hash function for each ballot). This is well within the ca-pabilities of low end commodity PC’s or even much morelimited embedded systems. If a commodity PC has alreadybeen chosen as the basic architecture for a DRE voting ma-chine, the computational requirements of CVSM will notaffect hardware selection. The necessary storage is alsominimal. If we assume at most 9,000 votes across all racesas in Section 3, 50-byte identifiers for each vote, and 160-bitgroup elements in G (typical of an elliptic curve), then lessthan 650KB are necessary on the removable storage (andonly a single group element on the internal storage). ThePROM-based VSM requires the purchase of new PROM’sfor each use of the machines. In contrast, a USB flash drivemay be purchased (at consumer rates) for less than $9.00,a one time cost. If no internal non-volatile storage is other-wise available on the machines, 1Mbit flash memory chipsmay be purchased for less than $1.00 each.

**References**
[1] M. Bellare and S. Miner. A forward-secure digital signature
scheme. In

*Proceedings of Crypto*, 1999.

[2] D. Boneh, C. Gentry, B. Lynn, and H. Shacham. Aggregate
and verifiably encrypted signatures from bilinear maps. In

*Proceedings of Eurocrypt*, 2003.

[3] D. Boneh, C. Gentry, B. Lynn, and H. Shacham. A survey
of two signature aggregation techniques.

*CryptoBytes*, 6(2),2003.

[4] N. Buchbinder and E. Petrank. Lower and upper bounds on
obtaining history independence. In

*Proceedings of Crypto*,2003.

[5] J. Chow, B. Pfaff, T. Garfinkel, K. Christopher, and
M. Rosenblum. Understanding data lifetime via whole sys-tem simulation.

In

*Proceedings of the USENIX Security*
[6] J.-S. Coron and D. Naccache. Boneh et al.’s k-element ag-
gregate extraction assumption is equivalent to the Diffie-Hellman assumption. In

*Proceedings of Asiacrypt*, 2003.

[7] P. Gutmann. Secure deletion of data from magnetic and
solid-state memory. In

*Proceedings of the USENIX Secu-rity Symposium*, 1996.

[8] J. D. Hartline, E. S. Hong, A. E. Mohr, W. R. Pentney, and
E. Rocke. Characterizing history independent data struc-tures.

*Algorithmica*, 42(1):57–74, 2005.

[9] G. Itkis and L. Reyzin. Forward-secure signatures with opti-
mal signing and verifying. In

*Proceedings of Crypto*, 2001.

[10] A. M. Keller, D. Mertz, J. L. Hall, and A. Urken.

vacy issues in an electronic voting machine. In

*Proceedingsof the ACM Workshop on Privacy in the Electronic Society(WPES)*, 2004.

[11] E. Kiltz, A. Mityagin, S. Panjwani, and B. Raghavan.

Append-only signatures. In

*Proceedings of ICALP*, 2005.

[12] T. Kohno, A. Stubblefield, A. Rubin, and D. Wallach. Anal-
ysis of an electronic voting system. In

*Proceedings of IEEESymposium on Security and Privacy*, pages 27–40, 2004.

[13] B. Lynn. The pairing-based cryptography (PBC) library.

[14] D. Micciancio. Oblivious data structures: Applications to
cryptography. In

*Proceedings of the ACM Symposium onTheory of Computing (STOC)*, 1997.

[15] D. Molnar, T. Kohno, N. Sastry, and D. Wagner. Tamper-
evident, history-independent, subliminal-free data structureson PROM storage -or- how to store ballots on a voting ma-chine. In

*Proceedings of the IEEE Symposium on Securityand Privacy*, 2006.

[16] M. Naor and V. Teague. Anti-presistence: history indepen-
dent data structures. In

*Proceedings of the ACM Symposiumon Theory of Computing (STOC)*, 2001.

[17] N. Sastry, T. Kohno, and D. Wagner. Designing voting ma-
chines for verification. In

*Proceedings of the USENIX Secu-rity Symposium*, 2006.

[18] K.-P. Yee, D. Wagner, M. Hearst, and S. M. Bellovin. Pre-
rendered user interfaces for higher-assurance electronic vot-ing. In

*Proceedings of the USENIX/ACCURATE ElectronicVoting Technology Workshop (EVT)*, 2006.

**Security and Correctness Proofs**
**Definition of **B

**.**
B = gb and use it in Game 1 with A. In order to answerrandom oracle queries, we maintain sets S and Γ and a map
Here we provide security and correctness proofs for the
HHAOS scheme presented in Section 4.2.

p, all initially empty. The set S will contain
all messages for which the random oracle has been called,and we will assign some of these to the set Γ. For conve-

**Correctness**
nience, we also define the function H : {0, 1}∗ → G as
With a little algebra it is easy to verify that this scheme
is correct according to Definition 1.

**Theorem 2. **HHAOSE

*is *correct

*.*
We carry out Game 1 with A as follows.

*Proof. *Let X = {x1, . . . xn} ⊆ {0, 1}∗ and assume PK =

**Setup **Define α = ab and give PK = (g, e(A, B)) =

(g, e(g, g)α) and Φn = (S1, S2) are generated as in Defini-
tion 1. Let r1, r2, . . . rn ∈ Zp be the random values chosen
Whenever A makes a random oracle query for s ∈
in the successive invocations of Append. Let si denote the
{0, 1}∗ (in this phase or later), we answer as follows.

discrete log (base g) of H(xi) for each i ∈ {1, . . . n}. Then
First, check if f (s) is defined (that is if s ∈ S). If so,
return H(s). If f (s) is not defined, save a uniformly
random value from Zp as f (s). Then we add s to S and
add it to Γ with probability q . Then return H(s).

**Corrupt **We receive X = {x1, . . . xn} from A. Without

loss of generality we can assume that X ⊆ S, since
if that is not the case we can just call the oracle for all
∈ S. If X ⊆ Γ, we abort the simulation. Otherwise,
we may successfully produce a signature Φn for X.

1 s1+r2s2+···rnsn · e(g, g)−r1s1−r2s2−···rnsn
k be an element of X that is not in Γ. We com-
pute a signature Φn to return to A as follows. Select
Since we also have that X = { x | ∃y (x, y) ∈ S2 }, Verifywill return True and the scheme is correct.

Φn = ({(x1, gr1), . . . (xk, A−1), . . . (xn, grn)},

**Append Only Unforgeability**
H(x1)r1 · · · A−f(xk) · · · H(xn)rn)
= ({(x1, gr1), . . . (xk, grk ), . . . (xn, grn)},
1)r1 · · · H (xk)rk · · · H (xn)rn )
random oracle model based on the computational Diffie-Hellman assumption.

and return Φn to A. By the definition of rk and H(xk),this is a well formed response.

**Theorem 3. ***If the computational Diffie-Hellman assump-*
Note that all our responses to A are properly dis-

*tion holds in *G

*, then *HHAOSE

*is *append only unforgeable
tributed. The only values which have not been selected
as in the regular scheme are α = ab and rk = −a,which are independent and distributed identically to

*Proof. *Suppose the (t′, ǫ′)-CDH assumption holds in G;
values selected as in the regular scheme. Also, the
that is, any probabilistic algorithm running in time at most
values given in response to random oracle queries are
t′ solves CDH with probability at most ǫ′. Then we will
independent and distributed uniformly at random over
show that HHAOSE is (t, ǫ)-append only unforgeable with
t′ = O(t · poly(κ)) and ǫ′ ≥ ǫ/(e(q + 1)), where q ≤ t.

Assume a t time algorithm A wins Game 1 with prob-

**Forge **We receive a set Y = {y1, . . . ym} and a signa-

ability at least ǫ while making at most q random oracle
ture ΦY = (S1, S2) from A. If Verify(PK, Y, ΦY ) =
queries. We construct an algorithm B which solves CDH
False or X ⊆ Y , A has failed at Game 1 and we abort.

in time O(t · poly(κ)) with probability at least ǫ/(e(q + 1)).

Otherwise, we may use the forgery produced by A
Thus, Pr [ E2|E1 ] ≥ 1/(e(q + 1)), Pr [ E1 ] ≥ ǫ, and
to solve our CDH instance. Denote the contents of
Pr [ E1 ∧ E2 ] ≥ ǫ/(e(q +1)). So B does not abort and suc-
S2 in ΦY as S2 = {(y1, z1), (y2, z2), . . . (ym, zm)}.

cessfully solves the CDH instance with probability at least
ǫ/(e(q + 1)). Furthermore, B takes time O(t · poly(κ)).

So if the (t′, ǫ′)-CDH assumption holds in G, then
HHAOSE is (t, ǫ)-append only unforgeable, where t′ =
O(t · poly(κ)), ǫ′ ≥ ǫ/(e(q + 1)), and q ≤ t. In particular, ifevery PPT algorithm solves CDH in G with probability neg-
and return C as the answer to the CDH instance.

ligible in κ, then HHAOSE is append only unforgeable.

Verify(PK, Y, ΦY ) = True, A must have queried for

**History-Hiding**
all y ∈ Y at some point,5 so f (y) is defined for ally ∈ Y . Additionally, we have that
It is straightforward to establish that the HHAOS scheme
1)·(e(H (y1), z1) · · · e(H (ym), zm))−1

**Theorem 4. **HHAOSE

*is *history-hiding

*.*
*Proof. *Specifically, we show that any adversary has advan-
1)·e(gf (y1), z1)−1 · · · e(gf (ym), zm)−1
tage exactly zero in Game 2. Run KeyGen(1κ) to compute
PK and Φ0 = (gα, {}). Return PK to an adversary A. Af-ter receiving a set X = {x

**Analysis of **B

**.**
aborts before it can successfully solve its CDH instance. LetE1 be the event of A succeeding at Game 1, and let E2 be
Φn = gαH(λ0(x1))r1 · · · H(λ0(xn))rn,
the event of X ⊆ Γ and Y ⊆ Γ. The probability that B does
Since B produces well formed responses distributed
Φ′n = gαH(λ1(x1))r′1 · · · H(λ1(xn))r′n,
identically to those of HHAOSE in its interactions with A,
{(λ1(x1), gr′1), . . . (λ1(xn), gr′n)}
we have that Pr [ E1 ] ≥ ǫ. Now we compute Pr [ E2|E1 ].

Assume E1. Let θ = q . Then
According to Game 2, if our coin b is 0 we must return Φn,otherwise we return Φ′
n. However, since r1, r′1, r2, r′2, . . .

are selected independently and multiplication in G is com-
mutative, Φn and Φ′n are identically distributed random
variables. So A’s guess b′ is independent of which of thetwo we return and thus independent of our coin flip b. We
then have that |Pr [ b′ = b ] − 1 | = 0 and have shown that
Since A succeeds, X ⊆ Y and therefore |X \ Y | ≥ 1. Also,
Additionally, it is evident from the proof that HHAOSE
S ) is not only history-hiding, but history-
independent in the information theoretic sense.

5We neglect the possibility of A guessing the output of the random
oracle, which may be made arbitrarily unlikely by increasing the outputlength of the random oracle.

**Implementation Details**
Here we provide concrete details on efficiently and se-
curely implementing the cryptographic vote storage module
(CVSM) described in Section 5.2. We first detail the values
stored by the CVSM, then the procedures for updating them.

The CVSM achieves multiset semantics by appending
to a string the number of copies already present before
inserting it into the set of stored strings, as described inSection 2.1.

C : {0, 1}∗ → N which keeps track of the number of copies
of each string we have encountered. This may be stored in
the main (volatile) memory of the CVSM process; its us-
age is further explained below. Referring to the HHAOSscheme described in Section 4, the history-hiding appendonly signature Φ = (S

**Figure 2. Values stored on the removable**
ing the polling process, we store the value S

**flash memory within the voting machine.**
internal flash memory within the machine. The contents ofS2 are stored on the removable flash memory along withseveral other values. To refer to these locations on the re-
first M entries in the array S2 and compute the following.

movable memory, we denote the content of the removablememory with the structure given in C-like pseudocode in
Figure 2. Here n is an upper bound on the number of bal-
lots we will need to store and ℓ is the length of each ballot.

These values on the removable storage along with the valueS1 on the internal storage are manipulated by the following
append only signature on the recorded ballots has verifiedand proceed to total the ballots; otherwise, report an error.

**Open **Select α R

a fingerprint of the public key PK. Save S1 ← gα,M ← 0, and P ← PK.

**InsertBallot **Upon receiving a ballot string b ∈ {0, 1}ℓ,

lookup b in the hash table C, incrementing the valueC(b) if it is found. If b is not found, insert 1 at C(b). Ifb collides with another string b′ = b in C, use chainingand sort the strings at that location. Sorting collisionsis necessary to maintain history independence. Next,
S1 ← S1 · H(b||C(b))r. Then copy S2[M ] ← S2[i],store S2[i] ← ( b||C(b), gr ), and save M ← M +1. Note that this method of selecting a location forthe new pair in S2 ensures that every ordering of thecurrent values in S2 is equally likely.

**Close **Randomly select r

H(“ finalize ”||M )r and V2 ← gr on the removablestorage. Save S1 ← 0 on the internal storage.

To verify the ballots stored on a removable memory us-
ing a public key fingerprint, carry out the following opera-tions. First check that the fingerprint provided matches thepublic key P stored on the memory. Next, scan through the

Source: http://www.isoc.org/isoc/conferences/ndss/07/papers/cryptographic_methods_for_storing_ballots.pdf

lymphocyte function-associated molecule-1(LFA-1)1. Arai K, Hotokebuchi T, Miyahara H, Mohtai M, Kitadai HK, Sugioka Y, Kaibara N. Successful long-term storage of rat limbs. The use of simple immersion in Euro-Collins solution. Int Orthop. 1993 Dec;17(6):389-96 2. .Arai K, Hotokebuchi T, Miyahara H, Arita C, Mohtai M, Sugioka Y, Kaibara N. Limb allografts in rats immunosuppressed with FK506. I.

PROHIBITED LIST INTERNATIONAL STANDARD The official text of the Prohibited List shall be maintained by WADA and shall be published in English and French. In the event of any conflict between the English and French versions, the English version shall prevail. This List shall come into effect on 1 January 2005. THE 2005 PROHIBITED LIST WORLD ANTI-DOPING CODE Valid 1 Jan