division of elections system review report prepared by...
TRANSCRIPT
Division of Elections System Review Report
Prepared by
Bureau of Voting Systems Certification
Dominion Voting Systems, Inc.
Sequoia/WinEDS, Release 3.1.077, Version 1(Revised 2)
Municipal Election
Village of Wellington
March 13, 2012 ______________________________________________
April 17, 2012
Florida Department of State Page 2 of 20
Executive Summary
Abstract: This report includes an evaluation of Dominion Voting System‟s (DVS) Sequoia-
WinEDS Election Management System (EMS), along with an inspection of the
Ballot Printing System (BPS) and the BPS/WinEDS Bridge tool application
(Bridge). No evidence of a software flaw was found in any of the elements
assessed; however, the system‟s task sequences and operator manuals showed
some lack of clarity with regard to the steps required to create an election
definition and the general usability of the software.
Purpose: To evaluate the DVS-Sequoia Voting System, Release 3.1.077, Version 1
(Revised 2), which was certified in Florida under certification #070907-
SEQUOIA-Ol (Revised2) and approved on July 22, 2008, to assess whether the
system‟s software when employed as expected exhibits a flaw in contest order.
This assessment includes an examination of the WinEDS database, Ballot Printing
System (BPS), and Bridge tool application.
Background: On March 13, 2012, an issue was encountered with regard to the certified vote
results for a municipal election held in Village of Wellington. After certification
of the official results, the county performed the post-election voting system audit
(on March 19, 2012), which includes a manual tally of the votes cast in a
randomly selected race. This audit revealed that the vote totals as stated in the
certified election results differed from those reached with the manual tally.
A review of the WinEDS election definition indicated that this election did not
have the correct contest order. The order should have been: Mayor, Council –
Seat 1, and Council – Seat 4. However, the contest order for this election was:
Council – Seat 1, Council – Seat 4, and Mayor. The vote totals within each given
contest were accurate, but those votes were not reported within the correct
contest.
On March 19, the same day that the anomaly was discovered, an employee of
DVS worked with a Palm Beach County election official to reorder the contests
and to produce an election definition for use with the Optech 400-C central scan
tabulator. With this updated definition, Palm Beach‟s election staff rescanned the
Wellington ballots on the central count scan tabulator and concurred that the
recount produced election results that differed from those originally certified on
March 13 and, further, that the central count tabulator recount represented the true
totals, as obtained in the manual audit.
Scope: This limited examination is for DVS-Sequoia‟s EMS that contains the WinEDS
Release 3.1.077 application; Ballot Printing System (BPS), Release 40L, Version
3; and the BPS/WinEDS Bridge application, Version 3.1.19D. It will not include
a review of the Optech Insight Plus or the Optech 400-C tabulators. BPS and the
Bridge are optional applications that provide the user with a less cumbersome way
to create an election, but the WinEDS software is capable of producing an
election, without the use of either of these applications.
Florida Department of State Page 3 of 20
Note that in the state of Florida there are two counties using this voting system,
Palm Beach and Indian River County. Currently, Palm Beach is the only county
that uses all three of the referenced Sequoia applications (BPS, Bridge tool, and
WinEDS), as Indian River uses only WinEDS.
Examination: The Bureau of Voting Systems Certification (BVSC) examined the voting system
using two groups, working independently. The first group used a system with the
three applications (BPS, Bridge, and WinEDS) installed on only one laptop and
the second group worked on a system which included the same applications, but
with a client/server configuration.
Vendor provided documentation was used to validate required procedures for
performing the test activities to follow. Specifically, the systems operations user
manuals were used as a basis for instruction and to perform process steps. Each
group restored an election profile and definition, which included data such as
precincts, city information, offices, etc. Then the BPS application was used to
create elements for a new election, including contest and candidate information,
after which the Bridge tool was employed to import the „dummy‟ data into the
WinEDS database. The groups also used Palm Beach County‟s 2012 Municipal
election definition as a „base election‟ for WinEDS and BPS databases.
Various components in BPS were edited and the Bridge tool was again applied to
import into WinEDS. Each group verified the accuracy of the expected contest
order and also tested various permutations to uncover a contest order mismatch.
In addition, as indicated in Sequoia‟s WinEDS Election Data System Software
Release 3.1.074C Reference Guide and the online help option within the software,
the groups performed the tedious tasks of reordering contests as is sometimes
required when edits have occurred. The manual explains that list order affects the
ballot and report appearance. Instruction manual and reference guide clarity
issues will be reviewed and addressed during the certification process for
Dominion Voting Systems Inc./Sequoia WinEDS Release 4.0.175, Version 1.
Conclusion: After testing, no evidence of a software flaw was found in any of the elements
assessed and neither testing group was able to replicate the software contest issue
that occurred in the March 13, 2012 Village of Wellington Election. Since no
software flaw occurred, the contest order was changed in the WinEDS database to
verify that the user can reorder a mismatched contest order. Existing event logs
did not provide sufficient data to determine what occurred in this election.
The manuals and the software were extremely difficult to use and/or understand.
Moreover, sometimes steps required to complete task sequences were not
included, thus creating not only a delay in the amount of time needed to create an
election, but also hinders a user‟s ability to verify that the election data is correct.
In most instances, once the correct documentation passages were found, staff was
able to perform a given activity. However, the manuals were cumbersome to
interpret, time consuming to comprehend, and instructions were difficult to locate.
Florida Department of State Page 4 of 20
Recommendations: Election officials need to review the contest list order in WinEDS, for both paper
and the AVC Edge II ballots, to verify that contest data is in sync, as explained in
the vendor‟s Product Advisory Notice. Additionally, a more enhanced test deck
should be used during Logic and Accuracy testing as the enhanced patterning
would allow the tester to see contest order errors before the election is finalized.,
The user manual and reference guide clarity issues will be reviewed and
addressed during the current certification process for Dominion Voting Systems
Inc./Sequoia WinEDS Release 4.0.175, Version 1.
Florida Department of State Page 5 of 20
Dominion Voting Systems, Inc.
Sequoia/WinEDS, Release 3.1.077, Version 1 (Revised 2)
System Review
Purpose:
Evaluation of the Dominion Voting Systems (DVS) Sequoia-WinEDS, Release 3.1.077, Version
1 (Revised 2), which was certified in Florida under certification #070907-SEQUOIA-Ol
(Revised 2) and approved on July 22, 2008 [See Appendix A], to assess whether the system‟s
software when employed as expected exhibits a flaw in contest order.
Scope:
This assessment includes an examination of DVS-Sequoia‟s EMS that contains the WinEDS
Release 3.1.077 application; Ballot Printing System (BPS), Release 40L, Version 31; and the
BPS/WinEDS Bridge application, Version 3.1.19D. It did not include a review of the Optech
Insight Plus or the Optech 400-C tabulators.
Certification Process:
A voting system configuration, certified for use in Florida, consists of a specific set of hardware
and software components that as a whole, meets the requirements of Florida Statutes and the
Florida Voting System Standards. A voting system configuration has three main elements: the
election management system, the precinct count system, and the central count system.
Section 97.021(44), Florida Statutes, defines a voting system as
“…a method of casting and processing votes that functions wholly or partly by use of
electromechanical or electronic apparatus or by use of marksense ballots and includes,
but is not limited to, the procedures for casting and processing votes and the programs,
operating manuals, supplies, printouts, and other software necessary for the system‟s
operation.”
Sections 101.5601 through s. 101.5614 Florida Statutes are known as the “Electronic Voting
Systems Act.” As stated in s. 101.5602, the
“…purpose of this act is to authorize the use of electronic and electromechanical voting
systems in which votes are registered electronically or are tabulated on automatic
tabulating equipment or data processing equipment.”
The Department of State must conduct public examination of all electronic or electromechanical
voting systems submitted to it and must determine whether the systems comply with the
requirements of s. 101.5606. As per s. 101.5605,
1 The BPS and Bridge tool applications are optional supplemental applications that provide the user with a less
cumbersome way to create an election, but the WinEDS software is capable of producing an election, without the use of
either of these applications. In the state of Florida, there are two counties using this voting system, Palm Beach and Indian
River County. Currently, Palm Beach is the only county that uses all three of the referenced Sequoia applications (BPS,
Bridge tool, and WinEDS), as Indian River uses only WinEDS.
Florida Department of State Page 6 of 20
“Testing for certification shall include, but is not limited to, testing of all software
required for the voting system‟s operation; the ballot reader; the rote processor, especially
in its logic and memory components; the digital printer; the fail-safe operations; the
counting center environmental requirements; and the equipment reliability estimate. For
the purpose of assisting in examining the system, the department shall employ or contract
for services of at least one individual who is expert in one or more fields of data
processing, mechanical engineering, and public administration and shall require from the
individual a written report of his or her examination.”
The first provisional certification of a voting system was issued January 28, 1994. In June 1998,
the Division of Elections, Voting System Section, published the “Florida Voting Systems
Standards” in an attempt to make the requirements for, and process of, certification and
provisional certification easier to understand. The Florida Voting Systems Standards were
published January 2005 and have continued to be revised to include updates in legislation.
The Department of State Division of Elections has adopted uniform rule 1S-2.004 regarding the
purchase, use, and sale of voting equipment in the state. Governing bodies (supervisors of
elections) in the State of Florida may only purchase equipment that has been certified for use in
this state by the Department of State as specified by s. 101.294, F. S. and in accordance with this
rule.
Certification Information for DVS Sequoia-WinEDS, Release 3.1.077:
The voting system under review was first certified as Sequoia-WinEDS, Release 3.1.077,
Version 1, on September 7, 2007. The version currently in use in Palm Beach County is
Sequoia-WinEDS, Release 3.1.077, Version 1 (Revised 2), certified on July 22, 2008. The
certification team for this effort included two staff members from the Division of Elections, a
Government Operations Consultant and a Government Analyst I, with assistance from an
employee from the Palm Beach County Supervisor of Election‟s office.
Village of Wellington Election:
On March 13, 2012, the Village of Wellington held a Municipal Election and the Palm Beach
County Supervisor of Elections published election results for the election:
Mayor
2,411 Darell Bowen
3,341 Bob Margolis
Council Seat 1
2,877 John Greene
2,946 Shauna Hostetler
Council Seat 4
2,956 Al Paglia
2,745 Matt Willhite
After publication of the election results, the county performed the post-election voting system
audit, which includes a publicly noticed manual tally of the votes cast in a randomly selected
Florida Department of State Page 7 of 20
race2. This audit revealed that the vote totals as stated in the certified election results differed
from those reached with a manual tally.
A review of the WinEDS election definition indicated that this election did not have the expected
contest order. It showed that the vote totals within each given contest were accurate, but that the
votes were not reported within the correct contest. On March 19, the same day that the anomaly
was discovered, an employee of DVS worked with a Palm Beach County election official to
reorder the contests and to produce an election definition for use with the Optech 400-C central
scan tabulator. With this updated definition, Palm Beach‟s election staff rescanned the
Wellington ballots on the central count scan tabulator and concurred that the recount produced
election results that differed from those originally certified on March 13 and, further, that the
central count tabulator recount represented the true totals, as obtained in the manual audit. Those
results are as follows:
Mayor
2,877 Darell Bowen
2,947 Bob Margolis
Council Seat 1
2,956 John Greene
2,745 Shauna Hostetler
Council Seat 4
2,412 Al Paglia
3,341 Matt Willhite
The voting system vendor, Dominion Voting Systems, was then provided with a „backup‟ copy
of the election database. DVS Director of Product Development, Eric Coomer, performed an
analysis of the database to determine a root cause for the abnormality. On Tuesday, March 27,
2012, Dr. Coomer visited the Florida Department of State/Division of Elections, where he
discussed the findings, provided a scenario of the type of activity that could cause this variance,
suggested best practices for Logic & Accuracy testing, and illustrated software updates which
would lessen the probability of such an event occurring in the future.
Below is a more detailed discussion of Dr. Coomer‟s presentation.
Dominion Voting Systems Findings (Root Cause)
The analysis indicated that the anomaly was caused by dissimilar data that resided in two
WinEDS tables containing contest order data. For the purposes of this discussion, the
two tables will be called: „base order‟ table and „ballot contest order‟ table. When
election data is imported into the WinEDS database, the base order and ballot contest
order tables are populated with the same data, in the same sort order.
When the database was received by DVS, the database contained contests orders that
were atypical from the expected order. To guarantee that the Ballot Contest Order table
data matches the Base Order table data, the user is required to perform several time
consuming steps. If a step is missed or done out of sequence, it could result in a problem
with the ballot data. The misstep could cause an anomaly if not addressed with proper
2 Per section 101.591(1), Florida Statutes.
Florida Department of State Page 8 of 20
contest order sequencing. The reference guide and “online help” informed users that
contest sequencing should be addressed when making additions and/or deletions to the
election database. The issue had been seen before in some Dominion/Sequoia user
jurisdictions, but in those jurisdictions the issue was detected and corrected during L & A
testing or earlier since this can be part of a normal election preparation process.
Hypothesis of Election Database Activity
Three statements were released by Dominion pertaining to the Village of Wellington
election event (See Appendix B, C, and D). While the exact cause of the variance
between the tables could not be unconditionally determined since the complete chain of
events leading up to the anomaly is not known, Dr. Coomer theorized that the dissimilar
data appeared to have occurred due to actions taken to create the AVC EDGE II contest
titles or because of a physical change to the EDGE‟s ballot „artwork.‟ He stated that
either of these scenarios could result in the contest order becoming „out of sync‟ between
the two tables of data.
Procedures used to create the paper ballots, which are cast using the Optech-Insight
Precinct Scanner and which can later be processed with the Optech-400C Central Count
Scanner, differ from those that are used to create the ballot for the AVC EDGE II. These
differences may have caused the ballot contests to be presented in an order which varied
from the originally imported listing.
It is important to note that the apparent incongruity of the two tables is not an indication
of a system error. Given the state of the database at the time the election results were
uploaded, the behavioral processes functioned correctly and as programmed.
Florida Department of State Page 9 of 20
Best Practices – L&A Testing
Ordinarily, Logic and Accuracy (L&A) testing activities conducted in accordance with
section 101.5612, Florida Statutes, would have revealed possible anomalies. In this
instance the L&A test did not do that because of the combination of the fact that the
ballot had three contests that were precisely the same and sequential and because the
ballot test deck 1-2 pattern was not sufficient to uncover the contest order variance.
It is common to use a 1-2 or 1-2-3 pattern when preparing test decks. This type of
marking means that one test ballot is marked on the 1st oval (“vote target”) in each
contest on the ballot. Then two test ballots are marked on the 2nd
oval in each contest on
the ballot. Using this scenario, the test deck for the example provided in this report
would have had a total of three test ballots. When the test ballots were cast and the
results tapes/reports were printed, the vote totals were displayed in the expected 1-2
pattern. Because of this, pre-audited test deck count expectation is met:
Candidate
Ballot
Pos. Order
CONTESTS AS THEY APPEARED ON
PRINTED BALLOT
Expected Vote Totals
(Simple Test Deck)
CONTEST 1: MAYOR
1 CANDIDATE 1: Darell Bowen 1
2 CANDIDATE 2: Bob Margolis 2
CONTEST 2: COUNCIL – SEAT 1
3 CANDIDATE 1: John Greene 1
4 CANDIDATE 2: Shauna Hostetler 2
CONTEST 3: COUNCIL – SEAT 4
5 CANDIDATE 1: Al Paglia 1
6 CANDIDATE 2: Matt Willhite 2
A more enhanced test deck, with a pattern such as 1-2-3-4-3-2 would have demonstrated that
the database was not handling vote totals as expected:
EXPECTED CONTEST VOTE TOTALS –
USING BALLOTS MARKED USING AN
ENHANCED TEST DECK
EXPECTED
Vote Totals
CONTEST 1: MAYOR
CANDIDATE 1: Darell Bowen 1
CANDIDATE 2: Bob Margolis 2
CONTEST 2: COUNCIL – SEAT 1
CANDIDATE 1: John Greene 3
CANDIDATE 2: Shauna Hostetler 4
CONTEST 3: COUNCIL – SEAT 4
CANDIDATE 1: Al Paglia 3
CANDIDATE 2: Matt Willhite 2
Florida Department of State Page 10 of 20
ACTUAL/EXPECTED CONTEST VOTE
TOTALS COMPARISON AFTER
COMPLETION OF L&A TESTING – USING
AN „ENHANCED‟ TEST DECK
EXPECTED
Contest
Vote Totals
ACTUAL
Contest
Vote Totals
CONTEST 1: MAYOR
CANDIDATE 1: Darell Bowen 1 3
CANDIDATE 2: Bob Margolis 2 2
CONTEST 2: COUNCIL – SEAT 1
CANDIDATE 1: John Greene 3 1
CANDIDATE 2: Shauna Hostetler 4 2
CONTEST 3: COUNCIL – SEAT 4
CANDIDATE 1: Al Paglia 3 3
CANDIDATE 2: Matt Willhite 2 4
The enhanced patterning allows the tester to see that something is incorrect in the way the
election data is being processed in the database.
Software Updates
The Division of Elections is currently reviewing an upgraded version of the WinEDS
software, which includes several enhancements to the currently certified version. One of
the improvements to the system is the addition of a more „user friendly‟ way to deal with
the large number of items in the task sequence required to ensure that the Base Order
table and the Ballot Contest Order table are in sync.
After the Bureau of Voting Systems Certification‟s review of the system, the team agrees with
Dominion‟s conclusions as presented by Dr.Coomer.
Florida Department of State Page 11 of 20
Bureau of Voting Systems Certification Review of Voting System
The review team for this effort included three staff members from the Division of Elections
Bureau of Voting Systems Certification (Bureau Chief, Senior Management Analyst II, and
Systems Project Analyst, Systems Programmer II) [See Appendix E]..
The Bureau of Voting Systems Certification (BVSC) examined the voting system using two
groups, working independently. The first group used a system with the three applications (BPS,
Bridge, and WinEDS) installed on only one laptop and the second group worked on a system
which included the same applications, but with a client/server configuration.
Vendor provided documentation was used to validate required procedures for performing the test
activities to follow. Specifically, the systems operations user manuals were used as a basis for
instruction and to perform process steps. Each group restored an election profile and definition,
which included data such as precincts, city information, offices, etc. Then the BPS application
was used to create elements for a new election, including contest and candidate information, after
which the Bridge tool was employed to import the „dummy‟ data into the WinEDS database.
The groups also used Palm Beach County‟s 2012 Municipal election definition as a „base
election‟ for WinEDS and BPS databases. Various components in BPS were edited and the
Bridge tool was again applied to import into WinEDS. Each group verified the accuracy of the
expected contest order.
Below are specific details of the permutations tested to uncover a contest order variance.
1. Set up WinEDS 3.1.077, Version 1 system according to the certified voting system
specifications, on both a laptop and a using client/server configuration. Activities included:
“Scrubbing” and imaging servers, workstations, and laptops
Loading commercial-off-the-shelf (COTS) software
Loading voting system specific software, namely WinEDS, BPS, BPS/WinEDS Bridge
Tool application, and Election Reporting software modules.
Networked the client / server system
No problems were encountered with the required installations.
2. Inspection of the Ballot Printing System (BPS) application:
Using an existing Palm Beach County election database, activities were carried out to
create ballots, add and delete contests and candidates.
The 2007 certification database that was originally used as a 2006 Palm Beach County
election was set up as a Municipal Election. Modifications were made to the base election
cycle year; candidate‟s names were edited with no change to the ballot artwork.
Created two new elections:
Election one was created using the client/server system and included a fictitious
jurisdiction, with three contests, having two candidates each.
Election two was created using the laptop and included creation of the following
fictitious data:
o Jurisdiction
o District type and district
o Office types, masters, and offices
o Polling places
o Precincts
Florida Department of State Page 12 of 20
o Municipality
o Contests.
NOTE: Prerequisite activities were needed before and during this process. Many
of these tasks were not found in the documentation or, in some cases, additional
steps were required to complete the activity. These tasks included items such as
assignment of precincts and district types to an election cycle, assignment of
components (forms, jurisdictions, regions, users) to groups and of groups to
components, partisan classifications, etc. All election settings were set up to mirror
Palm Beach, and when applicable, the Wellington, Municipal Election settings.
All of the activities above were satisfactorily completed and election data, including contest
order, was presented in the expected condition.
3. Inspection of the WinEDS database:
Using the Palm Beach County 2012 Municipal Election:
Performed tasks in the following areas to „force‟ (change) a contest mismatch.
o Edited election data contest order
o Edited profile contest order
o Edited Edge II (ballot management) contest order
o Edited profile & election data contest order
o Edited profile, election data, and Edge II (ballot management) contest order
For each of the edited contest order items, performed tasks necessary to „re-order‟
contests as is sometimes required when edits have occurred, as indicated in Sequoia‟s
WinEDS Election Data System Software Release 3.1.074C Reference Guide and the
online help option within the software. The steps to re-order contests are not complex,
however, it is an entirely manual process and, therefore, labor intensive and tedious.
Review of all permutations yielded the projected outcome and also allowed a user to
make contest order corrections/edits as needed.
4. Inspection of the BPS/WinEDS Bridge Tool application:
Using the WinEDS Palm Beach County 2012 Municipal Election database and
accompanying BPS database files, edited the BPS data files. Then used the Bridge Tool to
import that data into the WinEDS database and examined the Village of Wellington contest
list order.
Contests were ordered as anticipated.
Florida Department of State Page 13 of 20
Conclusion:
Within the scope of this review of the WinEDS database, the Ballot Printing System (BPS), and
the WinEDS/Bridge Tool application, no evidence of a software flaw was found in any of the
elements assessed and neither testing group was able to replicate the software contest issue that
occurred in the March 13, 2012 Village of Wellington Election. Since no software flaw
occurred, the contest order was changed in the WinEDS database to verify that the user can
reorder a mismatched contest order. Existing event logs did not provide sufficient data to
determine what occurred in this election.
It is worthy to note that the manuals and software were extremely difficult to use and/or
understand. Moreover, sometimes steps required to complete task sequences were not included
manual and/or online instructions, thus creating not only a delay in the amount of time needed to
create an election, but also a hindrance to a user‟s ability to verify that election data is correct.
The manuals were cumbersome to interpret, time consuming to comprehend, and instructions
were difficult to locate.
In addition, the vendor published a Product Advisory Notice in order to re-educate users about
the contest order issue [See Appendix D.].
Recommendations:
1. Election officials need to review the contest list order in WinEDS, for both paper and the AVC
Edge II ballots, to verify that contest data is in sync, as explained in the vendor‟s Product
Advisory Notice.
2. Supervisors of elections should include a more enhanced test deck during Logic and Accuracy
testing as the enhanced patterning would allow the tester to see contest order errors before the
election is finalized. This recommendation will be discussed at the next Florida State
Association of Supervisors of Elections (FSASE) Conference in May 2012 and reminders will be
given periodically to supervisors of elections preceding the Fall 2012 elections.
3. User manual and reference guide clarity issues will be reviewed and addressed during the current
certification process for Dominion Voting Systems Inc./Sequoia WinEDS Release 4.0.175,
Version 1.
Florida Department of State Page 14 of 20
APPENDIX A – Sequoia Voting Systems WinEDS 3.1.077 Certificate of Approval.
Florida Department of State Page 15 of 20
APPENDIX B – DVS – March 20, 2012 Statement
Florida Department of State Page 16 of 20
APPENDIX C – DVS – Dominion Letter to Secretary of State
Florida Department of State Page 17 of 20
APPENDIX D – DVS – Product Advisory Notice, WinEDS Database Edits/Ballot Order
Florida Department of State Page 18 of 20
Florida Department of State Page 19 of 20
APPENDIX E – Florida Department of State Voting Systems Review Team
David R. Drury holds Bachelor degrees in Mechanical Engineering, History and Political Science
along with a Masters in Business Administration. Mr. Drury has thirteen years of aerospace
experience with Boeing and GE Aircraft Engines which included computer modeling of jet engine
performance and an extensive test background that included model tests, wind tunnel tests,
production engine tests, and flight tests. Mr. Drury earned several “GE Outstanding Achievement
Awards” and was nominated for “GE Aircraft Engines Product Quality Award” during 1990 at the
Evendale, OH facility. Mr. Drury also acquired experience in the electronics industry while at
General Dynamics Tallahassee Operations where he served as a Sr. Industrial Engineer - ISO 9000
Management Representative, and Lead Auditor. During that time, Mr. Drury also served as an
adjunct professor at FAMU – FSU College of Engineering where he taught statistical quality control
for two semesters. In his last position prior to joining state government, Mr. Drury was Director of
Quality Assurance for Martin Electronics, Inc. In March 2004, Mr. Drury joined the Bureau of
Voting Systems Certification as a Sr. Management Analyst II and was promoted to Bureau Chief in
December, 2005. Mr. Drury is experienced with process audits, performance audits, and voting
system audits.
Linda Hastings-Ard graduated from Florida State University with a Bachelor‟s Degree in
Information Science & Technology. Prior to joining the Bureau of Voting Systems Certification, she
was the Systems Manager for the Florida Medicaid Reform Enrollment Broker Contract at
ACS/Xerox Corporation and previous to that held positions in both the public and private sectors,
including Florida Departments of Natural Resources and Revenue, Florida State University-College
of Business, University of Florida-College of Law, Pensacola Jr. College, Computer Aid, Inc. and
Camber Corporation. Ms. Hastings-Ard has diversified experience in systems/user acceptance
testing, business process and user specifications analysis, development, testing, and documentation.
Her primary responsibility is to manage and provide oversight for the Bureau‟s functional testing
and source code analysis activities during certification events and to conduct Election Day
observations as needed.
Cathy Cook joined the Bureau of Voting Systems in November, 2011. She comes to BVSC with a
background in business analysis in educational, government, and private-sector fields. Her
specialties are requirements gathering, database design and management, integration and black box
testing, and project management. She has conducted business research for budding entrepreneurs and
inventors, helped design a new unified system for a state law enforcement agency, managed a
project to raise awareness for the Florida Organ & Tissue Donor Registry, and implemented system
changes to both the Florida Medicaid Health Care Enrollment and the Florida Prepaid College
Tuition programs, among other projects. As Systems Project Analyst, Cathy functions as the
Bureau's database systems expert, providing input for all technical projects as they relate to
relational databases. Cathy is a certified Project Management Professional (PMP®) and earned a
Master of Science in Library and Information Studies from Florida State University.
Florida Department of State Page 20 of 20
APPENDIX D – Florida Department of State Voting Systems Review Team (Cont‟d)
Rondal “Jim” Halter earned a Bachelor‟s degree from Oberlin College with a double major in
Physics and Mathematics. He spent 5 ½ years in the US Navy as a surface line officer, assistant staff
civil engineer, and public affairs officer. Prior to joining the Bureau of Voting Systems Certification
in July 2011, Mr. Halter had nineteen years‟ experience in Software Engineering, Systems
Engineering, Configuration Management, and Quality Assurance. Mr. Halter is a member of the
certification source code analysis group. His primary responsibility is election system software
source code review, with corollary duties assisting the certification functional test group, and
participation in Election Day observations and attending election recounts as required.