IN ELECTION VOTING, DO PEOPLE TOUCH THE OBJECTIVE OR NOT? Except where reference is made to the work of others, the work described in this thesis is my own or was done in collaboration with my advisory committee. This thesis does not include proprietary or classified information. Gregory Rogers Certificate of Approval: Cheryl D. Seals Associate Professor Computer Science and Software Engineering Juan E. Gilbert, Chair Professor Computer Science and Software Engineering Christa Slaton Associate Dean College of Liberal Arts George T. Flowers Dean Graduate School IN ELECTION VOTING, DO PEOPLE TOUCH THE OBJECTIVE OR NOT? Gregory Rogers A Thesis Submitted to the Graduate Faculty of Auburn University in Partial Fulfillment of the Requirements for the Degree of Master of Science Auburn, Alabama August 10, 2009 IN ELECTION VOTING, DO PEOPLE TOUCH THE OBJECTIVE OR NOT? Gregory Rogers Permission is granted to Auburn University to make copies of this thesis at its discretion, upon the request of individuals or institutions and at their expense. The author reserves all publication rights. Signature of Author Date of Graduation iii VITA Gregory Rogers is a master?s student in the Computer Science and Software Engineering Department at Auburn University. Mr. Rogers received a Bachelor of Software Engineering de- gree from Auburn University in 2007. He is currently a graduate research assistant in the Human Centered Computing Lab at Auburn University. His interests include Spoken Language Systems, User Interface (Usability), Multimodal Interfaces, Human Computer Interaction, and Database. Mr. Rogers is a member of the National Society of Black Engineers and Vice President of the Auburn University Black Graduate & Professional Student Association. iv THESIS ABSTRACT IN ELECTION VOTING, DO PEOPLE TOUCH THE OBJECTIVE OR NOT? Gregory Rogers Master of Science, August 10, 2009 (B.S., Auburn University, 2007) 60 Typed Pages Directed by Juan E. Gilbert Election technologies have been constantly evolving since the first election in the United States of America. In 2008, during the early voting election one problem that occurred was vote switching in West Virginia and recently in Saline County, Kansas. Vote switching occurs when the voter?s selection is given to another candidate instead of the intentional candidate. Many voters believed that this problem occurred because of technical issues but another possibility is that the design of the interface was inherently flawed. The aim of this thesis was to investigate the extent to which the interface design affects the outcome of an election. Specifically, people were asked to touch an objective in order to determine the role that interface design plays in the results of an election. A small group of technical individuals that interact with various interfaces on a regular basis were used to test out the hypothesis. Findings show that majority of the individuals touched the objective but there was still a small percent that was unable to complete the mission. This exploratory look suggests that there is a strong possibility of an error occurring with the voting system?s interface that could cause voting switching and ultimately the election results. v ACKNOWLEDGMENTS First and foremost I would like to thank Jesus Christ, my Lord and Savior, through whom all thing have been made possible. Next, I would like to express my deepest gratitude to my advisor, Dr. Juan E. Gilbert, for his patient guidance and continued encouragement throughout my graduate studies. He has been extremely helpful to me and has no idea how much gratitude I have for him. I would also like to thank my graduate committee members, Dr. Cherly Seals and Dr. Christa Slaton for their reviewing and advising efforts. Additional thanks must also be given to the members of the HCCL lab for proofreading, advice, opinions, and support. Special thanks go out to my family for believing in me and encouraging me in my decision to pursue my goals. To my mother Deidra and my brother Darrin and James, thank you for your continued support and unwavering belief in me. To my future wife Alexandra, thanks for sticking by my side and encouraging me when time was hard and finally to my friends: Williams Harden, Ashley Robinson, Joya Daniels, and Keith Ajayi, thank you all for your unwavering friendship. vi Style manual or journal used Journal of Approximation Theory (together with the style known as ?aums?). Bibliograpy follows van Leunen?s A Handbook for Scholars. Computer software used The document preparation package TEX (specifically LATEX) together with the departmental style-file aums.sty. vii TABLE OF CONTENTS LIST OF FIGURES x LIST OF TABLES xi 1 INTRODUCTION 1 2 LITERATURE REVIEW 4 2.1 Evolution of Voting Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.1 Voice Voting Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.2 Paper-based Voting System . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.3 Mechanical Voting System . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1.4 Scanner Voting System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.1.5 Direct Record Electronic System . . . . . . . . . . . . . . . . . . . . . . 7 2.2 Problems with E-Voting Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Requirement and Specification of E-Voting . . . . . . . . . . . . . . . . . . . . . 11 2.3.1 User Interface Design Applied to Ballot Design . . . . . . . . . . . . . . . 11 2.3.2 Color, Contrast, Shading, and Fonts . . . . . . . . . . . . . . . . . . . . . 12 2.3.3 Legal Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3 PROBLEM STATEMENT 19 4 EXPERIMENT AND ANALYSIS 21 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.2 Development of Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.3 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.3.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.3.2 Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.3.3 Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 4.4 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.4.1 Data Collection Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.4.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5 CONCLUSION AND FUTURE WORK 37 5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 BIBLIOGRAPHY 39 APPENDICES 41 viii A INFORMATION SHEET 41 B PRE-SURVEY 42 C PRE-SURVEY RESULT 44 D INTERFACE RESULT 48 E SCREEN CAPTURE CODE 49 ix LIST OF FIGURES 1.1 Example of InterfaceDesign . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1 Example of County Election . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Evolution of Lever Voting System . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.3 Sample Calibration Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.4 Sample Calibration Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.5 Example of Miscalibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 4.1 Prime III System and Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 4.2 Example of Multiple Contests Per Screen . . . . . . . . . . . . . . . . . . . . . . 24 4.3 Example of First Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.4 Example of Second Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.5 Example of A Summary Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.6 The two interfaces used in the study . . . . . . . . . . . . . . . . . . . . . . . . . 29 4.7 Pre-Experiment Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.8 Example of the control interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 4.9 Example of the Experimental interface . . . . . . . . . . . . . . . . . . . . . . . . 36 x LIST OF TABLES 2.1 Voting equipment reported in the 2006 election. [9] . . . . . . . . . . . . . . . . . 8 2.2 Basic model for rolling DRE ballots . . . . . . . . . . . . . . . . . . . . . . . . . 11 xi CHAPTER 1 INTRODUCTION The use of voting technology is an area that is constantly changing. As voting technology evolves, new problems are inadvertently created that people in the voting population are trying to address. One of the major disadvantages in voting technology is in the area of usability. Several studies have been done, and continue to be researched, to help address some of the problems that might arise with usability. When designing a voting system many factors have to be understood based on previous difficulties that have occurred because of usability issues. Usability of voting systems has always been the front runner of problems in all voting systems: voice voting, paper-based system, lever voting machine system, paper scanning system, and direct record electronic system (DRE). Voice voting (the first form of voting used in the United States) gathered people together in a meeting location where people called out their votes [11]. Their votes were then recorded by an official clerk. Due to privacy issues, voice voting was later replaced by paper-based voting. Paper-based systems allowed voters to make their selections on paper. Their selections were then turned into a ballot box. Many corruption issues occurred with the paper system, therefore, newer system such as the lever voting machine were implemented. In the lever voting machine, each candidate was assigned to a certain lever arm. The lever voting machine permitted a person to make a selection by pulling a lever arm. Researchers still were determined to find a way to address many of the previous usability issues, therefore systems such as the paper scanning systems and DRE systems were developed. One of the major paper scanning systems created was a system called the optical scanner system citejones2001. It permitted a voter to mark their selection on a paper ballot that would be placed in a scanner machine to collect and tally the votes. With the advancements in usability between the paper scanning systems and the DRE systems, the DRE system has become the more popular voting 1 system used in the United States. This system allows voters to use a touch screen to cast their vote for the candidate of their choice. With the development of the different voting systems, new guidelines for standards and re- quirements had to be implemented in order for improvements to be made on the systems. The Help America Vote Act created Election Assistance Commission (EAC) to create voting system procedures. These guidelines for voting systems can be used to establish if the systems provide all of the required basic functionality, accessibility and security capabilities. The Election Assistance Commission (EAC) and Federal Elections enforce the standards and requirements for each voting system. Under such standards several issues needed to be addressed when creating a new system, such as, people being able to vote equally, independently, and privately. In addition to these stan- dards, requirements such as order of the offices on the ballot, design of the ballot, colors and even the interface itself, have to be addressed. When creating a voting system, the design of the system has to address usability, therefore the developer needs to know the problems of previous system usability designs. In West Virginia and Saline County, many voters had a problem with a DRE system. Voters entered the polling place to cast their vote and when the voters tried to make a selection for one candidate their vote went to another candidate. Many voters thought their vote had been switched to a different candidate, but in actuality the DRE system was correctly cataloging the selection made by the voter. The confusion became apparent in the positioning of the names of the candidates. The names of the candidates were positioned at the top of the button on the touch screen shown in Figure 1.1. Due to the location of the candidate?s name on the button, the voter tended to touch the button positioned above that which contained the target candidate. This led voters to believing that their votes were switched or given to another candidate. The aim of this study was to determine how people interact with voting systems and if this design problem can change the outcome of an election. A prototype of the voting system used in West Virginia and Saline County was implemented in order to research the design problems of their voting interface. In Chapter 2 of this thesis a literature review can be found, which will give a brief description of the history of voting, problems with DREs, standards of voting system, and 2 Figure 1.1: Example of InterfaceDesign their requirements. A summary of the literature review and a definition of our research problem appear in Chapter 3. Chapter 4 introduces the voting system that was used and how the voting system was designed and implemented. The method and results of the usability study are described and analyzed as well. Also, included in Chapter 4 is the methodology, its data and findings. The conclusion and future works are presented in Chapter 5. 3 CHAPTER 2 LITERATURE REVIEW 2.1 Evolution of Voting Technology 2.1.1 Voice Voting Method The process of voting has changed from being a very public process to a more private one. In early voting methods, if the voter wanted to vote they would show up in a public location and call out the name of the candidate that they would like to vote for in the election. This was known as voice voting. Figure 2.1 illustrates a great example of how the voting process was done during this time. One of the main reasons that the voting process was done this way was to easily recognize a person truing to vote more than once. The other way of securing the vote, in addition to the public nature of the voting process, was that the voters were sworn in by placing their hand on the Bible, stating that they had not voted and had the right to a vote. There was no concept of voter registration, so this oath and the possibility that the judge or someone else in the vicinity of the polls might recognize them if they came back was all that prevented a voter from voting again and again [11]. There were several clerks who kept a tally of the votes being cast in separate polling books. The fact that there was no ballot box to stuff was one advantage to the voice voting process. On the other hand, a major disadvantage was that there was no privacy so anyone could sell their vote and also be persuaded to vote for a certain candidate. The process of voice voting was effective for many years until the paper ballot came about and changed the voting process. 2.1.2 Paper-based Voting System The first paper ballots were pieces of paper provided by the voter with their written votes on them. Paper ballots first appeared in Rome in 139BC and in the U.S. in 1629. By the time the 12th Amendment to the United States Constitution was passed, it was clear that the term ballot 4 Figure 2.1: Example of County Election was routinely taken to refer to slips of paper on which the names of candidates for office were written [11]. The concept of the preprinted ballot was supported and many times supplied by the candidates and the political parties. Some of the political parties and/or candidates would print their ballot slips out on distinctive paper. This made the privacy factor of the ballot obsolete due to the fact that the vote cast by the voter would then be identifiable to those observing the election. It took awhile before many states would allow the use of preprinted ballots. There was fear that since the voters were allowed to insert their own ballots into the ballot box that the individual would insert more than one ballot into the box. There were many objections to the progression towards privacy in the voting process. 2.1.3 Mechanical Voting System Lever voting machines were designed to bring about a new level of security and privacy in the voting process. The lever voting systems were designed to protect the voter and to make the process of casting the ballot perfectly plain, simple and secret [11]. The machines were implemented 5 in many of the larger states by the 1930?s. Automatic Voting Machines (AVM) and the Shoup companies held the market on the manufacturing of the lever voting machine. (a) First Lever Machine Booth (b) Today?s Lever Machine Booth Figure 2.2: Evolution of Lever Voting System Figure 2.2 is the design of both the AVM and the Shoup machines, which were very similar, as both implemented the tabular ballot layout, in which the intersection of the particular row and column used to record a vote is where the lever was located . They were designed to eliminate the following: over voting, questions about ballot interpretation and vote count manipulation. Unfortu- nately, there was no way to confirm the individual?s vote. 2.1.4 Scanner Voting System The idea for the scanning system came from the IBM Type 805 test scoring machine. This machine was used with the Standard Achievement Test (SAT), which read the pencil markings on 6 test paper by electrical conductivity. This was used until the 1950?s to score the exam. Later a professor at the University of Iowa developed a similar product for the college entrance exams. His prototype was the first sample of the optical mark-sense test scoring machine. The rights of this system were later sold to Westinghouse Learning Corporation. At Westinghouse, they experimented with the system and continued to investigate the idea of reading information from a scan ballot [11]. Several versions of the mark-sense system began to emerge in the California area. The first system developed weighed about 15,000 pounds. A few years later another called the Votronic ballot tabulator was developed, which was much smaller and also easier to operate compared to the first system developed. In the late 1970?s, the American Information System (AIS) emerged and later became known as the Election System and Software (ES&S) [11]. Later ES&S joined with the Business Recording Corporation. ES&S developed many new types of opti- cal marking-sense systems: the model 150, known for its speed and widely used in small counties and the faster model 550 which is used in larger counties. In today?s voting society a newer version of the optical marking-sense system is used, which is often referred to as the optical scan system. The optical scan system uses the same basic ballot systems. The ballot is fed into the system and scanned by the optical scanning system. Once scanned, the votes from the ballot are tabulated to produce results from the election. 2.1.5 Direct Record Electronic System There were many attempts towards the design of electrical vote recording systems. The first such system used was designed by Albert Henderson in 1850. He designed an electrochemical vote recorder used for legislative roll-call vote. Henderson?s system recorded the vote of aye or nay. In 1869 Edison added electrochemical counters to tally the votes . It was not until 1974, designed by McKay, Ziebold, Kirby, et. al, that the ?first? direct-recording electronic voting system called the Video Voter was used in a public election [11]. Fidlar-Doubleday?s Electrovote 2000 voting machine was a mesh of technology, which included an IBM PC compatible with a touch screen in a security case, the case prevented tampering, and a voting booth that consisted of a table and a privacy screen made of plastic [11]. The Microvote is one of the older versions of a DRE voting 7 system. This system printed the results on a preprinted ballot and then is placed where the voter cannot touch it. Most DRE systems used today are very simple and easy to use. A voter enters the voting booth and interacts with a touch screen interface to make their selection. Once the voter has made their selection, they then touch the button on the screen to cast their ballot. As the DRE system changes, developers are trying to find more ways to make this voting process even smoother during election time. 2.2 Problems with E-Voting Systems Developed in the 1970?s touch screens earned praise for their intuitiveness and the users? ability to point and select items on the screen at a fast pace. Touch screens had been widely used in many domains from banking applications to public information displays, but not until 2006 had they seen wide use in elections. Now over 66 million registered voters use DRE voting systems [9]. As of the 2006 elections, jurisdictions with 63% of the nation?s registered voters had changed their voting system, marking the largest shift in voting equipment in history [9]. The switch made by many jurisdictions from paper-based systems to DREs can be credited to the Help America Vote Act, a federal law passed in October of 2002 that required all states to replace some paper-based systems with new election technology that allowed for an accurate and efficient election and also allowed people with disabilities to be able to cast their ballot secretly. Table 2.1 summarizes the Election Data Services findings on voting equipment reported for 2006 elections across the nation [9]. Voting Equipment Reported for the 2006 Elections Type of Voting Equipment Counties Registered VotersNumber Percentage Number Percentage Punch Card 124 3.98 5,166,247 3.03 Lever 119 3.82 17,356,729 10.18 Paper Ballots 176 5.65 653,704 0.38 Optical Scan 1,502 48.23 69,517,991 40.79 Electronic 1,050 33.72 66,573,736 39.06 Mixed 143 4.59 11,154,765 6.55 TOTAL 3,114 100.0 170,423,172 100.0 Table 2.1: Voting equipment reported in the 2006 election. [9] 8 With a reasonable percentage of people now voting on DRE?s, like its predecessors, DRE?s imperfections started to become visible. DRE?s were questioned in Heber Springs, Florida during the mayoral elections of November 2006 where voters claimed to select one candidate and because of calibration issues with the touch screen another candidate was selected [16]. The problem with miscalibrated touch screens in Heber Springs could have occurred accidentally from a change in temperature or humidity in the precinct, vibration, or shock. The system could have also been miscalibrated intentionally to discriminate against candidates by an election administrator. Inten- tionally miscalibrating the touch screen can easily be achieved by selecting the wrong part of the screen during touch screen calibration, thereby making it difficult to select a candidate. The process of miscalibrating the screen can be seen in Figures 2.3, 2.4, and 2.5. Figure 2.3: Sample Calibration Screen Figure 2.4: Sample Calibration Process 9 Figure 2.5: Example of Miscalibration The candidate selection vote can also prove to be difficult when the voter places two fingers or another hand on the touch screen. DRE?s are single touch systems and when two touches are present the software takes the average of the two touches and sets the result as the current position of the pointing device; because of this, selecting an option can prove to be difficult for the voter. The above problem also applies to debris being present on the screen that is recognized as a touch. Debris can intentionally be placed on the edge of the screen and with enough pressure could cause the touch screen to be thrown off [12]. Dirty screens also make the DRE?s susceptible to selection problems. As people vote on a DRE?s they continuously touch the screen leaving dirt and smudges thereby making it difficult for the next person to be able to read the options and make selections. When arguing the positives for DRE?s many people state that DRE?s have given people with disabilities the chance to vote secretly and independently for the first time. Though this is true, many disabled voters have issues with DRE?s including lack of responsiveness and navigation problems that involve poor sound and hard to read Braille [4]. Price is another issue that is not overlooked when discussing DRE?s. With prices ranging from $4,000 to $12,000 a machine, most locales cannot in hard economic times make the commitment to purchase such machines and/or maintain their upkeep. The problems with DRE?s and past election technologies have led to the development of a multimodal virtual reality voting system that provides the voter with the accuracy, efficiency and accessibility that voters have been searching for. 10 2.3 Requirement and Specification of E-Voting 2.3.1 User Interface Design Applied to Ballot Design One of the serious problems in the process of designing user interfaces is determining the users? mental model surrounding the problem. Specifically in the area of ballot design, there is the problem of determining how people think about voting. For example, the term abstain is used in many legal and professional communities to indicate that an individual did not vote; however, a programmer may call this a null vote. It would be considered and would probably be store as a null value. To a programmer, an abstention would be considered, and would probably be stored, as a null value. Both are effectively irrelevant, even though abstain is the proper terminology, because the average voter would probably state that they omitted the race. This is important since it is the average voter for whom the interface is designed. However, a larger problem arises when an individual incorrectly assumes a skipped vote indicates an undervote. This emphasizes the importance of fully analyzing the terminology in user interfaces, including the design of ballots. The Election Assistance Commission (EAC) recognizes this fact and advises designers/programmers to use clear, concise language (simple language) for all content [19] and provides many best practice recommendations for ballot design. After reviewing the EAC?s recommendations for DRE ballots, the following basic model can be derived from their example interface template [18]: Interface Template Object Action Ballot Begin Contest Vote Question/Referenda Review Choice Return Name/Candidate Skip Selection Back Vote Next Write-in Cancel Review Screen Accept Cast Read More Table 2.2: Basic model for rolling DRE ballots 11 2.3.2 Color, Contrast, Shading, and Fonts In many paper voting systems and DREs, ballot designs have been a major issue. Poor design of a ballot can lead to confusion and potentially a large number of spoiled or mismarked ballots. When designing a ballot, the designer needs to think about the following things: color, contrast, shading, and fonts. These items are part of the ten election design guidelines by American Institute of Graphic Arts (AIGA), the professional association for design [17]. The color and shading should be consistent throughout the entire ballot. They support navigation and give instructions for contents and contests for each ballot. Consistency is key when designing a ballot. The fonts need to be the same on each of the ballot pages along with the shading and color. The designers want to avoid introducing new fonts that can cause the eye to stop reading and have to adjust. For touch screen systems, Sans-serif fonts such as Arial, Univers, and Verdana, which have clean strokes, should be utilized. When designing a ballot, letters should be in upper and lowercase because that format tends to be easier to read. The entirely capitalized words have an indistinctive silhouette and are harder to read. According to AIGA, the title-case name is easier on the eyes and has a distinct silhouette that is easier to read. Thus, when designing a ballot the designer should think about universal design and not just what is good for a few people. It?s also important to consider is that there are colorblind people in the world, so the designer should use colors that are good for that majority of the population, as well as the rest. 2.3.3 Legal Compliance Yet another aspect of ballot design is legal compliance. When considering ballot design based on Federal laws as well as state laws one must consider and adhere to these statues. Responsibility of the preparation of ballots and other supplies to conduct all state, county, and federal elections is the responsibility of the judge of probate as chief elections officer for the county [2]. Some legal compliance for ballot design that should be included is the following: 12 When considering the arrangement of the offices in a primary ballot, the names of the can- didates must be listed on the ballot in alphabetical order by surname and the offices must be arranged in this specific order [2]: ? President ? Delegate to the National Convention ? Governor ? Lt. Governor ? U.S. Senator ? U.S. Representative ? Attorney General ? State Senator ? State Representative ? Supreme Court Justice ? Court of Civil Appeals Judge ? Court of Criminal Appeals Judge ? Secretary of State ? State Treasurer ? State Auditor ? Commissioner of Agriculture and Industries 13 ? Public Service Commissioner ? State Board of Education Member ? Circuit Court Judge ? District Attorney ? District Court Judge ? Circuit Clerk ? Other public officers (to be listed in the order prescribed by the probate judge) ? Other party officers The lists of party candidates in general elections need to be placed in parallel columns across the face of the ballot, in alphabetical order beginning at the left, with each party column of candidates headed by the party?s designation and emblem. In the event of independent candidates, their names are placed in a column to the right of the last column of the party nominees. A blank column is provided to the right for possible write-in candidates. [2] The Electronic Voting Committee Administrative Code Chapter 307-X-1 lists the procedures for electronic voting counting systems. These laws are specifically designed for Mark-sense ballot counters currently used by the state. Section 17-2-4 of Alabama?s Election Code states the voting system requirements, vote standards, uniform polling system, and purchase of equipment [2]. On or before January 1, 2005, each voting system used in an election must meet the following requirements: 1. The voting system shall: (a) Permit the voter to verify, in a private and independent manner, the votes selected by the voter on the ballot before the ballot is cast and counted [1]. 14 (b) Provide the voter with the opportunity, in a private and independent manner, to change the ballot or correct any error before the ballot is cast and counted, including the opportunity to correct the error through the issuance of a replacement ballot if the voter was otherwise unable to change the ballot or correct any error [1]. (c) If the voter selects votes for more than one candidate for a single office: i. Notify the voter that the voter has selected more than one candidate for a single office on the ballot. ii. Notify the voter before the ballot is cast and counted of the effect of casting multiple votes for the office. iii. Provide the voter with the opportunity to correct the ballot before the ballot is cast and counted. 2. A voting system may meet the requirements of paragraph c. of subdivision (1) by: (a) Establishing a voter education program specific to that voting system that notifies each voter of the effect of casting multiple votes for an office. (b) Providing the voter with instructions on how to correct the ballot before it is cast and counted, including instructions on how to correct the error through the issuance of a replacement ballot if the voter was otherwise unable to change the ballot or correct any error. 3. The voting system shall ensure that any notification required under this section preserves the privacy of the voter and the confidentiality of the ballot. (a) The voting system shall produce a record with an audit capacity for such system. (b) The voting system shall produce a permanent paper record with a manual audit capacity for such system. (c) The voting system shall provide the voter with an opportunity to change the ballot or correct any error before the permanent paper record is produced. 15 (d) The paper record produced under paragraph a. shall be available as an official record for any recount conducted with respect to any election in which the system is used. 4. The voting system shall: (a) Be accessible for individuals with disabilities, including nonvisual accessibility for the blind and visually impaired, in a manner that provides the same opportunity for access and participation, including privacy and independence, as for other voters [1]. (b) Satisfy the requirement of subdivision (1) through the use of at least one direct recording electronic voting system or other voting system equipped for individuals with disabilities at each polling place [1]. i. The voting system shall provide alternative language accessibility pursuant to the requirements of Section 203 of the Voting Rights Act of 1965 (42 U.S.C. 1973aa-1a) [1]. ii. The error rate of the voting system in counting ballots, determined by taking into account only those errors which are attributable to the voting system and not attributable to an act of the voter, shall comply with the error rate standards established under Section 3.2.1 of the voting systems standards issued by the Federal Elections Commission which are in effect on June 19, 2003 [1]. Once the ballot is correctly designed according to the preceding legal compliance, there needs to be a way to ensure accuracy. The use of a voter verification screen is a solution that tries to improve the voter?s accuracy. Before their ballot is cast, the voter is presented with the selections that were made during their voting process. The voter can choose to ?accept? the selections and move to the next step or ?reject? the selections and make changes in their ballot choices. Here are two options that were considered for the voter verification screen in order to satisfy the legal compliance. Terms, as used in this section: Option A: 16 The EAC best practices as based on Voluntary Voting System Guidelines (VVSG) suggest that the voter be able to review ballot selections at any point in the system [18]. They note that some voters may want to skip forward to contests they deem more important and skip back [19]. While using the review ballot selections screen, in an EAC study, research participants could choose a contest to alter their votes [19]. Upon, touching the contest, the participant went to the contest screen and could then alter the results. However, participants got lost during the navigation [19]. To avoid confusion and getting lost in the ballot, the voter should be able to review the ballot from any contest screen. From the review screen, in which the voter will see the entire ballot and scroll down to view all contest, they must return to the previous contest screen. When the voter finished voting on the final contest screen, they will be at the summary screen. At the summary screen, the voter may scroll up and down to view selections; contests that are under voted will be noted. The voter may choose the contest they wish to alter by touching it, to which they will be sent to that contest?s screen. In the above EAC study, ?Participants had difficulty understanding their next step after moving from the summary screen to a contest screen- many wanted to return to a summary page to pick up where they left off? [19]. Therefore, the voter will then only be able to navigate back to the summary screen, where they may repeat the process to change under votes or alter selections. On the selected contest, EAC also recommends to ?remove all navigation except ?Help? and ?Return to Summary? when coming from summary page? [19]. Once the voter has finished reviewing each contest from the summary screen, they may choose to ?accept choices?. After ?accepting? they will go to a confirmation page. From the confirmation page the voter may scroll up and down to view their selections, go ?back? to summary page, or ?cast my vote.? On the ?cast my vote? screen the voter is asked, ?Do you want to cast your ballot?? The choices under the question are ?Yes? or ?No, go back to review my choices? [18]. If the voter chooses ?Yes?, the ballot will be cast and they will go to a screen that says, ?your ballot has been cast! Thank you for voting? [18]! If they choose ?No,? they will go back to the summary screen and repeat the process to change their selection. Option B: 17 Contrary to the Election Assistance Commission VVSG, reviewing ballot selections from any point in the ballot can be confusing for voters. To rid the possibility of confusion, the voter should be able to review all contest selections at the summary screen. As before, they scroll up and down to view selections; under-voted contest(s) will be noted to draw the voter?s attention to them. The voter proceeds to contests within the ballot as in option A and cast the ballot as in option A. The difference is the removal of the option to review the ballot throughout the balloting process. This should eliminate all chances for voter confusion, by simplifying the process such that regardless of computer familiarity, the voter feels comfortable voting. 1. Review screen is used to view the ballot and selections. It is accessed from any contest screen before choosing to go forward on the final contest screen on the ballot. 2. Contest screen is any screen that the voter uses to make electoral decisions. 3. Summary screen is the screen that comes up after choosing to go forward on the final contest screen. From this screen the voter navigates to contest or ?accepts choices? to move on to confirmation. After altering a contest, the voter may only return to the summary screen. 4. Confirmation screen come after the voter ?accepts choices? on the summary screen. The voter may only choose to go back to summary screen to keep alter selections or cast ballot. 5. Cast ballot screen asks the voter, if they want to cast their ballot. The choices are ?Yes? or ?No.? Choosing ?No? takes the voter back to the summary screen. ?Yes? takes the voter the thank you screen 6. Thanks for voting screen tells the voter their ballot has been cast and thanks them for voting. It is accessed after a choice of ?Yes? on the cast ballot screen. 18 CHAPTER 3 PROBLEM STATEMENT During the 2008 presidential election, several West Virginia voters, encountered numerous difficulties. The voters went into the polling station and tried to cast their vote for a candidate on the touch screen ballot. While trying to press the candidate?s name on the touch screen, the voter?s vote was changed and given to a different candidate other than the candidate that the voter tried to select. For example, one of the voters by the name of Matheney said ?when I touched the screen for Barack Obama, the check mark moved from his box to the box indicating a vote for John McCain? [14]. This led the voters to believe that their votes were flipped/switched to a different candidate. Matheny, the voter who stated the vote was switched to a different candidate, went and reported the problem to the election poll worker and the poll worker told the voter to not touch the screen so hard. Matheny was also told to try using just her fingernail to press the screen. The voter attempted to cast their vote again. It took three tries before the vote was finally cast. After the incident, voters begin to question and not trust the voting system. After viewing video of the people using the system utilized in West Virginia and, it was dis- covered that a design problem was a reason for the switching of the vote. In Figure 1.1, the voter is trying to touch Barack Obama?s name on the touch screen and the vote is going to John McClain, which is similar to Matheny?s statement. When the voter tried to touch the candidate button as seen in Figure 1.1, the voter touched the candidate name that was positioned at the top of the button. Due to the positioning of the name of the candidate on the buttons the voters tends to touch the button positioned above their intentional choice. Recently, a similar problem occurred in Saline County, Kansas where voters believed their votes were switched. Therefore this thesis is about the interface design used in West Virginia and Saline County. The interface had design issues that occurred during the voting process. The problem that occurred appeared to be vote switching, when the voter attempted to touch a candidate name and their vote 19 was given to another candidate, but actually may have been a simple design issue that led people to believe vote switching was occurring. The aim of this research is to address the issue of design to determine to what extent poor design, which results in a voter not being able to touch the objective, affects the outcome of an election. 20 CHAPTER 4 EXPERIMENT AND ANALYSIS 4.1 Introduction The primary objective of this study is to analyze how people interact with electronic voting systems during the voting process and to investigate if the problem of West Virginia and Saline County can change the outcome of an election. The goal is to determine if voters are making their selection by touching the text on the buttons or if they are making their selection by touching the button itself. In order to explore this assumption, a voting system similar to the ones used during those elections, with the same design and interface, was utilized in this research. It is expected that the interface will prove effective and the results will support the usability of the system. However, it is also expected that the text in the control and experiment groups will be touched more, and could result in some error during voting. 4.2 Development of Interface After reviewing the design problem that occurred in West Virginia and even a more recent election in Saline County, this was a great opportunity to study the design of the voting system. The voting system was a DRE system, where the voters interact with a touch screen to make their selection. One of the major observations was the position of the candidate name on the button which some believe was the cause of the vote switching. The candidate?s name was positioned in the top left-hand corner of the button and in front of the candidate name was a checkbox. When a voter made a selection, a check was placed in the checkbox. After observing the position of the candidate?s name, the idea of using a voting system to test how voter interact with voting system based on the position of the text on the button was formulated. By touching the candidate name, 21 how does it affect the outcome of an election? In order to test this hypothesis, a voting system had to be used to test the design of the voting system used during the elections. Prime III, a voting system prototype developed at Auburn University was used. Prime III has both a touch screen and voice interface that can be used separately or in conjunction as shown in figure 4.1(a). At the voting station, the voter used the touch screen to cast her/his ballot during the study. Displayed on the touch screen were large fonts with neutral colors for people that are colorblind. The interface also takes into consideration voters with physical impairments that limit their dexterity. The voice part of Prime III is there to allow a person that is unable to see the opportunity to vote on the same machine. By removing the voice of Prime III, the ballot design still remained the same on the contest screen which only listed the contest selections from that contest. The ballot design removes ambiguity and confusion by only showing the candidates for one office at a time and ballot questions one at a time. Other systems show multiple contests per screen, which increases the potential of voter confu- sion and possible under voting shown in Figure 4.2. The Prime III interface highlights the number of races remaining. This is done because Prime III requires the voter to go through each race. This ensures that there are no under votes from accidentally missing a race. The voter does not have to vote for each race and can skip to the next race by choosing a continue button or clicking on the clear button to clear a wrong vote. When making a selection there is a short delay to address double clicking or finger dragging as observed in voters with shaky hands. With many systems, when a double click occurs, the voter either does not see the second screen or believes that the system has made a selection for them [8]. Prime III is designed so that the voter sees each race and does not skip any race. Once the voter selects their choice, the system goes to the next office or to the first verification process. On the first verification, the voter can make changes to the selection made by touching that contest on the screen, thus going back to the office, and then changing their vote in Figure ??. If the selections are correct, then they move on to the second verification in Figure 4.4. When the second verification screen is complete, the voter can cast their ballot and the process is complete. Finally, the voter is presented with 22 (a) A Picture of the Prime III system (b) A screenshot of Prime III Figure 4.1: Prime III System and Interface 23 Figure 4.2: Example of Multiple Contests Per Screen a summary of their ballot in Figure 4.5. Prime III is a very unique system in the area of voting because people regardless of their disability, can still vote on the same machine. Due to the interface of Prime III, the hypothesis is testable and it is a simulation that shows how the voter interacted with the DRE system in West Virginia and Saline County. Prime III was mod- ified and new items were added for the study in order to track where the voter touched the screen. There were two version of the voting system used in the study. The first, being a control prototype, was just like the voting system used in West Virginia. The candidate?s name was positioned in the top left-hand corner and before the candidate?s name was a checkbox. The second version was the experiment prototype which is the interface design used by Prime III. The candidate?s name was positioned in the center of the button. With these two prototypes, the hypothesis was tested and it simulates how the voter interacts with the voting system, whether the voter touches the name or touches the button itself. 24 Figure 4.3: Example of First Verification 25 Figure 4.4: Example of Second Verification 26 Figure 4.5: Example of A Summary Screen 27 4.3 Method 4.3.1 Participants The participants were recruited on a volunteer basis. Mass e-mails were sent to several com- puter science departmental courses at Auburn University. The email requested dates and times that the participant could meet. A reply email was sent with the date, time, and location of the study. All students, both undergraduate and graduate, were accepted. The participants also had to be at least 19 years of age and there were no potential harms to participate in the study. There were a total number of 34 volunteers recruited to participate in this study at Auburn University. 4.3.2 Procedures All of the participants were at least 19 and consented to being participants in a research study entitled Prime III voting system. The first task entailed participants reading the information form (See Appendix A) about the experiment which gave the participant details about the study. Once the participants read over the information form for the study, the participants were asked to fill out a pre-survey to obtain information about their background, education level, computer use, and other pertinent information (See Appendix B). To maintain anonymity, each participant was given a unique identifier for the study on the pre-survey. The participant was positioned in front of a touch screen which displayed the interface of Prime III. Before the participant could start voting, an access code was entered in order to start the voting system. At the touch screen, the participant was simply informed to vote for any candidate on the screen or they could write in a candidate for the current office display. When the participant was done and had cast their ballot, they were asked to repeat the same process: They were instructed that they did not have to vote for the same candidate again. By voting twice each participant was able to use both the control prototype and the experiment prototype. The only independent variable changed was the interface of the technical communication. The participants were randomly selected for the order of the control prototype and the exper- iment prototype in Figure 4.6. 28 (a) The Control Interface (b) The Experimental Interface Figure 4.6: The two interfaces used in the study 29 ? Control prototype Text placed in the top left-hand corner on the button ? Experiment prototype Text placed in the center of the button All participants were told not to discuss the experiment with friends and classmates to ensure that all participants had an equal knowledge of the study. During the experiment, a set of resulting data was collected by following specific data collec- tion methods. 4.3.3 Materials There was a variety of equipment, software, and technology used in this study. In order to perform the study, the setup of the voting system had to be executed. The voting system consisted of a touch screen monitor, notebook computer, and the Prime III software. The Prime III software had to be modified to fit the study. The modification consisted of a way to track where on the touch screen the participant touched the button. Once the user touched the touch screen, a screen shot of the interface was captured to have a shapshot of where on the screen the participant touched. To see the exact location of the touch on the screen a blue dot was captured as the location of the touch on the screen. The code for this selection can be seen in Appendix E. Other equipment used in the study was the information sheets that where offered to the participants of the study (See Appendix A) and the surveys given to the participant during the study (See Appendix B) .The technology used to modify Prime III voting system was Netbean. Netbean is a free open-source Integrated Development Environment (IDE) for software devel- opers. It allowed modifications needed to Prime III in order to capture the screen with the position where the participant touched the touch screen. The experiment?s results were also stored and ma- nipulated in Microsoft Access and Microsoft Excel. 30 4.4 Analysis 4.4.1 Data Collection Method In order to evaluate the participants, the first step was to collect background information be- cause some groups of people will be more likely to perform well on a computer system than others; certain demographic information must be collected to determine the validity of the results of the survey. Participants were given a pre-survey to complete before using the sample electronic voting system that consisted mainly of two parts. The first part of the survey asked mainly demographical information regarding the participant, such as age, education level, and other related information. The second part of the survey dealt with the familiarity with computers and electronic systems. This section included questions relating to how comfortable the user feels using a computer and how many hours the user spends using a computer during a typical week, along with other similar questions. It was important to determine this information because the more comfortable the subjects were with computers in general, the more likely they were to succeed in using the electronic voting system. A major goal of the design is to make it acceptable to all possible voters, so the best possible subject group would include a wide range of computer experience. Because of the venue chosen for the experiment, a less than optimal group of subjects was utilized. Most of the subjects said that they felt comfortable with computers and used them on a regular basis. Also most were young and well- educated. The group did include a few subjects from other demographics, which does helps identify trends in the data. Regardless of this fact, the study results are still valid because the subjects had no prior experience with this system and little to no prior experience with electronic voting machines. While the results may be different for a general population of voters, the overwhelming success demonstrates that the design created for the study has positive attributes that should be taken into account when allowing the voter to interact with the electronic voting system. The results of the survey will be discussed in more detail in the results section. 31 4.4.2 Results This section describes the quantitative data gathered from the questionnaire concerning partic- ipant background and their screenshot that was captured when the participant made a selection in responses to the system. Participants Results As shown in Figure 4.7(a), there were 26 males and 7 females, and 1 participant who did not indicate a gender; this participant?s survey was removed as some of the other information was also left blank. All the remaining participants were used and filled in the information completely. The age range of participants was 19-29 with a mean of 20.12 (SD = 2.07) and a median of 19, shown in Figure 4.7(b). No one in the study reported that they had any disabilities as asked in the survey. In Figure 4.7(c), the educational obtainment of the participants was largely high school completion (94%), with the 6% having obtained a bachelor?s degree or master?s degree. Interface Results After the participants completed the pre-experiment survey, they were asked to begin the Prime III voting system. From the Prime III system, several parts of information were gathered from the interaction of the participants. There were two interfaces used to gather the data. First was the control interface, which was similar to the interface used by the voters in West Virginia shown in Figure 4.8. The second interface was simply the interface of Prime III voting system which was the experimental interface shown in Figure 4.9. The control interface contained checkboxes, which were the targets for the participants to touch, a candidate?s names, and buttons. The candidate?s name was positioned to the top-right of the checkbox and both the checkbox and candidate?s name were positioned on a button shown in Figure 4.6. The experimental interface included only the candidate?s name and the button for each candidate was displayed on the screen. The candidate?s name was placed in the center of the button. In order to try and eliminate any bias from the study, within-subjects variable was used. By using this approach, the interfaces were switched from par- ticipant to participant; meaning if participant A came and voted first on the control interface then 32 (a) Gender of the Participants (b) Ages of Participants (c) Highest Educational Level of Participants Figure 4.7: Pre-Experiment Survey Results 33 participant B would vote on the experimental interface but the participant still used both the control and experimental interface. There were ninety-eight touches recorded from the control interface among the three races and also ninety-eight from the experimental interface. Originally there should have been 102 touches recorded for the entire study in both the control and the experiment but a few participants submitted write-in which was unable to be captured. A write-in is a candidate that did not get placed on the ballot. This makes it hard to capture the interaction of the participant. Looking at Appendix C gives every item each participant touched using the two different interfaces. Figure 4.8 shows an example of the touches that occurred during the usage of the control interface and Figure 4.9 shows the example of touches on the experimental interface. For the control group, a total of 63% of the participants touched the checkbox on the control interface, 31% touched the button and the remaining 6% touched the candidate?s name. As for the experimental interface?s interaction, 61% touched the candidate?s name and the 39% touched only the button. Although the majority did touch the target; it?s still enough to cause an alteration in the outcome of an election. The control interface was put at risk of a possible vote flipping to occur, similar to what happened in West Virginia, because only 6% of the participants touched the candidate?s name. The standards state, for each processing function indicated above, the system shall achieve a target error rate of no more than one in 10,000,000 ballot positions, with a maximum acceptable error rate in the test process of one in 500,000 ballot positions [3]. The 6% is much larger than the maximum acceptable error rate which is only .0002%. By touching the name there is a strong possibility of an error occurring that will cause voting flipping. Many of the participants were technical people and were familiar interacting with system inter- faces. During the experiment interface, 61% touched the candidate?s name and in the control a total of 37% did not touch the target which was the checkbox; they touched the button or candidate?s name. This is enough evidence to show that vote flipping could occur. This also shows that the location of the candidate?s name is extremely important because even among a population of the technical people; they still tried to touch the candidate?s name and not the target. 34 (a) A participant touching the button (b) A participant touching the name (c) A participant touching the checkbox Figure 4.8: Example of the control interface 35 (a) A participant touching the button (b) A participant touching the name Figure 4.9: Example of the Experimental interface 36 CHAPTER 5 CONCLUSION AND FUTURE WORK This chapter covers overall study conclusions as well as ideas for future work. 5.1 Conclusion The main goal of this experiment was to observe the user experience with the voting system?s design used in West Virginia. At the end of collecting the data and observing the location of the touch on the screen, it showed that 6% of the participants touch the name and not the button itself in the West Virginia-style voting system design. The DRE systems used in West Virginia and Saline County have errors similar to other voting systems. In addition, the error rate was fairly low, suggesting that voters were touching the correct location on the button deliberately and carefully. Finally, if participants were less motivated to vote carefully in the study than in a real election and the result was exaggerated even by a factor of two, it still means that a small percent of the voters would not touch the button or the name, therefore corrupting an election outcome. However, the DRE system is still preferred rather than the other systems [9]. Also many DRE participants would like to have the DRE system in voting and may even get upset if the system was restricted in their use in election [9]. With problems arising like those in West Virginia, the voting societies are looking for ways to improve the design of this system. If something so small can change the outcome of an election, what would happen if a major problem with the system would have occurred? Many voters that used this system felt like their vote was switched. The voter thought it was a technical problem, but if the design was reviewed the voter would have realized there was not a technical error. 37 5.2 Future Work This research was focused on the problem with DRE systems and ways in which to improve the usability of the systems. The problem which was addressed was whether people touch the candidate?s name on the button or touch the button itself; however that is just one of the many items to look at with DRE system. The 2008 election was evidence of the previous problem and this thesis looks at how the problem described could change the outcome of an election. From the information in the result section, it showed that, this interface design could actually change an election outcome. One item of further research is how to address the design problem. In order to get better results, another test with a larger group who are not technical people would be best. Performing this research with a non-technical group would give better feedback about the design because many of the people in the voting society would not be considered technologically savvy. Another possible research topic to explore would be the voting system itself. The voting system used in this experiment, Prime III, is a prototype of an electronic voting system and not representa- tive of all DRE system. Other voting systems may be better or worse than the Prime III system and may return different data about the design of the voting system used in West Virginia. Overall, DRE systems represent the newest ideas in election technology. This technology has proven to be usable, trustworthy, and secure, however the work presented here shows that there is still room for improvement, especially with regard to accuracy and intent of the voter. As the improvements continue, we can expect voting systems to continue to evolve and make the voting process simple and accurate. 38 BIBLIOGRAPHY [1] 107th U.S. Congress (October 29, 2002). ?Help America Vote Act of 2002 (Pub.L. 107-252)?. U.S. Government Printing Office. [Online]. http://www.fec.gov/hava/law ext.txt [2] ?Alabama?s Election Code. HB-100 (Act 2006-570)? [Online]. Available: http://ali.state.al.us/hb100com.pdf [3] Federal Election Commission United States of America (2002) ?Voting Systems Standards Volume 1 - Performance Standards?. [Online]. http://www.eac.gov/voting systems/voluntary-voting-guidelines/docs/voting-systems-standards-volume-i- performance.pdf/attachment download/file [4] Ackerman, E. (2004). Blind Voters Rip E-Machines [Online]. Available: http://verifiedvotingfoundation.org/article.php?id=5693 [5] The Caltech/MIT Voting Technology Project(2001). Residual Votes Attributable to Technol- ogy. An Assessment of the Reliability of Existing Voting Equipment[online]. Available : http://www.hss.caltech.edu/ voting/CalTech MIT Report Version2.pdf [6] Celeste, R.F., Thornburgh, D., & Lin, H.: Asking the Right Questions About Electronic Voting. National Academy Press (2005) [7] Cross, E.V. & Gilbert, J.E.: Lets Vote: Multimodal Electronic Voting System. 11th Interna- tional Conference on Human-Computer Interaction, Las Vegas, Nevada CD ROM (2005) [8] Cross, E.V., Rogers, G., McClendon, J., Mitchell, W., Rouse, K., Gupta, P., et al.(2007). Prime III: One Machine, One Vote for Everyone. VoComp 2007, Portland, OR. http://www.vocomp.org/papers/primeIII.pdf [9] Election Data Services Inc. (2006). 2006 Voting Equipment Study [online]. Avail- able:http://www.edssurvey.com/images/File/ve2006 nrpt.pdf [10] Friedman, Brad (2008). VIDEOS: Vote Flipping on Touch-Screens in WV. Brad Blog, Online http://www.bradblog.com/?p=6576 [11] Jones, D. W. (2001). A Brief Illustrated History of Voting. [Online]. Available:http://www.cs.uiowa.edu/ jones/voting/pictures/ [12] Jones, D. (2004) Observations and Recommendations on Pre-election testing in Miami-Dade County [online]. Available:http://www.cs.uiowa.edu/ jones/voting/miamitest.pdf [13] Mebane, W. R. (2004). The wrong man is president! Overvotes in the 2000 presidential elec- tion in Florida. [online]. Available: http://macht.arts.cornell.edu/wrm1/overvotes.pdf 39 [14] Nyden, Paul J. (2008). Some early W.Va. voters angry over switched votes. The Charleston Gazette, http://www.wvgazette.com/News/200810170676 [15] Schrag, Duane (2009). Vote Flipping was not unexpected. Sailina Journal, [online]. Available:http://www.salina.com/news/story/vote-machine-4-9-2009 [16] Short, L. (2006) Vote Machine Problems Reported [online]. Available: http://www.votersunite.org/article.asp?id=6910 [17] ?Top 10 election design guidelines?. AIGA. [Online]. Avail- able:http://www.aiga.org/content.cfm/election design top ten. [18] United States. Election Assistance Commission. Effective Designs for Administra- tion of Federal Elections Section 5: Rolling DRE ballots. June 2007, 2008. http://www.eac.gov/files/BallotDesign/5-Rolling DRE Ballots.pdf [19] United States. Election Assistance Commission Effective Designs for Administration of Federal Elections Section 7: Research report: Nine research events. June 2007, 2008. http://www.eac.gov/files/BallotDesign/7-Nine Research Events.pdf [20] Zuckerman, Diana M.(2004). Blind adults in America: their lives and challenges Wash- ington D.C. National Center for Policy Research for Women and Families [online]. Avail- able:http://www.center4research.org/blind0204.html 40 APPENDIX A INFORMATION SHEET INFORMATION SHEET FOR Research Study Entitled Prime III Voting System for People with Disabilities You are invited to participate in a research study which aims to evaluate a new medium for voting in national and local elections. This study is being conducted by Dr. Juan E. Gilbert, Associate Professor in the Computer Science and Software Engineering Department at Auburn University. The study will measure the effectiveness and usability of a voting system that uses speech and touch. You were selected as a possible participant because you are 19 years or older, and either an Auburn University student or a resident of Union City, Alabama. Participation is voluntary. If you decide to participate, you will spend about 20 minutes completing the study. First you will sign a release form and take a pre-survey which will provide us with some background information about you. Then you will be asked to vote for various city officials in a mock election. The voting system allows you to interactively cast your ballot using either speech or by touching the screen. Any speaking done during the experiment should only be to the voting system. A post-survey will be done at the end to understand your opinions about the system. All of your responses to the system will be electronically recorded. You will not be captured on camera. Instead, we will use computer software the will run in the background to capture the inputs to the computer. If you are an Auburn University student, you will then be given a raffle ticket for a chance to win a pair of movie tickets. Please check the Prime III website at http://www.primevotingsystem.org on the day after you receive your raffle ticket. The winning raffle ticket number will be posted by 8:00 a.m. the day after you receive your raffle ticket. The website will give you instructions on how to pick up your movie tickets. You will need to pick up your movie tickets in person. Any information obtained in connection with this study will remain anonymous. Information collected through your participation may be published in a professional journal and/or presented at a professional meeting. While there are no guaranteed direct benefits to you from this research, you may find the research and interaction with the new interactive, speech and touch enabled voting system interesting. Your participation should make it possible to better develop a usable, interactive voting system. In addition, your participation in the study may result in a system that will make it easier for people with disabilities to vote. Your decision of whether or not to participate in this study will not jeopardize your future relations with Auburn University or the Department of Computer Science and Software Engineering. You are free to withdraw from this study at any time without any questions. You will also be able to withdraw the data collected from you. If you have any questions, ask them now. If you have questions later, you may contact Dr. Juan E. Gilbert (gilbert@auburn.edu) who will be happy to answer them. For more information regarding your rights as a research participant you may contact the Auburn University Office of Human Subjects Research or the Institutional Review Board by phone (334)-844-5966 or e-mail at hsubjec@auburn.edu or IRBChair@auburn.edu . . HAVING READ THE INFORMATION PROVIDED, YOU MUST DECIDE WHETHER TO PARTICIPATE IN THIS RESEARCH PROJECT. IF YOU DECIDE TO PARTICIPATE, THE DATA YOU PROVIDE WILL SERVE AS YOUR AGREEMENT TO DO SO. THIS LETTER IS YOURS TO KEEP. ___________________________________ Investigator's signature Date Pg 1 of 1 41 APPENDIX B PRE-SURVEY Pre-Experiment Survey ID: Age: Gender: Race/Ethnicity: Citizenship: Highest Degree obtained (High School, BS, BA, MS, MA, PhD): Do you have any disabilities (Yes or No):_______ If Yes, please explain Is English your native or second language? uniF00C Native language uniF00C Second language For approximately how many years have you been using a computer? _______ On average, how many times do you use a computer during the course of a week? uniF00C 0 - 1 uniF00C 2 ? 3 uniF00C 4 - 5 uniF00C 6 or more 42 In the section below, choose the response that most accurately describes you. 1. I am computer literate. uniF00C Strongly Agree uniF00C Agree uniF00C Neutral uniF00C Disagree uniF00C Strongly Disagree 2. I am good with computers. uniF00C Strongly Agree uniF00C Agree uniF00C Neutral uniF00C Disagree uniF00C Strongly Disagree 3. I trust computers to do online shopping. uniF00C Strongly Agree uniF00C Agree uniF00C Neutral uniF00C Disagree uniF00C Strongly Disagree 4. I am comfortable using computers to pay household bills. uniF00C Strongly Agree uniF00C Agree uniF00C Neutral uniF00C Disagree uniF00C Strongly Disagree 5. I trust computers to securely send my personal information over the internet. uniF00C Strongly Agree uniF00C Agree uniF00C Neutral uniF00C Disagree uniF00C Strongly Disagree 43 APPENDIX C PRE-SURVEY RESULT 44 45 46 47 APPENDIX D INTERFACE RESULT 48 APPENDIX E SCREEN CAPTURE CODE 49