VOICE INTERACTIVE SYSTEM: A USABILITY STUDY OF THE VOICE SURVEY CREATOR Except where reference is made to the work of others, the work described in this thesis is my own or was done in collaboration with my advisory committee. This thesis does not include proprietary or classified information. _______________________________________________ Ashley Marie Wachs Certificate of Approval: ______________________________ ___________________________ Hari Narayanan Juan E. Gilbert, Chair Associate Professor Associate Professor Computer Science and Software Computer Science and Software Engineering Engineering ______________________________ __________________________ Cheryl Seals George T. Flowers Assistant Professor Interim Dean Computer Science and Software Graduate School Engineering VOICE INTERACTIVE SYSTEM: A USABILITY STUDY OF THE VOICE SURVEY CREATOR Ashley Marie Wachs A Thesis Submitted to the Graduate Faculty of Auburn University in Partial Fulfillment of the Requirements for the Degree of Master of Science Auburn, Alabama May 10, 2007 iii VOICE INTERACTIVE SYSTEM: A USABILITY STUDY OF THE VOICE SURVEY CREATOR Ashley Marie Wachs Permission is granted to Auburn University to make copies of this thesis at its discretion, upon the request of individuals or institutions at their expense. The author reserves all publication rights. __________________________ Signature of Author __________________________ Date of Graduation iv VITA Ashley Marie Wachs is the daughter of David and Charlene Wachs. She was born on August 30, 1984, in Montgomery, Alabama. She graduated from Jefferson Davis High School in Montgomery in May 2002. Ashley entered Auburn University immediately following her high school graduation and received a Bachelor of Science in Computer Science along with a Minor in Business Administration in May 2006. Ashley immediately began working diligently to meet the requirements of a Master of Science in Computer Science at Auburn University. v THESIS ABSTRACT VOICE INTERACTIVE SYSTEM: A USABILITY STUDY OF THE VOICE SURVEY CREATOR Ashley Marie Wachs Master of Science, May 10, 2007 (B.S., Auburn University, May 2006) 64 Typed Pages Directed by Juan E. Gilbert The voice user interface survey is aimed at providing an easier and more efficient way of creating surveys that is conducive to the entire population. This system allows a person to call and create a survey solely over the telephone. The creator is allowed to give an introduction to the survey for its participants to hear. This allows them to provide any specific instructions that may be needed. There are several types of questions that are available for the caller to use in creating the survey. They can use the standard question types such as true or false and yes or no questions. Also available are multiple choice and likert scale questions, and of course there is always the option of a short answer question. There are an unlimited number of questions that can be on the survey and no time limit on the questions. This feature allows the system to be usable for large vi and small surveys. After each question is recorded by the creator, he or she is then played back the question with the answer choices just like the participant will hear in taking the survey. This allows the creator to make any needed changes to the current question before they move on to the next question they would like to record. This survey application?s goal is to make the survey creator?s job easier by allowing them to construct the entire survey with only telephony access. There is no current application found that allows users to create surveys solely over the phone. As of now, the only available surveys allow the user to create the survey over the web on the computer and then allow the user to take the survey over the telephone. This new system is hopefully a start to a new and more convenient way of creating not only surveys but other applications using voice technology alone. A usability study was conducted in order to try and effectively measure the quality of the system and the user?s experience with the system. The study was aimed at finding how well experts in the usability field felt at how the system conveyed the needed information and instructions to the user. The interaction with a new system and the use of a voice technology was also taken into account when analyzing the results. At the completion of the study, all of the studies findings were complied and analyzed. After careful analysis of the usability study the data reveled that 57 % of the participants would chose the VUI Survey Tool over a paper-based survey and a computer Internet based survey. It was also recorded that a median score of 4 out 5 that the participants strongly agreed that they would use the survey application tool again in the future. vii ACKNOWLEDGMENTS First and foremost, I would like to thank God who is my life and strength and with whom I can do all things. I would like to thank Dr. Juan E. Gilbert for his support and guidance throughout my undergraduate and graduate career thus far. I would like to thank my parents, Mr. and Mrs. David Wachs, who have supported me in every endeavor of my life. To Dr. Hari Narayanan and Dr. Cheryl Seals, my graduate committee members, thanks for all of the reviewing and revising advice. To my friends, thanks for supporting me in all that I do. Finally, exceptional thanks to Ted, who always knows how to calm me down, who put up with my whining and complaining, and who supported and motivated me more than he knows. viii Style manual of journal used Journal of SAMPE Computer software used Microsoft Word 2003 ix TABLE OF CONTENTS LIST OF FIGURES??????????????????????????..xi 1. INTRODUCTION??????.?????????????????.1 2. LITERATURE REVIEW???.?????????????????..4 2.1 Speech Technology????.????????????????4 2.2 Surveys???????.?????????????????...5 2.2.1 Paper based Surveys?????????????????6 2.2.2 Web based Surveys???????.??????????.8 2.2.2.1 Haolgen eSurveyor????..????????.9 2.2.2.2 Zoomerang??????.?????????.10 2.2.3 Phone based Surveys????????????.11 2.3 Summary?????????????????????...12 3. PROBLEM STATEMENT?????.??????..???????...13 4. EXPERIMENT AND ANALYSIS????????...???????..15 4.1 Introduction???????..??????????????..15 4.2 Development of Voice Interactive Survey Project?????..?...15 4.3 Method?????????????????????..??.16 4.3.1 Participants???????????..????????16 4.3.2 Procedures???????????..???????.?17 4.3.3 Materials????????????..????????19 x 4.4 Analysis???????????????..????????21 4.4.1 Measurement??????????????????.21 4.4.2 Results and Discussion???????. ?..?????..23 5. CONCLUSION AND FUTURE WORK????????..?????.33 5.1 Conclusion?????????????????????.....33 5.2 Future Work????????????.????????......34 REFERENCES?????????????????.??????.......36 APPENDICES???????????????.??????????38 APPENDIX A: E-mail that was sent to Departments???????..39 APPENDIX B: Classroom Script that was used in recruiting...???...40 APPENDIX C: Pre-Experimental Survey????????????41 APPENDIX D: Sample Survey????????????????.42 APPENDIX E: Post Experimental Survey????????????43 APPENDIX F: Time Log of Study Results???????????...47 APPENDIX G: Pre-Experimental Survey Results?????????48 APPENDIX H: Post Experimental Survey Results????????...50 APPENDIX I: Comparison of the Results????????????53 xi LIST OF FIGURES Figure 1: A Survey Created Using the SumQuest Application????..??????6 Figure 2: A Sample Paper Survey????????????????.?????.7 Figure 3: A Survey Created Using Halogen eSurveyor????..??.??????10 Figure 4: A Survey Created in Zoomerang????????????.?????.11 Figure 5: Interface of Hosting for the Project ???????????.?????20 Figure 6: Example of what Participants Received?????????.?????..21 Figure 7: Section 1 Median Results???????????????????.....24 Figure 8: Section 2 Median Results????????????????????.27 Figure 9: Time Log from Study????????????????????...?30 Figure 10: Task Completion Time Histogram????????????????.30 Figure 11: Comparison of Results of participants who completed survey????......31 Figure 12: Comparison of Results of participants who did not completed survey??..32 1 1. INTRODUCTION There are three main advancements in survey technology in the twentieth century, the telephone, random sampling, and electronic surveys (Dillman, 2000). There are a lot of telephone surveys on the market that allow participants to take a survey. However; there is not a variety of ways in which the survey can actually be created. This study was brought about with the intention of finding a new and useful way to create a survey. The method decided upon was speech technology. Speech technology continues to become a more predominant form of communication among many companies. Speech is a very basic form of communication, but is also the most natural and effective way to interact with humans. A very important part of the speech technology is the means of speech recognition technology. People in the United States are now living longer than ever. It is very common to have a person that is sixty-five years or older still active in many activities. In the year 2000, the percentage of the population in the United States that was at least sixty-five years or older was 12.6 %, but it is estimated that in the year 2030, 20 % of the population will be at least 65 years of age (Gavrilov & Heuveline, 2003). One reason that speech is important to begin using more often in everyday activities is because most people retain their speaking ability long into their life. This alone will help the elderly community stay more active and involved in the world, not only through technology. 2 The Voice Interactive Survey Project was intended to develop a new way of using the rapidly growing speech technology with a technique that could apply to everyone. No matter what business you are in, people want to know what you think about their product, company, and how satisfied their users are. Combing these two issues of speech technology and surveys is a great way to provide a convenient and useful application that can apply to all of the population. The usability of the speech survey application was conducted by a graduate student Auburn University. The participants in the study were students who had taken a college course in usability no matter what department of the University they were from and who were at least 19 years of age. The study was done by having the participants call and create a standard survey given to them to replicate. The findings of the study were complied by having the participants evaluate their experience through a questionnaire. The hypothesis of the study is that by using speech technology in creating a survey people would be successful in creating the survey that they intended. Also, hypothesized is that people will say that they would use the Survey Creator Tool over the Internet when given the choice. The rest of this thesis is organized as follows: Chapter 2 consists of a literature review, which discusses briefly the use of speech technology and the effectiveness and upcoming technologies using speech. Also being discussed is the current surveys that are available and their means of reaching the users. A definition of the research problem along with a synopsis of the literature review is present in Chapter 3. The experiment 3 details and the analysis of the results are included in Chapter 4 while Chapter 5 concludes the thesis with the results of the project along with the future work to come. 4 2. LITERATURE REVIEW 2.1 SPEECH TECHNOLOGY There is an overwhelming interest in the upcoming technology that uses speech. Many systems already have speech incorporated in them. For example, the voice mail tools on a cell phone, an automated call center, along with many others are now a thought of the past. The use of speech in companies and other areas are starting to catch on to the current trend. A comparison has been made between the evolvement of the web and now speech (Robb, 2006). The reason for the World Wide Web?s success is the application tools that go along with it, and the same goes with the speech, the better the tools that we have, the better and more successful it will become (Robb, 2006). Incorporating the voice technology into a company is now proving to save them money in the long run. It was estimated that a marketing and IT company in Minnesota is saving around two dollars per call that uses the speech technology (Solheim, 2004). It allows employees to get more accomplished in a work day because they are not bothered will simple and common questions that can easily be answered using a voice application over the telephone. 5 2.2 SURVEYS Creating a good and useful survey is dependent on retrieving the appropriate information from the participants (Belani, H., Pripuzie, K. & Kobas, K., 2005). Everyone wants a good and accurate way to get information from customers. Companies always want to know how well they rank in the customer satisfaction rating and how useful people find their application. The basic types of surveys that will be discussed are paper-based surveys, web- based surveys, and phone-based surveys. There is a system that allows a variation of all three. SumQuest Software is a software application that offers many tools to help create a usable survey (SumQuest Software). Figure 1 shows a sample survey that was created using the SumQuest Software Application. 6 Figure 1: A survey created using the SumQuest Application 2.2.1 PAPER-BASED SURVEYS When people hand-write the surveys that are given to them, there is a greater chance for human error. A major disadvantage of paper-surveys is the difficulty of reading other peoples handwriting, which can have an impact on the validity of the final results (Belani, H., Pripuzie, K. & Kobas, K., 2005). There is also a great deal of paperwork to keep up with and compile to process the paper survey results (Belani, H., Pripuzie, K. & Kobas, K., 2005). Hand counting and sorting of each question and answer is very time consuming and tedious (Belani, H., Pripuzie, K. & Kobas, K., 2005). Having to pay someone to hand process the results and count the votes proves to be very 7 expensive and most companies do not have the money or the man power to validate their survey results to the extent necessary. After taking the time to photocopy and mail the paper survey to the participants, all of the surveys and reports could be completed for a web-based survey (Halogen Software, 2007). Figure 2 is an example of a typical survey that would be handwritten and distributed by the mail. Figure 2: A sample paper survey 8 2.2.2. WEB-BASED SURVEYS There are a lot of survey builders that take place over the internet. Many companies offer a great resource in allowing a survey to be created on the computer. The requirements for using this feature are having a computer and internet access and in some cases pay a fee. Performing research, in the form of a survey, online gives the opportunity to have a fast and interactive way to connect with your employees and your customers no matter where you are located around the world (Halogen Software, 2007). Many organizations are using web-based surveys because they believe that by using them they are reducing their research cost and can achieve more accurate responses in a more timely fashion (Halogen Software, 2007). Web-based surveys have proven to be a more convenient and cost effective way to reach out to a population than traditional paper surveys (Belani, H., Pripuzie, K. & Kobas, K.). Web-based surveys also have some drawbacks that can make companies hesitant to use them. One of the biggest concerns right now about web surveys is the coverage of the sampled people (Kay & Johnson, 1999; Crawford, Couper & Lamias, 2001). There are wide disparities among ethnic and socioeconomic groups when it comes to Internet access (Selwyn & Robson, 1998). It is hard to know if you are discriminating to people that do not have access to the Internet. As discussed above, typically people that do not have Internet access have similar demographics. It continues to be a concern that the participants are not equally distributed. Another proven drawback is there are a lower number of participants in web-based surveys than in the traditional mail-based surveys. Several studies have shown that Internet surveys response rates have been lower than an equivalent mail-based survey (Couper, Blair & Triplett, 1999). There are several potential 9 reasons for the discrepancy in the participant response rates. The results could reflect the coverage basis of the participants were unequal, a lack of convenience of the Internet for the participant, or little knowledge of the Internet by the participant and therefore a survey would be hard to complete on a system that not everyone is familiar with (Solomon, 2001). 2.2.2.1 HALOGEN eSURVEYOR Halogen eSurveyor is a survey application that companies can incorporate into their business. The eSurveyor application must be bought in a software package and installed on the computer that the surveys will be created upon. Halogen is compatible with Windows based systems and is tailored for SQL and for Oracle database users (Halogen Software, 2007). It advertises that it allows users to build surveys to meet people?s individual needs in areas such as Human Resources, Market Research and Information Systems (Halogen Software, 2007). Halogens market their system as an easy to use system and only a small knowledge base of simple computer tools is needed. There is no coding language knowledge required which makes it easier for the average person to use and understand. One of the main features Halogens think set them apart from other competitor?s is their real-time response monitoring. They allow the company to administer the survey while it is being taken. This helps keep management in connection with the high priority issues that the business may be interested in (Halogen Software, 2007). Figure 3 below shows a survey that was created using the Halogen eSurveyor. 10 Figure 3: A survey created using Halogen eSurveyor 2.2.2.2 ZOOMERANG Zoomerang allows users to have a 30 question survey free of charge that will allow the results to be seen for 10 days. Other more advanced surveys provided by Zoomerang are available for a price. Zoomerang has many of the features that allow for a great survey; however, you must have an Internet connection to access them. If the participant has a slow Internet connection or the survey contains a lot of graphics, then it may take a long time to load the survey page on the web and participants may get tired of waiting (Gunn, 2002). Below Figure 4 shows a sample survey that one can create using the survey creator Zoomerang. 11 Figure 4: A Survey Created in Zoomerang 2.2.3 PHONE SURVEYS This is a very new and upcoming field and there is not much research available. We have adopted voice into many different applications and it is now time to adapt it to creating surveys as well. The only phone surveys available are ones that allow the person to actually take the survey over the phone. However; these applications do not allow one to create the survey on the phone as well. 12 2.3 SUMMARY Voice technology is a growing field of study that is ready to be incorporated into many areas of work. Surveys are a great way to collect valuable information. Paper- based surveys along with web-based surveys are a good way to conduct an evaluation. However, a system that thinks like the customer is also needed. A phone survey that can be created using the convenience of a telephone is a much needed application that would be greatly appreciated in everyday life. 13 3. PROBLEM STATEMENT We are in a world that is fast pace and pushed for time. "Everybody is trying to squeeze 36 hours into a 24 day" (Gillette, 2006). We need an application that adapts to the users needs and accommodates to their specific needs. There are so many fingers being pointed on why people are so busy and why everyone seems to have more things to do than ever before. But instead of trying to solve a never ending problem, society needs to try and appeal to the everyday busy customer. In 2005, it was found that 219.4 million people owned cellular telephones (CIA Fact Book, 2006). It was also found in 2003 that 268 million people have landline telephones in use (CIA Fact Book, 2006). With so many people having access to a telephone, either a cell phone or a landline phone, an application over the phone was an easy choice. Creating a survey over the telephone is a great way to incorporate the user?s everyday tasks in to their work. One of the most common types of conducting social science research in a quantitative fashion is by the use of surveys (Belani, H., Pripuzie, K. & Kobas, K., 2005). Surveys are an easy way to find out what the people want and expect out of a service or a product. The great thing about surveys is that they can pertain to any business, topic, or product. Surveys are useful in doing customer research. It was found that twenty five percent of owners conduct customer research before their project begins and forty five 14 percent conducts the research study either during the project or after it is complete (Hurst, 2006). This is helpful in finding out how customers respond to your company or your product. Information found in surveys is useful in maintaining a good company to client form of communication. Since there is no found survey creator that allows users to perform this task over the phone one is needed. By giving the user the ability to create a survey over the telephone, they are not limited to location. They do not have to be at a computer with internet access and they do not have to be in the office at their desk to have access to the survey tool. A telephone signal along with a set of questions for their survey is all that one needs to create a survey. 15 4. EXPERIMENT AND ANALYSIS 4.1 INTRODUCTION With the fast pace society that we live in today, it is imperative for industry and producers to find creative ways to come together and help reduce the pressure that is involved in this overwhelming society. The voice interactive survey project was created with the intention of making the creation of surveys easier and more convenient for potential users. The survey will provide a more accessible way to create a survey that could potentially provide users with important information from customers of their product or service. The goal is to make the transition from web based creation to telephony creation of surveys an easy transition that they will choose every time. 4.2 DEVELOPMENT OF THE VOICE INTERACTIVE SURVEY PROJECT The voice interactive survey project was a product of a small class project and discussion that was started in a Spoken Language Systems class of a Professor of the Computer Science and Software Engineering (CSSE) Department at Auburn University. This idea was a topic that the author, a Masters student in the CSSE Department at Auburn University, was immediately interested in and viewed as a great opportunity to develop a system and then conduct a study to further improve the techniques for the usability of the system. Once the idea and opportunity came available in a discussion, the 16 project began to take form. The project was focused on implementing an application that would increase the use of voice systems and at the same time provide a pleasant and useful experience for the user. The collaboration of the idea and the goals brought about the development of the Voice Interactive Survey Project. The project allowed usability experts the chance to put their interests and knowledge into action in evaluating the application that hoped to enhance the users experience using spoken languages. 4.3 METHOD 4.3.1 PARTICIPANTS Participants for this study were chosen based on a certain set criteria. It was decided that only students who had taken a college class course in usability would be allowed to evaluate the system. This was the criteria because a knowledgeable expert would provide the experiment with a set of useful data and evaluations. This group of experts would know what they should be looking for in evaluating an application in regards to usability and effectiveness. The participants also had to be at least 19 years of age to participate in the study. Participants were recruited on a volunteer basis. Mass e-mails were sent to departments of Auburn University that offered a course in the field of usability. The e- mail that was sent stated that all students, both undergraduate and graduate students, would be accepted, but no professors. 1 It also stated that you had to be at least 19 years of age and there were no potential harms or benefits to participating in the study. 1 The full copy of the e-mail sent to departments can be found in Appendix A. 17 Another form of recruiting participants was visits to classrooms of current usability students. The same information was given to the classroom students as in the e-mail. 2 No one that had any prior exposure to this system or the study would be allowed to participate. After the recruiting process, a total number of 21 participants were evaluated on the day of the study. 4.3.2 PROCEDURES The first step in this project was to actually create the system that would be the application used in the testing and evaluation to come. A list of functions that were expected in the application was created and that began the process for the development of the system. A design of the system was next in the process which consisted of how the tool was to interact with itself and others. It was decided that the system would allow the user to first create a welcome message with instructions for their particular survey participants to hear when they initially start taking the survey. This message would allow the user to give a specified welcome message from their company if desired. This would be helpful in making sure that the participant would know who they were taking a survey for and what they were attempting to retrieve from the customer. The survey would also be accommodating to all sizes of surveys. There would be no size limitations to the number of questions, the length of the question, or the maximum number of answer choices available. 2 The full script of the classroom script can be found in Appendix B 18 The user would also be allowed to use several types of questions. The available question types decided upon are as follows: ? 5- point likert scale ? Yes/No ? True/False ? Multiple Choice ? Short Answer/ Open Ended Questions Also decided upon was the chance to hear the question and answer choices that were recorded and allow them to make any necessary changes to the question before it is saved and stored. This option is given after each individual question. This was decided as an appropriate design because the user would be able to immediately hear the question that they had just created and has the option to start over for that question. It was decided that there should be no training required in using this system. It was intended that the first exposure of this system be during the study. With no training on the actual use of the system, the ease of use and understanding for the system must be present. This was a very important aspect of the project, because it is trying to target all of the populations and career disciplines. After the completion of the actual application tool of the voice survey, it was then time to prepare for the usability study. Before the usability study began, a pilot study was conducted on the survey creator application tool. Five students participated in the pilot study of the system. The pilot study was done to help in finding errors and potential errors in the application. This was very helpful because parts of the application that gave the pilot participants difficulties were analyzed and some were re-designed and 19 implemented. The pilot study also allowed a better time estimation on how long the study would take each participant to complete. This was helpful in scheduling the time slots for the participants in the actual study. In preparing for the study, several evaluation procedures were prepared and a questionnaire was decided as the best solution for this project. After the documents were ready to go for the study, the only thing left to do was to begin the recruiting process for the study participants. A more detailed discussion of the questionnaires, types of questions, and other documents that were used in the study will be provided in the next section discussing the materials that were used. 4.3.3 MATERIALS BeVocal Caf? is a free web-based development environment that also allows free hosting for voice applications (BeVocal Caf?, 2007). BeVocal was used as the hosting site of the voice interactive survey for the study. Figure 5 shows the BeVocal interface used for hosting of the project. 20 Figure 5: Interface for Hosting of the Project When the participants arrived at the study, they were given a pre-experimental survey. The study was done on a voluntary and anonymous basis. No personal information was collected from any participants that would help in identifying them. The pre- experimental survey asked for basic demographic information. It also asked about previous knowledge and exposure to general telephone and survey usage. 3 The participants were also given a small sheet of paper that gave them the telephone number to call and an access code to be able to have access to the system. Figure 6 shows what the participants received. 3 The entire Pre-Experimental Survey can be found in Appendix C. 21 Information Needed to Call the System: Telephone Number 9-1-877-338-6225 Pin # XXXX User ID 4301764 Figure 6: Example of what Participants Received The participants also received a sample survey to use in creating their survey in the study. A sample survey was used in attempt to help in collecting data from the survey. 4 In this manner, the length of the questions and the number of questions will be the same for all participants, which provides a set basis for comparison of the completing time and total errors. In giving each participant the same survey to create it allows for more accurate data that can be directly compared to one another. 4.4 ANALYSIS 4.4.1 MEASUREMENT The participants were evaluated at the completion of the survey using a questionnaire. 5 The questions were set up to be answered on a five point likert scale and a few short answer questions. The questionnaire?s goal was to retrieve the participants overall reaction of the system and the functionality of the application. It did that by measuring the accuracy of the application by asking questions that pertain to the errors 4 The sample survey that was given to the participants to create in the study can be found in Appendix D. 5 The Post-Experimental Survey can be found in Appendix E. 22 and the speech that the system gave. The functionality was also measured based on the ease of use and the task completion time of the survey. The participants were also asked for any improvements or suggestions they would recommend to make the system a better product. The objective data collected from this study was the collection of the task completion time. The task completion time was determined based on the entire call of the system. Inside the BeVocal application there is a log browser that records the steps that they call takes, as well as the complete time of the task. At the end of the day, the BeVocal account was accessed and the call times were recorded. The times were recorded in the order of the calls, so the times were matched with the ID numbers of the surveys. All of the times were moved to a time log that matches with the ID numbers of each participant. 6 The participants were also evaluated at the completion of the survey creation using a three section survey. Section 1 was on a 1 ? 5 number scale, while Section 2 was based on a scale that asked if the participant agreed or disagreed with the statement. They were asked various questions about the effectiveness of the survey tool along with the usability of the tool. Section 3 was a short answer questionnaire to be completed by the participant. The survey results, along with the information from the pre-experimental survey and the time log, were used to summarize the findings of this study. 6 The time log can be viewed in Appendix F 23 4.4.2 RESULTS AND DISCUSSION Of the original twenty-one participants, all twenty-one completed the entire study. The success of the study was based on the participants being able to successfully create the survey and at the same time not become frustrated with the system. In the end, it is a success if the participants say that they would use the VUI Survey Tool again and that they would use it over the Internet. The study was conducted on Friday, March 23 , 2007, in the Human Centered Computing Lab at Auburn University. As previously stated after the participants were given an information letter explaining what they would be doing, the participants that agreed were given a pre-experimental survey to take. The pre-experimental survey 7 showed that of the twenty-one participants, 52 % were female and 48% were male. This was almost an even number of male and females and that would be helpful in analyzing the results. Also, the median age of the participants in the study was 24 years of age. This age was expected because the participants had to be at least nineteen years of age and the college student body is around the twenty age range. No one that participated in the study recorded that they had any disabilities. English being the participants? native language was recorded at 62 %. 38 % of the participants stated that English was a second language to them. It was also recorded in the pre-experimental survey that 52 % of the participants use the phone at 4 - 6 times daily. This is about 1460 ? 2190 times a year. 29 % said that they used the telephone around 7 ? 9 times daily, while 14 % said that they used the telephone 10 or more times a day. 5 % of the people said that they barely ever used the 7 The Pre-Experimental Survey Results can be found in Appendix G 24 phone at 0 ? 3 times a day. Also relevant to this study is whether or not the participant had ever created a survey before. If someone has created a survey before they would have a better chance of knowing what to expect with the system. 76 % of the participants said that they had created a survey, while 24 % said they had not. Overall, the participant pool was well diverse and represented of the population that would be potential customers of this survey tool. After the participants had completed the telephone survey, they were asked to complete a post-experimental survey for evaluation of the survey application tool. 8 The first section of the survey consisted of four questions that asked on a scale of 1 ? 5, 5 being the most desired, if they thought the system was Wonderful, Satisfying, Usable, and Trusting. The results of this were very obvious. The median results for if the participants thought the study was wonderful and then satisfying were both a 4. When it came to whether the survey was usable and if you trusted it to accurately complete the survey, the median grade score a 5. Figure 7 shows the median results and the standard deviation of the questions of Section 1 of the Post-Experimental Survey. Question Median Score Standard Deviation Terrible to Wonderful 4 0.590 Frustrating to Satisfying 4 0.655 Not Usable to Usable 5 0.680 To what extent to you trust the VUI Survey to accurately create the survey Do not Trust to Completely Trust 5 0.746 Figure 7: Section 1 Results 8 The Post-Experimental Survey Results can be found in Appendix H. 25 The second part of the post survey consisted of ten items of which the participants were asked if they (A) Strongly Agree; (B) Agree; (C) Neutral; (D) Disagree; (E) Strongly Disagree. When analyzing the data, numbers were given on a 1 ? 5 scale basis in order to more effectively compute the median score and the standard deviation. The number 5 was given to (A) Strongly Agree and the numbers counted down to 1 appropriately. Section two originally had 12 questions, but two of the questions were discarded after the completion of the study due to misleading or confusing questions. The first item stated that they would use the VUI Survey Tool again and the median score for this statement was 4. This means that the majority of the people agreed that they would use the tool again. The second item stated that the survey tool was easy for the participant to use. The median score for that statement was also a 4. The third item on the survey said that it was easy to get started. This could have been interpreted as getting started with accessing the system or getting started with the beginning of the survey. Either way an acceptable score of a median of 4 was recorded. The fourth item stated that it was easy to create the survey overall. This also was calculated to have a median score of 4. The fifth item on the survey stated that if the participant made a mistake, then it was easy to correct. This yielded a median score of 3. The sixth item on the survey stated that the system was accurate in recalling the correct information. This statement provided a median score of 5. This was a very important score because if the system does not recall the correct information then the user becomes frustrated and distrusting in the system. 26 Statement number seven on the survey stated that I (the participant) was able to successfully complete the task (of the survey creation). This provided a median score of 4. This shows that most everyone agreed that they were able to successfully complete the survey as expected. Number eight on the survey provided a statement that said that I (the participant) would have preferred a male voice. This resulted in a median score of 2. This shows that most people disagreed with this statement. The survey creation tool has a female voice and that seemed to be exactly what the participants wanted. Out of all of the twenty-one participants, there was not one single person that said they wished that the voice would have been one of a male. Item nine stated that it was easy to understand the system?s instructions. The median score for this statement calculated a score of 4. This is very important because if one does not understand the system?s instructions, then it would be very difficult to accurately and efficiently interact with the system and receive the expected survey in return. Item ten stated that it was easy to speak to the system. This statement yielded a median score of 5. This is also an extremely important feature that the system has. The system must be able to hear the responses clearly and at different voice levels because not all people speak softly or loudly. Figure 8 shows the median results and the standard deviation of the statements of Section 2 of the Post-Experimental Survey. 27 Question Median Score Standard Deviation I would use the VUI Survey Tool again 4 0.498 The survey tool was easy for me to use 4 0.398 It was easy to get started 4 0.700 It was easy to create my survey 4 0.730 If I made a mistake, it was easy to correct 3 0.913 The system was accurate in recalling the correct information 5 0.498 I was able to successfully complete the task 4 0.727 I would have preferred a male voice 2 0.669 It was easy to understand the system's instructions 4 0.700 It was easy to speak to the system 5 0.598 Rating Scale 5 Strongly Agree 4 Agree 3 Neutral 2 Disagree 1 Strongly Disagree Figure 8: Section 2 Results Section 3 of the post-experimental survey is a 2 question section with a place at the end for comments and suggestions. There were 3 questions to begin with, but one was discarded due to confusion of the question. Therefore, only the 2 short answer questions along with the comments and suggestions question will be analyzed and used in data retrieval. The top three responses to question 1 will be recorded and discussed. The first question on Section 3 of the post-experimental survey was how you (the participant) would improve the VUI Survey Tool. The top three responses will be given. One response was that the participants would have liked for the system to explain the types of questions offered before the survey began. This would probably be a good option to give the users, but not something that everyone was required to hear. Another 28 top response was to have the option to go back to previous questions that one has recorded and redo or listen to them. The last response that will be discussed is that there needed to be a longer period to allow for the recording and the answer choices. Some people stated that the system cut them off in the middle of their question before they were finished. This occurred due to the fact that different people pause different links of time between words and sentences. The pause time allowed before the system cut off was not long enough for some of the participants. The second question in Section 3 of the survey was if given the choice to create a survey using the Internet, paper, or the VUI Survey Tool, which would you choose and why. This is a very big part of the study to see which way most people would prefer and their reasons behind it. No one out of the entire study stated that they would prefer using a paper based survey over the other choices. 57 % of the participants said that they, if given the option, would chose to create their survey using the VUI Survey Tool, while the other 38 % would prefer to use the Internet over the VUI Survey Tool. The last 5 % of the participants chose not to answer that question. Some of the reasons given on why the Internet would be better than the VUI Survey Tool will be discussed. It was said that by creating the survey on the computer, it is easier to make corrections to the questions and the answer choices. Some people said that they were frequent computer users anyways and that it would be faster for them because they were more familiar with the computer applications. Some said that it would be nice to have an electronic copy of the questions and answers on the survey and they could just print the screen on the computer to get that. Lastly, a non-native speaking participant stated that they would prefer the computer based survey because they already 29 struggle on the English language and pronunciation of words and that using the computer would be less confusing. Some of the reasons given on why the VUI Survey Tool is the better choice for creating a survey will be discussed. Most everyone who chose this as the better option said that its easy accessibility was an important factor. The accessibility of this tool is that it can be done anywhere and it is easier to use a phone or a cell phone than to have access to a computer. Another reason why this is chosen as the best choice was that it gave people the opportunity to have their voice on their survey. Many participants liked that their voice would be on the survey and that they had the chance to hear what they sounded like giving the survey. This provides the user the opportunity to decide what voice they would like on the survey and that it is not a computerized voice. The objective results that were collected were the task completion time that it took to complete the entire survey. It was recorded in total seconds and then converted into minutes and seconds. Since all of the questions are of the same length for each person and each person had the same number of questions, the times can be compared. The average time it took to create the sample survey of five questions was 265 seconds, which is 4 minutes and 25 seconds. The maximum amount of time was 456 seconds, which converts to 7 minutes and 36 seconds and the minimum amount of time it took was 192 seconds, which is 3 minutes and 12 seconds. There is a great bit of difference of time between the maximum and the minimum time it took to complete the survey. This difference could be caused by some participants re-recording some of the questions along with the answer choices. Also, some participants took advantage of the bargin feature that was allowed in the application. These are a few of the factors that can change the 30 time of the survey creation. More than the actual time that it takes to create the survey itself, the user must be happy with the survey they created and the time it took to do so. Figure 9 shows the time log summary of the study and Figure 10 shows the histogram of the task completion time. Minutes Average Time 4.4 Maximum Time 7.6 Minimum Time 3.2 Standard Deviation 1.107076803 Figure 9: Time Log from Study Task Completion Time 0 1 2 3 4 5 6 7 8 12345678910112131415161718192021 ID M i nut e s Figure 10: Task Completion Time Histogram After looking at all of the different data that was collected during the study, it was time to connect all of the pieces together. There were 4 participants that did not complete the entire survey. These 4 participants exited out of the system unintentionally, but could not re-enter the application to finish the survey. The task completion times are compared 31 with the number of errors that occurred with both the participants who completed the survey and those that did not. 9 It seems that the participants that had a greater number of errors and ones that re-recorded questions had a longer task completion time than others with no errors. Re-recording a question was not considered an error because that is a personal preference whether or not one would like to re-record a question. Most of the errors that occurred were when the system did not understand the user or the user did not speak when the system was expecting an input. Figure 11 shows the comparison of the different factors of the data collected for the participants that did complete the entire survey. Results of Participants who Completed the Entire Survey 0 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 9 1011121314151617 Participants Task Completion Time # of Errors # of Re-recorded Questions Figure 11: Comparison of results of the collected data where participants completed the entire survey 9 The entire table of information can be found in Appendix I. 32 Figure 12 shows the results of the participants who did not complete the entire survey during the study. Most of the participants that did not complete the survey had a lower task completion time that the participants who were able to create the entire survey. Results of Participants who Did Not Complete the Entire Survey 0 1 2 3 4 5 6 7 8 1234 Participants Task Completion Time # of Errors # of Re-recorded Questions Figure 12: Comparison of results of the collected data where participants did not complete the entire survey 33 5. CONCLUSION AND FUTURE WORK 5.1 CONCULSION This study was brought about by the creation of the VUI Survey Application Tool. This project was designed in order to create a survey over the telephone using only voice applications. The project targeted developing a system that could be used by all of the population. It was developed with the intentions of not needing any technology experience and that any age person would find this application useful as well as easy to use and interact with. It was also intended for all businesses and companies. It was developed so that any field of study could affectively use the survey creator. The study took place on Friday, March 23, 2007 in the Human Centered Computing Lab at Auburn University. The participants were given a sample survey to create using the VUI Survey Tool. After creating the survey, they were asked to evaluate the system based on their expert knowledge. The participants were chosen with the criteria that they had taken or were currently enrolled in a college course that discusses what makes a system usable and how to effectively evaluate them. The results of the study showed that more than half of the participants would prefer to create a survey using the VUI Survey Tool rather than a typical paper-based survey or even a computer Internet survey. The participants also expressed a sense that the system was satisfying, usable, and trustworthy. This study suggest that this could be 34 a very successful application tool that seemed to be effective in achieving what it was meant to do for the participants. 5.2 FUTURE WORK The future plans of this project are to continue this project by expanding the functionality of the application by adding more in depth options and features to the creation of the survey. Based on the findings of this study, the survey will be a customized system that will be more accommodating to the users based on the suggestions and improvements that were given. Based on the findings of the survey and the comments on the questionnaires, a system that recognizes more dialects and speech accents in a person?s voice is needed. This is a problem that not just this application faced. This is an issue that all people in the speech technology are ware of and are working to try and find a more accommodating way to interact with all people, no matter what accent of dialect of a specific language that they speak. A suggestion that was provided on the comments section of the questionnaire was to have a question-based pool that people could pull from when creating their survey. This feature will be looked into in implementing in this application. A set of sample questions that could give users an idea of what to ask in a particular situation would be available for the creators use. After the new enhancements have been added to the survey, a new study will be conducted that will then use the same questionnaires and procedures as this current study. There will then be two sets of data that will be able to be compared and analyzed to see if 35 the improvements made to the system in regards to user experience and satisfaction were improved. There are plans in the work to combine the findings of this study with another graduate student?s survey application. After the combination has taken place, the commercialization of this project is part of the future aspirations of this study. 36 REFERENCES Belani, H., Pripuzie, K. & Kobas, K. (2005) Implementing Web-Surveys for Software Requirements Elicitation. 8 th International Conference on Telecommunications ? ConTEL 2005, (June 15-17, 2005). BeVocal Caf? (2007). Retreived on March 7, 2007 from http://cafe.bevocal.com/ CIA Fact Book (2006) Retreived March 8, 2007 from https://www.cia.gov/cia/publications/factbook/geos/us.html#Comm Couper, M.P., Blair, J. & Triplett, T. (1999) A comparison of mail and e-mail for a survey of employees in federal statistical agencies. Journal of Official Statistics, 15, 39- 56. Crawford, S.D., Couper, M.P. & Lamias, M.J.. (2001) Web Surveys: Perception of burden. Social Science Computer Review, 19, 146-162. Dillman, D.A. (2000). Mail and Internet surveys?The tailored design method. New York: John Wiley & Sons, Inc. Gavrilov, L.A. & Heuveline, P., Aging of Population. The encyclopedia of population, New York, Macmillan reference USA, 2003. Available: http://longevity-science.org.Population_Aging.htm. Gillette, Felix. (2006) Americans, Too Busy to Do Errands, Read About Them Instead. CJR Daily, (September 14, 2006). Available: http://www.cjrdaily.org/behind_the_news/americans_too_busy_to_do_erran.php 37 Gunn, H., (2002). Web-based Surveys : Changing the Survey Process. First Monday, volume 7, number 12 (December 2002). Available: http://firstmonday.org.issues/issue7_12/gunn/index.html Halogen Software (2007). Retreived on March 9, 2007 from http://www.halogensoftware.com/products/esurveyoropenbrochure.php Hurst, M. (2006) Survey: Customer Research and Results. (February 8, 2006). Available: http://www.goodexperience.com/blog/archives/000511.php Kaye B.K. & Johnson T.J. (1999). Research Methodology: Taming the Cyber Frontier. Social Science Computer Review, 17, 323-337. Selwyn, N., Robson, K. (1998). Using e-mail as a research tool, Social Research Update. Available: http://www.soc.surrey.ac.uk/sru/SRU21.html Robb, D. (2006). Now We?re Talking. ComputerWorld (October 2, 2006). Volume 40, Issue 40. Solheim, S. (2004). Talking up Speech. eWeek. (September 27, 2004) Volume 21, Issue 39. Solomon, D. (2001) Conducting Web-Based Surveys. ERIC Clearinghouse on Assessment and Evaluation College Park MD., (December 2001). Available: http://www.ericdigests.org/2002-2/surveys.htm SumQuest Software. Retrieved on: March 12, 2007. Available: http://www.sumquest.com/index.htm 38 APPENDICES 39 APPENDIX A E-mail that was sent to list serves in attempt of recruiting participants My name is Ashley Wachs and I am a Master?s Student in the Department of Computer Science and Software Engineering here at Auburn University working under Dr. Juan Gilbert. I need your help in conducting a study for part of my research that I am doing. The qualifications are if you have taken or are currently taking a course in college that discusses the usability of a system and you are at least 19 years of age. If you decide to participate, you will test a voice interactive system and then evaluate the system. This will only take around 15 ? 20 minutes. There are no risks or benefits to you if you participate. However, your help will be greatly appreciated. The study will be taking place on DATE in ROOM from TIMES. Thank you very much and if you have any questions, please feel free to contact me by e- mail at wachsam@auburn.edu. Thank you for your time. Ashley M. Wachs Human Centered Computing Lab - http://www.humancenteredcomputing.org/ Department of Computer Science and Software Engineering Auburn University 107 Dunstan Hall Auburn, AL 36849-5347 U.S.A. 334-844-6322 (o) 40 APPENDIX B Classroom script that was used in attempt of recruiting participants Classroom Script for Announcement Hello, my name is Ashley Wachs and I am a Master?s Student in the Department of Computer Science and Software Engineering here at Auburn University working under Dr. Juan Gilbert. I need your help in conducting a study for part of my research that I am doing. The qualifications are if you have taken or are currently taking a course in college that discusses the usability of a system and you are at least 19 years of age. If you decide to participate, you will test a voice interactive system and then evaluate the system. This will only take around 15 ? 20 minutes. There are no risks or benefits to you if you participate. However, your help will be greatly appreciated. The study will be taking place on DATE in ROOM from TIMES. Thank you very much and if you have any questions, please feel free to contact me by e- mail at wachsam@auburn.edu. Thank you for your time. Ashley M. Wachs Human Centered Computing Lab - http://www.humancenteredcomputing.org/ Department of Computer Science and Software Engineering Auburn University 107 Dunstan Hall Auburn, AL 36849-5347 U.S.A. 334-844-6322 (o) 41 APPENDIX C Pre-Experiment Survey that was given to all participants in the study Pre-Experiment Survey ID: Age: Gender: Race/Ethnicity: Citizenship: Highest Degree obtained (High School, BS, BA, MS, MA, PhD): Do you have any disabilities (Yes or No):_______ If Yes, please explain Is English your native or second language? ? Native language ? Second language In the section below, choose the response that most accurately describes you. 1. How often do you use the telephone daily? ? 0-3 times ? 4-6 times ? 7-9 times ? 10 or more times 2. Have you ever created a survey before? ? Yes ? No 42 APPENDIX D Sample Survey that Participants used in the Study VUI Sample Survey Opening Message Thank you for taking the Weather Survey. Type of Question: True/False Question 1 Question: Pollution is the major cause of global warming. TRUE FALSE Type of Question: Multiple Choice Question 2 Question: What type of severe weather is the most dangerous? 1 floods 2 Hurricanes 3 tornadoes 4 Thunderstorms 5 blizzards Type of Question: Open Ended Question 3 Question: If you could, how would you control the weather? Type of Question: Yes/No Question 4 Question: Do you like going to class in the rain? YES NO Type of Question: Likert Scale Question 5 Question: How easy would it be for a computer to predict the weather? very easy Easy no opinion hard very hard 12 3 4 5 43 APPENDIX E Post Experiment Survey that was given to the Participants of the Study Post-Experiment Survey ID: Please mark the number that best reflects your reaction to the VUI Survey Tool : Terrible?????????...Wonderful ? 1 ? 2 ? 3 ? 4 ? 5 Frustrating????????..Satisfying ? 1 ? 2 ? 3 ? 4 ? 5 Not Usable??????????..Usable ? 1 ? 2 ? 3 ? 4 ? 5 To what extent do you trust VUI Survey to accurately create your survey? Do Not Trust At All????????????.Completely Trust ? 1 ? 2 ? 3 ? 4 ? 5 I would use the VUI Survey Tool again. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree The VUI Survey Tool would be easy to use by people who don?t know a lot about surveys.- THROWN OUT ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree 44 Please respond by selecting the reaction that best reflects your impressions: The survey tool was easy for me to use. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree It was easy to get started. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree It was easy to create my survey. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree I knew what to say or do during the task.- THROWN OUT ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree If I made a mistake, it was easy to correct. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree The system was accurate in recalling the correct information. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree 45 I was able to successfully complete the task. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree I would have preferred a male voice. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree It was easy to understand the system?s instructions. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree It was easy to speak to the system. ? Strongly Agree ? Agree ? Neutral ? Disagree ? Strongly Disagree 46 How many questions did you re-record?___________- THROWN OUT I would improve the VUI Survey Tool by: _______________________________________________________________________ _______________________________________________________________________ _______________________________________________________________________ If given the choice to create a survey using the Internet, paper or the VUI Survey Tool, which would you choose and why? _______________________________________________________________________ _______________________________________________________________________ _______________________________________________________________________ Additional comments/suggestions: _______________________________________________________________________ _______________________________________________________________________ _______________________________________________________________________ 47 APPENDIX F The Time Log of the Study Results Time Log ID Total Seconds Minutes Seconds 1 221 3 41 2 208 3 28 3 422 7 2 4 204 3 24 5 232 3 52 6 252 4 12 7 192 3 12 8 232 3 52 9 258 4 18 10 351 5 51 11 250 4 10 12 456 7 36 13 244 4 4 14 276 4 36 15 315 5 15 16 225 3 45 17 254 4 14 18 213 3 33 19 263 4 23 20 278 4 38 21 209 3 29 48 APPENDIX G Pre-Experimental Survey Results ID Age Gender Race/Ethnicity Citizenship Highest Degree 1 23 Female White USA B.S 2 23 Female White USA B.S 3 54 Female White USA M.S. 4 29 Female Asian Taiwan B.S 5 24 Female White USA B.S 6 26 Male Asian China B.S 7 27 Female Asian India PhD 8 26 Male Asian India M.S. 9 23 Male African- American USA B.S 10 24 Female African- American USA B.S 11 19 Male White USA High School 12 25 Male Asian India B.S 13 22 Male White USA High School 14 23 Female White USA B.S 15 24 Male African- American USA B.S 16 25 Female African- American USA M.S. 17 25 Male Asian Taiwan B.S 18 26 Male Asian India M.S. 19 21 Female White USA High School 20 27 Male Asian China M.S. 21 24 Female White USA M.S. 49 ID Is English 1st language Often use phone Ever created survey before 1 Yes 7-9 times Yes 2 Yes 4-6 times No 3 Yes 0-3 times Yes 4 No 4-6 times No 5 Yes 7-9 times No 6 No 4-6 times Yes 7 No 10 + times Yes 8 No 4-6 times Yes 9 Yes 4-6 times Yes 10 Yes 7-9 times Yes 11 Yes 4-6 times Yes 12 No 10 + times Yes 13 Yes 4-6 times No 14 Yes 10 + times No 15 Yes 7-9 times Yes 16 Yes 4-6 times Yes 17 No 4-6 times Yes 18 No 4-6 times Yes 19 Yes 7-9 times Yes 20 No 4-6 times Yes 21 Yes 7-9 times Yes 50 APPENDIX H Post Experimental Survey Results ID Terri ble /Won derfu l Fru stra ting /Sat isfyi ng Not Usa ble /Us able /Trust Use Again Easy for me to use Easy to get started 1 3 3 4 4 Agree Agree Neutral 2 4 4 4 4 Agree Agree Strongly Agree 3 4 3 4 4 Agree Agree Neutral 4 4 4 5 5 Strongly Agree Strongly Agree Strongly Agree 5 4 4 5 4 Strongly Agree Agree Neutral 6 5 5 5 5 Agree Agree Agree 7 5 5 5 5 Strongly Agree Strongly Agree Strongly Agree 8 4 4 5 5 Agree Agree Agree 9 3 3 3 4 Neutral Agree Agree 10 3 4 3 2 Neutral Agree Agree 11 4 4 4 5 Agree Agree Agree 12 4 4 4 4 Agree Agree Disagree 13 4 4 5 5 Agree Strongly Agree Agree 14 5 5 5 5 Agree Agree Agree 15 4 4 5 5 Agree Agree Agree 16 5 5 5 5 Agree Agree Agree 17 4 4 4 4 Agree Agree Agree 18 4 4 5 4 Agree Agree Agree 19 4 4 5 4 Agree Agree Agree 20 4 5 5 5 Agree Agree Agree 21 4 5 4 5 Agree Agree Agree 51 ID Easy to Create Easy to correct Accurate In recalling info Success fully complet e task Preferred male voice Easy to understan d system's instruction s Easy to speak to system 1 Agree Strongly Agree Agree Agree Neutral Neutral Agree 2 Neutral Agree Agree Neutral Strongly Disagree Neutral Strongly Agree 3 Agree Strongly Agree Strongly Agree Disagree Disagree Strongly Agree Strongly Agree 4 Strongly Agree Disagree Agree Agree Neutral Strongly Agree Strongly Agree 5 Agree Agree Strongly Agree Agree Neutral Agree Agree 6 Strongly Agree Neutral Strongly Agree Agree Disagree Agree Strongly Agree 7 Strongly Agree Disagree Strongly Agree Strongly Agree Strongly Disagree Strongly Agree Strongly Agree 8 Agree Neutral Strongly Agree Strongly Agree Disagree Strongly Agree Strongly Agree 9 Neutral Agree Agree Agree Neutral Agree Agree 10 Neutral Agree Agree Agree Neutral Neutral Agree 11 Strongly Agree Disagree Strongly Agree Agree Neutral Strongly Agree Strongly Agree 12 Strongly Agree Disagree Strongly Agree Agree Neutral Agree Neutral 13 Agree Neutral Strongly Agree Strongly Agree Neutral Strongly Agree Strongly Agree 14 Agree Agree Strongly Agree Strongly Agree Disagree Agree Agree 15 Strongly Agree Neutral Strongly Agree Strongly Agree Disagree Agree Strongly Agree 16 Agree Agree Strongly Agree Agree Disagree Strongly Agree Strongly Agree 17 Strongly Agree Agree Strongly Agree Agree Disagree Strongly Agree Strongly Agree 18 Agree Neutral Agree Agree Disagree Agree Agree 19 Strongly Agree Neutral Agree Agree Neutral Agree Strongly Agree 20 Strongly Agree Neutral Strongly Agree Strongly Agree Neutral Agree Strongly Agree 21 Strongly Agree Neutral Agree Agree Disagree Agree Agree 52 ID Internet/VUI 1VUI 2VUI 3 Internet 4 Internet 5VUI 6VUI 7VUI 8 9 Internet 10 Internet 11 Internet 12 Internet 13 VUI 14 VUI 15 VUI 16 VUI 17 VUI 18 Internet 19 Internet 20 VUI 21 VUI 53 APPENDIX I Comparison of the Results Task Completion Time Completed # of Errors # of Re- recorded Questions 3.7 Yes 0 0 3.5 Yes 0 0 7 No 0 2 3.4 No 2 1 3.9 Yes 0 0 4.2 Yes 1 0 3.2 No 1 0 3.9 Yes 0 1 4.3 Yes 0 0 5 Yes 1 0 4.2 Yes 0 0 7.6 Yes 5 1 4.1 Yes 0 0 4.6 Yes 0 1 5.3 Yes 0 1 3.8 No 0 0 4.2 Yes 0 0 3.6 Yes 0 0 4.4 Yes 0 1 4.6 Yes 1 0 3.5 Yes 0 0