A LONGITUDINAL COMPARATIVE STUDY EXAMINING FREQUENCY AND TYPES OF TECHNOLOGY USAGE BETWEEN PRACTICING CLASSROOM TEACHERS AND PRE-SERVICE EDUCATION MAJORS Except where reference is made to the work of others, the work described in this dissertation is my own or was done in collaboration with my advisory committee. This dissertation does not include proprietary or classified information. __________________________________ Virginia Bowman Wilcox CERTIFICATE OF APPROVAL: David M. Shannon Pamela C. Boyd, Chair Professor Professor Educational Foundations, Leadership Curriculum and Teaching and Technology _______________________ _______________________ Theresa McCormick Stephen L. McFarland Assistant Professor Dean Curriculum and Teaching Graduate School A LONGITUDINAL COMPARATIVE STUDY EXAMINING FREQUENCY AND TYPES OF TECHNOLOGY USAGE BETWEEN PRACTICING CLASSROOM TEACHERS AND PRE-SERVICE EDUCATION MAJORS Virginia Bowman Wilcox A Dissertation Submitted to The Graduate Faculty of Auburn University In Partial Fulfillment of the Requirement for the Degree of Doctor of Philosophy Auburn, Alabama August 8, 2005 iii A LONGITUDINAL COMPARATIVE STUDY EXAMINING FREQUENCY AND TYPES OF TECHNOLOGY USAGE BETWEEN PRACTICING CLASSROOM TEACHERS AND PRE-SERVICE EDUCATION MAJORS Virginia B. Wilcox Permission is granted to Auburn University to make copies of this dissertation at its discretion, upon request of individuals or institutions at their expense. The author reserves all publication rights. Signature of Author Date iv VITA Virginia (Fay) Bowman Wilcox, daughter of Mary and Daniel Bowman, was born September 18, 1968, in Macon, Georgia. She graduated from First Presbyterian Day School in 1986. She attended Wesleyan College in Macon, Georgia, and graduated with a Bachelor of Arts degree in Early Childhood Education in May of 1990. She began her teaching career at Dexter Elementary School in Fort Benning, Georgia, where she taught second through fifth grades from 1990 through February of 1997. She obtained her Master of Arts degree in Elementary Education from Auburn University in August of 1993. She began working toward her PhD the following summer. In February of 1997, she moved to Oregon where she taught at Mount Scott Elementary School in North Clackamas, Oregon, for one year. In 1998, she moved to DuPont, Washington where she taught fifth and sixth grade in Puyallup, Washington, until May of 2002. In June of 2002, she returned to Macon, Georgia, and began teaching in the Education Department at Wesleyan College. She married Howard Albert Wilcox III, son of Howard and Teresa Wilcox, July 30, 1990. She has three beautiful daughters: Georgia Marie Wilcox, born February 14, 2000; Teresa Reagan Wilcox, born October 1, 2001; and Charlotte Lynn Wilcox, born February 20, 2003. She and her family reside in Macon, Georgia. Her husband is employed by Intel. v DISSERTATION ABSTRACT A LONGITUDINAL COMPARATIVE STUDY EXAMINING FREQUENCY AND TYPES OF TECHNOLOGY USAGE BETWEEN PRACTICING CLASSROOM TEACHERS AND PRE-SERVICE EDUCATION MAJORS Virginia Bowman Wilcox Doctor of Philosophy, August 8, 2005 (M.S. Auburn University, 1993) (A.B. Wesleyan College, 1990) 146 Typed Pages Directed by Dr. Pamela C. Boyd The state of Georgia has allocated significant sources to train teachers and prepare them to use high-end technological devices, programs, and instruments in their classes. The purpose of these financial disbursements was to create students who are prepared for a highly technological society and who were adept in the procedures necessary to use the equipment in sophisticated and applicable ways (Raudonis, 2004). Current research reflects that most schools are well equipped to accomplish this goal in regard to equipment, supplies, and materials. Nevertheless, school system technology assistants and trainers frequently point out that much of this equipment goes unused during the typical school year. The problem then points to lack of use on the part of the classroom teachers. Multiple training programs, courses, staff development, incentives, and other vi training methods have been created with the intention of demonstrating first-hand how the available technology can be implemented within in the classroom in more appropriate and meaningful ways. Studies have found no significant impact of such programs on the general teaching population in regard to their classroom usage, attitudes, and comfort levels, involving technology (Laffey, 2004). This study examined two groups of educators and the frequency, as well as type of technology use maintained within their classrooms over a six-month period of time. A group of twenty practicing teachers in the Bibb County School system (Macon, Georgia) participated in a technology training course (InTech) taught by the researcher in spring of 2004. A second group of twelve pre-service education majors at Wesleyan College (Macon, Georgia) participated in the same course. Both groups were contacted again six months after completing the course. Their technology usage was examined and compared statistically. The goal was to determine which group adopted and maintained the most significant changes in personal and professional technology usage. The purpose of this study was to contribute to the body of research currently being conducted in order to determine the most effective and appropriate point at which such training should be provided during a teacher?s preparation program. It was discovered that while the practicing teachers increased their usage of technology across all areas, the level of increase was not maintained 6-months after the completion of the InTech course. The group of pre-service college students showed steady and continued growth across all areas and at all three collection intervals. This indicated that the earlier technological training can be introduced in educators? vii preparatory programs and training, the more likely practitioners will be to facilitate its use when presented with opportunities that may allow them to do so. viii ACKNOWLEDGMENTS The author would like to thank Dr. Pamela Boyd and Dr. David Shannon for their support and assistance throughout this process. Special thanks go particularly to my husband, Howard, who has always believed in me more than I believe in myself and to my mother-in-law, Teresa, whose support has always been overwhelming. The efforts contained within this paper are dedicated to my three beautiful daughters, Georgia Marie, Teresa Reagan, and Charlotte Lynn; my hope for you is that you?ll see through my example that you never stop learning. Finally, this study and the resulting degree of PhD are ultimately dedicated to the memory of my mother, Mary Carr Bowman. Truly none of this could have happened without her. ix Style manual used: Publication Manual of the American Psychological Association, 5 th Editon. Computer software used: Microsoft Windows XP Professional, Microsoft Excel 2003, SPSS 13.0 for Windows. x TABLE OF CONTENTS LIST OF FIGURES ................................................................................................... xiii LIST OF TABLES..................................................................................................... xiv I. INTRODUCTION TO STUDY..................................................................... 1 Statement of the Problem............................................................................... 1 Purpose of the Study ...................................................................................... 2 Background for the Study .............................................................................. 3 Theoretical Perspective.................................................................................. 4 The Need for the Study.................................................................................. 5 Research Goals............................................................................................... 6 Research Questions........................................................................................ 7 Limitations of Study ...................................................................................... 7 Definition of Terms........................................................................................ 8 Summary........................................................................................................ 12 II. REVIEW OF THE LITERATURE ............................................................... 13 Introduction.................................................................................................... 13 Prevalent Technology Usage in Today?s Classrooms ................................... 15 Technological Training in Pre-Service Teacher Preparation Programs ........ 17 Factors Influencing Practicing Teachers? Technology Usage ....................... 21 Limited Access................................................................................... 21 Adequate Training ............................................................................. 23 Administrative Support...................................................................... 25 Student Performance.......................................................................... 27 Best Practices in Regard to Effective Technology Integration...................... 28 Summary........................................................................................................ 33 III. METHODOLOGY ........................................................................................ 35 Introduction.................................................................................................... 35 Research Questions........................................................................................ 36 xi Background Information................................................................................ 36 Participants..................................................................................................... 38 Treatment of Participants............................................................................... 39 Procedures...................................................................................................... 42 Data Collection .............................................................................................. 43 Instrumentation .............................................................................................. 44 Analysis of Data............................................................................................. 45 IV. FINDINGS OF THE STUDY........................................................................ 46 Overview and Analysis.................................................................................. 46 Instrument Reliability .................................................................................... 47 Analysis of Course Effect .............................................................................. 59 Analysis of Personal Computer Use .............................................................. 65 Analysis of Use of Computer in Teaching Applications ............................... 69 Analysis of Integration of Technology into Subject Areas............................ 72 Group Effect .................................................................................................. 76 Summary........................................................................................................ 79 V. CONCLUSIONS AND RECOMMENDATIONS ........................................ 81 Introduction.................................................................................................... 81 Research Question #1 Results........................................................................ 82 Research Question #2 .................................................................................... 86 Personal Use....................................................................................... 86 Teaching Application......................................................................... 90 Subject Area Integration .................................................................... 93 Question #2 Results ....................................................................................... 95 Conclusions.................................................................................................... 97 Implications and Suggestions ........................................................................ 102 In-Service Factors .............................................................................. 102 Equipment Availability.......................................................... 102 Adequate Training ................................................................. 103 Pre-Service Factors ............................................................................ 104 Continuation of Skill.............................................................. 104 Modeling of Desired Behaviors............................................. 105 Restatement of Findings ................................................................................ 106 Significance of the Study............................................................................... 109 xii Recommendations for Further Study............................................................. 110 Summary........................................................................................................ 112 REFERENCES .......................................................................................................... 114 APPENDICES ........................................................................................................... 119 Appendix A: InTech Survey........................................................................ 120 Appendix B: Classroom Announcement..................................................... 122 Appendix C: Letter of Consent.................................................................... 124 Appendix D: Macon State College Letter ................................................... 126 Appendix E: Wesleyan College Letter........................................................ 128 Appendix F: Six Month Reminder Letter ................................................... 130 xiii LIST OF FIGURES Figure 1 Between Group Comparison Chart.................................................... 64 Figure 2 Personal Use of Technology Chart .................................................... 66 Figure 3 Use of Technology in Teaching Applications Chart.......................... 69 Figure 4 Use of Technology in Subject Area Integration Chart....................... 73 Figure 5 Summary of Pre- and Six-Month Technology Use Chart.................. 96 Figure 6 Summary of Group A and Group B Technology Use Chart.............. 98 xiv LIST OF TABLES Table 1 InTech Survey Item Breakdown ........................................................ 48 Table 2 Reliability Results for Personal Use .................................................. 50 Table 3 Personal Use by Question Breakdown............................................... 51 Table 4 Reliability Results for Teaching Application .................................... 53 Table 5 Teaching Application Use by Question Breakdown.......................... 54 Table 6 Reliability Results for Integration of Technology into Subject Areas.......................................................... 56 Table 7 Integration of Technology into Subject Area by Question Breakdown.......................................................................... 57 Table 8 Reliability Results for Pre-, Post-, and Six Months........................... 58 Table 9 Summary of Mixed-Model ANOVA................................................. 61 Table 10 Descriptive Summary of Technology Use......................................... 63 Table 11 Paired t-Test for Personal Use ........................................................... 68 Table 12 Paired t-Test for Teaching Application.............................................. 71 Table 13 Paired t-Test for Subject Area Integration......................................... 75 Table 14 Between Subject Effects for Personal Use of Technology ................ 76 Table 15 Between Subject Effects of Use of Technology in Teaching Applications ....................................................................... 77 Table 16 Between Subject Effects of Integration of Technology in Subject Areas ................................................................................. 78 xv Table 17 InTech Survey Personal Use Questions............................................. 88 Table 18 InTech Survey Teaching Application Questions ............................... 92 Table 19 InTech Survey Subject Area Integration Questions........................... 94 1 I. INTRODUCTION TO STUDY Statement of the Problem Ten years ago Georgia State Lottery money was allocated to install technology into public schools. A decade later, the question arises ? has it been worth the cost? In an examination of the research currently available on this topic, it was discovered that as little as five years ago, 80% of Georgia public schools reported high levels of technology in place and available to students (Raudonis, 2004). That same year only 10% of the teachers surveyed reported actually using this equipment to any extent within their classrooms. In a survey of Georgia public school teachers taken in 2003, approximately 80% of teachers rated their technology skills as low to moderate (Raudonis, 2004). The results of this survey identified the need for teacher training. The equipment is available and ready to use. However, teachers do not possess the skills, comfort level, or training needed to implement effective lessons or activities that actually integrate or connect to the technology available. The state of Georgia has recognized a lack of adequate technology-trained educators in their state. In 1999, a course entitled InTech (Integrating Technology) was created. The goal of this course was to provide practicing teachers training that would model best practices in the application and integration of technology across subject areas with students. A secondary goal was to provide teachers with the skill level needed to 2 lower anxiety levels in regard to technology and increase personal comfort levels with high-tech equipment, hopefully resulting in more frequent, as well as prolonged, usage of the equipment available (Redish, Holmes & Whitacre, 2004). The InTech course was mandated by the state of Georgia and became part of the certification process. In order to be certified to teach or to renew a current teaching certificate, a candidate must have undergone InTech training prior to the year 2006 (Redish, et al., 2004). The state of Georgia has begun to look at the results of this mandated state course. In a 2003 survey, only 8% of Georgia Public School teachers rated personal technology skills as high. About 50% reported using technology for student learning on a weekly basis. Only 15% actually saw connections between the use of technology in class and state standards (Raudonis, 2004). These numbers reflect the opening question: ?has it been worth the cost?? Obviously the State of Georgia saw value in providing technological experiences for children because the state has mandated such a course and its subsequent certification requirement. What, if anything, has changed in the five years since the mandate was created? Purpose of the Study The purpose of this study was to investigate which of two groups (practicing in- service teachers or pre-service education majors) would not only benefit most from, but also retain the information presented in an InTech course over an extended period of time. The goal was to determine the most effective time to present technological training to education candidates. The outcomes of this study should also provide the identification of barriers that produce aversions to the implementation of technology in the classroom. 3 In addition, this study identified factors that could be used to develop teachers who possess an advanced ability to incorporate technology across the subject areas, thus resulting in more positive experiences regarding the use of technology for all involved. The information obtained from this study would also be important for colleges of education whose primary purpose is to train students to be competent in the preparation of students for working in today?s, as well as tomorrow?s, society. The research to this point indicates that the educational training received does not adequately prepare teachers to use technology effectively with students (Laffey, 2004). The results obtained from this study should provide guidance for colleges of education as they continue to ascertain the most appropriate way to include technology training among all of the other areas they are mandated to provide. It is also hoped that those involved in monitoring technology use on a local level within individual school systems would be interested in these results as they tie directly back to specific changes and types of training that are most assuredly needed or not needed in order to promote even the slightest possibility of effective technology use in classrooms. Background for the Study This study follows a phenomenology similar to that of a quantitative mixed method program evaluation. The phenomenon studied was the effect of the required InTech course and the continued impact it may have had on a participant?s usage and pedagogy six months after its completion. The data for this study was quantitative in nature. It was collected in the form of responses to a technology usage survey (Appendix 4 A). This data was analyzed and used to determine which of the two groups, Group A (in- service practicing teachers) or Group B (pre-service college education majors at Wesleyan College), maintained the most significant changes in pedagogy over the extended six-month time period. Specific attention was given to three distinct areas of technology usage: personal technology use, use of technology in specific teaching applications, and the integration of technology usage into specific curricular subject area instruction. Consistent with social program evaluation, this data provided an overall portrayal of the quality and effectiveness not only of this particular InTech course, but also of the Bibb County School system?s attempts to facilitate the use of technology within their schools. Theoretical Perspective The notion that students learn best when they can see a logical purpose for whatever skill they need to learn has always provided this researcher strong pedagogical guidance (Daniels & Bizar, 1998). The belief that students will retain information longer and move to higher levels of Bloom?s Taxonomy (including application, analysis, and synthesis) holds true, but only after students have had truly meaningful experiences with the material they are attempting to learn (Daniels & Bizar, 1998). The use of technology is but one way to accomplish these feats. Whether teaching children or future teachers of children, students must be provided opportunities to experience, interact with, and see firsthand how the skill, technique, policy, or procedure can help or will affect them 5 personally before they will be able to take the first step toward developing a philosophy that encompasses it. The theoretical perspective of this study was interpretivism ? more precisely, constructivism ? in nature. Human beings cannot be told or taught anything but can only construct meaning once they have had experiences with the information, ideas, concepts, and strategies (Zemelman, Daniels, & Hyde, 1998). The Integrating Technology (InTech) course followed a constructivist philosophy. The course contained little lecture or demonstration. The majority of the time spent in this course revolved around students experiencing the integration of technology first hand. They were immersed in an integrated unit setting and were able to obtain practical hands-on experience in every aspect of this course. The participants left this course and returned to their respective schools. A follow-up contact was made with them six months after completing the InTech course. The Need for the Study The need for this study encompassed two distinct categories. First, the state of Georgia has committed a significant amount of resources to purchasing technological equipment. The state has dedicated a tremendous amount of time to the creation and implementation of technology training courses. It has also created state mandates and incentives to motivate teachers to use and apply what they have learned with students in real and meaningful ways. Based on the review of the literature, return on the state?s investments have not materialized. This study examined how frequently teachers used technology with their students with specific attention to three areas: personal use, 6 instructional use, and subject area integration. This study also examined which among two groups sustained the most significant change in technology usage. This would be an important factor to keep in mind when planning for future staff development based courses. Secondly, colleges of education currently offer various forms of technological courses ranging from a single mandatory course to total infusion of technology across all methods-based content area courses. This study focused on two groups of educators and the frequency with which they used technology in their teaching. It also focused on the maintenance of that usage over an extended period of time once the requirements of the InTech course had been removed. If the use of technology by students beyond graduation is deemed a priority, then the results of this study would provide valuable guidance to colleges as they attempt to create environments conducive to creating future educators who will not only use technology in their teaching, but also see value in its integration across subject areas as well. Research Goals The purpose of this study was to determine which of two groups (practicing in- service teachers or pre-service education majors) would not only benefit from, but also retain the information presented in an InTech course over an extended period of time. The goal was to determine the most appropriate time to introduce technological training to future educators in order to ensure lasting, effective, and meaningful pedagogical change in regard to use of technology. 7 Research Questions The research was guided by the following two questions: 1. Which group of InTech trained educators (in-service teachers or pre- service teachers) will maintain the most significant pedagogical change in regard to technology use over an extended period of time? 2. Which of three areas of technology usage: personal use, teaching applications, or subject area integration will sustain the most significant change between the two groups? Limitations of Study The first and most important limitation of this study was the small participant size. The study began with 20 participants in Group A. Group A consisted of practicing in-service elementary school teachers from the Bibb County School System in Macon, Georgia. One participant quit teaching, bringing the final number in Group A to 19. Group B was made up of 12 pre-service senior elementary school majors from Wesleyan College in Macon, Georgia. Both groups were taught the InTech course by the researcher following the same course syllabus and guidelines. While the small group size allowed more personal interaction, facilitated better researcher/subject relationships, and added to the overall quality of each participant?s technological training experience, it was, however, a limitation as far as quantitative data are concerned. Such a small number of participants did not provide the base of numbers needed to make truly significant statements about usage, limitations, or benefits. The significant factors found will be used to construct future research studies and future course content. 8 Another limitation of this study was the spectrum of socio-economic levels represented by participants? individual schools. The initial goal was to focus this study on participants teaching within or preparing to teach in the Bibb County School System in Macon, Georgia. However, over the summer, three Group A participants were employed by nearby Houston County. This change from one school system to another could possibly have had an effect on the continued access participants had to certain types of technology they may have used at the beginning of the study. Rather than pick three new participants, it was decided to keep them in the group. It had also been hoped to have participants that represented the spectrum of socio-economic level schools. While this was the case when the groups were originally formed, due to intra-county transfers, ten out of nineteen Group A participants and six of the twelve Group B participants were assigned to what could be classified as low socio-economic schools while other participants remained in middle to high socio-economic based schools. Once again, rather than replace participants who had already begun the study, it was decided to keep them in the group as well. Definition of Terms The following definitions will be the standard interpretation used for the purposes of this study. Assessment ? The practice of determining if students have achieved objectives or goals established within a particular lesson or unit. 9 Authentic ? Relates to real life, real events, real purposes and real products. Avoiding the use of packaged curriculum or scripted lesson plans that do not relate to real, rich life experiences. Best Practices ? A set of 13 guiding principles established by S. Zemelman, H. Daniels and A. Hyde and published by Heinemann out of Portsmouth, New Hampshire in 1998. CDROM ? A computer storage device, Compact Disk, Read Only Memory. Challenging ? Stimulating to student interests, as well as learning behavior. Cognitive ? Promotes thinking and understanding instead of simply knowing and reciting. Collaborative ? Providing opportunities to interact responsibly with classmates in a variety of settings. Constructivist ? The means of actively recreating and reinventing knowledge, skills, and techniques. Cooperative Groups ? A group in which all members have a job to do or a role to play in the completion of a final project. Without the cooperation and teamwork of all members, the task cannot and will not be properly completed. Criterion-referenced ? Compared to objectives or goals established to meet certain criteria. Democratic ? Involves student citizenship in the decision making processes that take place in the classroom, completion of projects, scoring, displaying work, and more. Developmental ? Age appropriate and hierarchical in nature in structure, content, and expectations. 10 Educators ? Those who are currently teaching (in-service) or in a preparatory program to become teachers (pre-service). Experiential ? Engages students in active learning through actually experiencing events and activities either in real or simulated fashions. Expressive ? Allows students to perform a variety of communicative activities in the presentation of their work. These include but are not limited to: speech, writing, drawing, poetry, dance, drama, music, movement, and visual arts. Heterogeneous ? A group in which members are operating on a variety of instructional levels with the goal of learning from each other. Holistic ? Teaches concepts from whole to part, integrating content throughout other subject areas and never in isolation. Homogeneous ? A group in which all members are operating on similar levels of understanding, abilities, and performance. In-service Teachers ? Practicing classroom teachers who have chosen to take a particular course. The course may meet a requirement to maintain or renew certification. The participant may take the course out of an intrinsic desire to improve existing skills or knowledge. InTech ? Integrating Technology course created by Dr. Traci Redish and now used as a fulfillment of the technological proficiency requirement for the state of Georgia. Integration ? Combining a skill or content area with other skills from other content areas in order to see a true applicable and meaningful use of the new skill in context. 11 Interpretivism ? Where the researcher attempts to interpret a phenomenon in the context of the surrounding culture and setting. Metacognition ? The ability to think about one?s thinking. The reflective practice of realizing along the way that something is not making sense and taking appropriate steps to correct the confusion. Multi-media ? Technology that incorporates any two of the following: text, graphics, media, or sound. Norm-referenced ? Compared to others at the normal or average range. Pedagogy ? The art and science of teaching. Pre-service Teachers ? Individuals participating in a teacher preparation program as either a traditional student seeking a Bachelors of Arts degree in education or as a non- traditional student with the purpose adding to an already completed degree and becoming certified to teach. Portfolio ? A collection of a student?s best work within a certain subject area. Problem-based learning ? Presenting a unique problem that needs to be solved as the foundation for a lesson thereby reinforcing the importance of the skills that will be needed to solve the problem. Reflective ? Encourages students to reflect on performance, outcomes, and overall quality of work. Students should be allotted time to provide feedback for peers and to use any and all feedback received to improve future work. Smart-Classroom ? A room designed specifically for technology-enhanced lessons and teaching. Such a room may contain items such as, but not limited to, an 12 interactive touch screen display, a teacher electronic workstation, a class-set of computers with internet access, printer access and digital video. Social ? Uses groupings that are homogenous as well as heterogeneous in small and cooperative group settings to work towards the completion of projects and activities. Student-Centered ? Focusing the content studied and materials used in a class around the interests and needs of the students in the class rather than arbitrary and distant content or curriculum. Synthesis ? Combining elements into one single or unified entity. Summary The State of Georgia maintains that, in order to obtain a certificate to teach in the state (at any level), one must demonstrate technological proficiency. Bibb County and all schools encompassed within that county have defined technological proficiency as taking and passing the state-approved InTech course. However, local technological curriculum specialists in the Bibb County system still report minimal to minor usage of the skills and techniques obtained in the InTech course. It is the goal of this researcher to provide data that indicates the primary factor in this minimal usage is when that technology training was received. Chapter II will examine the current literature available on this subject in the areas of prevalent technology use in today?s classrooms, technology training in pre- service teacher preparation programs, factors influencing practicing teachers? technology usage, and best practices in regard to effective technology instruction. 13 II. REVIEW OF THE LITERATURE Introduction Naisbitt (1982) explains that new technologies pass through three stages. In the first stage, the technology follows the line of least resistance into the new setting. At the second stage, new technology improves or replaces previously used items, programs, or materials. Finally, in the third stage, users discover new functions for the technology, based on its potentials. They discover what they can do now with it that was not possible before. Naisbitt?s claims are confirmed by Peck and Doricott (1994): ?Most educators have been stuck in the stage two level creating puzzles, delivering instruction, assessing student progress, and producing reports or newsletters? (p. 11). Schiffer stated, ?However, unlike in businesses, computers in the classroom have increased, rather than decreased, teacher workloads. Many report that the classroom computers spend more time turned off than on and that the money spent would have been better used elsewhere? (Schiffer, 1999, p. 5). Peck and Doricott asked, ?If we removed all of the computers from schools tomorrow, would it make a difference in the knowledge and skills students demonstrated upon graduation? Probably not. What if we removed all of the computers from businesses tomorrow? Most would find it impossible to continue? (p. 11). D?Ignazio (1993) ponders why schools simply rumble along virtually unchanged by the presence of computers. He stated, ?Businesses have been building electronic highways while 14 education has been creating an electronic dirt road. And sometimes on a dirt road, it?s just as easy to just get out and walk? (p. 33). According to O?Neil (1995), the most common uses of technology in the classroom were the use of video for presenting information, computer games and software for drill and practice, and word processing in middle and high school settings. Redish, Holmes and Whitacre (2003/2004) note that this type of usage is still the most commonly found in today?s more technologically-equipped classroom. The notion of reasoning with computer simulations, gathering information from databases, internet, CD-Rom, or presentational software is still rare in classroom settings even today where these devices are more easily accessible. In secondary settings, the percentage of teachers who actually report using technology in any form as a part of their mandated curriculum is quite low. Nine percent reported that they employ computers while teaching English, 6-7% in Math, and only 3% for Social Studies (Redish, et al., 2003/2004). There are some who insist on hard evidence that supports the superiority of technology as an aid to teaching and learning before they are willing to advocate its use in the classroom. Others take the view that ?technology is here to stay, and it should be included as a part of science and mathematics classrooms if instruction is to be relevant to students? daily lives? (Lederman & Niess, 2000, p. 347). This review of the literature will examine four areas in regard to technology use: prevalent usage in today?s classrooms, technology training in pre-service teacher preparation programs, factors that have influenced practicing teachers? technology usage, and best practices in regard to effective technology instruction. 15 Prevalent Technology Usage in Today?s Classrooms According to Lowther, Ross and Morrison (2003), classroom-teaching methods are remarkably resistant to change: ?From the 1890s to today, teacher-centered practices still dominate the class arrangement, communication, dynamics, and instructional activities? (p. 35). Hokanson and Hooper (2000) agree, pointing out that teacher?s reliance on computers for delivering instruction falls into the drill-and-practice and entertainment categories rather than ?facilitating student-centered activities such as inquiry and problem solving? (p. 540). Even in a study where teachers were given class sets of laptops to use with their students, Lowther, et al. (2003) noted that, ?although the students in the study had their own computers, two out of three teachers observed failed to use the technology in ways that substantially changed their former, teacher-centered approaches? (p. 25). One reason for the lag in implementation is that teachers are not yet convinced that computer technology can significantly enhance learning. Until educators can be convinced that the existing technologies will not only increase student subject matter retention, but also make their jobs easier and more enjoyable, true technology implementation will never take place (Naisbitt, 1982). Educators at Naisbitt?s (1982) third stage, where they discover new functions of technology based on its potentials, understand that it is what the student does that counts. There are some things, however, that only teachers can do. Teachers can build strong, productive relationships with students. Technologies cannot. Teachers can motivate students to love learning. Technologies cannot. Teachers can identify and meet students? emotional needs. Technologies cannot. Technology-based solutions in education can, and must, free the teacher to do the important work that requires human interaction, 16 continuous evaluation, and improvement of the learning environment. However, teachers are resistant to taking the chance to use the equipment that is available. Peck and Doricott (1994) state, ?When educators allow students to interact with technologies in meaningful ways for significant periods of time, the growth that follows will encourage educators to try new things? (p. 14). Slavin (2002) states it best: Technology is often the Trojan horse through which innovation enters the school. To see students so engaged in learning that they lose track of time, to see a level of excitement that causes students to come to school early and stay late, and to have time to develop strong relationships with students and to meet their individual needs, will inspire educators to take more frequent and larger steps into stage three. (p. 19) Most of the computer programs and software packages available for classroom use today are designed to give students a more active role in constructing knowledge. This brings about an implicit change in the role of the teacher. According to Kozma and Johnston (1991): The teacher becomes more of a coach or a mentor, helping students solve problems presented by the software. While edifying to some faculty, early adopters report that this role is much more challenging than lecturing or guiding a well structured discussion. (p. 27) Instead of assuming the traditional role of being the expert, posing the problems, and knowing the answers ahead of time, the teacher helps students as they engage problems of their own choosing or problems with varying solutions depending on the parameters set by the student. At any point, a variety of problems could be tackled in class, some of 17 them unfamiliar to the instructor. This requires more subject-matter expertise and more skill in guiding students to derive appropriate conclusions from an activity. In some cases it requires a strong ego and a willingness to reveal ignorance. Most practicing teachers are very uncomfortable in such a role. But then, as Kozma and Johnston (1991) state, ?? this only models what academia is all about ? the search for knowledge? (p. 28). Technological Training in Pre-Service Teacher Preparation Programs Studies of technology usage suggest that advanced technology is not widely or substantially improving schools (Web-based Education Commission, 2000). One of the most prominent explanations for the low level of impact is that teachers do not feel well prepared to use technology effectively (Becker, 1999). Current in-service teachers are not well prepared to use technology, nor does it appear that the next generation is being adequately prepared to enter the profession as technology-using teachers. Ertmer (2003) points out that ?? only 44% of new teachers (three or fewer years in the classroom) feel well prepared to use technology in their teaching? (p. 124). Moursund (1999) surmised, ?In the past few years, teacher education programs have made substantial progress in preparing future teachers in information technology, but they still have a long way to go? (Introduction section, para 2). Teacher education programs need new knowledge about the implications of their practices and the potential of reform efforts to better prepare teachers to use technology in their teaching. After studies revealed that most teacher preparation programs did not prepare their students to use technology in the classroom, the Department of Education funded the development of standards and recommendations on how colleges should 18 prepare teachers to use technology. The National Educational Technology Standards (NETS) were compiled by the U.S. Department of Education and the Office of Educational Technology and released within the National Education Technology Plan in May 2005. These standards gave colleges of education a set of technology use benchmarks that pre-service candidates should reach on the road to teacher certification. According to Lederman and Niess (2000), there are three ways technology is currently being incorporated into teacher education programs. The first way revolves around the teacher educator as the primary user of the technology. A second way prepares the teachers to be the primary user of the technology. A third approach is to prepare the teachers to have their future students using the technology to investigate concepts and solve meaningful problems in the content areas. Teachers must not only become users of a tool, but also design usage of the tool by learners. They must, according to Wertsch (1998), be able to ?take something that belongs to others and make it their own? (p. 53). Teaching practices that are consistent with constructivist thought involve helping learners internalize or reshape new information to make it their own. Berg, Lasley, Raisch and Daniel (1998) discovered: Exemplary technology-using teachers are using technology in their classrooms in ways that are overwhelmingly constructivist. That is, the technology students used most frequently in the teachers? classrooms were research, writing, and desktop publishing. Students in these classrooms are using this commonly found technology as a tool to explore new information and produce new products. They are actively engaged in learning. Each one of these applications provides students 19 opportunities to process new information, to transform it, and to ?make it their own.? (p.122) It is this type of training that future teachers need to experience if they are ever to be expected to go beyond rote drill-and-practice type usage of available technologies. However, even at the pre-service level, it has been noted that students have a model they cling to that defines the kind of teacher they envision themselves being. Laffey (2004) noted ?many students struggle with the seeming incompatibility of the classroom they had always envisioned teaching in and their fear of having the computer come between them and the children they wanted to teach? (p. 71). There are three levels at the pre-service level that have been identified by Laffey: mastery, appropriation, and resistance. He defines mastery as know-how. Students at this stage know how to use technology and use it in ways that help them complete assignments, make presentations, or display and organize data. According to Laffey, appropriation of technology would be seen when students use the technology beyond regular expected coursework and assignments. Perhaps personal usage has increased; one may even see a shift toward planning for how to utilize the available equipment in future lessons with students. Laffey identifies resistance as inability or unwillingness to transfer the ?capability to her own teaching practices. The explanation for resistance may come from the context, the tools, or most likely, the personal history of the individual? (p. 362). He suggests strategies of removing technological focus from a one-course type model and shifting towards an infusion of the technology into all education methods and content courses. This approach, however, requires a faculty that is experienced enough with the available equipment to 20 model appropriate use of the technologies in their courses and to require the pre-service teachers to use it in their work. According to the Alliance for Childhood 2001 report: There is little, if any, research on how university and college faculty come to appropriate technology in their teaching. Faculty must integrate technology into methods courses so that as the pre-service teachers are learning how to select appropriate learning goals, design meaningful lessons, and arrange necessary materials to accomplish the expected goals, the potential of technology to enhance the learning is considered. (para 3) Moursund (1999) suggests that teacher educators frame the two roles the computer can play in schools: as a tool for the acquisition of knowledge and empirical facts or as a tool for the development of children?s thinking. Given the importance that the teacher-child relationship has for early childhood education teachers, and the controversy about using technology with young children, teacher preparation programs may find it beneficial to frame teaching the use of technology as a way to mediate the expressions, performances, and activities valued for children. (Moursund, para 6) Ultimately, the earlier pre-service teachers are exposed to appropriate technology usage, the more comfortable they will be with it and therefore more likely to use it with their future students. All in all, the pre-service teachers need help to plan for how to successfully implement and manage technology in their teaching, such as knowledge of support from peers, working with computer teachers or media specialists in schools, taking continuing education, or developing strategies to let children help other children. The final factor rests with cooperating teachers. Wang, Ertmer and Newby (2004) state: 21 Observing cooperating teachers using computers during the student teaching experience was one of the three most important factors that influenced feelings of preparedness for the use of computers for instruction in their own classrooms. Apparently, observing role models (in this case supervising teachers) favorably influenced the student teachers to perform similarly. (p. 232) With this in mind, colleges and universities need to be more selective when placing their student teachers to provide this type of experience. It is quite clear that colleges of education will have to change their practices in preparing educators for the 21 st century. More importantly, the culture of the colleges of education must change, so that technology becomes an important responsibility for every faculty member, staff person, student, and administrator. This is essential because ?a curriculum cannot be considered in isolation from the culture in which it is to be implemented? (Schrum, Skeele & Grant, 2002/2003, p. 257). Factors Influencing Practicing Teacher?s Technology Usage Limited Access There is little debate regarding the need for teachers to integrate technology into their classrooms as well as provide practical technology experiences for their students. Unfortunately, the rapid expansion of technology in today?s society has failed to affect learning in significant ways. According to Schrum et al. (2002/2003), ?teachers cite many reasons for not using technology in their classrooms, including lack of training and support, lack of awareness of the instructional potential of technology, lack of time to integrate technology into the curriculum, and plain old fear? (p. 258). Wang et al. (2004) 22 also point out teachers? uses of computers are likely to be influenced by multiple factors, including the accessibility of hardware equipment, technical support, teachers? belief in their capacity to work effectively with technology, and lack of encouragement from supervisors. According to Hasselbring and Tulbert (2002), it is estimated that there are between 1.5 and 2.1 million computers in public schools alone. Although this represents a significant monetary investment, most schools still do not have the quantity of computers necessary to make them an integral part of the instructional program. The number of computers in U.S. schools translates into approximately 1 computer for every 30 students. With this ratio, it is not possible for every student to be a computer user; furthermore, for those who are, it is estimated that they spend on average a little more than 1 hour per week on the computer.? (Hasselbring & Tulbert, 2002, p. 34) There is general agreement that computing technologies have not had a significant impact on teaching and learning in K?12 in the United States, even though billions of dollars have been spent purchasing, equipping, and supporting the technology. Pierson (2001) points out: Some critics of school technology use this situation to push their position that technology is not appropriate for children. Others put the failure on the backs of the classroom teachers. However, according to a snapshot survey of schools around the country, the primary reason that technology has not had an impact on teaching and learning is that students have, for all intents and purposes, not actually used the technology. (p. 414) 23 She goes on to claim the primary reason for this nonuse to be lack of access to the technology: ?Having one computer in the classroom is not access, nor will it lead to significant student use. Frankly, technology can?t have an impact if children have not had the opportunity to access and use the technology? (p. 415). The snapshot survey conducted by Norris, Sullivan, Poirot and Soloway (2003) noted one teacher in six had no computers in his or her classroom, and nearly two-thirds of respondents had no more than one computer to be shared among their entire classroom. Norris et al. stated that, ?Less than 5% of respondents had more than five classroom computers that were in working condition. In other words, teachers with no more than one classroom computer outnumbered teachers with six or more computers by a factor of 7 to 1? (p. 17). Norris et al. (2003) go on to state, ?Almost without exception, the single most significant predictor of technology use is the number of working classroom computers? (p. 16). Also significant, but less markedly so, are teachers? use of the Internet at school, the availability of curricular software, and the availability of adequate technical support to maintain operational status of computers and networks. Simply stated, they cannot use what they do not have, or what does not work. Adequate Training Most practicing teachers also report not having adequate training in how to use the various technologies available to them within their classroom. According to Hasselbring, only one-third of all teachers in grades K?12 have had as much as 10 hours of computer training. Many of the courses required of them in their undergraduate coursework dealt with the mechanics or operational side of the technology and less with the methods, pedagogy, and procedures that could be used to integrate technology across 24 subject areas. Subsequently, the workshops and staff development courses that have been the primary source of professional development in the area of computer technology have failed to help teachers understand the compelling benefits of integrating it into classroom lessons. Royer (2002) reports: Many skills-based, one-shot sessions that help teachers learn how to make a web page, create an electronic concept map, or make a multimedia presentation are being offered. Teachers, however, need to understand how they can use it to develop student understandings and to support constructivism, cooperative learning, and problem-based learning. Professional development for computer technology needs to be ongoing, tied to student learning, focused on individual and organizational goals, driven by a long-term plan, and planned collaboratively by those who will participate in it. (p. 233) Because technology is a dynamic innovation, learning to use it as a personal or instructional tool requires a willingness to make mistakes and learn from them and an ability to take risks. Becker (1994) noted that exemplary technology-using teachers not only spent a good deal of personal time working with computers, but also had more extensive computer training and teaching experience as well as high levels of innovativeness and confidence. Pierson (2001) noted, ?These teachers were surrounded by colleagues who used computers for meaningful activities, enjoyed school and district level support for technology use, and had sufficient staff development opportunities? (p. 416). Perhaps in this case, the biggest barrier to technology use is time: time for training, time for teachers to try out new technologies in their classrooms, and time to talk to other teachers about technology. Teacher educators and administrators should not only provide 25 extensive training on educational technology, but also should facilitate the dispositions of openness to change and commitment to teaching improvement. That commitment must begin with the acknowledgement that a significant amount of time is needed throughout multiple school years; when a positive plan for implementation is in place, obtainable goals are established, and strong administrative support is present. Administrative Support The business realm and society as a whole have embraced computer technology and allowed it to reinvent the ways in which we create, find, exchange, and even think about information. School districts have found that they are no longer able to ignore such a deeply permeating innovation. As such, many school districts bow to societal pressure and fund technology without having a thoughtful plan for implementation. Pierson (2001) explains: This lack of foresight leaves an evident disparity between instances of classroom technology use, with teachers who are attempting innovative integration ideas sprinkled throughout a selection of users and nonusers. As a result, any success is found in isolated pockets where administrative support has been strong. (p. 413) Yet despite the increase in access to new technologies, schools are not sufficiently stocked, powered, or wired. O?Neil noted in 1995, ?About one-half of the computers in schools are older 8-bit models incapable of supporting advanced applications, such as CD-ROM or network integration? (p. 10). Sadly, those numbers have changed very little in the past fifteen years. Today?s schools do not have the older 8-bit models, but many still house and attempt to maintain outdated models for which parts can no longer be purchased (Redish, et al., 2003/2004). 26 Too many administrators are uninformed and uninvolved in the role technology plays in their schools. Many administrators have little firsthand experience with technology yet find they face the daunting task of guiding their schools through the change process. This fact manifests itself as Dawson and Rakes (2003) state, ?a principal who does not understand how to use technology makes very poor decisions, spends a lot of money on unnecessary things, or does not provide appropriate supplies or troubleshooting support when needed? (p. 32). Hence, according to Vannatta and Fordham (2004), ?Administrators in all settings and at all levels play key roles in establishing either ?change? or ?maintenance? cultures within their educational systems? (p. 259). Dawson and Rakes (2003) point out that if teachers are to make the necessary adjustments in their teaching methods to accommodate the employment of technology, they need patience and support from school administrators: The principal is a key facilitator in the effort to infuse technology into the school; therefore, technology training for principals, as well as for teachers should be a priority. No matter how much training teachers receive to prepare them for technology integration, most will not successfully employ that training without the leadership of the principal. (p. 30) As far back as 1995, O?Neil noted, ?If teachers aren?t given more time to explore the uses of various technologies, and if the help they need in terms of training and administrative support and expectation isn?t available, progress toward the vision held by technology supporters will always be slow? (p. 11). 27 Student Performance The final barrier to a technological transformation would be student assessment. When students truly use technology in meaningful ways, they demonstrate new outcomes such as creative problem-solving strategies or heightened abilities to collaborate while performing tasks. According to Dwyer (1994), ?? their teachers struggled with how to translate those demonstrations into quantitative measures that could be entered into grade books? (p. 6). Another concern revolved around the pressure teachers feel to prepare students for standardized achievement tests. Most teachers spend time preparing students using traditional text-based, lecture-recitation-seatwork instructional approaches. Many teachers who were surveyed believed a shift towards more technological projects and aspects within their classrooms would detract from test preparation time, thus causing their students to fall behind or score poorly on the required assessments. A study highlighted by Dwyer examined a program called Apple Classrooms of Tomorrow (ACOT). His findings disprove this belief: In the sites that implemented the new electronic medium in problem-solving, open-ended, constructivist ways, student department attendance improved across all sites, student attitude toward self and learning showed progress, and test scores indicated that, at the very least, students were doing as well as they might have without all of the technology, and some were clearly performing better. (p. 5) He goes on to verify that analysis of scores at technological sites showed no significant increase or decrease, even though students were spending far less time on standard curriculum as they developed more technological-related skills. He also found: 28 ACOT students wrote more, more effectively, and with greater fluidity. Teachers also found that their students finished whole units of study far more quickly than in past years. In one instance, a class completed the 6 th grade math curriculum by the beginning of April, creating a quandary of what to do for math for the remainder of the year. In other words, student productivity increased. (p. 8) Today?s teachers report more heightened pressures to teach to the test in light of programs such as No Child Left Behind, Merit Pay and Adequate Yearly Progress lists. With the shift in education seemingly moving toward quantity and away from quality, the true benefits of technology may never be fully realized. Best Practices in Regard to Effective Technology Integration Once teachers see the positive growth that can occur through the integration of technology, how then should they go about facilitating these necessary changes? Meaningful use of technology in schools goes far beyond just dropping technology into classrooms. The greatest advances in all test schools occurred in classes where teachers were beginning to achieve a balance between the appropriate use of direct instruction strategies and collaborative, inquiry-driven knowledge-construction strategies. In those classes, Dwyer (1994) points out: Children were seen as learners and expert resources; students were challenged by problems that were complex and open-ended. In assessing students? work, teachers looked for evidence of deeper understanding?statements of relationships, synthesis, and generalization of ideas to new domains. And, of 29 course, students had opportunities to use a variety of tools to acquire, explore, and express ideas. (p. 9) There must be a complete transition from one school of thought to another in order for this to take place. Both Dwyer (1994) and Zemelman, Daniels and Hyde (1998) acknowledge the focus should shift from teacher-centered, didactic activities to learner- centered, interactive activities. The teachers should transition from being the fact teller and subject area expert to being a collaborator and even sometimes a learner. The student role should move from simply being a listener and always being the learner to being a collaborator and sometimes being the expert. Instructional emphasis should shift away from memorization of facts and towards discovering relationships through inquiry and invention. Demonstration of success should no longer be focused on the quantity of information but the quality of understanding. Teachers should move from norm- referenced, multiple-choice based assessments to criterion-referenced, portfolios and performance-based authentic assessments. Finally, the use of technology should no longer be seen as drill and practice or simple word processing, but as a tool to enhance communication skills, collaboration, information access, and expression. According to Marzano, Pickering and Pollock (2001), there are nine strategies considered safe ways to get started: 1) examining similarities and differences, 2) summarizing or note taking, 3) creating and participating in self-assessment, 4) homework or practice settings, 5) nonlinguistic representations and presentations, 6) cooperative learning, 7) reinforcing students? metacognition, 8) generating and testing hypotheses, 9) cues and advanced organizers. Marzano et al. urge teachers to: 30 Start with generalized skills that can connect with most any states broad scope and sequence or curriculum standards, demonstrating applicable uses for the skill and how the computer can help make the task easier for the teacher and more meaningful for the teacher is the first step toward facilitating any change. (p. 73) Similarly, Zemelman, Daniels and Hyde, in their 1998 book Best Practices, re-affirm the constructivist notion that, ?no one can be told that change is going to be good for them. Instead they must be placed into a situation where the necessity for the change or its direct application to them and their world become increasingly obvious? (p. 119). The simplistic, yet effective strategies and suggestions offered by the teams of Marzano, et al. (2001) and Brabec, Fisher, and Pitler (2004) fall into that category quite well. Yet it all boils down to the simple fact that ??teachers are more likely to change and use computer technology if they are involved in discovering and testing how it can improve student achievement? (Royer, 2002, p. 234). What then can and should be done to ensure that schools are not only equipped with the technologies that students will need to experience in order to provide them with the necessary skills to succeed in the business world, but that teachers are capable, ready, and willing to integrate its use into the existing curriculum? How should effective classroom use of technology take shape? Whitaker (1995) describes how this very thing was accomplished in the Tucson, Arizona, Unified School District. According to her report, it all stemmed from community demand and involvement. The local businesses receiving applicants from the school district approached the curriculum coordinators with the request. The school system graduates did not possess the simple technology skills needed to perform basic job duties. Where the local companies and businesses wished to 31 hire locally, they were finding an increasing need to look elsewhere for the properly trained people. The school system was approached. The superintendent worked with the local businesses to transform the curriculum by adding a fourth R: readiness for the world of work. Their plan took over five years from start to finish, but all have reported positive reactions from students, parents, teachers, and of course, the community. Their experience points out a few critical areas to consider: 1) Include teachers in every aspect of the decision making process, 2) Don?t buy anything that looks or sounds flashy, examine the budget and the curriculum ? have a clear plan on how it will be used before the purchase is made, 3) Give all software, hardware and peripheral purchasing the same weight attributed textbook adoptions, 4) Don?t stint on training, it should be ongoing, easy, readily available, and applicable, and finally, 5) If it?s broken, fix it and be quick about it. (Whitaker, 1995, p. 8?12) Sometimes the simple facts need to be stated in more official ways before they are taken seriously. Along the same lines, Brabec, Fisher, and Pitler (2004) examined the number of ways single technology applications can be used to address different instructional strategies. They see most teachers viewing a program as an end rather than a tool in reaching the end. They give examples of using word processing programs to create and use assessment rubrics, graphic organizers, summarizing articles, or reading passages; they challenge conventional uses of familiar products and provide unique and motivating 32 uses for them with students in classroom settings. The results indicate higher levels of on- task behavior, increased proficiency and retention, and surprisingly, higher achievement test scores: Teachers trained in these methods report it ?easy to return to school, there was nothing to buy, all the software had been on my computer all along.? Others reported a complete shift in behavior from students considered severe behavior problems prior to taking the course. One teacher said, ?Once given the freedom to create on and use the computer in these ways, perhaps they had more respect for me because I trusted them with this valuable equipment, whatever the reason, the behavior improved dramatically.? (p. 10) Their point was the focus should be on lesson planning and unit preparation. Once teachers focus on content and classroom strategies, the focus can then shift to ways in which technologies can enhance the lessons. Brabec et al. (2004) noted, ?Building lessons on a solid, research-based foundation of effective strategies, adding appropriate technologies, and consistently applying those strategies should help ensure high-quality instruction that has the potential of maximizing student achievement? (p. 17). The fact that Whitaker (1995) felt the need to list what may seem obvious points reiterates the earlier notion that the factors that work to prevent teachers from using the available equipment must be addressed if change is to occur on any level or if educators are expected to progress to Naisbitt?s (1982) third stage of technology use. The thing that stands out most about Whitaker?s situation is that this expectation started with local businesses. The demand came from the outside. The expectations of change and increased facilitation of the equipment came from the top, the superintendent. However, 33 the teachers were included in all aspects of decision-making. The technology was not viewed as an added course or something extra. The notion here seemed to be pursuing the best and most efficient way to implement the technology while at the same time continuing to teach the expected state and local curriculum standards. The training that was provided focused on uses and strategies and new ideas for using equipment in new and unique ways within walls already constructed. The last and most important factor came in the form of support. It goes without saying that if something is broken, it obviously cannot be used. Summary It is not enough to acquire the technology. The technology must be used in appropriate ways to deliver powerful instruction. Simply placing powerful technology in the hands of teachers is not enough. Pre-service and in-service training must become a priority if schools are to have teachers who are both comfortable and competent with respect to the use of technology in their teaching. Training teachers to use technology effectively has unique requirements that distinguish it from traditional training activities. Most obviously, teachers need well-equipped facilities and an environment that allows them to explore and master the technology. Instructors for these activities must appreciate teachers? special concerns regarding computers. Moreover, training should be conducted over years, not days, with ongoing front-line technical support while teachers are practicing what they have learned during training. As Pierson (2001) stated: Our society does not simply need teachers who know how to use computers. We need exemplary teachers who know how to effectively use all the tools at their disposal for the learning and benefit of students. According to the proposed 34 definition of technology integration, technology in the hands of a merely adequate teacher will lack the experienced and thoughtful motivation necessary to embed it within a context of sound teaching practice. Conversely, technology in the hands of an exemplary teacher will not necessarily result in integrated and meaningful use. Unless a teacher views technology use as an integral part of the learning process, it will remain a peripheral ancillary to his or her teaching. True integration can only be understood as the intersection of multiple types of teacher knowledge and, therefore, is likely as rare as expertise. Educational leaders would be well served to look beyond mere technology purchases and focus efforts instead on creating environments that are conducive to continued growth in pedagogy as well as in technology use. (p. 430) Chapter III will explain the methodology, participants, and data collection procedures used in this study. 35 III. METHODOLOGY Introduction According to a 1998 Newsweek article (author unknown) entitled Technology Times and Trends, it took sixteen years from the time the personal computer was invented for it to reach one quarter of the United States? population. That is almost half the time it took for the television and nearly three times shorter than electricity. This same article goes on to say that today?s teens get nearly 50% of their information from video sources such as television, video games, the internet, the world wide web, CD- ROMS, DVDs and other media. Other common predictions for future use of technology include: 75% of all books will be published on-line by the year 2007; by 2008, computers will be capable of voice and handwriting recognition; by 2015, factory jobs will comprise less than 10% of the factory work force; within the next ten years, the world?s access to new information will double every six months. In an age in which a new technological innovation is introduced every few months, how are classroom teachers stepping up to the challenge of preparing today?s children for tomorrow?s world of work? This study examined two groups of educators and their experiences in a course designed to equip them with the tools, strategies, capability, and experience to take that first step. On the first day of an InTech course, the two groups completed a survey that examined frequency and type of technology usage in three areas: personal usage, 36 teaching applications, and subject area integration. The same survey was completed approximately twelve weeks later on the last day of the course. Six months after completion of the InTech course, participants responded to a follow-up survey. The results gathered by this instrument were analyzed statistically in an effort to find out which of the two groups benefited the most from the course as evidenced by sustaining the most significant change in overall technology usage over the extended period of time. A Mixed Model Analysis of Variance was used to analyze the data and paired t-tests were used as follow-ups examining between-group interactions and change over time. Research Questions The research was guided by the following two questions: 1. Which group of InTech trained educators (in-service teachers or pre- service teachers) will maintain the most significant pedagogical change in regard to technology use over an extended period of time? 2. Which of three areas of technology usage: personal use, teaching applications, or subject area integration will sustain the most significant change between the two groups? Background Information InTech is a rainforest-themed technology course designed around constructivist principles. It was created by Dr. Traci Redish as part of her PhD program in 1993. The course has been adopted by the state of Georgia and sections pertaining specifically to middle grade and secondary educators have been added since the 1995 implementation 37 date. The course is offered through InTech-certified sites throughout the state and is a requirement for the renewal of teaching certification or obtaining an initial teaching certificate. In order to become InTech Certified, one must undergo the full 50 hours of the course as a participant and complete a co-teaching assignment under the supervision of an InTech certified staff for another 50 hours. Upon successful completion of those two requirements, the applicant must complete 150 hours of solo teaching within the next year. This researcher participated in the initial InTech training course in fall of 2003. The co-teaching and solo teaching components followed during spring of 2004 and into fall of 2004. The final phase and completion were obtained November of 2004. Beginning in 2004, the state of Georgia bowed to complaints from the numerous educators faced with the prospect of losing their certificated status and began allowing participants to take a test that would enable them to exempt the InTech course altogether. This proposal met with heavy criticism from those who had worked so hard to establish the course as a requirement and from participants who had already completed the course. While the technology test may demonstrate knowledge of how to use technology, it does nothing to test the participant?s ability to use that technology within the context of lessons, course content, and curricular areas or to manage the use of the technology appropriately with large groups of children. Bibb County in Macon, Georgia, is one of the few counties that have chosen not to accept the state approved test-out option. They have mandated that all teachers within the county take the full 50 hours of the InTech course to renew or obtain teaching certification. This requirement is currently in place to provide technology training to educators who may have graduated during a time when such courses were not offered as part of their initial teacher preparation program. 38 Beginning in Spring of 2006, all college programs that offer teacher preparatory courses and are seeking Georgia Professional Standards Commission (PSC) approval must provide technology training equivalent to that obtained within the state-approved InTech course to exiting candidates. It was this situation that sparked the idea for this study. Given the two groups that could receive the InTech training, which of the two ? practicing in-service teachers or pre-service education majors ? would not only benefit more from the course, but sustain its implementation in the most significant ways after the course had ended? Participants There were two groups of educators participating in this study. The sampling procedure for this study was purposeful. Each participant fell very distinctly into one of these two groups. All participants were either practicing teachers who signed up to take the InTech course as an in-service option in spring of 2004 (Group A) or they were pre- service senior education majors at Wesleyan College (Group B) who took the InTech course as a part of their education degree requirements in fall of 2004. There was no random nature to the way the two groups were determined. They occurred naturally as a means of each participant?s educational status, training, and current need. Group A consisted of 19 in-service teachers. These educators signed up to take the InTech course taught by the researcher at Macon State College beginning in February of 2004. The participants ranged in age from 23 to 65. The majority of the participants had ten or more years teaching experience and took the InTech course because it was needed to renew their Georgia State Teaching Certificate. Group B consisted of 12 pre- 39 service senior level education majors at Wesleyan College. These students took the InTech course taught by the researcher beginning in August of 2004. The participants ranged in age from 18 to 33. They were all senior-level students with no teaching experience, about to enter their full-time student teaching practicum. They took the InTech course because it was required to obtain an initial teacher certificate in the state of Georgia. Treatment of Participants Both groups experienced identical courses in methods of instruction, day-to-day material, delivery, projects, and assignments. The course was created around a rainforest theme and taught in a constructivist hands-on way where participants were actively involved in the lessons and the use of the technology. Participants were placed in situations similar to those that should be used with their students in order to allow them the experience of actually using the technology to solve problems and create unique projects with common themes. On the first day of the course each participant was read a statement (Appendix B) taken directly from the IRB Letter of Consent (Appendix C). The course proceeded as normal for approximately twelve weeks, meeting once weekly for approximately four hours each time, resulting in fifty hours of training time. Throughout the course, participants experienced constructivist-based, integrated technology activities; presented research on the benefits of using technology with students; planned, taught, and assessed four technology-connected lessons; maintained journals and created an electronic portfolio of completed assignments and projects. The 40 course culminated in group presentations highlighting the five critical areas addressed throughout the InTech course: 1. Use of modern technologies The focus of the course was to model the use of technologies not in a separated way, but as a tool used to enhance and facilitate higher-level learning and thinking within the content areas. The course focused on all areas of technology including, but not limited to, software, Internet, hardware, and multimedia applications. 2. Classroom management One area frequently listed among the top five reasons for not using technology with a class is the ability to manage the potential chaos or to control students. The InTech course modeled a variety of management techniques that could work within a large computer lab setting as well as a small one-computer classroom. The course introduced a new management technique each day and placed participants in that setting, thereby allowing them to experience the effectiveness (good and bad) first hand. In alignment with the aims of the course, the participants were not told what was good and what was bad; they constructed that evaluation on their own through first-hand experience. 3. Curriculum standards Another reason many teachers list for not utilizing technology with students is that it does not fit in with the curriculum they are expected to teach. A major part of the InTech course allowed participants to look at 41 curriculum standards that were currently in place and devise or construct alternate ways to address that content. The class was set up in an integrated unit fashion where the participants were actively involved in a rainforest unit. This unit had been carefully planned to coincide with 1 st through 5 th grade standards in Writing, Reading, Science, Math, and Social Studies. As teachers worked through the unit as part of the class, this realization slowly developed. One goal of the course was for them to go back and do the same with students in their classrooms. 4. Enhanced pedagogical practice Many teachers in the classroom today did not receive adequate training in the use of technology with children. Even those who are technologically proficient often do not feel comfortable doing anything more than allowing children to play games on the computer as part of a technology- connected lesson. Once again, modeling and immersion came into play as part of the InTech course. The instructor modeled and facilitated a true workshop, project-based, integrated approach to teaching, all the while utilizing the available technology as a tool to assist in accomplishing real tasks that had purpose and meaning. Participants were required to plan, teach, and assess four lessons similar in style and nature. 5. New designs for teaching and learning The InTech course was presented in a format that was unique and new for most participants. Most participants were not accustomed to working in cooperative groups to complete a task. Rotation stations were established 42 throughout the course where one participant was trained and became an expert on certain equipment, areas, or information. Other participants then rotated through as the experts instructed them on vital points and concepts. Workshop scenarios were maintained when writing or reading course assignments and content. Participants broke into partner editing groups, article discussion groups, and worked with each other to revise, edit, and interpret course materials. For many participants, this was their first exposure to these techniques. The lessons were designed so that the technology would not be the focus of the lesson, but used as a means to complete the lesson or goal. Participants were asked to highlight real-life examples of the implementation of these five critical areas within their own classrooms and/or schools throughout the duration of the course. Procedures Upon taking the course in the spring of 2004, the participants in Group A completed two surveys or questionnaires. The first thing they did on day one was complete a frequency of use survey (Appendix D) in regard to the actual implementation or use of a variety of technologies and strategies to be utilized throughout the InTech course. The participants completed the same survey approximately three months later, on the last day of the course. The same group of in-service teachers was contacted again in six months and asked to complete the same survey. The pre-service group (Group B) took the InTech course in fall of 2004. The same procedures were followed with this 43 group as well. They were asked to complete the frequency of usage survey (Appendix B) on the first day of the course. They completed that same form approximately three months later on the last day of the course. They were contacted in six months and asked to complete the same frequency of usage survey. Permission was granted by both Macon State College in Macon, Georgia (Appendix D) and Wesleyan College in Macon, Georgia (Appendix E) to use the data obtained as part of both of the InTech courses in this study. The Internal Review Board (IRB) approval was obtained July of 2004 (Appendix C). Once IRB approval was received, the data that had been completed as part of Group A?s regular InTech course beginning in February of 2004 was obtained from the records on file at Macon State and copied for the purposes of this study. Data Collection Data collection for this study began in August of 2004. All pertinent data had been obtained and coded by the end of April of 2005. Data was obtained from the Macon State archives in August of 2004 after IRB approval was granted to use pre-existing data from February and April of 2004. All data collected in regard to this survey was coded with group letters A or B and an assigned number. Group A used numbers from 1?20. Group B used numbers 1?12. Data collected on site was obtained directly by the researcher and stored in a locked filing cabinet in an office on campus at Wesleyan College in Macon, Georgia. Data collected six months after completing the InTech course was obtained via United States Postal Service mail or an internal pony mail system. Both participating groups received the reminder letter (Appendix F) along with a self- 44 addressed, stamped return envelope, as well as a 100,000 Grand candy bar as a thank-you for participation. Nineteen out of the original twenty members of Group A returned their surveys. One participant quit teaching during that time frame. She returned her survey with a letter explaining her situation. All of the 12 members of Group B returned their surveys. Instrumentation The primary data collection used in this study was the InTech survey (Appendix A). This was a frequency of usage survey set up on a 6 point Likert Scale. The survey was arranged so that a high score indicated the most frequent usage of the technology in the week that had just ended. A score of 0 would indicate no usage, whereas a score of 6 indicated usage occurred more than once a day. This item contained approximately 19 subsequent pieces that fell into one of three distinct categories: personal use, teaching applications, and subject area integration. Dr. Traci Redish created this item during her dissertation study. It has content-related validity because it was created after completion of a literature review focusing on types of technology usage in classrooms and multiple observations of classroom teachers using technology in classroom settings. Faculty teams were formed to list items to be included. A pilot administration of the instruments was conducted during fall of 1993. A detailed item analysis was conducted in order to determine if any items were less effective than others. Items were then reviewed and re- written. A second pilot administration was conducted during spring of 1995. The results of the spring of 1995 pilot test are what are now used on the first and last day of each state-approved InTech course taught within the state of Georgia. The instrument is 45 currently in the process of undergoing construct-related validity as the number of times the instrument has been used increases. Analysis of Data A Mixed Analysis of Variance (MANOVA) design with one between subject variable (Group) and 1 within subject variable (Time) was used. Specifically, this analysis allowed examination of data between different groups (in-service and pre-service teachers) over time. A mixed model was used for each of the three areas measured on the InTech survey: personal usage, teaching applications, and subject area integration. Instances when a single variable completely explains phenomena or difference are rare; therefore, the MANOVA allowed the freedom to test each factor while controlling for all others, thus making it more statistically powerful. Chapter 4 will reveal the findings of the study. 46 IV. FINDINGS OF THE STUDY Overview and Analysis This study examined the result of a course (InTech) on two groups? usage of technology in three distinct areas: personal use, teaching applications, and subject area integration. The goal of this study was to determine which of the two groups ? in- service practicing classroom teachers (Group A) or pre-service senior education majors (Group B) ? would achieve and maintain the most significant change in technology usage over a six-month period of time. The overriding purpose was to determine the most appropriate and meaningful point during teacher candidate training to implement technology courses in order to achieve lasting and meaningful results. The data collected for this study was quantitative in nature. It was analyzed using a mixed-model ANOVA and followed up by paired t-tests and repeated measures analysis of the three specific areas. The dependent variable in this study was technology use by the two groups. That technology use was measured by a frequency of usage survey administered at three points throughout the study. The InTech frequency of use survey (Appendix A) was administered on the first day of the InTech course, three months later on the last day of the InTech course, and again six months after the last day of the course. The instrument asked participants to rate the frequency with which they had used technology in their 47 classrooms during the week that just ended in three distinct areas: personal use, teaching applications, and subject area integration. Thus the effect of the treatment, the InTech course, was measured in regard to the effect seen on participants? frequency of technology usage. Therefore, the dependent variables in this study would logically be the participants? frequency of using technology in each of those three areas. These variables were dependent on two distinct independent variables. The independent variables in this study were the effect of the passage of time and the effect of being distinctly within one of the two groups identified. This chapter will analyze statistically the results that were obtained pertinent to these areas. Instrument Reliability The primary data collection used in this study was the InTech survey (Appendix A). This instrument was a frequency of usage survey set up on a 6 point Likert Scale. The survey was arranged so that a high score indicated the most frequent usage of the technology in the week that had just ended. A score of 0 indicated no usage, whereas a score of 6 indicated usage occurred more than once a day. This item contained 19 questions that fell into one of three distinct categories: personal use, teaching applications, and subject area integration. Dr. Traci Redish created this item during her dissertation study. Its content-related validity was described in detail in chapter III. Table 1 displays the 19 questions on the InTech survey and places each question distinctly within one of the three technology usage areas: personal use, teaching applications, or subject area integration. 48 Table 1 InTech Survey Item Breakdown Item # Question Area 1 Personal record keeping, communication or documentation Personal 2 Send or receive information regarding your job via e-mail Personal 3 Use a computer to plan a lesson Personal 4 Use a computer to assist in the implementation of a lesson Personal 5 Use a projection device for a computer in your classroom Teaching Application 6 Plan and teach a technology-related lesson Teaching Application 7 Utilize multi-media technology in the presentation of a lesson Teaching Application 8 Use technology as a tool as you presented or taught a lesson Teaching Application 9 Take students to the computer lab for a lesson (taught by you? not free game time) Teaching Application 10 Allow students access to the computer for research Teaching Application 11 Allow students access to the computer to prepare projects or complete assignments Teaching Application 12 Implement involved multi-media projects Teaching Application 13 Encourage students to apply technological knowledge to create multi-media projects Teaching Application 14 Integrate any form of technology in the teaching of Reading Subject Area Integration 15 Integrate any form of technology in the teaching of Mathematics Subject Area Integration 16 Integrate any form of technology in the teaching of Social Studies Subject Area Integration 17 Integrate any form of technology in the teaching of Science Subject Area Integration 18 Integrate any form of technology in the teaching of Language Arts Subject Area Integration 19 Integrate any form of technology in classroom or time management Subject Area Integration 49 It should be noted that items 1?4 dealt specifically with personal use of technology; items 5?13 focused on use of technology in teaching applications; and the last six, numbers 14?19, addressed the integration of technology into specific content or subject areas. Most experts note that the more items contained within a scale to measure a particular area, the more reliable the measure will most likely be (Guilford & Fruchter, 1978; Sirkin, 1995). Therefore, the areas addressing personal use and subject area integration were most likely going to be the weakest of the three due to the small number of items found on the survey that actually addressed those specific areas. The internal consistency of the instrument was tested for the purposes of this study. Internal consistency estimates reliability in terms of how consistent the actual items are within the instrument. In other words, if an evaluation instrument is designed to measure some content area, then the items that comprise the overall instrument should all be consistent with each other. They should measure the same content and therefore be highly consistent with each other. (Shannon & Davenport, 2000, p. 120) A Cronbach?s Alpha reliability test was run on each area of the item and the instrument used in this study as a whole in order to check for consistency within the item itself. The Cronbach?s Alpha coefficient is a measure of squared correlation between observed and true scores. The reliability is measured in terms of the ratio of true score variance to observed score variance. The relationship between the true score and the observed score should be strong, and this test examined that relationship. An Alpha score close to 1 indicates a more reliable instrument. According to Nunnally (1978), there is not an agreed-upon cut-off. Usually scores of 0.7 and above are acceptable. 50 Table 2 summarizes the results for the questions pertaining to personal computer use across all three time intervals. Table 2 Reliability Results for Personal Use Area Cronbach?s Alpha Score Personal Pre- .769 Personal Post- .617 Personal 6-months .759 Using the acceptable cut-off of .7 as previously specified by Nunnally (1978), it was noted that two of the three instruments examining personal usage tested as reliable. The personal use at the pre- collection interval was the most reliable at .769. Personal use at 6-months was reliable at .759. The most unreliable of the three was personal use at the post- collection interval. The Cronbach?s Alpha score of .617 as compared to the acceptable cut-off of .7 indicated this to be the weakest of the three areas for this particular section of the instrument. The same instrument was administered at all three points with no changes to any part of it. In an effort to determine why one would show evidence of lower reliability, a reliability test that analyzed each specific item that addressed personal computer use was performed. Table 3 analyzes personal use by specific questions. 51 Table 3 Personal Use by Question Breakdown Question # Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item-Total Correlation Cronbach?s Alpha if Item Deleted 1?pre 6.219 21.725 .5169 .7628 2?pre 6.406 24.572 .5303 .7355 3?pre 8.000 23.807 .6588 .6693 4?pre 9.000 27.097 .6512 .6952 1?post 12.937 11.544 .424 .535 2?post 13.063 12.577 .271 .627 3?post 13.781 9.789 .521 .451 4?post 14.688 8.673 .411 .553 1?6 mo 11.742 24.865 .578 .717 2?6 mo 12.226 23.447 .478 .742 3?6 mo 12.839 17.873 .606 .677 4?6 mo 13.613 16.378 .652 .650 Examining Table 3, look specifically at the corrected item-total correlation, should show strong, positive item-total correlations. Almost all of the items listed here showed moderate to strong correlations and fell within the positive range. The closer to 1 the correlation, the more consistent with the other items and, therefore, the more reliable it is in terms of measuring what is intended. Items 3 and 4 showed consistently strong to moderate correlation throughout all three collection intervals. A score closer to 0 than to 1 indicates a lack of reliability as well. Item number 2 at the post-collection interval 52 obtained a score of .271, indicating low correlation with the rest of the items. This correlation was not a pattern consistent across all three collection intervals. Therefore, it was determined that this item added to the reliability of this instrument and should not be removed from the survey. It becomes a bit easier to see when looking also at the Cronbach?s Alpha scores where the items began helping or hurting the overall score. The number reported here indicates what would happen to the overall reliability of this particular instrument if the item or question were deleted altogether. An item that strongly influences this number (raising it below or above the total Alpha score reported in Table 2) would be worth a closer look. A negative impact, meaning removal of the item, causes the total Alpha score to drop below the total reported Alpha and thus indicates this item should be examined to determine if it needs to remain in the instrument for future use. A positive impact, meaning removal of the item causes the total Alpha score to rise above the total reported Alpha, signifies high correlation and reliability and means this particular item was strong and should remain in the instrument for future use. After an examination of the reported numbers, no single question stood out as consistently strong and reliable across all three collection intervals. By the same notion, no single question stood out as having a consistent negative impact on the scores. Item number 2 at the post-collection interval actually raises the reported Alpha score of .617 to .627, but even that increase is not enough to bring the score into the acceptable significant range. Personal use of technology as addressed by this instrument was by far the weakest area and quite possibly should be addressed if this instrument is to be used again in the future. 53 Table 4 summarizes the reliability results across all three collection intervals for the use of technology in teaching applications. Table 4 Reliability Results for Teaching Applications Area Cronbach?s Alpha Score Teach. App. Pre- .934 Teach App. Post- .909 Teach App. 6-months .951 Unlike what was reported for personal use, the section of the InTech survey addressing use of technology for teaching applications indicated strong reliability across all three collection intervals. The number of items specifically targeting this area was higher. Nine total questions addressed this area as compared to 4 questions for personal use. The Alpha scores reported and compared to the acceptable cut-off of .7 indicate strong reliability across all intervals. A specific breakdown by question is provided in Table 5. 54 Table 5 Teaching Application by Question Breakdown Question # Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item- Total Correlation Cronbach?s Alpha if Item Deleted 5?pre 6.0313 111.612 .495 .943 6?pre 5.9688 100.612 .751 .927 7?pre 6.2188 106.628 .809 .923 8?pre 6.0000 98.581 .905 .916 9?pre 5.7500 110.516 .535 .940 10?pre 6.0313 99.580 .943 .914 11?pre 6.0625 99.867 .928 .915 12?pre 6.4688 115.483 .722 .931 13?pre 6.2188 105.531 .835 .922 5?post 18.4063 104.120 .831 .888 6?post 18.4375 104.706 .910 .882 7?post 18.2500 109.548 .761 .894 8?post 18.4063 105.475 .874 .885 9?post 19.3125 118.609 .651 .902 10?post 19.4063 113.797 .664 .901 11?post 19.6563 114.426 .656 .902 12?post 19.9375 137.093 .305 .919 13?post 20.1875 131.060 .505 .911 5?6 mo 16.6452 182.237 .825 .945 6?6 mo 16.5806 181.985 .943 .938 (table continues) 55 Table 5 (continued) Question # Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item- Total Correlation Cronbach?s Alpha if Item Deleted 7?6 mo 16.2581 179.731 .899 .940 8?6 mo 16.3226 178.759 .924 .939 9?6 mo 16.0645 217.996 .296 .969 10?6 mo 17.0000 190.467 .815 .945 11?6 mo 16.9677 186.099 .900 .940 12?6 mo 17.6774 200.159 .851 .945 13?6 mo 17.4516 191.923 .865 .943 In an examination of Table 5 at the pre-collection interval, only two questions? number 5 and number 9?stand out as having raised the originally reported pre-collection interval Alpha score of .934. This change was very slight and the two questions correlated only moderately with the rest. This lack of correlation would indicate that these questions were strong and contributed significantly to the overall outcome of this item. Question 9 did raise the Alpha slightly at the 6-month collection interval from .951 to .969. Questions 12 and 13 did the same, but at the post-collection interval. Question 12 would move the Alpha from .909 to .919 and question 13 to .911 respectively. No other specific question on the teaching application section of this instrument showed significant impact either positively or negatively to the overall Alpha score across all three collection intervals. This finding would indicate the section addressing teaching application was indeed reliable and consistently addressed the area it was intended to target. 56 Table 6 summarizes the reliability results across all three collection intervals for integration of technology in specific content subject areas. Table 6 Reliability results for integration of technology into subject areas Area Cronbach?s Alpha Score Subject Area Int. Pre- .906 Subject Area Int. Post- .909 Subject Area Int. 6-months .922 Similar to what was seen for use of technology in teaching applications, the section of the InTech survey that addressed integration of technology into specific content-related subject areas indicated strong reliability across all three collection intervals. The number of items that specifically targeted this area was slightly smaller than those that addressed teaching application, yet slightly larger than the number that examined personal usage. Six total questions addressed this area, as compared to 4 for personal use and 9 for teaching applications. The Alpha scores reported and compared to the acceptable cut-off of .7 showed strong reliability across all intervals. This score indicated that, similar to the questions targeting teaching applications, the questions on the InTech survey that targeted the integration of technology into specific content subject areas were reliable and consistent across all three collection intervals. A specific breakdown by question is provided in Table 7. 57 Table 7 Integration of Technology into Subject Area by Question Breakdown Question # Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item- Total Correlation Cronbach?s Alpha if Item Deleted 14?pre 3.1875 30.028 .840 .879 15?pre 3.2188 30.241 .846 .879 16?pre 3.0938 27.830 .752 .889 17?pre 3.1250 27.790 .753 .889 18?pre 3.1875 29.899 .853 .877 19?pre 3.0938 29.959 .542 .925 14?post 10.8750 39.145 .773 .890 15? post 10.8125 36.738 .921 .868 16? post 10.5313 42.644 .670 .904 17? post 10.6250 39.790 .859 .881 18? post 10.3438 35.910 .790 .887 19? post 8.6875 38.351 .583 .925 14?6 mo 10.9355 81.262 .839 .900 15?6 mo 10.6774 83.426 .771 .909 16?6 mo 10.8065 82.428 .821 .902 17?6 mo 10.8065 81.161 .836 .900 18?6 mo 10.7097 80.213 .811 .903 19?6 mo 10.7419 84.331 .613 .933 Examining the data broken down by individual questions, only one question, number 19, stood out as one that would raise the originally reported pre-collection 58 interval Alpha scores. This particular question would raise the Alpha score across all three intervals. At the pre- collection interval a change from .906 to .925 was reported. At the post- collection interval the score would rise from .909 to .925. Finally at the 6-month collection interval, the score of .922 would change to .933. The changes noted were very slight and the question itself correlated moderately with the rest. This fact would indicate that this particular question was strong and contributed significantly to the overall outcome of this item. No other specific question on the integration of technology into specific content subject areas section of this instrument showed significant impact either positively or negatively to the overall Alpha score across all three collection intervals. This result would indicate the section that addressed subject area integration was indeed reliable and consistently addressed the area it was intended to target. Three final reliability tests were run on this item. A test on the entire survey at the three collection intervals was performed. The results are displayed in Table 8. Table 8 Reliability Results for Pre, Post- and 6-Months Area Cronbach?s Alpha Score Pre- All .922 Post- All .942 6-month- All .963 Looking at the entire instrument?s reliability at all three data collection intervals revealed statistically significant numbers. A comparison of these Alpha scores to the 59 acceptable cut-off of .7 revealed that all three were highly significant. The instrument at the pre- collection interval was the lowest with an Alpha score of .922, yet remained higher than the .7 limit and therefore still demonstrated strong correlation across all questions and strong overall reliability for the instrument as a whole. A breakdown of each item at each interval was conducted, yet none of the items revealed change to the overall reported Alpha scores significant enough to be discussed or displayed for the purposes of this study. The question breakdown in the preceding sections addressed all the items that showed even minor changes on the final three tests. Overall, this item showed some slight reliability issues at the post- collection interval in regard to personal use of technology. While that area was worthy of concern, it was not a pattern, as the Alpha scores for that particular section remained strong with the previous and the subsequent collection of data. The final collection at the 6-month interval yielded significant results and eased some of the concern about the reliability of this instrument. In all other areas addressed, and at all other collection intervals, the InTech survey provided strong Alpha scores, indicating high correlation and strong internal consistency. It can be assumed from the results discussed here that this was a highly reliable instrument. Analysis of Course Effect The basic goal of variance component estimation is to estimate the population co- variation between random factors and the dependent variable. The population variances of the random factors can also be estimated, and significance tests can be performed to test whether the population co-variation between the random factors and the dependent 60 variable are nonzero. The Analysis of Variance (ANOVA) method provides an integrative approach to estimating variance components, because ANOVA techniques can be used to estimate the variance of random factors, to estimate the components of variance in the dependent variable attributable to the random factors, and to test whether the variance components differ significantly from zero. In this study, a mixed-model ANOVA is most appropriate. According to Shannon and Davenport (2000), A mixed-model ANOVA is best to use in a pretest and posttest experimental design to determine the extent to which the treatment [the InTech course] had an influence over the subject?s performance over time. In some cases, an additional follow-up may be used after a period of time [6-months] to determine the extent to which the treatment has continued to have an impact. (p. 273) The results of the mixed-model ANOVA yielded three F tests between (1) groups (2) time (3) interaction effects. Overall, the two groups were not statistically different in terms of their use of technology, but both groups did increase their use over time. In other words, the average use of technology for personal use, teaching applications, or subject area integration did not vary significantly by group. 61 Table 9 Summary of Mixed-Model ANOVA Dependent Variable Between-Group Time Group by Time Interaction MS F MS F MS F Personal Use 110.98 2.21 646.40 35.82*** 92.01 5.10** Teaching Application 187.70 1.73 788.13 18.74*** 284.19 6.76** Subject Area Integration 1034.26 3.83 1744.70 17.85*** 593.90 6.08** *p < .05 **p < .01 ***p < .001 The change over time in all three cases proved to be statistically significant. Also significant was the combination of passage of time and grouping (the interaction effect). This data indicated that the combination of the two independent variables, belonging to a distinct group (A or B) and the passage of time, together played a significant role in determining a participant?s frequency of technology use in all three areas. This data would indicate that the course did have a significant effect on both groups. The InTech course took place within the time frame from the pre- to the post- collection intervals. Because of this circumstance, any significant changes noted during that time frame can most likely be attributed to the effect of the InTech course itself. Any changes from the post- to the 6-month collection intervals would indicate a maintenance effect by the individual participant. The passage of time either from the first day of the course to the last day, from the last day to 6 months later or from the first day to 6 months 62 later, was the most significant independent variable noted. The combination of being within both group (A or B) and the passage of time was the second most significant. Simply being distinctly within either group (A or B) was not a significant variable on the usage of technology by itself. If lower MS scores indicate more significance, then higher scores would indicate less significant interactions. This result means that while significance in personal use could be attributed to the independent variables of belonging to a group, passage of time, or both, subject area integration cannot. While this data explained significant interactions between the independent variables (group and time), further examination is necessary to determine the differences between the groups or specific points in time. Table 10 displays the data describing technology use within the groups and along the three data collection points from the study 63 Table 10 Descriptive Summary of Technology Use Group Personal Use?pre Mean (SD) Personal Use?post Mean (SD) Personal Use?6mo Mean (SD) Group A (n = 19) 11.89 (6.35) 18.89 (4.49) 16.11 (6.58) Group B (n = 12) 6.33 (5.14) 16.25 (3.22) 17.58 (4.56) Teaching App. ? pre Mean (SD) Teaching App.? post Mean (SD) Teaching App.? 6 mo Mean (SD) Group A (n = 19) 7.58 (12.17) 28.16 (10.96) 19.05 (16.64) Group B (n = 12) 4.33 (10.75) 11.08 (5.26) 18.83 (13.84) Subj. Area Int. ? pre Mean (SD) Subj. Area Int.? post Mean (SD) Subj. Area Int ? 6 mo Mean (SD) Group A (n = 19) 4.00 (7.73) 15.74 (6.54) 12.26 (11.42) Group B (n = 12) 2.83 (3.74) 5.92 (2.78) 14.50 (10.22) Table 10 displays the means and Standard Deviation (SD) for each group in each area at each data collection point throughout the study. The means displayed represent a statistical average across the item for each group within each area. An examination of the mean scores in Table 10 appears to show that Group A increased significantly from the pre- to the post-collection intervals across all areas. That level, however, was not maintained longitudinally, as in all three areas Group A?s mean scores dropped after the course ended. Group B, on the other hand, showed significant and steady increase across means at all three collection intervals and in all three areas. While the gains across time may not look as significant as those seen in Group A from pre- to post-, they still increased and continued to do so after the course had ended. Figure 1 displays the total means for each group across each collection interval in a simpler line graph format. Overall comparison between Groups 64 Figure 1. Between Group Comparison Chart Table 11 takes the numbers from Table 10 and displays them in a format that easily shows the significance of the change over time between the two groups. It can be seen from this chart that the two groups began the study at the pre- collection interval very similar in overall frequency of technology use in all three areas. In fact, Group A was higher by only about 10 data points, indicating usage of computers in all areas 1 to 2 times more than group B in all areas. The most significant difference between the two groups was seen at the post-collection interval. Group A rose to be approximately 30 data points higher than Group B by the end of the InTech course. This rise signifies usage of 0 10 20 30 40 50 60 70 pre- post- 6 months Time Group A Group B 65 the technology across all three areas about 4 to 5 times more per week by Group A than by Group B. The data collected at the 6-month interval once again showed the two groups were not statistically different in regard to their overall technology usage. Group A dropped below Group B for the first time in the study, but only by a tiny margin. The difference between the two groups at the 6-month collection interval was a matter of 2 to 3 data points, indicating no more significant usage by Group B than Group A even though Group B?s total score was higher than group A. What is significant about this graph is the trend in usage that was seen. It can be seen that both groups benefited from participating in the course, as their frequency of usage rose from the pre- to the post- collection intervals. However, while Group B continued to grow and maintain the effects of the InTech course, Group A actually dropped at the 6-month collection interval, indicating a lack of maintenance on their part. In short, the effects of the course were strong in both groups; however, Group B showed signs of maintaining and possibly increasing future usage should the trend seen here continue. The following sections will analyze specific group performance in each area and at each level. Analysis of Personal Computer Use Looking specifically at the InTech survey, the highest score an individual item could obtain was a 6. The instrument had 19 questions on it. Four of those questions addressed personal use of technology; therefore, the range of mean scores possible for personal use alone began at 0 and could reach as high as 24. Figure 2 displays this information in line graph format. 66 Personal Use of Technology 11.89 18.89 16.11 17.58 6.33 16.25 0 2 4 6 8 10 12 14 16 18 20 Personal pre Personal post Personal 6 months Time Group A Group B Figure 2. Personal Use of Technology Chart Examining Figure 2, the information displayed in Table 10 becomes even clearer. Group A, practicing in-service teachers, began the InTech course using technology for personal reasons an average of 11.89 out of 24. This score indicates that as a group they used technology for personal reasons approximately once or twice in a typical week. Their usage rose significantly to 18.89 (3 to 4 times) due to the course, yet dropped to 16.11 (2 to 3 times) once the course ended. Though not significantly higher, the score at the 6-month collection point was still higher than it was when this group began the study. Conversely, Group B, collegiate pre-service students, began the InTech course using technology for personal reasons at an average of 6.33 out of 24. This score indicates that they used technology for personal reasons once or not at all in a typical week. This number jumped significantly to 16.25 (2 times) due to the course and continued to 67 increase to 17.58 (3 times) at the 6-month point. Although the difference between the post- and the 6-month points was not significant, the difference from the pre- to both the post- and the 6-month points certainly was. Paired t-tests were run for each group separately to further examine the personal usage of technology across the three time periods for both groups. The paired t-tests work well for a pre- and post-test design because they make two types of comparisons. They compare two scores within the same group ? such as before and after a specific treatment as in this study ? the InTech course and the passage of time. They can also compare two related samples on the same dependent variable in a matched-pairs design. Table 11 displays the resulting statistical data. 68 Table 11 Paired t-Test for Personal Use Paired Sample Mean Std. Deviation Std. Error Mean Lower Upper T df Sig. (2- tailed) Group A Personal pre-post -7.100 4.745 1.061 -9.320 -4.879 -6.69 19 <.001 Group A Personal post-6- months 2.789 6.276 1.439 -.236 5.815 1.93 18 .069 Group B Personal pre-post -9.917 5.567 1.607 -13.454 -6.379 -6.17 11 <.001 Group B Personal post-6- months -1.333 5.710 1.648 -4.961 2.295 -.809 11 .436 The confidence interval of the difference is listed at 95% for this test. This means one could expect that this difference would occur 95 times out of 100 if it is listed as significant. The mean and the t-score for the first pair for Group A (personal usage at the pre- and the post- intervals) were extremely low. In fact, they were reported in negative numbers. The same was noted for Group B from pre- to post-. The significance for both groups was < .001, indicating one could expect these results less than once out of 100 times. The indication here is that the difference in personal usage of computers from the pre- to the post-intervals was significant for both groups and due to something other than chance. This difference means that quite possibly the InTech course had a significant effect on the two groups. The difference in personal usage from the post- to the 6-month intervals was not as significant at the .069 level. This data would indicate that the change in frequency of usage from the post- to the 6-month intervals could have been due to chance and not particularly explained by belonging to a group or as a result of the course of the passage of time. Analysis of Use of Computer in Teaching Applications In analyzing the use of technology in teaching applications, it is necessary once more to look specifically at the InTech survey. The highest score an individual item could obtain was a 6. The instrument had 19 questions on it. Nine of these questions specifically targeted teaching applications. This situation means that for teaching application the range began at 0 and could reach as high as 54. Figure 3 displays this information in line graph format. 69 Use of technology in teaching applications 30 28.16 25 19.05 20 18.83 15 11.08 10 7.58 5 4.33 0 T.A. postT.A. pre T.A. 6 month Time Group A Group B Figure 3. Use of Technology in Teaching Applications Chart 70 While examining Figure 3, some interesting trends can be noted. Group A began the course using technology for specific teaching applications at a frequency of 7.58 on average. This number means they used technology to teach or support the teaching of lessons approximately once a week or not at all. This number rose significantly as a result of the course. At the post-interval, Group A reported an increase in use to the average of 28.16, reflecting the use of technology to teach or support the teaching of lessons approximately 4 to 5 times a week. This trend for Group A, in a similar fashion to personal usage, dropped once the course ended to an average of 19.05, indicating a consistent usage of technology to teach or support the teaching of lessons approximately 3 times in a typical week. This change remained significant across all time frames for this particular group. Group B began the study at the pre- interval using technology to teach or support the teaching of lessons on an average of 4.33 times in an average week. This number indicated that this group opted most often not to use any form of technology when teaching lessons prior to beginning the course. At the post- interval, this group?s average rose to 11.08 signifying an increase in usage from none to once or twice in a typical week. Remaining consistent with this group?s personal usage, their 6-month interval scores continued to climb. The reported average of 18.83 at the 6-month interval points to the use of technology to teach or support the teaching of lessons at least 3 times within a typical week. While the mean scores for Group A and Group B were not significantly different at the 6-month interval, it is easy to see from the chart that changes due to the course did occur. The two groups happened to end up after 6 months at approximately the 71 same level of technology use in regard to teaching application; however, the paths each group took to get there did differ quite significantly. Paired t-tests were run to further examine the use of technology in teaching applications across the three time periods. As noted earlier, the paired t-tests work well for a pre- and post-test design because they make two types of comparisons. They compare two scores within the same group such as before and after a specific treatment as in this study, the InTech course and the passage of time. They can also compare two related samples on the same dependent variable in a matched-pairs design. Table 12 displays the resulting statistical data. Table 12 Paired T-test for Teaching Application Paired Sample Mean Std. Deviation Std. Error Mean Lower Upper T df Sig. (2- tailed) Group A Teaching Applications Pre-post -20.30 10.588 2.368 -25.256 -15.344 -8.6 19 <.001 Group A Teaching Applications post-6-month 9.10 15.308 3.512 1.727 16.483 2.6 18 .018 Group B Teaching Applications Pre-post -6.75 10.524 3.038 -13.437 -.0635 -2.2 11 .048 Group B Teaching Applications post-6- months -7.75 14.772 4.264 -17.136 1.635 -1.8 11 .096 72 The confidence interval of the difference is listed at 95% for this test. This percentage means one could expect that this difference would occur 95 times out of 100 if it is listed as significant. The mean and the t-score for the first pair for both groups (teaching applications pre- and teaching applications post-) once again were extremely low. In fact they were reported in negative numbers. The significance total for Group A was <.001 indicating one could expect these results less than once out of 100 times. Group B reported a significance total of .048. The indication here was that the difference in frequency of technology use in teaching applications from the pre- to the post- intervals for both Group A and Group B was most significant and most likely due to some external factor other than chance. The difference in use of technology for teaching applications from the post- to the 6-month intervals was not significant for both groups. Group A?s change from the post to the 6-month point was significant at the .018 level. According to this chart, both groups changed significantly in usage of technology to teach or to assist in the teaching of lessons over both pairs of t-tests. This result was evidence that this change was due to something other than chance. In the case of this study, that would most likely be the InTech course itself. In other words, both groups changed (increased) significantly from the pre-to the post- collection intervals, however, Group A decreased from the post- to the 6-month follow-up whereas group B did not. Analysis of Integration of Technology into Subject Areas In analyzing the use of technology in subject area integration, it becomes necessary once more to look at the InTech survey. The highest score an individual item could obtain was a 6. The instrument had 19 questions on it. Six of these questions specifically targeted subject area integration. This information means that for the area of subject area integration, the range could begin at 0 and might reach as high as 36. Figure 4 displays this information in line graph format. Subject Area Integration 73 Figure 4. Use of Technology in Subject Area Integration Chart By examining Figure 4 some interesting trends can be seen. Group A began the course integrating technology into specific subject areas at a frequency of 4 on average. This number means they integrated the use of technology into the teaching of specific content related subject areas less than once a week or not at all. This number rose significantly as a result of the course. At the post- interval, Group A reported an increase in subject area integration to the average to 15.74, reflecting the integration of technology into specific content subject area courses or lessons approximately 2 to 3 times a week. This trend for Group A remained consistent with what was seen for both personal use and teaching applications and dropped once the course ended to an average of 12.26. This 4 15.74 12.26 2.83 5.92 14.5 0 2 4 6 8 10 12 14 16 18 SA Int pre SA Int Post SA Int 6 month Time Group A Group B 74 trend also indicated a consistent integration of technology into other subject areas approximately twice in a typical week. This type of change remained significant across all time frames and all areas of usage for this particular group. Group B began the study at the pre- interval integrating technology into subject area content on an average of 2.83 times in an average week. This number indicated that this group opted not to integrate technology into the teaching of other subject areas at all. At the post- interval, this group?s average rose to 5.92, signifying an increase in usage from not at all to at least once in a typical week. Remaining consistent with this group?s technology usage in the other two areas, their 6-month interval scores continued to climb. The reported average of 14.50 at the 6-month interval indicated this group chose to integrate technology into the teaching of other subject areas at least 2 to 3 times within a typical week. The mean scores for Group A and Group B were not significantly different at the 6-month interval, yet it is easy to see from the chart that changes due to the course did occur. The two groups wound up once again after 6-months at approximately the same level of integration of technology into other subject areas; however, the paths each group took to get there, as before, differed quite significantly. Yet again, paired t-tests were run to further examine the use of technology in teaching applications across the three time periods. Table 13 displays the resulting statistical data. 75 Table 13 Paired t-Test for Subject Area Integration Paired Sample Mean Std. Deviation Std. Error Mean Lower Upper T df Sig. (2- tailed) Group A Subject Area Pre-post -11.60 5.062 1.132 -13.97 -9.231 -10.25 19 <.001 Group A Subject Area post-6- month 3.47 9.929 2.278 -1.31 8.259 1.53 18 .145 Group B Subject Area Pre-post -3.08 3.232 .9331 -5.14 -1.029 -3.31 11 .007 Group B Subject Area post-6- month -8.58 10.958 3.163 -15.55 -1.621 -2.71 11 .020 The confidence interval of the difference is listed at 95% for this test. This percentage means one could expect that this difference would occur 95 times out of 100 if it is listed as significant. The mean and the t-score for the first pair (subject area integration pre- and subject area integration post-) for both groups were extremely low. Similar to the other two areas, it was again reported in negative numbers. The significance of < .001 indicated one could expect these results less than once out of 100 times. It could be concluded that the integration of technology into specific subject areas from the pre- to the post- intervals was most significant for both groups at the pre- to post- collection intervals and most likely due to something other than chance. The difference in integration of technology into subject areas from the post- to the 6-month intervals for both groups was not as significant at the .145 level for Group A and .020 for Group B. This result would indicate that the change in the participant?s frequency of 76 integrating technology into other subject areas from the post- to the 6-month intervals, while not as strongly significant as the pre- to post-, was still significant and most likely would not be due to chance and would be explained by belonging to a group or as a result of the course or the passage of time. Group Effect Finally, a test of between subjects effects was run in order to determine the effect being within one particular group may have had on area of technology use over the others over the duration of the study. Table 14 summarizes the personal use of technology between groups. Table 14 Between Subject Effects for Personal Use of Technology Dependent Variable Type III Sum of Squares Df Mean Square F Sig Personal pre 227.479 1 227.479 6.490 .016 Personal post 51.444 1 51.444 3.121 .088 Personal 6- month 16.068 1 16.068 .462 .502 According to the numbers reported here, the most significant difference between Group A and Group B occurred on the first day of the course. The significant score of .016 was lower than the universally accepted .05 for significance. This score indicated that the two groups differed most significantly in the usage of technology for personal use 77 before the InTech course even began. The levels of difference between the two groups at the post- and the 6-month intervals, while different, were not statistically different enough (both reportedly higher than the .05 cut off) to attribute this difference to anything other than chance and not to being a member of one of the two groups. Therefore, it can be assumed that the course and being a member of either Group A or Group B had only a slight significant statistical effect on a participant?s frequency of use of technology to complete personal tasks. Table 15 analyzes the between-group interaction for use of technology in teaching applications. Table 15 Between Subject Use of Technology in Teaching Applications Dependent Variable Type III Sum of Squares Df Mean Square F Sig Teaching Application pre 77.476 1 77.476 .571 .456 Teaching Application post 2144.234 1 2144.234 25.222 <.001 Teaching Application 6-month .354 1 .354 .001 .970 It was noted the most significant difference between the two groups (A and B) in use of technology in teaching a lesson occurred at the post- interval. That is, the biggest difference between the two groups in using technology to teach or assist in the teaching of a lesson was most significant at the < .001 level in the post-collection interval, on the last day of the InTech course. The other two points were not listed as significant at all. 78 The pre- collection interval was not significant at the .456 level and the 6-month interval at the .970 level. This data indicated that being a member of either group (A or B) had the most significant impact on use of technology in teaching applications on the last day of the InTech course, quite possibly as a result of the course itself. Being a member of either group did not play significantly into the frequency of using technology within teaching applications before the course began (as was noted with personal use) or at the 6-month time interval. Any differences noted here could be attributed to chance and not to being a member of one of the two distinct groups within the study. The final area to examine is the between group analysis of subject area integration. Table 16 displays this information. Table 16 Between Subject Integration of Technology Within Subject Areas Dependent Variable Type III Sum of Squares Df Mean Square F Sig Subject Area Int. pre 10.011 1 10.011 .236 .631 Subject Area Int. post 709.270 1 709.270 24.068 <.001 Subject Area Int. 6-month 36.800 1 36.800 .305 .585 Examining Table 16 revealed that the most significant difference that occurred between the two groups (A and B) in the integration of technology within subject areas was at the post- interval. In other words, the biggest difference between the two groups in integrating technology within specific subject areas was most significant (at the <.001 level) at the post- collection interval, on the last day of the InTech course. The other two 79 points were not listed as significant at all. The pre- collection interval was not significant at the .631 level and the 6-month interval at the .585 level. This data indicated that being a member of either group (A or B) had the most significant impact on a participant?s frequency of integrating technology into specific subject areas on the last day of the InTech course; similar to use of technology for teaching applications, this again could be a result of the course itself. Being a member of either group did not play significantly into a participant?s integration of technology into specific subject areas before the course began (as was noted with personal use) or at the 6-month time interval. Any differences noted here could be attributed to chance and not to being a member of one of the two distinct groups within the study. Summary Statistically it can be seen that across all three scales, both groups increased significantly over time due, most likely, to the InTech course itself. Specifically being within one of the two groups (A or B) was not as significant as the combination of both being within a particular group and the passage of time. According to the data summarized here, Group B grew gradually, and that trend was seen across all three data collection points, indicating it could grow even higher if tested again at another point in the future. Group A, however, did not maintain the effect of the course to the levels observed from the pre- to the post- collection intervals. Across all three areas and at all three collection intervals, Group A increased from pre- to post-, possibly as a result of the InTech course. However, that increase was not maintained at the 6-month interval. Group A showed drops from the post- to the 6-month collection intervals in all three areas of 80 technology usage. While the frequency of usage reported at the 6-month interval was still, in most cases, significantly higher than that which was reported at the pre- collection interval, the most significant drop for group A was in the area pertaining to use of technology in teaching applications. Therefore, it can be noted that the most significant predictor of technology use during and after the treatment, in this case the InTech course, in all areas of documented technology use?personal use, teaching applications, and subject area integration?was the combination of being distinctly within one of the two particular groups and the passage of time (or the result of the course). Reasons behind these findings will be explored and explained in Chapter V. 81 V. CONCLUSIONS AND RECOMMENDATIONS Introduction The preceding chapter analyzed statistically the data obtained from this study. No interpretation, explanation, or discussion was given about why the outcomes described might have occurred. This chapter will take the results reported in Chapter IV and provide more in-depth analysis based on knowledge of the circumstances behind the collection intervals, group dynamics, and other factors. This section will also specifically target the statistical data needed to answer the two research questions that drove this study: 1. Which group of InTech trained educators (in-service teachers or pre- service teachers) maintained the most significant pedagogical change in regard to technology use over an extended period of time? 2. Which of three areas of technology usage: personal use, teaching applications, or subject area integration sustained the most significant change between the two groups? Attention will be given to the long-reaching implications this study might have in the fields of pre-service teacher training as well as in-service training for practicing teachers. Many of the ideas and findings from the review of the literature will be supported and documented as a result of this analysis. The significance of this correlation will also be addressed. Research Question #1 Results The primary question driving this study was: Which group of InTech trained educators (in-service teachers or pre-service teachers) will maintain the most significant pedagogical change in regard to technology use over an extended period of time? In order to answer this question appropriately, it is necessary to take a look at a chart used in Chapter IV once again. Figure 1 portrays the comparison of technology usage in all three areas ? personal use, teaching application, and subject area integration ? across all three data collection intervals for both groups (A & B). Overall comparison between Groups 82 Figure 1. Between Group Comparison Chart 0 10 20 30 40 50 60 70 post-pre- 6 months Time Group A Group B 83 Upon further examination of Figure 1, it can be seen that Group A remained consistently higher in frequency of technology use from the pre- to the post-collection intervals. Group A dropped below Group B in frequency of usage at the 6-month collection interval, although admittedly not by much. In other words, the difference at the beginning of the study (pre-) and the difference at the end of the study (6-months) was not statistically different or significant between the two groups. The biggest significance between Group A and Group B in overall frequency of use of technology was at the post- collection interval. It can be noted that the data portrayed here would insinuate that the answer to the primary question would be that Group A gained the most from the InTech course. However, the question specifically asks: Which group maintained that change over the extended period (6-months) of time? In this case, a closer examination of the data revealed that Group B not only achieved but maintained significant gains throughout the duration of the study. Their usage of technology in all areas increased from the pre- to the post- collection intervals and, unlike their counterparts in Group A, continued to increase past the end of the InTech course and into the 6-month time frame. In fact, their steady rise across the collection intervals signifies a rising trend in this group?s frequency of technology use, most likely as a direct result of taking the InTech course. Therefore, the answer to the question ? ?Which group of InTech trained educators (in-service teachers or pre-service teachers) will maintain the most significant pedagogical change in regard to technology use over an extended period of time?? ? would have to be Group B. It could be argued that the level of technology usage noted at the 6-month interval for Group A was significantly higher than it was prior to the InTech course, thus indicating that Group A not only benefited from the course but maintained 84 that benefit over time. Unfortunately, this trend did not remain constant. The levels for Group A were observed to be dropping in a drastic fashion in all areas whereas the levels for Group B across all collection intervals increased each time. It would be correct to note that the InTech course had a significant effect on Group A. That effect, however, was not maintained at the level reached upon completion of the course at the post- collection interval. Results such as these were not only anticipated but expected for this group. Group A participants were required to use the technology as part of the InTech course. In order to obtain verification of completing the course, the participants in Group A had to plan, teach, and assess 4 technology-connected lessons. Therefore, the increase in the use of technology from the pre- to the post- collection intervals could be attributed directly to the requirements of the InTech course. The true pedagogical impact of the course can be found by looking at what Group A participants chose to do once that requirement was removed and they were back in their classrooms. Group A teachers still facilitated the use of the technology more than was observed prior to taking the InTech course, but not as frequently as was observed when they were required to do so. Group B participants began the course using technology at extremely low levels. These participants not only increased their frequency of usage, but also maintained a steady growth pattern across all three data collection intervals. Group B is showing signs (based on the pattern of this data) that this trend of increase in the frequency of technology usage could potentially continue to rise over time. This trend may be due to the fact that these collegiate level pre-service educators are in the beginning stages of determining their style of teaching and personal pedagogy. It was noted throughout the 85 data collection intervals that the more they were placed into situations where they had the opportunities to plan and teach lessons on their own, the more they chose to facilitate the use of technology as a part of these lessons. Group B participants began the InTech course prior to the start of Wesleyan courses. The low numbers noted for the frequency of usage of technology for the week that had just ended are logical as these women were returning from summer vacations and had not used the technology. At the post- collection interval, once again timing played a significant role in the numbers portrayed. The end of the InTech course and post- collection interval occurred during the week of final examinations. While Group B participants may have used technology for personal reasons in the week that had just ended, most were no longer in the lab schools, had completed the requirements for the InTech course much earlier in the semester, and were not using technology in teaching or integrating it into subject areas. Therefore, the true impact this course may have had on them can be seen at the 6-month collection interval. At this point these participants were fully immersed in their full-time solo student teaching experiences. In other words, the 6- month data collection interval occurred at the time that their supervising classroom teacher had turned control of the classroom over to them for three full weeks. The precise timing was the second week of their three-week solo experience. It was therefore logical to see the highest numbers at this interval, as this was the only time during the study these participants had full access to implementing all areas addressed on the InTech survey. It can be determined as a result of statistical analysis and other circumstances that pre- service education majors will maintain the most significant pedagogical change in regard to technology use over an extended period of time. Research Question #2 In order to answer question number 2 ? Which of three areas of technology usage: personal use, teaching applications, or subject area integration will sustain the most significant change between the two groups? ? it will be necessary to break this section into three distinct parts. Each part will address each area listed within the secondary question for this study. Personal Use A chart used in Chapter IV to analyze personal use of technology is represented again here as Figure 2. Personal Use of Technology 86 Figure 2. Personal Use of Technology Chart According to Figure 2, Group B began this study using technology for personal reasons once or less than once in a typical week. Group A began the study using technology for personal reasons nearly twice as much. Group A?s average use score of 11.89 18.89 16.11 17.58 6.33 16.25 0 2 4 6 8 10 12 14 16 18 20 Personal pre Personal post Personal 6 months Time Group A Group B 87 almost 12 signifies usage of technology for personal reasons at least twice in the week that just ended. Across the three data collection intervals, the difference in personal technology usage was not significant between the two groups. The biggest difference and most significant spread occurred before the InTech course began at the pre- collection interval. Both groups rose significantly at the post- collection interval as a result of the InTech course. Both Group A as well as Group B increased use of technology for personal reasons to approximately 2 to 3 times within a typical week. At the 6-month collection interval, both groups showed signs of strong maintenance at a level of 2 to 3 times a week. This trend is significant despite the fact that Group A?s total mean actually dropped because it was markedly higher at the end of the study than it was in the beginning of the study. According to these numbers, Group A used technology for personal reasons more than Group B before the course began and maintained that level of usage through the post- collection interval. It was not until the 6-month collection interval that Group B increased their use of technology for personal reasons to a level that surpassed that of Group A. Analyzing the circumstances surrounding both groups portrayed several significant reasons behind the reported results. It was initially surprising to see that a group of collegiate seniors did not use technology for personal reasons more than once in the week of the pre- collection interval. By examining the questions that addressed personal use on the InTech survey, the reason behind this low number can be more fully understood. Table 17 displays the questions from the InTech survey that addressed personal usage of technology. 88 Table 17 InTech Survey Personal Use Questions 1 Personal record keeping, communication or documentation Personal 2 Send or receive information regarding your job via e-mail Personal 3 Use a computer to plan a lesson Personal 4 Use a computer to assist in the implementation of a lesson Personal As mentioned in the previous section, the pre- collection interval occurred prior to Wesleyan courses beginning and consequently prior to these students being placed in lab school settings to conduct field experience work. Therefore, there would be no opportunity for Group B participants to have used technology to maintain records, check e-mail, plan, or implement lessons. Knowledge of this circumstance provides tremendous insight to the low frequency of usage numbers reported in this category. Conversely, the in-service teachers in Group A were in the classroom, working full time throughout the duration of the InTech course. At no point during the study were they in a situation where they did not have access to their personal classroom computer or otherwise. It is therefore logical that Group A?s frequency of personal technology use as addressed by the questions in Table 17 would be higher than that of Group B. Similarly, at the post- collection interval, participants in Group B had, as before, completed their lab experience and were taking final examinations. Participants in Group A were still in their classrooms, yet had completed the course and had (as part of the course) used a computer template to type technology-connected lesson plans, sent and responded to personal e- mails pertaining to the course with the instructor, and even maintained an electronic portfolio all for the purposes of completing the course. These situations were 89 instrumental in understanding the significant rise observed in the frequency of usage from the pre- to the post- collection intervals in both groups, but most importantly, those from Group A. Moving from the post- to the 6-month collection intervals, the two groups once again were not significantly different. Both groups did, however, show differences in the pattern and trend seen within the numbers that were reported. Group B participants showed a slight increase from their post- to the 6-month collection point. The difference from the post- to the 6-month was not significant, but was slightly higher. The difference from the pre- to the 6-month interval was quite significant. This difference can be attributed directly to the effect of the course and to the change of the situation in which these participants were placed at each data collection point. Group B participants moved from a point where they had no reason to use the technology to a point where they were in the middle of a situation (full-time student teaching) that required them to facilitate the use of e-mail, electronic lesson planning, and classroom activities on a daily basis. Group A, however, did not show a strong level of maintenance past the post- collection interval. While their averages did rise significantly from the pre- to the post- collection interval, that trend was not maintained into the 6-month interval. In fact, the reported average at the 6-month interval was only slightly higher than that reported at the post- collection point. This result means that while Group A participants on average used the technology before the course began for personal reasons more than Group B participants, this trend actually decreased once they were out of the course and no longer required to do so. However, once the Group B participants were placed in a situation where they had the opportunity to use the technology more for personal reasons, they opted to do so more often than not. Teaching Application To assist in the examination of the data on use of technology for teaching applications, it is necessary once again to refer back to a chart first used in Chapter IV and presented here as Figure 3. Use of technology in teaching applications 90 30 28.16 25 19.05 20 18.83 15 11.08 10 7.58 5 4.33 0 T.A. postT.A. pre T.A. 6 month Time Group A Group B Figure 3. Use of Technology in Teaching Applications Chart An examination of the data represented in table 24 revealed a trend quite similar to that seen in personal usage. The two groups did not differ significantly at the pre- or the 6-month collection intervals. It was obvious, however, that Group A increased their usage of technology to teach or support the teaching of specific lessons significantly from the pre- to the post- collection intervals. This increase was most assuredly attributed to the requirements of the course in which the participants had to plan, teach, and assess 91 four technology-connected lessons. These participants went from using technology in the teaching or supporting the teaching of specific lessons less than twice a week to more than four times a week. That corresponds directly to the number of lessons they were required to teach. This level of usage was not maintained once the course ended and those requirements were no longer prevalent. While the 6-month collection interval scores for Group A were still significantly higher than those reported at the pre- collection interval, they had dropped quite a bit from the post- collection point. This drop was most likely due to the fact that the course had ended, the participants were back in their classrooms, and the pressures of completing all that needed to be done took over. Most of these teachers still maintained an adequate average of technology usage in teaching their lessons even after the course ended. Would this remain constant? If the trend in data noted here were to continue, the answer is ? probably not. Group B participants demonstrated a slow and steady climb across all three data collection points. The participants in Group B reported the use of technology to teach or assist in the teaching of lessons less than once a week at the pre- collection interval. This average rose to approximately twice a week at the post- collection point and peaked at 3 to 4 times per week at the 6-month collection point. The difference between the pre- and the 6-month collection intervals was nearly three times higher. This could be attributed to a combination of the results of the InTech course and the timing of the administration of the survey. As mentioned previously, the pre- collection point occurred during a time when these participants had no reason to utilize the technology for the reasons specified on the InTech survey. Table 18 displays the InTech survey questions that specifically addressed the use of technology to teach or support the teaching of specific lessons. 92 Table 18 InTech Survey Teaching Application Questions 5 Use a projection device for a computer in your classroom Teaching Application 6 Plan and teach a technology-related lesson Teaching Application 7 Utilize multi-media technology in the presentation of a lesson Teaching Application 8 Use technology as a tool as you presented or taught a lesson Teaching Application 9 Take students to the computer lab for a lesson (taught by you ? not free game time) Teaching Application 10 Allow students access to the computer for research Teaching Application 11 Allow students access to the computer to prepare projects or complete assignments Teaching Application 12 Implement involved multi-media projects Teaching Application 13 Encourage students to apply technological knowledge to create multi-media projects Teaching Application It can be noted that most of the questions displayed in Table 18 required access to students on a consistent basis to be answered with high frequency numbers. The Group B participants did not have that level of access to students until the 6-month collection interval. The data in Table 18 supports the notion that once these participants had the opportunity to use technology in teaching applications (and long after they had been required to do so) they opted approximately 3 to 4 times a week to facilitate the use of technology in the teaching of specific lessons. Subject Area Integration It is necessary to refer again to a chart previously used in Chapter IV to address this area. Figure 4 highlights the data pertaining to the integration of technology within specific content related subject areas. Subject Area Integration 93 Figure 4. Use of Technology in Subject Area Integration Questions The data in Figure 4 signifies perhaps the largest and most significant change in Group A?s technology use across all three areas. Examining integration of technology into specific content related subjects showed Group A listed this type of usage as once a week or less at the pre- collection interval. This number rose quite significantly to reflect integration within subject areas almost three times more than before the InTech course began. While this outcome again could be attributed to the course requirements as described previously, it was interesting to note that this one usage area did not drop quite as drastically as the others once the course and its subsequent requirements ended. The numbers reported at the 6-month collection point indicated a consistent integration of 4 15.74 12.26 2.83 5.92 14.5 0 2 4 6 8 10 12 14 16 18 SA Int pre SA Int Post SA Int 6 month Time Group A Group B 94 technology into subject areas approximately 2 to 3 times in a week. This number remained significantly higher than the numbers reported at the pre- collection point. Unlike the other two areas, Group B did not show the same steady increase across collection intervals that had been seen before. In fact, the difference from the pre- to the post collection interval was only slightly significant, reflecting an integration of technology less than once a week at both points. The significance for Group B came within the post- to the 6-month collection interval. The frequency of subject area integration rose from once a week to three times a week on the average. This level of usage was seen well after the requirements of the course had passed. Once again the time frame and requirements go a long way in explaining the numbers reported. As noted previously, the pre- and post- collection points placed Group B participants in situations where they had no reasons to integrate technology into the teaching of specific subject areas. As can be seen in Table 19, the questions from the InTech survey that addressed subject area integration were very specific. Table 19 InTech Survey Subject Area Integration Questions 14 Integrate any form of technology in the teaching of Reading Subject Area Integration 15 Integrate any form of technology in the teaching of Mathematics Subject Area Integration 16 Integrate any form of technology in the teaching of Social Studies Subject Area Integration 17 Integrate any form of technology in the teaching of Science Subject Area Integration 18 Integrate any form of technology in the teaching of Language Arts Subject Area Integration 19 Integrate any form of technology in classroom or time management Subject Area Integration 95 In order to respond to the questions displayed in Table 19 with significant frequency, it would be necessary to have access to a classroom and to be in charge of the planning and implementation of the listed subject areas. For Group B this level of access was not to be the case until the 6-month collection point. It was interesting to note that once this particular group of participants had access to a classroom and were in charge of planning and implementing all content areas, their scores rose significantly. This rise would indicate the course had a strong effect on them. The line graph alone indicated little to no effect directly from the course, as the change for Group B from the pre- to the post- collection points was hardly significant. However, the jump from the pre- to the 6-month points was quite significant and remained consistent with the trend noted for the other two usage areas as well. Question #2 Results The final question that this study addressed was ? Which of three areas of technology usage: personal use, teaching applications, or subject area integration sustained the most significant change between the two groups? To determine this answer, it was necessary to look specifically at each area at the beginning of the study and again at the end of the study. Figure 5 displays this information in line graph format. Summary of pre- and 6-month tech use 0 5 10 15 20 25 30 35 40 Per. Use Teach. App Sub Area type of use co m b i n ed m e an s Pre 6-month Figure 5. Summary of Pre and 6-Month Technology Use Close examination of Figure 5 showed the largest difference in Mean Score (average technology use) occurred in the area of teaching applications. A spread of close to 30 data points was observed for this area of usage. This spread means that from the pre- collection interval to the 6-month collection interval both groups increased their usage of technology to teach or assist in the teaching of lessons from approximately once or twice within a typical week to more than once a day. This change was quite significant when compared to the other two areas of technology use. Personal use had the smallest change. Both groups began the study using technology for personal reasons on an average of 2 to 3 times in a typical week. While this number rose to at least once a day for both groups, the overall difference between the pre- and the 6-month intervals was not as significant as that noted for teaching applications. Finally, the integration of technology into specific content subject areas was 96 97 also significant, yet not quite as significant as that noted for teaching applications. The graph shows integration of subject area began with both groups reporting use at an average of once and less in a typical week. That number rose significantly to approximately 4 times in a week at the 6-month collection point. While that increase was certainly significant, it was not quite significant to the level noted for teaching application. Therefore, based on the data noted here, the answer to the question ? which of three areas of technology usage: personal use, teaching applications, or subject area integration sustained the most significant change between the two groups ? would most certainly be the area of use of technology in teaching applications. Conclusions In conclusion, it can be noted that both groups benefited from the InTech course and made significant changes in pedagogical practice as a result. The best way to visualize this change is to see both groups? performance on all three areas side by side. Figure 6 displays the accumulation of the data for both groups in the three areas: personal use, teaching applications, and subject area integration. Sum m ary of Group A Tech use 0 5 10 15 20 25 30 pre post 6 month Time G r ou p M ean Personal Teach App Sub. Area Summary of G roup B Tech use 0 5 10 15 20 pre post 6 month Time G r oup M ean Personal Teach App Sub. Area Figure 6. Summary of Group A and Group B Technology Use Chart Examining the two groups? data side-by-side, it was easy to see that the InTech course had quite a significant effect on both groups. In all areas, both groups showed significant gain from the pre- to the post- collection interval. This gain could be attributed directly to the InTech course and the subsequent requirements of the course. The real picture of the overall impact of the course could be seen at the 6-month collection interval. Group A showed drops in all three technology usage areas from the initial gains that had been made at the post- collection interval. In two areas (personal use and subject area integration) the 6-month drop placed the participants at a level dangerously close to the level at which they began the study. This would indicate that Group A was not able to maintain the effects of the course over an extended period of time. Group B demonstrated steady and significant growth across all three collection intervals. While the impact of the course itself did not immediately play significantly into the frequency of their technology usage, the strength of their numbers over time would indicate the impact was still significant. Unlike their Group A counterparts, not only was 98 99 the impact of the course significant, but Group B participants were able to maintain the effects over time after the course and its subsequent requirements had ended. It was also noted that the area of teaching applications sustained the most significant change between groups. This change means that while both groups increased their overall frequency of usage of technology as a result of the InTech course, the area that changed most significantly was the use of the technology in the teaching of or to assist in the teaching of specific lessons. So it can be seen that the InTech focus primarily addressed use of technologies for teaching applications, the five critical areas described in Chapter III are examined once again: 1. Use of modern technologies The focus of the course was to model the use of technologies not in a separated way but as a tool used to enhance and facilitate higher level learning and thinking within the content areas. The course focused on all areas of technology including, but not limited to, software, Internet, hardware, and multimedia applications. 2. Classroom management One area frequently listed among the top five reasons for not using technology with a class is the ability to manage the chaos or to control students. The InTech course modeled a variety of management techniques that could work within a large computer lab setting as well as a small one- computer classroom. The course introduced a new management technique each day and placed participants in that setting, thereby allowing them to experience the effectiveness (good and bad) first hand. In alignment with 100 the aims of the course, the participants were not told what was good and what was bad; they constructed that evaluation on their own through first - hand experience. 3. Curriculum standards Another reason many teachers list for not utilizing technology with students is that it does not fit in with the curriculum they are expected to teach. A major part of the InTech course allowed participants to look at curriculum standards that were currently in place and devise or construct alternate ways to address that content. The class was set up in an integrated unit fashion where the participants were actively involved in a rainforest unit. This unit had been carefully planned to coincide with 1 st through 5 th grade standards in Writing, Reading, Science, Math, and Social Studies. As teachers worked through the unit as part of the class, this realization slowly developed. One goal of the course was for them to go back and do the same with students in their classrooms. 4. Enhanced pedagogical practice Many teachers in the classroom today did not receive adequate training in the use of technology with children. Even those who are technologically proficient often do not feel comfortable doing anything more than allowing children to play games on the computer as part of a technology connected lesson. Once again, modeling and immersion came into play as part of the InTech course. The instructor modeled and facilitated a true workshop, project-based, integrated approach to teaching, all the while 101 utilizing the available technology as a tool to assist in accomplishing real tasks that had purpose and meaning. Participants were required to plan, teach, and assess four lessons similar in style and nature. 5. New designs for teaching and learning The InTech course was presented in a format that was unique and new for most participants. Most were not accustomed to working in cooperative groups to complete a task. Rotation stations were established throughout the course where one participant was trained and became an expert on certain equipment, areas, or information. Other participants then rotated through as the experts instructed them on vital points and concepts. Workshop scenarios were maintained when writing or reading course assignments and content. Participants broke into partner editing groups, article discussion groups, and worked with each other to revise, edit, and interpret course materials. For many participants, this class was their first exposure to these techniques. The lessons were designed so that the technology would not be the focus of the lesson, but used as a means to complete the lesson or goal. It can be seen that the focus of the InTech course primarily addressed the use of technologies in teaching applications,and therefore, it would be logical to see this particular area reporting the most significant change. 102 Implications and Suggestions In-Service Factors Final examination of all the accumulated data in comparison with the review of the literature brought a few major implications to light. While it can be noted that both groups of participants obviously benefited from the InTech course, a major concern arose from the significant drop observed in Group A?s usage from the post- to the 6-month collection intervals. When looking back at the literature review, a couple of reasons for this come to light: Equipment Availability Norris et al. (2003) state, ?Almost without exception, the single most significant predictor of technology use is the number of working classroom computers? (p. 16). Also significant, but less markedly so, are teachers? use of the Internet at school, the availability of curricular software, and the availability of adequate technical support to maintain operational status of computers and networks. Simply stated, they canot use what they do not have, or what does not work. Most of the participants in Group A frequently reported extreme frustration with the course requirements during class discussions, within journal entries, and through private conversations. Their general feelings revolved around feeling the pressure of having to complete a project (in this case, teaching four technology-connected lessons) and not having the appropriate materials and equipment to do so. Many worked in situations not conducive to teaching technology-connected lessons solely to complete the course. What should be done to remedy this situation? Spend the money wisely. If it is 103 broken, fix it; if it needs replacing do so; if someone does not have what is needed, find a way to provide it. Systems that expect or demand use of technology should begin with those teachers who want to use the technology and then take every step needed to make it as easy as possible for them to do so. Adequate Training Royer (2002) reports: Many skills-based, one-shot sessions that help teachers learn how to make a web page, create an electronic concept map, or make a multimedia presentation are being offered. Teachers, however, need to understand how they can use it to develop student understandings and to support constructivism, cooperative learning, and problem-based learning. (p. 233) While the InTech course was beneficial and did have significant impact on the frequency of usage reported for this group, it is evident this trend did not remain consistent with the passage of time (6-months). As effective as the course may have been, it was still, as Royer (2002) stated, a ?skills-based, one-shot session? and therefore not as likely to create any type of sustained pedagogical change. What should be done to remedy this situation? Royer again states it best: Professional development for computer technology needs to be ongoing, tied to student learning, focused on individual and organizational goals, driven by a long- term plan, and planned collaboratively by those who will participate in it. (p. 233) Until these teachers are placed into situations where the benefits of using the technology (past the completion of a class or course requirements) are noted, true change will never occur. 104 Pre-Service Factors In the examination of Group B?s frequency of usage across all three points, it was obvious that they not only benefited from the course, but also continued to maintain that level of usage and even increased in all areas well after the InTech course had ended. The average usage in all areas showed signs of steady and gradual increase indicating a positive effect from the course in addition to significant changes in pedagogical practices. It would be logical then to ask ? Would that growth trend remain constant should this group be surveyed again in 6-months to a year?s time? Closer examination of the data and situations involving the Group B participants and reflections from the review of the literature highlight a few major issues in regard to teacher candidate training. Continuation of the Skill Laffey (2004) suggests strategies of removing technological focus from a one-course type model and shifting towards an infusion of the technology into all education methods and content courses. This approach, however, requires a faculty that is experienced enough with the available equipment to model appropriate use of the technologies in their courses and to require the pre-service teachers to use it in their work. According to the Alliance for Childhood 2001 report: There is little, if any, research on how university and college faculty come to appropriate technology in their teaching. Faculty must integrate technology into methods courses so that as the pre-service teachers are learning how to select appropriate learning goals, design meaningful lessons, and arrange necessary materials to accomplish the expected goals, the potential of technology to enhance the learning is considered. (para 3) 105 While Group B reported strong growth across all areas, that growth may not remain constant if this type of training remains as one stand-alone course. In other words, to ensure that the strong changes in frequency of usage remain strong and perhaps even increase, it would be necessary and appropriate to infuse technology into all methods- based education courses. Ultimately, the more frequently pre-service teachers are exposed to appropriate technology usage, the more comfortable they will be with it and, therefore, more likely to use it with their future students. Modeling of Desired Behaviors The final factor rests with cooperating teachers. Wang, Ertmer and Newby (2004) state: Observing cooperating teachers using computers during the student teaching experience was one of the three most important factors that influenced feelings of preparedness for the use of computers for instruction in their own classrooms. Apparently, observing role models (in this case supervising teachers) favorably influenced the student teachers to perform similarly. (p. 232) With this in mind, colleges and universities need to be more selective when placing their student teachers to ensure they can provide this type of experience. It is quite clear that colleges of education will have to change their practices in preparing educators for the 21 st century. More importantly, the culture of the colleges of education must change so that technology becomes an important responsibility for every faculty member, staff person, student, and administrator. This change is essential because ?a curriculum cannot be considered in isolation from the culture in which it is to be implemented? (Schrum et al., 2003, p. 257). 106 It is vital that colleges begin to demand more from the supervising teachers with whom they place their students. It should be mandated that lab-schools, field-experiences, student-teaching and class-observations be under the guidance of highly trained, experienced, motivated, and accomplished educators. The college and the supervising teacher should view each placement as a partnership between the school and the college. The supervising teacher should maintain a role equivalent to that of the professors. Students should see the amount of practical training and learning that takes place within their classroom as significant and as important as the lectures they receive from their college professors. Therefore, the schools and supervising teachers should be viewed as liaisons and extensions of the college into the school systems. All pedagogical framework, theories, philosophies, methods, and ideals should be shared and equally supported within both environments. This framework would ultimately include the infusion of technology into all subject areas. Restatement of Findings The results found indicate that college level students will incorporate and maintain use of technology as a valued aspect of their personal pedagogy more so over an extended period of time than in-service teachers. This insight indicates strongly that integration of technology courses are needed at the pre-service level. The earlier and more frequently such courses can be introduced, the better. Statistical data pointed out that in-service teachers (Group A) benefited more from the actual participation in the course, showing significant increases in all areas of technology usage from the first day to the last day of the course. This level of usage, unfortunately, was not maintained six 107 months later. While the level of use at the six-month point was still higher than what was seen on the first day of the InTech course, it had dropped significantly in all areas. Thus, the indication was that the in-service teachers used the technology when they had to (for the purposes of completing the required course). However, once they returned to their respective schools and classrooms, the level of usage dropped, as they were no longer required to use the equipment. This corresponds with what was found in the review of the literature, as Royer (2002) reports: Many skills-based, one-shot sessions that help teachers learn how to make a web page, create an electronic concept map, or make a multimedia presentation are being offered. Teachers, however, need to understand how they can use it to develop student understandings and to support constructivism, cooperative learning, and problem-based learning. Professional development for computer technology needs to be ongoing, tied to student learning, focused on individual and organizational goals, driven by a long-term plan, and planned collaboratively by those who will participate in it. (p. 233) Until this type of in-service training begins replacing the one-shot style InTech courses, true pedagogical change at the in-service level may never be obtained. Examining pre-service education majors (Group B) showed the trend was slightly different than what was seen from Group A. Initially the data portrayed made it appear as though the InTech course had no significant impact on Group B. This group?s usage did not increase significantly from the first day of the course to the last. This trend was explained by the fact that the InTech course began before the actual Wesleyan school term began. Therefore, the students were not in lab schools or in their regular collegiate 108 courses at the time. Three months later, on the last day of the course, the fall semester had just ended, and students had finished their field and lab experiences and were preparing for final examinations. The significance in Group B?s numbers came from the steady increase seen in all areas well after the completion of the course. This increase indicated that the skills, knowledge, and techniques obtained during the InTech course were indeed retained. Examining the scores, at the six-month time frame, revealed these students were in the middle of their solo-student teaching experience. At that point, they had control over the classroom and all planning and implementation. The fact that their numbers were the highest during that time indicated that the InTech course had a significant effect. The participants in Group B made use of the strategies when they had the opportunity to do so. This fact again corresponds with the review of the literature: Ultimately, the earlier pre-service teachers are exposed to appropriate technology usage, the more comfortable they will be with it and therefore more likely to use it with their future students. All in all, the pre-service teachers need help to plan for how to successfully implement and manage technology in their teaching, such as knowledge of support from peers, working with computer teacher or media specialists in schools, taking continuing education, or developing strategies to let children help other children. The final factor rests with cooperating teachers. Wang, Ertmer and Newby (2004) state: Observing cooperating teachers using computers during the student teaching experience was one of the three most important factors that influenced feelings of preparedness for the use of computers for instruction in their own classrooms. Apparently, observing role models (in this case supervising teachers) favorably influenced the student teachers to perform similarly. (p. 232) 109 In order to ensure that pre-service teachers leave education preparatory programs fully capable of integrating technology in meaningful ways, not only do they need to be placed with supportive cooperating teachers, but they also need to see the faculty modeling, utilizing, and integrating the technology into their own courses. In other words, the use of technology should be infused throughout every course the pre-service teachers are expected to take. Significance of the Study The outcomes of this study should allow the State of Georgia to identify barriers that could produce aversions to the implementation of technology in the classroom. As a result, the state should also able to see what factors would be favorable in the production of teachers who possess an increased ability to incorporate technology across the subject areas, thereby resulting in more positive experiences regarding the use of technology for all involved. The information obtained from this study would also be important for colleges of education whose primary purpose is to train students to be competent in the preparation of students for working in today?s, as well as tomorrow?s, society. The research to this point indicated that the educational training currently being received does not adequately prepare teachers to use technology effectively with students. The results obtained from this study should provide guidance for colleges of education as they struggle with the most appropriate way to include technology training among all of the other areas they are mandated to provide. 110 Finally, it is hoped that those involved in monitoring technology use on a local level within individual school systems would be interested in these results as they tie directly back to strategies and methods that are most assuredly needed or not needed in order to promote even the slightest possibility of effective technology use in classrooms. Recommendations for Further Study Several follow-up studies and projects could result from the outcomes of this study. The first would be a follow-up on the pre-service education majors (Group B) as they graduate from Wesleyan and begin their first year as classroom teachers. It would be interesting to contact this group again in a year?s time to see if the steady slope of increase in technology usage is being maintained or if that has leveled off or even dropped. A more in-depth look at the factors that may have influenced this outcome would also be needed and beneficial. Another interesting study would be to investigate the area of subject area integration a bit further. It would be fascinating to determine why some teachers found it easier to integrate the technology into particular subject areas and harder with others. This might also lead to some interesting discoveries in regard to the national standards within specific subject areas and how they have been or possibly should be modified to reflect the more modern world of today and today?s technology appropriately. A final study that would be applicable would be a closer examination of the use of technology as it pertains to teaching applications. A Bloom?s Taxonomy breakdown analyzing the ways participants use the technology with students would likely yield relevant findings. Categorizing that usage to correspond with the progressive levels of 111 Bloom?s Taxonomy would be a logical step. It would be interesting to determine which level of Bloom?s Taxonomy the use of technology addressed the most. The review of the literature hinted that teachers use technology for more rote drill and practice applications that would fall into the knowledge level on Bloom?s Taxonomy. An in-depth study looking at the teachers who use technology at the higher levels on Bloom?s Taxonomy and what kind of training they had as compared to those stuck at the lower levels could possibly yield very significant findings. A benefit that has already materialized from the completion of this study was a new educational technology course to be offered at Wesleyan College beginning in fall of 2005. This course was designed to provide pre-service teachers with the skills, knowledge, experience, and confidence needed to appreciate the value of integrating technology across the curriculum while also providing them with the tools they will need to avoid the pitfalls that will most assuredly come their way. Wesleyan College does not currently offer any form of Educational Technology course. The Education Department has contracted with a local community college for the past three years to meet the state of Georgia Professional Standards Commission requirements for teacher candidate training in regard to technology. In 2004, the college received a Title III grant dedicated to the creation of a facility that could be used to teach such a course. As of December 2004, the facility was in place and operational: all that was missing was the course and its content. The outcomes of this study were used in the creation of this course. Once all data collection and analysis connected to this study ended, the syllabus, course readings, projects, assignments, and assessments were created based on the results obtained. This course will be taught for the first time beginning fall semester of 2005. 112 Summary While the pre-service teachers of Group B continued to increase over time and maintained that growth trend well after the course ended, the in-service teachers of Group A reported a decrease in all areas six months after the course. The general assumption that can be made was that in-service teachers grew significantly as a result of the course in all areas of technology usage. However, once the course ended and they were no longer required to use the technology, the usage dropped in all areas. While they still maintained a higher level of usage six months after completing the course than was shown on the first day of the course, it should be noted that the requirements of the course were most likely a significant factor in such a high rating from the pre- to the post- course time frames. Pre-service teachers showed gradual increase from the pre- to post course time frame, indicating course effects that were not as immediately significant as those that Group A experienced. This group of pre-service college students experienced its most significant jump after the course ended. While this result may seem to indicate the course did not have a significant effect on their technology usage, it was most likely due to the situation they were in at the time. When the InTech course began and the pre- course instrument was administered, Wesleyan?s collegiate courses for the fall 2004 semester had not yet started. Three months later, at the post-course period, Wesleyan classes were ending, and final examinations were being administered. Six months later at the final stage, all of the pre-service teachers were well-immersed in their solo student teaching internships. Their classroom teachers had, by this time, turned control of the classroom 113 and all planning and implementation aspects over to them. It is believed the course did have an effect on the Group B participants that simply could not be seen until they were placed in a situation allowing them finally to implement what had been experienced six months earlier. 114 REFERENCES Alliance for Childhood. (2001). Tech tonic: Towards a new literacy of technology. College Park, MD: Alliance for Childhood. Becker, H. J. (1994). How exemplary computing-using teachers differ from other teachers: Implications for realizing the potential of computers in schools [electronic version]. Journal of Research on Computing in Education, 291?321. Becker, H. J. (1999). Internet use by teachers (Report #1). University of California- Irvine: Center for Research on Information Technology in Organizations. Retrieved November 11, 2004, from http://www/crito.uci.edu/TLC/html/ findings.html Berg, S. B., Lasley, C. R., Raisch, T. J., & Daniel, C. (1998,Winter). Exemplary technology use in elementary school classrooms. Journal of Research Computing in Education, 31(2), 111?123. Brabec, K., Fisher, K., & Pitler, H. (2004, February). Building better instruction. Learning & Leading with Technology, 5?18. Daniels, H., & Bizar, M. (1998). Methods that matter: Six structures for best practice classrooms. Portland, ME: Stenhouse Publishers. 115 Dawson, C., & Rakes, G.C. (2003, Fall). The influence of principals? technology training on the integration of technology into schools. Journal of Research on Technology in Education, 36(1), 29?43. D?Ignazio, F. (1993). Electronic highways and classrooms of the future. In T. Cannings & L. Finkle (Ed.) The technology age classroom. Wilsonville, OR: Franklin, Beedle, and Associates. Dwyer, D. (1994, April). Apple classrooms of tomorrow: What we?ve learned. Educational Leadership, 51(7), 4?10. Ertmer, P. (2003). Transforming teacher education: Visions and strategies. Educational Technology, Research and Development, 51(1), 124?130. Guilford, J. P., & Fruchter, B. (1978). Fundamental statistics in psychology and education. San Francisco, CA: McGraw-Hill Book Company. Hasselbring, T. S., & Tulbert, B. (2002, Spring). Improving education through technology. Preventing School Failure, 35(3), 33?40. Hokanson, B., & Hooper, S. (2000, Fall). Computers as cognitive media: Examining the potential of computers in education. Computers in Human Behavior, 51(5), 537? 552. Kozma, R. B. (2003, Fall). Technology and classroom practices: An international study. Journal of Research on Technology in Education, 36(1), 1?10. Kozma, R. B., & Johnston, J. (1991, January/February). The technological revolution comes to the classroom. Change, 23(1), 10?33. 116 Laffey, J. (2004, Summer). Appropriation, mastery and resistance to technology in early childhood pre-service teacher education. Journal of Research on Technology in Education, 36(4), 361?383. Lederman, N. G., & Niess, M. L. (2000, November). Technology for technology?s sake or for the improvement of teaching and learning? School Science & Mathematics, 100(3), 345?350. Lowther, D. L., Ross, S. M., & Morrison, G. M. (2003). When each one has one: The influences on teaching strategies and student achievement of using laptops in the classroom. Educational Technology, Research and Development, 51(3), 23?38. Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom instruction that works: Research-based strategies for increasing student achievement. Alexandria, VA: Association for Supervision and Curriculum Development. Moursund, D. (1999). Will new teachers be prepared to teach in a digital age?: National survey on information technology in teacher education. Santa Monica, CA: Milken Exchange on Education Technology. Retrieved November 11, 2004, from http://milkenexchange.org/research/iste_results.html Naisbitt, J. (1982). Megatrends. New York: Warner Books. Norris, C., Sullivan, T., Poirot, J., & Soloway, E. (2003, Fall). No access, no use, no impact: Snapshot surveys of educational technology in K-12. Journal of Research on Technology in Education, 36(1), 15?30. Nunnally, J. C. (1978). Psychometric theory (2 nd ed.). New York: McGraw-Hill. O?Neil, J. (1995, October). Teachers and technology: potential and pitfalls. Educational Leadership, 53(2), 10?11. 117 Peck, K. L., & Doricott, D. (1994, April). Why use technology? Educational Leadership, 51(7), 11?14. Pierson, M. E. (2001, Summer). Technology integration practice as a function of pedagogical expertise. Journal of Research on Computing in Education, 33(4), 413?431. Raudonis, L. (2004, May/June). Technology: After a decade, how are we doing? PageOne, 26(3), 4?25. Redish, T., Holmes, E., & Whitacre, L. (2003/2004). Framework for integrating technology: Elementary handbook. Atlanta, GA: Educational Technology Training Centers. Royer, R. (2002, May/June). Supporting technology integration through action research. The Clearing House, 75(5), 233?237. Schiffer, J. (1999). A framework for staff development. In A. Lieverman & _. Miller (Eds.), Staff development: New demands, new realities, new perspectives (pp. 4? 23). New York: Teachers College, Columbia University. Schrum, L., Skeele, R., & Grant, M. (2002/2003, Winter). One college of education?s effort to infuse technology: A systematic approach to revisioning teaching and learning. Journal of Research on Technology in Education, 35(2), 256?272. Shannon, D. M., & Davenport, M. A. (2000). Using SPSS to solve statistical problems: A self-instruction guide. Upper Saddle River, NJ: Merrill Prentice Hall. Sirkin, M. R. (1995). Statistics for the social sciences. Thousand Oaks, CA: SAGE Publications. 118 Slavin, R. E. (2002). Evidence-based policies: Transforming educational practice and research. Educational Researcher, 31(7), 15?21. Technology Times and Trends. (1998, April 13), Newsweek, p. 45. U.S. Department of Education, Office of Educational Technology. (2005). National education technology plan. Washington DC: U.S. Government Printing Office. Vannatta, R. A., & Fordham, N. (2004, Spring). Teacher dispositions as predictors of classroom technology use. Journal of Research on Technology in Education, 36(3), 253?272. Wang, L., Ertmer, P. A., & Newby, T. J. (2004, Spring). Increasing preservice teacher?s self-efficacy beliefs for technology integration. Journal of Research on Technology in Education, 36(3), 231?251. Web-based Education Commission. (2000). The power of the internet for learning: Moving from promise to practice. Washington, DC: U.S. Government Printing Office. Wertsch, J. (1998). Mind as action. San Francisco, CA: Oxford University Press. Whitaker, L. (1995, February). Aim straight at the curriculum [Electronic version]. Electronic School Journal, 7-14. Retrieved November 11, 2004, from http://www.electronic-school.com/whitaker.html Zemelman, S., Daniels, H., & Hyde, A. (1998). Best practices. Portsmouth, NH: Heinemann Publisher. 119 APPENDICES 120 APPENDIX A INTECH SURVEY 121 122 APPENDIX B CLASSROOM ANNOUNCEMENT 123 124 APPENDIX C LETTER OF CONSENT 125 126 APPENDIX D MACON STATE COLLEGE LETTER 127 128 APPENDIX E WESLEYAN COLLEGE LETTER 129 130 APPENDIX F SIX MONTH REMINDER LETTER 131