DIRECT INSTRUCTION READING: EFFECTS OF THE READING MASTERY PLUS ? LEVEL K CURRICULUM ON PRESCHOOL CHILDREN WITH DEVELOPMENTAL DELAYS Except where reference is made to the work of others, the work described in this dissertation is my own or was done in collaboration with my advisory committee. This dissertation does not include proprietary or classified information. ________________________________________ Ryan M. Zayac Certificate of Approval: _____________________________ ______________________________ Jennifer M. Gillis James M. Johnston, Chair Assistant Professor Professor Psychology Psychology _____________________________ ______________________________ James F. McCoy Craig B. Darch Associate Professor Professor Psychology Rehabilitation and Special Education _____________________________ George T. Flowers Interim Dean Graduate School DIRECT INSTRUCTION READING: EFFECTS OF THE READING MASTERY PLUS ? LEVEL K CURRICULUM ON PRESCHOOL CHILDREN WITH DEVELOPMENTAL DELAYS Ryan M. Zayac A Dissertation Submitted to the Graduate Faculty of Auburn University in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Auburn, Alabama August 9, 2008 iii DIRECT INSTRUCTION READING: EFFECTS OF THE READING MASTERY PLUS ? LEVEL K CURRICULUM ON PRESCHOOL CHILDREN WITH DEVELOPMENTAL DELAYS Ryan M. Zayac Permission is granted to Auburn University to make copies of this dissertation at its discretion, upon request of individuals or institutions and at their expense. The author reserves all publication rights. ______________________________ Signature of Author August 9, 2008 _ Date of Graduation iv DISSERTATION ABSTRACT DIRECT INSTRUCTION READING: EFFECTS OF THE READING MASTERY PLUS ? LEVEL K CURRICULUM ON PRESCHOOL CHILDREN WITH DEVELOPMENTAL DELAYS Ryan M. Zayac Doctor of Philosophy, August 9, 2008 (M.S., Auburn University, 2005) (B.S., Allegheny College, 2002) 226 Typed Pages Directed by James M. Johnston Despite the success of Direct Instruction (DI) programs in teaching a variety of individuals how to read, there has been little research on the use of DI for young children with developmental delays. The purpose of the present study was to investigate the effects of the Reading Mastery Plus ? Level K program on preschool children with developmental delays. The study demonstrated the fact that preschool-aged children both with and without developmental delays are able to acquire beginning reading skills. While the research design inhibits the identification of any functional relationships between the Reading Mastery Plus ? Level K program and the participants? reading gains, the data showed that young children with developmental delays can acquire skills that are necessary to begin reading. This is an important finding, especially considering the fact that the number of children with autism spectrum disorder is increasing. While v the previous mindset that children with developmental delays were not capable of reading has changed, the research on the effects of Direct Instruction on this population has seen only limited growth. The results of this study provide an appropriate starting point for extending this literature and for turning this research into practice. vi ACKNOWLEDGMENTS I would first and foremost like to thank God for providing me with such a blessed life and the opportunity to follow my dreams. I would also like to thank Dr. Jim Johnston for all of his help and guidance in shaping me into the student that I am today. I would also like to acknowledge my committee, especially Drs. Jen Gillis and Craig Darch for providing helpful comments and insight on this project. I also appreciate the help of Rachel Dawkins who was instrumental in the data collection process. Thanks are also due to my wife Amber, for all of your love and support throughout my time here at Auburn? and to my family for always being so supportive. War Eagle to you all! vii Style manual used: Publication Manual of the American Psychological Association (5 th edition) Computer software used: Microsoft Word 2007? Microsoft Excel 2007? viii TABLE OF CONTENTS LIST OF TABLES ........................................................................................................... viii LIST OF FIGURES ........................................................................................................... ix CHAPTER I. INTRODUCTION .........................................................................................1 The Need for Quality Education .....................................................................................1 Why Children Fail to Read .............................................................................................5 Perspectives on Reading Instruction .............................................................................11 Direct Instruction Reading ............................................................................................19 Components of Direct Instruction Reading ..................................................................26 Research on Direct Instruction Reading .......................................................................33 A Direct Instruction Approach to Teaching Children With Developmental Delays How to Read ............................................................................45 CHAPTER II. EXPERIMENT PROPER ..........................................................................47 Method ..........................................................................................................................47 Participants ................................................................................................................47 Materials and Setting ................................................................................................52 Dependent Measure ..................................................................................................53 Procedure ..................................................................................................................54 Results and Discussion .................................................................................................60 CHAPTER III. GENERAL DISCUSSION .....................................................................129 REFERENCES ................................................................................................................137 APPENDIX ......................................................................................................................156 ix LIST OF TABLES Table Page 1. Participant characteristics .............................................................................................51 2. Number of trials needed to reach mastery criterion for each participant .......................76 3. Celeration rates of target behaviors .............................................................................116 4. Average frequency of responding (per minute) at the conclusion of the study ..........128 A1. Key terms in beginning reading instruction ..............................................................157 A2. Project Follow Through models ................................................................................158 A3. Reading Mastery Plus ? Level K placement test sample questions ..........................159 A4. Sequence of letter-sound correspondence introduction ............................................160 x LIST OF FIGURES Figure Page 1. Project Follow Through Results: Index of significant outcomes for all models. ..........23 2. Project Follow Through Results: Comparison of third-grade students on the Metropolitan Achievement Test. .......................................................................................24 3. Special orthography and signaling procedure.???????????????...31 4. Percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Allison ...........................................................................................................................61 5. Percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Allison ...........................................................................................................................62 6. Percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Allison ...........................................................................................................................63 7. Percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Danielle .........................................................................................................................64 8. Percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Danielle .........................................................................................................................65 9. Percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Danielle .........................................................................................................................66 10. Percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Megan ...........................................................................................................................67 xi Figure Page 11. Percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Megan ...........................................................................................................................68 12. Percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Megan ...........................................................................................................................69 13. Percentage of correct responding to letter-sound correspondences ?t?, ?n?, and ?c? by Megan ...........................................................................................................................70 14. Percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Ricky .............................................................................................................................71 15. Percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Ricky .............................................................................................................................72 16. Percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Ricky .............................................................................................................................73 17. Percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Omar .............................................................................................................................74 18. Percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Omar .............................................................................................................................75 19. Responding by Allison across all letter-sound correspondence tasks .........................78 20. Responding by Danielle across all letter-sound correspondence tasks ........................79 21. Responding by Megan across all letter-sound correspondence tasks ..........................80 22. Responding by Ricky across all letter-sound correspondence tasks ............................81 23. Responding by Omar across all letter-sound correspondence tasks ............................82 xii Figure Page 24. Percentage of correct responding on ?say it fast? exercises by the individuals with developmental disabilities ..........................................................................................86 25. Percentage of correct responding on ?say it fast? exercises by the individuals without developmental disabilities ....................................................................................87 26. Responding by Allison on ?say it fast? exercises ........................................................88 27. Responding by Danielle on ?say it fast? exercises ......................................................89 28. Responding by Megan on ?say it fast? exercises .........................................................90 29. Responding by Ricky on ?say it fast? exercises ..........................................................91 30. Responding by Omar on ?say it fast? exercises ...........................................................92 31. Percentage of correct responding on ?say the sounds? exercises by the individuals with developmental disabilities ..........................................................................................93 32. Percentage of correct responding on ?say the sounds? exercises by the individuals without developmental disabilities ....................................................................................94 33. Responding by Allison on ?say the sounds? exercises ................................................96 34. Responding by Danielle on ?say the sounds? exercises ..............................................97 35. Responding by Megan on ?say the sounds? exercises .................................................98 36. Responding by Ricky on ?say the sounds? exercises ..................................................99 37. Responding by Omar on ?say the sounds? exercises .................................................100 38. Percentage of correct responding on ?say the sounds-say it fast? exercises by the individuals with developmental disabilities .....................................................................101 39. Percentage of correct responding on ?say the sounds-say it fast? exercises by the individuals without developmental disabilities ................................................................102 xiii Figure Page 40. Responding by Allison on ?say the sounds-say it fast? exercises .............................104 41. Responding by Danielle on ?say the sounds-say it fast? exercises ............................105 42. Responding by Megan on ?say the sounds-say it fast? exercises ..............................106 43. Responding by Ricky on ?say the sounds-say it fast? exercises ................................107 44. Responding by Omar on ?say the sounds-say it fast? exercises ................................108 45. Percentage of correct responding on ?sounding out? exercises by the individuals with developmental disabilities ........................................................................................109 46. Percentage of correct responding on ?sounding out? exercises by the individuals without developmental disabilities ..................................................................................110 47. Responding by Allison on ?sounding out? exercises .................................................111 48. Responding by Danielle on ?sounding out? exercises ...............................................112 49. Responding by Megan on ?sounding out? exercises .................................................113 50. Responding by Ricky on ?sounding out? exercises ...................................................114 51. Responding by Omar on ?sounding out? exercises ...................................................115 52. Percentage of correct responding on ?reading vocabulary? exercises by the individuals with developmental disabilities .....................................................................118 53. Percentage of correct responding on ?reading vocabulary? exercises by the individuals without developmental disabilities ................................................................119 54. Responding by Allison on ?reading vocabulary? exercises .......................................120 55. Responding by Danielle on ?reading vocabulary? exercises .....................................121 56. Responding by Megan on ?reading vocabulary? exercises .......................................122 57. Responding by Ricky on ?reading vocabulary? exercises .........................................123 xiv Figure Page 58. Responding by Omar on ?reading vocabulary? exercises .........................................124 59. Participants? percentage of correct responding across all dependent measures ........125 60. Participants? cumulative number of errors across all dependent measures ...............127 A1. Direct Instruction checklist .......................................................................................161 A2. Direct Instruction observation form ..........................................................................162 A3. Direct Instruction ratings form ..................................................................................163 A4. Direct Instruction general comments form ...............................................................164 A5. Cumulative number of errors made to letter-sound correspondences ?a?, ?m?, and ?s? by Allison ............................................................................................................165 A6. Cumulative number of errors made to letter-sound correspondences ?e?, ?r?, and ?d? by Allison ...........................................................................................................166 A7. Cumulative number of errors made to letter-sound correspondences ?f?, ?i?, and ?th? by Allison ..........................................................................................................167 A8. Cumulative number of errors made to letter-sound correspondences ?a?, ?m?, and ?s? by Danielle ..........................................................................................................168 A9. Cumulative number of errors made to letter-sound correspondences ?e?, ?r?, and ?d? by Danielle ..........................................................................................................169 A10. Cumulative number of errors made to letter-sound correspondences ?f?, ?i?, and ?th? by Danielle .........................................................................................................170 A11. Cumulative number of errors made to letter-sound correspondences ?a?, ?m?, and ?s? by Megan .............................................................................................................171 xv Figure Page A12. Cumulative number of errors made to letter-sound correspondences ?e?, ?r?, and ?d? by Megan ............................................................................................................172 A13. Cumulative number of errors made to letter-sound correspondences ?f?, ?i?, and ?th? by Megan ...........................................................................................................173 A14. Cumulative number of errors made to letter-sound correspondences ?t?, ?n?, and ?c? by Megan ............................................................................................................174 A15. Cumulative number of errors made to letter-sound correspondences ?a?, ?m?, and ?s? by Ricky ..............................................................................................................175 A16. Cumulative number of errors made to letter-sound correspondences ?e?, ?r?, and ?d? by Ricky ..............................................................................................................176 A17. Cumulative number of errors made to letter-sound correspondences ?f?, ?i?, and ?th? by Ricky.............................................................................................................177 A18. Cumulative number of errors made to letter-sound correspondences ?a?, ?m?, and ?s? by Omar ...............................................................................................................178 A19. Cumulative number of errors made to letter-sound correspondences ?e?, ?r?, and ?d? by Omar ..............................................................................................................179 A20. Cumulative number of errors made by Allison during ?say it fast? tasks ..............180 A21. Cumulative number of errors made by Danielle during ?say it fast? tasks .............181 A22. Cumulative number of errors made by Megan during ?say it fast? tasks ...............182 A23. Cumulative number of errors made by Ricky during ?say it fast? tasks .................183 A24. Cumulative number of errors made by Omar during ?say it fast? tasks .................184 A25. Cumulative number of errors made by Allison during ?say the sounds? tasks ......185 xvi Figure Page A26. Cumulative number of errors made by Danielle during ?say the sounds? tasks .....186 A27. Cumulative number of errors made by Megan during ?say the sounds? tasks .......187 A28. Cumulative number of errors made by Ricky during ?say the sounds? tasks .........188 A29. Cumulative number of errors made by Omar during ?say the sounds? tasks .........189 A30. Cumulative number of errors made by Allison during ?say the sounds-say it fast? tasks .....................................................................................190 A31. Cumulative number of errors made by Danielle during ?say the sounds-say it fast? tasks .....................................................................................191 A32. Cumulative number of errors made by Megan during ?say the sounds-say it fast? tasks .....................................................................................192 A33. Cumulative number of errors made by Ricky during ?say the sounds-say it fast? tasks .....................................................................................193 A34. Cumulative number of errors made by Omar during ?say the sounds-say it fast? tasks .....................................................................................194 A35. Cumulative number of errors made by Allison during ?sounding out? tasks .........195 A36. Cumulative number of errors made by Danielle during ?sounding out? tasks .......196 A37. Cumulative number of errors made by Megan during ?sounding out? tasks ..........197 A38. Cumulative number of errors made by Ricky during ?sounding out? tasks ...........198 A39. Cumulative number of errors made by Omar during ?sounding out? tasks ............199 A40. Cumulative number of errors made by Allison during ?reading vocabulary? tasks ..............................................................................................200 xvii Figure Page A41. Cumulative number of errors made by Danielle during ?reading vocabulary? tasks ..............................................................................................201 A42. Cumulative number of errors made by Megan during ?reading vocabulary? tasks ..............................................................................................202 A43. Cumulative number of errors made by Ricky during ?reading vocabulary? tasks ..............................................................................................203 A44. Cumulative number of errors made by Omar during ?reading vocabulary? tasks ..............................................................................................204 A45. Cumulative number of errors made by Allison across all skill sets ........................205 A46. Cumulative number of errors made by Danielle across all skill sets ......................206 A47. Cumulative number of errors made by Megan across all skills sets .......................207 A48. Cumulative number of errors made by Ricky across all skill sets ..........................208 A49. Cumulative number of errors made by Omar across all skill sets ..........................209 1 CHAPTER I. INTRODUCTION The Need for Quality Education Concern Over Educational Outcomes In recent years considerable national attention has been focused on educational reform in the United States (U.S.) (Kim & Axelrod, 2005; Marchand-Martella & Martella, 2002; Strauss, 2005). Concerns about public education are not new; however, their focus in more recent times has shifted. Issues that have been paramount in the last 10 to 20 years have included excessive high school dropout rates, an apparent decline in national and state test scores, an increasing achievement gap between international and U.S. students, and the failure of funding increases to produce any discernible results in addressing these issues (Evers, 1998). As educators and politicians have continued to search for school reform models to address these issues, researchers have begun to identify factors that have largely contributed to our educational system?s inadequacies. At the forefront is the issue of literacy, which is defined as the ability to read and write (American Heritage Dictionary, 1992). The U.S. Department of Education reported in 2004 that across the nation 40% of fourth-grade students failed to demonstrate even basic literacy skills required for success in school, and when they examined low socioeconomic schools, that figure rose to over 70% (as cited in Twyman, Layng, Stikeleather, & Hobbins, 2005). Furthermore, skills in reading ability did not improve with additional educational instruction, as the National 2 Center for Education Statistics (1999) reported that 67% of students in the 8th-grade and 60% of students in the 12th-grade did not meet the skills requirements necessary to be labeled as proficient readers. The failure of our school systems to adequately address this developing literacy achievement gap has implications not only for the individuals lacking these basic skills, but the general U.S. society. Over the last 50 years we have seen the U.S. economy transform from a physical-labor oriented workforce, to more efficient, technology-versed personnel. As the rapid increase in technology has altered the employment landscape, the U.S. educational system has struggled to keep pace in supplying the market with a workforce that possesses the increased intellectual abilities needed to both develop and work in these emerging industries. As employment opportunities for those with underdeveloped basic skills continue to decrease, the U.S. will be left with an increasing unemployment problem. As the National Institute for Literacy (2000) reported, a high proportion (6.5%) of individuals with inadequate academic achievement (i.e., no high school degree) are unemployed and run the risk of continuing to live in poverty and raising a family that may encounter the same educational problems. There are a number of factors to address to help facilitate the improved performance of our educational system?s teaching of basic literacy skills. While the ability to teach students how to write is extremely important, a full discussion of this component of literacy is beyond the scope of this paper. Instead, the remainder of the paper will focus on reading instruction. The first important factor in improving reading instruction is the ability to identify those students who are most at risk for experiencing problems in this area (Evers, 1998; Marchand-Martella, Slocum, & Martella, 2004; Weaver, 2002). 3 Students At-Risk for Reading Deficiencies The majority of students who enter schools at risk for reading disabilities generally fall into two broad groups. The first group begins school with adequate language ability (i.e., vocabulary, conceptual knowledge, etc.), but is at risk due to a weakness in literacy skills such as letter knowledge and sounds of the English language. These children generally have difficulty in transitioning between printed text and oral language (Carnine, Silbert, Kame?enui, Tarver, & Jungjohann, 2006). The second group of students begins school with a deficit in both language ability and literacy skills. These children will often fall behind immediately because the instruction provided to them often assumes that they are entering school with these basic skills (Carnine et al., 2006). As previously mentioned, 40% of U.S. fourth-grade students are reading below a basic level; that is, they have difficulty reading and comprehending even the simplest of texts. When we examine specific groups within that national average, the results become even more discouraging. At every age (and subject) level, African Americans and Hispanics scored below Caucasians (Kim & Axelrod, 2005). Sixty-three percent of African American fourth graders and 58% of Hispanic children scored below the basic reading level, compared to 27% of Caucasian students (National Institute for Literacy, 2000). Another major concern is the growing educational achievement gap between students in affluent and middle-class school systems and those students from minority and low- income school districts. In many of these school districts, students from the minority groups discussed above account for 80% of the enrollment (Kober, 2001 as cited in Kim & Axelrod, 2005); and although local, state, and federal agencies have made efforts to assist the districts in bridging these gaps, the achievement levels of minority and 4 disadvantaged students has declined over the last decade in comparison to other students (National Institute for Literacy, 2000). In addition to minority and low-socioeconomic status students, children identified as learning disabled also have been shown to have difficulty in acquiring the skills needed to read at a basic level (Gersten, 1985). Of all the children recognized as learning disabled and requiring special education services, almost 80% of them have been classified as such due to impairment in their reading ability. Even with these students receiving additional services, more than twice as many students with learning disabilities are failing to graduate from high school as compared to their peers (Commission on Excellence in Special Education, 2002). The data show that there are a number of students that we may expect to demonstrate academic deficiencies, including: (a) racial minorities, (b) those in poverty, and (c) students with disabilities. The data also show that there are a number of students struggling to read whom we would not expect. Children raised by well educated, middle- class parents throughout the country have also demonstrated reading deficiencies (National Center for Education Statistics, 1999). What these results indicate is that difficulty in learning to read effectively is not limited to only those individuals coming from educationally disadvantaged backgrounds. The failure of our educational system to address this critical issue early and effectively for these at-risk students has led to what Stanovich (1986) has popularized as Matthew effects. This term was selected based on the bible passage from Matthew 25:29 (Revised Standard Version) that reads: ?For unto everyone that hath shall be given, and he shall have abundance; but from him that hath not shall be taken away even that which he 5 hath.? From an education standpoint, Matthew effects refer to children who enter school with strong academic skills being able to acquire other skills with relative ease, while those children with little to no prerequisite skills for beginning reading having to struggle to learn the skills necessary to keep pace with the class. This ?negative spiral of cumulative disadvantage? (Carnine, Silbert, Kame?enui, & Tarver, 2004, pg. 15) affects the child in all academic areas and is capable of leading to the development of behavioral problems related to the inability to effectively perform the task, which only presents another obstacle to academic achievement. Although there are some individuals who would debate whether there really is a reading crisis in the U.S. schools (Allington, 2006; Strauss, 2005), the majority of researchers, administrators, and educators would agree that improving our students reading skills is a high priority (Evers, 1998; Simmons & Kame?enui, 1998). While we have used a variety of labels to identify the children at-risk, the common denominator is their performance in reading, and more specifically, their reading failure. An important factor then in attempting to improve our students? performance is to identify the potential variables that may lead to their reading deficiencies. Why Children Fail to Read Word Recognition Deficits Research has shown that reading problems for individuals have primarily occurred at the level of the individual word and largely revolved around the ability to orally decode the printed word into its component parts (Bradley & Bryant, 1983; Byrne & Fielding- Barnsley, 1991; Torgesen, 1997; Vellutino, 1991). In order to be able to decode single words, the beginning reader must acquire an understanding that reading in the English 6 language ? amongst others ? is based on the alphabetic principle, which is that units of print (graphemes) represent units of sound (phonemes). The ability to identify and manipulate these phonemes, which is known as phonemic awareness, has been shown to be a critical component in beginning reading (National Institute of Child and Human Development [NICHD], 2000). Research conducted over the last two decades has clearly shown that students who enter school with a strong set of skills in phonological awareness (i.e., awareness of the larger parts of spoken language; see Table A1 for a description of common terms in reading instruction) and phonemic awareness are more successful in reading than those students who do not (Gillon, 2004; Goswami & Bryant, 1990; Simmons & Kame?enui, 1998). Once children learn these phonological and phonemic awareness skills, they become more accurate at word recognition; when they become more accurate at word recognition they begin to read more fluently and can begin to devote more of their intellectual abilities to reading for comprehension (Shankweiler et al., 1999; Stanovich, 1986). Student?s mastery of these skills is a key factor in whether they will experience reading difficulty. A child?s reading ability is on a continuum, with the factors that allow them to read well also leading to their reading poorly when those processes are deficient. While current research has not yet determined a qualitative difference between processes related to reading disabilities versus typical development, there have been studies that have examined other factors that may contribute to this deficiency in word recognition. Genetic and Environmental Factors Neurological influences. Although research in this area is relatively new, progress has been made in beginning to identify the neural systems used for reading. With the 7 development of more advanced neural imaging techniques, such as functional magnetic resonance imaging (fMRI), researchers are now able to measure the changes that take place in neural activity in specific brain regions when discrete tasks are presented (Joseph, Noble, & Eden, 2001; Richards, 2001). While current research has still not identified all areas of the brain involved in reading, researchers have been able to discover three main regions involved in this process. Through imaging studies, it has been shown that the left cerebral hemisphere is the focal point of phonological analysis and comprehension. Interestingly, studies have shown that while non-impaired readers have most of their neural activation during reading occur in the left superior temporal gyrus (STGp), inferior parietal, and temporoparietal areas, individuals with reading impairments predominantly show activation in the corresponding regions in the right cerebral hemisphere (Simos et al., 2002). Research in this area also demonstrated that through direct electrical stimulation of the left STGp, decoding ability was severely disrupted in non-impaired readers (Simos et al., 2000 as cited in Simos et al., 2002), further supporting the view that this region is critical in the reading process. Researchers have observed these neurobiological changes in impaired readers across age, gender, cultures, and languages (Paulesu et al., 2001). Observation of these patterns between adults and children suggests that these reading difficulties do not dissipate with maturity. Nevertheless, studies have shown that through phonological and phonemic awareness instruction, children have seen a change not only in behavioral performance, but also in brain functioning (Shaywitz et al., 2003 as cited in Shaywitz & Shaywitz, 2004; Simos et al., 2002). These results suggest that although reading deficiencies clearly 8 have a neurological basis, it is not a neurological disease. While it remains to be determined if there is a critical time frame for producing these neural and behavioral changes, this research is extremely promising in that it supports the view that reading deficiencies can be mediated through explicit and systematic instruction. Hereditary and environmental influences. Reading deficiencies have been proven to be highly hereditary. Research has shown that for parents who have a reading disability, between 25% to 50% of their children will also have a reading deficiency, and that if one child in the family has a reading deficiency, 50% of his or her siblings will also be affected (Scarborough, 1990). In addition to the neurological areas involved in reading discussed above, researchers have also been able to identify several genes (e.g., chromosome 6) that are involved in reading deficiency (Cardon et al., 1994; Fisher & DeFries, 2002). While genetic variables do account for some of the variability in individual?s reading skills, they do not account for everything. The role of the environment is an extremely important factor in the development of reading skills. Children who are raised in a home where one or both of the parents have a reading deficiency are susceptible not only to the genetic factors, but also growing up in a relatively impoverished learning environment. Parents who read poorly may be less likely to read to their children and spend time developing the skills (e.g., phonemic awareness, oral language) needed to succeed in beginning reading instruction. The importance of parent-child interactions and their effects on vocabulary acquisition was demonstrated by Hart and Risley (1995) who reported that among high, middle, and low socioeconomic families, children from 9 low-income households acquired less than half (approximately 500 vs. 1100) as many words as children from high socioeconomic families by three years of age. Instructional influences. While it is easy and maybe sometimes tempting to attribute a child?s academic failures to genetics and the home environment, one often underestimated factor is the influence of the instruction that is provided. Biological predispositions to reading deficiencies can be exacerbated by the fact that these struggling readers are not receiving the type and frequency of instruction necessary to improve reading ability. Combine the finding that struggling readers do not receive as much practice as non-impaired students (Allington, 1984) with the fact that struggling readers are often provided with reading materials that are too difficult for them (Stanovich, 1986), and it is not surprising to see the results that have been discussed throughout this paper. While instruction certainly plays an important part in the reasons for why children fail to learn to read, it is also one of the few factors that we can effectively manipulate in order to prevent this occurrence. Preventing Reading Failure Early intervention. In addressing students? reading failures, our school systems must be proactive and focus on prevention rather than intervention. While early intervention/prevention has long been regarded as logical and cost-efficient (Adams, 1990; Stanovich, 1986), even intensive programs like Head Start have not always produced the desired outcomes. While beginning academic instruction early is certainly beneficial, these unmet goals may be due to the fact that the educational programs being offered are simply not sufficient (Foorman, Francis, Beeler, Winikates, & Fletcher, 1997). With an increased understanding of what the key factors in teaching reading are 10 (NICHD, 1996, 2000), educators must now focus on implementing programs designed on what research has found. Turning research into practice. ?The separation between research and application in education can be characterized not merely as a gulf but as an abyss? (Sidman, as cited in Heward, 2005, p. 317). The education field is certainly not a static institution impervious to change; however, its application of scientifically based research programs in the classroom could arguably be described as such. Ideology, personal preference, and convenience seem to have driven the selection of curricula more so than research (Carnine, 1992; Gersten, 2001). The explanation as to why these practices continue to exist within the education field is certainly complicated and involves a number of issues, including: (a) teachers? lack of training in evaluating research and its implementation (Brophy & Good, 1986; Carnine, 1995); (b) educators? lack of interest in objective evidence (Olson, 1999; Watkins, 1996; see the later discussion of Project Follow Through); (c) an unwillingness to implement effective programs that require structured, fast-paced, and regular daily application (Lindsley, 1992); and (d) a failure by researchers to effectively communicate their findings to the ?average? administrator and/or teacher (Gable & Warren, 1993). In an attempt to bridge the gap between research and practice and address the developing achievement differences between subgroups of students, the No Child Left Behind Act (NCLB) of 2001 was signed into legislation. The NCLB was an extension and reauthorization of the Elementary and Secondary Education Act (ESEA) of 1965, which also attempted to improve the education of disadvantaged (i.e., low-income) students through the appropriation of additional funds to teach reading (Title I). The 11 NCLB, through the Reading First (K-3) and Early Reading First (pre-K) components of the act, extended the remedial-reading services allocated in the Title I section of the ESEA, to include professional development for teachers and the requirement that the reading instruction being delivered in the classroom be supported by scientific research (U.S. Department of Education, 2002). The federal government?s recognition of the literacy crisis in the U. S. schools and their support in beginning to address this issue through such acts as the NCLB is a promising step in the right direction. The NCLB?s goal of bringing all students up to proficient levels in reading and other areas by the end of the 2013-2014 school year is extremely ambitious and an area that will be addressed when it comes up for review later this year (?Testing law may change,? 2007). Nevertheless, with increased accountability and a new emphasis on teacher training, early intervention, and the implementation of reading programs supported by scientifically based research, it is a goal that is more attainable now than in any previous generation. In order to meet these high standards though, reading instruction must be carefully designed and implemented. Perspectives on Reading Instruction Whole-language (meaning-based) Instruction Advocates of the whole language approach to reading describe it not as a method of instruction but as a perspective that is in part based on the philosophy of holism (Goodman, 1992; Krashen, 2002). Holism, as it relates to education, is based on the belief that it is not possible to understand learning by analyzing the component parts, because these parts are intertwined in an indivisible manner and can only be studied as such (Weaver, 2002). Accordingly, the whole language approach to reading embraces 12 this philosophy and examines the principles involved in reading from a strictly natural or whole standpoint. The standard version of the whole language approach was developed in part by Goodman (1968), who began to study reading from a psycholinguistic perspective. Goodman (1982) referred to reading as a ?psycholinguistic guessing game? (p. 33) in which the reader uses three different cueing systems (i.e., graphophonemic, semantic, and syntactic) to determine the meaning of the word/reading, which is the ultimate goal for this approach. Raines and Canady (1990) describe a typical reading session in a whole language classroom as follows: First, the reader scans the print and predicts the meaning. Then the reader samples the print to confirm or reject the predicted meaning. If the prediction is confirmed, she moves on to the next sample. If the prediction is rejected, she either abandons or adjusts the prediction and moves on. As she moves through the text, predicting, sampling, and confirming information, she integrates the new information in with her previous knowledge. Comprehension is taking place as the reader reads and when the ?whole? text is read. (pg. 5) Whole language advocates view this ability to read and comprehend the text as a natural process similar to learning how to speak, suggesting that children learn the alphabetic principle and other skills needed to read, naturally, as a consequence of simply being exposed to literature-rich environments (Krashen, 2002). In this meaning-emphasis approach, whole language programs select literature for the children to read, not on the basis of decodability, but on how frequently the words appear in everyday print. In these situations, the students are required to read what words they 13 can, and then to use a variety of sources (e.g., pictures, context, initial letter) as cues to determine any novel words (Fossett & Mirenda, 2006). In whole language programs then, students do not learn the basic reading skills first; instead, they learn the meaning of specific words and then use those meanings (along with other cues) to help decode unfamiliar words. Whole language theorists claim that the natural development of these abilities is predicated, however, on the premise that children must: (a) be properly motivated, (b) have access to developmentally appropriate, high-quality, and culturally-diverse literature that consists of real text, and not decodable passages designed for reading instruction; (c) integrate their literacy skills throughout other areas of the curriculum, especially writing; and (d) have the opportunity to read frequently (Allington, 2006). While didactic instruction is not emphasized, this child-centered approach does support the teaching of certain skills (e.g., phonics) to develop fluency when it is embedded within the context of the literature. This form of instruction is similar to incidental teaching (Hart & Risley, 1975), as instruction in letter-sound correspondence or other reading skills is provided in the context of a naturally occurring teacher-student interaction, as opposed to a structured lesson. As Adams and Bruck (1995) pointed out, the whole language approach became increasingly popular from the 1970s to the mid-1990s for a number of reasons. The movement?s emphasis on comprehension and the integration between reading and writing did have some positive effects on literacy instruction, including an increase in the quality of literature in schools, and an encouragement of families to spend more time reading together with their children. Nevertheless, not everyone agreed with the core components 14 of the program. As the whole language approach was implemented in more and more classrooms across the U. S., a statistically significant decline in reading scores was recorded on the National Assessment of Educational Progress (National Center for Education Statistics, 1999). In response to these declining test scores, critics of the whole language approach began to generate considerable research that questioned the utility of these programs, and more specifically, how it addressed the teaching of phonics. Phonics (code-based) Instruction In direct contrast to whole language?s approach, proponents of phonics instruction believe that beginning readers should receive systematic and explicit instruction in the alphabetic principle. Supporters of this system claimed that the ability to efficiently use letter-sound correspondences provided the reader with the ability to: (a) recognize familiar words accurately and automatically, (b) independently decode new words using minimal contextual cues, and (c) be able to devote more effort to comprehension (Carnine et al., 2004). In order for phonics instruction to be effectively utilized, beginning readers need to acquire a few prerequisite abilities (Schieffer, Marchand-Martella, Martella, Simonsen, & Waldron-Soler, 2002). Assuming that the beginning reader has normal receptive and expressive language abilities, the first skill to be acquired is phonological awareness. Phonological awareness, as mentioned previously, is the ability to recognize that each word consists of smaller parts (e.g., syllables, phonemes, onsets, rimes) and that the sound structure of the word allows for the occurrence of devices such as rhyming and alliteration. For example, the word ?mint? can be heard as a one-syllable word: mint. Mint can also be heard as individual phonemes /m/ - /? /- /n/ - /t/ or in its onset (i.e., 15 consonant sound(s) that occur before the first vowel) and rime (i.e., vowel of a syllable and any additional consonants that follow) form: /m/ - /?nt/. The second skill to be acquired is a subcategory within phonological awareness and is known as phonemic awareness. Phonemic awareness focuses only on the ability of being able to identify the phonemes contained within the word and the individual?s ability to manipulate those sounds (e.g., blending, segmenting). Blending requires that the student be able to translate a series of blended sounds (e.g., ?mmmiiinnnt?) into a word produced at a normal rate (e.g., ?mint?). By having the student perform this type of exercise, students experience the fact that words are composed of smaller units. In contrast, segmenting requires the student to take a word and say it slowly, holding each sound for a period of time and switching to the next sound in the word without pausing. Once this skill is mastered, students then move on to segmenting words by pronouncing each phoneme in sequence, but without holding the sound. Because oral instruction in segmenting and blending does not require the student to have an understanding of letter- sound correspondences, these skills can be taught prior to instruction of any specific letter-sound associations. Instruction in phonemic awareness, therefore, will prepare the student for the type of tasks that they will encounter later on in beginning reading activities (Carnine et al., 2006). Whole language supporters have claimed that the explicit teaching of phonological and phonemic awareness, as well as letter-sound correspondence, is unnecessary and not an efficient use of instructional time since the rules are complex and have numerous exceptions (Krashen, 2002; Weaver, 2002). Research on this topic has shown otherwise. The ability to display phonological and phonemic awareness has been demonstrated to be 16 extremely important in reading aptitude and the prediction of future reading performance (Bradley & Bryant, 1983; Cunningham, 1990; Lundberg, Frost, & Petersen, 1988; Lundberg, Olofsson, & Wall, 1980; Moore, Evans, & Dowson, 2005; O?Connor, Jenkins, Leicester, & Slocum, 1993; Olofsson & Lundberg, 1985; Torgesen, 1997). Results from this research indicate that deficits in the ability to use phonological awareness can explain a significant portion of beginning reading problems (e.g., word recognition) and difficulty in other related areas, including comprehension and vocabulary (see Smith, Simmons, & Kame?enui, 1998 for a review of this research). With decreasing reading scores across the country and advocates for both whole language and phonics instruction citing research that claimed that their program was the most efficient, the U. S. Congress attempted to achieve a consensus and commissioned a book to review the scientific research (Adams, 1990) and two separate panels to examine the growing literacy crisis (NICHD, 2000; Snow, Burns, & Griffin, 1998). The results that were reported by all three sources mostly confirmed the previous work?s findings; therefore, only the most recent findings will be presented. National Reading Panel Report Findings The National Reading Panel was commissioned in 1997 to assess the status of research-based knowledge in teaching children how to read. An examination of various databases produced approximately 100,000 research studies related to reading that had been published since 1966. In order to be able to examine this literature, the panel gathered information from the previous panel?s work (Snow et al., 1998), conducted regional public hearings, and consulted experts within the field, to determine a prioritized list of topics to address (NICHD, 2000). Following this process, the National Reading 17 Panel selected seven main topics for more intensive study. The topics selected included: (a) alphabetics (i.e., phonemic awareness & phonics instruction); (b) fluency, (c) comprehension (i.e., vocabulary, text comprehension, teacher preparation, and comprehension strategies instruction); (d) teacher education and reading instruction, and (e) computer technology and reading instruction. After selection of the topics, panel members formed a subgroup to address each area. In order for a study to be included in the panel?s review it had to meet all of the following criteria: (a) published in English in a refereed journal, (b) focused on reading development in children from preschool to grade 12, and (c) used an experimental or quasi-experimental design with a control group or a multiple-baseline design (NICHD, 2000). Following this selection process, a meta-analysis was performed if the data permitted, or a detailed descriptive analysis was conducted. Results obtained from studies that assessed the effectiveness of phonemic awareness training indicated that explicitly teaching this skill was highly successful across a variety (e.g., age, gender, ability, socioeconomic status) of students in increasing reading skills, and in most cases, spelling skills. While the panel states explicitly that reading programs should not focus solely on phonemic awareness, they should provide explicit instruction in this area. The meta-analysis conducted on studies examining the effectiveness of phonics instruction found similar results. Through direct and systematic instruction of letter- sound correspondences, children in kindergarten through 6 th grade performed significantly better in decoding and spelling words then students who received little to no instruction, especially for children from low-socioeconomic backgrounds and those with 18 reading disabilities. The results also indicated that first graders who received phonics instruction improved their reading comprehension, although older students showed no significant difference. Examination of studies that focused on fluency indicated that repeated oral reading procedures that included feedback had a positive effect on word recognition, fluency, and comprehension across all students. Nevertheless, activities that only emphasized increased time spent reading (e.g., independent silent reading) without proper feedback, showed little effect. This is not to suggest that silent reading is not beneficial, but rather, that in developing fluency students need more support than just practice alone. Results from the meta-analysis that examined comprehension strategies offered several conclusions. The first conclusion is that instruction in vocabulary, whether through incidental teaching or repeated activities, is beneficial in improving the comprehension skills of children. The results also indicate that specific instruction in a number of comprehension strategies (e.g., question answering, question generation, story structure, graphic organizers; see NICHD, 2000, pg. 15 for a full description of strategies) is the most effective technique in increasing a student?s reading comprehension. Finally, the results suggest that teachers should be trained sufficiently, and be skillful in their ability to adapt and respond to students? needs for feedback and altered forms of teaching strategies. While the National Reading Panel?s report (NICHD, 2000) was certainly not without its detractors (Cunningham, 2001; Strauss, 2005; Weaver, 2002), when accompanied with the findings of Adams? (1990) book and the Preventing Reading Difficulties report (Snow et al., 1998) the case for the teaching of phonics within a larger reading program 19 was certainly justified. While these sources do recommend the teaching of phonics, they also suggest that the skills developed through these programs are only necessary and not sufficient in becoming a proficient reader. Within a balanced reading program, students should also receive the explicit teaching of fluency, vocabulary, and reading comprehension strategies. One approach to reading that contains all of these components is known as Direct Instruction (DI). Direct Instruction Reading Defining Direct Instruction Direct Instruction has been defined and described in various ways since its development. The most common misconception is that DI is nothing more than teacher- directed instruction, as opposed to the child-centered approach, in which the teacher acts as a facilitator for students (Adams & Engelmann, 1996). The term ?direct instruction? was first introduced by Rosenshine (1976) as part of his examination of instructional variables related to effective teaching practices. Rosenshine?s direct instruction (not capitalized) refers to teaching activities focused on academic matters where goals are clear to the students, time allocated for instruction is sufficient and continuous, the content covered is extensive, student performance is monitored, questions produce many correct responses, and feedback to students is immediate (Becker & Carnine, 1978). Direct Instruction has often thus been misperceived as any systematic instruction with these features (Stein, Carnine, & Dixon, 1998). For the purposes of this paper, however, DI refers specifically to the teaching model developed by Bereiter and Engelmann (1966). The developers of this instructional approach based their theory on the belief that 20 virtually all students can experience success and, if they do not, then there is a problem with the instructional design. Historical Origins and Development Unlike most child-centered models, DI evolved out of work with students who were from economically disadvantaged backgrounds (Becker & Carnine, 1978). In contrast to whole language?s Piagetian influenced philosophy (Raines & Canady, 1990), DI did not endorse withholding instruction from students until they reached specific levels of developmental readiness (Marchand-Martella et al., 2004). Instead, Bereiter and Engelmann developed their program so that it could be implemented immediately with children struggling to keep pace with their normally developing peers. Instruction in the University of Illinois affiliated preschool run by Bereiter and Engelmann consisted of intensive, sequenced presentations, through teacher-directed verbal instruction for two hours per day in small homogenous groups (Marchand- Martella et al., 2004). This systematic approach to instruction resulted in strong progress in academic achievement for these disadvantaged students. After Bereiter left to take another position, Wes Becker was asked to become co-director of the preschool program. Shortly after this occurrence, and encouraged by the results being produced by the program, Engelmann and Becker were asked by the Office of Education to design a program for children from kindergarten through third grade. The program that was designed became known as DISTAR (Direct Instruction System for Teaching Arithmetic and Reading) and was one of the models included in what would become known as Project Follow Through. 21 Project Follow-Through Originally designed as a comprehensive program to extend Head Start into the elementary grades, a decrease in funds shifted the focus of Project Follow Through from service to research (Becker, 1977). The project was redesigned to select, test, and evaluate different educational approaches and collect data on their effectiveness in teaching disadvantaged children from kindergarten through third grade. Project Follow Through remains to date the largest educational experiments in history, involving over 100,000 children from 170 communities throughout the U.S. (Marchand-Martella et al., 2004). The initial testing phase lasted from 1968 to 1976, with the project continuing as a service program until 1995. State, school, and national officials nominated school districts that could stand to benefit from this study, and these school districts were able to choose between 22 different educational programs. After a presentation on each model, schools selected and implemented their sponsor?s program in at least one school in their district. In order to be included in the actual data analysis, however, it was required that the educational program be implemented in three or more active school sites that could be compared to control school sites within that same community. Using this criterion, 9 out of the 22 programs qualified to be included in the evaluation (Adams & Engelmann, 1996). Data from the schools were collected and analyzed across three kinds of outcomes. The first measure examined basic skills, such as word recognition, spelling, language, and math computation. The second measure examined cognitive-conceptual skills, including reading comprehension, math concepts, and problem solving. The final outcome that was examined was on children?s affect (i.e., self-esteem/self-concept). Data 22 were collected and analyzed by two independent agencies. Analysis of the results examined differences between the educational programs and their control groups (locally and nationally), as well as a comparison between all nine of the sponsor?s models (see Table A2 for a description of each model). Analyses between the Follow Through models and control sites that used the school districts normal curriculum are displayed in Figure 1. If the scores on the outcomes described above were statistically significant and higher than the control group, the result was considered positive. If the control group scored higher than the Follow Through programs and these results were statistically significant, this was considered a negative score. Non-significant differences were considered as a score of zero and represented by the vertical line in Figure 1 (for a full description on how these data were analyzed, see Adams & Engelmann, 1996, pp. 71-72). As the results show, DI was the only model that produced significant positive outcomes in all measurement categories. While DI is described as a basic skills model, it is interesting to note that it scored higher than any other model, including cognitive- conceptual programs (TEEM, Cognitive Curriculum, and Parent Education) in this area. Additionally, DI produced positive results in affective (self-esteem) measures. These two findings are particularly important, as one of the criticisms often suggested about DI is that it only promotes rote learning and that its focus on explicit and systematic instruction could lead to students overexertion and low self-esteem (Adams & Engelmann, 1996). Data that demonstrate the achievement differences between the nine models are presented in Figure 2. When examining these data, it is important to know that although the national average is the 50 th percentile, disadvantaged students? mean scores are 23 Figure 1. Project Follow Through Results: Index of significant outcomes for all models (adapted from Marchand-Martella et al., 2004). -500 -400 -300 -200 -100 0 100 200 300 400 500 Significant Outcomes Affective Cognitive Basic Direct Instruction Behavior Analysis Southwest Lab Parent Education TEEM Cognitive Curriculum Bank Street Responsive Education Open Education 24 Figure 2. Project Follow Through Results: Comparison of third-grade students on the Metropolitan Achievement Test. The dashed line indicates the national average for disadvantaged children (adapted from Marchand-Martella et al., 2004). 0 10 20 30 40 50 60 P e r cen t i l e Direct Instruction So ut h west Lab Parent Education Behavior Analysis Bank Street Responsive Educat ion Cognitive Curriculum TEEM Open Education Total Reading Total Math Spelling Language 25 typically at the 20 th percentile (Marchand-Martella et al., 2004). It is appropriate then to use the 20 th percentile as a means to measuring the effectiveness of each program. As Figure 2 shows, only DI provided instruction that increased student?s scores above the 20 th percentile and, in three categories, close to the national average. The Behavior Analysis model was the only other program to have all four of the measures score above the 20 th percentile, although it did not produce as successful results as DI. The data that were collected and reported by Project Follow Through clearly show that the DI approach to teaching produced much more success than any other program used in this study. Not surprisingly, proponents of the other methods questioned the findings of the study (House, Glass, McLean, & Walker, 1978), suggesting that the tests that were used were inappropriate and that the data analyses conducted by one of the testing agencies were flawed. Bereiter and Kurland (1981, 1982, as cited in Adams & Engelmann, 1996) followed the suggestions made by House et al. (1978) and reanalyzed the data. Analysis of these data again showed that DI was the most effective out of all of the programs. Despite the empirical support of DI?s effectiveness, schools throughout the country did not rush to implement this model. Advocates of DI have suggested that due to the model?s philosophical approach (i.e., not child-centered) educational administrators have been reluctant to provide their endorsement (Carnine, 2000, as cited in Kim & Axelrod, 2005). Regardless of the educational system?s reluctance to recommend the use of DI, the data have clearly shown that it is an extremely effective program in teaching not only reading, but other subject areas as well. In order to understand why DI has been so 26 proficient in addressing the academic needs of students (especially disadvantaged), it is necessary to examine the components and strategies that underlie its design. Components of Direct Instruction Reading What Makes Direct Instruction Effective While there are a number of programs that are capable of producing acceptable academic achievement, few, if any, incorporate such a well designed, scientifically based curriculum. The strategies used within DI programs have been selected and tested prior to their implementation to assure their effectiveness. The following section describes some of the key features contained within DI programs and why they have led to such consistent results. Teaching Techniques The manner in which a teacher presents lessons is as important as the instructional design underlying the content being presented. Different teacher presentation techniques are appropriate for different stages of reading instruction, and DI programs allow for this to occur. For example, when children are first learning to read, instruction is in small groups and is highly interactive, with children primarily making oral responses and the teacher providing immediate feedback on their responses. Once children have learned to read accurately and with fluency, reading instruction techniques become more varied. The instruction in an upper-grade classroom in which all children are performing at grade level will include a variety of instructional activities. Sometimes the teacher may present lessons to the entire class. Other times, children may work collaboratively by providing feedback to each other. If some children have difficulty with a particular concept or strategy, the teacher may provide small-group instruction to the struggling students. 27 Another example of how a specific technique is altered at different times during a student?s reading acquisition is the teacher?s monitoring of student performance. During early reading instruction the teacher listens to oral responses and watches children?s mouths to see how they are pronouncing words (Carnine et al., 2004). Monitoring in the later grades focuses more on the teacher reviewing student?s written work and providing a combination of oral and written feedback. Scripted Presentations Direct Instruction teachers learn to follow lesson scripts very carefully. The use of detailed lesson scripts has been criticized because it presumably restricts and inhibits teacher initiative (Adams & Engelmann, 1996). In considering this possible limitation, it is important to evaluate some virtues afforded by the use of scripts. Scripts allow for the use of explicitly pre-tested examples and sequences. The teacher knows that if the student has the prerequisite skills, the teaching sequence will work (Engelmann & Carnine, 1982). The teacher does not have to spend time trying out various possible illustrations, choosing appropriate language, and analyzing possible teaching sequences. The scripts also make explicit the teacher behaviors required to follow them. Thus, the training requirements for a given program can be formalized in detail and executed. Also, a supervisor of a scripted program can walk into any room and within a few seconds be oriented to what should be going on and thus evaluate the situation and provide appropriate feedback. Finally, because the teaching sequence is standardized, it is easier to monitor the progress of the children with program-based tests. Although scripted presentations are not necessary or even desirable in all areas or levels of education, they 28 most certainly can serve an important role when dealing with universal competencies for children. Sequencing of Skills In using the scripted presentations provided within DI programs, the teacher has the benefit of providing students with instruction in strategies that have been carefully sequenced in order to prepare them for the tasks that follow. Direct Instruction utilizes four main guidelines in determining the sequence of skills. First, DI programs always teach the prerequisite skills necessary to be successful in the academic task. For example, students will develop phonemic awareness before they are required to begin sounding out single words. Second, examples that are consistent with the strategy that is being taught are provided first. Once the child is able to master the rule, then exceptions to it are taught. For example, DI teaches students the most common letter-sound correspondences first, then children are introduced to words where the same letter makes a different sound. The third guideline that DI follows is that easier skills should be taught before more difficult ones. By increasing the student?s chance of success early in the instructional period, the child will be more likely to participate in more difficult exercises later on (Mace et al., 1988). Finally, DI separates information that is likely to be confused by a number of lessons. For example, letters that look similar (e.g., b and d) or that sound similar (e.g., /m/ and /n/) are introduced with a number of lessons between them; over 90 in the case of b and d (Carnine, 1976a, 1981). Small-Group Instruction The use of small-group instruction composed of individuals who are at the same instructional level has many advantages. In general, they are more efficient than one-on- 29 one instruction (Biberdorf & Pear, 1977; Fink & Sandall, 1979) and provide for more teacher direction, prompts, reinforcement, correction, and individualization than is found in large-group instruction. They allow for an emphasis on oral communication, which is frequently a problem with individuals with mental retardation and other disabilities (Scruggs & Mastropieri, 1993). Finally, small groups provide a setting where repetitious practice on important building blocks can be made fun and where other individuals can be used as models. Choral responding During the beginning reading and early primary stages, the instructor should maximize students? responding by structuring tasks to incorporate unison responses (Carnine et al., 2004). Unison or choral responding is when all of the students that are receiving instruction answer at the same time, which helps to create a high level of student participation. Much of the instruction in the beginning and early primary stage is suitable for unison responding since the tasks usually have only one correct answer. For example, when shown the letter ?s? and asked, ?What sound?? the students answer, ?sss.? The main advantage of frequent unison responses is that all students actively practice each skill throughout the instructional period. Unison responses also provide the teacher with frequent information about each student?s performance. Signaling The scripts used in small-group teaching tell the teacher how and when to give signs for the group to respond together. The effective use of signals (e.g., a cue given to let students know when to respond) allows all students the opportunity to respond. If signals are not used, or if they are not used in an effective manner, it is likely that some of the 30 higher functioning individuals in the group will respond before the lower responding individuals have a chance to organize their response (Carnine et al., 2004). This may result in the lower functioning students either repeating the same responses as other individuals in the group or not responding at all. This, in turn, may limit the lower functioning students? ability to master the material being taught. An example of a visual signal may be as follows (see Figure 3). When sounding out a word, a finger is used to point to the letter being sounded out. The students say the sound as long as the teacher touches the letter. The teacher moves his or her finger from sound to sound as they are to be said and lifts the finger away at the end of the word. This signaling procedure ensures that students blend the sounds, which minimizes word misidentification errors. Also, by pausing for a moment before signaling, the teacher provides instructionally na?ve individuals with a few extra seconds needed to come up with the answer. Pacing Appropriate pacing contributes to student attentiveness and reduces errors. Students are usually more attentive to a lively, fast-paced presentation than to a slow, deliberate, one (Carnine, 1976b). Also, frequent responding, which results from brisk teacher questioning, often enhances student attentiveness and increases the amount of practice the students receive. Nevertheless, quickly going from question to question does not mean that a teacher rushes students and requires them to answer questions without adequate time to determine the answer. The pause before the signal may be adjusted to allow for more time, especially during difficult tasks or with a group of individuals who are instructionally na?ve (Adams & Carnine, 2003). 31 Figure 3. Example from Reading Master Plus ? Level K displaying the special orthography and signaling procedure (Engelmann, Osborn, Bruner, Engelmann, & Seitz- Davis, 2002). 32 Error Correction Detecting and immediately correcting students? errors is essential in accelerating their learning. The correction procedure in the DI model is directed toward the entire group and it may involve as many as five different steps (e.g., model, lead, test, firm-up, delayed test). For example, if during a telescoping format (i.e., translating a series of blended sounds into words at a normal rate) the teacher says, ?ffffiiiit,? and after the signal the student responds ?fid,? the teacher would instantly correct the student?s mistake and model the entire task. The teacher would say the correct word (?fit?) and then lead the students in the same task while responding with them. This ensures that students will hear a correct response as they continue to practice. Third, the teacher tests the students as they respond by themselves. Once this is complete, the instructor returns to the beginning of the exercise and presents the previous material in addition to the material that was just missed. This allows students to gain repeated practice with missed items. The final step is a delayed test during which the group or particular individuals are tested on the missed items during a point later in the lesson. This may be done several times throughout the lesson to provide multiple opportunities to practice more difficult items. Teaching to Mastery Possibly the most important feature of the DI model is the amount of surplus practice that is provided in the lessons. The DI programs provide cumulative review of earlier taught material and stress that once a concept or strategy is introduced, it is used frequently. Through the continuous assessment of each student?s ability, DI programs are able to determine if additional instruction is necessary. Adams and Engelmann (1996) 33 suggest that students should perform at least at 70% correct responses on any component of the lesson that has been introduced in the preceding lesson, 90% correct responses on components that have been introduced more than 2 lessons earlier, and at virtually 100% correct responding on material at the end of the current lesson. Failure to teach to mastery may lead to problems for students during later lessons. The inability to correctly identify letter-sound correspondences during beginning reading will greatly influence the individual?s ability to properly sound out words later in the program. If the instructor follows the guidelines described in the DI program, the process of teaching to mastery should be relatively efficient and eventually the students will require fewer repetitions each teaching session to reach mastery. Summary The components listed in the previous section encompass the basic principles utilized in the DI model of reading. These techniques provide the groundwork for the rest of the strategies used within each specific program (e.g., Reading Mastery, Corrective Reading). With a general understanding of these techniques it is now possible to turn to the research that has been conducted on DI reading programs. Research on Direct Instruction Reading Typical, At-risk, and Special Education Students The influence of DI programs on students? reading achievement has been investigated since its inception. The most thorough and longitudinal study of DI occurred during Project Follow Through. As the results from that study showed, DI has been extremely successful in improving the reading achievement of children from disadvantaged backgrounds. In addition to children from low-socioeconomic backgrounds, a variety of 34 other students have benefited from DI programs. Studies examining the effects of these programs on children with traumatic brain injury (Glang, Singer, Cooley, & Tish, 1992), epilepsy (Humphries, Neufeld, Johnson, Engels, & McKay, 2005) and even non-English speaking countries (Grossen & Kelly, 1992; Nakano, Kageyama, & Kinoshita, 1993) have all produced positive results. With an abundance of research conducted on DI and at-risk (i.e., racial minority, low- SES), typically developing, and special education students (for reviews see Gersten, 1985; Schieffer et al., 2002), a full review of this literature would be excessive. Instead, a brief overview of two meta-analyses conducted on DI research will be presented. In 1988, White conducted a meta-analysis that examined the effects of DI on the achievement of special education students. The analysis included 25 studies in total that compared DI programs to other instructional methods. Results from this analysis showed that no measure in any of the studies significantly favored the comparison group, while 53% of the measures significantly favored the DI groups. Furthermore, the calculated effect size for reading (decoding and comprehension) was 0.84, which is far above the effect size (.25 or 1/4 th of a standard deviation) that is usually considered as educationally significant (White, 1988). A further comparison between students with mild disabilities versus those with moderate to severe disabilities revealed no significant difference in the mean effect size, suggesting that the DI programs were effective across a range of disabilities. Data from the analysis also indicated that the DI programs were beneficial for students across a range of grades, elementary through secondary. More recently, Adams and Engelmann (1996) conducted a meta-analysis of all published DI comparison studies regardless of the student population. In order to be 35 included, however, the studies did need to meet several guidelines, including: (a) reported pretest scores, including means, standard deviations, and sample sizes; (b) multiple sessions of program implementation, (c) the use of a formal DI program and not just strategies included within these programs, (d) a non-single subject design (although the authors do note that these designs are appropriate, an effect size cannot be calculated); and (e) a comparison group (Adams & Engelmann, 1996). A total of 37 studies met the above criteria and were included in the analysis. Overall, the results indicated that the DI programs were more successful at producing a statistically significant increase in student achievement than comparison groups; 64.1% of the studies favored DI as opposed to only 1.2% of non-DI programs (34.7% of the studies produced no statistically significant difference). The effect size for this analysis was 0.69, which was lower than White?s (1988) findings, but still considerably above the educational significance benchmark of 0.25. The findings from these two meta-analyses become even more impressive when you compare them to Stahl and Miller?s (1989) meta-analysis on whole language programs that resulted in an effect size of 0.09. Students with Developmental Disabilities In contrast to the research that has been produced on typically developing, at-risk, and learning disabled students, there is a paucity of research examining the effects of DI on the teaching of reading skills to children with developmental disabilities. Research on reading by these children was virtually nonexistent prior to the late 1960s because of an emphasis on other types of skills and the general belief that these children could not learn how to read (Conners, 1992). Early research suggested that this belief was misguided; it showed that behavioral techniques could be powerful in teaching a basic sight-word 36 vocabulary to these students (Brown, Huppler, Pierce, York, & Sontag, 1974; Brown & Perlmutter, 1971). Since that time much of the instruction and research on reading by children with developmental disabilities has focused on sight-word approaches (Conners, 1992). Nevertheless, the development of DI programs has led to an examination of teaching reading using a phonics-based approach. Previous reviews of the literature on teaching reading skills to children with developmental disabilities (Conners, 1992, 2003; Gersten, 1985; Joseph & Seery, 2004; Katims, 2000; Lockery & Maggs, 1982) revealed several studies (Cohen, Heller, Alberto, & Fredrick, 2008; Conners, Rosenquist, Sligh, Atwell, & Kiser, 2006; Hoogeveen & Smeets, 1988; Hoogeveen, Smeets, & Lancioni, 1989; Hoogeveen, Smeets, & van der Houven, 1987; Singh & Singh, 1988; van Bysterveldt, Gillon, & Moran, 2006) that examined the use of phonics in teaching reading to children with developmental disabilities; however, they did not use DI curricula. Several studies were identified that used DI programs. One of the earliest studies to assess the effects of DI on individuals with mental retardation was Bracey, Maggs, and Morath (1975). Bracey and her colleagues tested the progress of six moderately mentally retarded children (IQs 30-40) who had all been placed in residential care facilities for a minimum of five years. The children ranged in age from 7-14 years old and, at the outset of the study, were unable to read single words by sounding out. Additionally, all children had speech impediments of varying degrees. Bracey et al. (1975) used the DISTAR Reading Level I program during which the children were presented with tasks related to blending (e.g., segmenting and telescoping) that focused on having the children learn to reproduce sounds and words when they were 37 presented slowly, and with rhyming exercises that were designed to make the children more aware of the parts of words and of the similarities and differences between them. The teacher provided instruction for 15 to 30 minutes everyday (for 2 years) on an individual basis rather than the suggested small-group format. While students were receiving instruction from the teacher, the other five individuals worked independently on worksheet activities from the reading program. The children were pre- and post-tested on the mastery tests contained within the reading program. Results from the study show that the children displayed significant improvement in the subskills of blending sounds, segmenting sounds, letter-sound correspondence, and sounding out. The conclusions of this study are limited due to the fact that they did not use a control condition; however, the fact that previously illiterate children displayed impressive gains demonstrates that the DI program more than likely was effective. In a project designed to demonstrate procedures for the systematic examination of individual rates and accuracy of progress in reading programs for moderately retarded children, Apffel, Kelleher, Lilly, and Richardson (1975) used two different reading programs. The authors examined the effects of Rebus, a whole-word approach method that focused more on independent work, and DISTAR Reading on the rate and accuracy of individuals? responses. Sixty students (no ages provided) who were identified as moderately retarded were split into two groups, with one receiving the DISTAR program and the other the Rebus program. Reading instruction was provided for 30 minutes per day in small-groups (approximately 4 children) over the course of one year. Every four weeks students were 38 tested using the mastery tests provided within the reading program to assess their acquisition of the specific skills (e.g., letter-sound correspondence, blending). The results showed that individuals in both groups were able to acquire some of the skills required to begin reading. Although the study did not compare the two programs, the results showed that children in the DISTAR group performed at a much higher level (rate and accuracy) than the students in the Rebus program. In a later study by Booth, Hewitt, Jenkins, & Maggs (1979), DISTAR Language and Reading programs were provided to 33 children (IQs 35-55) who ranged in age from 8-14 years old. Of the original 33 children, only 12 students participated in the reading portion of the study. Students received approximately 32 months of daily instruction using the DI reading program over the course of the study. To assess effects, students were tested at the end of each school year on several measures including the DISTAR mastery in reading test, Peabody Picture Vocabulary Test, Neale Analysis of Reading Ability, and Baldie Language Ability Test. Results from the study showed that the children who received the DI programs mastered most of the basic literacy skills tested. Prior to instruction, the children with moderate mental retardation were learning at about the rate of two months development for each five calendar months. After the program, the scores showed that most children gained 34 language and reading months during the 32 months of instruction, a rate that surpassed the development of children from the control group. Although the results are impressive, there are several concerns with this study. The fidelity of the program must be questioned. Although the authors state that instruction was provided for 32 months, they did not describe any of the daily procedures (i.e., group size, actual time spent instructing, etc.). In addition, although they used the mastery tests 39 in the program, they did not report any of those results, and chose to focus on a language test that required the students to write out answers, rather than provide any oral responses, which would have directly measured decoding skills. The results reported are convincing; however, given the limited description of the dependent measures and instruction time, generalizations should be cautioned. Gersten and Maggs (1982) examined the long-term effects over a five-year period of DISTAR Language and Reading instruction on 12 instructionally na?ve children in the high-moderate range of mental retardation, ranging in age from 6-12 years old. Reading instruction was begun 6 months into the first year and accompanied language instruction each day for 30 minutes. The students were pre- and post-tested on the Stanford-Binet Intelligence Test, and only post-tested on the Peabody Picture Vocabulary Test and the Baldie Language Ability Test. The Stanford-Binet scores were then compared for relative gains against the norm group. All of the children progressed to the final level of the reading program within the five-year period, and their IQs increased significantly more than would have been predicted by regression to the mean (41.9 to 50.6). O?Connor, Jenkins, Cole, & Mills (1993) compared the effects of Reading Mastery to another phonics based program entitled ?Meet the Superkids.? Like Reading Mastery, ?Superkids? introduces letter sounds in isolation, teaches sound blending, and selects reading vocabulary that have regular (i.e., most common sound) decodable spellings. Nevertheless, ?Superkids? adopts an entirely different stance on other aspects of program design. Direct Instruction programs stress faultless communication (Engelmann & Carnine, 1982). As discussed previously, DI programs separate letters and sounds that are auditorily or visually similar because when clustered, they are difficult to discriminate 40 (Carnine 1976, 1981). In contrast, ?Superkids? clusters letters with similar visual and auditory features with the belief that it will facilitate learning. Similar to the contrast in the order of letter-sound correspondences with DI programs, ?Superkids? does not use a specific error correction procedure, require that skills be taught to mastery, or use cumulative review. The study was conducted over a four-year period and used 81 6-year old children (divided into two groups) who demonstrated a deficit in cognitive development. The authors did not provide any other criteria (e.g., IQ) other than to say that these individuals all scored at least 2 standard deviations below the norm on a cognitive development test. Reading lessons occurred daily for 30 minutes in small homogeneous groups based on ability. All students were tested using the McCarthy Scales of Children?s Abilities and the Test of Early Reading (TERA) at the onset of the program. The California Achievement Test (CAT) was also administered at the beginning of the second year. The results showed that there was no significant difference in achievement between either of the two groups either at the end of the treatment year or at the follow-up testing one year later. Nevertheless, when the authors compared students who had made ?advanced progress? in both programs, students in Reading Mastery registered larger reading gains. Examination of this study offers several interesting insights into research on DI programs with children with mental retardation. First, the fact that there was not a significant difference between the two different treatment groups suggests that the determining factor in DI?s effectiveness may be phonics instruction and not the structured design features (e.g., scope and sequence, error corrections, etc.). Second, it is of interest to note whether the relative efficacy of DI reading for young children with mental 41 retardation is limited to relatively ?higher performers.? The aforementioned studies (Bracey et al., 1975; Gersten & Maggs, 1982) suggest that this is not the case, but these studies provided instruction for a longer period of time than the current study. The question to ask then is whether a one-year treatment period is sufficient to provide children with mental retardation with a solid base in reading skills. In a more recent study of the use of the DI model on teaching reading to children with mental retardation, Flores, Shippen, Alberto, and Crowe (2004) instructed children on letter-sound correspondence. The participants in the study were 6 children (ages 8-13 years) enrolled in a public elementary school, who were diagnosed with mental retardation (IQs 38-52). Prior to the use of the DI program (Corrective Reading), the children had been receiving instruction using the Edmark Reading Program, which uses a sight-word reading approach. Before beginning DI instruction, the children were given a criterion-referenced assessment of letter-sound correspondence, during which none of the students were able to correctly identify any of the letter sounds tested. The students received instruction on letter-sound correspondence, segmenting, blending, and word decoding according to the design described in Corrective Reading. The authors tested each student on five different measures that corresponded with the main phases of instruction. Probes for single letter identification presented the student with the target letter and several distracters. Students were then instructed to say the letter?s sound. Probes for discrimination and blending were used in which students had to correctly discriminate between letters and then blend the sound of the target letters together. Probes for decoding ? slow, presented CVC (consonant-vowel-consonant) words and asked the students to say the words slowly (e.g., ?sssaaat?). Probes for 42 decoding ? fast used the same presentation and required the students to say the word fast (e.g., ?sat?). Blending probes were also administered for words that had been explicitly taught (e.g., ?sam?) and for several words that had not been taught, but used the same letters (e.g., ?mat?). Probes were administered three times a week at separate times from reading instruction. During baseline, all of the students identified the target letters at 0% accuracy. The criterion for moving to the next target letter was three consecutive errorless probe trials. On average, during testing of the first target letter (/m/), students took 9.5 trials to reach mastery (range of 5-16 trials for the 5 students). Probes for the second target letter (/a/) resulted in mastery on average in 4.5 trials (criterion was met on average for the third (/s/) and fourth (/t/) letters in 3.2 and 4.4 trials, respectively). Testing continued with the examination of letter-sound discrimination and blending. These results showed that 4 of the 5 children reached criterion in the minimum of three probes (M = 3.2). All of the students demonstrated criterion level performance in blending and telescoping the instructed words in three or four probes. After four weeks, a follow-up probe of the three available students was given, and they all demonstrated mastery of the letter-sound correspondences. The results of this study clearly showed that individuals with mental retardation could learn letter-sound correspondences, and the decrease in trials to criterion suggests that students learned the generalized relationship between letters and sounds. One student, however, was unable to complete the program, as he could not pronounce the letter ?s? due to an articulation problem. In addition, he consistently responded to the presentation of the letter ?t? by saying the letter name rather than the sound. This student?s results 43 suggest a possible limitation to the use of DI programs with persons with mental retardation. If an individual has a severe language articulation disorder, reading through systematic decoding may not be an appropriate reading approach; however, this result may be idiosyncratic, and future research is necessary to address whether articulation may be a limiting factor in DI instruction. In the most recent study of the effects of a DI program on the acquisition of reading skills, Infantino and Hempenstall (2006) conducted a case study on a child with Autism Spectrum Disorder (ASD). The participant in the study was a seven-year-old male diagnosed with high functioning ASD who was enrolled in a mainstream primary school. The student?s teacher and parents both reported that he was having difficulty with reading and comprehending text. The authors of the study used the Corrective Reading Program ? Decoding Strand Level A as a means to try to increase the child?s reading skills. Presentations of the lessons contained within the program were provided by the child?s parents after they had received training and feedback on the program from the study?s authors. Pre- and post-tests were conducted on a number of batteries, including: (a) the Comprehensive Test of Phonological Processing (CTOPP), which measures phonological awareness, rapid naming, and phonological memory; (b) the Wide Range Achievement Test (WRAT) Word Recognition subtest, which assesses the individual?s ability to recognize words presented in a list; (c) the Woodcock Tests of Reading Mastery (WTRM), which assesses decoding skills using pseudowords; (d) the Spadafore Diagnostic Reading Test (SDRT), which measures listening comprehension and silent reading comprehension; and (e) the Dynamic Indicators of Basic Early Literacy Skills 44 (DIBELS) test, which examined the students reading fluency. Lessons from the program were scheduled to occur five days a week for 25 to 30 minutes. Nevertheless, actual lesson instruction occurred on average for only three days a week, for a total of 22 weeks. Results from this case study showed that the individual made strong gains in both listening and silent reading comprehension, improving almost two grade equivalents. Improvements in fluency (29 words per minute to 43 words per minute) and receptive language were also recorded. Interestingly, the child?s scores in phonological and decoding skills showed no statistically significant gains. The authors suggested that this may have occurred for a number of reasons. First, it has been previously reported that, in some cases, children with ASD have focused more on the whole word rather than its components (Fontenelle & Alarcon, 1982), making difficult the use of a phonics-based program. If this were the case, teaching reading through whole word recognition would become unproductive, as individuals can only store a limited amount of visual information (Share, 1995). This would in turn, make it very difficult for individuals with ASD to decode the majority of novel words. The second possibility for why the child did not progress in phonological and decoding skills is that the program that was selected (i.e., Corrective Reading) only offers a brief emphasis on these skills. When combined with the fact that the program was not implemented as frequently as it was designed to be, this may most likely be the reason for the individual?s lack of progress in the acquisition of these skills. Selection of a program that focuses more on teaching these skills may have led to a greater improvement in this area. 45 Despite the study?s findings that a DI program can be effective in increasing reading skills for a child with ASD, the results are limited. Given that the study only involved one child, the lack of generalization is obvious. Improvement in reading skills could have been due to maturation, although this is most likely not the case. The program was also administered at home, where the authors did not have full access to the session. Additionally, the fact that the child was diagnosed as ?high functioning? limits any generalizations to other children with ASD who are considered lower functioning. The results do provide some promise though for children with ASD. Direct Instruction programs have clearly shown their ability to help individuals effectively teach children of various backgrounds the skills necessary to begin reading. Nevertheless, the question of the full potential of the DI model to improve children with developmental disabilities? performance in reading has not been systematically addressed. A Direct Instruction Approach to Teaching Children with Developmental Delays How to Read Previous research has demonstrated the ability of DI programs to increase the acquisition of reading skills in older, higher functioning children from a variety of backgrounds, including those with developmental delays (Adams & Engelmann, 1996); however, research on children with developmental delays who are younger and lower functioning has not been as systematic. When identifying possible DI programs to use with this program, it is important to recall the recommendations of the National Reading Panel (NICHD, 2000). The panel suggested that beginning reading programs should: (a) teach phonemic awareness explicitly, (b) provide sequenced phonics instruction, (c) explicitly teach blending and segmenting, and (d) build fluency through repeated 46 presentations with appropriate error corrections and feedback. Reading Mastery Plus ? Level K (Engelmann et al., 2002) is one DI program that incorporates all of these recommendations. The paucity of research on the effects of DI on children with developmental delays dictates that research first demonstrate whether this form of instruction is effective with children in this population. While the ultimate goal of this line of research is to show that the DI program, and not extraneous variables, is responsible for increased reading abilities, preliminary research should first be focused on how children with developmental delays respond to reading instruction using the DI curriculum. Future experimenters may then address specific components within the DI program that may be altered to maximize the effectiveness of the curriculum. The purpose of the present study was to examine the effects of the Reading Mastery Plus ? Level K program on preschool students with developmental delays. 47 CHAPTER II: EXPERIMENT PROPER Method Participants Participants were recruited from two sites located in the southern United States. Three individuals diagnosed with a developmental delay and two individuals who were typically-developing participated in the study. Four of the participants were selected from a private, non-profit, integrated preschool specializing in applied behavior analytic services. The preschool served approximately six children with developmental delays and eight typically developing children aging in range from 30-72 months. The staff of the preschool consisted of: (a) a clinical director who was a Board Certified Behavior Analyst (BCBA) and a doctoral student at a local university, (b) an assistant director who was a BCBA and responsible for supervision of the classrooms and practicum students, (c) six students who were enrolled in a master?s program in applied behavior analysis at a local university, (d) three lead classroom teachers who had a bachelor?s degree in either early childhood education or psychology, and (e) approximately ten undergraduate students who were enrolled in an experiential learning class at a local university. The children diagnosed with developmental delays received approximately 2 hours of one-to-one instruction over the course of the school day (8:00am to 2:30pm). Individual instruction varied depending upon the participant?s skill deficits, but typically included sessions that focused on adaptive behavior skills (e.g., learning their phone 48 number or address), social skills, and academic instruction (e.g., counting, etc.). Prior instruction in beginning reading skills for the four participants at the preschool consisted of singing the alphabet song, which was then followed by practice in visually identifying the target letter for the day and modeling the teacher in saying its sound. The children also spent time listening to the teacher read from storybooks and looking through other developmentally appropriate texts. The fifth participant (Omar) attended a non-profit school that served children ages 2- 12 with autism spectrum disorder. The school?s treatment approach utilized behavior analytic principles to help each student obtain the goals and objectives identified for them. Approximately 15 students who were either enrolled in the half-day program (8:30am to 11:30am or 11:30am to 2:30pm) or the full-day program (8:30am to 2:30pm) attended the school. The staff at the school consisted of: (a) two doctoral level BCBAs, (b) three master?s level BCBAs, and (c) approximately 10 instructional specialists that had a minimum of a bachelor?s degree in education or a related field. A 1:1 student- teacher ratio was provided, with some small-group instruction provided as appropriate (specific to a student?s goals and objectives). While typically developing children normally begin to read starting around the age of 72 months, research has shown that some children as young as 30 months old have benefited from explicit instruction in phonological awareness (Lonigan, Anthony, Bloomfield, Dyer, & Samwel, 1999; Lonigan, Burgess, Anthony, & Barker, 1998; Weisberg & Savard, 1993). Participants who were below 36 months of age were not included in the study (Burack, Iarocci, Bowler, & Mottron, 2002). 49 The first participant, Allison, was a five year-old, Caucasian female diagnosed with Pervasive Developmental Disorder, Not Otherwise Specified (PDD-NOS). Allison was diagnosed by a clinician at a regional center that specializes in the diagnosis and treatment of developmental disabilities. Allison?s Autism Quotient (AQ) score on the Gilliam Autism Rating Scale (GARS; Gilliam, 1995) was 60. The AQ has an average of 100 and a standard deviation of 15 and is designed to indicate the likelihood that the individual has autism. According to the test manual, AQ scores are associated with the following probabilities of having autism: below 69, Very Low; 70-79, Low; 80-89, Below Average; 90-110, Average; 111-120, Above Average; 121-130, High; and above 131, Very High. Allison?s AQ score indicated a ?very low? probability of having autism category. While the GARS has been shown to underestimate the likelihood that children with autism would be classified as having autism (Lecavalier, 2005; South, et al., 2002), Allison?s extremely low score is most likely an accurate indicator of her abilities. Allison was also administered the Gilliam Asperger?s Disorder Scale (Gilliam, 2001) and received an Asperger?s Disorder Quotient of 72, which corresponded to being ?borderline? for having Asperger?s Disorder. Allison?s social interaction and spontaneous language usage were more limited than for typical peers and generally were related to specific personal requests or were in response to others? initiations. She would participate when called upon during normal academic instruction, but displayed inappropriate behaviors several times per week, including whining and crying in response to loud environments and intermittently engaging in hair-pulling behavior (i.e., pulling a small amount of hair from her scalp) 50 during academic tasks. Nevertheless, these behaviors were not observed during any of the sessions conducted throughout the study. The second participant, Danielle, was a five year-old, African-American female diagnosed with PDD-NOS. Danielle was diagnosed by a clinician at a regional center that specializes in the diagnosis and treatment of developmental disabilities. Danielle received an AQ of 83 on the GARS, which indicated a ?below average? probability for autism. Danielle?s language skills were below average for a child of her age. She received a score of 78 on the Preschool Language Scale - 4 (PLS-4; Zimmerman, Steiner, & Pond, 2003), which corresponded with an age equivalent of 3 years and 2 months. She would often perseverate on irrelevant topics during social discourse and frequently failed to initiate conversation. Danielle actively participated in academic tasks during normal instruction and did not engage in any problem behaviors throughout the study. The third participant, Megan, was a four year-old, Caucasian female who was typically developing. No standardized scores were available for Megan, but the clinical director of the preschool reported that Megan?s language and social skills were normal for a child of her age. She actively participated in all academic tasks and did not demonstrate any behavior problems during instruction. The fourth participant was also a typically developing child. Ricky, a four year-old, Indian male engaged in all academic tasks during normal instruction and did not display any aberrant behaviors. Standardized assessment scores on various skills were not available for Ricky; however, the clinical director of the preschool reported that Ricky?s social and language skills appeared to be average for a child of his age. 51 The last participant, Omar, was a three year-old, African-American male who was suspected of having Autism * . At the time of testing, no standardized assessment scores were available. Omar attended the full-day program at the school for children with autism and received instruction in a number of skill areas, including: language development, social skills, gross and fine motor skills, and academic skills. Prior beginning reading skills instruction for Omar consisted of singing the alphabet song and visually identifying letters of the alphabet. Omar also spent time listening to the instructor read from storybooks and would look through other developmentally appropriate texts. Table 1 Participant Characteristics Participant Age (Months) at Start of Study Diagnosis/Development Allison 71 PDD-NOS Danielle 66 PDD-NOS Megan 50 Typical Ricky 49 Typical Omar 39 Autism* Omar participated in academic tasks when prompted, but often engaged in off-task behaviors (e.g., looking around the room, getting out of his seat, etc.) if the teacher did not redirect him. Omar did not initiate conversations and would engage in stereotyped behaviors, such as rocking back and forth. Additionally, Omar?s parents reported that they were having feeding issues with him (i.e., Omar would only eat certain foods). Only individuals with appropriate consent were included in this study. The consent package included a description of the procedures and possible risks of the experiment. 52 Both non-profit organizations and the participant?s legal guardian were required to provide consent for each individual?s participation before any testing began. At the beginning of each session, individuals were asked if they would like to participate in the activity. If the individuals responded no, they were not required to participate in that day?s session. If the individuals responded yes, they were taken to the training room. Prior to the beginning of the session, all individuals were informed that they could stop the activity at any time. Materials and Setting Training sessions for each child were conducted in an isolated, one-to-one training room at the preschool or non-profit facility during predetermined times when the children would normally be available for one-on-one instruction. Unlike most Direct Instruction reading sessions that are presented in a small group format (Carnine et al., 2004), training sessions for the participants in this study were presented in a one-on-one format, as it was difficult to group them homogenously. Instruction in beginning reading skills was presented using the Reading Mastery Plus ? Level K curriculum (Engelmann et al., 2002). Participants began instruction at the appropriate teaching session (e.g., letter- sound correspondence, blending, etc.) following the administration of a placement test that is contained within the Reading Mastery Plus ? Level K program (see Table A3 for a sample of the questions contained on the placement test). All training sessions were videotaped and reviewed to record data, check for treatment fidelity, and obtain interobserver agreement (IOA) data. The goal of the Reading Mastery Plus ? Level K program is to build a solid foundation of reading skills that permit children to start first grade ahead of where they 53 would start without this program. Through the reading component of this program, children learn letter-sound correspondences and decoding skills (blending and segmenting) that culminates in the reading of single words contained within short workbook stories (i.e., one to three words). Dependent Measures Letter-sound Correspondence (LS). In Reading Mastery Plus ? Level K, participants are initially taught to decode words by pronouncing the phonemes in each word. To accomplish this task, participants must be firm in their letter-sound correspondence. In letter-sound correspondence training, the participants were taught the most common sound for 13 different letters. The number of correct pronunciations, the number of errors, and the rate of correct pronunciations of each target letter were measured. Oral blending ? Say it Fast (SF). In this component of instruction, the participant was required to say the sounds of one-syllable words without pausing between the sounds (e.g., am). The number of correct pronunciations, the number of errors, and the rate of correct pronunciations of words in this category were measured. Oral blending ? Say the Sounds (SS). Activities in this component provided practice for the participant in oral blending without saying the word at its normal pace (e.g., rrraaannn). These activities allowed the participant to practice saying the sounds of words without pausing between the letters. The number of correct pronunciations, the number of errors, and the rate of correct pronunciations of sounds in this category were measured. Oral blending ? Say the Sounds-Say it Fast (SSSF). This component consolidated the previously taught skills of saying words slowly (i.e., ?say the sounds?) and saying words at their normal pace (i.e., ?say it fast?). The number of correct pronunciations, the 54 number of errors, and the rate of correct pronunciations of words in this category were measured. Sounding Out (SO). Activities in this component were similar to those in the ?say the sounds? track, except that the children were to read the text on the page rather than have the teacher model the sound. In these lessons, participants learned to pronounce the sounds slowly, without pausing between the sounds. Instruction in this format required the participant to sound out simple typical words (e.g., am) and nonsense words (e.g., ra). The number of correct pronunciations, the number of errors, and the rate of correct pronunciations of words in this category were measured. Reading Vocabulary (RV). The participants began to read regular words (i.e., words in which each letter corresponds with its most common sound) after they learned the letter- sound correspondences of the words being introduced. The first reading vocabulary words begin with continuous sounds (e.g., a, s, m), as children typically pronounce these words easier than words beginning with stop sounds (e.g., t, d, c) (Carnine et al., 2004). Words beginning with stop sounds were introduced later in the program, as well as a few common slightly irregular words (e.g., is). The number of correct pronunciations, the number of errors, and the rate of correct pronunciations of each target word were measured. Procedure The Reading Mastery Plus ? Level K program ideally should be implemented for 25 to 30 minutes each day for five days a week. The experimenter attempted to follow this suggestion; however, due to participant absences and scheduling conflicts, the program was not implemented as frequently as suggested. Instruction occurred approximately 3 55 times per week for 20 to 25 minutes per session, with extended periods (e.g., a week or longer) of non-instructional days occurring throughout the study. The author served as the primary instructor for reading sessions and was assisted by an undergraduate research assistant at the preschool and an instructional support specialist at the other non-profit facility. The author trained both assistants on how to present the lessons from the Reading Mastery Plus ? Level K program. Training occurred over the course of several weeks, with each assistant receiving approximately 15 hours of training. The author used a behavioral skills training approach that utilized instructions, modeling, rehearsal, and feedback to teach the assistants how to implement the Reading Mastery Plus ? Level K program appropriately. The assistants were judged by the author to be proficient in presenting the DI curriculum when they reached 100% accurate responding for three consecutive sessions on the DI checklist developed by Marchand- Martella, Lignugaris-Kraft, Pettigrew, & Leishman, 1995 (see Figures A1-A4). Prior to the beginning of instruction, baseline data were collected on each of the dependent measures described above. Once stable responding in the baseline phase was recorded, the participants began instruction in the Reading Mastery Plus ? Level K program. At the beginning of each session, each participant was asked if he/she would like to participate in the reading activity. If the individual responded no, he/she was not required to participate at that time; but, was approached later in the day and asked to participate again. If the participant responded yes, he/she was escorted to the training area. Prior to the beginning of the session, each of the participants was informed that he/she could stop the activity at any time. 56 Following the administration of the placement test provided within the curriculum, each child began at the appropriate lesson in the Reading Mastery Plus ? Level K program. Initial instruction consisted of teaching specific letter-sound correspondences. The Reading Mastery Plus ? Level K program introduces 13 sounds, with Level 1 of the program covering the remaining correspondences. The presentation of letters followed the sequence described within the program (see Table A4 for the order of letter-sound correspondence introduction). Instruction in letter-sound correspondence was conducted according to the curriculum, which uses a model (e.g., instructor says the target sound first), lead (e.g., instructor and student say the sound), and test (e.g., student says sound) format. New letters and skills were introduced once the participant mastered the current target behavior(s). Mastery was defined as response accuracy at or above 90% on skills and information introduced earlier in the program sequence (Adams & Engelmann, 1996; Engelmann, 2007). Response accuracy should continue within this range during further instruction. In addition, participants also had to pass the mastery tests contained within the curriculum in order to move on to the next group of lessons. If the participant did not master the current target behavior(s), the curriculum instructs the teacher to represent the previous lesson(s) in a shortened, ?firming-up procedure? that provides the participant with additional practice and feedback on the target behavior(s). If, after 10 consecutive sessions the participant was not at or above 70% correct responding, the instructor proceeded to the next target behavior. Previous research has indicated that some participants may have extreme difficulty pronouncing specific letters and that further instruction may not currently be of benefit (Flores et al., 2004). If the 57 participant was not able to master the next target behavior within 30 sessions, instruction using the Reading Mastery Plus ? Level K program was stopped. Oral blending of sounds also began during letter-sound correspondence instruction. The first blending activity required that the participant learn how to ?say it fast.? Using the same model, lead, test format, the participants learned how to blend the sounds of words together at their normally spoken pace. The participants also concurrently were taught how to segment words or say them slowly (i.e., phoneme by phoneme). Following the mastery of these skills and the participants continued learning of letter-sound correspondences, instruction in how to sound out text was introduced. In the sounding out activities, the participant combined the skills of segmenting, blending, and letter-sound correspondence to begin to read regular words that contained the letter-sound correspondences that had previously been mastered. This section culminated in the participant being able to sound out the word and then say it at its normal pace. The Reading Mastery Plus ? Level K program is concluded by presenting the participant with the reading vocabulary words that were learned during the sounding out activities in a three-to four-word sentence that is accompanied by pictures. The participant?s final target behavior in the Level K program was to read these sentences at a normal pace. Research Design This study used an A-B design. While simple A-B comparisons are generally acknowledged to be weak designs for identifying functional relationships (Bailey & Burch, 2002), they do have merit when the research objective is to examine the effects of a procedure that has not yet been fully identified or explained (Johnston & Pennypacker, 58 1993). The ability to demonstrate the effects of a program on a population that has not been previously studied is a critical step in preliminary research (i.e., demonstration research style; Johnston & Pennypacker, 1993). Prior to identifying functional relationships between the DI program and children?s reading behavior, one must first identify whether these children are able to benefit from this instruction. Children?s failure to acquire the reading skills taught by the DI program would suggest that there are specific components in the program that are not effective in teaching skills to this population. Future researchers may then address these specific variables. Conversely, if the children do acquire the reading skills taught by the program, future researchers may begin to identify variables that could be altered to maximize the program?s effectiveness. While single-subject experimental designs have not been extensively used in the reading research field (Carlson, 1985; McCormick, 1990; Neuman & McCormick, 1995), the implementation of multiple A-B designs across different subjects offers some benefits that between subject designs cannot (Barger-Anderson, Domaracki, Kearney-Vakulick, & Kubina, 2004). In working with a relatively small sample size, this design allows the researcher to make preliminary assessments about the effectiveness of the DI curriculum for other children with developmental delays. While definitive causal statements cannot be made, the design does provide the researcher with the opportunity to identify potential variables that may influence the participant?s acquisition of reading skills. The use of this design is also appropriate based on the target behaviors (e.g., letter- sound correspondence, blending, segmenting, etc.) being examined in the study. In reversal designs, the withdrawal of reading instruction in order to show that reading skills may return to baseline levels is questionable from an ethical standpoint. Additionally, 59 once the participants learn a specific strategy, their target behavior may not return to baseline levels even after the instruction is withdrawn. The use of an A-B design allows for the researcher to examine the possible effects of the program without having to remove the treatment, or determine if the target behavior will return to baseline levels. Data Analysis At the end of each session, the data were examined to determine the effects associated with the DI lesson. Reading skills acquisition was examined in terms of: (a) the percentage of correct responses during sessions, (b) the number of errors during sessions, (c) the rate of responding, (d) the number of mastered letter-sound correspondences, and (e) the number of trials required to achieve mastery. These data were graphed and compared against the individual?s initial baseline measurement and the baseline data obtained from the other participants. An increase from the baseline data in the number of letter-sound correspondences mastered, along with an increase in the percentage of correct responses to blending, segmenting, and sounding-out tests, would suggest that the DI program may be effective in teaching beginning reading skills. Additionally, a decrease in the number of trials required to master items, and a decrease in the number of errors over the course of instruction would suggest that the Reading Mastery Plus ? Level K program may be effective and appropriate for teaching reading skills to young children with and without developmental disabilities. When examining the following results it is important to remember that A-B designs cannot demonstrate functional relationships, and therefore the results must be interpreted with caution. 60 Results and Discussion Placement Test and Baseline Data The placement test contained within the Reading Mastery Plus ? Level K program was administered to each of the participants before data collection began. The corresponding scores for all of the participants indicated that reading instruction should begin with the first lesson in the sequence. Following the placement test, baseline data were collected for each participant across all of the skills (e.g., letter-sound correspondence, say it fast, sounding out, etc.) presented in the Reading Mastery Plus ? Level K program. Baseline data collection continued until responding was stable. Once stable responding was recorded, instruction in the program began. Letter-sound Correspondence The performances of each individual on the various letter-sound correspondences are depicted in Figures 4-18 in terms of the percentage of correct responses across the instructional period. Despite individual differences, each participant demonstrated the ability to master a number of letter-sound correspondences, ranging from a high of 11 to a low of 5. Mastery was defined as response accuracy at or above 90% on skills and information following the third day of instruction and that remained at that level for the majority of the following sessions (Adams & Engelmann, 1996; Engelmann, 2007). While the difference in the number of mastered letter-sound correspondences was partially due to the number of instructional days presented to each individual, it also appears that it was due to the number of trials needed to reach mastery for each participant (see Table 2). When the participants were divided into groups based on their diagnoses (i.e., developmental delays versus typically developing) the results showed that 61 Figure 4. The percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Allison. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "a" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "m" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r cen t C o r r ect R e s p o n d i ng Consecutive Calendar Days Letter-Sound Correspondence - "s" Allison Ba seline Instruction 62 Figure 5. The percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Allison. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "e" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Letter-Sound Correspondence - "r" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Letter-Sound Correspondence - "d" Allison Ba seline Instruction 63 Figure 6. The percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Allison. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "f" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Letter-Sound Correspondence - "i" Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Letter-Sound Correspondence - "th" Allison Ba seline Instruction 64 Figure 7. The percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Danielle. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Letter-Sound Correspondence - "a" Danielle Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Letter-Sound Correspondence - "m" Danielle Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Letter-Sound Correspondence - "s" Danielle Ba seline Instruction 65 Figure 8. The percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Danielle. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "e" Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "r" Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Letter-Sound Correspondence - "d" Danielle Ba seline Instruction 66 Figure 9. The percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Danielle. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "f" Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Letter-Sound Correspondence - "i" Danielle Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 P e rce n t C o rr e c t R e s p on d i n g Consecutive Calendar Days Letter-Sound Correspondence - "th" Danielle Ba seline Instruction 67 Figure 10. The percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Megan. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e r cent C o r r ect R e s p o n d i ng Consecutive Calendar Days Letter-Sound Correspondence - "a" Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "m" Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "s" Megan Ba seline Instruction 68 Figure 11. The percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Megan. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "e" Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "r" Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "d" Megan Ba seline Instruction 69 Figure 12. The percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Megan. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "f" Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "i" Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "th" Megan Ba seline Instruction 70 Figure 13. The percentage of correct responding to letter-sound correspondences ?t?, ?n?, and ?c? by Megan. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "t" Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "n" Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "c" Megan Baseline Instruction 71 Figure 14. The percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Ricky. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rce n t C o rr e c t R e s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "a" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "m" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "s" Ricky Ba seline Instruction 72 Figure 15. The percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Ricky. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "e" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "r" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "d" Ricky Ba seline Instruction 73 Figure 16. The percentage of correct responding to letter-sound correspondences ?f?, ?i?, and ?th? by Ricky. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rc en t C o rr e c t R e s p o n d i ng Consecutive Calendar Days Letter-Sound Correspondence - "f" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "i" Ricky Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "th" Ricky Baseline Instruction 74 Figure 17. The percentage of correct responding to letter-sound correspondences ?a?, ?m?, and ?s? by Omar. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 101316192225283134374043464952555861646770737679 P e rce n t C o rr e c t R e s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "a" Omar Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "m" Omar Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "s" Omar Baseline Instruction 75 Figure 18. The percentage of correct responding to letter-sound correspondences ?e?, ?r?, and ?d? by Omar. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "e" Omar Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 P e rc e n t C o rre c t R e sp o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "r" Omar Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 P e r c e n t Co r r e c t Re s p o n d i n g Consecutive Calendar Days Letter-Sound Correspondence - "d" Omar Ba seline Instruction 76 Table 2 Number of Trials to Reach Mastery Criterion for Each Participant Participant Days of Instruction a m s e r d f i th t n Allison 38 12 5 3 12 9 10 7 - - - - Danielle 43 9 3 9 4 9 11 5 - - - - Omar 37 9 11 7 4 10 - - - - - - Mean 39 10 6.33 6.33 6.66 9.33 10.5 6 - - - - Participant Days of Instruction a m s e r d f i th t n Megan 45 3 6 4 8 8 4 8 7 5 6 5 Ricky 37 3 3 5 4 5 5 4 4 4 - - Mean 41 3 4.5 4.5 6 6.5 4.5 6 5.5 5 - - the children with developmental delays took longer to master the letter-sound correspondences than their typically developing peers. Despite this difference, the data show that the children with developmental delays were able to learn a number of letter- sound correspondences during the instructional period. The data displayed in Figures 4-18 indicate that each of the participants was able to master a number of letter-sound correspondences; however, these data are limited in describing the participants? responding since percentages are dimensionless quantities (Johnston & Pennypacker, 1993). For example, on lesson 104 both Danielle and Ricky had 100% correct responding during all of the letter-sound correspondence exercises. According to these data, there was no difference between Ricky and Danielle?s responding. Nevertheless, if one examines the data by looking at the frequency of correct 77 responding, the data would show that Ricky was making approximately 15 correct letter- sound correspondences per minute, while Danielle was making only about 3 correct responses per minute. The use of percentages obscures the differentiation in responding between participants and within individual sessions. In order to provide a clearer and more accurate representation of responding, the responses of each participant across all of the letter-sound correspondences were charted using the Standard Celeration Chart (Pennypacker, Gutierrez, & Lindsley, 2003). The Standard Celeration Chart is a semi-logarithmic chart that was designed using a ?multiply-divide? scale that allows individuals to chart and assess ratios of correct response and error frequencies. By using a linear representation of trends in performance and quantifying them as multiplicative factors per week (e.g., correct responding multiplying by 2.5 per week, errors dividing by 1.75 per week), the chart introduced the measure of learning known as celeration (Binder & Watkins, 1990). Celeration is defined as, ?count per unit time per unit time (c/t/t) and is the basic unit of behavior change? (Pennypacker, et al., 2003, pg. 101). Unlike the cumulative record (see the Appendix for graphs of the participants? cumulative errors across all of the dependent measures), the Standard Celeration Chart allows users to easily measure the frequency of behaviors by using standard angles on the chart to measure learning, independent of performance level. The use of standard dimensions also allows users to avoid the distortion inherent in conventional graphs and to directly compare trends and magnitude of effects (Pennypacker et al., 2003). The performances of the participants on the letter-sound correspondence tasks ? as plotted on the Standard Celeration Chart ? are depicted in Figures 19-23. The celeration 78 Figure 19. Responding by Allison across all letter-sound correspondences. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 79 Figure 20. Responding by Danielle across all letter-sound correspondences. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 80 Figure 21. Responding by Megan across all letter-sound correspondences. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 81 Figure 22. Responding by Ricky across all letter-sound correspondences. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 82 Figure 23. Responding by Omar across all letter-sound correspondences. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 83 values were recorded by using a Celeration Finder (White, 2003). According to the Precision Teaching literature, the minimal frequency increase suggested for a significant acceleration is x1.25 (or ?1.25 for a deceleration) per week (Legault, Maloney, & Giroux, 2000). The results show that all of the participants, including those children with developmental delays, increased their frequency of correct responding during the instructional period. Nevertheless, none of the participants reached an acceleration rate that would be considered significant according to the Precision Teaching literature. This finding may be attributed to the way in which the frequency was measured. In the Precision Teaching literature, data on responding are usually collected during a one-minute, timed measurement where the participant performs the target behavior as quickly and accurately as possible (Binder & Watkins, 1990). This free-operant form of measurement allows the participant to perform the behavior at a frequency that is, ideally, independent of other factors. The frequency of responding in this study was measured during each individual lesson and not in a separate, timed measurement. By measuring the frequency of responding using this approach, the participants? responding became dependent upon the rate at which the instructor was presenting the material, as well as their own frequency of responding. While the one-minute, timed measurement would have been the preferred form of measurement, the skills being assessed required that the instructor present the material to the participants. In order to measure the frequency, the sessions were videotaped and a stopwatch was used to record the amount of time spent on each target behavior (e.g., say the sounds, reading vocabulary, etc.). Due to the fast-paced presentation of the DI exercises, it was 84 not possible to measure interresponse time (IRT) or latency. Instead, the total count was divided by the total session time spent on the target behavior (e.g., letter-sound correspondence). This form of measurement is problematic as it combines duration and IRT and limits the conclusions that can be made about why the frequency of responding has changed (Johnston & Pennypacker, 1993). Nevertheless, given the setting and technology available to the author, this was the form of measurement that was selected. Despite the limitations described above, the results did show that all of the participants increased their frequency of correct responding. If a more accurate form of measurement had been used (i.e., IRT as a denominator) the frequency of correct responding may have reached the significant level for acceleration (x1.25 per week) suggested by the literature; though this conclusion cannot be assumed. Unlike the frequency of correct responding, the frequency of incorrect responses was not as systematic. A deceleration in incorrect responses (i.e., an increase in accuracy) occurred for three out of the five participants. Omar, Ricky, and Danielle all showed decreases in the frequency of incorrect responses (?1.20, ?1.17, ?1.10 per week, respectively). When combined with their results from the frequency of correct responding, these three participants all showed an improvement in fluency on letter- sound correspondences. Allison and Megan, however, slightly increased their frequency of errors on letter-sound correspondences over the course of instruction (x1.04 and x1.04 per week, respectively). These data indicate that while both girls increased the frequency at which they made correct answers, they also increased the number of errors they made during those lessons. In comparing the participants with developmental delays to their typically developing peers, the data show that the children with delays actually showed 85 greater increases in letter-sound correspondence acquisition over the course of instruction; although overall, their peers responded at a higher frequency of correct responses per minute. Say it Fast Similar to letter-sound correspondence exercises, all of the participants were able to master the ability to orally-blend words by saying them at their normal rate (see Figures 24 & 25). All three of the participants with developmental delays took 7 trials to master the skill, while their typically developing peers took 3 (Ricky) and 10 (Megan) trials, respectively. When the data are examined using the Standard Celeration Chart, they show that each of the participants had a small acceleration in the frequency of their correct responding, and a significant deceleration in their incorrect responses during the instructional period (see Figures 26-30). Neither the participants with developmental delays nor the typically developing children appeared to benefit more from this form of instruction, as their acceleration rates and frequency of correct responses (excluding Ricky) were very similar. Say the Sounds Figures 31 and 32 show the performance of each of the five participants on oral- blending exercises that required them to say the correct letter-sound correspondence for various words without stopping between the sounds (e.g., nnnooo). Due to the fact that ?say the sound? exercises are only presented in four lessons of the Reading Mastery Plus ? Level K program, the data on this skill are relatively limited when compared to the other dependent measures. Although only one child (Ricky) met the mastery criterion, the other participants did demonstrate a substantial increase in the accuracy of their 86 Figure 24. The percentage of correct responding on ?say it fast? exercises by the three participants with developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 1013161922252831343740434649525558 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Say it Fast Ba seline Instruction Allison 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Say it Fast Ba seline Instruction Danielle 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 1013161922252831343740434649525558 Per c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say it Fast Ba seline Instruction Omar 87 Figure 25. The percentage of correct responding on ?say it fast? exercises by the two participants without developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 1317212529333741454953 P e r c e n t C o r r e c t R e s p o n d i n g Consecutive Calendar Days Say it Fast Ba seline Instruction Megan 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 1317212529333741454953576165697377 P e r cen t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say it Fast Ricky Ba seline Instruction 88 Figure 26. Responding by Allison on ?say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 89 Figure 27. Responding by Daniel on ?say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 90 Figure 28. Responding by Megan on ?say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 91 Figure 29. Responding by Ricky on ?say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 92 Figure 30. Responding by Omar on ?say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 93 Figure 31. The percentage of correct responding on ?say the sounds? exercises by the three participants with developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 1013161922 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Say the Sounds Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 159131721 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Say the Sounds Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Say the Sounds Omar Baseline Instruction 94 Figure 32. The percentage of correct responding on ?say the sounds? exercises by the two participants without developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13 17 21 25 29 P e r c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say the Sounds Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13 17 21 25 29 33 37 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Say the Sounds Ricky Ba seline Instruction 95 responding from baseline measures. When the participants with developmental delays were compared to their typically developing peers, the data showed that the children with delays did not perform as well on this segmenting skill as did the children without delays. Nevertheless, the difference in responding was not substantial enough to limit the progress of future lessons. Similar to the previously assessed skills, the Standard Celeration Charts (see Figures 33-37) indicate that each of the participants made an improvement in their frequency of correct responses, with Allison and Megan each making a significant improvement over the four lessons (x2.60 and x2.34 per week, respectively). Significant decelerations in the frequency of incorrect responses were also achieved by four out of the five participants, including all three of the children with developmental delays. While promising, these data should be interpreted with caution since there are only a limited number of data points to evaluate. Data on the previous skills suggest that while the celeration may change sharply during initial instruction on the target behavior, over an extended time the change in frequency becomes more stable. Say the Sounds-Say it Fast Say the Sounds-Say it Fast exercises consolidated the skills learned in the previous lessons of saying words fast (?say it fast?) and saying words slowly (?say the sounds?). All five of the participants achieved the mastery criterion for this skill within the first 7 trials (see Figures 38 & 39), with no differences between the children with developmental delays and their typical peers. The three participants with delays showed slightly greater acceleration gains in correct responses per week (x1.12, x1.09, and x1.07 versus x1.04 and x1.03); although none of the participants? gains were considered significant 96 Figure 33. Responding by Allison on ?say the sounds? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 97 Figure 34. Responding by Danielle on ?say the sounds? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 98 Figure 35. Responding by Megan on ?say the sounds? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 99 Figure 36. Responding by Ricky on ?say the sounds? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 100 Figure 37. Responding by Omar on ?say the sounds? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 101 Figure 38. The percentage of correct responding on ?say the sounds-say it fast? exercises by the three participants with developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 Per c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say the Sounds-Say it Fast Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 913172125293337414549535761656973778185899397101 P e r cen t C o r r ec t R e s p o n d i n g Consecutive Calendar Days Say the Sounds-Say it Fast Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 101316192225283134374043464952555861646770737679 Per c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say the Sounds-Say it Fast Omar Ba seline Instruction 102 Figure 39. The percentage of correct responding on ?say the sounds-say it fast? exercises by the two participants without developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 P e r c e n t C o r r e c t R e s p o n d i n g Consecutive Calendar Days Say the Sounds-Say it Fast Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e r cen t C o r r ect R e s p o n d i n g Consecutive Calendar Days Say the Sounds-Say it Fast Ricky Ba seline Instruction 103 (see Figures 40-44). Nevertheless, four out of the five participants demonstrated a significant deceleration in incorrect responses per week, with Ricky being the only one who did not meet the criterion (x1.15); though it should be noted that Ricky?s frequency of correct responses was much higher than the rest of the participants. Sounding Out The final pre-reading track was ?sounding out.? The activities in this track were similar to those in the ?say the sounds? track, except in this track the participants had to read the sounds instead of repeating the sounds that the instructor said. Despite individual differences (see Figures 45 & 46), all of the participants were able to master this important skill. Two of the children (Omar and Danielle) with developmental delays took several more trials (14 and 10, respectively) than their typical peers (7 and 6, respectively) did to reach criterion. Allison, however, was able to reach mastery criterion within 7 trials. In contrast to the previous skills, the typically developing children demonstrated larger acceleration gains in correct responses over the course of instruction than did the children with disabilities (see Figures 47-51). Both Megan and Ricky (x1.25 and x1.30, respectively) demonstrated significant improvements on ?sounding out? exercises, while Allison was the only child with disabilities to show such a gain (x1.28). Despite this fact, the other two children with delays showed a clear improvement in both their speed and accuracy of sounding out words (see Table 3 for a complete listing of celeration rates across all of the dependent measures). Reading Vocabulary The study culminated with the participants reading simple, regular words (e.g., if, man, sit) that had been introduced in earlier lessons. Prior to the start of the Reading 104 Figure 40. Responding by Allison on ?say the sounds-say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 105 Figure 41. Responding by Danielle on ?say the sounds-say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 106 Figure 42. Responding by Megan on ?say the sounds-say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 107 Figure 43. Responding by Ricky on ?say the sounds-say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 108 Figure 44. Responding by Omar on ?say the sounds-say it fast? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 109 Figure 45. The percentage of correct responding on ?sounding out? exercises by the three participants with developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Sounding Out Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 131721252933374145495357616569 Per c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Sounding Out Danielle Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 1013161922252831343740434649525558616467707376 Per c en t C o r r ect R e s p o n d i n g Consecutive Calendar Days Sounding Out Omar Ba seline Instruction 110 Figure 46. The percentage of correct responding on ?sounding out? exercises by the two participants without developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 P e r cen t C o r r ect R e s p o n d i ng Consecutive Calendar Days Sounding Out Megan Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13172125293337414549535761656973778185899397101105109113117121125129133137 P e r cen t C o r r ect R e s p o n d i n g Consecutive Calendar Days Sounding Out Ricky Ba seline Instruction 111 Figure 47. Responding by Allison on ?sounding out? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 112 Figure 48. Responding by Danielle on ?sounding out? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 113 Figure 49. Responding by Megan on ?sounding out? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 114 Figure 50. Responding by Ricky on ?sounding out? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 115 Figure 51. Responding by Omar on ?sounding out? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 116 Reading Vocabulary Incorrect ?1.70 ?1.80 ?3.70 ?1.85 ?1.20 Correct X1.16 X1.09 X1.70 X1.10 X1.05 Sounding Out Incorrect ?3.10 ?3.00 ?2.05 ?2.90 ?2.30 Correct X1.28 X1.18 X1.07 X1.30 X1.25 Say the Sounds Say it Fast Incorrect ?1.35 ?1.70 ?1.75 ?1.15 ?1.25 Correct X1.12 X1.07 X1.09 X1.03 X1.04 Say the Sounds Incorrect ?3.80 ?1.45 ?3.90 ?9.60 X1.00 Correct X2.60 X1.00 X1.03 X1.15 X2.34 Say it Fast Incorrect ?1.75 ?4.40 ?2.30 ?2.10 ?3.40 Correct X1.07 X1.05 X1.07 X1.05 X1.02 Letter-Sound Correspondence Incorrect X1.04 ?1.10 ?1.17 ?1.17 X1.04 Correct X1.12 X1.10 X1.10 X1.00 X1.02 Frequency of Responding Allison Danielle Omar Ricky Megan Table 3 Celeration R a tes of T a r g et Behaviors (p er week ) 117 Mastery Plus ? Level K program, none of the five participants could sound out and say any of the printed words at a normal pace. After the implementation of the DI program, four out of the five children had met mastery criterion (see Figures 52 & 53). Two out of the three children with developmental delays (Allison and Danielle) were successfully reading words by the end of the study, having both taken 10 trials to reach the criterion. Omar was the only child who did not reach mastery criterion. This result is most likely due to the limited number of trials that were presented to him before the study ended. The two typically developing children (Megan and Ricky) were both successful in reading the words presented to them, and were able to reach mastery in 9 and 6 trials, respectively. When examining the Standard Celeration Charts (see Figures 54-58), the data show that the children with developmental delays were able to correctly read between 4 words per minute (Omar) and 8 words per minute (Allison), with Danielle averaging about 7 words per minute. The children who were typically developing performed even better, with Megan reading about 14 words per minute, and Ricky reaching over 20 words per minute by the conclusion of the study. Overall Results In examining the previous results and comparing the responding of all of the individuals across the various target behaviors (see Figure 59), the results showed that the children both with and without developmental delays were successful in acquiring the basic skills necessary to begin reading. Nevertheless, due to the A-B design, it is not possible to conclude that the results that were obtained were due solely to the Reading Mastery Plus ? Level K program and not possible extraneous factors. 118 Figure 52. The percentage of correct responding on ?reading vocabulary? exercises by the three participants with developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Reading Vocabulary Allison Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 5 9 13172125293337414549535761656973778185899397101 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Reading Vocabulary Danielle Baseline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 4 7 101316192225283134374043464952555861646770737679 P e r c e n t C o r r e c t R e s pond i n g Consecutive Calendar Days Reading Vocabulary Omar Baseline Instruction 119 Figure 53. The percentage of correct responding on ?reading vocabulary? exercises by the two participants without developmental disabilities. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 7 13 19 25 31 37 43 49 55 61 67 73 79 85 91 97 103109115121127133139145151157163169175181187193199205211 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Reading Vocabulary Megan Ba seline Instruction 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 P e r c e n t C o r r e c t R e s pon d i ng Consecutive Calendar Days Reading Vocabulary Ricky Ba seline Instruction 120 Figure 54. Responding by Allison on ?reading vocabulary? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 121 Figure 55. Responding by Danielle on ?reading vocabulary? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 122 Figure 56. Responding by Megan on ?reading vocabulary? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 123 Figure 57. Responding by Ricky on ?reading vocabulary? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 124 Figure 58. Responding by Omar on ?reading vocabulary? exercises. Circles (?) represent correct responses, while incorrect responses are indicated by an (x). 125 Figure 59. The percentage of correct responses made by the participants across all of the dependent measures. 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00% Letter-Sound Correspondence Say the Sounds Say it Fast Say the Sounds - Say it Fast Sounding Out Reading Vocabulary P e r cen t C o r r ect R e s p o n d i n g Skill Set Participant's Responses Across All Skills Allison Danielle Omar Megan Ricky 126 While the participants with developmental delays were able to acquire the skills presented in the program, the results show that they were not as accurate in their responses as were their typical peers (see Figure 60). Additionally, the frequency of correct responding by typically developing children was higher than the children with developmental delays (see Table 4). Nevertheless, as indicated by the celeration rates listed in Table 3, the accuracy (i.e., deceleration in the frequency of incorrect responses) of the responses made by the children with delays improved significantly over the course of instruction on most skills. Treatment Fidelity and Interobserver Agreement Treatment fidelity was measured using a checklist of instructional procedures (Marchand-Martella, et al., 1995; see Figures A1-A4). Approximately 35% percent of the sessions were checked for treatment fidelity either through direct observation or videotape. Each of the treatment fidelity observations was carried out with 100% accuracy. Interobserver agreement probes were conducted approximately once a week, with 35% (70 out of 200) of the sessions being assessed. Videotaped sessions were assessed by the calibrating observer following completed lessons to allow for repeated viewings. Interobserver agreement was calculated by dividing the smaller score on the dependent measure by the larger score on the dependent measure and multiplying it by 100 (e.g., 9 correct letter-sound correspondences ? 10 correct letter-sound correspondences x 100 = 90% agreement). Interobserver agreement was 100% on all probes that were observed. 127 Figure 60. The cumulative number of errors made by the participants during the instructional period. 0 50 100 150 200 250 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u mu la t i v e N u mb e r o f E r r o rs Calendar Days Cumulative Number of Errors Across All Lessons Allison Danielle Megan Ricky Omar 128 Reading Vocabulary Correct 8 7 4 21 15 Sounding Out Correct 15 11 8 21 16 Say the Sounds Say it Fast Correct 17 13 10 22 16 Say the Sounds Correct 12 9 10 21 11 Say it Fast Correct 15 15 12 21 13 Letter-Sound Correspondence Correct 10 8 7 14 12 Frequency of Responding Allison Danielle Omar Ricky Megan Table 4 Avera g e Fre q uenc y of Res p ondin g (p er minute ) at the Conclusion of the Stud y 129 CHAPTER III: GENERAL DISCUSSION The purpose of the present study was to investigate the effects of the Reading Mastery Plus ? Level K program on preschool children with developmental delays. Due to the research design used in this study, functional relationships between the Reading Mastery Plus ? Level K program and the participant?s acquisition of beginning reading skills cannot be determined. Nevertheless, the results showed that preschool-aged children with and without developmental delays can acquire beginning reading skills. These results address a gap in the literature which has shown Direct Instruction to be effective for younger children without delays (Adams & Engelmann, 1996; Weisberg, 1988; Weisberg & Savard, 1993) and for older children with delays (Bradford, Shippen, Alberto, Houchins, & Flores, 2006; Flores & Ganz, 2007; Flores et al., 2004), but has provided limited information on teaching young children with developmental delays. Over the course of instruction, all three of the children with developmental delays were able to master a number of letter-sound correspondences while increasing the frequency of their correct responses. Additionally, two of the children were able to decrease their frequency of incorrect responses to letter-sound correspondence tasks. This is noteworthy, considering the fact that the letter-sound correspondence exercises were where the participants made the majority of their errors (see Figures A45-A49). When examining the other dependent measures, the typically developing children performed at higher frequencies on the majority of the dependent measures (e.g., 130 approximately 17 words read per minute versus 6 words per minute on ?reading vocabulary? exercises). Nevertheless, the participants with developmental delays actually showed larger gains over the course of instruction across all of the dependent measures. Their greatest area of improvement came in the deceleration of errors across the various target behaviors. While it would have been ideal for the children to achieve significant gains in both speed and accuracy (i.e., fluency), their improvement in correct responding was an important first step. By increasing their accuracy over the course of instruction, the children were able to experience a number of successful trials and receive an increased amount of praise from the instructors. Although it took these children longer to produce correct answers then their peers, the author preferred that the children take their time and make correct responses rather than respond quickly with an incorrect answer. Although this decreased their fluency scores, the children were mastering the target behaviors and, based on previous research, these students would probably be able to improve their fluency scores by simply engaging in repeated practice over a longer instructional period (NICHD, 2000). While there has been a limited amount of research examining the use of Direct Instruction with preschool-aged children with developmental delays, the findings of this study support previous studies that have shown that children and adolescents with developmental delays are able to acquire decoding skills (Bradford et al., 2006; Flores et al., 2004; Nation, Clarke, Wright, & Williams, 2006). Nevertheless, given the wide variation in cognitive and linguistic skills seen in individuals with developmental delays, one must be cautious in making this generalization to the entire population. 131 Traditionally, children with developmental delays have struggled more with reading comprehension than with decoding skills (Chiang & Lin, 2007; Nation et al., 2006; O?Connor & Klein, 2004). While the Reading Mastery Plus ? Level K program provides limited instruction in comprehension, the results of this study do have important implications for reading comprehension. What the data from this study suggest is that preschool-aged children with developmental delays have the ability to acquire the skills necessary to begin reading. Although these skills (e.g., letter-sound correspondence, blending, segmenting, etc.) do not directly cause text comprehension, they do play a vital role in allowing the child to decode words and read them at a rate that facilitates their ability to focus on the meaning of the text rather than on identifying or guessing what the word may be ? as is often seen in the whole-language approach. If the prerequisite skills necessary to begin reading can be taught to these children at an early age ? as this study suggests ? then, in later instruction, this may allow teachers to focus more of their time on the areas where the child is struggling (e.g., comprehension strategies) and devote less time to skills they have already mastered. Although there were many important findings in this investigation, there were several limitations. First, the research design used in this study only allows for a limited interpretation of the results. Since A-B designs do not demonstrate functional relationships, it is not possible to conclude that the results that were found were due solely to the Reading Mastery Plus ? Level K program. While it is likely that the gains demonstrated by the participants were due to the reading instruction, the author cannot conclude that there were no extraneous factors involved that may have influenced the results. Future studies utilizing a more efficient research design are needed to identify any 132 functional relationships between DI curricula and beginning reading skill acquisition with this population. Second, the intervention was administered by instructors who had limited experience in using Direct Instruction curricula. The author was familiar with DI methods based on classroom instruction and workshops, but he was not a certified trainer. Despite this fact, all of the participants showed gains in all of the dependent measures. Nevertheless, if the intervention had been presented by experienced instructors, these gains may have been more substantial. Additionally, the time spent on instruction was not as systematic as recommended by the program. Due to participant absences and scheduling conflicts, instruction was presented only about three times per week for 20 to 25 minutes, instead of the daily instruction that is recommended. Had the additional instruction been provided, it is possible that the participants would have seen larger gains in fluency measures (NICHD, 2000). Third, this research did not compare the Reading Mastery Plus ? Level K curriculum to any other reading programs (e.g., Headsprout; Layng, Twyman, & Stikeleather, 2003). While the results show that the participants? beginning reading skills improved from baseline measures, it is possible that other interventions may have been as effective, or more effective than the program that was used. Further research using other designs is needed to address this issue. Additionally, the researcher did not have control over any reading instruction that may have been delivered outside of the school setting. Based on conversations with one of the participants (Ricky), it was believed that he had been receiving additional instruction in beginning reading skills from his parents. Therefore, 133 the large improvements made by Ricky (and possibly other participants) cannot be contributed solely to the Reading Mastery Plus ? Level K curriculum. As previously noted, the form of measurement used to assess celeration rates also limited the interpretation of the results. The use of total session time in determining the frequency of responding provided a less accurate representation of the participants? target behaviors over the course of instruction. In using the total session time, the results were likely more conservative (i.e., lower frequencies of responding) than they would have been if IRT or latency had been used as a denominator. A more conservative approach was also used in assessing mastery. Whereas a number of studies have labeled mastery as three consecutive probes at 100% accuracy (Engelmann, 2007; Flores & Ganz, 2007; Flores et al., 2004), this study examined mastery by measuring all of the responses over an entire session. While first-time correct responding does show what skills the child has in his or her repertoire, by only measuring a single response the investigator is left with an incomplete picture of responding (Johnston & Pennypacker, 1993). Another limitation was that there was no opportunity to assess maintenance. Four out of the five participants were removed from the setting (i.e., they stopped attending the preschool/facility) prior to completion of the study; and due to time constraints, maintenance was not assessed with the final participant. While the participants did master a number of the target behaviors, it would have been beneficial to show that these skills were maintained once the instruction was discontinued. The heterogeneous nature of reading skills in children with developmental delays has been well documented (Chiang & Lin, 2007; Conners, 1992; Nation et al., 2006). Although the small group of children with developmental delays who participated in this 134 study demonstrated success with the Reading Mastery Plus ? Level K program, the degree to which generalizations can be made is severely limited. In order to generalize these results to the larger population, additional testing with a larger sample of children with varied levels of functioning is needed. The use of Direct Instruction in successfully teaching children, adolescents, and adults of various backgrounds how to read is impressive (Adams & Engelmann, 1996). Despite the success of these programs, there is little research on the use of DI for young children with developmental delays. Additional research is necessary in order to extend the current study?s findings and to identify ways of improving reading instruction for children who are expected to have difficulty in learning how to read (Nation et al., 2006). In addition to increasing the sample size and including children with a more diverse range of functioning, there are several other areas that research with this population should address. First, while normal DI programs are presented using the small-groups format, the research that has been conducted with younger children with developmental delays has largely occurred using a one-to-one format. Although one-to-one instruction is beneficial for the child, by being able to provide instruction to only one child at a time, the instructor is greatly limited as to how many reading sessions can be provided each day. Due to the heterogeneity of children with delays, it may be difficult to place them in reading groups; however, previous research has not assessed if this may be a possibility once the children reach a certain level of mastery (e.g., 6 letter-sound correspondences mastered, etc.). Future research should be conducted to identify techniques and 135 procedures for moving from individual instruction with students with developmental disabilities to teaching them using the standard small-group format. As the results of this study have shown, it is possible for instructors who have limited experience using DI curricula to teach children basic reading skills. Given the participants? success in acquiring beginning reading skills during the limited amount of instructional time, it would be interesting to see if these results could be extended to different settings (e.g., child?s home) while incorporating a variety of instructors (e.g., parents, family members, etc). Integrating parents and other family members into DI programs has been successful with typically developing children (Leach & Siddall, 1990), but there has not been any research conducted involving younger children with developmental delays. By providing additional practice time, these children would stand to make even more progress in areas in which they are likely to struggle. Future research on incorporating family members in the implementation of DI curricula would be helpful in demonstrating the utility of these programs and their effects with this particular population. In addition to the development of reading skills, future research on the implementation of DI programs could also examine its effects on other secondary behaviors. Given the fact that children with ASD and other developmental delays often have impaired language and social communication (Sigafoos, Schlosser, Green, O?Reilly, & Lancioni, 2008), it would be interesting to examine the effects that reading instruction has on these behaviors. Additionally, researchers could assess how children?s progress in DI programs affects aberrant classroom behaviors. 136 The current study demonstrated the fact that preschool-aged children both with and without developmental delays are able to acquire beginning reading skills. While the research design inhibits the identification of any functional relationships between the Reading Mastery Plus ? Level K program and the participants? reading gains, the data showed that young children with developmental delays can acquire skills that are necessary to begin reading. This study provides preliminary data on the use of Direct Instruction with preschool-aged children with developmental delays ? an area that the DI literature has yet to systematically investigate. The results of the study support previous research that has shown DI to be effective for a variety of individuals, including older children and adolescents with developmental delays (Adams & Engelmann, 1996). This is an important finding, especially considering the fact that the number of children being diagnosed with Autism Spectrum Disorder is increasing (Centers for Disease Control and Prevention, 2007), as is the placement of children with developmental delays into partially or fully integrated classrooms (Williamson, McLeskey, Hoppey, & Rentz, 2006). While there is still some debate over the developmental appropriateness of teaching reading skills at such an early age (New, 2001; Purcell & Rosemary, 2008), the previous mindset that children with developmental delays were not capable of reading has definitely changed (Connors, 1992). Despite this fact, the research on the effects of DI on this population has seen only limited growth. As such, additional research on the use of DI with preschool-aged children with developmental delays is warranted, and the results of this study provide an appropriate starting point for extending this literature and for turning this research into practice. 137 REFERENCES Adams, M. J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press. Adams, M. J., & Bruck, M. (1995). Resolving the ?great debate?. American Educator, 19, 10-20. Adams, G. L., & Carnine, D. W. (2003). Direct Instruction. In H. L. Swanson, K. R. Harris, & S. Graham (Eds.), Handbook of Learning Disabilities (pp. 403-416). New York: Guilford Press. Adams, G. L., & Engelmann, S. (1996). Research on Direct Instruction: 25 years beyond DISTAR. Seattle, WA: Educational Achievement Systems. Allington, R. L. (1984). Content coverage and contextual reading in reading groups. Journal of Reading Behavior, 16, 85-96. Allington, R. L. (2006). What really matters for struggling readers: Designing research- based programs (2 nd ed.). Boston: Pearson Education, Inc. American heritage dictionary (3 rd ed.). (1992). Boston: Houghton Mifflin. Apffel, J., Kelleher, J., Lilly, M. S., & Richardson, R. (1975). Developmental reading for moderately retarded children. Education and Training of the Mentally Retarded, 10, 229-235. Bailey, J. S., & Burch, M. R. (2002). Research methods in applied behavior analysis. Thousand Oaks, CA: Sage Publications. 138 Barger-Anderson, R., Domaracki, J. W., Kearney-Vakulick, N., & Kubina, R. M. (2004). Multiple baseline designs: The use of a single-case experimental design in literacy research. Reading Improvement, 41, 217-225. Becker, W. C. (1977). Teaching reading and language to the disadvantaged ? What we have learned from field research. Harvard Educational Review, 47, 518-543. Becker, W. C., & Carnine, D. W. (1978). Direct Instruction: A behavior theory model for comprehensive educational intervention with the disadvantaged. In S. W. Bijou & R. Ruiz (Eds.), Behavior modification: Contributions to education (pp. 145-210). Hillsdale, NJ: Lawrence Erlbaum. Bereiter, C., & Engelmann, S. (1966). Teaching disadvantaged children in the preschool. Englewood Cliffs, NJ: Prentice-Hall. Biberdorf, J. R., & Pear, J. J. (1977). Two-to-one versus one-to-one student-teacher ratios in the operant verbal training of retarded children. Journal of Applied Behavior Analysis, 10, 506. Binder, C., & Watkins, C. L. (1990). Precision teaching and Direct Instruction: Measurably superior instructional technology in schools. Performance Improvement Quarterly, 3, 74-96. Booth, A., Hewitt, D., Jenkins, W., & Maggs, A. (1979). Making retarded children literate: A five year study. Australian Journal of Mental Retardation, 5, 257-260. Bracey, S., Maggs, A., & Morath, P. (1975). The effects of a direct phonic approach in teaching reading with six moderately retarded children: Acquisition and mastery learning stages 1, 2. Slow Learning Child, 22, 83-90. 139 Bradford, S., Shippen, M. E., Alberto, P., Houchins, D. E., & Flores, M. (2006). Using systematic instruction to teach decoding skills to middle school students with moderate intellectual disabilities. Education and Training in Developmental Disabilities, 41, 333-343. Bradley, L., & Bryant, P. E. (1983). Categorizing sounds and learning to read ? a causal connection. Nature, 301, 419-421. Brophy, J., & Good, T. (1986). Teacher behavior and student achievement. In M. Wittrock (Ed.), Handbook of research on teaching (pp. 328-375). New York: Macmillan. Brown, L., Huppler, B., Pierce, L., York, B., & Sontag, E. (1974). Teaching trainable- level students to read unconjugated action verbs. Journal of Special Education, 8, 51-56. Brown, L., & Perlmutter, L. (1971). Teaching functional reading to trainable level retarded students. Education and Training of the Mentally Retarded, 6, 74-84. Burack, J. A., Iarocci, G., Bowler, D., & Mottron, L. (2002). Benefits and pitfalls in the merging of disciplines: The example of developmental psychopathology and the study of persons with autism. Development and Psychopathology, 14, 225-237. Byrne, B., & Fielding-Barnsley, R. (1991). Evaluation of a program to teach phonemic awareness to young children. Journal of Educational Psychology, 83, 451-455. van Bysterveldt, A. K., Gillon, G. T., & Moran, C. (2006). Enhancing phonological awareness and letter knowledge in preschool children with Down Syndrome. International Journal of Disability, Development, and Education, 53, 301-329. 140 Cardon, L. R., Smith, S. D., Fulker, D. W., Kimberling, B. S., Pennington, B. F., & DeFries, J. C. (1994). Quantitative trait locus for reading disability on chromosome 6. Science, 266, 276-279. Carlson, P. E. (1985). Updating and broadening the use of single subject designs in reading. Reading Psychology, 6, 251-265. Carnine, D. W. (1976a). Similar sound separation and cumulative introduction in learning letter-sound correspondences. Journal of Educational Leadership, 69, 368-372. Carnine, D. W. (1976b). Effects of two teacher presentation rates on off-task behavior, answering correctly, and participation. Journal of Applied Behavior Analysis, 9, 199-206. Carnine, D. W. (1981). Reducing training problems associated with visually and auditorily similar correspondences. Journal of Learning Disabilities, 14, 276- 279. Carnine, D. W. (1992). Expanding the notion of teachers? rights: Access to tools that work. Journal of Applied Behavior Analysis, 25, 13-19. Carnine, D. W. (1995). The professional context for collaboration and collaborative research. Remedial and Special Education, 16, 368-371. Carnine, D. W., Silbert, J., Kame?enui, E. J., & Tarver, S. G. (2004). Direct Instruction reading (4 th ed.). Columbus, OH: Pearson Education, Inc. Carnine, D. W., Silbert, J., Kame?enui, E. J., Tarver, S. G., & Jungjohann, K. (2006). Teaching struggling and at-risk readers: A Direct Instruction approach. Columbus, OH: Pearson Education, Inc. 141 Centers for Disease Control and Prevention. (2007). Prevalence of autism spectrum disorders and developmental disabilities monitoring 14 sites, United States, 2002. Morbidity and Mortality Weekly Report, 56, 1-11. Chiang, H. M., & Lin, Y. H. (2007). Reading comprehension instruction for students with autism spectrum disorders: A review of the literature. Focus on Autism and Other Developmental Disabilities, 22, 259-267. Cohen, E. T., Heller, K. W., Alberto, P., & Fredrick, L. D. (2008). Using a three-step decoding strategy with constant time delay to teach word reading to students with mild and moderate mental retardation. Focus on Autism and Other Developmental Disabilities, 23, 67-78. Commission on Excellence in Special Education (2002). A new era: Revitalizing special education for children and their families (USDOE Publication No. ED-02-PO- 0791). Washington, DC: Author. Conners, F. A. (1992). Reading instruction for students with moderate mental retardation: Review and analysis of research. American Journal of Mental Retardation, 96, 577-597. Conners, F. A. (2003). Reading skills and cognitive abilities of individuals with mental retardation. In L. Abbeduto (Ed.), International review of research in mental retardation (Vol. 27, pp. 191-229). San Diego, CA: San Diego Academic Press. Conners, F. A., Rosenquist, C. J., Sligh, A. C., Atwell, J. A., & Kiser, T. (2006). Phonological reading skills acquisition by children with mental retardation. Research in Developmental Disabilities, 27, 121-137. 142 Cunningham, A. E. (1990). Explicit versus implicit instruction in phonemic awareness. Journal of Experimental Child Psychology, 50, 429-444. Cunningham, J. W. (2001). The National Reading Panel report. Reading Research Quarterly, 36, 326-335. Engelmann, S. (2007). Student-program alignment and teaching to mastery. Journal of Direct Instruction, 7, 45-66. Engelmann, S., & Carnine, D. W. (1982). Theory of instruction: Principles and applications. New York: Irvington Publishers, Inc. Engelmann, S., Osborn, J., Bruner, E. C., Engelmann, O., & Seitz-Davis, K. L. (2002). Reading Mastery Plus: Direct Instruction reading ? Level K. Columbus, OH: SRA/McGraw Hill. Evers, W. M. (Ed.). (1998). What?s gone wrong in America?s classrooms. Stanford, CA: Hoover Institution Press. Fink, W. T., & Sandall, S. R. (1979). One to one versus group academic instruction with handicapped and non-handicapped preschool children. Mental Retardation, 16, 236-240. Fisher, S., & DeFries, J. C. (2002). Developmental dyslexia: Genetic dissection of a complex cognitive trait. Nature Reviews Neuroscience, 3, 767-780. Flores, M. M., & Ganz, J. B. (2007). Effectiveness of Direct Instruction for teaching statement inference, use of facts, and analogies to students with developmental disabilities and delays. Focus on Autism and Other Developmental Disabilities, 22, 244-251. 143 Flores, M. M., Shippen, M. E., & Alberto, P. A., & Crowe, L. (2004). Teaching letter- sound correspondences to students with moderate intellectual disabilities. Journal of Direct Instruction, 4, 173-188. Fontenelle, S., & Alarcon, M. (1982). Hyperlexia: Precocious word recognition in developmentally delayed children. Perceptual and Motor Skills, 55, 247-252. Foorman, B R., Francis, D. J., Beeler, T., Winikates, D., & Fletcher, J. M. (1997). Early intervention for children with reading problems: Study designs and preliminary findings. Learning Disabilities, 8, 63-71. Fossett, B., & Mirenda, P. (2006). Sight word reading in children with developmental disabilities: A comparison of paired associate and picture-to-text matching instruction. Research in Developmental Disabilities, 27, 411-429. Gable, R. A., & Warren, S. F. (1993). The enduring value of instructional research. In R. Gable & S. F. Warren (Eds.), Advances in mental retardation and developmental disabilities: Strategies for teaching students with mild to sever mental retardation (pp. 1-7). Philadelphia: Jessica Kingsley. Gersten, R. (1985). Direct Instruction with special education students: A review of evaluation research. The Journal of Special Education, 19, 41-58. Gersten, R. (2001). Sorting out the roles of research in the improvement of practice. Learning Disabilities Research and Practice, 16, 45-50. Gersten, R., & Maggs, A. (1982). Teaching the general case to moderately retarded children: Evaluation of a five year project. Analysis and Intervention in Developmental Disabilities, 2, 329-343. Gilliam, J. E. (1995). Gilliam autism rating scale. Austin: ProEd. 144 Gilliam, J. E. (2003). Gilliam asperger?s disorder scale. Austin: ProEd. Gillon, G. T. (2004). Phonological awareness: From research to practice. New York: Guilford Press. Glang, A., Singer, G., Cooley, E., & Tish, N. (1992). Tailoring Direct Instruction techniques for use with elementary students with brain injury. Journal of Head Trauma Rehabilitation, 7, 93-108. Goodman, K. S. (1968). The psycholinguistic nature of the reading process. In K. S. Goodman (Ed.), The psycholinguistic nature of the reading process (pp. 13-26). Detroit: Wayne State University Press. Goodman, K. S. (1982). What is universal about the reading process? In F. V. Gollasch (Ed.), Language and literacy: The selected writings of Kenneth S. Goodman (Vol. 1, pp. 71-75). Boston: Routledge & Kegan Paul. Goodman, K. S. (1992). I didn?t found whole language. The Reading Teacher, 46, 188- 198. Goswami, U., & Bryant, P. (1990). Phonological skills and learning to read. East Sussex, UK: Lawrence Erlbaum Associates Ltd., Publishers Grossen, B., & Kelly, B. F. (1992). The effectiveness of Direct Instruction in a third- world context. International Review of Education, 38, 81-85. Hart, B., & Risley, T. R. (1975). Incidental teaching of language in the preschool. Journal of Applied Behavior Analysis, 8, 411-420. Hart, B., & Risley, T. R. (1995). Meaningful differences in the everyday experience of young American children. Baltimore: Brookes. 145 Heward, W. L. (2005). Reasons applied behavior analysis is good for education and why those reasons have been insufficient. In W. L. Heward et al. (Eds.), Focus on behavior analysis in education: Achievements, challenges, and opportunities (pp. 316-348). Columbus, OH: Pearson Education, Inc. Hoogeven, F. R., & Smeets, P. M. (1988). Establishing phoneme blending in trainable mentally retarded children. Remedial and Special Education, 9, 46-53. Hoogeveen, F. R., Smeets, P. M., & Lancioni, G. E. (1989). Teaching moderately mentally retarded children basic reading skills. Research in Developmental Disabilities, 10, 1-18. Hoogeveen, F. R., Smeets, P. M., & van der Houven, J. E. (1987). Establishing letter- sound correspondences in children classified as trainable mentally retarded. Education and Training in Mental Retardation and Developmental Disabilities, 22, 77-84. House, E. R., Glass, G. V., McLean, L. D., & Walker, D. F. (1978). No simple answer: Critique of the Follow Through evaluation. Harvard Educational Review, 48, 128-160. Humphries, T., Neufeld, M., Johnson, C., Engles, K., & McKay, R. (2005). A pilot study of the effect of Direct Instruction programming on the academic performance of students with intractable epilepsy. Epilepsy & Behavior, 6, 405-412. Infantino, J., & Hempenstall, K. (2006). Effects of a decoding program on a child with Autism Spectrum Disorder. Australasian Journal of Special Education, 30, 126- 144. 146 Johnston, J. M., & Pennypacker, H. S. (1993). Strategies and tactics of behavioral research (2 nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers. Joseph, J., Noble, K., & Eden, G. (2001). The neurobiological basis of reading. Journal of Learning Disabilities, 34, 566-579. Joseph, L. M., & Seery, M. E. (2004). Where is the phonics?: A review of the literature on the use of phonetic analysis with students with mental retardation. Remedial and Special Education, 25, 88-94. Katims, D. (2000). Literacy instruction for people with mental retardation: Historical highlights and contemporary analysis. Education and Training in the Mental Retardation and Developmental Disabilities, 35, 3-15. Kemp, C. (1996). Does teaching young children with disabilities to read facilitate their language development? A critical review of current theory and empirical evidence. International Journal of Disability, Development and Education, 43, 175-187. Kim, T., & Axelrod, S. (2005). Direct Instruction: An educators? guide and a plea for action. The Behavior Analyst Today, 6, 111-120. Krashen, S. (2002). Defending whole language: The limits of phonics instruction and the efficacy of whole language instruction. Reading Improvement, 39, 32-42. Layng, T. V., Twyman, J. S., & Stikeleather, G. (2003). Headsprout early reading: Reliably teaching children to read. Behavioral Technology Today, 3, 7-20. Leach, D. J., & Siddall, S. W. (1990). Parental involvement in the teaching of reading: A comparison of hearing reading, paired reading, pause, prompt, praise, and direct instruction methods. British Journal of Educational Psychology, 60, 349-355. 147 Lecavalier, L. (2005). An evaluation of the Gilliam Autism Rating Scale. Journal of Autism and Developmental Disabilities, 35, 795-805. Legault, A., Maloney, M., & Giroux, N. (2000). Learning rates with Direct Instruction, Precision Teaching and the Corrective Reading Series. Journal of Precision Teaching and Celeration, 17, 89-91. Lindsley, O. R. (1992). Why aren?t effective teaching tools widely adopted? Journal of Applied Behavior Analysis, 25, 21-26. Lockery, M., & Maggs, A. (1982). Direct Instruction research in Australia: A ten-year analysis. Educational Psychology, 2, 263-288 Lonigan, C. J., Anthony, J. L., Bloomfield, B. G., Dyer, S. M., & Samwel, C. S. (1999). Effects of two shared-reading interventions on emergent literacy skills of at-risk preschoolers. Journal of Early Intervention, 22, 306-322. Lonigan, C. J., Burgess, S. R., Anthony, J. L., & Barker, T. A. (1998). Development of phonological sensitivity in 2-to 5-year-old children. Journal of Educational Psychology, 90, 294-311. Lundberg, I., Frost, J., & Petersen, O. (1988). Effects of an extensive program for stimulating phonological awareness in preschool children. Reading Research Quarterly, 23, 263-284. Lundberg, I., Olofsson, A., & Wall, S. (1980). Reading and spelling skills in the first school years predicted from phonemic awareness skills in kindergarten. Scandinavian Journal of Psychology, 21, 159-173. 148 Mace, F. C., Hock, M. L., Lalli, J. S., West, B. J., Belfiore, P., Pinter, E., & Brown, D. K. (1988). Behavioral momentum in the treatment of noncompliance. Journal of Applied Behavior Analysis, 21, 123-141. Marchand-Martella, N., Lignugaris-Kraft, B., Pettigrew, T., & Leishman, R. (1995). Direct Instruction supervision system. Logan, UT: Utah State University. Marchand-Martella, N., & Martella, R. (2002). An overview and research summary of peer-delivered corrective reading instruction. The Behavior Analyst Today, 3, 213-220. Marchand-Martella, N., Slocum, T., & Martella, R. (2004). Introduction to Direct Instruction. Boston: Pearson Education, Inc. McCormick, S. (1990). A case for the use of single-subject methodology in reading research. Journal of Research in Reading, 13, 69-81. Moore, C., Evans, D., & Dowson, M. (2005). The intricate nature of phonological awareness instruction. Special Education Perspectives, 14, 37-54. Nakano, Y., Kageyama, M., & Kinoshita, S. (1993). Using Direct Instruction to improve teacher performance, academic achievement, and classroom behavior in a Japanese public junior high school. Education and Treatment of Children, 16, 326-343. Nation, K., Clarke, P., Wright, B., & Williams, C. (2006). Patterns of reading ability in children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 36, 911-919. 149 National Center for Education Statistics. U.S. Department of Education. (1999). NAEP 1998 Reading report card for the nation. (NCES 1999-459). Washington, DC: Author. National Institute for Literacy. (2000). Workforce education. Retrieved February 6, 2007, from http://www.nifl.gov/nifl/facts/workforce.html National Institute of Child Health and Human Development. (1996). Thirty years of NICHD research: What we now know about how children learn to read. Effective School Practices, 15, 33-46. National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. (NIH Publication No. 00-4769).Washington, DC: U.S. Government Printing Office Neuman, S. B., & McCormick, S. (1995). Single-subject experimental research: Applications for literacy. Newark, DE: International Reading Association. New, R. S. (2001). Early literacy and developmentally appropriate practice: Rethinking the paradigm. In S. B. Neuman & D. K. Dickinson (Eds.). Handbook of Early Literacy Research (pp. 245-262). New York: The Guilford Press. O?Connor, I. M., & Klein, P. D., (2004). Exploration of strategies for facilitating the reading comprehension of high-functioning students with autism spectrum disorders. Journal of Autism and Developmental Disorders, 34, 115-127. 150 O?Connor, R., Jenkins, J., Cole, K., & Mills, P. (1993). Two approaches to reading instruction for children with disabilities: Does program design make a difference? Exceptional Children, 59, 312-323. O?Connor, R. E., Jenkins, J.R., Leicester, N., & Slocum, T. A. (1993). Teaching phonological awareness to young children with learning disabilities. Exceptional Children, 59, 532-546. Olofsson, A., & Lundberg, I. (1985). Evaluation of long term effects of phonemic awareness training in kindergarten: Illustrations of some methodological problems in evaluation research. Scandinavian Journal of Psychology, 26, 21-34. Olson, L. (1999). Researchers rate whole-school reform models. Education Week, 18(23), 1, 14-16. Paulesu, E., Demonet, J. F., Fazio, F., McCrory, E., Chanoine, V., Brunswick, N., et al. (2001). Dyslexia-cultural diversity and biological unity. Science, 291, 2165-2167. Paulson, L. H., Kelly, K. L., Jepson, S., van den Pol, R., Ashmore, R., Farrier, M., & Guilfoyle, S. (2004). The effects of an early reading curriculum on language and literacy development of Head Start children. Journal of Research in Childhood Education, 18, 169-178. Pennypacker, H. S., Gutierrez, A., & Lindlsey, O. R. (2003). Handbook of the Standard Celeration Chart: Deluxe edition. Gainesville, FL: Xerographics, Inc. Purcell, T., & Rosemary, C. A. (2008). Differentiating instruction in the preschool classroom: Bridging emergent literacy instruction and developmentally appropriate practice. In L. M. Justice & C. Vukelich (Eds.), Achieving Excellence in Preschool Literacy Instruction (pp. 221-241). New York: The Guilford Press. 151 Raines, S. C., & Canady, R. J. (1990). The whole language kindergarten. New York: Teachers College Press. Richards, T. L. (2001). Functional magnetic resonance imaging and spectroscopic imaging of the brain: Applications of fMRI and fMRS to reading disabilities and education. Learning Disability Quarterly, 24, 189-203. Rosenshine, B. (1976). Recent research on teaching behavior and student achievement. Journal of Teacher Education, 27, 61-64. Scarborough, H. S. (1990). Very early language deficits in dyslexic children. Child Development, 61, 1728-1743. Schieffer, C., Marchand-Martella, N. E., Martella, R. C., Simonsen, F. L., & Waldron- Soler, K. M. (2002). An analysis of the Reading Mastery program: Effective components and research review. Journal of Direct Instruction, 2, 87-119. Scruggs, T. E., & Mastropieri, M. A. (1993). Teaching students with mild mental retardation. In R. A. Gable & S. F. Warren (Eds.), Strategies for teaching students with mild to severe mental retardation (pp. 117-125). Baltimore: P. H. Brookes Publishing. Shankweiler, D., Lundquist, E., Katz, L., Stuebing, K. K., Fletcher, J. M., Brady, S., et al. (1999). Comprehension and decoding: Patterns of association in children with reading difficulties. Scientific Studies of Reading, 3, 69-94. Share, D. L. (1995). Phonological recoding and self-teaching: Sine qua non of reading acquisition. Cognition, 55, 151-218. Shaywitz, S. E., & Shaywitz, B. A. (2004). Disability and the brain. Educational Leadership, 61, 6-11. 152 Sigafoos, J., Schlosser, R. W., Green, V. A., O?Reilly, M., & Lancioni, G. E. (2008). Communication and Social Skills Assessment. In J. Matson (Ed.), Clinical Assessment and Intervention for Autism Spectrum Disorders (pp. 165-192). Boston: Academic Press. Simmons, D. C., & Kame?enui, E. J. (Eds.) (1998). What reading research tells us about children with diverse learning needs: Bases and basics. Mahwah, NJ: Lawrence Erlbaum Associates. Simos, P. G., Fletcher, J. M., Bergman, E., Breier, J. I., Foorman, B. R., Castillo, E. M., et al. (2002). Dyslexia-specific brain activation profile becomes normal following successful remedial training. Neurology, 58, 1203-1213. Singh, N. N., & Singh, J. (1988). Increasing oral reading proficiency through overcorrection and phonic analysis. American Journal on Mental Retardation, 93, 312-319. Smith, S. B., Simmons, D. C., & Kame?enui, E. J. (1998). Phonological awareness: Research bases. In D. C. Simmons & E. J. Kame?enui (Eds.), What reading research tells us about children with diverse learning needs: Bases and basics (pp. 61-121). Mahwah, NJ: Lawrence Erlbaum Associates. Snow, C. E., Burns, S., & Griffin, P. (1998). Preventing reading difficulties in young children. Washington, DC: National Academy Press. South, M., Williams, B. J., McMahon, W. M., Owley, T., Filipek, P. A., Shernoff, E., et al. (2002). Utility of the Gilliam Autism Rating Scale in research and clinical populations. Journal of Autism and Developmental Disabilities, 32, 593-599. 153 Stahl, S. A., & Miller, P. D. (1989). Whole language and language experience approaches for beginning reading: A quantitative research synthesis. Review of Educational Research, 59, 87-116. Stanovich, K. E. (1986). Matthew effects in reading: Some consequences of individual differences in the acquisition of literacy. Reading Research Quarterly, 21, 360- 406. Stein, M., Carnine, D. W., & Dixon, R. (1998). Direct Instruction: Integrating curriculum design and effective teaching practice. Intervention in School and Clinic, 33, 227- 235. Strauss, S. L. (2005). The linguistics, neurology, and politics of phonics: Silent ?e? speaks out. Mahwah, New Jersey: Lawrence Erlbaum Associates. Testing law may change to accommodate disabled students. (February 12, 2007). Retrieved from http://www.cnn.com/2007/EDUCATION/02/12/education.law.changes.ap Torgesen, J. K. (1997). The prevention and remediation of reading disabilities: Evaluating what we know from research. Journal of Academic Language Therapy, 1, 11-47. Twyman, J. S., Layng, T., Stikeleather, G., & Hobbins, K. A. (2005). A nonlinear approach to curriculum design: The role of behavior analysis in building an effective reading program. In W. L. Heward et al. (Eds.), Focus on behavior analysis in education: Achievements, challenges, and opportunities (pp. 55-68). Columbus, OH: Pearson Education, Inc. 154 U. S. Department of Education. (2002). No Child Left Behind: A desktop reference. Retrieved February 10, 2007, from http://www.ed.gov/admins/lead/account/nclbreference/reference.pdf Vellutino, F. R. (1991). Introduction to three studies on reading acquisition: Convergent findings on theoretical foundations of code-oriented versus whole-language approaches to reading instruction. Journal of Educational Psychology, 83, 437- 443. Watkins, C. L. (1996). Follow through: Why didn?t we? Effective School Practices, 15, 57-66. Weaver, C. (2002). Reading process and practice (3 rd ed.). Portsmouth, NH: Heinemann Weisberg, P. (1988). Direct Instruction in the preschool. Education and Treatment of Children, 11, 349-363. Weisberg, P., & Savard, C. F. (1993). Teaching preschoolers to read: Don?t stop between the sounds when segmenting words. Education and Treatment of Children, 16, 1- 18. White, O. R. (2003). The finder book for the standard celeration chart. Retrieved on April 2, 2008, from the University of Washington, College of Education Website: http://courses.washington.edu/edspe510/ White, W. A. T. (1988). A meta-analysis of the effects of Direct Instruction in special education. Education & Treatment of Children, 11, 364-374. Williamson, P., McLeskey, J., Hoppey, D., & Rentz, T. (2006). Educating students with mental retardation in general education classrooms. Exceptional Children, 72, 347-361. 155 Zimmerman, I., Steiner, V., & Pond, R. (2003). Pre-school language scale-IV. San Antonio, TX: Psychological Corporation. 156 APPENDIX 157 Table A1. Key Terms in Beginning Reading Instruction Term Definition Grapheme Phoneme Onset Rime Phonemic awareness Phonological awareness Phonics Synthetic phonics Analytic phonics Embedded phonics A written letter symbol used to represent a phoneme Ex: The printed letter ?A? The smallest unit of sound in a language. Ex: The sound that /a/ makes in the word ?fan? The initial consonant(s) in a word that occur before the first vowel Ex: The letters ?st? in the word ?stain? The first vowel(s) in a word and any consonant(s) that follow Ex: The letters ?ain? in the word ?stain? The understanding that each word consists of individual phonemes. Ex: The word ?fan? consists of the phonemes /f/ /a/ /n/ The understanding that language consists of larger units, including: words, onsets, rimes, syllables, and phonemes The relationships between written letters (graphemes) and spoken language (phonemes) Teaching students explicitly to convert letters into sounds and then blend the sounds to form words Teaching students to analyze letter-sound relations in previously learned words to avoid pronouncing sounds in isolation Teaching students phonics during incidental teaching opportunities 158 Table A2. Project Follow Through Models (adapted from Marchand-Martella et al., 2004) Model Description Direct Instruction Curriculum emphasis was reading, math, and language. Carefully sequenced lessons specified teachers behaviors. Instruction in small groups with frequent assessment. Behavior Analysis Focused on reading, writing, spelling, and math. Progress was continuously monitored. A token economy was used along with programmed instructional material. Parent Education Curriculum objectives varied according to each child?s needs. Focus was on motivating and training parents to serve as teaching aides. Classroom instruction followed a Piagetian approach. Responsive Education Self-paced and self-determined instruction. Primary focus was on problem solving and self-confidence. Assumed that if high self- esteem was developed, acquisition of academic skills would follow. Bank Street Focused on development of creativity, self- esteem, and language to express ideas. Used instruction similar to Head Start. Open Education Development of imagination, self-esteem, and flexibility to change were stressed. Children initiated and terminated activities. Language Development Approach Stressed bilingual development. Taught material in Spanish and English. Tucson Early Educational Model (TEEM) Development of broad intellectual skills using an approach similar to whole language. Child-centered approach. Cognitively Oriented Curriculum Focus on children?s reasoning skills in science, math, and reading. Based on Piagetian theory. Child-centered approach. 159 Table A3. Reading Mastery Plus ? Level K Placement Test Sample Questions 1. Show me your nose. (Child must point to or touch his/her nose) 2. Show me your head. (Child must point to or touch his/her head) 3. Show me your ear. (Child must point to or touch his/her ear) 4. What?s your whole name? (Child must say first & last name; middle name is optional) 5. What?s your first name? (Child must say first name only) 6. Instructor points at the man. What is this man doing? (Accept child saying: sleeping, going to sleep, or lying down. Do not accept: sleep, eyes shut, or got to sleep, etc.) 7. Instructor says, ?My turn to say the whole thing. This man is sleeping. Say that.? (Child replies, ?This [or that] man is sleeping.?) 8. Instructor points to the girl. What is this girl doing? (Accept child saying: eating, eating a hamburger, or an entire correct sentence. Do not accept: eat, eat a hamburger, etc.) 9. Instructor says, ?My turn to say the whole thing. This girl is eating. Say that.? (Child replies, ?This girl is eating or This girl is eating a hamburger.?) 160 Table A4. Sequence of Letter-sound Correspondence Introduction in Reading Mastery Plus ? Level K Symbol Pronounced As in a aaa and m mmm ram s sss bus ? ? ? ? eat r rrr bar d ddd mad f fff stuff i iii if th ththth this and bathe t t cat n nnn pan c c tack o ooo ox 161 Figure A1. Direct Instruction checklist (as adapted from Marchand-Martella, et al., 1995). 162 Figure A2. Direct Instruction observation form (as adapted from Marchand-Martella, et al., 1995). 163 Figure A3. Direct Instruction ratings form (as adapted from Marchand-Martella, et al., 1995). 164 Figure A4. Direct Instruction general comment form (as adapted from Marchand- Martella, et al., 1995). 165 Figure A5. The cumulative number of errors to letter-sound correspondences ?a?, ?m?, and ?s? by Megan. 0 1 2 3 4 5 6 7 8 9 10 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "a" Allison 0 1 2 3 4 5 1479121518212427303363942454851545760636697275788184879093969102 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "m" Allison 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "s" Allison 166 Figure A6. The cumulative number of errors to letter-sound correspondences ?e?, ?r?, and ?d? by Allison. 0 1 2 3 4 5 6 7 8 9 10 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "e" Allison 0 2 4 6 8 10 12 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "r" Allison 0 5 10 15 20 25 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "d" Allison 167 Figure A7. The cumulative number of errors to letter-sound correspondences ?f?, ?i?, and ?th? by Allison. 0 1 2 3 4 5 6 7 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "f" Allison 0 5 10 15 20 25 30 35 40 45 50 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "i" Allison 0 1 2 3 4 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "th" Allison 168 Figure A8. The cumulative number of errors to letter-sound correspondences ?a?, ?m?, and ?s? by Danielle. 0 2 4 6 8 10 12 14 16 18 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u mu la t i v e N u mb e r o f E rro r s Consecutive Calendar Days Letter-Sound Correspondence - "a" Danielle 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u mu la t i v e N u mb e r o f E r ro rs Consecutive Calendar Days Letter-Sound Correspondence - "m" Danielle 0 1 2 3 4 5 6 7 8 9 10 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u mu la t i v e N u mb e r o f E rro r s Consecutive Calendar Days Letter-Sound Correspondence - "s" Danielle 169 Figure A9. The cumulative number of errors to letter-sound correspondences ?e?, ?r?, and ?d? by Danielle. 0 2 4 6 8 10 12 14 16 18 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r ro rs Consecutive Calendar Days Letter-Sound Correspondence - "e" Danielle 0 2 4 6 8 10 12 14 16 18 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u mu la t i v e N u mb e r o f E r ro rs Consecutive Calendar Days Letter-Sound Correspondence - "r" Danielle 0 5 10 15 20 25 30 35 40 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r ro rs Consecutive Calendar Days Letter-Sound Correspondence - "d" Danielle 170 Figure A10. The cumulative number of errors to letter-sound correspondences ?f?, ?i?, and ?th? by Danielle. 0 1 2 3 4 5 6 7 8 9 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "f" Danielle 0 5 10 15 20 25 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100103 C u mu la t i v e N u mb e r o f E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "i" Danielle 0 1 2 3 4 5 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100 103 C u mu la t i v e N u mb e r o f E r ro rs Consecutive Calendar Days Letter-Sound Correspondence - "th" Danielle 171 Figure A11. The cumulative number of errors to letter-sound correspondences ?a?, ?m?, and ?s? by Megan. 0 1 2 3 4 5 6 7 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u m u l a t i v e N u m b e r o f E rrors Consecutive Calendar Days Letter-Sound Correspondence - "a" Megan 0 1 2 3 4 5 6 7 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "m" Megan 0 1 2 3 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "s" Megan 172 Figure A12. The cumulative number of errors to letter-sound correspondences ?e?, ?r?, and ?d? by Megan. 0 2 4 6 8 10 12 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u m u l a t i v e N u m b e r o f E rrors Consecutive Calendar Days Letter-Sound Correspondence - "e" Megan 0 2 4 6 8 10 12 14 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "r" Megan 0 1 2 3 4 5 6 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "d" Megan 173 Figure A13. The cumulative number of errors to letter-sound correspondences ?f?, ?i?, and ?th? by Megan. 0 1 2 3 4 5 6 7 8 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u m u l a t i v e N u m b e r o f E rrors Consecutive Calendar Days Letter-Sound Correspondence - "f" Megan 0 5 10 15 20 25 30 35 40 45 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "i" Megan 0 1 2 3 4 5 6 7 8 9 10 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "th" Megan 174 Figure A14. The cumulative number of errors to letter-sound correspondences ?t?, ?n?, and ?c? by Megan. 0 1 2 3 4 5 6 7 8 9 10 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u m u l a t i v e N u m b e r o f E rrors Consecutive Calendar Days Letter-Sound Correspondence - "t" Megan 0 1 2 3 4 5 6 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "n" Megan 0 1 2 3 4 5 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 Cu m u l a t i v e N u m b e r o f Er r o r s Consecutive Calendar Days Letter-Sound Correspondence - "c" Megan 175 Figure A15. The cumulative number of errors to letter-sound correspondences ?a?, ?m?, and ?s? by Ricky. 0 1 2 3 4 5 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rrors Consecutive Calendar Days Letter-Sound Correspondence - "a" Ricky 0 1 2 3 4 5 6 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "m" Ricky 0 1 2 3 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "s" Ricky 176 Figure A16. The cumulative number of errors to letter-sound correspondences ?e?, ?r?, and ?d? by Ricky. 0 1 2 3 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "e" Ricky 0 1 2 3 4 5 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "r" Ricky 0 1 2 3 4 5 6 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "d" Ricky 177 Figure A17. The cumulative number of errors to letter-sound correspondences ?f?, ?i?, and ?th? by Ricky. 0 1 2 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "f" Ricky 0 1 2 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "i" Ricky 0 1 2 3 4 5 6 7 8 9 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "th" Ricky 178 Figure A18. The cumulative number of errors to letter-sound correspondences ?a?, ?m?, and ?s? by Omar. 0 5 10 15 20 25 30 35 1 4 7 101316192225283134374043464952555861646770737679 C u m u l a t i ve N u m b er of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "a" Omar 0 2 4 6 8 10 12 14 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "m" Omar 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 1 4 7 101316192225283134374043464952555861646770737679 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "s" Omar 179 Figure A19. The cumulative number of errors to letter-sound correspondences ?e?, ?r?, and ?d? by Omar. 0 1 2 3 4 5 6 1 4 7 101316192225283134374043464952555861646770737679 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Letter-Sound Correspondence - "e" Omar 0 5 10 15 20 25 30 1 4 7 101316192225283134374043464952555861646770737679 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "r" Omar 0 5 10 15 20 25 30 35 40 45 50 1 4 7 101316192225283134374043464952555861646770737679 C u m u l a t i v e N u m b e r o f E rro rs Consecutive Calendar Days Letter-Sound Correspondence - "d" Omar 180 Figure A20. The cumulative number of errors made by Allison during ?say it fast? exercises. 0 2 4 6 8 10 12 14 16 1 4 7 101316192225 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say it Fast Allison 181 Figure A21. The cumulative number of errors made by Danielle during ?say it fast? exercises. 0 2 4 6 8 10 12 14 123456789101121314151617181920212232425262728293031323 C u mu la t i v e N u mb e r o f E r r o r s Consecutive Calendar Days Say it Fast Danielle 182 Figure A22. The cumulative number of errors made by Megan during ?say it fast? exercises. 0 5 10 15 20 25 30 35 1 8 15 22 29 36 43 50 C u mu la t i v e N u m b e r o f E rro rs Consecutive Calendar Days Say it Fast Megan 183 Figure A23. The cumulative number of errors made by Ricky during ?say it fast? exercises. 0 2 4 6 8 10 12 14 135791131517192123252729313353739414345474951535575961636567697173757 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say it Fast Ricky 184 Figure A24. The cumulative number of errors made by Omar during ?say it fast? exercises. 0 2 4 6 8 10 12 14 16 18 20 1 4 7 1013161922252831343740434649525558 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say it Fast Omar 185 Figure A25. The cumulative number of errors made by Allison during ?say the sounds? exercises. 0 1 2 3 4 5 6 1 4 7 101316192 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds Allison 186 Figure A26. The cumulative number of errors made by Danielle during ?say the sounds? exercises. 0 1 2 3 4 5 6 7 8 1234567891011213141516171819202122324 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds Danielle 187 Figure A27. The cumulative number of errors made by Megan during ?say the sounds? exercises. 0 1 2 3 1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds Megan 188 Figure A28. The cumulative number of errors made by Ricky during ?say the sounds? exercises. 0 1 2 3 4 5 1234567891011213141516171819202122324252627282930313233435363738 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds Ricky 189 Figure A29. The cumulative number of errors made by Omar during ?say the sounds? exercises. 0 1 2 3 4 5 6 7 8 1234567891011213141516171819202122324 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds Omar 190 Figure A30. The cumulative number of errors made by Allison during ?say the sounds- say it fast? exercises. 0 2 4 6 8 10 12 14 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds -Say it Fast Allison 191 Figure A31. The cumulative number of errors made by Danielle during ?say the sounds- say it fast? exercises. 0 2 4 6 8 10 12 14 16 18 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds-Say it Fast Danielle 192 Figure A32. The cumulative number of errors made by Megan during ?say the sounds- say it fast? exercises. 0 2 4 6 8 10 12 14 16 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds-Say it Fast Megan 193 Figure A33. The cumulative number of errors made by Ricky during ?say the sounds-say it fast? exercises. 0 1 2 3 4 5 6 7 8 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds-Say it Fast Ricky 194 Figure A34. The cumulative number of errors made by Omar during ?say the sounds-say it fast? exercises. 0 2 4 6 8 10 12 14 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Say the Sounds-Say it Fast Omar 195 Figure A35. The cumulative number of errors made by Allison during ?sounding out? exercises. 0 1 2 3 4 5 6 7 1 4 7 10131619222528313437404346495255586164 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Sounding Out Allison 196 Figure A36. The cumulative number of errors made by Danielle during ?sounding out? exercises. 0 2 4 6 8 10 12 14 16 18 20 1 3 5 7 9 11131517192123252729313335373941434547495153555759616365676971 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Sounding Out Danielle 197 Figure A37. The cumulative number of errors made by Megan during ?sounding out? exercises. 0 1 2 3 4 1 4 7 1013161922252831343740434649525558 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Sounding Out Megan 198 Figure A38. The cumulative number of errors made by Ricky during ?sounding out? exercises. 0 1 2 3 4 5 6 7 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Sounding Out Ricky 199 Figure A39. The cumulative number of errors made by Omar during ?sounding out? exercises. 0 5 10 15 20 25 30 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Sounding Out Omar 200 Figure A40. The cumulative number of errors made by Allison during ?reading vocabulary? exercises. 0 2 4 6 8 10 12 14 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Reading Vocabulary Allison 201 Figure A41. The cumulative number of errors made by Danielle during ?reading vocabulary? exercises. 0 5 10 15 20 25 1 4 7 101316192225283134374043464952555861646770737679828588919497100103 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Reading Vocabulary Danielle 202 Figure A42. The cumulative number of errors made by Megan during ?reading vocabulary? exercises. 0 5 10 15 20 25 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 204 211 C u mu la t i v e N u m b e r o f E rro rs Consecutive Calendar Days Reading Vocabulary Megan 203 Figure A43. The cumulative number of errors made by Ricky during ?reading vocabulary? exercises. 0 1 2 3 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 134 141 148 155 162 169 176 183 190 197 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Reading Vocabulary Ricky 204 Figure A44. The cumulative number of errors made by Omar during ?reading vocabulary? exercises. 0 1 2 3 4 5 6 7 8 9 10 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 C u m u l a t i ve N u m b e r of E r r o r s Consecutive Calendar Days Reading Vocabulary Omar 205 Figure A45. The cumulative number of errors made by Allison across all of the skill sets. 0 5 10 15 20 25 30 35 40 45 LS-a LS-m LS-s LS-e LS-r LS-d LS-f LS-i LS-th SF SS SSSF SO RV N u m b e r o f E r ro rs Skill Set Total Number of Errors Across Skills Allison 206 Figure A46. The cumulative number of errors made by Danielle across all of the skill sets. 0 5 10 15 20 25 30 35 40 LS-a LS-m LS-s LS-e LS-r LS-d LS-f LS-i LS-th SF SS SSSF SO RV N u m b e r o f E r ro rs Skill Set Total Number of Errors Across Skills Danielle 207 Figure A47. The cumulative number of errors made by Megan across all of the skill sets. 0 5 10 15 20 25 30 35 40 45 LS-a LS-m LS-s LS-e LS-r LS-d LS-f LS-i LS-th LS-t LS-n SF SS SSSF SO RV Nu m b e r o f E r r o r s Skill Set Total Number of Errors by Skill Megan 208 Figure A48. The cumulative number of errors made by Ricky across all of the skill sets. 0 2 4 6 8 10 12 14 LS-a LS-m LS-s LS-e LS-r LS-d LS-f LS-i LS-th SF SS SSSF SO RV Nu m b e r o f E r r o r s Skill Set Total Number of Errors Across Skills Ricky 209 Figure A49. The cumulative number of errors made by Omar across all of the skill sets. 0 5 10 15 20 25 30 35 40 45 LS-a LS-m LS-s LS-e LS-r LS-d SF SS SSSF SO RV N u mb e r o f E rro rs Skill Set Total Number of Errors Across Skills Omar