Comparing the Effectiveness of Reform Pedagogy to Traditional Didactic Lecture Methods in Teaching Remedial Mathematics at Four-Year Universities by Luke Alexander Smith A dissertation submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Doctor of Philosophy Auburn, Alabama August 3, 2013 Keywords: remedial mathematics, reform mathematics, postsecondary education Approved by W. Gary Martin, Chair, Professor of Curriculum and Teaching Marilyn E. Strutchens, Professor of Curriculum and Teaching David Shannon, Professor of Educational Foundations, Leadership and Technology Stephen E. Stuckwisch, Assistant Professor of Mathematics ii Abstract Postsecondary remedial mathematics courses often have relatively low pass rates compared to other courses (Bahr, 2008; Virginia College Community System, 2011) and have contributed to the view that mathematics is a gatekeeper for college success (Epper & Baker, 2009). This study addressed this situation by exploring recommendations made by various organizations including the National Council of Teachers of Mathematics (NCTM) (2009, 2006, 2000, 1989), the American Mathematical Association of Two-Year Colleges (AMATYC) (2006), and the Mathematical Association of America?s Committee on the Undergraduate Program in Mathematics (2011) to improve student learning in mathematics courses through various pedagogical techniques; in this study, the pedagogical practices advocated by these organizations are collectively referred to as ?reform mathematics.? The study was conducted at a mid-sized university in the southern United States. A quasi- experimental design was used to investigate the effectiveness of incorporating reform mathematics practices as compared to didactic lecture techniques in improving student success in remedial mathematics courses. Student success was measured in terms of pass rates, procedural ability, application ability, and change in mathematical self-efficacy. Repeated Measures ANOVAs, t-tests, and Fisher Exact tests were used to determine if the treatment had an effect on student achievement variables. Additionally, qualitative data were also gathered from students who were enrolled in the reform-oriented course to examine their perceptions of key aspects of reform mathematics instruction. While the results were not statistically significant, the trends within the data suggest that students may benefit from reform-oriented instruction. iii Acknowledgments First and foremost, I would like to thank my Lord and Savior Jesus Christ for providing me with the physical, mental, and emotional resources to complete this degree. I am awed by how He put all the pieces together to make this accomplishment possible. I would like to thank my wife, Michelle, for enduring all the nights that I was away studying while she diligently cared for our children. Your love and support throughout this program strengthened my resolve to see to its successful conclusion. To my mother and father, Guadalupe and Edgar Smith, I thank you for raising me and instilling within me your values and for putting me in schools where the teachers instilled a love for learning; I hope that I too may be able to do the same for my children. To my mother and father-in law, Cheri and Gary Maxwell, I thank you for taking care of my wife and kids while I was away from home. To my grandmother, Ruth Smith, I thank you for advocating the benefits of higher education throughout my childhood. To my precious children, Katelyn, Kimberly, Kara, and Karlie, I thank you for being such great kids; although you may not understand it now, each of you gave me an immediate reason to understand better how to teach. I must also thank my coworkers for their support in making this study possible. To my supervisor, Susan Barganier, I thank you for providing me unrelenting support towards completing this program. To Dr. Lee, Dr. Schmidt, Dr. Peele, Dr. Smith, Dr. Boronski, Dr. Ray, Ms. Tomblin, and Mrs. Warren, I thank you for your continued support; each of you contributed towards my degree in invaluable ways. iv To Anna Wan, I thank you for the tremendous amount of support you gave through both providing key resources and by making available your time, energy, and expertise. To Lisa Ross, Beth Hickman, and Dr. Gilbert Duenas, I thank you for helping me to set up my project for my study. To Dr. David Shannon, I thank you for meeting with me many times throughout this project and for helping me with the methodology and statistical processes used within this study. To Dr. Stephen Stuckwisch, I thank you for modeling how to make mathematics fun in your classes. To Dr. W. Gary Martin and Dr. Marilyn Strutchens, I thank you for showing me a completely new way of viewing teaching and for modeling that manner of teaching within your own classes. I would like to thank Dr. Martin specifically for the many meetings we had throughout the years in which he helped me understand how to improve my work; I am grateful for his investing his time and energy into making me a better teacher and researcher. v Table of Contents Abstract ???????????????????????????????? ??.. .ii Acknowledgments?????????????????????????????.?.iii List of Figures??????????????????????????????? ?..x List of Tables????????????????????????????????..xi 1. Introduction .............................................................................................................................1 Strengths and Weaknesses of Developmental Education ..................................................3 Reform Mathematics Pedagogy .......................................................................................4 Purpose of the Study ........................................................................................................6 2. Review of Related Literature ...................................................................................................8 Characteristics of Students Who Take Remedial Courses.................................................8 Effectiveness of Remedial Mathematics Courses in Postsecondary Education................ 11 Efforts Made to Improve Student Success in Remedial Mathematics Courses ................ 15 Computer-Based Assistance for Students in Remedial Mathematics Courses ..... 15 Shortening the Length of Developmental Mathematics Programs ....................... 24 A Promising Approach: Reform Mathematics Pedagogy................................................ 26 Recommendations for K-12 Mathematics ........................................................... 27 vi Principles................................................................................................ 27 Process Standards ................................................................................... 28 The Common Core ................................................................................. 30 The Equitable Nature of Reform Mathematics Pedagogy ........................ 32 Recommendations for Post-secondary Mathematics Students ............................. 34 Recommendations for Underprepared Post-secondary Mathematics Students ..... 36 The Effects of Reform-Oriented Classrooms on Student Achievement ............... 39 Student Achievement in Middle School Reform-Oriented Classrooms .... 40 Student Achievement in Secondary Reform-Oriented Classrooms. ......... 44 Student Achievement in College-level Reform Oriented Classrooms ...... 52 Student Achievement in Remedial Postsecondary Reform Oriented Classrooms ............................................................................................. 58 Effects of Self-Efficacy on Student Achievement .......................................................... 61 Synthesis of Relevant Studies ........................................................................................ 67 Theoretical Framework .................................................................................................. 68 Research Questions ....................................................................................................... 70 CHAPTER 3: METHODOLOGY ............................................................................................. 72 Design ........................................................................................................................... 72 Context .......................................................................................................................... 73 My Personal Background ................................................................................... 75 vii Description of Sample ................................................................................................... 77 Instrumentation.............................................................................................................. 79 Dependent Measures .......................................................................................... 79 Pass rates ................................................................................................ 80 Procedural skills. .................................................................................... 80 Application skills .................................................................................... 81 Mathematical self-efficacy...................................................................... 82 Perspectives of treatments ...................................................................... 82 Validity and Reliability ...................................................................................... 82 Procedural and application scores ........................................................... 83 Reformed Teaching Observation Protocol. ............................................. 83 Covariates .......................................................................................................... 84 Procedure ...................................................................................................................... 84 Control Group .................................................................................................... 85 Experimental Group Treatment .......................................................................... 86 Data Analysis ................................................................................................................ 91 Establishing Validity .......................................................................................... 91 Selecting Covariates ........................................................................................... 92 Analysis of Effects ............................................................................................. 93 Qualitative Analysis ........................................................................................... 95 viii Summary ....................................................................................................................... 96 CHAPTER 4: RESULTS .......................................................................................................... 97 Summary of Events ....................................................................................................... 97 Integrity of Treatment .................................................................................................... 98 Inter-rater Reliability of Tests ........................................................................................ 99 Quantitative Results ..................................................................................................... 100 Selecting Covariates ......................................................................................... 100 Research Question 1: Procedural Skills ............................................................ 102 Research Question 2: Application Skills ........................................................... 107 Analysis of Procedural vs. Application Skills ....................................... 111 Research Question 3: Pass Rates ...................................................................... 116 Research Question 4: Students? Change in Mathematics Self-Efficacy ............. 116 Summary of the Quantitative Results ............................................................... 119 Qualitative Results....................................................................................................... 120 Comparison of Treatments ............................................................................... 121 Efficacy of Control Treatment .......................................................................... 122 Research Question 5: Students? Views about Reform Mathematics .................. 123 Summary of Qualitative Results ....................................................................... 125 CHAPTER 5: CONCLUSIONS AND IMPLICATIONS ......................................................... 126 Limitations .................................................................................................................. 126 ix Conclusions ................................................................................................................. 127 Research Questions 1 and 2: Procedural and Application Skills ........................ 127 Research Question 3: Pass Rates ...................................................................... 128 Research Question 4: Change in Mathematics Self-Efficacy............................. 129 Research Question 5: Student Response to the Experimental Treatment ........... 129 Implications ................................................................................................................. 130 Teachers........................................................................................................... 131 Administrators ................................................................................................. 134 Conclusion .................................................................................................................. 138 References .............................................................................................................................. 140 Appendix A: Permission Forms ............................................................................................... 156 Appendix B: Student Surveys .................................................................................................. 160 Appendix C: Sample Application Problems ............................................................................. 165 Appendix D: Reformed Teaching Observation Protocol .......................................................... 167 Appendix E: Paired Lesson Plans ............................................................................................ 173 Appendix F: Responses to Open-ended Student Surveys ......................................................... 200 x List of Figures Figure 1: A sample procedural problem with corresponding grading rubric ............................... 81 Figure 2: A sample application problem with corresponding grading rubric .............................. 82 Figure 3: Mean adjusted procedural scores for control and experimental groups ...................... 106 Figure 4: Mean adjusted application scores for control and experimental groups ..................... 111 Figure 5: A solution obtained through the use of pictures ........................................................ 113 Figure 6: A solution obtained through systematic trial and error .............................................. 114 Figure 7: Mean pre- and post-mathematical self-efficacy scores .............................................. 118 xi List of Tables Table 1: Computer-based mathematics instruction..................................................................... 22 Table 2: Shortening the length of the developmental sequence .................................................. 26 Table 3: Effects of reform-oriented instruction in middle school mathematics courses............... 43 Table 4: Effects of reform-oriented instruction in secondary school mathematics courses .......... 51 Table 5: Effects of reform-oriented instruction in postsecondary mathematics courses .............. 56 Table 6: Effects of reform-oriented instruction in postsecondary remedial mathematics courses 60 Table 7: Effects of self-efficacy on student performance ........................................................... 65 Table 8: Demographics of sample ............................................................................................. 78 Table 9: Summary of differences betweeen traditional and reform-oriented instruction ............. 88 Table 10: Differences in RTOP scores between control and experimental sections .................... 99 Table 11: Summary of inter-rater reliability Pearson correlation values .................................. 100 Table 12: Differences in continuous variables between groups ................................................ 101 Table 13: Differences in dichotomous variables between groups ............................................. 102 Table 14: Summary of procedural scores for control and experimental groups ........................ 103 Table 15: Statistical analysis of procedural scores between groups .......................................... 104 Table 16: Summary of procedural scores adjusted for race ...................................................... 105 Table 17: Comparison of final exam scores between control and experimental groups ............ 107 Table 18: Summary of application scores for control and experimental groups ........................ 108 Table 19: Statistical analysis for the difference in application scores ....................................... 109 xii Table 20: Summary of application scores adjusted for race...................................................... 110 Table 21: Comparison of non-algebraic strategies on application questions between groups .... 115 Table 22: Summary of pass rates ............................................................................................. 116 Table 23: Summary of students? change in mathematics self-efficacy...................................... 117 Table 24: Statistical analysis for students? change in mathematics self-efficacy ....................... 119 1 1. Introduction Increased levels of education have been shown to have positive impacts on individuals and society as a whole. Compared to students with lesser education, students who earn a bachelor?s degree or higher are more likely to earn higher salaries, generate more tax revenue, live a healthier lifestyle, obtain health insurance, acquire pensions, perform civic duties and are less likely to receive public assistance (Baum & Payea, 2004; Perna, 2005). However, in an effort to earn a postsecondary degree, many students have found that they were underprepared for postsecondary mathematics and were required to take remedial mathematics courses (Fike & Fike, 2007; Alliance for Excellent Education [AEE], 2011; Radford et al., 2012). Remedial mathematics classes are available to help students develop mathematical skills that should have been obtained in secondary mathematics courses. In 2008, roughly 72% of all tertiary schools and 90% of public tertiary schools in the United States offered remedial courses (National Center for Education Statistics [NCES], 2008). Roughly 42% of first-time postsecondary students in 2003-2004 were required to take remedial mathematics courses (Radford et al., 2012), and students who took remedial mathematics classes often met all other admission standards (Duranczyk & Higbee, 2006). However, since the attrition rates of remedial mathematics courses have often been reported around 50% (Phoenix, 1990; Ellington, 2005; Attewell et al., 2006; Fike & Fike, 2007; Bahr, 2008; Virginia College Community System [VCCS], 2011), and the likelihood of a student?s departure from the remedial mathematics program increases significantly with the number of remedial courses that the student is required to take (Hern, 2012; Bahr, 2012; Complete College America [CCA], 2012), it is not surprising that as many as 72% of students in developmental mathematics sequences never attempted a college-level mathematics course (Wolfle, 2012). Thus, mathematics has been viewed as a 2 gatekeeper for college success (Massachusetts Community College Executive Office, 2006; Fike & Fike, 2007; Epper & Baker, 2009). Since students who successfully complete remedial mathematics courses often perform as well in their academic pursuits as students who did not need remedial courses (Attewell et al., 2006; Bettinger & Long, 2009; Bahr, 2010), researchers have investigated various areas related to the successful completion of mathematics courses taken by college freshmen, including the benefits of online assessment and the effectiveness of implementing pedagogical practices that align with the reform mathematics movement. For the purposes of this paper, ?reform mathematics? represents the pedagogical practices that are advocated by organizations such as the National Council of Teachers of Mathematics (NCTM) (2009, 2006, 2000, 1989), the American Mathematical Association of Two-Year Colleges (AMATYC) (2006), and the Mathematical Association of America?s Committee on the Undergraduate Program in Mathematics (2011). These practices include active student learning, a diminished role of the instructor as a source of knowledge, and student exploration and experimentation before formal presentation of mathematical theorems. More details about reform mathematics will be presented in the review of literature. These methods have been found successful in some contexts. Thus in this paper, I will present literature that advocates the need to more closely align the pedagogical practices within remedial mathematics classrooms with pedagogical practices advocated by the reform- mathematics movement and that diverging from traditional didactic lecture towards a more reform-oriented style of instruction will improve the quality of instruction for students in remedial courses. 3 Strengths and Weaknesses of Developmental Education Before continuing, it is important to briefly clarify the meaning of remedial education and developmental education, since both the general public and many scholars use both terms interchangeably (Institute for Higher Education Policy, 1998; Kozeracki, 2002; Parmer & Cutler, 2007; Radford et al., 2012). Developmental education programs emphasize a holistic approach (Boylan, Bonham, & White, 1999) to assist individuals who have failed to meet placement requirements by providing them a variety of courses and services that focus primarily on reading, writing, mathematics, studying strategies, and other affective variables that are important for college success (Tomlinson, 1989; Boylan & Bonham, 2007). Remedial courses are a subset of developmental education and refer exclusively to courses that are not at college level (Boylan, Bonham, & White, 1999; NCES, 2004) and have served as the core of developmental education (Brothen & Wambach, 2004). For the purposes of this paper, the term developmental will refer to the programs enacted by colleges that provide a range of services for underprepared students, and the term remedial will refer to the coursework that is taken at postsecondary institutions but is below college level. Developmental education offers significant benefits to students, institutions, and society as a whole by providing access and equal opportunity to higher education (Tomlinson, 1989; Mills, 1998; McCabe & Day, 1998; Goldrick-Rab, 2010; Gallard, Albritton, & Morgan, 2010; VCCS, 2011). Since an individual?s educational attainment is a significant predictor of occupational status and financial earnings (Kerckhoff, Raudenbush, & Glennie, 2001), developmental education offers individuals a ?last chance? to obtain benefits associated with higher education by preparing them for postsecondary work (Tomlinson, 1989; McCabe & Day, 1998; Gerlaugh et al., 2007; Gallard, Albritton, & Morgan, 2010). Postsecondary remediation 4 develops in students the minimum skills that are necessary to function in the economy and democracy (Bahr, 2008). Many of the jobs in today?s society require skills that are made available to students through developmental mathematics programs (McCabe & Day, 1998; Goldrick-Rab, 2010). Because many of the students who benefit from developmental education are able to improve their skills, and thus not have to compete for the increasingly fewer low-skill jobs that are available, developmental education plays an essential role in reducing the number of individuals in welfare and prison populations by helping students to become independent and self-sufficient (McCabe & Day, 1998; Gallard, Albritton, & Morgan, 2010). Objections are sometimes raised regarding the costs associated with developmental education programs (Bahr, 2008; Gallard, Albritton, & Morgan, 2010; AEE, 2011). For many legislatures, postsecondary remediation has symbolized the devaluation of academic standards in tertiary education and the failure of America?s precollegiate educational system (Mills, 1998; Boylan & Bonham, 2007); and many legislatures are only recently recognizing the importance of developmental education (Boylan & Bonham, 2007). Despite these objections, the benefits to society far exceed the costs associated with implementing developmental education (McCabe & Day, 1998; Saxon & Boylan, 2001; Gallard, Albritton, & Morgan, 2010), and developmental education programs consistently generate sufficient revenue to cover the costs of delivering their services (Saxon & Boylan, 2001). Reform Mathematics Pedagogy The high failure rates present in many developmental programs may exist because a significant proportion of remedial students? academic backgrounds are so weak that they are unable to succeed in even pre-collegiate courses (Adelman, 1995). The traditional lecture 5 techniques that are commonly used in college classrooms provide these students little benefit (Adelman, 1995); if lecture techniques had worked in middle and secondary education, these students would not need to enroll in remedial courses at the postsecondary level (Boylan & Saxon, 1999; Trenholm, 2006). A high percentage of students fail remedial mathematics courses (Hern, 2012). On the other hand, students who pass them often do as well as students who do not need remedial mathematics courses; thus, it becomes clear that remedial mathematics courses work well for some students but not for others (Bahr, 2008). Because of the substantial benefits to students and society that come with college success, improvements need to be made to remedial mathematics courses so that more students can complete these courses and move closer to achieving their college degree. The pedagogy advocated by the reform mathematics movement may be a solution to improving the level of student understanding in postsecondary remedial mathematics courses. The current reform movement in school mathematics advocates that students engage in exploring mathematical phenomena, making conjectures, and analyzing the validity of those conjectures. Recommendations made by the above organizations include a shift from traditional didactic lecture (the teaching method in which the teacher is the primary dispenser of knowledge to a group of passively engaged students) towards student-oriented classrooms that encourage active student participation in the learning process through engagement in worthwhile problem solving, collaboration among students, multiple representations, and technology. Many instructors still present the material to students through rote lecture?the process whereby the instructor provides information to passive, uninvolved students (Fry, Ketteridge, & Marshall, 2003; White-Clark, DiCarlo, & Gilchriest, 2008). In comparison, students enrolled in 6 mathematics courses that adhere to reform pedagogy generally perform at least as well as comparison lecture-based courses. These findings have held at the middle-school (Reys et al., 2003; Thompson, 2009), high school (Hirschhorn, 1993; Schoen, Hirsch, and Ziebarth, 1998; Thompson & Senk, 2001; Cichon and Ellis, 2003), and postsecondary levels of education (Lawson et al., 2002; Erickson & Shore, 2003; Ellington, 2005; Gordon, 2006). Purpose of the Study Current research does not adequately address the effectiveness of various teaching strategies employed within remedial mathematics classrooms in colleges and universities. For example, although research has been done on the effectiveness of computer-based assistance in remedial mathematics courses in which lecture-based instruction was either supplemented or replaced by computer-based instruction (Villarreal, 2003; Walker & Senger, 2007; Squires, Faulkner, & Hite, 2009), the scope of these studies were limited to either the effects of stimuli outside classroom instruction or to the effects of replacing instructors with computers; neither approach examined the teaching practices of the instructors. Furthermore, several studies have been performed on the effectiveness of remedial mathematics instructors? pedagogical decisions with respect to cooperative learning, use of technology, and problem-oriented approaches to learning (Phoenix, 1990; Erickson & Shore, 2003; Ellington, 2005); however, these studies possessed limitations in their comparative designs. Multiple studies have shown that students who received instruction in accordance with reform mathematics pedagogy tend to do at least as well as traditionally taught students in procedural skills and often better in application problems (Hirschhorn, 1993; Schoen, Hirsch, & Ziebarth, 1998; Senk & Thompson, 2006). This study compared the effectiveness of reform pedagogy to didactic lecture methods in teaching remedial mathematics at a four-year university. The study was guided by the following 7 broad research question: Is teaching remedial mathematics in a reform-oriented manner beneficial to university students? Five subquestions were addressed as follow: 1. Is there a significant difference in the pass rates in the remedial mathematics courses between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 2. Is there a significant difference in mathematical procedural ability between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 3. Is there a significant difference in mathematical problem solving ability between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 4. Does the self-efficacy of university students in the reform classes improve as a result of instruction received in the reform classes? 5. What views about reform instruction will university students who are enrolled in a reform-oriented remedial mathematics course express upon completing one semester of reform-oriented mathematics instruction? 8 2. Review of Related Literature Having discussed the importance of improving student success in remedial mathematics courses, it is important to examine the efforts made by others to enable students to learn mathematics. In this chapter, I review the literature that is relevant to the study. First, I present a description of the characteristics of students who take remedial courses. Next, I present studies that address the effectiveness of remedial mathematics courses. Third, I describe efforts made to improve student success in remedial courses through computer-based assistance as well as by shortening the length of developmental mathematics programs. Fourth, I present the main tenets of reform mathematics pedagogy, a promising alternative approach to improving student success in remedial mathematics courses. I will present an overview of the reform mathematics pedagogy as advocated by the National Council of Teachers of Mathematics (NCTM) for K ? 12 mathematics, followed by recommendations made by the American Mathematical Association of Two-Year Colleges (AMATYC) and the Mathematical Association of America (MAA) for undergraduate mathematics courses that service underprepared students. I will also present studies that address the effects on student achievement that can occur when values that are aligned with reform pedagogy are adopted within mathematics classrooms. Fifth, I review literature that address the impact that mathematics self-efficacy can have on student success in mathematics. Sixth, I present my case for developing a study that would examine the effects of reform pedagogy on student achievement in post-secondary remedial mathematics courses. Lastly, I explain the theoretical framework that will serve as the underpinning of my study. Characteristics of Students Who Take Remedial Courses This section presents socio-demographic information regarding race, gender, age, and income levels of students in remedial classes. Descriptions of common prior academic 9 experiences and obstacles faced by students in remedial courses are also presented. According to NCES (Sparks & Malkus, 2013), the percentage of students who took remedial courses dropped sharply from 1999 to 2003, but increased slightly from 2003 - 2007; thus, the net difference between 1999 data and 2007 data showed that a lower percentage of students were taking remedial courses. This trend occurred in characteristics such as race, gender, and age. The follow data describe the trends for first-year undergraduate students who attended public institutions. According to the data collected by NCES (Sparks & Malkus, 2013), in 2007 - 2008, 23.3% of all first-year students reported enrolling in a remedial course, as compared to 22.1% in 2003 - 2004 and 28.8% in 1999 - 2000. During the 2007 - 2008 academic school year, the percentages of African American, Hispanic, Asian/Pacific Islander, and White students who reported taking a remedial course were 30.2%, 29.0%, 22.5%, and 19.9%, respectively. Although slightly higher than 2003 - 2004 data in which the percentages of African American, Hispanic, Asian/Pacific Islander, and White students who reported taking a remedial course were 27.4%, 26.8%, 20.1%, and 19.7%, respectively, the 2007 - 2008 data are still lower than the 1999 - 2000 data in which 37.7%, 37.8%, 34.9%, and 24.7%, respectively reported taking a remedial course. Thus, two points should be emphasized from these sets of data. First, remedial courses continue to be needed by students entering postsecondary education. Second, minority students continue to be significantly overrepresented in remedial courses, a phenomenon documented by other research (Bailey, Jenkins, & Leinbach, 2005; Bailey & Morest, 2006; AEE, 2011). According to data gathered by NCES (Sparks & Malkus, 2013), female students were more likely than male students to take a remedial course in 2007 - 2008 (24.7% and 21.6%, respectively), in 2003 - 2004 (23.1% and 20.7%, respectively), and in 1999 - 2000 (29.1% and 10 28.5%, respectively). When comparing the data across the three collection points, it becomes clear that the overrepresentation of females in remedial courses continued to be an issue, a phenomenon voiced by research a decade earlier (Hagedorn et al., 1999). Approximately 23.8% of traditional college age students (ages 15 to 23 years old) reported having taken a remedial course during their first year, whereas 22.0% of older students ages 24 to 29 and 20.3% of students between 30 and 39 years of age reported taking a remedial course during their first year. Supplementing the data not provided by Sparks and Malkus (2013), Goldrick-Rab (2010) found that many of the students in community colleges who are enrolled in noncredit instruction are older adults from disadvantaged backgrounds. The consideration of adult learners is important because adult learners can face more difficulties in obtaining higher level mathematics skills than recent graduates do; adult learners often face more logistical and financial challenges. For example, adult learners are often the sole household earner and must coordinate daycare and time off from work (Woodard & Burkett, 2005; Golfin et al., 2005; Duranczyk & Higbee, 2006; AMATYC, 2006). Additionally, adult learners have often functioned at low levels of quantitative literacy and have a history of education failure (Golfin et al., 2005). Students in remedial mathematics courses often meet all other admission standards but are limited in educational opportunities due to poor mathematical skills (Duranczyk & Higbee, 2006), a fact that reinforces the view of mathematics as a gatekeeper for college success (Epper & Baker, 2009). Many of the students in developmental courses face difficulties that are not experienced by traditional students; Duranczyk and Higbee (2006) aptly summarized this situation: ?Nontraditional students?whether in terms of age, heritage, socioeconomic status, or educational history?often do not have the luxury of approaching higher education as full-time 11 residential students, employed for fewer than 20 hours per week, supported primarily by their parents, and without the responsibility of caring for dependent family members? (p. 23). Effectiveness of Remedial Mathematics Courses in Postsecondary Education Proponents for remediation have stated that remedial courses help students develop skills to improve their chances of collegiate success (Bettinger & Long, 2009); however, not all researchers agree that remediation is effective (Perin, 2006; Attewell et al., 2006). The following studies describe various effects that remedial courses have had on underprepared students. The terms remediation and remedial in the following studies refer to courses that are below college level. Bahr (2010) investigated the effectiveness of post secondary remediation for students who were deficient in mathematics, English, or both mathematics and English. His sample consisted of 68,884 first-time, non-English Second Language college freshmen enrolled in one of California?s community colleges during 1995. He continued to monitor these students for six years and found that students who completed remediation in either mathematics or English as well as students who completed remediation in both mathematics and English ?experienced rates of credential completion and upward transfer that are comparable, or slightly superior, to those of students who attain college-level competency in math and English without remediation? (p. 195). In other words, students who successfully completed their remedial courses tended to do as well as students who were not required to take remedial courses. Although it is encouraging to find that remedial courses adequately prepare students for future academic coursework, student persistence is problematic for remedial education. From a similar set of data, Bahr (2008) noted that only 1 in 4 students successfully completed the remedial courses, and of these students who did not successfully complete the remedial course, 12 roughly 80% of them did not complete a program of study or transfer to a 4-year institution. Bahr stated that future research should examine why remediation does not work for some students. Johnson and Kuennen (2004) studied the impact that delaying remedial mathematics had on students? scores in freshman microeconomics, a quantitative-intensive course. From a sample of 1,462 freshman microeconomics students, the researchers found that students who did not need to take remedial mathematics scored higher than remedial students who had already passed their remedial mathematics courses, and these remedial students scored higher than remedial students who had not yet taken their remedial mathematics courses. The researchers found that the differences between all three groups were statistically significant. Because Johnson and Kuennen (2004) only examined microeconomics students, the researchers stated that further study could be done related to physics, chemistry, accounting, and other quantitative courses. The results of Johnson and Kuennen?s (2004) study differed slightly from Bahr?s (2010) study. In Bahr?s (2010) study, developmental students who completed the remedial classes did as well as students who did not need remedial classes; whereas in Johnson and Kuennen?s (2004) study, developmental students who had completed their remedial classes scored slightly lower in microeconomics than students who did not need remedial classes. Parmer and Cutler (2007) studied the performances in Math 101 (a college-level elementary algebra course) of students who had completed remedial mathematics (n = 591) as compared to students who did not need to take remedial mathematics (n = 437) at Sinclair Community College, Ohio. The researchers conducted a three-part project. First, the researchers issued a 15 question pre-course assessment on topics including writing percents as decimals, simplifying expression containing fractions and exponents, squaring negative numbers, and solving linear equations. The researchers found that former remedial students answered on 13 average 9.86 questions correctly as compared with 10.22 correctly answered questions for students who did not take remedial mathematics; the researchers did not state whether the differences were statistically significant but did state that both groups of students were similarly equipped for Math 101. Second, when the researchers analyzed the academic performance of the students throughout the course, they found that former remedial students scored the same as non- remedial students only on the first test; on all subsequent tests, remedial students scored lower than non-remedial students. Further, significantly higher percentages of non-remedial students passed Math 101 than remedial students (53% and 46%, respectively). Lastly, the researchers issued an anonymous survey that asked students to report perceived difficulty on various topics throughout the course; these topics included factoring trinomials, solving linear equations, solving linear inequalities, and operations with polynomials. Former remedial students gave a higher difficulty rating to learning all 10 topics on the survey than did non-remedial students. Attewell et al. (2006) compared students who successfully completed all their remedial mathematics courses on their first attempt to students who never enrolled in remedial coursework. The researchers analyzed data from students whose information was gathered from the 1988 - 2000 National Educational Longitudinal Study. After controlling for high school experiences and socio-demographic background, the researchers' logistic regression model found that students in two-year colleges who completed remedial mathematics courses were more likely to earn a degree than were comparably equipped students who did not enroll in remedial mathematics courses (n = 2,009, p < 0.001). However, for students (n = 3,833) enrolled in four- year universities, no significant difference in graduation rates was found between successful remedial students and non-remedial students. 14 Bettinger and Long (2009) found that when they controlled for students? ACT scores, high school GPA, family income, gender, and several other factors, remediation for underprepared mathematics students had a positive effect on helping these students to succeed at the college level. Their results came from tracking 28,000 full-time, traditional freshmen students in 42 Ohio universities over a period of 6 years. The researchers noted that the placement of similarly prepared students (as indicated by their ACT scores and high school GPA, for example) into remedial classes was often determined by the university that they attended. By analyzing where these students were placed, the researchers found that students who successfully completed remedial courses were more likely to persist in college than were similar ability students who had not enrolled in remedial courses. Further, remediated students were more likely to complete their degree programs and less likely to transfer to a less selective college. Specifically, underprepared mathematics students who took remediation courses were 13.9% less likely to drop out of the program and 1.5% more likely to complete their degree within 6 years. One of the strengths of this study is that the results were based on data from multiple universities. Several patterns emerge regarding the effectiveness of remedial mathematics courses. First, students who successfully completed their remedial mathematics courses tended to have similar graduation rates as students who did not need to enroll in remedial mathematics courses (Bahr, 2010; Attewell et al, 2006). Second, students who took remedial mathematics courses often did not perform as well in their quantitative classes as students who did not need to take remedial mathematics (Johnson & Kuennen, 2004; Parmer & Cutler, 2007). However, students who enrolled in remedial mathematics courses were more likely to graduate (Bettinger & Long, 2009) and perform better in quantitative courses (Johnson & Kuennen, 2004) than were students 15 of equal ability who did not enroll in remedial mathematics courses. In other words, even though students who took remedial mathematics courses may not have performed as well in their quantitative courses as their counterparts who did not need remedial coursework, students who completed their remedial courses have graduation rates similar to those of students who do not need remedial coursework. Efforts Made to Improve Student Success in Remedial Mathematics Courses Two common approaches to improving student success in remedial mathematics courses or improving students? understanding of remedial mathematics topics have been documented in the literature: the use of computer-based instruction and the decrease in the length of developmental mathematics sequences. The following two sections describe effects that these two strategies have had on improving student success related to understanding remedial mathematics material. Computer-Based Assistance for Students in Remedial Mathematics Courses Implementing computer-based instruction to improve student learning in remedial mathematics courses is a common form of intervention initiated by universities (Villarreal, 2003; Walker & Senger, 2007; Squires, Faulkner, & Hite, 2009). Computer-based instruction has also been used to improve and remediate students? algebraic skills in credit-bearing mathematics courses (McSweeney & Weiss, 2003; Brouwer et al., 2009). The majority of the following researchers stated that the technologies used in their studies were either as effective as or more effective than traditional measures in remediating students? algebra skills; however, several of the following authors stated that the computer-based instruction implemented in their studies did not significantly improve student achievement, or the authors did not provide sufficient data to support the claim that computer-based instruction benefited their students. 16 Villarreal (2003) described efforts made by the University of Texas at Brownsville, where 49% of its students required help to begin college credit courses. Three developmental mathematics courses were offered at the college: Basic Mathematics, Introductory Algebra, and Intermediate Algebra. The mathematics department first experimented with Computer Directed Instruction (CDI) in which students were enrolled in a computer-based, self-paced course that allowed students to attend the computer laboratory at their convenience; the department soon found that the students were not disciplined enough to complete the coursework in a timely manner. The mathematics department eventually constructed their Intermediate Algebra classes with both lecture and laboratory components. Students met for three hours per week in classroom instruction and three hours per week in the computer laboratory. The researcher noted that the passing rate for Intermediate Algebra increased an average of 12% over the following two years; however, the researcher did not provide statistical data to support this claim. The researcher also noted that the mathematics department offered students several paper/pencil laboratory sections in which students were encouraged to work together with active peer tutoring instead of working in computer laboratories; the instructor was able to work with small groups of students or individuals as necessary. Unfortunately, Villareal (2003) did not provide any data regarding the success of this alternate approach. Further, the type of instruction that was offered during paper/pencil laboratories was not described in detail. Walker and Senger (2007) studied the effect of a computer software program called The Learning Equation (TLE) on the student achievement of 120 minority developmental students enrolled in an intermediate algebra at Alabama State University. Roughly half of the students were randomly placed into traditional courses whereas the other half were placed into courses that used TLE. All of the students were given a pretest and posttest to determine the 17 effectiveness of the software program, and the researchers found no significant difference in student achievement between the computer and the non-computer groups. Unfortunately, details about classroom instruction were unclear. For example, Walker and Senger (2007) reported that students in both the control and experimental classes received instruction through direct lectures that included ample use of PowerPoint and had access to tutors in a computer laboratory; however, the researchers did not describe the pedagogical practices (such as group work and classroom discussion) employed by the instructor. Additionally, the researchers did not provide a general list of topics covered in the course. Squires, Faulkner, and Hite (2009) studied the effects of a ?one-room schoolhouse? at Cleveland State Community College. The project involved a total of three developmental mathematics courses (basic mathematics, elementary algebra, and intermediate algebra) and three college level mathematics courses (college algebra, introductory statistics, and finite mathematics). In 2008, students met for class in a computer lab one hour each week during which time an instructor was available to help students and monitor their progress. Students were also required to attend a computer lab an additional two hours each week where they continued to learn the material in the course. All of the courses were delivered using an online learning system. Each course consisted of 10 to 12 modules where students watched a brief instructional video, completed homework, and then passed a quiz by scoring 70% or better. Once students completed all of the modules for the course, they could start working on the material in the next course, thus allowing students to complete multiple courses during the semester. The researchers stated that the pass rate in remedial mathematics classes at Cleveland State had increased from a 54% to 72%. The pass rates in college algebra (a college-level course) increased from 65% to 74%; however, the pass rates in remaining college-level courses have remained at 72%. The 18 researchers also noted that the costs in the mathematics department decreased 10% because of the restructuring of these courses. McSweeney and Weiss (2003) performed a comparative study to examine the effectiveness of Math Online in improving students? algebra skills so that they may succeed in college-level Calculus 1 and Calculus 2 courses. Math Online is a self-paced, computer-based online course that is designed to give students extra practice and reinforcement in their algebra and precalculus skills outside the classroom. Students enrolled in the calculus section that utilized Math Online were required to complete a set number of proctored multiple-choice quizzes outside the classroom during the semester in a local computer facility. Each instructor who participated in the experiment taught one traditional Calculus section and one section of Calculus in which students used Math Online to practice mathematical skills in addition to the lecture. Further, students did not know which type of course they would be taking until their first day of class. The researchers assessed student performance by 1) giving each instructor?s students pre- tests and post-tests consisting of multiple-choice questions and 2) including common exam questions in each instructor?s midterm and final exams (McSweeney & Weiss, 2003). When the researchers compared the pre-test and post-test scores for each of the instructor?s two classes for Fall 2000 and Fall 2001 (a total of 24 classes containing roughly 25 students per class), they found that the experimental group scored an average of 1 question higher than the control group on the 15-question tests (p < 0.05). When the researchers examined the results of the common test questions, they found that the experimental groups did significantly better (p < 0.05) than the control group on roughly 25% of the questions and that there was no significant difference between the two groups on the other questions. Lastly, the researchers found that instructors 19 teaching the experimental courses could teach the same amount of material in less time (7.5% less time) than when they taught the material without using Math Online. Zavarella and Ignash (2009) examined the effectiveness of distance learning courses, traditional courses, and hybrid courses for students enrolled in Beginning Algebra at a large urban Florida community college. In their study, the researchers described distance learning courses as online courses in which students used packaged software that was delivered at a distance. In hybrid courses, students met on campus, and computers were used as the primary delivering agents of the course material; however, instructors acted as facilitators and delivered personalized instruction as needed. In the traditional courses, content was delivered in a face-to- face classroom setting through a lecture style format. Zavarella and Ignash (2009) found that the students who enrolled in traditional lecture courses were significantly less likely to withdraw from the course than were students who enrolled in the distance learning sections and the hybrid sections (20%, 39%, and 40%, respectively). One limitation of the study was the lack of description regarding the type of software that the students in the computer-based instruction used. Brouwer et al. (2009) wanted to know if frequently completing online tutorials with corresponding online assessments enhanced the students? experiences in Calculus 1 and Business Statistics at the University of Amsterdam. The researchers studied a total of 650 freshmen students who were required to take a concurrent remedial algebra course focused on algebraic skills during the first part of the semester. For Calculus students, the remedial course took place during the first five weeks of the semester; students were given a practice test on the third and fourth week, and a final test on the fifth week. For the Statistics students, the course took place during the first ten weeks, and students were given two tests each week. Based on results from 20 student surveys, the researchers found that the majority of both the Calculus and Statistics students found the remedial course to be designed appropriately (79% and 67%, respectively). Unfortunately, the researchers did not state if the remedial course improved student performance in the Calculus and Statistics courses, nor did they describe the mathematical content that was assessed in the remedial algebra course. In other words, even though the students felt that the remedial course was designed appropriately, it was unclear if the experimental students? scores significantly different from the scores of students who did not take the remedial course. Similar to the previous study by Squires, Faulkner, and Hite (2009), Bassett and Frost (2010) described the efforts made by Jackson State Community College to reduce the time that students spent in remedial mathematics courses. The college transformed its three remedial lecture-based mathematics courses into 12 computer modules ran by the mathematics software program MyMathLabsPlus. In the new design, students could progress through the modules at their own pace and could complete a module by demonstrating 80% mastery of its content; therefore, students could complete their developmental course work in just one term if they were motivated to do so. Faculty helped students by leading small group discussions on topics that students found difficult. The pass rate for the traditional remedial courses historically averaged 41% through Spring 2008; however, when the school transferred to remedial instruction being delivered primarily through MyMathLabsPlus, the pass rate rose to 60% (n = 1,324) by Fall 2009. Additionally, student retention rates increased from 74% during the use of traditional lecture instruction to 83% in Spring 2009 during use of the computer-based instruction. Consistent with the study by Squires, Faulkner, and Hite (2009), the present study also stated that the mathematics department reduced costs as a result of having to hire fewer instructors to teach 21 the material. The study could have been strengthened by describing the type of instruction that was demonstrated by the MyMathLabsPlus software. Vassiliou (2011) reported the efforts of the Florida community college Kendall Campus to use computer assisted instruction to reduce the number of remedial mathematics, reading, and writing courses that students needed to take. Kendal Campus used a computer based tutorial system called Advance College Readiness Online which prescribes individualized lessons based on perceived student deficits in specific content areas. In the study, students first took a placement test to earn a baseline mathematics score. Second, the Advancer software program prescribed a series of individualized lessons to address deficits in arithmetic and elementary algebra. After students worked with the software on their own time (typically between 6 - 13 hours), students took the placement test again to establish a post test score. Of the 180 students who participated in the study from 2006 - 2008, students increased their post test scores in algebra and arithmetic 45% and 57%, respectively. The author also stated that 136 of the 216 students (63%) in the study placed into a higher remedial course, and 62 of those 136 students were able to avoid a remedial course altogether; additionally, the persistence and success rates of the students who used the Advancer tutorial system was greater than the persistence of the students who received traditional remedial classroom instruction. However, with respect to rates of testing out of remedial courses and rates of persistence, the author did not distinguish between remedial mathematics courses and remedial reading and writing courses. Additionally, the study did not describe the type of instruction used by the Advancer software. The studies in this section addressed the effectiveness of computer-based instruction for post-secondary mathematics courses and found mixed results. The combination of lecture and computer-based laboratory instruction improved student achievement in some studies (Villarreal, 22 2003; McSweeney & Weiss, 2003), as did the complete abandonment of traditional lectures in favor of self-paced, computer-based instruction (Squires, Faulkner, & Hite, 2009; Bassett & Frost, 2009). However, the benefits of computer-based instruction were limited. Villarreal (2003) reported that the self-paced, purely computer-based instructional design had to be modified to a lecture and laboratory design due to students? lack of discipline, and Zavarella and Ignash (2009) found that students were more likely to withdraw from computer-based mathematics courses. Additionally, Walker and Senger (2007) found no significant benefit in student achievement from using computer-based instruction, and Brouwer et al. (2009) provided no information regarding the effect of computer-based instruction on student achievement. See Table 1 for a summary of studies that examined the effects of computer-based instruction on student success in remedial courses. Table 1 Computer-based mathematics instruction Approach Used Researcher Results Computer Directed Instruction (Lab only) in remedial algebra courses Villarreal (2003) Pure lab program was abandoned in favor of Lecture/Lab combination because of lack of student discipline Computer Directed Instruction (Lab only) in remedial algebra and college algebra courses Squires, Faulkner, & Hite (2009) Increase in pass rate from 54% to 72% for remedial courses; 65% to 74% for college algebra courses 23 Lecture/Lab Combination in remedial algebra courses Villarreal (2003) 12% increase in pass rate over 2 years Lecture/Lab Combination in Calculus courses McSweeney & Weiss (2003) Lab groups did significantly better than non-lab groups Lecture/Lab Combination in remedial algebra courses Walker & Senger (2007) No significant difference Lecture/Lab Combination in Calculus and college-level statistics courses Brouwer et al. (2009) No results given about student achievement Computer Directed Instruction (Distance Learning) in remedial algebra courses Zavarella & Ignash (2009) Distance learning groups had significantly higher withdrawal rates than face-to- face groups Module-based Curriculum (Computer Directed Instruction) Bassett & Frost (2009) Pass rate in remedial courses rose from 41% to 60% Enhanced Placement Scores through Computer Tutorials Vassiliou (2011) Arithmetic and algebra placement scores increased 45% and 57%, respectively. Many students placed out of remedial courses. 24 Shortening the Length of Developmental Mathematics Programs Some schools are attempting to improve the success rate of students in developmental programs by decreasing the number of remedial courses that students must take before being permitted to take college-level courses (Merseth, 2011; Hern, 2012). The lower a student places in a developmental mathematics sequence, the more opportunities that student will have to exit the sequence (Bahr, 2012); thus, students who pass one remedial course may decide not to enroll in the subsequent course (CCA, 2012). This section describes efforts to reduce the number of required remedial courses in developmental mathematics sequences. Merseth (2011) reported the efforts made by the Carnegie Foundation for the Advancement of Teaching to create Statway and Quantway, programs designed to improve student persistence and student engagement in developmental mathematics courses. These courses promoted two aspects that can benefit non-STEM (science, technology, engineering, and mathematics) students: a path in which students could obtain college credit in only two semesters and a curriculum that concentrated on quantitative literacy. Primarily focused on students enrolled in community colleges, Quantway and Statway promoted student success by engaging students in sense making about real-world issues and by compelling students to make decisions through numerical reasoning and argumentation. In Statway, instruction focused on statistical concepts and quantitative reasoning; mathematics served as a subplot that reinforced learning these topics. In Quantway, instruction focused on the understanding and application of mathematical concepts instead of the memorization of disconnected processes and procedures. Because Statway and Quantway were recently launched in Fall 2011 and Winter 2012, respectively, credit completion data is available only for Statway courses. 25 Byrk (2012) reported the results from the first cohort of Statway students. Roughly 50% of Statway students earned college credit in one year. Byrk (2012) compared these results to California community college students who enrolled in traditional developmental mathematics sequences during Fall 2009 ? Spring 2012: 17.4% of students who needed only 1 remedial mathematics course completed a college-level mathematics course in one year, 39.9% of students who needed only 1 remedial courses earned college credit in three years, and 16.5% of students who needed 2 remedial courses earned college credit in three years. In order to demonstrate that the sequence of courses in Statway was comparably rigorous to other credit statistics courses, a statistics test was distributed to a national reference sample of students who had successfully completed a statistics course. The average score on the common exam was 64%, and the average score on the exam for the Statway cohort was 62.8% (Byrk, 2012). One limitation of the study is its focus on non-STEM students. Hern (2012) reported the effect of implementing Path2Stats, a one-semester developmental course that prepared students for college statistics. The study was done in seven California community colleges during the 2011 ? 2012 school year. There were no prerequisites for the course, and students began learning statistics on the first day of class. Any remedial arithmetic and algebraic concepts were reviewed when the current statistical topics deemed them necessary. In the study, 71 of the 119 (60%) of the Path2Stats students completed a college-level statistics course at the end of one year, as opposed to 362 of the 1756 (21%) of the students who elected to enroll in the traditional remedial courses offered by the community colleges. One limitation of the study is that the researcher did not provide a description of the topics and classroom activities within the Path2Stats course. Another limitation of the study was the lack of focus on STEM students. 26 The studies in this section presented efforts made by institutions to increase student success by reducing the time that it took students to complete a college-level mathematics course. Some institutions used computer-based instruction, whereas other institutions redesigned the developmental mathematics curricula to complete remedial coursework in one semester. The studies presented generally positive results regarding the effectiveness of decreasing the required length of mathematics developmental programs. See Table 2 for a summary of studies that examined the effects of a shortened developmental sequence on student success in remedial courses. Although the efforts in these studies showed promise, it may be difficult for departments to implement these changes since they would have to significantly redesign their developmental programs. Table 2 Shortening the length of the developmental sequence Approach Used Researcher Results Two Semester College-Credit Track for non-STEM Students Byrk (2012) 50% of Statway cohort earned college credit in 1 year vs. 17.4% of traditional remedial students Two Semester College-Credit Track for non-STEM Students Hern (2012) 60% of Path2Stats cohort earned college credit vs. 21% of traditional remedial students A Promising Approach: Reform Mathematics Pedagogy The preceding studies described efforts made by colleges to improve student achievement through computer-based instruction or by decreasing the length of the developmental 27 mathematics sequence. Computer-based instruction was often used to reinforce mathematical concepts outside the classroom, and decreasing the length of the developmental mathematics sequence was applied primarily to non-STEM students. The following set of studies will describe efforts to improve student achievement by modifying pedagogical practices inside the classroom; additionally, these practices can be used to improve instruction for both STEM and non-STEM students. Since the following studies are based on the ideas advocated by reform documents that are published by the National Council of Teachers of Mathematics (NCTM), the American Mathematical Association of Two-Year Colleges (AMATYC), and the Mathematical Associations of America (MAA), I will first describe their main tenets before describing the effects that adopting such practices have had on student achievement in middle, secondary, and postsecondary mathematics classrooms. Recommendations for K-12 Mathematics The current standards-based reform movement began toward the end of the eighties with the publication of NCTM?s Curriculum and Evaluation Standards (1989) followed by NCTM?s Professional Standards (1991) and Assessment Standards (1995); in 2000, NCTM published Principles and Standards for School Mathematics which synthesized into a single volume much of the information presented in the previous three publications (Piburn & Sawada, 2000). Principles and Standards for School Mathematics presented six ?principles? and five ?process standards? that articulated and guided the reform mathematics movement by presenting a strongly coherent picture of mathematics reform (Piburn & Sawada, 2000). A brief description of these principles and standards are provided below. Principles. The principles described in NCTM?s Principles and Standards for School Mathematics (2000) were designed to provide teachers and administrators guidance. The 28 following six principles describe components of high-quality mathematics education. The Equity Principle states that all students, regardless of their personal characteristics, physical challenges, or backgrounds, should have the opportunity to study mathematics, have the support they need to learn mathematics, and have access to a challenging, coherent curriculum that is taught by capable mathematics teachers who hold high standards for their students. The Curriculum Principle states that coherent curricula demonstrate to students how different strands of mathematics relate to, and build on, one another; additionally, mathematics teachers should organize their lessons around fundamental mathematical concepts that can be extended and developed. The Teaching Principle states that teachers need to understand the big ideas in mathematics and carefully create experiences that help students develop an understanding of those ideas. The Learning Principle states that students learn by actively building upon prior knowledge, and students who learn with understanding are more likely to know when and how to use what they know. The Assessment Principle states that assessment should focus on both students? conceptual understanding and procedural skills, and mathematics teachers who include formative assessment throughout their lessons can furnish useful information to both teachers and students. The Technology Principles states that technology (such as computers and graphing calculators) can help students to explore mathematical conjectures more easily than if they were to create representations by hand; also, students can use technology to perform routine procedures more quickly and accurately and thus explore a wider range of problems. Process Standards. The standards described in NCTM?s Principles and Standards for School Mathematics (2000) describe the math content and processes that students in high-quality mathematics programs should learn. The Problem Solving Standard states that teachers who 29 select worthwhile problems and create environments that encourage exploration can solidify and extend what students know, stimulate students? interest in learning mathematics, and enable them to persist in challenging problems. The Reasoning and Proof Standard states that students need to develop reasoning skills to be able to understand mathematics; and students at all grade levels should see that mathematics makes sense through exploring phenomena, making mathematical conjectures, and justifying results. The Communication Standard states that students who communicate their ideas to their teachers and peers build meaning and permanence to the ideas, and students who listen to others? explanations can deepen their own understanding, particularly when they disagree. The Connections Standard states that when teachers emphasize the interrelatedness between mathematical concepts and other disciplines, students can better learn those concepts as well as learn about the usefulness of mathematics; further, teachers should take advantage of the ample opportunities in science, medicine, commerce, and social science to provide their students mathematical experiences in a context. The Representations Standard states that multiple representations?such as diagrams, graphs, tables, and symbolic expressions?should be emphasized throughout a student?s mathematical education. As students develop their mathematical abilities, they develop a repertoire of mathematical representations and an ability to determine which representation is more advantageous based on the problem at hand. Additionally, multiple representations allow for students to move toward abstraction so that students can better understand the role that mathematics plays in revealing patterns. A subsequent extension to high school mathematics standards is the idea that students should reason through and make sense of mathematics. Reasoning and sense making are the foundations for NCTM's Process Standards (NCTM, 2009). Students who are able to reason 30 through and make sense of newly presented mathematical concepts can organize their knowledge in ways that can improve their mathematical abilities. These students will be more likely to understand and retain new information because they will be able to link the new topics to skills and concepts they have already acquired. Teachers can help their students achieve mathematical competence by consistently encouraging students to develop increasingly sophisticated levels of reasoning (NCTM, 2009). The Common Core. The Common Core State Standards for Mathematics is a set of K-12 mathematics standards adopted by most of the United States that defines what students should understand and be able to do throughout their study of mathematics (National Governors Association & Council of Chief State School Officers [NGA & CCSSO], 2010). Although the Common Core does not describe methods of teaching mathematical concepts, the Common Core provides a set of grade specific standards that students should meet as they become prepared for their colleges and careers. Grounded in evidence regarding what knowledge and skills are necessary for postsecondary success, the Common Core is important to postsecondary education because it will provide the basis of knowledge and skills that students across America should have upon entering postsecondary institutions (Jones & King, 2012). Building upon years of work by NCTM and the National Research Council to define the mathematics that students need to understand, the Common Core articulates mathematical standards that can be implemented at the state level (NCTM, 2011). The Common Core and NCTM share a vision of a focused curriculum and identify critical areas in mathematics through 12th grade; further, both institutions generally agree upon the types of mathematical practices that students should be able to demonstrate (NCTM, 2011). Similar to NCTM?s (2000) Process Standards described above, the Common Core proposed Standards for Mathematical Practices 31 which required students to be proficient with tables and graphs, to reason abstractly and quantitatively, to construct viable arguments and evaluate the arguments of others, to solve everyday problems, to use technology appropriately, to develop precision in communicating mathematics, to look for patterns within problems, and to evaluate the reasonableness of their solutions (NGA & CCSSO, 2010). Students will be prepared to enter a wide range of postsecondary-level courses if they are proficient with the Standards for Mathematical Practices that are listed in the Common Core (Conley et al., 2011). A common theme in many documents advocating reform of mathematical instruction is the need for students to develop problems solving skills in addition to computational fluency and conceptual understanding (NCTM, 2000; NGA & CCSSO, 2010). These abilities are mutually supportive and facilitate the learning of one other (NCTM, 2000). Teachers who use context- based problems to introduce mathematical principles can improve students? conceptual understanding of the mathematics by helping students 1) provide rich representations of a problem, 2) know when to apply mathematical principles, 3) know if their solutions are reasonable, and 4) judge the reasonableness of their solutions (Schroeder & Lester, 1989). Students who learn the reasons behind the mathematical principles that they are taught are more likely to remember them correctly and apply them appropriately when confronted with new situations (Skemp, 2006). In contrast, because each of these components supports one another, students who are unable to determine when and how to use their knowledge will find their mathematical abilities to be fragile (NCTM, 2000). Teachers should therefore emphasize the interrelations between conceptual understanding and computational fluency in order to help students become more effective at problem solving (National Mathematics Advisory Panel [NMAP], 2008). 32 Jones and King (2012) described several implications of the Common Core for postsecondary education. First, postsecondary instructors will be able to increase the rigor of their courses, and institutions should be able to redirect funding to credit-level mathematics courses due to a decreased need for remedial mathematics courses. Second, because the expectations within the Common Core are clearly articulated and upheld by postsecondary education, students will know that meeting these expectations will produce real benefits at the college level. Thus, students will be much more likely to meet those expectations because of the impending real-world consequences. Third, the Common Core adopted standards that correspond to the highest-performing states in the United States and countries around the world. Since the academic rigor within a curriculum is the most important factor towards achieving postsecondary success (Adelman, 1999), the coordination between K-12 and postsecondary education regarding the effective implementation of the Common Core should lead to less remediation and higher success rates at the college level (AEE, 2011; Jones & King, 2012). The Equitable Nature of Reform Mathematics Pedagogy. The classroom practices that are advocated by the National Council of Teachers of Mathematics have been identified as equitable with respect to increasing the achievement level of developmental students. The following paragraphs provide a brief description of what equity means, followed by a description of common equitable practices in mathematics classrooms. Gutierrez (2007) stated that equity means fairness instead of sameness; and at a basic level, equity can mean the inability to predict an individual?s mathematical achievement based solely on student characteristics such as race, gender, and ethnicity. Stenmark (1989) stated that equity means having the same opportunities as others but also includes a support structure by which to take advantage of those opportunities. Banks and Banks (1995) stated that equity may 33 not always mean treating differing groups the same; rather, sometimes it is necessary to treat groups differently in order to create equal-status situations for marginalized students. Many paths exist to develop equitable instruction (Boaler & Staples, 2008), and equitable instruction does not necessarily need to include curricula that is designed to be culturally sensitive by using examples of students? cultures or students? practices outside of school (Banks & Banks, 1995; Boaler & Staples, 2008). Conceptually oriented mathematics materials that are consistently well taught produce more equitable results for students than do procedure-oriented curricula that are taught through a demonstration and practice approach (Banks & Banks, 1995). Maintaining high cognitive demand, emphasizing the importance of effort over innate ability to learn mathematics, providing clear expectations for learning practices, showing students how to explain and justify their answers followed by requiring students to explain and justify their answers, and encouraging students to help other students as well as ask for help themselves have all contributed towards making instruction more equitable (Boaler & Staples, 2008). In equity-oriented classrooms, students are encouraged to actively construct knowledge and to learn from their peers through social interactions; further, students benefit from cooperative learning strategies when instructors take into account status differences among students (Banks & Banks, 1995). Instructors who incorporate cooperative learning strategies require students to clarify their thinking through talking and writing, test their ideas against other students, appreciate the perspectives of other students, and develop group communications skills; thus students are encouraged to assume responsibility for their learning by expressing their opinions and asking questions (Boylan, Bonham, & Tafari, 2005). Because many developmental students will not be accustomed to cooperative learning activities, instructors should take care to 34 help students to become accustomed to such activities; failing to do so may produce additional inequities (Boylan, Bonham, & Tafari, 2005; Boaler & Staples, 2008). Recommendations for Post-secondary Mathematics Students The Mathematical Association of America and the American Mathematical Association of Two-Year Colleges (AMATYC) made recommendations that are specifically intended for college-level introductory mathematics courses. The following recommendations address the type of mathematical content that undergraduate students should learn, the development of intellectual abilities within these courses, and the pedagogical approaches that teachers should use when teaching introductory college-level mathematics courses. Undergraduate mathematics curricula should develop the mathematical knowledge and skills of students so that they may pursue and achieve their career goals (AMATYC, 2006). By reducing the number of topics within undergraduate mathematics courses and covering the remaining topics in greater depth, students can learn the material with greater understanding and flexibility (AMATYC, 2006). Mathematical content that contain practical applications are especially important for adult learner (Goldrick-Rab, 2010), and although real world problems do not help students with procedural skills, they do help students do well on other real world problems (NMAP, 2008). The Mathematical Association of America?s Committee on the Undergraduate Program in Mathematics (CUPM) made a number of recommendations for college algebra courses. They advocated that students should become proficient with using systems of equations to model real world situations, and they should understand the concepts of rate of change and be familiar with linear, polynomial, exponential, and logarithmic functions (CUPM, 2011). It is also important for students to learn how to collect data and analyze it through statistical techniques such as fitting a 35 curve to a scatter plot and using that curve to make predictions based on the trends within the data (CUPM, 2011). Mathematics courses should also develop students? intellectual abilities. Students should develop their logical reasoning skills and their ability to communicate mathematical ideas in both oral and written form (CUPM, 2011). Instructors should help students analyze and synthesize information, and instructors should help students to work collaboratively to explore mathematical phenomena and report their findings (CUPM, 2011). Students should be able to engage competently and confidently in problem-solving activities. Problem solving includes the ability to create and interpret mathematical models based on real world situations (CUPM, 2011). When faced with a problem, students should develop a personal method of attacking a problem. For example, such a method of attack may include rereading the problem, defining relevant variables, drawing a diagram, using appropriate methods of solution (analytic, numerical, graphical), interpreting the appropriateness of the solution, and revising the model if necessary (CUPM, 2011). Instructors should emphasize conceptual understanding of mathematics when teaching students and should provide opportunities for students to explore mathematical material (CUPM, 2011). Such emphasis on conceptual understanding is important since students enter the classroom with preconceived notions, thereby necessitating that instructors engage students? initial understanding and help them to make analogies between new concepts and what they already know (Donovan, Bransford, & Pellegrino, 1999; Golfin et al., 2005). To improve conceptual understanding, algebraic techniques should be developed in the context of solving problems (CUPM, 2011). Additionally, technology (such as computers, calculators, spreadsheets) can assist students in their mathematical explorations (CUPM, 2011). Instructors 36 should also incorporate student-centered instruction through small group activities and projects (CUPM, 2011). Instructional techniques that involve personal interaction seem to benefit students who are struggling with the material (AMATYC, 2006). Instructors should use a variety of assessments (in addition to individual quizzes and tests) to assess a student?s level of understanding. Listening to students, asking them appropriate questions, and giving them opportunities to demonstrate their knowledge in a variety of ways is an effective strategy to increase student learning (AMATYC, 2006). Group homework, projects, presentations, activities, and quizzes can help instructors assess students? levels of understanding (CUPM, 2011). In summary, instruction within a postsecondary introductory mathematics course should improve students? attitudes towards mathematics and prepare them for the mathematics they will encounter in future courses (CUPM, 2011). Instructors can improve students? conceptual understanding by promoting mathematical exploration through technology and group activities (AMATYC, 2006; CUPM, 2011). These courses should also prepare students to engage in mathematics that they might encounter in their own personal lives (CUPM, 2011). Recommendations for Underprepared Post-secondary Mathematics Students Recommendations have also been made by the Mathematical Association of America, AMATYC, and the U. S. Department of Education that are specifically intended for postsecondary remedial mathematics courses. The following recommendations address the type of mathematical content that underprepared students should learn, the development of intellectual abilities within these courses, and the pedagogical approaches that teachers should use when teaching remedial mathematics courses. 37 In order to pursue successfully college-level mathematics, students need to have a solid foundation in arithmetic, geometry, trigonometry, algebra 1 and 2, and statistics (Golfin et al., 2005), and students should come to view the mathematics within these areas as interrelated concepts instead of unrelated facts to be memorized (AMATYC, 2006). Solving proportions and knowing its applications to their daily lives is a key concept that students in remedial courses need to understand (AMATYC, 2006). Additionally, remedial mathematics courses should minimize some algebraic topics such as factoring, radicals, and operations with rational expressions while instead emphasizing modeling, communication, and quantitative reasoning (AMATYC, 2006). In addition to specific types of mathematical knowledge, students must also be able to think critically, present sound solutions to problems using multiple representations, and apply knowledge in new contexts (Golfin et al., 2005). Students also need to gain confidence in solving real-world problems and build a reservoir of problem-solving strategies (AMATYC, 2006). As a result of the course, students should develop appropriate time-management skills and study habits, be comfortable working collaboratively, and have successful experiences using technology to organize and analyze data, and become comfortable executing multistep problems (AMATYC, 2006). Mathematics instructors should create classrooms that are authentically welcoming and supportive (Boylan, Bonham, & Tafari, 2005). Instructors can build trust in their classrooms by taking the time to learn about students as individuals and by creating spaces where students can learn more about themselves and their classmates (Boylan, Bonham, & Tafari, 2005). Instructors also need to provide positive experiences for underprepared students (AMATYC, 2006). Instructors can improve students? experiences in a course by modeling multiple problem-solving 38 approaches, engaging students actively in the learning process, and providing students with adequate time to explore problems and to reflect upon and understand multiple approaches to solving problems (AMATYC, 2006). A community-centered environment in which the instructor encourages small group discussions can increase student learning for several reasons. First, students in small group settings are more inclined to express disbelief and challenge ideas, thus providing a need for explicit mathematical argumentation (Golfin et al., 2005). Second, the members of the group bring with them insights and experience that can assist in the problem solving process (Golfin et al., 2005). Third, by encouraging students to ask questions and express their opinions, collaborative learning encourages students to assume responsibility for their learning (Boylan, Bonham, & Tafari, 2005). Technology can also be an effective strategy in increasing student learning. Although NMAP (2008) stated that no clear consensus could be reached regarding the effectiveness of technology-based delivery methods, instructors who used calculators to emphasize problem solving, real-world problems, or the development of critical thinking skills are finding greater success than instructors who use calculators to emphasize basic skills (Golfin et al., 2005). Similarly, AMATYC (2006) emphasized the importance of integrating technology into mathematics instruction in order to help students recognize numerical and graphical patterns. Instructors should also provide tasks during which students have successful experiences with technology, including calculators, spreadsheets, and other computer software (Golfin et al., 2005; AMATYC, 2006). In summary, students in postsecondary remedial mathematics courses need to view mathematics as a balance of analyzing problems and using appropriate techniques to arrive at 39 meaningful answers (AMATYC, 2006). When their first attempts are unsuccessful, students in remedial mathematics courses need to be comfortable switching to alternative strategies to attack the problem (AMATYC, 2006). By using technology and fostering a supportive community- centered classroom environment, instructors can employ classroom activities that improve students? confidence and problem-solving abilities (AMATYC, 2006). The Effects of Reform-Oriented Classrooms on Student Achievement The previous sections provide recommendations consistent with reform pedagogy which emphasizes a balance of procedural fluency and conceptual understanding. Students actively participate in the learning process by exploring mathematical concepts in groups with the aid of technology and discussing with their classmates what they discovered. Students also develop conceptual understanding by understanding the reasons behind the mathematical principles they are taught. Further, teachers help their students develop their problem-solving abilities by presenting mathematics in real-world contexts. The studies in the following paragraphs provide data regarding the effectiveness of reform-based curricula. Although different curricula are used throughout the studies, the reform- based curricula in the following studies mostly adhered to several pedagogical practices. First, mathematics should be presented in context and should have applications to real world situations (Robinson & Robinson, 1998; Schoen, Hirsch, & Ziebarth, 1998; Thompson & Senk, 2001; Thompson, 2009). Second, exploration (often small group) and experimentation are important in helping students to understand formal theory (Robinson & Robinson, 1998; Schoen, Hirsch, & Ziebarth, 1998; Webb, 2003; Thompson, 2009). Third, graphing calculators and other technology are valuable tools for helping students to understand concepts (Schoen, Hirsch, & Ziebarth, 1998; Webb, 2003; Thompson, 2009). Fourth, representations (pictures, graphs, or other objects 40 that illustrate concepts) can help students to make connections in mathematics (Thompson & Senk, 2001; Robinson & Robinson, 1998). Lastly, both routine and non-routine problems are presented during instruction (Robinson & Robinson, 1998; Thompson & Senk, 2001; Webb, 2003). The content in college remedial mathematics courses includes many of the same concepts that are covered in middle school and high school mathematics courses (Bahr, 2008). Such topics include order of operations, signed numbers, solving first and second degree equations, factoring polynomials, and introduction to graphing (Parmer & Cutler, 2007). Because of the similarity in mathematical content between middle and secondary courses to content in tertiary courses, I include in the literature review studies that address the effectiveness of reform-oriented pedagogy in middle and secondary classrooms. Student Achievement in Middle School Reform-Oriented Classrooms. The following studies describe the effect of implementing reform curricula or reform-oriented pedagogical practices in middle school mathematics classrooms. Each study was conducted for at least two years. Reys et al. (2003) compared the mathematics achievement between students who had used reform-based curriculum for at least two years (Grades 6 and 7) and students who used traditional curricula during that time. Three districts who had implemented reform-based curricula?either Connected Mathematics Project (Lappan et al., 1997) or MATH Thematics (Billstein & Williamson, 1998)?beginning in fall 1996 were compared with three individually matched comparison districts based on prior student achievement and socioeconomic levels. The Missouri Assessment Program (MAP) mathematics exam was used to establish a baseline by which to identify comparison districts and was also used as the posttest to measure mathematical 41 achievement. Beginning in 1997 through 1999, the researchers compared the mathematical achievement of eighth grade students on the MAP. Reys et al. (2003) found the students using reform-based curricula for at least two years during middle school performed as well as or better than students from the matched comparison districts. Additionally, all significant differences (at least p < 0.05) on the MAP were in favor of the students who used reform-based materials. The authors made two important comments regarding the strength of their study. First, the authors noted that their study would have been improved if all students used the same textbook series throughout the middle grades; but such a scenario is rarely found in the real world. Second, because the authors had no direct information regarding the quality of teaching within the classrooms, they assumed that considerable variability in teaching existed across all of the schools in the study. Mac Iver and Mac Iver (2009) examined the relationship between mathematical achievement growth and the number of years that urban schools implemented a whole school reform (WSR) model that included reform-based mathematics curricula. In 1999 ? 2000, 12 of the 86 schools in the study used reform-based mathematics curricula; the remaining 74 schools either lacked a coherent mathematics program of instruction for their WSR model, or they lacked a WSR model altogether. The researchers used the Pennsylvania System of School Assessment (PSSA) to measure achievement growth of 9,320 eighth grade students across 86 Philadelphia schools. Since the PSSA is offered in both fifth and eighth grade, and since the test is vertically equated so that it is possible to measure scale score growth over time, the researchers compared the scores of the students from fifth grade to the scores these same students achieved in 8th grade. Using multi-level change models, the researchers analyzed scale scores and found that students enrolled for three years in schools that implemented a mathematics component to its 42 whole school reform gained significantly higher mathematical achievement growth over students who attended schools that did not have a mathematics component to their whole school reform. Mac Iver and Mac Iver?s (2009) study could have been strengthened in two areas. First, it is important to note that their study only compared schools that had a reform-based mathematics component to schools that did not have any mathematics component. If the researchers had also analyzed schools that adopted a WSR model with a computation-focused curriculum, the researchers could have determined the following relationships: 1) how does consistently implemented reform-based mathematics curricula compare to a consistently implemented computation-focused mathematics curricula? (For example, do students who experience three straight years of a reform-based curriculum demonstrate better mathematical understanding than students who experience three straight years of mathematics curriculum that is not reform- based?), and 2) how does a consistently implemented computation-focused mathematics curricula compare to schools that do not have a consistently-implemented mathematics curricula at all? (For example, do students who take three straight years of a non-reform mathematics curriculum demonstrate better mathematical understanding than students who experience different mathematical curricula from year to year?) Second, the researchers did not take into account the quality of instruction within the classrooms. Thompson (2009) compared the effects of reform-based instruction and non-reform- based instruction on students' mathematical achievement as measured by the Iowa Test of Basic Skills (ITBS). Observers, who were trained to use an observation instrument adapted from math and science education standards and the TIMSS survey, documented mathematical reform-based activities and behavior and non-reform-based activities and behaviors within classrooms. Examples of reform-based mathematical activities included: 1) students using manipulatives, 2) 43 students engaged in self-assessments, and 3) students working in pairs or small groups. Examples of non-reform-based activities included 1) students listening to a teacher lecture and 2) students working on pencil/paper worksheets. From 2000 to 2002, 408 observations were made of randomly selected Oklahoma City mathematics and science classrooms (204 mathematics and 204 science) in grades 6 to 9 containing roughly 10,000 students. Using specific reform-based and non-reform-based practices as independent variables and using student achievement as the dependent variable, the researchers analyzed the data using stepwise multiple regression procedures to identify variables for elimination. Thompson found that the multiple effect contributions of manipulatives, self-assessment, and group-based projects significantly contributed to students' mathematics achievement (3% of the variance in ITBS math, p < 0.05). Thompson also found that none of the non-reform-based practices significantly contributed to mathematics achievement. The preceding studies illustrated that students who were taught using reform-based pedagogy in the middle grades tended to do at least as well as students who received traditional instruction; however, the effect of reform-based instruction was more pronounced for students who had received such instruction for at least two years. Table 3 summarizes the studies that examine the impact of reform-oriented teaching on student success in middle school mathematics courses. Table 3 Effects of reform-oriented instruction in middle school mathematics courses Approach Used Researcher Results Use of reform curricula (Connected Mathematics and Reys et al. (2003) Students using reform-based curricula for two years scored as well as or better than 44 MATH Thematics) matched traditional students on the Missouri Assessment Program Incorporation of reform mathematics curricula in the mathematics component of Whole School Reform programs Mac Iver & Mac Iver (2009) Students who were enrolled for three years in schools that implemented a reform mathematics component to its whole school reform gained higher mathematical achievement growth over students who attended schools that did not have a mathematics component to their whole school reform. Reform-oriented instruction Thompson (2009) The combination of manipulatives, self- assessment, and group-based projects affected 3% of variance of students? mathematical achievement on the Iowa Test of Basic Skills; non-reform practices had no significant affect. Student Achievement in Secondary Reform-Oriented Classrooms. The following several studies describe the effects of implementing reform curricula or reform-oriented pedagogical practices in secondary mathematics classrooms. All of the studies are comparative and base their results on students from multiple schools. Hirschhorn (1993) reported the effects that a reform-based curriculum, the University of Chicago School Math Project (UCSMP), had on student achievement and attitudes towards 45 mathematics. UCSMP began in 1983 with funding from the Amoco Foundation. The goal of the foundation was to improve school mathematics education by designing effective teaching materials. In an ex post facto study, the researchers compared students who completed four years of the reform curriculum to a carefully matched set of comparison students who received traditional curricula. A total of 141 students across three sites participated in the study. In spring 1990, students took three instruments as posttests: a) the Mathematics Level 1 Achievement Test which covered geometry and second-year algebra, b) an "Application Test" which covered applications of arithmetic, geometry, algebra, and advanced algebra, and c) a student opinion survey. The results showed that the students using the reform curriculum outperformed comparison students on the Applications Test at all three sites. The reform students at sites A and B significantly outperformed the comparison students on the Level 1 Achievement Test and the Application Test. For the eleventh grade cohort at site C, the comparison students outperformed the reform students on the Level 1 Achievement Test. For the 10th grade cohort at site C, the reform students outperformed the comparison students on the Application Test. The researchers noted that the comparison students tended to perform better on factoring topics, whereas the reform curriculum deemphasized such topics. Additionally, The student opinion survey showed 1) very little difference in attitudes towards mathematics between UCSMP and comparison students, 2) reform students who used a scientific calculator for at least 4 years were more likely to agree that calculators helped them to learn mathematics, and 3) reform students were more likely to agree that using a calculator too much makes you forget how to do arithmetic. Lastly, the researchers stated that a conservative measure was necessary to assess the validity of the results since they did not formally examine the quality of teaching within the classrooms. 46 Schoen, Hirsch, and Ziebarth (1998) examined the effects of the reform-oriented curriculum, the Core-Plus Mathematics Project (Coxford et al., 1998), on student achievement including the Iowa Tests of Educational Development and a test based on the National Assessment of Educational Progress. Beginning in 1997, the researchers performed a longitudinal study of high school freshman located in several states through their first year of post-high school education. To establish a baseline and properly match students in the Core-Plus group to students in the comparison group, the researchers administered the Ability to Do Quantitative Thinking (ATDQT) standardized test as a pretest to all students; the ATDQT is a subtest of the Iowa Tests of Educational Development. At the end of each year, the researchers administered open-ended posttests that were developed by the Core-Plus Mathematics Project evaluation team. At the beginning of the study, 2,944 Core-Plus students from 33 schools and 527 comparison students from 11 schools participated. After one year, 2,270 Core-Plus and 201 comparison students remained in the study, and after two years, 1,457 Core-Plus students and 0 comparison students remained in the study. The researchers found that Core-Plus students demonstrated better reasoning in quantitative situations in the ATDQT than did comparison students; Core-Plus students were better able to apply algebra and geometry concepts on posttests; and while comparison students outperformed Core-Plus students at the end of the first year in algebraic procedures, a significant difference no longer existed at the end of the second year. Thompson and Senk (2001) examined the difference in student achievement between 150 students who used UCSMP high school curricula and 156 students who used traditional curricula. A total of 16 second-year algebra classes located in 4 schools across four states participated in the study; the schools represented a variety of educational and socioeconomic 47 conditions. Each UCSMP class at a school had a paired non-UCSMP class at that same school where both sets of students were of comparable mathematical abilities. The researchers administered a pretest to measure entering algebra and geometry knowledge to determine if the students were comparably matched. At the end of the school year the researchers administered a posttest, the Advanced Algebra Multiple-Choice Posttest, which measured students' content knowledge. On the entire multiple choice posttest, the researchers found that the differences in the mean percentages between the paired classes were statistically significant for five of the eight classes, all favoring UCSMP classes. No significant difference existed between the remaining three pairs of classes. The authors cautioned that although the posttest was designed to be fair to both types of classes, teacher feedback indicated that major differences in content coverage existed among classes. However, for the Fair Test (which included items that both sets of teachers reported that their students had opportunities to learn the needed content), UCSMP classes again outperformed the comparison classes seven out of eight times, with four of these differences being statistically significant in favor of the UCSMP classes; the remaining differences were not statistically significant. On the Conservative Test which emphasized mathematical skills, the difference in achievement between the two groups was not statistically significant. On the Problem-Solving and Understanding Test, all but one set of differences was statistically significantly in favor of UCSMP classes. Thus, the UCSMP curricula tended to help students understand mathematics and did not adversely affect procedural skills. Continuing the string of studies on UCSMP, Senk and Thompson (2006) reported a secondary analysis of the solutions written by the second-year algebra students from Thompson and Senk's (2001) study. The students in the analysis used either UCSMP Advanced Algebra or a traditional second-year algebra curriculum. The researchers found that UCSMP students scored 48 higher than non-UCSMP students overall on the Problem Solving and Understanding Test and on a multiple choice achievement test. The researchers also found that UCSMP students used graphical and numerical strategies more frequently than students who used comparison textbooks. Moreover, since UCSMP students left fewer questions blank, the researchers hypothesized that the emphasis of UCSMP on multiple dimensions of understanding and multiple solution approaches better helped students begin a problem than other curricula studied. Researchers investigated whether enrollment in the reform-oriented curriculum, the Interactive Mathematics Program (IMP) (Fendel et al., 1999), 1) increased the percentage of students who took college-qualifying high school mathematics courses and 2) impacted student achievement as measured by the Comprehensive Test of Basic Skills (CTBS) and the Scholastic Aptitude Test (SAT) (Webb, 2003). The Interactive Mathematics Program is a four-year college- preparatory curriculum for grades 9 - 12 that integrates a wide range of mathematics and frequently uses technology throughout the program. IMP encourages students to use graphing calculators and work cooperatively to solve both routine and non-routine problems. Students are expected to experiment with examples, search for and articulate patterns, and provide conjectures to be tested. Students are encouraged to verbalize their thinking as evidenced by classroom activities including presentations, small-group activities, and written explanations. By using their teachers, classmates, textbook, and other resources, students are encouraged to become independent learners. A total of 1,121 student transcripts from the class of 1993 across three diversely populated California high schools were analyzed. The researchers found that a significantly higher percentage of IMP students decided to pursue a fourth year of high school mathematics than did students who were enrolled in more traditional mathematics courses (64% and 38%, 49 respectively). When analyzing student performance on the Comprehensive Test of Basic Skills (CTBS), the researchers found no significant difference between IMP students and non-IMP students. With respect to the Scholastic Aptitude Test (SAT), the researchers found that the IMP students in one of the high schools scored significantly higher than the non-IMP from that same school; for the other two schools, no significant difference was found. Thus, students enrolled in the IMP from the 9th grade performed at least as well as non-IMP students when considering SAT and CTBS scores. A weakness of the study was that students volunteered for the IMP curricula; thus, the study?s lack of random assignment made it difficult for the researchers to ensure that mathematical abilities of each cohort of students were similar at the beginning of the study. Cichon and Ellis (2003) reported that researchers gathered data from the graduating classes of 1997, 1998, 1999 who used MATH Connections (Robinson & Robinson, 1998), a reform-oriented curriculum. MATH Connections is a curriculum designed according to the goals of NCTM. This curriculum blends different areas of mathematics, technology, cooperative learning, and real-world situations to help students understand the mathematical concepts presented (Cichon & Ellis, 2003). From the eight schools selected, the MATH students were matched with comparison students based on their performance on the math portion of the standardized Connecticut Mastery Test (CMT) administered in the eighth grade. Observations were made of both MATH and non-MATH classes, and items were assessed on the scale in which 1 = none and 5 = extensive. Both classes had a similar amount of group interaction, but MATH classes had significantly more on-task student behavior than did comparison classes (4.4 and 3.8, respectively) and significantly more complex cognitive levels of discourse than comparison classes (2.3 and 1.9, respectively). The researchers noted that several factors in 50 MATH classes promoted an environment of successful conceptual understanding and problem solving experiences. These features included frequent use of multiple representations of mathematics such as visual and symbolic representations, the routine incorporation of graphing calculators, the focus on classroom arguments and open-ended questions that could be answered in multiple ways, and the use of problem-solving activities that involve real-world situations. When the researchers analyzed the performance of both groups on the Connecticut Academic Performance Test (CAPT) (taken in the tenth grade), they found that 60% of the 558 MATH students met or exceeded the state goal compared to 55% of the 745 comparison students; the result was statistically significant. The researchers found no significant difference between the CAPT mean scores of students from both groups who had matching CMT scores. However, when the researchers used the CMT score of students as a covariate to control for incoming high school mathematics ability, they found that MATH students significantly outperformed the comparison students on the CAPT. Thus the students in using MATH Connections were at least as successful in learning mathematics as the comparison students. The previous studies demonstrate that students in reform-oriented secondary mathematics courses often outperformed equally matched comparison students in reasoning skills and application problems. Additionally, any differences in procedural and algebraic ability between the two groups diminished after a couple of years. Table 4 summarizes the studies that examine the impact of reform-based mathematics curricula on student success in secondary mathematics courses. 51 Table 4 Effects of reform-oriented instruction in secondary school mathematics courses Approach Used Researcher Results University of Chicago School Math Project (UCSMP) Hirschhorn (1993) Student using UCSMP consistently outperformed comparison matched students on the Applications Test Core-Plus Mathematics Project (CPMP) Schoen, Hirsch, & Ziebarth (1998) Compared to matched students, students using CPMP demonstrated better reasoning in quantitative situations, were better able to apply algebra and geometry concepts, and eliminated deficits in procedural ability by the end of the second. University of Chicago School Math Project (UCSMP) Thompson & Senk (2001) Students using UCSMP performed as well as or better than matched students on problem-solving tests; no significant difference existed between the groups in procedural ability. Interactive Mathematics Program (IMP) Webb (2003) Compared to non-IMP students, students using IMP scored were more likely to enroll in a fourth year of high school mathematics MATH Connections Cichon & Ellis Students in MATH classes had more on- 52 (2003) task student behavior, more complex cognitive levels of discourse, and were more likely to meet state math standards than students in comparison classes University of Chicago School Math Project (UCSMP) Senk & Thompson (2006) Students using UCSMP used graphical and numerical strategies more frequently than matched comparison students Student Achievement in College-level Reform Oriented Classrooms. The following studies address efforts made by post-secondary institutions to improve student performance in college-level mathematics. Although several of the authors did not explicitly refer to reform documents for their motivation to implement the described classroom changes, the changes implemented in the studies often aligned with pedagogical practices advocated by reform documents. All of the following studies are comparative studies. Hurley, Koehn, and Ganter (1999) reported that the University of Connecticut conducted a 5-year longitudinal study (Fall 1989 ? Spring 1994) of 579 students who took either a traditional calculus course or an experimental computer-integrated calculus course. Although both sections used the same text, the experimental course included the following: 1) students participated in a computer-laboratory period for one class hour per week, 2) students engaged in a group problem-solving session that addressed both conceptual and computational questions, and 3) students were encouraged by instructors in both sessions to explore and analyze the problems provided. When analyzing the students? performance on a common final exam which included both conceptual and procedural questions, the researchers found that the experimental 53 sections outperformed the traditional sections each semester; however, the authors did not indicate that the differences were statistically significant. The data also showed that taking the computer-integrated calculus course was the only statistically significant factor that correlated with persistence in technical majors among females. For males, persistence in technical majors correlated significantly with the calculus course taken as well as Mathematics SAT score. Lastly, students who took the computer-integrated calculus course completed significantly more (p < 0.02) post-calculus major courses than did students who took the traditional calculus course. Lawson et al. (2002) observed six sections of Math Theory for Elementary Teachers. Three sections were taught by instructors that were influenced by the Arizona Collaborative for Excellence in the Preparation of Teachers (ACEPT), and the other three sections were taught by instructors who were not influenced by ACEPT. ACEPT is a National Science Foundation- sponsored program that attempts to improve mathematics instruction at Arizona State University by incorporating reformed teaching methods into undergraduate mathematics and science courses. At the beginning and end of the semester, students were administered a test that measured computational skills, number sense, and conceptual understanding. Each instructor was evaluated at least twice using the Reformed Teaching Observation Protocol (RTOP), a 25- question observation instrument developed by ACEPT to measure the degree to which a classroom?s activities align with reform pedagogy (Piburn & Sawada, 2000). The researchers found correlations between the following pairs of items: 1) student post-test scores and the mean instructor RTOP scores (r = 0.94, p < 0.001), 2) normalized student achievement gains and mean instructor RTOP scores (r = 0.86, p < 0.001), and 3) student post-test number sense scores and mean instructor RTOP scores (r = 0.92, p < 0.001). However, the researchers did not find a relationship between student post-test performance on the computational skills test and the 54 instructors' mean RTOP scores. Unfortunately, the researchers did not state the number of students that were involved in the study. Ellington (2005) reported that Virginia Commonwealth University (VCU), an urban institution with over 28,000 undergraduate and graduate students, developed a college algebra course that attempted to focus on mathematics topics that were important to other disciplines, develop students' abilities to work as a team and communicate quantitative ideas orally and in writing, and emphasize the development of mathematical models and the use of technology. In Fall 2004, the researchers compared 284 students across 8 sections of the modeling-based classes to 989 students enrolled in 28 sections of traditionally taught skills-based classes. The experimental students took tests that consisted of 70% modeling questions and 30% skills questions, and they spent the majority of the class period working in groups of 2-4 students on modeling problems with intermittent pauses for whole or partial-class discussion on issues or skills that needed to be addressed. Additionally, graphing calculators were emphasized on a daily basis, often to find and evaluate mathematical models. The author found that roughly 72% of the students in the experimental group earned a grade of A, B, or C as compared to 50% of the traditional students (p < 0.01). The DFW rates (the percentage of students who earned a final grade of ?D?, ?F?, or ?Withdrawal? for the course) for the experimental and traditional classes were 28% and 51%, respectively; however, the researchers did not state if these values were statistically significant. Students in both sections were administered ten common questions on their final exams that covered algebraic computations and modeling applications. The experimental students scored significantly higher than the traditional students (p < 0.001) on the 13 common final exam questions. When comparing the students in both courses who earned a C or higher, the experimental students 55 outperformed the traditional students on the skills questions, modeling questions, and the combined set of questions. In the subsequent mathematics courses (Spring 2005), significantly more traditional students than experimental students earned an A, B, or C in precalculus (70% and 56% respectively; p < 0.01). However no significant difference in ABC rates for the business mathematics existed between the two groups (Ellington, 2005). Several limitations regarding the results of the study should be noted. First, each experimental instructor was assigned two teaching assistants to attend all class meetings to help students who were having difficulty and to facilitate group activities; outside of class, the assistants tutored students and ran help sessions before each test. The authors reported no such advantage for the traditional students. Second, the experimental final grades included group projects (20%) and class activities (10%) whereas the traditional final grades only included homework and tests. Lastly, many of the experimental sections had an attendance policy that significantly penalized students' grades for unexcused absences; the traditional sections did not have such a policy. Gordon (2006) reported that in Fall 1999, researchers at New York Institute of Technology (NYIT) compared student achievement and student attitudes of 37 students enrolled across two reform-modeling precaluclus classes to 27 students enrolled across two traditionally taught precalculus classes. The traditional classes were lecture-based and focused on routine algebraic manipulations whereas in the reform-modeling classes, algebraic manipulations arose in the context of problem solving. The reform-modeling course focused on conceptual understanding of mathematical ideas, problem-solving, and realistic applications. Additionally, all classes used graphing calculators. When the researchers compared the students' answers on ten common questions on the final exam (primarily procedural in nature), they found that the 56 reform students significantly (p < 0.05) outperformed the traditional students. The researchers also conducted a student attitudinal survey through a pre-post survey design. The results showed that the reform students expressed higher positive attitudes towards mathematics, were more likely to view mathematics to be connected to situations beyond math courses, and were more likely to view technology as important to learning mathematics. The studies in the previous section demonstrate that students in college-level mathematics courses who receive problem-oriented instruction, combined with appropriate use of technology and cooperative learning, can perform at least as well as comparison students in terms of pass rates and performance on examinations. Additionally, students who received reform-oriented instruction tended to have higher positive attitudes towards mathematics. Ellington?s (2005) study was the only study in which students who received reform-oriented instruction performed worse in subsequent precalculus courses, even though the other students who received reform-oriented instruction performed as well as the comparison students in the subsequent business mathematics course. Table 5 summarizes the studies that examine the impact of reform-oriented instruction on student success in postsecondary mathematics courses. Table 5 Effects of reform-oriented instruction in postsecondary mathematics courses Approach Used Researcher Results Integration of computer laboratory meetings involving group work and exploration into a Calculus course Hurley, Koehn, & Ganter (1999) Compared to traditional courses, students in the computer-integrated course scored higher on the final exam, and females in the course were more likely to pursue technical 57 majors Reform-oriented instruction in a college-level mathematics course for elementary teachers Lawson et al. (2002) Significantly high correlations existed between reform-oriented instruction and students? post test scores, achievement gains, and number sense scores Integration of problem- oriented approach and cross- disciplinary content into a remedial algebra course Erickson & Shore (2003) Students in the experimental course earned higher test scores and reported more positive attitudes than students in the traditional remedial course Integration of cross- disciplinary topics, group work, and technology in a college-level mathematics course Ellington (2005) Students in the experimental group had higher pass rates and higher scores in the algebra course, lower pass rates in the subsequent Precalculus course, and no significant difference in subsequent the business mathematics course Reform-oriented instruction in a college-level Precalculus course Gordon (2006) Students in the reform course demonstrated higher procedural skills and more positive attitudes towards mathematics 58 Student Achievement in Remedial Postsecondary Reform Oriented Classrooms The previous studies addressed efforts to improve student performance in college-level mathematics courses. The following studies address efforts made by post-secondary institutions to improve student performance in remedial mathematics courses. Although several of the authors did not explicitly refer to reform documents for their motivation to implement the described classroom changes, the changes implemented in the studies often aligned with pedagogical practices advocated by reform documents. With the exception of Phoenix (1990), all of the following are comparative studies. Phoenix (1990) examined the effect that the following classroom practices had on her students' achievement in a remedial mathematics course: 1) student verbalization and immediate feedback, 2) cooperative learning, 3) concept/discovery-based approach, and 4) creative classroom activities. Students? placement into the course was based on their performance on the college's mathematics placement test. The average score on the placement test was 15.1 with a 5.1 standard deviation (a score of 25 out of 40 is passing). At the end of the semester, 25 of the original 30 students scored an average of 28.1 on the placement test with a standard deviation of 6.6. The researcher also reported that 16 students passed the course, 9 students failed the course, and 5 students withdrew from the course; thus, the class had a 53% pass rate. Compared to other sections of the course, the instructor's class outperformed 10 of the 12 other sections. Although the pedagogical techniques reported seemed promising, the study could have been strengthened in several areas. First, the researcher, who was also the instructor, did not address teacher effect. The instructor may have been a significantly better teacher than the other instructors teaching the course. Second, the researcher could have buttressed her claim that her class was taught significantly differently from the other 12 classes by providing a scoring device that evaluated 59 the activities within her classroom. Lastly, to her credit, the researcher noted that the results of the study were inconclusive. Future studies of this nature could be strengthened by presenting qualitative data that demonstrated students' appreciation for the classroom activities or quantitative data that allowed for comparisons between types of instructions between classrooms to be made. Erickson and Shore (2003) studied the effects of integrating a problem-oriented approach to learning as well as cross-disciplinary content from health disciplines (such as nursing and physical therapy) into a remedial intermediate algebra course provided at the Physical Therapist Assistant program at Allegany College of Maryland. The faculty of the health departments designed problems that were intended to demonstrate to students how mathematics was used in the health disciplines. The intermediate mathematics course covered polynomials, linear and quadratic equations, radicals, systems of equations, and graphing of functions. Students received instruction through lecture, classroom discussion, and in-class problem solving. Although the authors did not provide the size of the sample in the study, the authors found that the students who were enrolled in the cross-disciplinary, problem-oriented courses yielded significantly higher test scores and reported more positive attitudes than students in the traditional mathematics courses. Hooker (2011) studied the effect that collaborative learning had on students enrolled in a remedial algebra course at a small Tribal community college. The researcher used an experimental design in which the control group (n = 31) and the experimental group (n = 30) were taught concurrently during Fall 2008. In the experimental group, students sat in groups of 4 ? 8; each group was assigned a group leader who was a former student that the instructor had chosen and trained. During the first three days of each week, students worked on problems and 60 assignments; on the fourth day of class, the students engaged in a special in-class workshop in which an activity was given to each group of students. The activity was chosen to engage students in challenging real-life applications of the content that was taught during the week. The activity was also designed to cause students to 1) talk about the problem, 2) practice using new vocabulary and concepts, 3) encourage students to think about different ways to apply the new concepts, and 4) learn how to work together. The researcher found that the experimental group had a higher pass rate than the control group (43 % and 35%, respectively), and a higher percentage of the students in the experimental group persisted to the end of the course compared to the students in the control group (47% and 32%, respectively). However, the author did not state if these results were statistically significant. The studies in the previous section demonstrate that students in postsecondary remedial mathematics courses who received problem-oriented instruction or cooperative learning instruction performed at least as well as comparison students in terms of pass rates and performance on examinations. Additionally, students who received problem-oriented instruction tended to have higher positive attitudes towards mathematics. Table 6 summarizes the studies that examined the impact of reform-oriented instruction on student success in postsecondary remedial mathematics courses. Table 6 Effects of reform-oriented instruction in postsecondary remedial mathematics courses Approach Used Researcher Results Incorporation of student verbalization, cooperative learning, and concept/ Phoenix (1990) Inconclusive since the instructor, who was also the researcher, taught only one section and compared the results 61 discovered-based approach into a remedial algebra course from her section to sections taught by other instructors Integration of problem- oriented approach and cross- disciplinary content into a remedial algebra course Erickson & Shore (2003) Students in the experimental course earned higher test scores and reported more positive attitudes than students in the traditional remedial course Collaborative learning in a remedial algebra course Hooker (2011) Students in the collaborative learning course had higher pass rates and higher persistence rates to the end of the course Effects of Self-Efficacy on Student Achievement Another important component in improving students? success is self-efficacy. Bandura (1997a) described self-efficacy as the ?beliefs in one?s capabilities to organize and execute the courses of action required to produce given attainments? (p. 3). In other words, self-efficacy refers to a person?s confidence in their own abilities to accomplish the goals at hand. Bandura (1997b) stated that since people try to exercise control over the events in their lives, they have a much stronger incentive to act if they believe that their actions will be effective. Thus, individuals with a low self-efficacy will have low aspirations, weak commitment to their goals, and avoid difficult tasks; in contrast, individuals with high self-efficacy set high goals, sustain strong commitment, and view difficult tasks as challenges to be mastered (Bandura, 1997b). The following studies describe the effects of self-efficacy on student learning and present either the direct effects or the mediating effects of self-efficacy on performance. In all of the following 62 studies, students with higher self-efficacy either demonstrated higher mathematical performance, or they demonstrated higher performance on variables affecting mathematical performance. Pintrich and De Groot (1990) examined the effects of self-efficacy on cognitive strategies employed by students and on students? academic performance. ?Cognitive strategies? was defined by the researchers as strategies that students used to learn, remember, and understand the material; such strategies included rehearsal, elaboration, and organizational strategies. The sample consisted of 173 seventh graders from science and English classes from a middle-class, predominantly White Michigan city school district, and data consisted of students? responses on the Likert-style Motivated Strategies for Learning Questionnaire and also students? performance on class work, quizzes, tests, essays, and reports. The researchers found that higher levels of self- efficacy were correlated with high levels of cognitive strategy use (r = 0.33, p < 0.001) and that students with high self-efficacy were significantly more likely to use cognitive strategies than were students with low self-efficacy (p < 0.02). However, when controlling for cognitive engagement variables in regression analysis, self-efficacy was not significantly related to students? academic performance. The researchers suggested that although cognitive engagement was more directly related to academic performance, self-efficacy played a facilitative role in relation to cognitive engagement. Pajares and Kranzler (1995) studied 329 high school students from two Southern public schools, roughly 79% non-Hispanic White. Students completed a mathematics self-efficacy instrument called the Mathematics Confidence Scale (Dowling, 1978). The researchers found a significant correlation between math self-efficacy and mathematics performance (r = 0.64, p < 0.0001). Using path analysis, the researchers found that the direct effect of self-efficacy on mathematical performance (? = 0.349) was as strong as the direct effect of ability on 63 performance (? = 0.324). However, the high majority of students in their study demonstrated a high level of mathematical confidence that was often not matched by their mathematical competence, and the high school students in their sample demonstrated higher levels of mathematical overconfidence than did college undergraduates in their previous investigations. Pajares and Graham (1999) tracked 273 students from grade 6 through grade 8. The sample contained roughly the same number of boys as girls and consisted of 70% non-Hispanic Whites. The Southern suburban public middle school followed a mathematics curriculum that was consistent with the standards of the National Council of Teachers of Mathematics. After controlling for several variables including previous mathematics achievement, perceived value of mathematics, self-regulation, and anxiety, the researchers used multiple regression to determine the contribution that self-efficacy made to mathematical performance. They found that self- efficacy accounted for a modest but statistically significant portion of the variance in mathematical performance (r2 diff = 0.03, p < 0.05). Additionally, they found that students tended to be biased towards overconfidence. Pietsch, Walker, and Chapman (2003) studied the relationship between mathematics self- efficacy and mathematical performance of 416 high school students ages 13 to 16 in Sydney, Australia. The students came from low socioeconomic backgrounds, and 80% of the students were from non-English-speaking backgrounds. The researchers designed a self-efficacy survey to assess students' mathematics self-efficacy, and they assessed students' mathematical performance by using end-of-term examinations. Using confirmatory factor analysis and structural equation modeling techniques, the researchers found that mathematics self-efficacy significantly impacted mathematical performance and cited the results of one model (Goodness 64 of fit index = 0.92) that demonstrated the path from self-efficacy to mathematics performance to be 0.53 (p < 0.05). Mousoulides and Philippou (2005) analyzed 194 sophomore pre-service teachers who enrolled in a mathematics course during Fall 2004. Of the participants, 18% were male and 82% were female. Using the MSLQ 26-item questionnaire from Pintrich et al. (1993), the researchers devised a model that contained seven variables including self-efficacy, task value, and metacognitive strategies. They found that the goodness-of-fit index of their model was good in relation to typical standards (Comparative Fit Index = 0.923, chi2 = 451, df = 303, RMSEA = 0.056). The researchers found that self-efficacy had a causal effect of 0.33 on achievement. Thus, the researchers concluded that self-efficacy was a strong predictor of academic performance in mathematics and that their study corroborated the study of Pintrich and De Groot (1990). Kitsantas, Cheema, and Ware (2011) analyzed data from the 2003 Program for International Student Assessment (PISA) and school questionnaires from NCES (2003). Based on the mathematics literacy of 3,776 15-year-olds enrolled in grades 9, 10, and 11 across 221 schools, the researchers found a significant correlation between mathematics self-efficacy and mathematics achievement (r = 0.54, p < 0.001). Using multiple regression analysis, the researchers found that mathematics self-efficacy accounted for 20% of the total variation in mathematics achievement (p < 0.001) after controlling for gender, race, relative time spent on mathematics homework, and homework support. The researchers concluded that educators should help their students feel efficacious in using the mathematics to which they have been exposed. Although the researchers controlled for several variables, they did not control for students? prior mathematics achievement or mathematical abilities. 65 The previous studies demonstrate that self-efficacy is positively related to student engagement and student performance. A major finding in many self-efficacy studies is that after controlling for students? previous performance, students? beliefs in their abilities strongly predict their mathematical performance (Wigfield & Eccles, 2002). Further, the confidence that students possess in their own mathematical abilities helps to determine how they use the knowledge and skills that they possess (Pajares & Kranzler, 1995). Because high mathematics self-efficacy helps students to possess greater interest in and perseverance towards solving mathematical problems (Pajares & Kranzler, 1995; Olani et al., 2011), educators should help all students to feel efficacious in handling the mathematics to which they have been exposed (Kitsantas, Cheema, & Ware, 2011). Educators can help to develop mathematical self-efficacy in their students by developing a learning environment in which students? ideas are valued and respected and in which students develop ownership of their ideas. Such an environment can develop positive dispositions in students towards mathematics which in turn encourages students to engage in mathematical reasoning and thus acquire conceptual understanding (Mueller, Yankelewitz, & Maher, 2011). Table 7 summarizes the studies that examined the effect that efficacy had on student performance. Table 7 Effects of self-efficacy on student performance Approach Used Researcher Results Examined effects of self- efficacy on cognitive strategies and academic performance on middle school Pintrich & De Groot (1990) Higher levels of self-efficacy were correlated with increased use of as well as higher levels of cognitive strategy use 66 students in science and English courses Examined relationship between mathematics self- efficacy and mathematical performance on secondary students Pajares & Kranzler (1995) A significant correlation existed between mathematics self-efficacy and mathematical performance Examined relationship between mathematics self- efficacy and mathematical performance on middle school students Pajares & Graham (1999) Mathematics self-efficacy accounted for 3% of the variance in mathematical performance Examined relationship between mathematics self- efficacy and mathematical performance on secondary students Pietsch, Walker, & Chapman (2003) Mathematics self-efficacy significantly affected mathematical performance Examined impact of mathematics self-efficacy on preservice teachers? mathematical performance Mousoulides & Philippou (2005) Mathematics self-efficacy was a strong predictor of mathematical performance Compared secondary students? responses on questionnaires to Kitsantas, Cheema, & Ware (2011) Mathematics self-efficacy accounted for 20% of variation in mathematics 67 their mathematical literacy achievement Synthesis of Relevant Studies The first part of the literature review describes characteristics of students in remedial courses, the effectiveness of remedial mathematics courses in improving the chances of academic success for underprepared students, and the effectiveness of computer-based instruction and decreased length in developmental sequences have had in improving student achievement in remedial mathematics courses. The second part of the literature review describes the major tenets of the reform mathematics movement for K ? 12 mathematics classrooms as voiced by NCTM, recommendations that specifically address post-secondary mathematics courses for prepared and underprepared students, studies that describe the effects that reform- oriented curricula and pedagogical practice have had on student achievement, and studies that describe the effect that mathematics self-efficacy has had on improving student achievement. Based on the literature, attempts at improving remedial mathematics courses through computer-based instruction were mixed, and the implementation of such instruction required considerable financial and logistical support. Decreasing the number of required remedial mathematics courses showed promise, but the studies involved drastic redesigns in the mathematics department?s developmental program (Squires, Faulkner, & Hite, 2009; Bassett & Frost, 2010) and focused primarily on non-STEM students (Byrk, 2012). However, students in secondary and postsecondary mathematics reform-oriented courses tended to do at least as well as students in traditional lecture courses in terms of pass rates (Hooker, 2011), overall test scores (Erickson & Shore, 2003), procedural ability (Reys et al., 2003; Thompson, 2009), and problem- solving ability (Hirschhorn, 1993; Schoen, Hirsch, & Ziebarth, 1998; Thompson & Senk, 2001). 68 In conclusion, based on 1) the fact that the topics covered in tertiary remedial mathematics courses are equivalent to those covered in middle and secondary school mathematics courses and 2) the successes described by the studies that incorporated reform- oriented curricular and pedagogical changes within middle and secondary classrooms, I conducted a quasi-experimental design in which I taught one set of students using primarily traditional didactic lecture techniques and another set of students using pedagogical practices that more closely align with pedagogical practices advocated by reform documents. My study examined the effectiveness of reform-based pedagogy in terms of pass rates, procedural ability, and problem solving ability. Additionally, since studies have shown that students? beliefs in their own mathematical abilities can influence their mathematical performance (Wigfield & Eccles, 2002), my study also examined the effect that reform-based pedagogy has on students? mathematical self-efficacy. I also provided documentation regarding the daily classroom practices within the study (such as a series of RTOP scores), and I gathered qualitative data from the students in the reform-oriented class to learn how they felt about key aspects of reform- oriented instruction. Theoretical Framework I hold to the theoretical perspective of constructivism, a perspective that has significantly influenced mathematics education in the past several decades (Ernest, 1997). Constructivism is the belief that individuals create their own knowledge by modifying their existing concepts when presented with new evidence and experiences (Annetta & Dotger, 2006). Constructivism views learning as an active process since individuals wrestle to reconcile their perceptions of the world with their existing knowledge framework (Anderson et al., 1994). 69 Ernest (1997) cautioned that researchers who adhere to the constructivist epistemology should do so only with caution and humility. Since knowledge is attained through individual and social experiences, knowledge constructed by individuals may not align with that of an objective reality (Cooner, 2005). Additionally, since constructivism highlights the subjective interrelationship between the researcher and the participants, as well as the coconstruction of meaning between the two groups, the researcher is not considered an objective observer and must therefore acknowledge his values to his readers (Mills, Bonner, & Francis, 2006). Lerman (1989) stated that constructivism has been described as consisting of two hypotheses: ?1) knowledge is actively constructed by the cognizing subject, not passively received from the environment, and 2) coming to know is an adaptive process that organizes one?s experiential world; it does not discover an independent, pre-existing world outside the mind of the knower? (p. 1). Lerman (1989) further stated that researchers who accept only the first hypothesis are considered ?weak? constructivists, and researchers who accept both hypotheses are considered ?radical? constructivists. With respect to a research perspective, I consider myself a weak constructivist since I assume that individuals construct their own knowledge and that an objective reality does exist. Ernest (1997) described several important implications of a constructivist framework for mathematics education research. Researchers need to 1) attend to their own beliefs about knowledge, 2) attend to the constructs that participants bring with them into the study 3) attend to the social contexts of learning, 4) carefully use methodological techniques since truth can be acquired in more than one manner, 5) attend to the negotiation and shared meaning of the knowledge constructed by the participant, and 6) question the learner?s subjective knowledge. 70 Thus, a constructivist researcher needs to consider participants as whole persons in light of the complex social context that exist among the participant, teacher, and researcher. Research Questions In an effort to improve my understanding regarding the effectiveness of reform-based pedagogical practices in the context of postsecondary remedial mathematics courses at a four- year university, I conducted a mixed methods study that examined both quantitative and qualitative data. The quantitative data helped me verify empirically the success of each treatment, and the qualitative data helped me understand the strengths and weaknesses of reform- based pedagogy from students? perspectives. My study was guided by the following broad research question: Is teaching remedial mathematics in a reform-oriented manner beneficial to university students? Five subquestions were addressed in my study as follow: 1. Is there a significant difference in the pass rates in the remedial mathematics courses between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 2. Is there a significant difference in mathematical procedural ability between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 3. Is there a significant difference in mathematical problem solving ability between university students who receive instruction consistent with reform pedagogy versus university students who receive instruction through traditional didactic lecture methods? 4. Does the self-efficacy of university students in the reform classes improve as a result of instruction received in the reform classes? 71 5. What views about reform instruction will university students who are enrolled in a reform-oriented remedial mathematics course express upon completing one semester of reform-oriented mathematics instruction? As outlined in the following chapter, the first four subquestions were answered quantitatively and addressed pass rates, procedural ability, application ability, and change in mathematical self efficacy. The fifth subquestion was answered qualitatively and addressed students? perceptions of the reform-oriented course. 72 CHAPTER 3: METHODOLOGY This mixed-methods study was designed to gather knowledge about the effectiveness of reform-oriented instruction in postsecondary remedial mathematics courses. This chapter will first present the design of the study followed by the context of the study. Next, the instrumentation and data analysis plan will be discussed. Last, the procedure describing the treatments that were implemented in the study will be presented. Design Creswell (2007) stated that the goals of research influence the approaches that are used in research. The broad goal of this study was to determine whether or not reform-oriented instruction was an effective means to teach mathematics. Due to my constructivist theoretical framework, students? perceptions regarding the effectiveness of the treatments were important in ascertaining the effectiveness of reform-oriented instruction (Ernest, 1997). This theoretical perspective therefore necessitated that I use a mixed methods design for this study. The quasi- experimental portion of the study attempted to determine the effect that reform-oriented instruction had on the following student achievement outcomes: procedural abilities, application abilities, pass rates, and change in mathematics self-efficacy. The success of the reform-oriented instruction was viewed in contrast to the success of the didactic lecture instruction. The study was quasi-experimental because the students were not randomly selected; instead, students selected their course according to what best accommodated their schedules. However, no policies existed that systematically placed students into one particular class or the other, and students did not know until the first day of class which treatment they would receive. A quasi-experimental design was chosen in order to establish a cause-and-effect relationship between the treatment and students? success in the course, and any results were 73 controlled for variables that could influence the results (Gravetter & Wallnau, 2004). This quasi- experimental design was an appropriate choice for this study since 1) it is a commonly used method to discover generalizations about phenomena, 2) it attempts to validate empirically the relationships between teaching practices and student learning, and 3) a successful experiment can provide replicable and objective generalizations (Ernest, 1997; Carnine & Gersten, 2000). In the experiment, the didactic lecture course was defined as the control group, and the reform-oriented course was defined as the experimental group. Covariates were used in the study to temper results based on significant differences in demographic between the two groups. However, focusing only on the quantitative aspects of a study can neglect important qualities that are worth examining (Miles & Huberman, 1994). I also wanted to understand students? views of reform-oriented instruction. Specifically, to what extent and in what areas did students perceive that the reform-oriented instruction benefited them? Therefore, I incorporated a qualitative component into the study which would allow me to understand the perspectives of the students who received reform-oriented instruction (Merriam, 2001); students? responses on anonymous end-of-course surveys served as the basis for this analysis. The qualitative portion of the study examined the views that students in the reform-oriented course had on reform-oriented instruction; these views were solicited from students through anonymous end-of-course surveys. Context The South East University (SEU) (a pseudonym) at which this study took place offers a range of undergraduate degrees and graduate degrees in the schools of Business, Liberal Arts, Education, Nursing, and Sciences. In Fall 2011, 5,305 students were enrolled at SEU. During the 2011 ? 2012 school year, a total of 874 students graduated from SEU, with 63% and 37% of those students earning a bachelor?s degree and master?s degree, respectively. With respect to 74 ethnicity, approximately 55% of the student population was White, 31% Black, 2% Asian, 2% Hispanic, and 10% other. Additionally, 37% of the students were male, and 63% were female. Since SEU also encourages part-time studies, 38% of its students were part-time. Lastly, the average ACT score for entering freshmen at SEU was 22.0. The remedial course sequence for mathematics at SEU consisted of Math 0700 (Elementary Algebra) and Math 0800 (Intermediate Algebra), followed by credit-level courses such as Finite Mathematics and Precalculus. In order to take Math 0800, a student must either pass Math 0700 (the preceding remedial mathematics course) or place directly into Math 0800 by taking a computer placement test generated by computer software purchased by SEU?s mathematics department. Traditionally, Math 0800 (Intermediate Algebra) at SEU was taught in the following sequence: 1) Techniques for factoring, 2) Rational expressions and equations, 3) Graphing quadratic, square root, absolute value, and linear functions and performing operations with functions, 4) Simplifying radical expressions and solving radical equations, and 5) Solving and graphing quadratic equations. During the 2010 ? 2011 school year, 606 students and 487 students enrolled in Math 0700 and Math 0800, respectively. The pass rates for Math 0700 and Math 0800 were 51.8% and 46.6%, respectively. Collectively, 49.5% of the 1,093 students who enrolled in remedial mathematics courses passed their remedial courses. Dr. Jones (a pseudonym), head of the SEU mathematics department, agreed that an experimental section using alternative instructional methods could be offered on the condition that the experimental students would receive instruction that was on essentially the same level? in terms of difficulty level and type of problems?as the students in the traditional course. The remedial mathematics tests were designed by the SEU mathematics department and stressed primarily algebraic manipulations with one to two application problems on each test. However, I 75 had been given permission to modify the tests slightly by adding or removing questions as I deemed necessary. I also received permission from Dr. Jones to modify the grading scales of the experimental and control classes to allow for homework assignments and class participation grades, as well as to remove from the course the concepts of completing the square and graphing of circles. Students in remedial classes at SEU were required to attend a Math Lab once per week in addition to the classroom lectures, during which time students used a computer-based format to work on homework and quizzes. Each Math Lab session lasted between one to two hours in duration. Students in remedial courses (including those in the Control and Experimental groups) were required to pass the lab in order to pass their course, which was based on their attendance, homework, and quiz averages within the Math Lab. Before proceeding with the study, I sought permission from the Institutional Research Board (IRB) to conduct my study. (Consent forms are located in Appendix A.) I obtained permission from the Institutional Research Boards of both Auburn University and SEU. Students were given the option of ?opting into? the study by permitting me to use their data in the study. My Personal Background Since I was the instructor for both courses in this study, it is important to consider my teaching background. I earned a Bachelor?s degree in Education, double majoring in both Secondary Mathematics and General Science, after which time I immediately pursued a Master?s degree in Mathematics Education. Upon completing my Master?s degree, I taught the spectrum of secondary mathematics and science courses for three years in a high school before obtaining a position managing a mathematics and sciences tutoring facility at the university in which the present research study was conducted. I also served as an adjunct for the mathematics 76 department at this university and taught many freshman level mathematics courses, including a substantial number of remedial mathematics courses. I experienced traditional lecture instruction in my primary, secondary, and most of my postsecondary mathematics courses. Upon earning my undergraduate degree, I entered the teaching arena and taught in the same manner that I was taught: traditional lecture instruction. When I entered graduate school, I was exposed to the pedagogy advocated by reform mathematics organizations; however, I was hesitant to modify my perspectives until I could experience an impetus to justify such a modification. I perceived the lack of student performance to be largely due to a lack of student effort. Then one day during class a professor made the comment, ?In the end, teachers do not have any control over what students do outside the classroom. All the teacher can control is what happens inside the classroom.? This statement helped to change my focus from the deficiencies of my students to the deficiencies of the course and my teaching style. Ironically, it was during my second semester of graduate school that I became disheartened by the abysmal performance of students in my remedial mathematics courses. I felt that I had maximized the benefits of traditional instruction, and yet the students in my remedial mathematics classes were still failing at unacceptable rates. I was no longer satisfied with my current way of teaching, and I was prepared to consider different methods of teaching mathematics. Through the pedagogical courses in my graduate program, a reform-oriented graduate mathematics course that taught me mathematics using reform pedagogy, and extended feedback from colleagues who were trained to use reform pedagogy, I gradually became comfortable teaching in a reform-oriented manner and even developed a preference for that style of teaching. 77 I began modifying the Math 0800 course in order to determine if it were feasible to teach such an intensely procedurally-oriented course in a reform-based manner. Including my first pilot course in Summer 2010, I conducted three pilot courses over several semesters and found each course to be fairly successful in terms of students? pass rates and general student feedback. By the time I taught the reform-based course for this study in Spring 2012, I had become comfortable teaching the Math 0800 course in a reform-based manner. I had also earlier taught this course numerous times using a more tradition, lecture-based manner. Description of Sample Students in the study chose to be in their respective courses based on what best fit their schedule. No policy was in place that systematically placed a disproportionate number of students into one class or the other. The reform-oriented course was taught in Spring 2012, and the traditional lecture-based course was taught in Fall 2012. Students in this study were recruited from their respective classes within the first few class meetings of each course. On the third class meeting, a representative from the SEU?s IRB board described the nature of the study to the students and gave them the opportunity to participate in the study. I (the instructor) was not present during this interaction. Students elected to participate in the study by filling out an ?informed consent form? and a ?grade release form? (see Appendix A). The IRB representative collected the forms and kept them confidential from me until the final grades had been distributed at the end of the course. Surveys were used to collect demographic information from students in the sample regarding their age, race, gender, prior mathematical knowledge, number of hours employed each week, and number of credit hours attempted for the current semester (see Appendix B). 78 Table 8 summarizes the demographics data for the sample. The differences between the two groups will be analyzed later when I present the results of the study. Table 8 Demographics of sample Overall (n = 29) Treatment (n = 18) Control (n = 11) Mean (SD) Mean (SD) Mean (SD) Age 21.8 (5.6) 22.4 (6.5) 20.7 (3.7) Prior Mathematical Knowledge 24.7 (6.0) 26.1 (6.2) 22.6 (5.2) Hours of Employment per week 10.8 (14.1) 11.7 (15.5) 9.5 (12.0) Attempted Number of Credit Hours 14.0 (2.6) 14.8 ( 2.2) 12.8 (2.8) Race n (%) n (%) n (%) Black 11 (37.9) 4 (22.2) 7 (63.6) White 18 (62.1) 14 (77.8) 4 (36.4) Hispanic 0 (0.0) 0 (0.0) 0 (0.0) Native American 0 (0.0) 0 (0.0) 0 (0.0) Asian/Pacific Islander 0 (0.0) 0 (0.0) 0 (0.0) Other 0 (0.0) 0 (0.0) 0 (0.0) Gender Male 9 (31.0) 6 (33.3) 3 (27.3) Female 20 (69.0) 12 (66.7) 8 (72.7) 79 Two students who agreed to participate were removed from the study. One student was removed from the Control group because she only attended the first two days of class, and one student was removed from the Experimental group because the student missed the first two weeks as well as the last four weeks of the course. Removing the students from the study was appropriate because of the lack of exposure to the treatments in their respective groups. Instrumentation Various levels of measures were employed in this study. First, measures were used to address the primary research questions about pass rates, procedural skills, application skills, and change in mathematical self-efficacy. Second, the Reformed Teaching Observation Protocol and the use of an additional test grader were used as measures to establish that the study maintained an acceptable level of validity and reliability. Third, covariates acted as measures to temper differences in class data based on demographic differences between groups. Dependent Measures A lack of consensus appears to exist regarding the appropriate metrics that should be used in evaluating the effectiveness of tertiary developmental mathematics programs, perhaps in part because there exists a lack of consensus regarding the ultimate role of these programs. The issue that emerges is whether developmental courses should aim to ensure that students who complete the program attain a high level of mathematics competency, or whether such programs should aim to ensure that larger numbers of students complete the course at a slightly lower, yet acceptable, level of competency (Golfin et al., 2005). Some researchers examined the pass rates (typically a C or higher), average exam scores, or final class GPA to assess the level of mathematical competency gained by the students (Phoenix, 1990; Squires, Faulkner, & Hite, 2009). However, researchers may report these results at the expense of higher withdrawal rates 80 (Golfin et al., 2005). On the other hand, some researchers are more concerned with pass rates than with content mastery because their concern is whether an instructional approach can help larger numbers of underprepared students succeed in basic skills instruction (Golfin et al., 2005). My study addressed both the pass rates of students and their level of content mastery in Math 0800. Content mastery was divided into two parts: procedural skills and problem-solving abilities. Procedural skills included students? abilities to simplify algebraic expressions or solve equations without any type of real-world or situation-based context; for example, one problem might be ?Solve the following equation for x: x2 + 2x = 1.? Problem solving skills included students? abilities to solve situation-based problems by using a given equation or by devising their own method to solve the problem if no equation is given (see Appendix C). Additionally, I examined the mathematics self-efficacy of students in both classes since it is an important predictor in mathematics problem solving (Pajares & Kranzler, 1995). Lastly, the study collected data regarding students? views of the pedagogical practices used during their respective courses. Pass rates. This study examined the difference in pass/fail rates between the two groups of students. For each class, the number of students that initially ?enrolled? in the class was defined as those who completed at least one test, and students who withdrew from the course or failed due to excessive absences were grouped with other students who failed but regularly attended class. Procedural skills. This study examined the effect that the treatments had on students? procedural skills. Math 0800 has five regular tests consisting entirely of short-answer questions, and each test consists primarily of procedural questions. Students? performance on the procedural questions was used to determine if a significant difference in procedural skills existed between the two classes. A rubric was developed for each test to aid in the consistency of the 81 grading. Emphasis in the rubric was placed on students? demonstrating understanding of key concepts, and arithmetical mistakes were not severely punished. An example of a procedural problem and the corresponding grading rubric is provided in Figure 1. Figure 1: A sample procedural problem with corresponding grading rubric Application skills. This study examined the effect that the treatments had on students? application skills. Each of the five regular tests in Math 0800 contained one to two short-answer application problems. Students? performances on the application questions were used to determine if a significant difference in application skills existed between the two classes. A rubric was also developed to aid in the grading of the application problems. Emphasis was placed on students? demonstrating understanding of key concepts, and arithmetic mistakes were not severely punished. See Figure 2 for an example of an application problem and its corresponding rubric. 82 Mathematical self-efficacy. Students? mathematical self-efficacy was defined by their responses on a survey. I used Midgley et al.?s (2000) five-question, Likert-scale Mathematics Self-Efficacy Survey. The alpha for the survey was 0.78. See Appendix B for copy of the survey. Perspectives on instruction. The students in the control and the experimental groups were given anonymous surveys at the end of the course to solicit their likes and dislikes regarding the teaching styles employed throughout their respective courses. The experimental group had three additional questions on their survey than were on the control group?s survey. These extra questions solicited students? perspectives regarding the incorporation of three key reform practices into daily instruction: group work, student presentations, and graphing calculators. See Appendix B for a copy of the surveys. Validity and Reliability The results of my study were based on two key items: 1) how well I graded my students? tests and 2) how well I maintained fidelity to the intended treatments (traditional lecture instruction vs. reform-oriented instruction). Since objectivity and replicability are significant components of quantitative research (Cohen, Manion, & Morrison, 2007), I arranged for an Figure 2: A sample application problem with corresponding grading rubric 83 outside grader to confirm the accuracy of my test grading. Additionally, two colleagues in the field of education performed multiple classroom observations to document the extent to which I maintained fidelity to the intended treatments. The following sections explain these procedures in more detail. Procedural and application scores. I wanted to establish inter-rater reliability in order to support the validity of the procedural and application scores earned by the students. For the five free-response tests that were given each semester, I met with a colleague in mathematics education to grade a portion of the tests after each set of tests was administered. Specifically, at each meeting I graded six tests, and my colleague graded the same six tests. My colleague and I used the same grading rubric, and any significant differences in test scores were analyzed and resolved. The relationship between the researcher?s and colleague? procedural scores as well as the application scores were intended to maintain a Pearson correlation of at least 0.8. Reformed Teaching Observation Protocol. The Reformed Teaching Observation Protocol (RTOP) was used to corroborate the claim that the two types of instruction were significantly different from one another. The RTOP is a 25-item observation protocol that was devised by Piburn and associates (2000) through the Arizona Collaborative for Excellence in the Preparation of Teachers (ACEPT) to assess the level of reformed teaching that is present mathematics and science lessons (see Appendix D). I arranged two paired lessons (lessons in which I taught the same set of concepts to both classes) that were observed by colleagues in the field of education. The selected lessons were representative of the instruction that each group of students received. The RTOP was chosen because it aligned with reform pedagogy. Additionally, its creators designed the instrument to be easy to administer and appropriate for K-20 mathematics and science classrooms (Piburn & Sawada, 2000). 84 Covariates In order to determine if the Control and Experimental groups were comparable, students in both courses were given surveys at the beginning of the semester in which they provided demographic data regarding Age, Race, Gender, Prior Mathematical Knowledge, Number of Attempted Credit Hours, and Number of Hours of Employment (see Appendix B). Establishing prior mathematical knowledge is important in comparative studies (Senk & Thompson, 2003); therefore, prior mathematical knowledge was assessed by giving the final exam from Math 0700 (the previous math course) as a pretest on the first day of class. Similarly, data regarding students? gender, race, age, employment intensity, and course load were collected due to their potential impact on students? mathematical achievement (Hagedorn et al., 1999; Bahr, 2008). Procedure This study was quasi-experimental in that students were not randomly assigned to the control and experimental treatments. Students did not know before the first day of class which section would receive the experimental treatment. Soon after students were notified of the study, one student in the experimental course transferred to a different section of the course. Both the experimental and control sections were offered at 8 a.m. on Mondays and Wednesdays. Both sets of students used the same textbook and attended Math Lab once per week in addition to the classroom lectures, during which time students used a computer-based format to work on homework and quizzes. Students were required to pass the lab in order to pass the course. Students passed the Math Lab based on their attendance, homework, and quiz averages within the Math Lab. Because a few topics were removed from the experimental course (and thus the traditional course), accommodations were made for students in my classes. 85 The amount of material that was covered as well as the manner in which the students were graded was the same between the two groups. In other words, both groups covered the same material, took the same tests, and were graded the same way. Additionally, both classes had an attendance policy (unlike the study presented in Ellington [2005]). Due to an administrative policy at SEU that limited SEU staff members to teach one course per semester, I taught the experimental course in Spring 2012 and the control course in Fall 2012. An advantage to my teaching both classes was that it prevented ?teacher effect? from becoming a factor in the study. Control Group The Control group received traditional didactic instruction. In other words, I spent roughly 95% of class time explaining to the students the concepts, with the remaining time filled with students asking questions. Instruction proceeded in the following manner: 1) introduce the concept, 2) explain the concept in abstract terms without any realistic context, and 3) explain how the methods developed from the abstract presentation can be used to solve application problems containing a real-world context and explain how to solve these problems using the techniques currently in discussion. According to Schroeder and Lester (1989), this approach to teaching could be called teaching for problem solving since students would first be shown how to perform the procedural skills and then shown how to use those skills to solve both routine and non-routine problems. Technology was not used to reinforce or explain mathematical concepts. (Currently, the use of graphing calculators is discouraged, usually prohibited, in Math 0800 courses at SEU.) Students sat in individual desks and did not work in groups during class. When students asked if 86 something were correct or incorrect, I answered their questions to their satisfaction, but I did not probe their understanding to help them figure out the answer to their own question. Experimental Group Treatment The Experimental group received instruction slanted toward reform pedagogy as illustrated by the NCTM (2000, 2009), AMATYC (1995, 2006), and CUPM (1998, 2011) documents. Thus, I provided students opportunities to understand mathematical concepts on their own or with the help of their classmates using group work. I provided these opportunities by carefully developing the mathematical concepts to be understood either through real-life applications or through mathematical scenarios that encouraged mathematical exploration that in the end helped them to understand the mathematical principles in question before their classmates formally presented their findings to the class. Schroeder and Lester (1989) referred to this paradigm as teaching via problem solving, the process of introducing reasonable problem situations that embody mathematical concepts and then developing mathematical techniques in response to those problems. Additionally, I gave students opportunities to explain mathematical principles to the rest of the class before I formally explained the concepts to the class. Students sat at tables that fostered interaction and discussed their findings to the questions posed to them. After students had time to explore the problems and discuss their findings with their classmates, students were asked to present their work to the rest of the class. As the students presented to the class, the other students were expected to critique the presented information to determine its accuracy; thus, students were encouraged to engage in respectful constructive criticism in an intellectually safe environment. During group work, when a student asked me if a particular approach or answer were correct, my default response was ?What do you think??, ?How could you check your answer??, or ?What do your classmates think?? I avoided 87 directly answering the question and instead guided the student in a direction to figure out for himself if the answer or approach were correct. In other words, if the approach or answer were an incorrect one, I guided the student in a direction that illustrated to him or her that something was amiss. As described by Pines and West (1986), once students reflected upon the compatibility of their conceptions and experiences, they would be much more likely to accept formal theories as their own. If the approach were correct, then the students reinforced their understanding as they discussed their findings with their classmates, or they devised a way to check the reasonableness of their solutions. Student presentations to the class were standard practice. Having students present to the class reinforced what the students had learned, helped other students understand the mathematical concept, and helped students to better understand the material by justifying to their classmates the reasoning behind their solution. Students used the document camera to present their solution to the class since the document camera can save a significant amount of class time by removing the need for students to recreate their solutions as they presented to the class. Graded homework differed between the experimental and control sections in terms of the type of homework assigned; however, the amount of time necessary to complete the homework assignments was roughly the same for both classes. Both sections were given suggested problems within the text that would reinforce procedural skills. However, the control group was assigned graded homework based on exercises within the book that primarily emphasized procedural skills. The experimental group was assigned graded homework that addressed conceptual understanding. These conceptually oriented problems required students to relate the mathematics to realistic applications, produce several forms of justification including tables and 88 graphs, explore concepts that commonly act as stumbling blocks, and articulate clearly solutions and the meaning of solutions. In the experimental section, students were encouraged to use tables, graphs, and algebraic approaches to understand mathematical concepts. These students also used graphing calculators to understand solutions to linear, quadratic, and radical equations. Thus, students would not be completely reliant on algebraic techniques to solve these types of problems. Rather, they would be able to quickly construct a picture to test the reasonableness of their answers, or they could use the graph to prompt them in the right direction. Graphing calculators were supplied to students during class, but the students were responsible for obtaining graphing calculators for use outside the classroom. To help students obtain access to graphing calculators, students were encouraged to use the graphing calculators available in a nearby tutoring facility, and they were shown how to access online graphing calculators. Table 9 summarizes the key differences between the traditional course and the reform-oriented course. Additionally, see Appendix E to view two sets of paired lesson plans that demonstrate the difference between traditional and reform-oriented instruction. Table 9 Summary of differences between traditional and reform-oriented instruction Traditional Reform-oriented The teacher is the sole dispenser of knowledge and serves as a ?sage on the stage? The students regularly present their knowledge and findings to the class; the teacher serves as a ?guide on the side? (NCTM, 2000; White-Clark, DiCarlo, & Gilchriest, 2008) Direct lecture is extensively used Direct lecture is kept to a minimum (Boylan, 89 Bonham, & Tafari, 2005) Students are passive learners Students are active learners (MCCEO, 2006; AMATYC, 2006) Classroom discourse consists primarily of teacher-to-student discourse Classroom discourse consists significantly of student-to-student and student-to-teacher discourse (NCTM, 1991) Socratic questioning is not employed Socratic questioning is significantly employed (Vosniadou & Brewer, 1987) Student exploration and experimentation are not encouraged before formal theorems are presented Student exploration and experimentation are encouraged before formal theorems are presented (AMATYC, 2006; Thompson, 2009) The teacher values the most efficient means of solving a problem The teacher values multiple problem-solving approaches; efficiency is a secondary concern (AMATYC, 2006) Algebraic techniques are presented as the primary means of solving problems Pictures, tables, and graphs are emphasized in addition to algebraic techniques in order to help students improve conceptual understanding and solve problems (NCTM, 2000; AMATYC, 2006). Students master algebraic techniques before learning how to apply such techniques to story problems Story problems act as vehicles in which to introduce mathematical concepts (Schroeder & Lester, 1989) Primarily procedural/routine problems are emphasized during instruction Both conceptual/non-routine problems and procedural/routine problems are emphasized 90 during instruction (Robinson & Robinson, 1998; Thompson, 2001; Webb, 2003) Students work on problems in isolation Students work on problems collaboratively in small groups, engage in small-group and whole- class discussions, and present solutions to the class (NCTM, 2000; Boylan, Bonham, & Tafari, 2005; Golfin et al., 2005; AMATYC, 2006; Thompson, 2009) Limited use of technology Extensive use of technology through the graphing and table functions of graphing calculators (Golfin et al., 2005; AMATYC, 2006; Webb, 2003; Thompson, 2009) Homework emphasizes primarily procedural skills Homework emphasizes both conceptual understanding and procedural skills (NCTM, 2000) Assessment is mostly summative through homework and tests Assessment is strongly formative through class and group discussions, in addition to summative assessments of homework and tests (NCTM, 2000) 91 Data Analysis The following paragraphs describe the sequence in which the collected data in this study were analyzed. In short, the researcher verified that the RTOP scores were significantly different, determined which covariates were necessary to include in the statistical analyses, performed the statistical analyses, and analyzed the results from the student surveys. Establishing Validity The RTOP was incorporated into the study to establish that two distinct treatments had indeed taken place in the study. With respect to the required difference in RTOP scores that would be required in order to state that the types of instruction were significantly different, MacIsaac and Falconer (2002) distinguished between high school and university RTOP scores. Using physics lessons as a backdrop, the researchers stated that a traditional university lecture that is passive in nature would produce an RTOP score less than 20, whereas a traditional high school lecture with student questions would produce an RTOP score less than 45. When describing observations made of high school mathematics and biology teachers, Judson and Lawson (2007) similarly categorized RTOP scores of less than 30 to be low and an RTOP score of 43 to be moderate. MacIsaac and Falconer (2002) emphasized that the preceding scores approximate the amount of reform instruction implemented in a classroom but that any RTOP score greater than 50 indicated considerable presence of reformed teaching in a lesson. Lawson et al. (2002) based the success of their program by comparing the RTOP scores of ACEPT-influenced teachers to non-ACEPT teachers. When examining the mean RTOP scores for third year teachers, the ACEPT teachers scored significantly higher RTOP scores than non- ACEPT teachers (62 and 45, respectively, p < 0.05). 92 Thus an average RTOP score for the control class that is close to 20 and an average RTOP score for the experimental class that is at least 40 would align with the classifications presented by MacIsaac and Falconer (2002) and Judson and Lawson (2007); additionally, the difference in RTOP scores between the two classes would meet the 17 point difference presented by Lawson et al. (2002). Selecting Covariates For each of the statistical analyses that were performed, the researcher determined which variables (Age, Race, Gender, Prior Mathematical Knowledge, Number of Attempted Credit Hours, and Number of Hours of Employment) needed to be included as covariates. Age, Prior Mathematical Knowledge, Number of Attempted Credit Hours, and the Number of Hours of Employment were treated as continuous variables. Therefore, the difference in averages between the Control and Experimental groups with respect to each of these variables were analyzed using a t-test. Race and Gender were considered categorical variables; therefore, the differences in percentages with respect to these variables were analyzed using a Fisher?s Exact Test. Data regarding these variables were obtained from students through surveys that were administered at the beginning of the course. In this study, there was no reason to expect that the two classes would differ significantly with respect to any of the aforementioned variables. A common rule of thumb when controlling for variables in statistical analysis is to allow one variable into the study per 10 ? 15 students (Osborne & Costello, 2004). Since the sample consisted of only 29 students, and since the study had already introduced the variable treatment, I only introduced variables that were significantly different between the two groups and which also significantly impacted the research question?s dependent variable. 93 Determining if a variable needed to be included as a covariate involved two steps. First, the appropriate statistical test was used to determine if the Control and Experimental groups significantly differed with respect to a variable. The differences between the two groups were analyzed using a t-test for continuous variables (Age, Credit Hours, Work Hours, and Prior Mathematical Knowledge) and a Fisher?s exact test for categorical variables (Race and Gender). If the two groups did not differ significantly with respect to the variable, then the variable was not included as a covariate. However, if the two groups did differ with respect to that variable, then step two was invoked: treat the variable as an independent variable and determine if it alone has an effect on the dependent variable. If the variable had a significant effect on the dependent variable, then it would be included as a covariate in the final statistical analysis. Analysis of Effects The differences in students? Procedural scores and Application scores were each analyzed using a 2 (experimental group vs. control group) x 5 (5 regular tests) repeated measures ANOVA. The statistical analyses contained any covariates in which the two groups significantly differed and which also significantly impacted the dependent variable. Similar to a t-test, a Repeated Measures ANOVA is a statistical procedure used to test the null hypothesis that the means of variables do not differ. However in a Repeated Measures ANOVA, each participant in the study is tested multiple times and therefore contributes multiple values under the same variable. A Repeated Measure ANOVA was appropriate for analyzing differences in procedural and application scores because each student in the study was given multiple tests (and therefore contributed multiple values) that evaluated the student?s skill set with respect to a specific dependent variable (Huck, 2004). 94 A Repeated Measures ANOVA produces two types of results that are relevant to this study: 1) between-groups results and 2) within-groups interaction results. For example, in addressing the first research question, students in the Control group and the Experimental group were given a series of five tests over the course of the semester that evaluated their ability to solve procedural problems. A statistically significant ?between-groups? result would imply that the average procedural ability of the Control group was significantly different from the average procedural ability of the Experimental group. In contrast, a statistically significant ?within- groups interaction? result (denoted by the phrase ?Procedural * Treatment?) would imply that each group?s procedural scores changed differently over time; that is, the change in the Control students? average procedural scores throughout the study was significantly different from the way that the Experimental students? average procedural scores changed throughout the study. A graph with intersecting lines can be an indicator that a significant interaction effect exists in the data (Huck, 2004). The difference in Pass Rates between the two groups was analyzed using a Fisher?s Exact Test because of the test?s usefulness in analyzing the difference in percentages between two groups; additionally, a Fisher?s Exact Test was chosen due to its ability to accommodate small sample sizes (Huck, 2004). The results of the analysis were controlled for any necessary covariates. The difference in students? change in mathematical efficacy was analyzed using a 2 (Experimental group vs. Control group) x 2 (Pre/Post Test) repeated measures ANOVA. In contrast to the 2 x 5 repeated measures ANOVA used to analyze the difference in procedural scores, a repeated measures ANOVA was employed simply to determine if a within-subjects interaction took place. In other words, the test was used to determine if one group changed 95 significantly more in mathematical self-efficacy than did the other group across the period of instruction. The size of the treatment effects were determined by calculating Cohen?s d and partial eta squared. Cohen?s d provides the difference in means between two groups divided by the standard deviation of the sample. For example, a Cohen?s d of 0.3 implies that the mean difference between the two groups was 0.3 standard deviations. A Cohen?s d less than 0.2 implies a small treatment effect, between 0.2 and 0.8 is a medium treatment effect, and greater than 0.8 is a large effect. Similarly, a partial eta squared value directly indicates the percentage of the variability in the data that is due to the differences between treatments. For example, a partial eta squared of 0.425 implies that 42.5% of the variability in the data is due to the differences between treatments (Gravetter & Wallnau, 2004). Qualitative Analysis I coded students? comments on the end-of-course surveys according to a coding strategy advocated by Miles and Huberman (1994). Prior to beginning the study, I created a ?start list? of predefined codes based on the survey questions and students? possible responses to those questions. After administering the survey, I used a representational approach to code students? answers. According to Sapsford (1999), researchers who use this approach represent the surface content fairly by using key words to identify core concepts. Throughout the coding process, some of the predetermined codes increased in bulk and seemed ill-fitting. I therefore reassessed the strength of my original codes and created subcodes in order to produce a better fit for the collected data. Additional codes were also created from themes that emerged from the collected data. The coding process terminated once all of the students? statements could be readily classified according to the existing set of codes (Miles & Huberman, 1994). 96 Summary A quasi-experimental design was used to test the effectiveness of teaching a remedial mathematics course in a reform-oriented manner as opposed to teaching the remedial mathematics using didactic lecture. The effectiveness of each treatment was based on students? course pass rates, procedural skills, application skills, and mathematical self-efficacy. Additionally, students provided their perspectives regarding their respective treatments through an anonymous survey that was issued at the end of the course. The results of the statistical analyses were controlled for demographic variables in which the two groups significantly differed, and the validity of the results was supported by the use of an outside grader and through colleague classroom observations. Additionally, the results of the student surveys provided qualitative data and were categorized according to students? perspectives on the teaching techniques employed during their respective courses. In the following chapter, I will present the findings of this study. 97 CHAPTER 4: RESULTS The first section in this chapter will describe key events that occurred during the study. Second will be the results of the classroom observations through the lens of the Reformed Teaching Observation Protocol (RTOP), followed by the results of the inter-rater reliability for the grading of tests. Lastly, the results for the five research questions will be presented based on the data analysis methods presented in the prior chapter. Summary of Events When the plan for this study was first developed, the policy at the Southeast University allowed me to teach multiple courses per semester. However, the policy at the university changed during the planning of the study. I spoke with the head of the Mathematics Department and asked if a waiver could be submitted that would allow the researcher to teach two classes currently for the Spring 2012 semester. The head of the Mathematics Department agreed to file a waiver request since it would assist the researcher in completing his dissertation project. However, the upper administration at the Southern University denied the waiver request; thus, I was required to teach the Experimental class in Spring 2012 and the Control class in Fall 2012. From the early planning stages of the study, the head of the Mathematics Department fully supported the development and execution of the study. Even though some students from the multiple pilot studies complained about the methods in the reform-oriented classes, the head of the Mathematics Department told me not to worry about the complaints and that I had the department head?s full support. With administrative support in place, I began the study by teaching the Experimental class in Spring 2012. Several weeks into the study, the head of the Mathematics Department approached me and stated that the Spring 2012 semester would be the last time that I could teach in a reform- 98 oriented manner. When I asked about the withdrawal of support, the department head stated that multiple students had complained about the teaching method and that even a parent (who was an instructor and researcher at another university) contacted the assistant dean of the School of Sciences at the Southeast University. The administration decided to accommodate the complaining parent by moving his child to another class. The assistant dean of Sciences later spoke with me and stated that she defended my actions. However, the department head stated that the culmination of complaints from the current semester as well as previous semesters caused him to withdraw his support of reform-based teaching. Later conversations with the department head revealed that he would allow me to teach in a reform-based manner, but such teaching could not take place in the context of a research study. After this initial series of complaints, I finished teaching the Experimental course without any further incidents. In contrast, the Control course was taught in the Fall 2012 semester without incident, and I was not informed of any student complaints regarding my style of traditional teaching. Integrity of Treatment The first step in my analysis was to determine the degree to which appropriate teaching methods had been delivered to the two classes included in the study. The Reformed Teaching Observation Protocol (RTOP) was used by two colleagues in the field of education in order to establish that the Control class had received traditional mathematics instruction and that the Experimental class had received reform-oriented instruction. Two paired lessons were observed in each course: ?Completing the Square? and ?Shifting of Graphs?. The lesson on ?Completing the Square? was taught early in the course, and the lesson on ?Shifting of Graphs? was taught midway through the course. Both colleagues were present for each of the observations. Recall that the RTOP produces scores from 0 through 100, where a lesson receiving a score higher than 99 50 is considered to have significant incorporation of reform-oriented pedagogy. The two lessons that were observed in the Control group received firmly traditional scores, and the two lessons in the Experimental group received firmly reform-oriented scores. Table 10 presents the scores for the four classroom observations from each observer. Thus, the Experimental section did appear to receive reform-oriented instruction while the Control section did not. Table 10 Differences in RTOP scores between control and experimental sections Completing the Square Lesson Shifting of Graphs Lesson Section Rater 1 Rater 2 AVG Rater 1 Rater 2 AVG Control 22 23 22.5 22 35 28.8 Experimental* 92 80 86.0 91 91 91.0 Note. A score greater than 50 indicates significant use of reform pedagogy Inter-rater Reliability of Tests The next step in my analysis was to examine the inter-rater reliability of the test scores assigned to the students in order to support the validity of the scores earned by the students. The relationship between the researcher?s and colleague?s graded tests for Procedural Scores for the first five tests ranged from r = .825 to r = 1.000, with a median value of r = .996. The relationship for Application Scores for the first five tests ranged from r = .948 to r = 1.000, with a median value of r = 1.000. See Table 11 for a summary of the correlations between my scores and the outside grader?s scores. This table indicates that the outside grader and I maintained the desired Pearson correlation of at least r = .8 throughout the study. Thus, the scores in the study were valid. 100 Table 11 Summary of inter-rater reliability Pearson correlation values Control Group Experimental Group Test 1 Test 2 Test 3 Test 4 Test 5 Test 1 Test 2 Test 3 Test 4 Test 5 Procedural Scores .995 .993 .985 .999 1.000 .825 .998 .997 .988 .998 Application Scores .979 .948 1.000 1.000 1.000 .974 1.000 1.000 1.000 1.000 Quantitative Results The Control group and the Experimental group were analyzed in four different ways: 1) student performance on procedural problems, 2) student performance on application problems, 3) students? pass rates, and 4) student change in mathematics efficacy. The first section provides a description of how I determined which variables should function as covariates. Following that section are the results of each of the above four analyses. Selecting Covariates I analyzed the degree to which the Experimental section was comparable to the Control section. Thus, data were collected to determine if a significant difference existed between the two sections in terms of Race, Age, Gender, Prior Mathematical Knowledge, Number of Hours Employed, and Number of Attempted Credit Hours. I used a t-test for analyzing the four continuous variables (Prior Mathematical Knowledge, Hours of Employment, Credit Hours, and Age). The Experimental and Control groups differed significantly only with respect to the number of Credit Hours that students were taking (p = 0.043). See Table 12 for the results of the analyses. Thus, Credit Hours was identified as a potential covariate. 101 Table 12 Differences in continuous variables between groups Variable Treatment n Mean Standard Deviation (t, p-value) Prior Mathematical Knowledge Control 11 22.64 5.2 (-1.49, .148) Experimental 16 26.06 6.2 Total 27 Hours of Employment Control 11 9.50 12.0 (-.396, .696) Experimental 18 11.67 15.5 Total 29 Credit Hours Control 11 12.82 2.8 (-2.125, .043) Experimental 18 14.78 2.2 Total 29 Age Control 11 20.73 3.7 (-.801, .430) Experimental 18 22.44 6.5 Total 29 Of the two dichotomous variables, Gender and Race, I used a Fisher?s Exact Test to determine that only Race was significantly different between the Control and Experimental groups (p = 0.048). See Table 13 for the results of the analyses. Thus, Race was also identified as a potential covariate. 102 Table 13 Research Question 1: Procedural Skills My first research question examined whether the two groups demonstrated similar procedural skills throughout the five tests within the course. Data were gathered from students who completed all five tests, which included ten students from the control group and seventeen students from the experimental group. Table 14 provides a summary of students? procedural scores. According to the table, the average difference in Procedural scores between the two groups was 2.0 points in favor of the Control group. The Control group earned higher marks on the first, second, and fourth tests, and the Experimental group earned higher marks on the third and fifth tests. Differences in dichotomous variables between groups Variable Treatment n Male Female p-value Gender Control 11 3 8 1.00 Experimental 18 6 12 Total 29 Race Control 11 7 4 .048 Experimental 18 4 14 Total 29 103 Table 14 Summary of procedural scores for control and experimental groups Test 1: Factoring Test 2: Rational Expressions Test 3: Functions Test 4: Radicals Test 5: Quadratic Equations Average Difference Control Mean 80.4 77.6 74.6 72.7 62.2 Std Dev 10.1 14.0 9.6 17.2 16.2 Experimental Mean 76.7 67.6 75.0 65.8 72.2 Std Dev 14.5 17.4 11.4 15.3 17.4 Difference in Means (E-C) -3.7 -10.0 +0.4 -6.9 +10.0 -2.0 Cohen?s d -0.30 -0.63 0.04 -0.42 0.60 Note: The values listed represent the percentages of points earned A 2 (Treatment) x 5 (Procedural Test Scores) Repeated Measures ANOVA was used because each student in the two groups took a total of five tests. The independent variable was the Treatment, and the dependent variable was the Procedural Tests Scores. Additionally, Mauchly?s Test of Sphericity indicated that the assumption of sphericity for the analysis was not violated (Mauchly?s W = .906, df = 9, p = .988). Race was included as a covariate because of the statistically significant interaction between Race and students? Procedural Scores (Procedural Scores * Race F = 4.625, p = .002). Although Race was not a focus in this study, Race was used as a covariate to minimize the 104 differences between the Control and Experimental groups and therefore improve the accuracy of the statistical model. The other potential covariate, Credit Hours, was not included as a covariate in this analysis because it had no significant effect on students? Procedural Scores (between- subjects effect F = 1.271, p = .316; Procedural Scores * Credit Hours F = .924, p = .580). The two groups did not differ significantly in their overall procedural scores (Test of Between-groups ?Treatment? effect: F=.365, p = .551, Power = .089). Additionally, the treatment did not have a significant effect on the students? procedural scores over time (Test of Within-groups interaction ?Procedural Scores * Treatment?: F = 1.285, p = .281, Power = .388). Refer to Table 15 for a summary of the analysis. Table 15 Statistical analysis of procedural scores between groups df Mean Square F Sig Partial Eta Squared Observed Power Between Groups Race 1 149.009 .206 .654 .009 .072 Treatment 1 263.406 .365 .551 .015 .089 Error 24 721.651 Within Groups Procedural Tests 4 4.606 .002 .161 .937 Procedure * Treatment 4 1.285 .281 .051 .388 Procedure* Race 4 1.975 .104 .076 .575 105 When Race was included as a covariate in the analysis of procedural scores, the average difference between the two groups increased to 3.4 points in favor of the Control group. As with the unadjusted scores, the Control group scored higher on the first, second, and fourth tests; and the Experimental group scored higher on the third and fifth tests. The Cohen?s d values similarly indicate a large treatment effect on the first, second, and fourth tests (d > 0.8) and a medium effect for the third and fifth tests (0.2 < d < 0.8). Table 16 provides a summary of students? procedural scores after being adjusted for Race. Table 16 Summary of procedural scores adjusted for race Test 1: Factoring Test 2: Rational Expressions Test 3: Functions Test 4: Radicals Test 5: Quadratic Equations Average Difference Control Mean 81.7 76.3 74.3 73.1 66.4 Std Error 4.6 5.8 3.9 5.7 5.8 Experimental Mean 76.0 68.4 75.2 65.5 69.7 Std Error 3.4 4.3 2.8 4.2 4.2 Difference in Means (E-C) -5.7 -7.9 +0.9 -7.6 +3.3 -3.4 Cohen?s d -1.41 -1.55 0.27 -1.52 0.65 Note: The values listed represent the percentages of points earned 106 As illustrated in Figure 3, the two groups experienced different trends in performance with respect to procedural scores. The Control group started with an average adjusted procedural score of 81.7% on the first test and steadily declined to an average adjusted procedural score of 66.4% on the fifth test, a decrease of 15.3% over the semester. On the other hand, the Experimental group experienced increases and decreases on the five tests, but did not experience as much of a decline (76.0% to 69.7%) over the semester. Figure 3: Mean adjusted procedural scores for control and experimental groups Although not part of my original design, I used the department?s comprehensive final exam to further examine if the two groups demonstrated a significant difference in procedural ability. The departmental final exam was multiple-choice and consisted almost entirely of procedural problems. Data were gathered from students who completed the final exam: ten students from the control group and fifteen students from the experimental group. I performed an ANOVA and found that Race should not be included as a covariate since it did not have a 60 65 70 75 80 85 1 2 3 4 5 Pr oc ed ur al Sc ore s ( % ) Procedural Tests Control Experimental 107 significant effect on final exam scores (F = 0.007, p = 0.932). I also used multiple regression to determine that Credit Hours should not be included as a covariate because it did not have a significant effect on final exam scores (t = -0.413, p = 0.683). Using a t-test, I found that the difference in final exam scores between the two groups was not statistically significant (t = - 0.223, p = 0.825). The results in Table 17 showed that the procedural ability between the two groups of students was comparable. Table 17 Comparison of final exam scores between control and experimental groups n Mean Final Exam Score Standard Deviation Control 10 71.2% 12.9 Experimental 15 72.4% 13.4 Difference (E ? C) +1.2% Cohen?s d 0.0912 Research Question 2: Application Skills My second research question analyzed whether the two groups demonstrated similar application skills throughout the five tests within the course. Data were gathered from students who completed all five tests: ten students from the control group and seventeen students from the experimental group. Table 18 provides a summary of students? application scores throughout the semester. According to the table, the average difference in application scores between the two groups was 13.7 points in favor of the Experimental group. 108 Table 18 Summary of application scores for control and experimental groups Test 1: Factoring Test 2: Rational Expressions Test 3: Functions Test 4: Radicals Test 5: Quadratic Equations Average Difference Control Mean 62.0 67.5 68.1 82.0 41.2 Std Dev 27.0 23.7 33.1 21.5 40.0 Experimental Mean 72.4 74.5 90.8 87.1 64.7 Std Dev 18.6 30.7 20.0 16.5 39.6 Difference (E-C) +10.4 +7.0 +22.7 +5.1 +23.5 +13.7 Cohen?s d 0.44 0.25 0.83 0.27 0.59 Note. The values listed represent the percentages of points earned As in the first research question, a 2 (Treatment) x 5 (Application Test Scores) Repeated Measures ANOVA was used because each student in both groups took a total of five tests. The independent variable was the Treatment, and the dependent variable was the Application Tests Scores. Additionally, Mauchly?s Test of Sphericity indicated that the assumption of sphericity for the analysis was not violated (Mauchly?s W = .603, df = 9, p = .253). Race was included as a covariate because of its statistically significant between-groups effect (F = 4.517, p = .044). In contrast, the data showed that the Number of Credit Hours Attempted by students had no significant effect on students? application scores (between-subjects 109 effect F = 1.122, p = .390; Application * Credit Hours F = .577, p = .948); therefore, the Number of Credit Hours Attempted by students was not included as a covariate. The Experimental group outperformed the Control group on the application problems of every test. However, when Race was entered as a covariate, the overall difference between the two groups? application scores was not significant (F =1.051, p =.315, Power = .166). Further, the treatment did not have a significant effect on the students? application scores over time (Within-groups interaction ?Application Scores * Treatment? F = .297, p = .879, Power = .114). See Table 19 for a summary of the analysis. Table 19 Statistical analysis for the difference in application scores df Mean Square F Sig Partial Eta Squared Observed Power Between Groups Race 1 2439.224 1.628 .214 .064 .232 Treatment 1 1575.674 1.051 .315 .042 .166 Error 24 1498.514 Within Groups Application Tests 4 1397.822 2.364 .058 .090 .664 Application Tests * Treatment 4 175.679 .297 .879 .012 .114 Application Tests * Race 4 322.167 .545 .703 .022 .177 Error (application tests) 96 591.385 110 Table 20 provides a summary of students? application scores after being adjusted for Race. When Race was taken into account, the average difference between the two groups decreased to 8.3 points in favor of the Experimental group. The Experimental group earned higher marks on all of the tests and scored substantially higher on the fifth test. Although not statistically significant, the Cohen?s d values for all five tests indicate either medium or large treatment effects for all five tests. Table 20 Summary of application scores adjusted for race Test 1: Factoring Test 2: Rational Expressions Test 3: Functions Test 4: Radicals Test 5: Quadratic Equations Average Difference Control Mean 65.0 68.4 76.2 83.8 44.5 Std Error 7.8 10.2 8.4 6.6 14.2 Experimental Mean 70.6 74.0 86.1 86.0 62.8 Std Error 5.7 7.5 6.2 4.8 10.4 Difference (E-C) +5.6 +5.6 +9.9 +2.2 +18.3 +8.3 Cohen?s d 0.82 0.63 1.34 0.38 1.47 Note. The values listed represent the percentages of points earned As illustrated by the graph in Figure 2, the two groups experienced similar trends in performance. The Experimental group started with an average adjusted application score that 111 was 5.6 points higher than the Control group. The two groups roughly maintained that difference throughout the course until the fifth test in which the gap between the two groups grew to a difference of 18.3 points. Although the Experimental group consistently scored higher than the Control group throughout the course, the overall difference in application scores between the two groups was not significant. Figure 4: Mean adjusted application scores for control and experimental groups Analysis of Procedural vs. Application Skills. Due to a trend that appeared in each group?s performance with respect to their procedural problems and application problems, an additional analysis was conducted. A strong correlation existed in the control group between students? average procedural scores and their average application scores (r = .7645); of the ten students in the control group who completed the course, seven of the ten students (70.0%) had higher procedural scores than application scores. In contrast, a weak correlation existed in the experimental group between students? average procedural scores and average application scores 30 40 50 60 70 80 90 1 2 3 4 5 Sc ore s ( % ) Application Tests Control Experimental 112 (r = .2308); of the seventeen students who completed the course, only six of the seventeen students (35.3%) had higher procedural scores than application scores. In other words, students in the Experimental group were more likely than students in the Control group to earn average application scores that were higher than their average procedural scores across the five tests. However, a 2x2 Fisher?s exact test revealed that the difference in trends exhibited between the two groups was not statistically significant (p = 0.12) The data showed that several students in the experimental course earned low procedural averages but were able to earn relatively high application scores. For example, Gary (a pseudonym) earned a 41% average procedural score on the five tests during the course and a 78% average application score on the five tests during the course; thus, his average application score across the five tests was 37% higher than his procedural scores. Five other students similarly earned much higher average application scores (at least 16% higher) than procedural scores. In contrast, no student in the control group earned an average application score that was more than 11% higher than his or her average procedural score. In order to understand better why each group of students experienced different correlations between their average procedural and average application scores, students? responses on the application problems were further examined to determine what type of methods were used to answer the problems. The examination showed that in addition to solving problems through algebraic means, students also used pictures to understand the situation within a problem or to supplement their algebraic ability to solve a problem. Figure 5 illustrates how the use of pictures helped a student answer the application problem 2B. 113 Figure 5: A solution obtained through the use of pictures The analysis also showed that students used systematic trial and error approaches by constructing a table of values or by evaluating multiple solutions until the correct solution was attained. These approaches were often used in the place of more formal algebraic techniques to solve application problems. Figure 6 illustrates the systematic trial and error approach that a student used to answer question 5A. 114 Figure 6: A solution obtained through systematic trial and error Table 21 describes the difference in usage of supplemental methods such as pictures and systematic trial and error between the control and experimental groups. According to Table 21, students in the experimental group used pictures and systematic trial and error methods much more often than students in the control group. 115 Table 21 Comparison of non-algebraic strategies on application questions between groups Test Question Control n (%) Experimental n (%) Test Question Control n (%) Experimental n (%) 1A 1B Pictures 1 (10.0) 3 (17.6) Pictures 10 (100) 17 (100) Tables/ Trial & Error 0 (0.0) 4 (23.5) Tables/ Trial & Error 0 (0.0) 2 (11.8) 2A 2B Pictures 0 (0.0) 5 (29.4) Pictures 0 (0.0) 8 (47.1) Tables/ Trial & Error 0 (0.0) 0 (0.0) Tables/ Trial & Error 0 (0.0) 0 (0.0) 3A 3B Pictures 0 (0.0) 0 (0.0) Pictures 0 (0.0) 0 (0.0) Tables/ Trial & Error 0 (0.0) 2 (11.8) Tables/ Trial & Error 6 (60.0) 12 (70.6) 4A* 4B Pictures 8 (80.0) 17 (100.0) Pictures 0 (0.0) 3 (17.6) Tables/ Trial & Error 0 (0.0) 0 (0.0) Tables/ Trial & Error 0 (0.0) 1 (5.9) 5A Pictures 0 (0.0) 2 (11.8) Tables/ 1 (10.0) 11 (64.7) 116 Trial & Error * Problem 4A specifically asked students to draw a picture that described the scenario in the problem Research Question 3: Pass Rates My third research question attempted to determine if the two groups exhibited similar pass rates for the course. Using logistic regression, I established that neither Race (Wald = .423, p = .515) nor Credit Hours (Wald = .158, p = .691) should be included as covariates because they did not have a significant effect on students? course pass rates. Data were gathered from all students who agreed to participate in the study: eleven students from the Control group and eighteen students from the Experimental group. A 2 (Control/Experimental) x 2 (Pass/Fail) Fisher?s Exact Test found that the difference in pass rates between the two groups was minimal and likely due to chance (p = 1.00). Table 22 summarizes these results. Table 22 Summary of pass rates n Passed Failed Pass Rate Sig Control 11 7 4 63.6% 1.00 Experimental 18 11 7 61.1% Research Question 4: Students? Change in Mathematics Self-Efficacy My fourth research question attempted to determine if the two groups demonstrated similar changes in mathematics self-efficacy throughout the course. Because Credit Hours and Mathematics Self-Efficacy were both continuous variables, I used linear regression to establish that Credit Hours did not need to be a covariate (t = -0.575, p = 0.571). I also used a 2 x 2 117 Repeated Measures ANOVA to establish that Race did not need to be a covariate (Race * Efficacy F = 0.810, p = 0.378). Data were gathered from students who completed the Pre- and Post-Mathematics Self-Efficacy Surveys (a 5-question Likert-style survey): ten students from the Control group and thirteen students from the Experimental group. The dependent variable was students? Mathematics Self-Efficacy, and the independent variable was the Treatment. The relationship between these two variables was analyzed using a 2 (Pre-Efficacy/Post-Efficacy) x 2 (Control/Experimental) Repeated Measures ANOVA. Table 23 summarizes the students? change in mathematics self-efficacy. Table 23 Summary of students? change in mathematics self-efficacy n Mean Pre-Test Std Dev Pre-Test Mean Post-Test Std Dev Post-Test Change in Means Control 10 4.34 0.55 3.98 0.84 -0.36 Experimental 13 4.11 0.64 4.18 0.96 +0.07 Difference (E ? C) -0.23 +0.20 +0.43 Note. Survey results are based on a 5-point scale The graph in Figure 7 shows that at the beginning of the course, the Control group reported a self-efficacy score that was 0.23 points higher than the Experimental group?s score. However at the end of the course, the Experimental group reported a self-efficacy score that was 0.20 higher than the Control group?s score. 118 Figure 7: Mean pre- and post-mathematical self-efficacy scores The Control group reported a modest drop in mathematical self-efficacy by the end of the course, and the Experimental group reported having a slight increase in mathematical self- efficacy by the end of the course, resulting in a net difference of 0.43 in favor of the Experimental group. According to the Repeated Measures ANOVA, the effect that the treatment had on each group?s change in mathematics self-efficacy was not significant (Treatment * Efficacy F = 1.014, p = .325, Power = .161). Table 24 provides a summary of the statistical analysis. 1 2 3 4 5 1 2 M ath em atical Se lf- Ef ficac y Pre/Post Scores Control Experimental 119 Table 24 Statistical analysis for students? change in mathematics self-efficacy df Mean Square F Sig Partial Eta Squared Observed Power Between Groups Treatmenta 1 .002 .003 .995 .000 .050 Error 21 .659 Within Groups Efficacy 1 .226 .426 .521 .020 .096 Efficacy * Treatmenta 1 .540 1.014 .325 .046 .161 Error (Efficacy) 21 .532 Note. ?a? denotes the result that is relevant to the researcher?s question Summary of the Quantitative Results With respect to procedural skills, the Control group outperformed the Experimental group on three of the five tests; and with respect to application skills, the Experimental group outperformed the Control group on all five tests. However, the differences in both the procedural skills and application skills of the students were not significant. Students in the Control group had a stronger correlation between their average procedural scores and application scores than did students in the Experimental group; this difference may due the Experimental students using non-algebraic strategies more often on application problems than did the Control students. With respect to pass rates, the Control group had a slightly higher pass rate over the Experimental 120 group; however, the difference in pass rates was not statistically significant. With respect to change in mathematics self-efficacy, the Experimental group maintained its starting level of mathematics self-efficacy, while the self-efficacy of the Control group decreased by the completion of the course. However, the difference in the changes in mathematics self-efficacy was not statistically significant. Qualitative Results The following sections provide the results from the anonymous free-response student surveys. The sample for the qualitative data consisted of forty-five respondents, as opposed to the sample for the quantitative data which consisted of twenty-nine participants. The qualitative data had a higher sample size than the quantitative data because the qualitative data were gathered anonymously through end-of-course surveys from all remaining students in each course. The twenty-three respondents from the Control group primarily addressed my ability as their instructor to explain the information, as well as the mathematics department?s design of the course. The twenty-two respondents of the Experimental group addressed issues similar to those that were addressed by the Control group. However, since the students in the Experimental group were asked additional questions regarding various components of the experimental teaching method, the students in the Experimental group addressed issues that did not apply to the students in the Control group. I used the representational approach described by Sapsford (1999) to identify core concepts in students? responses, and I terminated the coding process once all of the students? statements could be classified according to the existing set of codes (Miles & Huberman, 1994). 121 Comparison of Treatments The first question of the student surveys for both groups was the same: ?How does this math class compare to other math classes that you have had? Explain.? The purpose of this question was to support the claim that the pedagogical techniques used in the Experimental group were significantly different from those used in the Control group. The types of comments that students in the Control group made were consistent with those expected from a traditionally taught classroom. For example, a significant number of comments favorably addressed the instructor?s ability to communicate mathematical concepts and to provide a relatively enjoyable learning experience. The remaining comments addressed the pacing and difficulty of the course; the majority of these comments were also positive. See Appendix F for a summary of the Control group?s comments for question 1 of the student survey. In contrast, the types of comments made by the students in the Experimental group were consistent with the pedagogy advocated by reform documents. The positive responses about the course showed that 1) students were given the opportunity to learn mathematics through extensive interactions with their peers, 2) the instructor minimized explicit mathematical instruction, 3) classroom instruction incorporated significant use of pictures and graphs (in addition to algebraic techniques), 4) mathematics was related to the real world, and 5) students were developing mathematical tools and ways of thinking that could be used in future mathematics classes. The negative responses similarly provided insight into the daily classroom experiences by referring to students? investigating mathematical phenomena on their own, students being required to work with the members of their group, and the instructor?s practice of 122 engaging students in questioning techniques. See Appendix F for a summary of the Experimental group?s comments for question 1 of the student survey. Efficacy of Control Treatment The remaining questions on the Control group?s student surveys were gathered in order to support the claim that the Control group was taught fairly in the eyes of the students. Students? responses in questions 2 ? 4 were grouped together but kept separate from students? responses to question 1. The remaining questions on the Control student survey were the following: 2) What are some things you liked about the course? 3) What are some things you did not like about the course? 4) Other comments. A high number of the students stated that the instructor adequately and enthusiastically explained the material. Students? comments regarding the structure of the course were more divided. Some students disagreed over the difficulty of the course, while other students commented negatively on the mathematics department?s design of the course (such as not allowing calculators on the final exam). Students similarly were divided over the homework policy. Some students felt that the homework problems helped them understand the material, while other students felt that too much homework was assigned. Several students negatively commented on the time of day for the course (8 a.m.). Interestingly, however, when given the opportunity to state negative characteristics of the course, ten of the twenty-three responding students explicitly stated that the course did not contain any negative qualities. Overall, 62 positive comments and 16 negative comments were made by the students; and of the 16 negative comments, 10 addressed factors beyond the instructor?s control (such as departmental policy and the class?s meeting time.) Thus, based on the proportionally high 123 number of positive comments about this particular mathematics course, it is clear that the high majority of the students responded positively to the way that the course was taught. See Appendix F for a summary of the Control group?s comments for questions 2-4 of the student survey. Research Question 5: Students? Views about Reform Mathematics Additional questions were placed on the Experimental group?s survey in order to solicit explicit feedback regarding several key components of reform pedagogy that were employed during the study. Students? responses in questions 2 ? 7 (see Appendix F) were grouped together but kept separate from students? responses to question 1 which had a different purpose. The remaining questions were the following: 2) What are some things you liked about the course? 3) What are some things you did not like about the course? 4) To what extent did you like working with your classmates during class? Explain. 5) Did you find the graphing calculator useful? If yes, please explain how/when it was useful. 6) To what extent did you benefit from presenting your work to the class (or watching your classmates present their work to the class)? Explain. 7) Other comments. Students possessed a generally positive view regarding student presentations by noting that the presentations 1) pushed them to perform good mathematical work, 2) helped them better understand mathematical concepts by observing how their classmates? approaches compared to their own, and 3) helped students increase their confidence in their mathematical and speaking abilities. Some students, however, felt that they did not benefit from peer presentations and 124 thought that class time could be better used by the instructor working through additional problems. Students also possessed a generally positive view of working together in groups. Many students seemed to feel that they benefited from the support structure provided by their groups, both in terms of helping one another understand a concept as well as sharing alternative ways to view a particular concept. Some students preferred not to work in groups because they did not like to share their work or because they would have liked the instructor himself to communicate explicitly the mathematical material. Students overwhelmingly liked the use of graphing calculators. Students expressed that graphing calculators helped them solve problems, graph functions, and verify that their answers were correct. The calculator?s ability to create graphs and tables provided students an alternative means to solve problems other than by using purely algebraic techniques. No negative comments were made regarding the use of calculators. Students provided mixed reviews regarding the teaching methods used during the course. While some students enjoyed every facet of the course and stated that the mathematics course was ?fun?, other students equally disliked the course. Several students would have preferred to minimize student discussions so that more examples could be done during class, and some students felt that the classroom instruction did not connect well with the problems that were in the book and on the tests. A few students also stated that the mathematical techniques developed during class as well as the type of problems solved during class did not correspond to the techniques and problems presented in the Math Lab. Students acknowledged the inherent trade-off between covering fewer problems in greater depth (questioning, group discussions, student presentations) versus the teacher solving 125 more problems during class, with the expectation that students will understand the material once a sufficient number of examples are presented. Summary of Qualitative Results The responses from the students? anonymous end-of-course surveys suggested that the treatments for each group were what I had thought. Based on the ratio of positive to negative comments, students appeared to find student presentations, group work, and graphing calculators to be beneficial. Students generally possessed positive views regarding student presentations, and they also possessed generally positive views about working together in groups. Further, every comment regarding the use of graphing calculators was positive. However, students provided mixed views regarding the relatively few number of examples that were worked by the instructor; many students felt that they would have benefited from the instructor working out more examples during class. Overall, students made significantly more positive comments than negative comments about the reform-oriented course and its components (109 positive and 26 negative). 126 CHAPTER 5: CONCLUSIONS AND IMPLICATIONS In this chapter, the limitations and conclusions of the study will be presented. Subsequently, the implications of the study for teachers and administrators will be discussed. Lastly, directions for future research studies will be suggested. Limitations The current study contains several limitations beyond my control. First, the study was a quasi-experimental study in that students were not randomly assigned to a treatment. Instead, students enrolled into the course which fit their schedule. Secondly, I could not control for all possible variables. Such variables extended to the extent and type of resources that students enlisted outside the classroom. Third, the study had a small sample size. A larger sample size would have been preferred in order to increase the statistical power of the study and therefore obtain a higher level of confidence in the study?s results. Fourth, the two classes were conducted at two different points in time. It is possible that students in each of the classes were affected by a different set of social events outside the classroom. The passage of time may have also resulted in my maturing in some manner between courses. Two important limitations that this study attempted to mitigate were a lack of fidelity to the treatments and the possibility of researcher bias. Scores on the Reformed Teaching Observation Protocol (RTOP) (Piburn & Sawada, 2000) supported my claim that the Control group received instruction consistent with traditional lecture methods and that the Experimental group received instruction consistent with reform pedagogy. Additionally, the open-ended student surveys provided additional data that showed that students experienced two different types of instruction in their respective courses. The survey data also indicated that students from the Control group felt that the instructor appropriately implemented traditional teaching 127 techniques, thus helping to mitigate the possibility that the researcher inadvertently provided the Control Group with lower quality instruction than the Experimental Group. Conclusions Although the data did not yield statistically significantly results, the trends within the data were consistent with those of similar studies of other reform mathematics classrooms. Key results for each of the research questions are presented in the sections below. Research Questions 1 and 2: Procedural and Application Skills Though the results were not statistically significant, the trends within the data suggested that incorporating reform-oriented pedagogy into post-secondary remedial courses may improve students? problem-solving abilities without sacrificing procedural proficiency. The trends in this study are consistent with prior research on secondary students in which reform students scored as well as traditional lecture students in procedural skills and better on problem-solving skills (Hirschhorn, 1993; Schoen, Hirsch, & Ziebarth, 1998; Thompson & Senk, 2001). With respect to students? procedural skills in this study, students in the Control group scored higher on the first test but gradually decreased in performance on each subsequent test throughout the course. In contrast, students in the Experimental course experienced both increases and declines throughout the course. The average score on the comprehensive, procedural final exam for the Control course was nearly the same as the average score for the Experimental course. With respect to application skills, students in the Experimental course outperformed students in the Control course on all five tests. The change in performances of the two classes generally mirrored each other on the application portions of the five tests. Students in the Experimental group demonstrated a much weaker correlation between their average procedural scores and average application scores than did students in the Control 128 group. Students in the Experimental group often earned higher average application scores than procedural scores across the five tests in the course. In contrast, students in the Control group tended to earn lower average application scores relative to their average procedural scores across the five tests. The reason for this reversal in trends may be due to the difference in how the Experimental and Control students were taught the material. Students in the Control group were taught the most efficient methods to solve problems; these methods were most often algebraic methods that were introduced in a general context first and then demonstrated later in a specific context (such as a story problem). In contrast, the Experimental group was taught various methods to explore problems such as systematic trial and error and utilizing the table and graphing functions of a graphing calculator; algebraic methods were often introduced after students had been given time to explore problems and develop reasonable solutions based on non-algebraic techniques. Thus, students in the Experimental group had more methods available to solve application problems in the event that one of those methods (such as the algebraic method) failed them. The results in this study were similar to those of Senk and Thompson (2006) in which secondary students in reform-based courses were more likely to use graphical and numerical strategies to solve problems than did their matched comparison students. Research Question 3: Pass Rates This study found no significant difference in the pass rates between the students in the Experimental and Control groups. Recall that a lack of consensus exists regarding maintaining higher standards in remedial mathematics courses versus adopting acceptably lower mathematics standards in exchange for higher pass rates (Golfin et al., 2005). In this light, the results of the study were promising; the students in this study who were subjected to the Experimental treatment were able to maintain the high standards of the course without reducing student pass 129 rates. In other words, the gains made by the Experimental group did not come at the expense of higher rates of attrition. Research Question 4: Change in Mathematics Self-Efficacy This study found no significant difference in the change in mathematics self-efficacy between the two groups. Compared to traditional lecture methods, the trends in the data suggested that reform-oriented instruction may produce more favorable changes in students? mathematics self-efficacy. This trend towards improving mathematics self-efficacy in the Experimental course was not surprising. First, several students commented that presenting their work to the class improved their confidence in their mathematical abilities despite constructive criticisms made by their classmates. One student commented: ?When presenting my work to the class I gained more confidence in the way I was solving my problems. Others were also able to point out flaws in my work as I was [able to point out flaws in] theirs. It basically made the whole class a big group.? Second, it was common practice throughout the course for the students in the Experimental group to spend at least 10 ? 15 minutes on a problem or a set of related problems. The daily behavior expected of and exhibited by the students in the Experimental group aligned with Bandura?s (1997b) description of high self-efficacy: sustaining strong commitment towards a goal and viewing tasks as challenges to be mastered. Research Question 5: Student Response to the Experimental Treatment The data from free-response student surveys showed that the students in the reform- oriented course expressed generally positive comments about the style of instruction. When asked to compare their current class to other mathematics classes in the past, more than one third of the students commented that that their understanding of the material resulted from non- algebraic methods of communication. With respect to group work, graphing calculators, and 130 student presentations (key elements of reform pedagogy), students? comments in the reform- oriented class were mostly favorable. Student comments revealed that discussing problems during group work and student presentations helped them to improve their understanding of the material by considering different perspectives; these perspectives helped students to learn from each other and resulted in their ?knowing the problems inside and out.? Other comments revealed that students found graphing calculators to be useful because they helped improve conceptual understanding and provided alternative means to solve problems through tables and graphs. Interestingly, no negative comments were made by students about the ability to use graphing calculators in the course. However, roughly half of the students in the Experimental course expressed a desire to see the instructor present and solve more problems during class. Such comments were not surprising considering the fact that many of the students? mathematical backgrounds consisted primarily of traditional lecture teaching. One may recall that the comments made by students in the traditional lecture course expressed very high opinions of the teaching method due in large part because the instructor ?tried his best in giving [students] easy ways to solve the problems? and would teach ?in detail the steps of the problem.? In other words, many of the students in the study seemed to prefer that the teacher solve multiple problems in a clear and detailed manner. Implications The results from this study can inform both teachers and administrators who engage post- secondary remedial mathematics students. I discuss the implications of this study for teachers and administrators. Lastly, although studies have consistently shown that the most important factor in school effectiveness is the teacher (Boaler, 2008), I conclude this section with a reminder of the vital role that administrators play in the implementation of reform curricula. 131 Teachers Teachers who may be interested in implementing more reform-oriented pedagogy into their remedial mathematics courses may be discouraged if the objectives within their courses are procedural in nature; however, this study demonstrated that significant implementation of the pedagogy advocated by reform documents into remedial mathematics courses may be possible. Although the objectives of the remedial mathematics course in this study were primarily procedural in nature, I was able to transform the inherently procedure-oriented course into a reform-oriented course by incorporating into daily activities pedagogical techniques that were consistent with reform pedagogy. Because the course textbook within the study reinforced primarily procedural skills, I supplemented homework problems and classroom activities with those I created myself as well as those that I obtained from the literature. Finding reform- oriented lessons was not exceedingly difficult. The objectives in a post-secondary remedial mathematics course are similar to the objectives at the secondary level, and ample reform- oriented curricula addressing secondary mathematics are available for teachers to tailor to the needs of their classes. Based on the resistance from students in this study to the new teaching approach, teachers who would like to incorporate more reform-oriented techniques into their classrooms need to be aware that many of their incoming students will likely have had little exposure to reform- oriented instruction. Teachers should therefore make certain that students understand at the beginning of the course the expectations for the reform-oriented course (such as active student engagement through questioning, group work, justification of answers to peers) and should submerge the students in reform-oriented activities at the onset of the course. Teachers who 132 clearly explain and demonstrate the desired learning practices to their students can improve the equitable nature of their instruction (Boaler & Staples, 2008) For example, I designed an activity for the first day of the reform-based class in which students were required to answer a basic algebra problem and then explain how they knew that their answer was correct. In order to model what was expected of them, I answered the first question ?What is 1x + 2x?? two different ways. First, I broke the problem into ?x + x + x? and argued that my answer was 3x because I had a total of three x?s. My second approach involved plugging a number (say, 4) in for x and showed that 1(4) + 2(4) = 3(4). After answering students? questions, I let them attempt the following few problems such as ?What is (x2)(x3) ??. Although many of the students stated that the answer was x5, their sole justification was given by phrases such as ?that?s the rule in the book? or ?that?s what I was taught by my high school teacher.? I then asked these students if their classmates also arrived at the same answer or if there were any way to convince an opposing viewpoint of the validity of their answer. Thus, this first activity set the tone for the course in that students quickly saw that they were expected to find the answers to problems without simply relying on theorems within the book; in other words, they were expected to derive a theorem or at least to understand why a theorem made sense. Additionally, students realized that they were expected to work with each other to make sure that they correctly understood the material, and if students disagreed on a solution, they were to attempt to reconcile their differences. As the course progressed, the students gradually began developing ways to make sense of problems and verify their answers to those problems through methods other than ?the book said so.? Having taught the experimental section, I offer an additional observation from this study?teachers need to remember to be patient. It may take time for students to adjust their 133 classroom learning habits from passive observers to active participants. Kara (a pseudonym), one of the students in Experimental course, at first did not appreciate the activities and types of problems assigned in the course. Early in the course, Kara emailed me, expressing some frustration that the classroom activities and assigned homework did not correspond to the types of problems presented in the text (which were procedural in nature). I replied that the classroom activities and homework were intended to address conceptual understanding and should therefore help to minimize common student mistakes. One week later, Kara emailed me, stating, ?I have been working on the practice test, and I am remembering all of the mistakes I was making before you tutored me. All the times I was forgetting the steps, and I had to figure it out...I have not forgotten! It stuck with me. Thanks again for your help.? Kara?s diligence in ?figuring things out? ultimately helped her to do very well in the course. At the end of the course, she emailed me and stated that she thoroughly enjoyed my teaching style. Kara?s success story represents the reason why teachers should expect, but not become overwhelmed by, students? initial frustrations with reform-oriented teaching. Just as the students in prior studies (Schoen, Hirsch, & Ziebarth, 1998; Reys et al., 2003) took up to two years to fully adapt to reform-oriented instruction, Kara?s experiences similarly demonstrate that it can take time for students to realize the benefits of learning mathematics in a reform-oriented manner. In this experimental course and previous pilot experimental courses, I consistently found time to be a significant opponent. Whereas traditional lecture methods often ask students to copy the teacher?s notes from the board (with or without understanding), reform mathematics pedagogy asks students to experiment, to discuss the findings of their experiments with classmates, and to learn from the findings of other classmates. Simply stated, these cognitive 134 processes take time. Teachers therefore need to tailor their lessons in such a way as to allocate sufficient time for students to fully engage in the lessons. Lastly, for teachers wishing to transform a more traditional course into a more reform- oriented course, it is critical for them to acquire administrative support. If students are not won over by the advantages of reform-oriented instruction, a strongly flavored reform-oriented course may cause students to complain to the administration or to produce negative feedback on course evaluations. Teachers need to make certain that administrators are aware of the likelihood of student complaints and that administrators are prepared to diplomatically address complaints presented by students. Administrators Administrators need to understand the advantages of reform-oriented instruction. The goal of reform-oriented mathematics is not to develop students? problem solving ability at the expense of procedural proficiency. Rather, its goal is to develop students? conceptual knowledge of the mathematics with the expectation that students will better develop and retain procedural skills, as well as understand under which conditions those procedural skills should be applied. The goals of reform-oriented instruction coincide with the calls made by various reports for postsecondary students to develop critical thinking and problem solving skills (Conley & Bodone, 2002; AEE, 2011). The trends in the data that were gathered in this study are consistent with the claims of reform mathematics: teaching in a reform-oriented manner does no harm while potentially providing multiple benefits. Administrators who want their instructors to teach in a reform-oriented manner need to provide their instructors with training and support. Administrators who would like their instructors to teach in a more reform-oriented manner should consider encouraging their 135 instructors to attend conferences, workshops, and other types of training that can help instructors better understand how to implement reform-oriented pedagogy. Many instructors know how to teach only in a traditional manner; thus, administrations may need to invest time and resources into helping their instructors understand an alternative way to teach mathematics. I would never have been able to teach in a reform-oriented manner had it not been for the tools that I acquired from my graduate program in mathematics education. Because I had only known traditional lecture methods until entering graduate school, the daily pedagogical modeling by my professors were instrumental in helping me to understand how to teach in a reform-oriented manner. Lastly, this study reiterates the warning given above to teachers: administrative support is a critical component in any reform process. The data within this study add to the body of literature that state that reform-oriented teaching quite often does no harm, while in many cases has the potential to help. In her 2008 book What?s Math Got to Do with It?, researcher Jo Boaler described a secondary mathematics department that adopted an award-winning reform mathematics curriculum that was supported by its teachers. Despite the success of the academically rigorous and engaging curriculum, a small group of parents used misleading information to lobby other parents and students into signing a petition that required the mathematics department to abandon its reform mathematics curriculum. Ultimately, the parents prevailed against the wishes of the teachers, and the mathematics department returned to ?the traditional books and methods of teaching that they had used for many years, with very little success? (p. 33). Likewise, in my study, student complaints (instead of statistical data linked to student success) caused the department to retract support of my teaching in a reform-based manner. 136 Future Research Although not statistically significant, the results of this study were promising. Additional studies could be done that could overcome the limitations of the present study. Several directions for future research studies as well as factors to consider in designing those studies are presented in the paragraphs below. The statistical power of my study was limited by its small sample size. Researchers should consider replicating this type of study with a much larger sample size. A larger sample size would improve the statistical power of the study and increase the chances of finding a difference in treatments if any existed. Larger sample sizes would also make it easier to study various covariates (such as Age and Race) in the context of reform mathematics. Thus, researchers could better understand the effectiveness of reform mathematics on different subpopulations of students. As a first step, researchers may need to develop a method of improving the participation rate of students in their research, as this was the major cause of my small sample size. Additionally, future studies may consider employing multiple teachers who are capable of teaching in both traditional and reform-oriented manners. Using such a design would similarly strengthen the results of such studies by mitigating teacher-effect. Additionally, depending on the scope and duration of the study, conducting the paired classes during the same academic terms may help researchers to strengthen their research design by minimizing the effects of student and instructor maturation. In designing such studies, researchers should seriously consider designing paired courses that meet at the same time of day; multiple students in the present study commented that the meeting time of the class (8:00 a.m.) affected their outlook on the course. 137 Extending the duration of the treatment may also be a point of further interest. For example, if a mathematics program contained multiple levels of remedial mathematics courses (such as Elementary Algebra followed by Intermediate Algebra), researchers could examine if the effect of the treatment increased with additional exposure. The present study implemented the treatment for a total of one academic semester. However, other studies analyzing the success of new programs advocate allowing the program to continue for at least two years in order to properly assess the success of the program (Schoen, Hirsch, & Ziebarth, 1998). While some studies may examine the effectiveness of teaching methods for the remedial mathematics course in which they are currently enrolled, other studies may track cohorts of students and examine their success in subsequent credit-bearing courses. Success in future courses could be analyzed both in terms of overall pass rates and academic achievement within the course. Researchers may also wish to examine the number and level of mathematics courses that students take during their postsecondary education, based on their exposure to traditional and reform-oriented curricula. The results of these studies could further be analyzed according to variables such as race, gender, and socioeconomic status. Researchers could also examine the effectiveness of teachers collaborating together to improve their remedial mathematics courses by implementing reform-based instruction. The Carnegie Foundation?s Networked Improvement Community is a prime example of how instructors and researchers can work together to supply the research community with recommendations for non-STEM postsecondary mathematics courses (Merseth, 2011). The collaboration network used in developing Path2Stats (Hern, 2012) is another example of how collaboration networks can provide invaluable support for teachers who wish to improve the structure of their mathematics courses. Similarly, the body of literature could benefit from 138 studies that examined how postsecondary remedial mathematics instructors worked together to implement reform-based strategies into their classrooms. These studies could also examine the paths that entire postsecondary mathematics departments took to restructure their remedial mathematics courses. Conclusion The results of this study extended prior research on reform-based instruction for secondary and introductory postsecondary courses to postsecondary remedial mathematics courses. Although its results were not statistically significant, this study demonstrated that reform-oriented practices at the postsecondary remedial mathematics level have the potential to improve students? problem-solving ability and mathematical self-efficacy; these benefits may be achieved without sacrificing procedural skills or student pass rates. Students who received reform-oriented instruction were more likely than students who received didactic lecture methods to use non-algebraic methods such as pictures and systematic trial and error to solve application problems. The students in the reform-oriented course may have developed this behavior because of their consistent exposure to word problems throughout the course; they grew accustomed to take information from stories and interpret that information in a mathematical manner through tables, graphs, and pictures. Comments made by the students in the reform- oriented group about the type of instruction they received were generally positive; however, many students felt that they would have better understood the material if the instructor had directly explained the concepts and worked many examples during class. Although this study showed that it is possible to incorporate reform-based pedagogical practices into a procedurally-oriented course without limiting and possibly enhancing students? mathematical achievement, administrator support is essential if instructors are to teach in a 139 reform-oriented manner. Administrators and teachers who attempt to implement reform-oriented instruction should be prepared to address student complaints regarding the structure of the course. Additionally, administrators who would like to see reform-based practices in their mathematics classrooms should provide their instructors with sufficient training and support. Future research should further examine the effectiveness of reform-oriented pedagogy in postsecondary remedial mathematics courses from both students? and teachers? perspectives. 140 References Adelman, C. (1995). The new college course map and transcript files: Changes in course-taking and achievement, 1972-1993 (2nd ed.). Washington DC: National Center for Educational Statistics. Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor?s degree attainment. Washington, DC: U.S. Department of Education. Retrieved from http://www2.ed.gov/pubs/Toolbox/toolbox.html Alliance for Excellent Education. (2011, May). Saving now and saving later: How high school reform can reduce the nation's wasted remediation dollars (Issue Brief). Washington, DC. Retrieved from http://www.all4ed.org/publication_material/IssueBrief/ SavingNowSavingLaterRemediation American Mathematical Association of Two-Year Colleges. (1995). Crossroads in mathematics: Standards for introductory college mathematics before calculus. Memphis, TN: Author. American Mathematical Association of Two-Year Colleges (AMATYC). (2006). Beyond crossroads: Implementing mathematics standards in the first two years of college. Memphis, TN: Author. Anderson, R., Anderson, B., Varank-Martin, M., Romangnano, L., Bielenberg, J., Mieras, A., et al. (1994). Issues of curriculum reform in science, mathematics, and higher order thinking across the disciplines (Curriculum Reform Project Series 0-16-043073-9). Washington, DC: U.S. Department of Education. Annetta, L. A., & Dotger, S. (2006). Aligning preservice teacher basic science knowledge with INTASCII and NSTA core content standards. Eurasia Journal of Mathematics, Science & Technology Education, 2(2), 40-58. 141 Attewell, P., Lavin, D., Domina, T., & Levey, T. (2006). New evidence on college remediation. Journal of Higher Education, 77(5), 886-924. Bahr, P. (2008). Does mathematics remediation work? A comparative analysis of academic attainment among community college students. Research in Higher Education, 49(5), 420-450. Bahr, P. (2010). Revisiting the efficacy of post secondary remediation: The moderating effects of depth/breadth of deficiency. Review of Higher Education, 33(2), 177-205. Bahr, P. (2012). Deconstructing remediation in community colleges: Exploring associations between course-taking patterns, course outcomes, and attrition from the remedial math and remedial writing sequences. Research in Higher Education, 53(6), 661-693. Bailey, T., Jenkins, D., & Leinbach, T. (2005). Community college low-income and minority student completion study: Descriptive statistics from the 1992 high school cohort. New York, NY: Community College Research Center, Columbia University. Bailey, T. & Morest, V. (2006). Defending the community college equity agenda. Baltimore: Johns Hopkins University Press. Bandura, A. (1997a). Self-efficacy: The exercise of control. New York: W. H. Freeman and Company. Bandura, A. (1997b). Self-efficacy. Harvard Mental Health Letter, 13(9), 4. Banks, C. & Banks, J. (1995). Equity pedagogy: An essential component of multicultural education. Theory into Practice, 34(3), 152-158. Bassett, M. & Frost, B. (2010). Smart math: Removing roadblocks to college success. Community College Journal of Research and Practice, 34(11), 869-873. Baum, S., & Payea, K. (2004). Education pays 2004: The benefits of higher education 142 for individuals and society. Washington, DC: College Board. Bettinger, E., & Long, B. (2009). Addressing the needs of underprepared students in higher education: Does college remediation work? Journal of Human Resources, 44(3), 736- 771. Billstein, R. & Williamson, J. (1998). Middle grades MATH thematics. Evanston, IL: McDougal Littell. Boaler, J. (2008). What?s math got to do with it? New York: Penguin Group. Boaler, J. & Staples, M. (2008). Creating mathematical futures through an equitable teaching approach: The case of Railside School. Teachers College Record, 110(3), 609-644. Boylan, H. & Bonham, B. (2007). 30 years of developmental education: A retrospective. Journal of Developmental Education, 30(3), 2-4. Boylan, H., Bonham, B., & Tafari, G. (2005). Evaluating the outcomes of developmental education. New Directions for Institutional Research, 125(1), 59-72. Boylan, H., Bonham, B., & White, S. (1999). Developmental and remedial education in postsecondary education. New Directions for Higher Education, 108, 87-101. Boylan, H. & Saxon, D. (1999). What works in remediation: Lessons from 30 years of research. 1999. Retrieved from http://www.ncde.appstate.edu/reserve_reading/what_works.htm Brothen, T. & Wambach, C. (2004). Refocusing developmental education. Journal of Developmental Education, 28(2), 16-22. Brouwer, N., Ekimova, L., Jasinska, M., Gastel, L. & Meckauskaite, E. (2009). Enhancing mathematics by online assessments: Two cases of remedial education. Industry & Higher Education, 23(4), 277-283. 143 Byrk, A. (2012, July). Celebrating a year of accomplishments and embracing the improvement challenges ahead. Poster session presented at the Pathways National Forum, Santa Cruz, CA. Retrieved from http://www.carnegiefoundation.org/developmental-math/celebrating- year-accomplishments-and-embracing-the-improvement-challenges-ahead Carnine, D., & Gersten, R. (2000). The nature and roles of research in improving achievement in mathematics. Journal for Research in Mathematics Education, 31(2), 138-43. Cichon, D. & Ellis, J. (2002). The effects of MATH Connections on student achievement, confidence and perception. In S. L. Senk and D. R. Thompson (Eds.), Standards-based school mathematics curricula: What are they? What do students learn? (pp. 345?374). Mahwah, N.J.: Lawrence Erlbaum Associates. Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education. Routledge Publishing: New York. Committee on the Undergraduate Program in Mathematics (1998). Quantitative reasoning for college graduates: A complement to the standards. Washington, DC: Mathematical Association of America. Committee on the Undergraduate Program in Mathematics (2011). College algebra guidelines. In S. Ganter & W. Haver (Eds.), Partner discipline recommendations for introductory college mathematics and the implications for college algebra (pp. 45 - 47). Washington, D.C.: Mathematical Association of America. Complete College America (2012). Core principles for transforming remedial education: A joint statement. Retrieved from http://www.completecollege.org/docs/ Remediation_Joint_Statement-Embargo.pdf 144 Conley, D. & Bodone, F. (2002). University expectations for student success: Implications for system alignment and state standard and Assessment policies. Retrieved from http://www.eric.ed.gov/ERICWebPortal/search/detailmini.jsp?_nfpb=true&_&ERICExtS earch_SearchValue_0=ED464922&ERICExtSearch_SearchType_0=no&accno=ED4649 22 Conley, D., Drummond, K., de Gonzalez, A., Rooseboom, J., & Stout, O. (2011). Reaching the goal: The applicability and importance of the Common Core State Standards to college and career readiness. Eugene, OR: Educational Policy Improvement Center. Retrieved from https://www.epiconline.org/publications/documents/ReachingtheGoal- ExecutiveSummary.pdf Cooner, T. (2005). Dialectical constructivism: Reflections on creating a web-mediated enquiry- based learning environment. Social Work Education, 24(4), 375-390. Coxford, A. F., Fey, J. T., Hirsch, C. R., Schoen, H. L., Burrill, G., Hart, E. W., Watkins, A. E., Messenger, M. J., & Ritsema, B. (1998). Contemporary mathematics in context: A unified approach. Chicago: Everyday Learning Corporation. Creswell, J., W. (2007). Qualitative inquiry and research design: Choosing among five approaches. Thousand Oaks, CA: Sage Publications. Donovan, M., Bransford, J., & Pellegrino, J. (1999). How people learn: Bridging research and practice. Washington, D.C.: National Academy Press. Dowling, D. (1978). The development of a mathematics confidence scale and its application in the study of confidence in women college students. Unpublished doctoral dissertation, Ohio State University, Columbus. 145 Duranczyk, I. & Higbee, J. (2006). Developmental mathematics in 4-Year institutions: Denying access. Journal of Developmental Education, 30(1), 22-31. Ellington, A. J. (2005). A modeling-based college algebra course and its effect on student achievement. Primus: Problems, Resources & Issues in Mathematics Undergraduate Studies, 15(3), 193-214. Epper, R. M., & Baker, E. D. (2009). Technology solutions for developmental math: An overview of current and emerging practices. Seattle, WA: William and Flora Hewlett Foundation and the Bill and Melinda Gates Foundation. Erickson, M., & Shore, M. (2003). Physical therapist assistant students and developmental mathematics: An integrative curriculum approach to teaching mathematics. Mathematics & Computer Education, 37(3), 374-380. Ernest, P. (1997). The epistemological basis of qualitative research in mathematics education: A postmodern perspective. In A. R. Teppo (Ed.), Qualitative research methods in mathematics education. Reston, VA: National Council of Teachers of Mathematics. Fendel, D., Resek, D., Alper, L., & Fraser, S. (1999). Interactive mathematics program. Emeryville, CA: Key Curriculum Press. Fike, D. & Fike, R. (2007). Does faculty employment status impact developmental mathematics outcomes? Journal of Developmental Education, 21(1), 2-11. Fry, H., Ketteridge, S., & Marshall, S. (2003). A handbook for teaching and learning in higher education: Enhancing academic practice (3rd ed.). New York: Taylor and Francis. Gallard, A. J., Albritton, F., & Morgan, M. W. (2010). A comprehensive cost/benefit model: Developmental student success impact. Journal of Developmental Education, 34(1), 10- 25. 146 Gerlaugh, K., Thompson, L., Boylan, H., & Davis, H. (2007). National study of developmental education II: Baseline data for community colleges. Research in Developmental Education, 20(4), 1-4. Goldrick-Rab, S. (2010). Challenges and opportunities for improving community college student success. Review of Educational Research, 80(3), 437-469. Golfin, P., Jordan, W., Hull, D., & Ruffin, M. (2005). Strengthening mathematics skills at the postsecondary level: Literature review and analysis, Washington, D.C.: U.S. Department of Education. Gordon, F. (2006). Assessing what students learn: Reform versus traditional precalculus and follow-up calculus. In N. Hastings (Ed.), A fresh start for collegiate mathematics: Rethinking the courses below calculus (181-192). Washington, D.C.: Mathematical Association of America. Gravetter, F., & Wallnau, L. (2004). Statistics for the behavioral sciences. Belmont, CA: Thomson Learning, Inc. Guti?rrez, R. (2007, October). Context matters: Equity, success, and the future of mathematics education. Invited plenary at the North American Chapter of the International Group for the Psychology of Mathematics Education, Lake Tahoe, NV. Hagedorn, L., Siadat, M., Fogel, S., Nora, A., & Pascarella, E. (1999). Success in college mathematics: Comparisons between remedial and nonremedial first-year college students. Research in Higher Education, 40(3), 261-284. Hern, K. (2012). Acceleration across California: Shorter pathways in developmental English and math. Change, 44(3), 60-68. 147 Hirschhorn, D. B. (1993). A Longitudinal study of students completing four years of UCSMP mathematics. Journal for Research in Mathematics Education, 24(2), 136-158. Hooker, D. (2011). Small peer-led collaborative learning groups in developmental math classes at a tribal community college. Multicultural Perspectives, 13(4), 220-226. Huck, S. (2004). Reading statistics and research (4th ed). New York: Pearson Education. Hurley, J., Koehn, U., & Ganter, S. (1999). Effects of calculus reform: Local and national. The American Mathematical Monthly, 106(9). 800-811. Institute for Higher Education Policy. (1998). College remediation: What it is, what it costs, what's at stake. Washington D.C.: The Institute for Higher Education Policy. Johnson, M., & Kuennen, E. (2004). Delaying developmental mathematics: The characteristics and costs. Journal of Developmental Education, 28(2), 24-29. Jones, A., & King, J. (2012). The Common Core State Standards: A vital tool for higher education. Change, 44(6), 37-43. Judson, E., & Lawson, A. (2007). What is the role of constructivist teachers within faculty communication networks? Journal of Research in Science Teaching, 44(3), 490-505. Kerckhoff, A., Raudenbush, S., & Glennie, E. (2001). Education, cognitive skill, and labor force outcomes. Sociology of Education, 74(1), 1-24. Kitsantas, A., Cheema, J., & Ware, H. W. (2011). Mathematics achievement: The role of homework and self-efficacy beliefs. Journal of Advanced Academics, 22(2), 310-339. Kozeracki, C. (2002). ERIC review: Issues in developmental education. Community College Review, 29(4), 83. Lappan, G., Fey, J., Fitgerald, W., Friel, S., & Phillips, E. (1997). Connected mathematics project. Palo Alto, CA: Dale Seymour. 148 Lawson, A., Russell, B., Irene, B., Carlson, M., Falconer, K., & Hestenes, D. (2002). Evaluating college science and mathematics instruction. Journal of College Science Teaching, 31(6), 388-393. Lerman, S. (1989). Constructivism, mathematics and mathematics education. Educational Studies in Mathematics, 20(2), 211-23. Mac Iver, M. A. & Mac Iver, D. J. (2009). Urban middle-grade student mathematics achievement growth under comprehensive school reform. Journal of Educational Research, 102(3), 223-236. MacIsaac, D. & Falconer, K. (2002). Reforming physics instruction via RTOP. Physics Teacher, 40(8), 479-486. Massachusetts Community College Executive Office. (2006). 100% Math Initiative: Building a foundation for student success in developmental math. Boston: Author. McCabe, R. & Day, P. (1998). Developmental education: A twenty-first century social and economic imperative. New York, NY: College Board. McSweeney, L., & Weiss, J. (2003). Assessing the math online tool: A progress report. Mathematics & Computer Education, 37(3), 348-357. Merriam, S. B. (2001). Qualitative research and case study applications in education. San Francisco, CA: Jossey-Bass. Merseth, K. K. (2011). Update: Report on innovations in developmental mathematics ? moving mathematical graveyards. Journal of Developmental Education, 34(3), 32-39. Midgley, C., Maehr, M. L., Hruda, L. Z., Anderman, E., Anderman, L., Freeman, K. E., et al. (2000). Manual for the Patterns of Adaptive Learning Scales (PALS). Ann Arbor: University of Michigan. 149 Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: SAGE Publications. Mills, M. (1998). From coordinating board to campus. Journal of Higher Education, 69(6), 672- 697. Mills, J., Bonner, A., & Francis, K. (2006). The development of constructivist grounded theory. International Journal of Qualitative Methods, 5(1), 1-10. Mousoulides, N., & Philippou, G. (2005). Students? motivational beliefs, self-regulation strategies and mathematics achievement. In H. L. Chick & J. L. Vincent (Eds.), Proceedings of the 29th conference of the International Group for the Psychology of Mathematics Education (PME) (pp. 321-328). Melbourne, Australia: PME. Mueller, M., Yankelewitz, D., & Maher, C. (2011). Sense making as motivation in doing mathematics: Results from two studies. Mathematics Educator, 20(2), 33-43. National Center for Education Statistics (2004). Remedial education at degree-granting postsecondary institutions in fall 2000 (NCES 2004-010). Washington, DC: U.S. Department of Education. Retrieved from http://nces.ed.gov/pubs2004/2004010.pdf National Center for Education Statistics (2008). 1989?90 through 2007?08 integrated postsecondary education data system, "institutional characteristics survey" (IPEDS- IC:89?99), and Fall 2000 through Fall 2007. Retrieved from http://nces.ed.gov/programs/digest/d08/tables/dt08_328.asp National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA.: Author. National Council of Teachers of Mathematics. (1991). Professional standards for teaching mathematics. Reston, VA.: Author. 150 National Council of Teachers of Mathematics. (1995). Assessment standards for school mathematics. Reston, VA.: Author. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA.: Author. National Council of Teachers of Mathematics. (2009). Focus on high school mathematics: Reasoning and sense making. Reston, VA: Author. National Council of Teachers of Mathematics. (2011). Making it happen. Reston, VA: Author. National Governors Association & Council of Chief State School Officers. (2010). Common core state standards. Washington, D.C.: Author. Retrieved from http://www.corestandards.org/the-standards National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education. Olani, A. A., Hoekstra, R. R., Harskamp, E. E., & van der Werf, G. G. (2011). Statistical reasoning ability, self-efficacy, and value beliefs in a reform based university statistics course. Electronic Journal of Research in Educational Psychology, 9(1), 49-72. Osborne, J., & Costello, A. (2004). Sample size and subject to item ratio in principal components analysis. Practical Assessment, Research & Evaluation, 9(11). Retrieved from http://pareonline.net/getvn.asp?v=9&n=11 Pajares, F., & Kranzler, J. (1995). Self-efficacy beliefs and general mental ability in mathematical problem-solving. Contemporary Educational Psychology, 20(4), 426. 151 Pajares, F. & Graham, L. (1999). Self-efficacy, motivation constructs, and mathematics performance of entering middle school students. Contemporary Educational Psychology, 24(2), 124-139. Parmer, P. & Cutler, J. (2007). Easing the transition: Building better bridges between developmental and college-level math. Journal of Applied Research in the Community College, 15(1), 37-45. Perin, D. (2006). Can community colleges protect both access and standards? The problem of remediation. Teachers College Record, 108(3), 339-373. Perna, L. W. (2005). The benefits of higher education: Sex, racial/ethnic, and socioeconomic group differences. Review of Higher Education, 29(1), 23-52. Phoenix, C. Y. (1990). A four-strategy approach used to teach remedial mathematics in a freshman year program. Community Review, 11(1), 45-53. Piburn, M., & Sawada, D. (2000). Reformed teaching observation protocol (RTOP): Reference manual (ACEPT Technical Report IN00-3). Retrieved from http://www.ecept.net/rtop/RTOP_Reference_Manual.pdf Pietsch, J., Walker, R., & Chapman, E. (2003). The relationship among self-concept, self- efficacy, and performance in mathematics during secondary school. Journal of Educational Psychology, 95(3), 589-603. Pines, A., & West, L. (1986). Conceptual understanding and science learning: An Interpretation of research within a sources-of-knowledge framework. Science Education, 70(5), 583- 604. Pintrich, P.R. & De Groot, E. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33-50. 152 Radford, A., Pearson, J., Ho, P., Chambers, E., & Ferlazzo, D. (2012). Remedial coursework in postsecondary education: The students, their outcomes, and strategies for improvement. Berkley, California: MPR Associates. Reys, R., Reys, B., Lapan, R., Holliday, G., & Wasman, D. (2003). Assessing the impact of Standards-based middle grades mathematics curriculum materials on student achievement. Journal for Research in Mathematics Education 34(1): 74 - 95. Robinson, E. & Robinson, M. (1998). MATH Connections: A secondary mathematics core curriculum content overview. In Guide to Standards-based instructional materials in secondary mathematics (1st ed.). Ithaca, New York: COMPASS. Sapsford, R. (1999). Survey research. London: Sage. Saxon, D., & Boylan, H. (2001). The cost of remedial education in higher education. Journal of Developmental Education, 25(2), 2. Schoen, H., Hirsch, C., & Ziebarth, S. (1998). An emerging profile of the mathematical achievement of students in the Core-Plus Mathematics Project. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA. Schroeder, T. L. & Lester, F. K. (1989). Developing understanding in mathematics via problem solving. In P. R. Trafton (Ed.), New directions for elementary school mathematics (pp. 31-56). Reston, VA: National Council of Teachers of Mathematics. Senk, S. & Thompson, D. (2003) Standards-based school mathematics curricula: What are they? What do students learn? Lawrence Erlbaum Associates, Inc: Mahwah. Senk, S. & Thompson, D. (2006). Strategies used by second-year algebra students to solve problems. Journal for Research in Mathematics Education 37(2), 116-128. 153 Skemp, R. (2006). Relational understanding and instrumental understanding. Mathematics Teaching in the Middle School, 12(2), 88-95. (Reprinted from Mathematics Teaching, 77, pp. 20-26, 1976) Sparks, D. & Malkus, N. (2013). First-year undergraduate remedial coursetaking: 1999-2000, 2003-04, 2007-08 (NCES 2013-013). Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013013 Squires, J., Faulkner, J., & Hite, C. (2009). Do the math: Course redesign?s impact on learning and scheduling. Community College Journal of Research and Practice, 33(11), 883-886. Stenmark, J. (1989). One point of view: What is mathematics equity? Arithmetic Teacher, 36(3), 3. Thompson, C. J. (2009). Preparation, practice, and performance: An empirical examination of the impact of Standards-based Instruction on secondary students' math and science achievement. Research in Education, 81, 53-62. Thompson, D. & Senk, S. (2001). The effects of curriculum on achievement in second-year algebra: The example of the University of Chicago School Mathematics Project. Journal for Research in Mathematics Education, 32(1), 58-84. Tomlinson, L. M. (1989). Postsecondary developmental programs: A traditional agenda with new imperatives. Washington, D.C.: The George Washington University. Trenholm, S. (2006). A study on the efficacy of computer-mediated developmental math instruction for traditional community college students. Research & Teaching in Developmental Education, 22(2), 51-62. Vassiliou, J. (2011). Improved outcomes with computer-assisted instruction in mathematics and English language skills for Hispanic students in need of remedial education at Miami 154 Dade College, Florida. Community College Journal of Research and Practice, 35(3), 191-201. Villarreal, L. (2003). A Step in the positive direction: Integrating a computer laboratory component into developmental algebra courses. Mathematics & Computer Education, 37(1), 72-78. Virginia Community College System, Office of Institutional Research &. Effectiveness. (2011). Developmental education annual report: Tracking the fall 2006 cohort and five-year historical trends. Retrieved from http://www.vccs.edu/Portals/0/ContentAreas/AcademicServices/Dev_Ed_Annual_Report _201102.pdf Vosniadou, S., & Brewer, W. (1987). Theories of knowledge restructuring in development. Review of Educational Research, 57(1), 51-67. Walker, S., & Senger, E. (2007). Using technology to teach developmental African-American algebra students. Journal of Computers in Mathematics & Science Teaching, 26(3), 217- 231. Webb, N. (2003). The impact of the Interactive Mathematics Program on student learning. In S. L. Senk & D. R. Thompson (Eds.), Standards-based school mathematics curricula: What are they? What do students learn? (pp. 375-398). Mahwah: Lawrence Erlbaum Associates. White-Clark, R., DiCarlo, M., & Gilchriest, N. (2008). "Guide on the side": An instructional approach to meet mathematics standards. High School Journal, 91(4), 40-44. Wigfield, A. & Eccles, J. (2002). The development of competence beliefs, expectancies for success, and achievement values from childhood through adolescence. In A. Wigfield & 155 J. Eccles (Eds.), Development of achievement motivation (pp. 91-120). San Diego: Academic Press. Wolfle, J. D. (2012). Success and persistence of developmental mathematics students based on age and ethnicity. Community College Enterprise, 18(2), 39-54. Woodard, T., & Burkett, S. (2005). Comparing success rates of developmental math students. Inquiry, 10(1), 54-63. Zavarella, C., & Ignash, J. (2009). Instructional delivery in developmental mathematics: Impact on retention. Journal of Developmental Education, 32(3), 2-4. 156 Appendix A Permission Forms 157 Grade Release Form 158 Informed Consent Form 159 160 Appendix B Student Surveys 161 Efficacy Survey Below are some questions about you as a student in this math class. Please circle the number that best describes what you think. Your responses will be kept anonymous. 1. I'm certain I can master the skills taught in math class this semester. 1 2 3 4 5 NOT AT ALL TRUE SOMEWHAT TRUE VERY TRUE 2. I'm certain I can figure out how to do the most difficult class work in math class. 1 2 3 4 5 NOT AT ALL TRUE SOMEWHAT TRUE VERY TRUE 3. I can do almost all the work in math class if I don't give up. 1 2 3 4 5 NOT AT ALL TRUE SOMEWHAT TRUE VERY TRUE 4. Even if the work is hard in math class, I can learn it. 1 2 3 4 5 NOT AT ALL TRUE SOMEWHAT TRUE VERY TRUE 5. I can do even the hardest work in this math class if I try. 1 2 3 4 5 NOT AT ALL TRUE SOMEWHAT TRUE VERY TRUE 162 Demographic Information For the purposes of this study, please provide the following information. The information will be kept confidential. Name: ______________________ On average, for how many hours are you employed each week? ___________ For how many credit-hours did you enroll this semester? __________ What is your age? __________ What is your sex: Male Female Please specify your race: Black/African American White Hispanic Native American Asian/Pacific Islander Other 163 Anonymous End-of-Term Student Survey (Traditional Course) 1. How does this math class compare to other math classes that you have had? Explain. 2. What are some things you liked about the course? 3. What are some things you did not like about the course? 4. Other comments (room on back) 164 Anonymous End-of-Term Student Survey (Reform-oriented Course) 1. How does this math class compare to other math classes that you have had? Explain. 2. What are some things you liked about the course? 3. What are some things you did not like about the course? 4. To what extent did you like working with your classmates during class? Explain. 5. Did you find the graphing calculator useful? If yes, please explain how/when it was useful. 6. To what extent did you benefit from presenting your work to the class (or watching your classmates present their work to the class)? Explain. 7. Other comments (room on back) 165 Appendix C Sample Application Problems 166 Sample Application Problems Test 1 (Factoring): A construction worker accidently drops a tool from the top of a 256- foot building. The height h of the tool after t seconds is given by h = -16t2 + 256. When will the tool hit the ground? Test 2 (Rational Equations): On an architect?s blueprint, 1 inch corresponds to 4 feet. Find the length of a wall represented by a line 3 ? inches long on the blueprint. Round to the nearest tenth if necessary. Test 3 (Functions): Michelle just purchased a used car from her uncle and agreed to pay him a certain amount of money at the end of each month. After 3 months, she owed $6700 on the car. After 7 months, she owed $4300 on the car. Will she be able to pay off her car by the end of the 12th month? Explain carefully. Test 4 (Radicals): A CSI Forensic Team found a dead man lying in the road next to a very tall apartment building. The Forensic Team determined that the man was traveling at least 90 feet per second when he hit the ground. If the formula describes how fast a person will be falling when they hit the ground based on their initial height off the ground, how high off the ground was the man when he fell off the building? Test 5 (Quadratic Equations): The following equation describes the profit, P(x) that a car dealership makes based on the number of employees, x, that it hires: P(x) = -3x2 + 240x. A) Find the number of employees that the dealership should hire in order to maximize its profit. B) What is the maximum profit that the dealership can make? 167 Appendix D Reformed Teaching Observation Protocol 168 Reformed Teaching Observation Protocol 169 170 171 172 173 Appendix E Paired Lesson Plans 174 Traditional Lesson Plan for Difference of Squares Title of Lesson: Factoring Differences of Squares (Traditional course) Audience: Math 0800 Content Objectives: Have students see that (x + a)(x ? a) = x2 ? a2 Have students see that adding x feet to the length of a square and then subtracting x feet from the width of the square reduces the area of the square by x2 feet Behavioral Objectives: The student will be able to factor an expression containing a difference of squares The student will be able to verify that the expression was factored correctly Prerequisites: How to multiply binomial expressions: (a + b)(c + d) How to add like terms Materials: None Procedure Phase Preliminary: Overview: Ask students if they have any questions on their homework or other concepts already covered in class. Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will answer homework questions posed by the students Key Questions: Are there any questions with the homework or other material that we have covered? 175 Phase 1: Overview: Demonstrate to students that the product (A + B)(A - B) will always result in an answer of the form x2 ? a2. Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will present the generalization for the difference of squares: (A+B)(A-B) = A2 ? B2 ?There is a special pattern in math called the ?difference of squares?. The pattern was given its name because anytime you multiply two terms (A-B)(A+B) [write this on the board], you will always get an answers that looks like this A2 ? B2 [continue writing on the board = A2 ? B2] 2) The instructor will then demonstrate this difference of squares pattern with the following problems: a) (x+3)(x-3) = x2 ? 9 b) (x+5)(x-5) = x2 ? 25 Key questions: Suppose you have (Y+Z)(Y-Z). Without working it out, what will the product look like? Are there any questions? Phase 2: Overview: Apply the factoring pattern in Phase 1 to factor several relatively simple difference of squares expressions. Grouping: Students sit in desks by themselves facing the instructor Tasks: 176 1) The instructor will explain that recognizing a difference of squares can enable someone to identify the product from which it came. 2) The instructor will then state the following rules for factoring a difference of squares: a) Verify that the expression is indeed a difference of squares by i) observing that the expression is a difference of two terms and by ii) identifying the square roots of each term b) Write the square roots down in their corresponding parentheses c) Place an addition sign in one parentheses and a subtraction sign in the other parentheses 3) The instructor will demonstrate how to factor the following problems: a) x2 ? 49 b) y4 ? 100b2 c) z10 ? 144R6 4) The instructor will then use the previous problems to show students how to check their answers by multiplying the products out again. 5) The instructor will ask students if they have any questions. Once all questions are answered, the instructor will ask students to factor the following problem: t4 ? 49z6 Solution: t4 ? 49z6 = (t2 ? 7z3) (t2 + 7z3) Question: How do you know if you have factored correctly? Key Questions: If you have a difference of squares, how do you find its factors? Are there any questions? Phase 3: 177 Overview: Apply the factoring pattern in Phase 1 to factor other more complicated variations of difference of squares problems Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will then present and solve a variety of factoring problems that embody different ways in which the difference of squares pattern can emerge: a) 3x2 ? 12 (factor out the 3 first in order to reveal a difference of squares) b) -9x2 + 100 (rearrange the terms in order to reveal a difference of squares) c) P8 ? 16 (perform the difference of squares twice to fully factor the expression) 2) The instructor will then present the factoring problem ?x2 + 9? and emphasize that problems possessing the ?sum of squares? pattern cannot be factored 3) The instructor will then ask students if they have any questions. After questions are answered, the instructor will ask students to factor the following problems: a) x4 ? 81 Solution: x4 ? 81 = (x2 ? 9)(x2 + 9) = (x + 3)(x ? 3)(x2 + 9) Questions: Can x2 ? 9 be factored down further? How about x2+ 9? b) 162 ? 8y4 Solution: 162 ? 8y4 = 2(81 ? 4y4) = 2(9 ? 2y2)(9 + 2y2) Question: What did you have to do to see that you had a difference of squares? Question: How many times did you have to factor a difference of squares? Key Questions: What are signs that you might have a difference of squares pattern? What might you need to do in order to see a difference of squares pattern? 178 Can you factor a sum of squares? Are there any questions? Phase 4: Overview: Apply the difference of squares factoring pattern to real estate Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will present the following scenario: ?A city developer normally sells square lots (S x S). However, he offers a special deal to a newlywed couple. For the same price as the square lot, the developer will turn the square lot into a rectangle by adding B feet to the length and then by subtracting B feet from the width. Should the couple accept the developer?s offer?? 2) The instructor first will solve the scenario abstractly in terms of S and B. Solution: The original lot has an area of (S)(S) = S2. The modified lot will have an area of (S+B)(S-B) = S2 ?B2. So the modified lot will have B2 less area. Therefore the couple should not accept the offer. 3) The instructor will then present a solution to the scenario by letting S = 10 and B = 3. Solution: The original lot has an area of (10)(10) = 100 square feet. The modified lot will have an area of (10 + 3)(10 ? 3)= (100 + 30 ? 30 ? 9) = (100 ? 9) = 91 square feet. Key Questions: Are there any questions? 179 Reform-oriented Lesson Plan for Difference of Squares Title of Lesson: Factoring Differences of Squares (Reform-oriented course) Audience: Math 0800 Content Objectives: Have students see that (x + a)(x ? a) = x2 ? a2 Have students see that adding x feet to the length of a square and then subtracting x feet from the width of the square reduces the area of the square by x2 feet Behavioral Objectives: The student will be able to factor an expression containing a difference of squares The student will be able to verify that the expression was factored correctly Prerequisites: How to multiply binomial expressions: (a + b)(c + d) How to add like terms Materials: None Procedure Phase 1: Overview: Present a situation where a developer modifies square lots by adding a particular distance to the length of the square and then subtracting the same distance from the width of the square (to form a rectangle). Grouping: Students will sit in groups that are heterogeneous in mathematical ability Tasks: 1) Determine if the developer?s modifications alters the area of the original lot 180 2) Determine the effect that adding/subtracting x feet to a square lot has on the area of the original lot 3) When given the area of the new/modified lot, determine what changes the developer made. 4) Generalize the factoring pattern for x2 ? c2 Key questions: How could you go about figuring out what happens to the area of the lot when the developer implements his changes? What happens to the area of the lot when the developer adds/subtracts the same amount to each side? How much does the area of the lot change when the developer adds/subtracts x feet to each side? What strategies (ex. pictures, tables) helped you discover the effect that occurred when altering the original lots? Phase 2: Overview: Apply the factoring pattern in Phase 1 to factor various expressions. Grouping: Students will sit in groups that are heterogeneous in mathematical ability Tasks: 1) Factor: x4 ? 81 Solution: x4 ? 81 = (x2 ? 9)(x2 + 9) = (x + 3)(x ? 3)(x2 + 9) Questions: Can x2 ? 9 be factored down further? How about x2+ 9? 2) Factor: 162 ? 8y4 Solution: 162 ? 8y4 = 8(81 ? y4) = 8(9 ? y2)(9 + y2) = 8(3 + y)(3 ? y)(9 + y2) Question: What did you have to do to see that you had a difference of squares? Question: How many times did you have to factor a difference of squares? 181 Key questions: How do you verify that you have factored a problem correctly? How do you factor a sum of squares (like x2 + 9)? Handout Given to Students Recall that a city developer wanted to change the boring square house lots of a neighborhood into more creative rectangle lots. To spice things up, he added, say, 2 meters to the length of the square and then subtracted the same amount (in this case 2 meters) from the depth. So if the developer adds some amount to the length but immediately subtracts that same amount from the depth, what happens to the area of the lot? For each of the changes that were made to the square lots, complete these tasks: *Make and label a sketch of the original square lot, using the variable X to represent the length of the original square * Make and label a sketch of the new lot, using the variable X to represent the length of the original square *Write an expression for the area of the new lot as a product of its length and width *Write an expression without parentheses for the area of the new lot as a sum of smaller areas. Use your sketch to explain this expression. 1) Suppose that the developer wanted to increase one side of the square lot by 5 meters but decrease the other side by 5 meters. Will the area of the new lot be the same as the area of the original lot? 2) Suppose that the developer wanted to increase one side of the square lot by 3 meters but decrease the other side by 3 meters. Will the area of the new lot be the same as the area of the original lot? 182 3) So when the developer increases/decreases the lot by some amount, B, what happens to the area of the lot? 4) If you see a lot with an area of x2 ? 16, what were the original dimensions? 5) If you see something like x2 ? c2, how do you factor that? Factor: x4 ? 81 Factor: 162 ? 8y4 Dimensions of new lot Area 183 Traditional Lesson for Shifting of Graphs Title of Lesson: Shifting of Graphs Audience: Math 0800 Content Objectives: Have students recognize 4 families of graphs (linear, quadratic, radical, absolute value) Have students see that y = (x + a)2 will shift the graph horizontally ?-a? units Have students see that y = (x)2 + a will shift the graph vertically ?a? units Have students see that y = -(x)2 will flip the graph Behavioral Objectives: The student will be able to graph quadratic, radical, and absolute value functions Prerequisites: How to create and use an x/y table How to compute absolute value expressions Materials: None Procedure Phase Preliminary: Overview: Ask students if they have any question on their homework or other concepts already covered in class Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will answer homework questions posed by the students Key Questions: Are there any questions with the homework or other material that we have covered? 184 Phase 1: Overview: Provide four common functions and their corresponding x/y tables and graphs Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will write the following functions on the board and then explain how to use an x/y table to graph each of the functions y = x y = x2 y = y = | x | The instructor will explain that the above four functions are common in mathematics courses and that it is important to become familiar with each of their shapes ?These 4 functions are ?mother functions? because they are the simplest versions of each family of functions, and from each of them come all of the other functions that we will see in this course. Note that the four basic shapes are lines, U?s, cursive r?s, and v?s.? Key Questions: If you ever forget what the shape of a graph is, what can you do? (use an x/y table) Phase 2: Overview: Demonstrate the change in the y = x2 function?s graph by systematically modifying the original function Grouping: Students sit in desks by themselves facing the instructor Tasks: 185 1) The instructor will state that for the y = x2 family of functions, y = x2 + a will shift the graph ?a? spaces along the y-axis, y = (x + a)2 will shift the graph ?-a? spaces along the x-axis, and y = -x2 will flip the graph 2) The instructor will modify y = x2 and provide x/y tables to support the creation of the newly adjusted graphs y = x2 y = x2 + 1 y = x2 ? 1 y = (x + 1)2 y = (x ? 1)2 y = - x2 3) The instructor will provide other tips: a) Notice that these types of graphs have a natural symmetry b) It helps to focus on the vertex of a graph when shifting it Key Questions: What is the basic shape to a y = x2 graph? How does adding/subtracting within the squaring mechanism affect the graph compared to adding/subtracting outside the squaring mechanism? What happens to the original graph when you put a negative sign on the x2? Phase 3: Overview: Demonstrate the change in the y = function?s graph by systematically modifying the original function Grouping: Students sit in desks by themselves facing the instructor 186 Tasks: 1) The instructor will state that for the y = family of functions, y = + a will shift the graph ?a? spaces along the y-axis, y = will shift the graph ?-a? spaces along the x-axis, and y = will flip the graph 2) The instructor will modify y = and provide x/y tables to support the creation of newly adjusted graphs y = y = + 1 y = - 1 y = (students try) y = y = 3) The instructor will provide other tips: a) Notice that the graph does not go infinitely in both directions; it has a ?starting point? b) The starting point exists because negative numbers are not allowed in the square roots c) The graph starts off rather quickly and then grows very slowly Key Questions: What is the basic shape to a y = graph? How does adding/subtracting within the squaring mechanism affect the graph compared to adding/subtracting outside the squaring mechanism? What happens to the original graph when you put a negative sign on the ? Phase 4 187 Overview: Demonstrate the change in the y = | x | function?s graph by systematically modifying the original function Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will state that for the y = | x | family of functions, y = | x | + a will shift the graph ?a? spaces along the y-axis, y = | x + a | will shift the graph ?-a? spaces along the x-axis, and y = - | x | will flip the graph 2) The instructor will modify y = | x | and provide x/y tables to support the creation of newly adjusted graphs y = | x | y = | x | + 1 (students try) y = | x | - 1 y = | x + 1 | y = | x ? 1 | y = - | x | Key Questions: What is the basic shape of a y = | x | graph? How does adding/subtracting within the squaring mechanism affect the graph compared to adding/subtracting outside the squaring mechanism? What happens to the original graph when you put a negative sign in front of the |x|? Phase 5: Overview: Demonstrate the change in each function?s graph by modifying the original function in multiple ways 188 Grouping: Students sit in desks by themselves facing the instructor Tasks: 1) The instructor will state that the above changes are cumulative; modifying a function in 2 ways will cause the graph to change in those 2 respective ways. For example: y = (x + 2)2 ? 1 will cause the vertex (and thus the rest of the graph) to shift left 2 spaces and down 1 space. The instructor will show the change in the graph by using both the rules as well as using an x/y table. y = will cause the vertex (and thus the rest of the graph) to shift right 2 spaces and up three spaces; the negative sign in front of the square root will cause the graph to flip. The instructor will show the change in the graph by using both the rules as well as using an x/y table. 2) Ask students to graph several functions a) Ask students to graph y = -|x + 4| - 2 on their own. After a few minutes, ask them to describe the graph of the function. b) Ask students to graph y = on their own. After a few minutes, ask them to describe the graph of the function. c) Ask students to graph y = (x ? 3)2 ? 1 on their own. After a few minutes, ask them to describe the graph of the function. Key Questions: How can you look at a graph and determine its shape, orientation, and location? If you forget which way a graph is supposed to shift, what can you do to figure out the graph?s correct orientation? (use an x/y table) What range of x-values should you use when creating an x/y table to graph a function? (at least - 6 to 6) 189 190 Reform-oriented Lesson for Shifting of Graphs Title of Lesson: Shifting of Graphs (Reform-oriented course) Audience: Math 0800 Content Objectives: Have students recognize 4 families of graphs (linear, quadratic, radical, absolute value) Have students see that y = (x + a)2 will shift the graph horizontally ?-a? units Have students see that y = (x)2 + a will shift the graph vertically ?a? units Have students see that y = -(x)2 will flip the graph Behavioral Objectives: The student will be able to graph quadratic, radical, and absolute value functions Prerequisites: How to create and use an x/y table How to compute absolute value expressions Materials: Graphing Calculators Procedure Phase 1: Overview: Present graphs of 4 functions (absolute value, quadratic, radical, and linear) and ask the students to match those graphs to their corresponding equations Grouping: Students will sit in preselected groups that are heterogeneous in mathematical ability Tasks: 1) Students match the four graphs presented to their corresponding equations by using either an x/y table or the graphing calculator Key questions: 191 What is the basic shape of the following graphs: quadratic, radical, absolute value, linear How are the linear and absolute value graphs related? How are the quadratic and absolute value graphs similar and different? How is the radical graph different from the other graphs? How could you convince someone that the graph of a quadratic function (for example) is shaped like a U? If you use an x/y table to graph the absolute value function, and you only use the points associated with x = 0, 1, 2, what type of graph might you create? If you use an x/y table to graph your functions, which x-values and how many x-values should you use? Phase 2: Overview: Students figure out what happens to the original graphs when the equations of those graphs are modified Grouping: Students will sit in preselected groups that are heterogeneous in mathematical ability Tasks: 1) Students graph 6 carefully selected versions of y = x2 and determine if they can make any generalizations regarding the way the original graph is changed y = x2 y = -x2 y = (x+2)2 y = (x-2)2 y = x2 + 2 y = x2 - 2 192 2) Students confirm within their groups that their graphs are correct, and then the class as a whole will verify that the graphs are correct. Students will also describe how they graphed the functions. 3) Students will discuss generalizations within their groups and then discuss with the rest of the class their generalizations Key questions: What generalizations/patterns do you notice? Do you think that the patterns you see hold for other types of functions? What did you notice when you added 2 inside the squaring mechanism versus when you subtracted 2 within the squaring mechanism? What happened when you added 2 outside the squaring mechanism versus when you subtracted 2 outside the squaring mechanism? How does adding/subtracting within the squaring mechanism affect the graph compared to adding/subtracting outside the squaring mechanism? Phase 3: Overview: Students determine if the generalizations made in Phase 2 hold for absolute value equations Grouping: Students will sit in preselected groups that are heterogeneous in mathematical ability Tasks: 1) Students graph 6 carefully selected versions of y = |x| y = |x| y= -|x| y = |x+1| 193 y = |x - 1| y = |x| + 1 y = |x| - 1 2) Students confirm within their groups that their graphs are correct, and then the class as a whole will verify that the graphs are correct and discuss how they graphed the functions. 3) Students will discuss within their groups if their generalizations from quadratic functions also hold for absolute value functions Key questions: How many of the generalizations you made for quadratic functions held for absolute value functions? Do you think these generalizations hold for radical functions also? What are the pros and cons of using a graphing calculator to graph functions? Phase 4: Overview: Students use the previous generalizations to graph functions with combinations of shifting [ex. y = -(x + 2)2 ? 4] Grouping: Students will sit in groups that are heterogeneous in mathematical ability Tasks: 1) Students will consider the graphs below and declare how they think the graph will look. f(x) = (x - 3)2 + 2, f(x) = -|x - 2| - 4, 2) Students will then verify that their declarations are correct Key questions: 194 How can you look at a graph and determine its shape, orientation, and location? If you are ever unsure of how a graph should look, what can you do? 195 Handout Given to Students Families of Functions Match the following functions to their corresponding graphs and x/y tables: Table of values Graphs 196 Quick question: Are the above graphs functions? How do you know? 2. The above functions are sometimes called ?parent? functions because they are written as simply as possible. But what happens when you start changing these parent functions one piece at a time? Let?s first look at the parent function f(x) = x2. How do you think the picture associated with f(x) = x2 will change as you modify different parts of the function? Graph the following functions (LABEL EACH GRID) and see if you can detect a pattern. (Use as many of the following grids as you like to keep your graphs from getting too cluttered.) (Hints: 1. Divide the work among your teammates, 2. your graphing calculator can save you bunches of time) x f(x) -3 3 -2 2 -1 1 0 0 1 1 2 2 3 3 x f(x) -3 -3 -2 -2 -1 -1 0 0 1 1 2 2 3 3 x f(x) -3 undefined -2 undefined -1 undefined 0 0 1 1 4 2 9 3 x f(x) -3 9 -2 4 -1 1 0 0 1 1 2 4 3 9 197 f(x) = x2 f(x) = (x + 2)2 f(x) = (x - 2)2 f(x) = x2 + 2 f(x) = x2 - 2 f(x) = -x2 Can you make any generalizations about how the picture will change based on how the function is changed? 198 3. In the previous section, you may have developed an idea about how the graph will change when you modify different parts of the function. Let?s see if your suspicions hold true for the next set of functions. f(x) = |x| f(x) = |x + 2| f(x) = |x - 2| f(x) = |x| + 2 f(x) = |x| - 2 f(x) = -|x| 199 What generalizations can you make now? 4. Based on your observations above, what do you think will happen to the graphs of the following functions? FIRST WRITE DOWN WHAT YOU THINK WILL HAPPEN. After you?ve done that, then see if you were right. f(x) = (x - 3)2 + 2 f(x) = -|x - 2| - 4 If you are ever unsure of how a graph should look (and you don?t have your graphing calculator), what can you do? 200 Appendix F Responses to Open-ended Student Surveys 201 Student Responses from the Control Group for Question 1 Question 1: ?How does this math class compare to other math classes that you have had? Explain.? Category # of comments Representative Student Quotes Comments about the teacher Positive comments about the instructor?s explanations 11 HH. This class has been very helpful. Mr. Smith taught us in detail the steps of a problem. KK. It is different because in my other math classes I was lost because I didn?t have my instructor break the work down. LL. The teacher explained things better in this class TT. This Math 0800 class is very easy because our instructor simplifies the problems, so everyone can learn it and grasp on to the concept. VV. [The instructor] explains basic concepts better than other instructors I have had. Positive comments about the instructor in general 7 AA. The only good thing about [the course] was the professor. He made me not dread coming to class. BB The teacher spent more time covering the material and actually helping students EE. Awesome teacher RR. The teacher provides an in-depth learning experience. SS. I enjoyed this math class [because] my teacher was very funny Neutral comments about the instructor in general 1 JJ. I have never had a teacher have their back towards the class. Miscellaneous comments about the course Positive miscellaneous comments about the course 5 QQ. This math class is much easier to follow and understand. QQ. The pace that is set is extremely acceptable WW. [The course] was pretty cool. FF. [The course was] more hands on. MM. It actually breaks down the material for my understanding Neutral miscellaneous comments about the course 2 II. This class is more in depth. OO. [The course] was a review because I knew most of the material Negative miscellaneous comments about the course 1 PP. [The course] was very difficult. I had a hard time in this class. 202 Student Responses from the Experimental Group for Question 1 Question 1: ?How does this math class compare to other math classes that you have had? Explain.? Category # of comments Representative Student Quotes Conceptual Understanding Positive comments about improving conceptual understanding 8 C. This math class was easier to understand. The pictures and graphs helped me to visualize the problems and the concepts. This made a big difference for me. K. We actually learned WHY things in math are the way they are. And we were asked why does a graph do this and how does the equation give certain values. Other classes told us ?this is the answer and that?s it.? (quotes added) S. Concepts taught by using logical explanations as opposed to algebraic. P. [The class] was different, but I learned MUCH more. I feel like I learned ?math?, not just 0800 stuff. I feel like I have a lot more tools now to use in my next math course. I am more confident with numbers now. Group work/student interactions Positive comments about group work/student interactions 4 O. It was more open and interactive. R. Never worked in groups before?it was fun. S. Sitting in groups is different. More discussion than lecture. Negative comments about group work/student interactions 1 J. It was harder to learn in [this class] because we had to ask group members for answers/explanations rather than [the] teacher Opportunities for students to learn the material themselves Positive comments about students having opportunities to learn the material themselves 1 E. Lots different in the sense ?we teach ourselves? by trial and error, but it?s a good different! Neutral comments about students having opportunities to learn the material themselves 1 G. It is different, almost taught by class with some instruction by teacher. Negative comments about students having opportunities to learn the material themselves 1 T. I don?t deal well with ?figure it out yourself? methods. I need to be told how to do something or else I will never get it. Real World Applications Real World Applications (neutral comment) 1 A. [This class] applied more to [the] real world Miscellaneous comments about the course Positive miscellaneous comments about the course 3 B. Glad there was homework unlike my other class Q. One of the best [classes] U. [This class was] much better than previous classes. I enjoyed the method used to teach this class. Neutral miscellaneous comments about the course 2 H. It was harder, but still able to be learned. L. It was a little more challenging. Negative miscellaneous comments about the course 1 I. In previous classes, when [a student] was asking a question, the instructor NEVER answered back with a question. There is no point to [answering back with a question] when a [student] asks a question; [the student] needs help, not being asked another question and confusing them 203 even more. This is far worse than the Pakistan teacher I had in high school. 204 Students Responses from the Control Group for Questions 2-4 Category # of comments Representative Student Quotes Student Learning Positive classroom environment 1 SS. [The instructor] actually cared about our opinions on the lessons. Comments about the Teaching Method Positive comments about the teaching method 10 AA. I enjoyed the lecture. HH. The teacher knows what he is doing and tried his best in giving us easy ways to solve the problems JJ. [The instructor] does not make math boring. RR. The teaching style of [the instructor] is very good. SS. I liked that [the instructor] made math as fun as he knew how QQ. [The instructor] is my favorite thing about this course, because even though he?s brilliant, he can explain things in a VERY easy to understand way. Negative comments about the teaching method 2 JJ. [The instructor?s] back towards the class JJ. The use of the word ?like?. Math is a science, not an art, so it is ?is? not ?like??so I have been told by past teachers. Comments about the teacher Positive comments about the teacher 9 DD. The professor was very helpful MM. The enthusiasm from the teacher SS. I really enjoyed my instructor. I think he is a wonderful teacher. I wish I could take him for all of my math courses. I?m going to miss him. OO. Professor Smith made sure everyone understood. Comments about the course in general Positive comments about the course in general 3 GG. [The course] was easy. LL. The material wasn?t as hard as I thought. PP. [The course is] preparing me for [the subsequent math course]. Negative comments about the course in general 5 GG. Some tests were hard. II. [There was] not enough time NN. We could only retake 1 test TT. [I did not like that students] couldn?t use calculators on the [final exam] (I needed [the calculator] at times.) Comments about homework Positive comments about homework 4 UU. I liked the homework. It wasn?t overwhelming but it helped a lot. WW. [I liked the] homework. Negative comments about homework 2 TT. [I did not like] all the homework problems Other comments Positive comments about Math Lab 1 VV. Math Lab provides a lot of extra practice?very helpful. Negative comments about the Math Lab 1 AA. The math lab was mind-numbing. Busy work isn?t for me. Also, I feel that it hindered me in making [me] better. Outside tutoring 1 TT. [I liked getting help from] tutor[s] from the Instructional Support Lab Negative comments about the class?s meeting time 5 CC. I hate that I chose a morning class VV. 8am. Blah! Students? positive responses when asked about any negatives in the course 10 BB. Nothing [was bad]. Everything was great. MM. N/A?[Nothing was bad] 205 Student Responses from the Experimental Group for Questions 2-7 Category # of comments Student Quotes Student Learning Learning with Understanding 4 K. Everything was broken down and nothing was harder than it had to be. My favorite part was learning how to find the square root of a number with a calculator and how the process of using monsters and certain level prisons were used. N. [I liked the] VISUAL LEARNING. P. [There were] no huge ?math terms? Positive classroom environment 1 T. [I liked the] freedom to express ideas. Student Presentations Positive general comments about student presentations in general 4 O. [I liked] the way everyone was kinda forced to get involved and talk in front of class. L. It made me nervous, but it pushed me to make sure I had the right answers. Mixed comments about student presentations in general 2 G. [They helped me] A lot, except when they were wrong, then it confused me even more T. Seeing different ways of doing something both confused and helped me. Negative comments about student presentation in general 3 J. [I] would rather have had [the] teacher explain it versus another student [explaining it]. S. Not applicable; more scrutiny/arguing took up class time that could have been spent on more examples and material. Student presentations improved student learning 6 K. It gave us a chance to not only compare answers but to see HOW we got the answer or if we found an easier way to do a problem. M. [Watching classmates present their work] helped me to understand what I had done in simple terms. N. Explaining it to others helps you learn. U. [Student presentations helped] a lot. It helped me retain what I learned in class. Student presentations helped students to learn multiple ways to solve a problem/view a concept 7 B. You could see how people worked problems different ways. K. [Student presentations helped us see] if we found an easier way to do a problem. Student presentations improved students? confidence in their math abilities/public speaking skills 4 C. When presenting my work to the class I gained more confidence in the way I was solving my problems. Others were also able to point out flaws in my work as I was [able to point out flaws in] theirs. It basically made the whole class a big group. E. [They helped me] not to be afraid of being wrong. N. [It] helped me to see that I can do things right. Group Work Positive comments about group work 20 A. Yes, [it was] fun to see how everybody had a different thought process. C. It was fun to argue over whose answers were wrong and right and then find out why, and some students had great ways of explaining things too. M. [I] enjoyed the interaction. [We] supported each other. O. It made class more fun and more interesting. You make friends easier. P. [I] LOVED it! It was VERY helpful to me and I learned A LOT from my classmates. Sometimes on the test I knew how to do a problem because I remembered something a classmate said. U. [Working with classmates helped] a lot. It brought up different views and opinions of problems that were beneficial to knowing problems inside and out. Negative comments about 5 C. Sometimes the people in the groups could be a little bit distracting. 206 group work J. [I did not like] Working in groups and depending on your table for correct answers and ways to solve problems. F. Not really. I like the teacher to teach. J. [I] would prefer to work with classmates like once a week instead of every day T. I like talking to them but not sharing work. Graphing Calculators Positive comments about graphing calculators in general 8 O. Yes, I had never used one since this class, and those things can practically solve the problem for you. P. Yes [I liked them], but I knew I couldn?t use it on the final so I felt I couldn?t ?depend? on it. T. Yes. The use (once I got a handle on using it) of the table function and graphing equations was helpful. Graphing calculators helped me to understand problems 2 G. [They helped with] actually seeing how problems were worked. S. Yes, graphs helped comprehension. Graphing calculators helped me check my answers 4 L. Yes, it helped to verify my answers N. VERY! [It was] awesome to learn ways to check. Graphing calculators made graphing easier 8 C. Yes. When my mind went blank on figuring out how to graph an equation, I remembered the graphing calculator way which saved me a couple of times. J. Yes, [they] helped with graphs, tables, and square roots K. Yes, when you want to have a visual of the vertex or look at the x- intercepts or see how the graph shifts when you have y = x2 vs. y = (x + 1)2 ? 4 Graphing calculators helped me to solve problems 4 A. Yes, [I liked graphing calculators because] basic math [took] less time to figure out R. Yes, as soon as I learned how to use [the calculator], the problems became easier. U. Yes, [calculators] helped me solve problems using graphs and tables. I liked that I didn?t have to solve everything mathematically. Comments about the Teaching Method Positive comments about the teaching method 7 E. The professor didn?t stand in front of class and lecture boringly every day. N. [I] LOVED [the course]. All of the course was helpful. G. Math was fun this semester. K. The class was great. I haven?t had a decent math teacher since 8th grade, and I didn?t hate waking up in the morning for math for a change after the teachers in high school. Negative comments about the teaching method 11 F. Too much time trying to figure things out on my own. I like a teacher that teaches the whole time. Example?example?example. O. Need to focus a little more on working a few more problems that are going to be on test. S. Too much discussion; [it] takes longer to get through material. H. I would rather have learned the regular way of teaching with just problems instead of stories. Comments about the teacher Positive comments about the teacher 7 N. Very enthusiastic teacher. P. [The teacher] does not make us feel stupid; you do not talk down to us V. The teacher is an excellent teacher, tutor, a lot of fun, and very knowledgeable and helpful. Other comments Conflicts with the Math Lab 3 G. The class [way] did not correspond with the math lab way. P. Sometimes [the teacher] did not match what we [students] did in [Math] 207 lab and that was hard. Sometimes I needed more practice than just one worksheet, because it was different than the lab work. S. Examples in class were not [the] same as [Math] lab or book (as in- depth). Positive general comments 7 A. [I liked the] Handouts B. [I liked] the homework N. Thank you for EVERYTHING! O. Keep up the good work Mr. Luke Negative comment about the class?s meeting time 1 U. [The class] was at 8am.