STRATEGIC FACTORS OF INSTITUTIONAL PRACTICE IMPACTING STUDENT SUCCESS IN THE COMMUNITY COLLEGE AS PERCEIVED BY STUDENTS AND FACULTY: ACADEMIC PREPARATION, WORK ETHICS AND INSTITUTIONAL SUPPORT Except where reference is made to the work of others, the work described in this dissertation is my own or was done in the collaboration with my advisory committee. This dissertation does not include proprietary or classified information. ___________________________________ Kenneth Edward Scott Certificate of Approval: _______________________________ ______________________________ James V. Wright, Co-Chair Maria M. Witte, Co-Chair Professor Associate Professor Counselor, Leadership Educational Foundations, and Special Education Leadership and Technology _______________________________ ______________________________ Margaret E. Ross David C. DiRamio Associate Professor Assistant Professor Educational Foundations, Educational Foundations, Leadership and Technology Leadership and Technology _____________________________ Joe F. Pittman Interim Dean Graduate School STRATEGIC FACTORS OF INSTITUTIONAL PRACTICE IMPACTING STUDENT SUCCESS IN THE COMMUNITY COLLEGE AS PERCEIVED BY STUDENTS AND FACULTY: ACADEMIC PREPARATION, WORK ETHICS AND INSTITUTIONAL SUPPORT Kenneth Edward Scott A Dissertation Submitted to the Graduate Faculty of Auburn University in Partial Fulfillment of he Requirements for the Degree of Doctor of Education Auburn, Alabama May 10, 2008 iii STRATEGIC FACTORS OF INSTITUTIONAL PRACTICE IMPACTING STUDENT SUCCESS IN THE COMMUNITY COLLEGE AS PERCEIVED BY STUDENTS AND FACULTY: ACADEMIC PREPARATION, WORK ETHICS AND INSTITUTIONAL SUPPORT Kenneth Edward Scott Permission is granted to Auburn University to make copies of this dissertation at its discretion, upon request of individuals or institutions at their expense. The author reserves all publication rights. ____________________________________ Signature of Author ____________________________________ Date of Graduation iv VITA Kenneth Edward Scott, son of Willie Edward Scott and Mary Faye Brannen- Scott, was hatched May 28, 1953, in Jesup, Georgia, United States of America. He graduated from Greenville High School, Greenville, Alabama, in 1971, and was called to Naval (Military Intelligence) service during the Vietnam Conflict. During and subsequent to military service, he earned the following degrees, awards, or certifications: (1) AS Degree, University of Maryland (1978); (2) BSEET Degree, Georgia Southern University (1983); (3) MED Degree, Auburn University Montgomery (1991); (4) Thirty postgraduate hours in CIS from Auburn University Montgomery and Troy University (1993); (5) National Teaching Excellence Award (1989); (6) nominee for Chancellor?s Faculty of the Year award (2000); and, (6) Cisco Certified Networking Associate (CCNA) (2001, 2008). He has been in the Community College system for 23 years and has served in the positions of Division Director, Department Chair, Program Coordinator, or Instructor for Computer Information Systems for 21 of the 23 years. Two years in the system were consumed as a Software Engineer. In private business, he served as a Project Manager/Engineer for 2 years; in the Navy, service was rendered in the covert/overt operations of Military Intelligence for 7 Years, 6 Months, and 18 Days, in the Far East, Europe, and the United States. Married for 29 wonderful years to Rita Inez Morosky Scott has happily resulted in a beautiful daughter, Tera Rebekah Scott. v DISSERTATION ABSTRACT STRATEGIC FACTORS OF INSTITUTIONAL PRACTICE IMPACTING STUDENT SUCCESS IN THE COMMUNITY COLLEGE AS PERCEIVED BY STUDENTS AND FACULTY: ACADEMIC PREPARATION, WORK ETHICS AND INSTITUTIONAL SUPPORT Kenneth Edward Scott Doctor of Education, Auburn University, May 10, 2008 (M.Ed., Educational Leadership, Auburn University Montgomery, 1991) (B.S.E.E.T., Electrical Engineering, Georgia Southern University, 1983) (A.A., Business Administration, University of Maryland, 1978) 387 typed pages Directed by Maria Martinez Witte and James V. Wright Student success has been widely researched; however, community college student success as an outcome of institutional practice has not (Bailey, 2006a; Jenkins, 2006). Moreover, college student success is influenced by a wide range of factors. The factors for this study were derived from three significant studies: 1) the meta-analysis by Robbins et al. (2004) in which nine broad constructs of college or student success were identified; 2) the in-depth literature analysis by Kuh et al. (2006) identifying 14 indicators of student success; and, the study conducted by Smith (2005) in suggesting 51 student success competencies for online faculty. vi The three studies noted (Kuh et al., 2006; Robbins et al., 2004; Smith, 2005) were used to design The Strategic-Impact-Triad (SIT) Model which compiled the list of variables into three categorical factors: 1) academic preparation, 2) work ethics, and 3) institutional support. The Strategic-Impact-Triad Model was assessed within the framework of institutional or management practice. To measure the impact the SIT Model factors had on student success, survey data was collected from community college faculty and students. The data were used to assess how students and faculty perceived academic preparation, work ethics, and institutional support as inseparable factors specifically influencing student success within the framework of institutional practice. The findings of the study suggested that there is a statistically significant difference in the perceptions of work ethics and institutional support as these variables impact student success in the community college within the domain of institutional practice; conversely, academic preparation was not statistically significant between students and faculty. The research also suggested that in order to improve the Teaching- Learning-Assessment Domain to maximize student success, the relationships of these SIT factors must be better understood as a Strategic-Impact-Triad, not solely as individual, stand-alone components within the practices of an educational institution. vii ACKNOWLEDGMENTS No man, or person, is an island. This has never been more apropos than when a doctoral candidate undertakes the challenge of completing a dissertation, concurrent with family obligations. Granted, it may involve fulfilling a requirement in a doctoral program; however, it always involves the support from others, as we are all beings who need others; we are not islands. I would like to thank my dearest wife, Rita, who encouraged me to undertake this trek across the universe of learning. To my most precious daughter, Tera, ?dad says? thanks for making me laugh just for the ?fun? of it. I give my love always to each of you. To Dr. James V. Wright, thank you for encouragement to focus on the matter at hand and see the light at the end-of-the-tunnel (it wasn?t a train, after all!). For Dr. Maria M. Witte, a consummate educator who always had time to answer questions and provide direction, you have my deepest gratitude. For Dr. Margaret E. Ross, a statistician of extraordinary proportions who brought about statistical significance to what was truly statistically significant, thank you for opening-the-door to ?meaningful measurements.? And, for Dr. David C. DiRamio, may you forever remain the catalyst for change in the lives of the students you encounter. To each of these distinguished Auburn University professors, ?my hat is off to you!? viii To Jim Manring, Ph.D., P.E.: your example in the classroom has not been lost in the transition from student-to-teacher in my own life. It is because of what you have demonstrated in my own learning that I have embarked upon this journey of teaching- learning. It is my solemn prayer that your retirement has been a time of greatest self- reflection and personal happiness. You may never know how much you challenged and changed the lives of your students ? but I know the Records of Heaven surely do! To my mom and dad, how can a simple ?thanks? ever say what needs to be said? My dad was never able to realize his own education, yet he is one of the most intelligent people I know, and this dissertation is dedicated especially to him and mom. They have earned a Heavenly Reward for all of their years of sacrifice, love, and patience. God bless you both and you are two very special people! To Charlene Cannady and Jichul Kim, doctoral candidate ?colleagues?, thanks for feedback on several fronts. Charlene and Jichul?you?ve made a big difference, even though you may have felt as if your contributions were ?statistically insignificant.? There were times, Lord, when ?throwing in the towel? seemed preferable to 4 A.M. sleepless mornings, excessively late nights analyzing data, or balancing school and two full-time jobs?and You know I stopped writing for a time! You reminded me to consider the ways of the Ant and be wise even without ?guide, overseer or ruler??and I started writing again for a reason! (Proverbs, Chapter 6) P.S. To Barbara Anne Spears: to say thanks for your help with access to data sources will never express my gratitude. You are an excellent administrator and your kindness will never be forgotten! ix Style manual or journal used: Publication Manual of the American Psychological Association, 5th Edition. Computer software used: 1) SPSS 12.0.1?15.0, Windows Pro/Vista, 2) Microsoft Office Suite 2002/2003/2007, and 3) SurveyMonkey (http://www.surveymonkey.com). x TABLE OF CONTENTS Page LIST OF TABLES .......................................................................................................... xiii LIST OF FIGURES ......................................................................................................... xvi CHAPTER I. INTRODUCTION .......................................................................................1 Introduction ..................................................................................................1 Background ..................................................................................................5 Perceptions in the Community College ..................................................24 College Student Success and Institutional Practice: Competing Agendas ................................................................................25 Academic Preparation ............................................................................26 Work Ethics or Soft Skills ......................................................................27 Institutional Support................................................................................28 Statement of the Problem ...........................................................................30 Purpose of the Study ..................................................................................31 Research Questions ....................................................................................34 Significance of the Study ...........................................................................35 Limitations of the Study.............................................................................36 Assumptions of the Study ..........................................................................37 Definitions of Key Terms ..........................................................................38 Organization of the Study ..........................................................................43 Chapter Summary ......................................................................................44 II. LITERATURE REVIEW ...........................................................................46 Introduction ................................................................................................46 Historical Perspective and Role of the Community College .....................47 Demographics of the Community College ................................................53 Issues of the Community College System of Education ............................62 Challenges in the Community College ......................................................63 Choosing Among Competing Agendas ..................................................63 Meeting the Needs of a Changing Society ..............................................66 More Students and Less Money ..............................................................69 Opportunities in the Community College ..................................................74 From Open Door Policy to Significance ....................................................76 The Framework of Institutional Practice and Student Success ..................78 Factor 1: Academic Preparation.................................................................92 xi Factor 2: Work Ethics of Students and Faculty ......................................126 Factor 3: Institutional Support .................................................................165 Chapter Summary ....................................................................................181 III. METHODS ..............................................................................................183 Introduction ..............................................................................................183 Design of the Study ..................................................................................186 Theoretical Framework .........................................................................186 Research Questions .................................................................................190 Population and Sample ...........................................................................191 Instrumentation ........................................................................................199 Item and Domain Development ............................................................200 Description of Study .............................................................................202 Academic Preparation ...........................................................................203 Work Ethics ..........................................................................................205 Institutional Support..............................................................................208 Qualitative Open-Ended Questions.......................................................210 Reliability and Validity ............................................................................211 Panel of Experts ....................................................................................212 Cronbach?s Alpha, Principal Component Analysis, and ANOVAs .....213 Data Collection and Procedures ...............................................................229 Confidentiality and Anonymity ...............................................................233 Chapter Summary ...................................................................................234 IV. RESULTS ................................................................................................236 Introduction ..............................................................................................236 Characteristics of the Sample...................................................................237 Student Participants .............................................................................239 Faculty Participants ...............................................................................243 Quantitative Analysis and Findings .........................................................246 Student and Faculty Perceptions ...........................................................246 Research Questions ..................................................................................251 Research Question 1 .............................................................................251 Research Question 2 .............................................................................254 Research Question 3 .............................................................................256 Research Question 4 .............................................................................258 Strategic-Impact-Triad (SIT) Model Coefficient Equation ..............258 Research Question 4 Findings ..........................................................261 Qualitative Analysis and Findings ...........................................................266 Academic Preparation Themes .............................................................267 Work Ethics Themes .............................................................................268 Institutional Support Themes ...............................................................268 Institutional Practices Themes .............................................................269 Chapter Summary ....................................................................................270 xii V. SUMMARY, CONCLUSIONS, RECOMMENDATIONS AND IMPLICATIONS ............................................................................272 Introduction ..............................................................................................272 Summary of the Study .............................................................................275 Conclusions ..............................................................................................277 Recommendations ....................................................................................286 Implications..............................................................................................291 Chapter Summary ...................................................................................294 REFERENCES ................................................................................................................297 APPENDICES .................................................................................................................338 APPENDIX A: Auburn University, Student Information Sheet .................................339 APPENDIX B: Student and Faculty Perceptions of College Student Success: STUDENT SURVEY v.2 ..................................................................341 APPENDIX C: Auburn University, Faculty Information Sheet ..................................347 APPENDIX D: Student and Faculty Perceptions of College Student Success: FACULTY SURVEY v.2 ..................................................................349 APPENDIX E: Web Portals.........................................................................................355 APPENDIX F: Sample Participation Request Letter to College President .................356 APPENDIX G: Sample Response Letter from College President ................................359 APPENDIX H: Practices of APD, WED, & ISD Correlated to Research ....................360 APPENDIX I: Office of Human Subjects Research, Approval 07-087 EP 0705: Approval: May 14, 2007 .....................................................................363 APPENDIX J: Office of Human Subjects Research, Approval 07-087 EP 0705: Approval: July 20, 2007 ......................................................................364 APPENDIX K: Florida Community College Jacksonville Approval Letter ...............365 APPENDIX L: Jefferson State Community College Approval Letter ........................366 APPENDIX M: Appeal Letter to Participants, 26 November 2007 .............................367 APPENDIX N: Participating College Information ......................................................368 APPENDIX N: Page 3 of 3, Detailed Student Demographic Data Matrix ..................370 APPENDIX O: Detailed Faculty Demographic Data Matrix ......................................371 xiii LIST OF TABLES Table Page 1. Strategic-Impact-Triad Factors as Related to Constructs of Student Success ...........8 2. Strategic-Impact-Triad Factors as Related to Indicators of Student Success ..........10 3. Strategic-Impact-Triad Factors as Related to Faculty Competencies for Student Success ..................................................................................................12 4. Students? Misconceptions about Preparing For and Attending College ..................22 5. Community College Generations, Characteristics, Principles, And Earning Power ..................................................................................................52 6. Number of Community Colleges by Sate (2004) and Population Served 2001-2002 ....................................................................................................55 7. Community College Fast Fact Data ........................................................................59 8. Comparative Sample of Demographic Datasets: Alabama Commission on Higher Education (ACHE) and American Association of Community Colleges (AACC) ....................................................................................................61 9. Criticisms of Professional Development Efforts .....................................................64 10. Community Colleges Participating in the Achieving the Dream (2005) Initiative ......................................................................................................80 11. Regional Accreditation and Higher Learning Commissions ..................................89 12. A Nation at Risk: Indicators of the Risk and Current References ..........................97 13. Academic Preparation Impact Projections ............................................................100 14. Alignment between High School Graduation and College Admissions Course Requirements ............................................................................................104 xiv 15. Alabama Commission on Higher Education, High School Report: Enrollment in Alabama Public Colleges and Universities (First-Time Freshmen) ...........................................................................................119 16. Action Steps for Policymakers to Prepare All Students for the Workforce and College ..........................................................................................123 17. Employability Skills for Australian Small and Medium Sized Enterprises ...........129 18. Work Ethics of Different Generations ...................................................................131 19. Work Ethics Taught in the Two-Year Technical Colleges in Georgia ..................133 20. Comparative Summary of Work Ethics .................................................................164 21. How Changes in Society?s Values Have Impacted the Work Ethic in America ...164 22. Student Affairs Major Functions within the Domain of Institutional Support ......167 23. Original Colleges Randomly Selected for Participation in the Study....................193 24. Actual Participants in the Pilot Study Phase ..........................................................195 25. Pilot Study Respondent Data .................................................................................197 26. Potential Community and Technical Colleges Surveyed for the Final Dataset .....199 27. Practices of the Academic Preparation Domain (APD) .........................................204 28. Practices of the Work Ethics Domain (WED) .......................................................206 29. Practices of the Institutional Support Domain (ISD) .............................................210 30. Instrumentation to Correlate Quantitative and Qualitative Themes .......................211 31. Principal Component Analysis for Pilot Study Variables (Independent Analysis) ...........................................................................................217 32. Adjusted Principal Component Analysis for Pilot Study Variables (Independent Analysis) ...........................................................................................220 33. Composite 36-Item Reliability Analysis .................................................................222 34. Principal Component Analysis of Self-Reported Student Abilities (Practices)......224 35. Pilot Data ANOVAs Indicating Validity, Reliability and Significance .................225 xv 36. Pilot Data ANOVAs, Bonferroni-Holm (BH) Adjusted Correction Model ..........227 37. Participating Community and Technical Colleges Surveyed for the Final Dataset ..........................................................................................................238 38. Student Demographics ...........................................................................................242 39. Faculty Demographics ...........................................................................................245 40. Strategic-Impact-Triad (SIT) Perceptions: Students Compared to Faculty ............250 41. Faculty and Student Strategic-Impact-Triad Factor Domain Assessments ............264 xvi LIST OF FIGURES Figure Page 1. The Strategic-Impact-Triad Model of Student Success Within the Framework of Institutional Practice, Perceptions, and the Teaching-Learning-Assessment Domain ................................................................16 2. Community College Global Model of Student Success .........................................45 3. Comparative Demographics of Community Colleges and 4-Year Institutions: 2003-04 .............................................................54 4. College-Readiness/Student Success Impact Model ................................................99 5. Synopsis of State Scores by Number of Relative Score Groupings .....................108 6. Comparison of Core Curriculum Participants.......................................................111 7. Core Curriculum College-Readiness Model .........................................................115 8. The Relationship of Perceptions and the Strategic-Impact-Triad Factors ............121 9. Misaligned Perception Model of Student?s Academic Preparation .....................121 10. Aligned Perception Model of Student?s Academic Preparation ...........................122 11. Strategic-Impact-Triad Model Algorithm .............................................................189 12. Work Ethics Hierarchical Rating ..........................................................................208 13. Strategic-Impact-Triad (SIT) Model Coefficient Equation ..................................261 14. Survey Questions to Descriptively Assess Research Question 4..........................262 1 CHAPTER I INTRODUCTION ?Let us think of education as the means of developing our greatest abilities, because in each of us there is a private hope and dream which, fulfilled, can be translated into benefit for everyone and greater strength for our nation.? --- John F. Kennedy Introduction Research has suggested that students want to be successful (Brock et al., 2007; Horn, Nevill & Griffith, 2006). Student success, therefore, has been widely researched; however, community college student success as outcomes of institutional practice have not (Bailey, 2006a; Jenkins, 2006). Using the Education Resource Information Center (ERIC) and the keyword of college student success, there were 6,287 documents and studies dating back to 1929. College student success is influenced by an extreme range and depth of impact factors; however, in spite of the large number of publications in the student success domain, Braxton (2006) stipulated the following counter-argument: ?college student success stands as a topic that cries out for some form of systematic empirical attention. Without the benefit of such scholarly attention, uninformed, ad hoc views on student success and ways to achieve student success will emerge?we have witnessed a decline in the past two decades in the research of how, and to what extent, the collective attitudes and behaviors of faculty and administrators and the environments of colleges and universities are seen as contributing to student success. (p. 1) 2 There are a significant number of identified variables which influence student success (Bailey et al., 2005a; Callan, Finney, Kirst, Usdan & Venezia, 2006; Hirsch, 2001; Karp, Bailey, Hughes & Fermin, 2005; Kaye, Lord & Bottoms, 2006; Kuh et al., 2006; Long, 2006; Richardson, 2006; U.S. Department of Education, 2006; Weimer, 1994). Therefore, a basic question that must be answered is: What is a formal definition for college student success or student success? How might student success be defined and how is it measured? Also, once college student success has been defined and measured, how is the application of research intended to improve student success, including intervention methods (Hirsch, 2001)? As defined by the National Postsecondary Education Cooperative [NPEC] (2006): What is student success? Is it earning a degree, acquiring new knowledge and skills, getting a job after graduation? Students, families, faculty, legislators, trustees, the press and the public, all have ideas about what constitutes student success, and their ideas aren?t necessarily the same. Understanding student success becomes even more complicated when we consider the diversity of students in postsecondary education. Is success measured the same way for 22 year old full-time college students and for 45 year old part-time students? For students with high test scores and for those who don?t yet write and compute at a college-level? Should success be measured the same way for these students? How can decision-makers be better informed about the many ways in which postsecondary student success may be defined and measured? (p. 1) For this study, community college student success or achievement is defined as any improvement within the life of the student, meaning this: if a student improves his or her reading level, learns to emulate the positive character of a faculty member, fosters a supportive relationship with the institution, is able to discern and apply the process of life-long learning, becomes a valuable member of society, graduates, acquires training, obtains industry certification, improves his or her attitude, or advances his or her life 3 positively as a result of the impact of the educational process?these ?positive attributes? constitute community college student success. Graduation is the ultimate community college student success outcome (Capaldi, Lombardi & Yellen, 2006; VanWagoner, Bowman & Spraggs, 2005); however, graduation is not the only student success outcome for community college students (Dale & Drake, 2005; Horn & Ethington, 2002; Horn, Nevill & Griffin, 2006; Kozeracki, 2002). In terms of student success and the volume of prospective college students, high school students indicated that they intend to pursue a college degree at levels proportionate with the millennial generation, e.g., students born between 1982 and 2002. The millennial generation is 33% larger than one of the foremost student populations in U. S. History, e.g., the Baby Boomers (Coomes & DeBard, 2004). Of the high school students espousing college attendance, it was projected that 45% will attend community colleges (American Association of Community Colleges [AACC], 2006a, 2007; Phillippe & Sullivan, 2005). While the number of students declaring their intent to pursue postsecondary education was projected to increase, inconsistencies in the defining trends and realities of college-readiness and student success seem to exist (Conley, 2005). Long (2006) argued that ?student success is a multidimensional issue with varying definitions of the benchmarks? (p. 2). Student success is influenced by many factors: family structure; socioeconomic forces; the P-12 system of academic preparation; student engagement, motivation, viable work ethics or soft skills; characteristics and practices of individual colleges to include pedagogic practices; policy design and implementation; and educational leadership (ACT, 2005a; Bailey et al., 2005a; Bailey, 2006a; Capaldi, Lombardi & Yellen, 2006; Hill & Petty, 1995; McLeish, 2002; Robbins et al., 2004). 4 Bailey (2006a), Byrd and MacDonald (2005) and Long (2006) suggested that institutional, management, or administrative practice has the potential to strategically influence student outcomes. As noted by McClenney and Greene (2005), students who enroll in a community college face many challenges. A major issue addressed in the study was why some students are successful while other students are not. McClenney and Greene (2005, p. 2) suggested the following argument: ?Why, then, do some [community college students] persevere while others leave before they meet their goals? Institutional practice can tip the balance.? Consequently, organizational structures and institutional practices which possess the potential to impact student success should be reviewed on a regular basis to positively ?tip the scale.? Long (2006) argued that ?looking at successful students and the best practices exhibited by institutions is the first step in identifying possible methods for addressing the hazards that limit student success? (p. 7). Consequently, factors which impact college student success are inherent in institutional practice. Moreover, it is crucial that community college leaders look for trends which have resulted from institutional practice?or environmental factors influencing student success. Perceptions and practices establish operational trends within the community college which influences student outcomes. An example of analyzing trends was argued by Conley (2005): An ever-increasing proportion of high school students in the United States today aspire to college. Yet statistics indicate that the percentage of college students receiving bachelor?s degrees has remained relatively constant over the past twenty-five years, that it now takes on average five years to get a four-year college degree, and that somewhere between 30 percent and 60 percent of students now require remedial education upon entry to college, depending on the type of institution they attend. Also over the past twenty-five years, SAT and ACT scores have risen only slightly in math and been relatively constant in reading, high school grade point average has gradually risen, and the proportion of students taking 5 college preparatory courses has grown as well. How do we explain the seeming inconsistencies between these trends? The answer can be found in part in the distinction between being college-eligible and college-ready. (p. xi) This study examined factors and issues associated with students who are both college-eligible and college-ready in terms of their respective success. While students are eligible and subsequently enroll, what pre-and-present-college factors influence their success? What institutional practices support student success, and specifically, what factors identified in this study were suggested as the most influential factors which impacted student success? And finally, this study suggested that perceptions are significant variables which influence institutional practice and that institutional practice influences perceptions of both students and faculty. The community college should assess practices and variables which impact college student success, e.g., perceptions, institutional practices, academic preparation, work ethics, and institutional support. Background Student success was defined and measured to establish the global framework for this study (National Postsecondary Educational Cooperative, 2006). To delimit and limit the scope of factors influencing student success in this study was to design a model which narrowed this investigation. Consequently, a Strategic-Impact-Triad (SIT) Model was designed to focus the emphasis of the study on the student success domain, with full acknowledgement of the vast array of student success variables identified in the context of previous and on-going research. 6 The SIT Model was derived from and correlated to the work of three major studies. First, Robbins et al. (2004) conducted a meta-analysis of 109 studies for the purpose of extracting constructs which indicated or suggested college or student success (nine constructs defined). Secondly, Kuh et al. (2006) conducted a thorough review of student success literature and identified 14 indicators of student success; and, Smith (2005) studied the characteristics of on-line instruction to derive a list of 51 competencies suggested as effective student success instructional methods; these methods, although specific to on-line instruction, are explicitly applicable to instructional methods in the classroom and directly related to promoting student achievement. The SIT Model logically identified: 1) the existence and practice of perceptions within an educational institution; 2) the relationship between factors influencing student success and the Teaching-Learning-Assessment Domain (TLAD); 3) the codependence of academic preparation, work ethics, and institutional support; 4) and the global organizational framework of institutional practice within which faculty and students function collectively and separately to promote college student success (Bailey et al., 2005a; Bailey, 2006a; Bailey & Alfonso, 2005; Braxton, 2006; Kuh et al., 2006; NPEC, 2006; Robbins et al., 2004; Tinto & Pusser, 2006). The components of the Strategic-Impact-Triad Model are inseparable; each part has a unique role to play in the success of students attending college, or preparing for the workforce; nevertheless, each component is related to, a correlate of, and interdependent upon, every other component within the Strategic-Impact-Triad Model. The factors of academic preparation, work ethics, and institutional support form the basis for providing community colleges with a model to promote focused resources and practices toward 7 student success as an outcome of recompiling numerous variables into the Strategic- Impact-Triad Model factors. First, Robbins et al. (2004) identified nine broad constructs related to college student success: 1) contextual influences, 2) academic-related skills, 3) general self- concept, 4) academic self-efficacy, 5) social involvement, 6) perceived social support, 7) institutional commitment, 8) academic goals, and 9) achievement motivation. The nine constructs were recompiled and correlated into three interdependently, multi-faceted factors within the Strategic-Impact-Triad (SIT) Model. To redirect the nine constructs into the SIT Model factors, a correlation to categorize the constructs into the SIT factors was accomplished as follows (Robbins at al., 2004): 1) academic preparation correlated to constructs #2, #4, #6, #8, and #9; 2) work ethics correlated to constructs #2, #3, #5, #6, #8, and #9; and, 3) institutional support correlated to constructs #1, #3, #5, #6, and #7. The correlation for Robbins et al. (2004) and the SIT factors was detailed in Table 1, whereas the correlation detail between Kuh et al. (2006) and Smith (2005) and the Strategic-Impact-Triad factors were indicated in Table 2 and Table 3, respectively. The study by Robbins et al. (2004) identified nine broad constructs related to student success. However, Robbins et al. (2004) also suggested the following: ?conceptual confusion occurs when defining college success and its determinants? (p. 261). Therefore, the purpose of the Strategic-Impact-Triad Model was to refocus community college practices to reduce ?conceptual confusion? and enhance institutional practices to improve community college student success. 8 Table 1 Strategic-Impact-Triad Factors as Related to Constructs of Student Success Strategic-Impact- Triad Factors Name of Construct (1 ? 9) Robbins et al. (2004) Construct Defined Institutional Support Contextual influence (1) The favorability of the environment; the extent that supporting resources are available to students, including (1) availability of financial supports, (2) institution size, and (3) institution selectivity. Academic Preparation, Work Ethics Academic- related skills (2) Cognitive, behavioral, and affective tools and abilities necessary to successfully complete task, achieve goals, and manage academic demands. Institutional Support, Work Ethics General self- concept (3) One?s general beliefs and perceptions about him/herself that influence his/her actions and environmental responses. Academic Preparation Academic self- efficacy (4) Self-evaluation of one?s ability and/or chances for success in the academic environment. Work Ethics, Institutional Support Social involvement (5) The extent that students feel connected to the college environment; the quality of students? relationships with peers, faculty, and others in college; the extent that students are involved in campus activities. Academic Preparation, Work Ethics, Institutional Support Perceived social support (6) Students? perception of the availability of the social networks that support them in college. Institutional Support Institutional commitment (7) Students? confidence of and satisfaction with their institutional choice; the extent that students feel committed to the college they are currently enrolled in; their overall attachment to college. Academic Preparation, Work Ethics Academic goals (8) One?s persistence with and commitment to action, including general and specific goal-directed behavior, in particular, commitment to attaining the college degree; one?s appreciation of the value of college education. Academic Preparation, Work Ethics Achievement motivation (9) One?s motivation to achieve success; enjoyment of surmounting obstacles and completing tasks undertaken; the drive to strive for success and excellence. Detailed Summary Relationship: Academic Preparation Correlated to Constructs: 2, 4, 6, 8, 9 Work Ethics Correlated to Constructs: 2, 3, 5, 6, 8, 9 Institutional Support Correlated to Constructs: 1, 3, 5, 6, 7 Note. From ?Do Psychological and Study Skill Factors Predict College Outcomes? A Meta-Analysis,? by S.B. Robbins et al., 2004, Psychological Bulletin, 130(2), p. 267. 9 Secondly, Kuh et al. (2006) conducted an in-depth study of the student success literature utilizing the following methodology: We conducted this extensive review of the literature related to student success, broadly defined, to develop an informed perspective on policies, programs, and practices that contribute to desired outcomes of postsecondary education. The research team developed a search strategy for identifying relevant literature and created a list of key search terms, authors, and related topics to focus the literature search. More than 70 search words, 40 authors, and 30 organizations were identified as salient. In addition to searching for these terms via online library databases, we also devised a plan to explore reports found on pertinent foundations and organization websites. Colleagues across the country were consulted to uncover additional research on student success that was less accessible through conventional means. (p. 149) Specifically, the study investigated more than 700 compiled relevant documents in the following categories and amounts: 1) 200 pre-college characteristics; 2) 300 postsecondary educational experiences; 3) 290 institutional conditions; and 4) 130 post- college outcomes. As with Robbins et al. (2004), Kuh?s et al. (2006) indicators were correlated to the Strategic-Impact-Triad Model factors and are shown as student success indicators in Table 2. The indicators were: 1) student goal attainment; 2) course retention and success; 3) success in subsequent coursework; 4) Fall-to-Fall persistence; 5) time to degree; 6) degree or certificate completion; 7) graduate school enrollment and employment; 8) transfer rate and success; 9) employer assessment of students; 10) academic value added; 11) student satisfaction; 12) student professional growth and development; 13) student involvement; and, 14) citizenship and engagement. A direct and summary correlation of student success indicators to SIT Model factors is shown in Table 2. 10 Table 2 Strategic-Impact-Triad Factors as Related to Indicators of Student Success Strategic-Impact-Triad Factors Name of Indicator (1-14) Kuh et al. (2006) Indicators Defined Academic Preparation, Institutional Support Student Goal Attainment (1) To what extent are students attaining their final educational goal as indicated on their application and advising record? Academic Preparation, Institutional Support Course Retention and Success (2) At what rate do students complete the individual courses in which they enroll? At what rates are D, F, and W grades awarded in particular courses? Institutional Support, Work Ethics Success in Subsequent Coursework (3) How successful are students in courses that are linearly sequential especially in math, science, and English? Academic Preparation Fall-to-Fall Persistence (4) At what rate do students continue their education one complete academic year to the next, in accordance with their educational goal? Work Ethics, Institutional Support Time to Degree (5) How many semesters elapsed prior to degree attainment? What percentage of full-time students attempt and complete the average credit hour load per term? Academic Preparation, Work Ethics, Institutional Support Degree or Certificate Completion (6) What number and percentage of students complete their chosen degree or certificate program? Institutional Support Graduate School Enrollment and Employment (7) At what level are students enrolling in graduate and professional school and attaining employment and advancement relevant to their degree or certificate program? Academic Preparation, Institutional Support Transfer Rate and Success (8) At 2-year institutions, what percentage of students completes their educational goal of transferring to a 4-year institution? How does the success of transfer students compare to students that started at the institution? Academic Preparation, Work Ethics, Institutional Support Employer Assessment of Students (9) How satisfied are employers with students? knowledge, qualities, and skills? Academic Preparation, Work Ethics Academic Value Added (10) What knowledge and skills have students acquired during their undergraduate experience? Institutional Support Student Satisfaction (11) How satisfied are students with access, instructional and student services, facilities, and campus life? Work Ethics Student Professional Growth and Development (12) What are the self-perceived personal growth, community involvement, and moral development of students completing their education at the institution? Academic Preparation, Work Ethics, Institutional Support Student Involvement (13) To what extent are students participating in educationally purposeful activities? Work Ethics Citizenship and Engagement (14) To what extent are students acquiring habits of the mind and heart in college that will benefit them and society in the future? Detailed Summary Relationship: Academic Preparation Correlated to Indicators: 1, 2, 4, 6, 8, 9, 10, 13 Work Ethics Correlated to Indicators: 3, 5, 6, 9, 10, 12, 13, 14 Institutional Support Correlated to Indicators: 1, 2, 3, 5, 6, 7, 8, 9, 11, 13 Note. From ?What Matters to Student Success: A Review of the Literature?, by G.D. Kuh et al., 2006, National Postsecondary Education Cooperative [NPEC], p. 151. 11 Thirdly, Smith?s (2005) research frames the study with the following benchmarks related to quality instruction: 1) institutional support, 2) course development, 3) teaching/learning, 4) course structure, 5) student support, 6) faculty support, and 7) evaluation and assessment. As a result of the benchmarks, 51 competencies for effective instruction were suggested. As noted by Smith (2005), ?learner-centered programs and competent instructors are two oft-cited keys to [student] success in higher education? (p. 1). And, although the focus of Smith?s (2005) study is related to online instruction, the competencies apply directly to the SIT Model factors of academic preparation, work ethics, and institutional support. Items specifically related to online instruction as compared to classroom techniques will be identified in this study (see Table 3). Fifteen examples of the 51 instructional competencies are indicated as follows: 1) [understand and] avoid overloading new students at the start of a course; 2) communicate high expectations; 3) evaluate ourselves; 4) evaluate students; 5) foster learning centeredness; 6) promote collaborative participation; 7) use humor; 8) [assess and] respect institutional performance guidelines; 9) model good participation; 10) help integrate students into the institution and its culture; 11) manage student expectations; 12) give prompt feedback; 13) use the web as a resource; 14) emphasize time on task; and, 15) develop relationships. The entire set of competencies and their correlation to the SIT Model factors are shown in Table 3. To reiterate, it is argued in this study that although the 51 competencies are related to online instruction, the 51 competencies are directly applicable to the SIT Model factors; the three SIT Model factors were also correlated to the 51 competencies to stipulate how the SIT factors recompiled the 51 competencies as influential variables promoting college student success within institutional practice. 12 Table 3 Strategic-Impact-Triad Factors as Related to Faculty Competencies for Student Success Faculty Competencies for Student Success: 1 - 30 Faculty Competencies for Student Success: 31 - 51 1. Act like a learning facilitator rather than a professor; 2. Avoid overloading new students at the start of a course; 3. Be clear about course requirements; 31. Make the transition to the online learning environment; 32. Manage student expectations; 4. Be willing to contact students who are not participating; 5. Become a lifelong learner; 6. Communicate high expectations; 33. Mandate participation. Step in and set limits if participation wanes or if the conversation is headed in the wrong direction; 7. Communicate technical information in plain English; 8. Create a warm and inviting atmosphere that promotes the development of a sense of community among participants; 34. Model good participation; 35. Network with others involved in online education; 9. Create an effective [online] syllabus?one that lays out the terms of the class interaction?the expected responsibilities and duties, the grading criteria, the musts and don?t s of behavior, and explains the geography of the course; 36. Prepare students for [online] learning; 37. Promote collaborative learning; 10. Deal effectively with disruptive students; 11. Define participation and grading criteria; 12. Develop reciprocity and cooperation among students; 13. Develop relationships; 14. Effectively and efficiently (administer) the course; 38. Promote reflection; 39. Provide structure for students but allow for flexibility and negotiation; 15. Effectively use whatever technology has been selected to support [online] learning; 16. Emphasize time on task; 40. Remember that there are people attached to the work on the screen; 41. Respect diverse talents and ways of learning; 17. Encourage contacts between students and faculty; 18. Encourage students to bring real-life examples into the online classroom; 19. Evaluate ourselves; 42. Respect institutional performance guidelines; 43. Respect privacy issues; 44. Set up a well-organized course site; 20. Evaluate students; 21. Foster learner centeredness; 22. Get students to respect assignment due dates and agreed-upon working times; 45. Teach students about online learning; 46. Translate content for online delivery; 23. Give prompt feedback; 24. Harness the technology; 25. Help integrate students into the institution and its culture; 47. Use active learning techniques; 48. Use best practices to promote participation; 26. Help students develop critical thinking skills; 27. Help students identify and use appropriate learning techniques; 49. Use humor; 50. Use the web as a resource; 28. Help students identify strengths and areas of needed improvement; 29. Keep informed of the latest trends and issues; continually improve your skills and knowledge; 30. Maintain the momentum of the course; 51. Most of all have fun and open yourself to learning as much from your students as they will learn from one another and from you! Detailed Summary Relationship: Academic Preparation Correlated to: 1-3, 5, 6, 9, 11, 16, 18, 20-24, 26-28, 31, 35-37, 39, 45-49, and 51 Work Ethics Correlated to: 2, 4, 6, 8, 10-14, 17-20, 22, 23, 26, 29, 30, 32-34, 37, 38, 40, 41, 43, 48, 49, and 51 Institutional Support Correlated to: 2, 5, 8, 15, 19-21, 23-25, 29, 30, 33, 39, 40, 42, 44, 45, 49, and 50 Note. From ?Fifty-one competencies for online instruction,? T. Smith, 2005, The Journal of Educators Online, 2(2), pp. 15 ? 18. 13 As suggested in Tables 1 ? 3, many named factors impact college student success. In each table, the Detailed Summary Relationship row correlates the Strategic-Impact- Triad Model factors to the constructs, indicators, or competencies within the studies by Robbins et al. (2004), Kuh et al. (2006), and Smith (2005), respectively. The relationship of the SIT Model factors and the variables identified in the studies noted is one in which the community college must be cognizant if improvement in student success is to be accomplished. Moreover, the SIT Model factors are inseparable from institutional practice. For example, institutional support cannot be segmented from student success if organizational practices fail to provide a favorable environment for the success of its student population or the support of instructional efforts (Bailey, 2006a; Jenkins, 2006; Richardson, 2006). Of particular concern in Tables 1 - 3 are the implied variables which are very often ignored in research related to student success, e.g., perceptions of faculty and students (Achieve, Inc., 2005; Brancato, 2003; Gillum & Davies, 2003; Jenkins, 2005; Levine & Cureton, 1998; McGuire & Williams, 2002; Reason, Terenzini & Domingo, 2005). To reveal the perceptions of students and faculty in this study was crucial to identifying the underlying reasons for the attitudes and actions expressed in educational practice. Moreover, this study is heavily dependent on perceptions of the factors which influence student success in the community college?specifically, how do students and faculty perceive the SIT Model factors as influencing, promoting and supporting student achievement? Students who treat academic preparation as impartial observers in the achievement process are likely to discover that faculty members rely heavily on academic preparation to form the basis of promoting educational maturity in subject matter and to 14 move students toward success in terms of graduation rates or achieving stated educational goals. Within the SIT Model, student success is a complex set of dynamic issues stemming from years of development. The years of development: 1) begin at an individual?s birth, 2) are developed in grades K-12, 3) are influenced by socioeconomic forces, and 4) are called into action the day the college-eligible student is accepted into the community college (Conley, 2005; Horn, Nevill & Griffith, 2006). Once enrolled, however, what factors influence the success of the student?: Robbins el al. (2004) suggested the nine constructs previously noted; Kuh et al. (2006) listed the 14 indicators of student success; Smith (2005) identified 51 instructional competencies; the American College Testing Service argued that a lack of college-readiness is a major detractor of student success in college (ACT, 2006a); policies have been identified as detrimental to student success due to misaligned outcomes of application and practice (Achieve, Inc., 2005); Hirsch (2001) suggested that student outcomes are not always attributable to the outcome itself, but ?what the student perceives as the cause for the outcome that will strongly affect motivation? to achieve (p. 73); basic skills deficiencies are strong indicators that student success is heavily dependent upon institutional policies such as intervention methodologies, including remediation or basic developmental skills (ACT, 2005a); and as previously noted, Robbins et al. (2004) argued that ?conceptual confusion occurs when defining college success and its determinants? (p. 261). While academic preparation, work ethics, and institutional support are the three factors to be evaluated in this study, it is vital to understand that student success is also inextricably aligned with the domains of teaching, learning, assessment, and institutional 15 practice. For this study, the Strategic-Impact-Triad Model factors are considered within the global framework of the Teaching-Learning-Assessment Domain, in so far as to ensure that the nature of this study is understood in the proper context. The goal of the Teaching-Learning-Assessment Domain is to foster, garner, and maximize college student success (Braxton, 2006; Long, 2006; Spelling, 2003; Weimer, 1994). Figure 1 provides a graphical representation of the Strategic-Impact-Triad Model which includes academic preparation, work ethics, institutional support, the Teaching- Learning-Assessment Domain (TLAD), the influence of perceptions, and institutional practice as the foundation upon which to assess the institutional achievement of community college student success. The nine college success constructs suggested by Robbins et al. (2004), the 14 indicators of student success noted by Kuh el al. (2006), and the 51 faculty competencies identified by Smith (2005), are all inherent variables in the Strategic-Impact-Triad Model, including the Teaching-Learning-Assessment Domain, perceptions, and institutional practice. The SIT Model, therefore, is suggested as a prima facie student success model to inform community college leaders that such a functional model is prerequisite to identify institutional practices which may hinder or support college student success?in specific terms of academic preparation, work ethics, and institutional support. 16 Figure 1. The Strategic-Impact-Triad Model of Student Success within the Framework of Institutional Practice, Perceptions, and the Teaching-Learning-Assessment Domain. As a theoretical construct, student success was defined by Walter W. Powell (1989), University of Arizona, as ?the whole is not greater than the sum of its parts, but some of the parts are pretty darn good!? (p. 490). If student success is considered in specific terms of grouping the Strategic-Impact-Triad Model (SIT) factors in Figure 1, the grouped-factor relationship is precisely co-dependent (Powell, 1989). In this study, institutional support is one of three ?parts? required to achieve community college student success (Amey & Long, 1998; Weimer, 1994). If a student with excellent academic preparation and a solid work ethic enrolls in a community college that practices unacceptable methods of institutional support, the student is more likely to consider the institution unsupportive of his or her educational goals. Conversely, if a student is cordially and openly welcomed and supported by the institution, but the student has poor S tr a te g i c - I mp a c t - T ri a d M o d e l o f S tu d e n t S u c c e s s Te a c hing Le a rni n g A s s e s s me nt A c a de mi c P re pa ra t ion W ork E t hic s Ins t it ut ion a l S upp ort Stud e nt Suc c e s s Perceptions ? Institutional Practice ? Perceptions Imbedded Variables: Robbins et al., (2004), 9 constructs of college success; Kuh et al., (2006), 14 indicators of student success; Smith (2005), 51 competencies of faculty. Perceptions of faculty and students. Institutional practices which influence the SIT Model factors. 17 academic preparation and even poorer work ethics, the community college is tasked with improving the academic preparation and work ethics of the student. As suggested by Richardson (2006): ?there is widespread agreement that improving the academic preparation of students for college needs to be a priority? (p. 3). As previously noted, student success is not only about graduation; success is a measure of improving the lives of students in both tangible and intangible ways as outcomes of the college experience. As suggested by Powell (1989), some of the factors are pretty darn good; nevertheless, this study suggested that for students to achieve their best success, the whole is strategically dependent upon its parts?those that are ?pretty darn good? (Powell, 1989, p. 490) and improving those that are not. Moreover, institutional practice promotes student success through the design of positive outcomes associated with academic preparation, work ethics, and institutional support. Institutional practices which impede the success of college students must be identified and positively altered, particularly the factors within the Strategic-Impact-Triad Model (Long, 2006). Students entering the doors of the community college may not be attending to attain a degree or transfer to a four-year college (Horn, Nevill & Griffin, 2006). If a student defines his/her goal as a specific employment certification, the community college is responsible to promote institutional practices for students to meet their stated objectives. College student success depends upon identifying individual student goals and providing support for individual success because employment opportunities for college students in the post-industrial age are dependent upon successful educational outcomes (American College Testing Service [ACT], 2005a, 2005b; Boswell, 2004; 18 Boswell & Wilson, 2004; College Board, 2004; Forster, 2006; National Center for Education Statistics (NCES), 1995, 1997, 2000, 2005; Swanson, 2004). Student success is contingent upon college-readiness and college-readiness has been defined as a national education priority (Byrd & MacDonald, 2005). To invoke linear, logical thinking is to suggest that because student success is dependent on the SIT Model factors of influence, and because student success is synonymous with college- readiness, college success is also a national education priority (Kirst & Venezia, 2006; Phillips & Skelly, 2006; U.S. Department of Education, 2000). Therefore, logic dictates that because college student success is largely dependent upon effective institutional practices, the need to investigate factors related to successful institutional practices which promote student success is also a national education priority (Byrd & MacDonald, 2005; Long, 2006; Richardson, 2006)?and community colleges are no exception in uncovering the practices which promote and/or hinder college student achievement. As suggested by the ACT (2006a), basic skills needed for success in the workplace and in college are converging. It is important that the co-relationship between academic preparation, work ethics, and institutional support be understood as the basis of student success and workforce preparation. To further bind the construct of college student success to the fundamental premise of why students need to be ready for and succeed in college or the workforce is to consider the following (ACT, 2006a, p. 2): In the business world, there is little doubt that the skills needed for success in work and in college are now converging. We are preparing a nation of citizens, and they all need to have the opportunity to be educated to a standard that prepares them to succeed in college and in the workplace. A central goal of American public high schools must be to prepare all young people to standards of readiness for both postsecondary education and workforce training?standards that our research shows are comparable. 19 The American College Testing Service (ACT) (2005a) conducted a study of college entrance examinations and concluded that the percentage of ACT-tested high school graduates who were able to meet or surpass all three college-readiness benchmarks was of considerable concern ? a mere 22% of the 1.2 million students tested in 2004. Benchmarks referenced in the study were college-level courses in English, Mathematics, and Science. ACT officials defined student success as earning at least a C in a for-credit course without a prerequisite for remediation. The reference to college- level courses included both two-and-four-year institutions. Although the study suggested a significant potential increase in college-readiness deficiencies as the number of college- bound students also increase, policies which address deficiencies in student preparation for college-level work have the significant potential to offset this negative trend, which included improved college student success (Dougherty & Hong, 2005; Dougherty, Reid, & Nienhusser, 2006; Hughes & Karp, 2006). Lovett and Mundhenk (2004) suggested that a college degree has replaced the high-school diploma as the gateway to the American middle class and workforce readiness. The relationship between postsecondary education, employment, global competition, and student success, is that in the competitively global society in which individuals co-exist, it is imperative that the students who will meet the needs of the future workforce are empowered with knowledge, skills and the ability to perpetually learn throughout their lives (Krueger, 2006). The report by Krueger (2006) also indicated that the U.S. Bureau of Labor Statistics estimated that the ?fastest growing and highest paying occupations between now and 2014 require some form of postsecondary education? (p. 1). Student success in the context of earning power is also noted by 20 Gillum and Davies (2003): ?As a general rule, they would make more money?they would also stand a better chance of getting employment?it would increase their chance of getting a job over someone that didn?t have the education? (p. 249). Additionally, Conklin and Smith (2004) suggested that the future economic survival of the nation is critically important for and directly related to student success: Never before in U.S. history has the quality of human resources?the skills and education of its people?been so important to the economic prospects of states and their residents. Within the next 20 years, the nation will lack 14 million people with postsecondary education unless states realize significant improvements in high school and postsecondary performance. High school and postsecondary completion rates and college-readiness [student success] need to improve, particularly among disadvantaged populations. (p. 1) Although student success in college may be viewed as a broad set of paradigms of preparation to enter college, successful outcomes, and contributions to society, student success is influenced by perceptions, attitudes, and reality (Gillum & Davies, 2003; Reason, Terenzini & Domingo, 2005). The reality of student success must be extracted from the interrelationships of academic preparation, work ethics, and institutional support within the framework of institutional or management practice (Bailey, 2006a; Jenkins, 2006). Ultimately, student success is the end-all, be-all framework of an individual?s educational goals?so that individuals complete their respective postsecondary education or training in order to acquire gainful employment, and once employed, to remain competitive in the market as an asset for employers, nationally and internationally (Baum & Payea, 2005; Lord, 2002a). Furthermore, attitudes and perceptions are more intrinsic but just as powerful. When attitudes and perceptions do not mesh with established policies and practices, 21 problems arise. As a result, it is imperative that institutional research is undertaken to correlate perceptions to the Strategic-Impact-Triad Model factors as a means to assess how perceptions impact institutional decisions that initiate and drive student success, e.g., institutional policies and practices within the P-16 infrastructure (Dougherty & Hong, 2005; Education Commission of the States, 2006; Knight, Moore & Coperthwaite, 1997). If the perceptions of students and faculty are not properly aligned and supported by relevant institutional policies and practices, the framework for establishing programs of college student success may be misaligned, ineffective or detrimental to positive student outcomes. For example, there is a widening gap between educators? expectations of their students and students? own expectations for success (Achieve, Inc., 2005; Brancato, 2003; Jenkins, 2005; Levine & Cureton, 1998; McGuire & Williams, 2002). If the community college disregards these perceptions, negative institutional practices may be allowed to perpetuate in the form of detractors for student success. Additionally, how might the perceptions of students and faculty provide input into the community college as a means to improve the relationship between students, faculty, and the community college?as a methodology to improve the reliability and validity of student success. To partially answer this question, Table 4 annunciates perceptions expressed by students concerning their academic preparation, enrollment, and success in college. If students do not perceive the overall college process correctly, it is critical that community college leaders and faculty recognize these misunderstood perceptions so that student-faculty relationships may be improved, as well as implementing intervention procedures and practices (Hirsch, 2001). For this study, student and faculty perceptions were used to assess the factors within the Strategic-Input-Triad Model. 22 Table 4 Students? Misconceptions about Preparing For and Attending College Many students believe that: In reality: I can?t afford college Students and parents regularly overestimate the cost of college I have to be a stellar athlete or student to get financial aid Most students receive some form of financial aid Meeting high school graduation requirements will prepare me for college Adequate preparation for college usually requires a more demanding curriculum than is reflected in minimum requirements for high school graduation, sometimes even if that curriculum is termed ?college prep? Getting into college is the hardest part For the majority of students, the hardest part is completing college Community colleges don?t have academic standards Students usually must take placement tests at community colleges in order to quality for college- level work It?s better to take easier classes in high school and get better grades One of the best predictors of college success is taking rigorous high school classes. Getting good grades in lower-level classes will not prepare students for college-level work My senior year in high school doesn?t matter The classes students take in their senior year will often determine the classes they are able to take in college and how well-prepared they are for those classes I don?t have to worry about my grades, or the kind of classes I take, until my sophomore year Many colleges look at sophomore year grades, and, in order to enroll in college-level courses, students need to prepare well for college. This means taking a well- thought out series of courses starting no later than 8th or 10th grade I can?t start thinking about financial aid until I know where I?m going to college Students need to file a federal aid form prior to when most colleges send out their acceptance letters. This applies to students who attend community colleges, too, even though they can apply and enroll in the fall of the year they wish to attend I can take whatever classes I want when I get to college Most colleges and universities require entering students to take placement exams in core subject areas. Those tests will determine the classes students can take Note. An excerpt from ?Betraying the College Dream,? by Andrea Venezia, Michael W. Kirst and Anthony L. Antonio, March, 2003, Stanford University Bridge Project, p. 31. 23 While the premise of the Strategic-Impact-Triad Model construct is to correlate its factors via perceptual reflections from students and faculty, a reality must also be identified. The institution is not solely responsible for the success of the student as there are extenuating circumstances at work. Students are exposed to a full gamut of experiences prior to college; however, it is the responsibility of the institution to become cognizant of student experiences to match student needs to student success practices (Haycock, 2006; Perna & Thomas, 2006). Student experiences form the perceptions students have of their respective or prospective community colleges (see Table 4). There is also a misconception by community college students in terms of what classes a student can take. The perception of students is that they can take any courses they want. In reality, community colleges require placement tests, transfer courses, or ACT/SAT scores to place students in the correct courses. If placement tests indicate scores which do not meet the COMPASS-normed, cutoff-scores (ACT COMPASS System, 2006) to place students in college-level English, Math, or Writing courses, students are directed into remedial courses to prepare them for college-level coursework (Greene, 2000). Institutional practices which support a culture of remediation-is-a- necessary-evil is more likely to negatively impact student success as a perceptual outcome on the part of the student (Alliance for Excellent Education, 2006). Institutional practice is as much a perception as the misconceptions of students preparing for and attending college (Venezia, Kirst & Antonio, 2003). As noted by Kuh (2007), ?what students perceive that an institution values and emphasizes makes a difference? in student success (p. 7). 24 Perceptions in the Community College Open door policies in the community college system of education across the nation have set a course of significant access to higher education for countless numbers of students (Horn, Nevill & Griffith, 2006; Vaughn, 2004; Wirt, Choy, Rooney, & Provasnik, 2005). Alongside this influx of students in the community college comes the necessity to understand the perceptions of students in terms of their actions. As suggested by Horn and Ethington (2002), ?there is a strong relationship between the extent to which students become involved in the academic and social systems of their educational institutions and their perceived gains in growth and development and the attainment of their educational goals? (p. 404). The relationship between a student?s institution and perceived success is what Bailey (2006a) refers to as ?what colleges and universities can do to promote their students? success? (p. 2), or a ?can-do? attitude. For the community college, understanding perceptions between students and faculty is vital to promoting and improving student success. In terms of the SIT Model and institutional practice, it is the perceptions of faculty and students which will inform the community college decision-makers about the impact institutional effort has on academic preparation practice, work ethics practice, and institutional support practice. Furthermore, the comparison of the perceptions between faculty and students provides feedback and input into how the factors of the SIT Model either support or deter student success. And most importantly, if community college students and faculty perceive institutional practice as hindering student success, it is imperative that the community college leadership seek out these perceptions to become action-items to improve student success throughout the institution (Long, 2006; Richardson, 2006). 25 College Student Success and Institutional Practice: Competing Agendas Competing agendas as suggested by The Chronicle of Higher Education (2004) are educational issues which will press the community college most for solutions. The relationship between college student success and institutional practice is an example of a pressing issue and one of the greatest challenges facing the community college system of education (Alvarado, 2006; Callan, Finney, Kirst, Usdan, & Venezia, 2006). Bailey et al. (2005a) conducted a study of student success in the community college and identified several ?institutional characteristics that affect[ed] the success of community college students? (p. 2). Success, as argued in the study, was a composite of several competing agendas, including but not limited to: financial resources, efforts in retention, multi- institutional attendance, leadership, faculty relations, and local political influence. Underlying the analysis of competing agendas in the community college is the relationship between institutional resources and accountability (Boggs, 2004; Dougherty & Hong, 2005; Jacobson, 2005; The Chronicle of Higher Education, 2004; VanWagoner, Bowman & Spraggs, 2005). For example, institutional resources are required to promote student success at the same time that measurable outcomes must be reported to verify student achievement. To verify student achievement, accountability practices must be effective from enrollment to achievement of stated educational goals. As suggested by Voorhees and Zhou (2000), ?efforts to assist community college students in defining [and achieving] their goals should last beyond their initial period of matriculation? (p. 232). College student success is an outcome of institutional practice; even so, the success of students can often end up as a competing agenda with practices which do not promote student success in the institution (Haycock, 2006). 26 Academic Preparation For the high school graduate who enrolls in the community college, academic preparation is but one of the prerequisites to success. Basic skills are the ingredients upon which the educational system is dependent for establishing, maintaining, enhancing, and supporting global competitiveness (Krueger, 2006; Phillips & Skelly, 2006). A student?s readiness for college?not eligibility?is specifically dependent upon prior academic preparation as a contributing factor. For instance, as basic skills improve, student success is more likely to improve linearly. Studies by the ACT (2006a, 2006b), Kaye, Lord, and Bottoms (2006), Callan, Finney, Kirst, Usdan, and Venezia (2006), Karp, Bailey, Hughes, and Fermin (2005), and the U.S. Department of Education (2006), have all noted that academic preparation is one of the most important educational challenges this nation has ever faced; academic preparation is critical to college student success. According to Byrd & MacDonald (2005), 41% of students attending community colleges were underprepared in at least one of the basic skills needed to succeed in college, e.g., reading, writing, or math. Community colleges must be intentional connoisseurs of institutional research because the data is the guide to effective institutional practices to counteract student academic preparation deficiencies. Moreover, academic preparation as a factor of student success is about meeting the needs of the student, whether that preparation is remedial coursework, extra time on task, tutoring sessions, effective teaching, or even one-on-one-after-class-time with the instructor. Student success is heavily dependent on opportunity to be challenged and supported academically within the framework of institutional practices which promote student achievement (Braxton, 2006). 27 Work Ethics or Soft Skills Work ethics has been defined as ?the desirable characteristics for a potential employee? (Hill & Petty, 1995, p. 59). Also referred to as employability of soft skills, work ethics or soft skills play a vital role in student success. According to Robinson (2000), employability skills are basic job skills which are perfunctory to ?getting, keeping, and doing well on a job? (p. 1). For the community or technical college student, the transposition of work ethics on the job is specifically applicable to doing well in the classroom. Strom & Strom (1999) used the Peer and Self-Evaluation System (PSES) to inform teachers in the community college about group interaction from the student point of view. The premise of the PSES was ?based on the assumption that groups of people who can work together will be the key to success in the emerging global marketplace? (p. 171), while ?group success depends on individual accountability? (p. 172). Teamwork is a work ethic and a college student and faculty success indicator. According to WorkEthics.Org (2006), the number one priority of Georgia?s employers is to create a viable and effective workforce by teaching the following work ethics to students: 1) Attendance, 2) Teamwork, 3), Attitude, 4), Organizational Skills, 5) Cooperation, 6), Character, 7) Appearance, 8) Productivity, 9) Communication, and 10) Respect. Students who attend the community or technical college without these work ethics are more likely to be less prepared to do college-level work than those students who possess these traits to a greater degree. Therefore, these work ethics have a direct impact on student success and are direct factors about which students and faculty have perceptions. To measure, compare, and report these relationships will inform the community or technical college of actions to be taken in promoting student success. 28 Institutional Support Institutional support is a factor which impacts student success before, during, and after enrollment. As previously noted, college-readiness includes a plethora of factors, themes or variables which have the potential to positively or negatively influence the student?s ability to be prepared to enter college. Once the student has achieved enrollment, institutional support structures should permeate each thread of the college student success domain. Weimer (1994) argued that ?we believe that the factors which affect student learning exist in four different areas: 1) the curriculum, 2) with faculty and in the classroom, 3) outside the classroom, and 4) via the organizational policies and structures of the institutions they attend? (p. 4). Amey and Long (1998) conducted a study comparing successful and unsuccessful underprepared students, e.g., deficits in college-readiness; the study concluded that ?differences in outcomes for the students in the two groups were related to actions taken by the students and/or the institution while the student was in attendance? (p. 5). Institutional support in the community college is to foster recruitment, retention, goal attainment, and graduation for every student. It should be noted that institutional support is a matter of institutional practice and that ineffective institutional support or practice is significantly harmful to student success (Bailey, 2006a; Long, 2006; Richardson, 2006). A student who is academically prepared for college successfully enrolls and begins attending a community college. The goal of the institutional support framework is to meet the ?student services? needs of the student. When ?student services? fail to provide for the success of the student as a factor of the students overall persistence to continue in college, student success is harmed. College-readiness is comprised of not 29 only being ready to enter college, but also to persist while in college. Institutional support structures are those policies and practices which enable students to persist while in college and include, but are not limited to: an efficient registration process; remedial courses as needed; advising; parking; college culture; counseling; learning-conducive facilities; approachable faculty; friendly support staff; and, institutional leadership. If these variables do not lend themselves to the success of the student, a student who is ready for college will be more likely to consider the institution unsupportive of their success and less likely to persist; institutional support and practice should be reflective of a significant community college (VanWagoner, Bowman, & Spraggs, 2005). As students and faculty interact in the domain of teaching, learning, and assessment, the variables (factors) of academic preparation, work ethics, and institutional support may easily interfere with the relationship between faculty and student. Wyatt, Saunders, and Selmer (2005) noted that ?69 percent of the student respondents indicated that they were achieving their academic potential, [whereas] only 22% of the faculty respondents felt that their students were reaching their academic potential? (p. 32). The perceptual difference between students and faculty indicated that to improve the relationship between these groups, the factor of academic preparation needed to be understood to a greater degree by investigating how these groups perceived academic preparation as promoting student success. Faculty and student groups are the independent variables used to assess the dependent variables of academic preparation, work ethics, and institutional support as impacting community college student success. To ignore student and faculty perceptions as impacting student success is to avoid the reality of a methodology to improve student achievement in the community college. 30 Statement of the Problem As community college students and faculty interact in the Teaching-Learning- Assessment Domain, critical factors influence student success. A significant amount of research has been conducted to investigate the enormity of variables impacting the student success domain. However, there is a shortage of research investigating the relationship of variables impacting community college student success within the framework of institutional practice. For this study, there was no mention in the literature of the grouped impact factors as noted in the Strategic-Impact-Triad Model, e.g., academic preparation, work ethics, and institutional support. Therefore, the problem that was undertaken in this study was to investigate the relationship of the Strategic-Impact- Triad Model factors as they impacted student success as perceived by students and faculty within the framework of institutional practices (Bailey, 2006a; Kuh el., 2004). If community colleges do not properly acknowledge the perceptions of individuals directly involved in the success of a student, policies and application are more likely to be haphazardly practiced (Long, 2006). The problem, therefore, is that to ignore the data is to make decisions which are more likely to be erroneous. As argued by Braxton (2006), ??c ollege student success stands as a topic that cries out for some form of systematic empirical attention. Without the benefit of such scholarly attention, uninformed, ad hoc views on student success and [ineffective] ways to achieve student success will emerge? (p. 1). A directly related problem to be examined is how the perceptions of faculty and students may be utilized as input variables to influence the realignment of policy and practice to promote student success (ACTE, 2006; Dobelle, 2006; Kuh et al., 2006). The major hypothesis of this study was to investigate whether 31 there was a statistically significant perceptual difference or similarity of student success within and between student and faculty groups; additionally, to compare the perceptions of students and faculty as they respectively perceived the relationship between institutional practice, academic preparation, work ethics, and institutional support as these variables positively or negatively influenced college student success. This research is directed at community college decision-makers to suggest the power of perceptions in influencing policy decisions and institutional practice. The questions to be answered in this study were operationalized by grouping students and faculty as primary sources of perceptual data. Additionally, the research questions will investigate relationships between/within groups to suggest the strengths or weaknesses of correlation between perceptions and how student success may be improved (Adelman, 2006; Maypole & Davies, 2001; NCES, 2003; Overby, 2004; Sanoff, 2006). Purpose of the Study The purpose of this study was to investigate underlying perceptions of students and faculty and how these perceptions relate to student success initiatives and policies. Students entering the doors of the community college have self-expectations, differences in high school preparation, and personal experiences which may significantly differ from what faculty members perceive or expect of students (Perin, 2006). Variances in skills, experiences, and perceptions become evident when students are required to take a placement test or complete an attitudes/opinions survey, e.g., Comprehensive Computer- Adaptive Testing System (COMPASS) (ACT COMPASS System, 2006), College Student Inventory (CSI-B) (Noel-Levitz, 2006). Outcomes of these types of entrance 32 exams or surveys in the community college give rise to a concern for student preparation to enter college, and to also compete at an acceptable level through the maze of coursework, study skills, and persistence. The number of students requiring remediation to formally begin community college level courses range from 30% to 94%, with the 94% being a valid outlier for very specific high school systems (Conley, 2005; Hammons, 2004; Phipps, 1998; Spann, 2000). This study explored the perceptions of college student success by students and faculty in the community college and correlated these findings with qualitative open- ended questions. Outcomes of this study are to inform not only educational administrators of the serious issues surrounding these perceptions of student success and institutional practice, but to also inform policy designers that perceptions can be used as input variables to properly redress misaligned or ineffective policies or practice to better promote college student success, including remedial education. In terms of how students and faculty separately and collectively perceive student success, Lindholm, Szelenyi, Hurtado, and Korn (2005) noted that 36% of postsecondary faculty (from four-and-two-year institutions, both public and private) considered that most students are well prepared academically for college. Forty-one percent of all survey respondents ? and 65% of faculty at public two-year colleges ? revealed that most of the students they taught lacked the basic skills needed for college-level coursework, whereas 70% of entering college students perceived themselves as above average or in the highest 10% academically. These perceptions by faculty and students can have detrimental outcomes for students if they are translated into policy action, reflected in faculty practice, or remain unchallenged by policy-makers. 33 Student success is a matter of perception on the part of both the student and faculty member (Dalgety & Coll, 2006; Lynch, 2005; Sanoff, 2006). Levine and Cureton (1998) suggested an increasing gap between how students learn most effectively as compared to faculty teaching methods. Students have a perception of learning that is practical, real-world, linearly-structured, and primarily focused on the concrete, physical environment. Conversely, faculty view learning as a process of stimulating students by using concepts, ideas, and abstractions. Furthermore, the perception of faculty is that students should be independent learners and need a significant level of autonomy in their assigned work. The major disconnect between these two group perceptions is best summarized by the results of Levine and Cureton (1998): ?Small wonder, then, that frustration results and that every year faculty believe students are less well prepared, while students increasingly think their classes are incomprehensible? (p. 16). Misaligned perceptions even extend to developmental or remedial studies in which selected faculty view successful completers of developmental or remedial courses as ?academic underachievers? (Overby, 2004, p. 1). The dichotomy in current research indicated that comparative perceptions of students and faculty do not necessarily align themselves in terms of teaching, learning, policy directives, practice, college-readiness, or student success. Brozik (2004) reflected on student preparation and success: No kidding, I mean it. Whom do I blame? I teach upper-division and graduate courses, and I am constantly confronted with students who cannot spell, who do not or will not read, and whose math skills are simply appalling. I spend a whole lot of time trying to get these kids up to a reasonable level of literacy. I should be teaching content, but, oh no, I just try to get past sentence fragments. (p. 25) 34 College-readiness has been studied and identified as a problematic source of educational dysfunction. The outcomes of a lack of college-readiness are specifically and minimally indicated in test scores, GPA, writing, reading, and college student success. However, this study investigated the perceptions of students and faculty to focus on respective viewpoints which are used as a basis to make decisions to improve student success. The outcomes of these measured perceptions will then become the framework to determine how institutional practice may be challenged for the purpose of enhancing future practice which perpetually enhances student success in college and life. Research Questions The following research questions were used in this study: 1. What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? 2. What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? 3. What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? 4. What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics, and institutional support? 35 Significance of the Study The significance of this study is embedded in the daily routines of education. Students and faculty regularly meet to exchange ideas, participate in teaching-learning, interact as human beings, and react as they respectively perceive their environments. Perceptions are force-multipliers in the eyes of the individual and, therefore, must be understood and researched to a significant level in order to become a catalyst for change. This study will assist in identifying perceptions which impact student success policy and practice. College student success policy which is ineffective or insignificant interferes with educational outcomes at the earliest stages of the P-16 process and in many cases proceeds through middle school, high school and college (Van de Water & Rainwater, 2001). This study will have a potential impact on policy designers as student success issues are studied and promulgated to the educational community. As underlying perceptions suggest the actual interpretation and application of policies applied to student success, this study will have considerable significance to policy designers who impact the lives of the future student population in the United States. This study will contribute to the literature in how students and faculty ? two major educational players in the teaching- learning process ? relate to each other perceptually and what these differences might suggest to college student success stakeholders and policy-makers (Bailey & Alfonso, 2005; Kuh et al., 2006; Luo & Jamieson-Drake, 2004; Robbins et al., 2005; Smart, Feldman & Ethington, 2006; Smith, 2005; Tinto & Pusser, 2006). 36 Limitations of the Study The limitations of this study are summarized below: 1. Perceptions data were collected only from community college students and faculty and may limit the specific applicability and transferability of the research to four- year institutions; 2. Stakeholders in this study are recognized as all individuals for whom college student success is a part of their respective consideration. The limitation in this regard is that this study delimits the stakeholders to students and faculty, with full disclosure that all stakeholder input and perceptions would necessitate a much broader scale of research; 3. Sampling sub-scales of student respondents in this study did not specifically distinguish between full-time, part-time, first-year, first-generation, returning student, gender, or reverse-transfer students in the population; 4. Sampling sub-scales of faculty respondents in this study did not specifically distinguish between full-time, part-time, gender, and years of experience; 5. Independent and dependent variables which define, categorize, quantify, qualify, or impact student success are delimited in scope to focus this study on the Strategic-Impact-Triad Model factors of academic preparation, work ethics, and institutional support (within the framework of institutional practice). 37 Assumptions of the Study The assumptions of this study are summarized below: 1. This study assumed that faculty in the community college are readily cognizant of issues and research regarding student success and have well-established perceptions of academic preparation, work ethics, institutional support, institutional practices, and are motivated to provide viable input; 2. This study assumed that students in the community college are nominally cognizant of issues regarding student success, have sufficiently-established perceptions of academic preparation, work ethics, institutional support, institutional practices, and are motivated to provide viable input; 3. It was assumed that students and faculty provided accurate feedback to survey questions as a matter of actual perceptions related to academic preparation, work ethics, institutional support, and institutional practices; 4. The underlying global assumption for this study was that the findings from three major studies could be grouped to form the factors in the Strategic-Impact-Triad. Based on a review of the literature, the Strategic-Impact-Triad Model was derived from expert opinions of established researchers: Robbins et al. (2004), Kuh et al. (2006), and Smith (2005). To delimit the scope of the findings by Robbins et al., (2004), Kuh el al. (2006), and Smith (2005), this dissertation assumed a composite, grouping methodology to categorically derive the factors of academic preparation, work ethics, and institutional support as the determining factors which would be investigated within the framework of institutional practice. 38 Definitions of Key Terms The following terms are used in this study and indicate general and specific applicability to the community college system of education. American College Testing Service. The American College Testing Service is interchangeable with ACT, or ACT, Inc. Most references in this study for the American College Testing Service will be indicated as ACT. Baby Boomers. Individuals born between 1946 ? 1964 and comprise the largest student population in the history of education until the rise of the millennial generation (see definition for Millennials). College-eligible. The process established by policy in the educational community in which a student has met all requirements for entry into college. College Preparation, College Preparedness, Student Preparedness, Student Readiness, or Student Preparation. These items are synonyms for college-readiness. College-Readiness. The conceptual ideal that a student is academically prepared to engage and persist in the rigors of college-level work (courses) as a means to complete a college degree (Kazis, 2006). College-readiness also includes any postsecondary education or training in which a student is prepared to engage for the purpose of improving his or her life-long learning and self-sustaining workforce attributes. College-Readiness Policy Realignment Model. A model to indicate the need to realign college-readiness policies to improve the system of P-16. This definition has a significant relationship to perceptions as policy and perceptions are correlates of one another. 39 College student success. A dynamic, moving-target construct which signifies that a student has achieved a stated goal which may not necessarily be a college degree but may include only a set of courses, a technical objective, or a field-of-study certification. Student success is difficult to define due to the extreme number of factors to define student success. For this study, college student success is a general theme in which any factor which hinders students from improving themselves is categorized as part of the construct of college student success (Kuh et al., 2006; Bailey, 2006a; Robbins et al., 2004). COMPASS. A copyright testing and placement service of the ACT, COMPASS is ?much more than a series of tests.? The COMPASS? system is a comprehensive computer-adaptive testing system that helps place students into appropriate courses and maximizes the information postsecondary schools need to ensure student success. Community College(s) or Community College System of Education. The national educational system of two-year institutions includes technical, community and junior colleges offering postsecondary education ranging from specialized certificates in technical training to two-year transfer college degrees of general studies or highly professional fields. Included in this definition is the interchange of the terms ?community college?, ?community/junior college?, ?junior college?, ?technical college?, ?community/junior/technical?, or the generic term of ?community college? to represent the community college system of education. ?Technical College? will be used specifically when defining or describing the technical college as a vocational institution, when appropriate, or its attributes. 40 Dual-Enrollment. The process of high school students dually enrolled in high school and college as a means to increase their potential for college-readiness success. Faculty. Specific to this study, faculty will be classified as those individuals with primary, secondary, or tertiary responsibility in the classroom as ?instructor of record? within the community college system of education. Gen X. The generation of students born between 1961 and 1981 and have been identified as a group with the attributes which differ from other generations and require an understanding of their perceptions of college-readiness. Middle Schoolism. An approach to educating children in the middle grades (usually grades 5-8), popularized in the latter half of the 20th century, that contributed to a precipitous decline in academic achievement among American early adolescents (Yecke, 2005). Millennials. The generation of students born between 1982 and 2002 and have been identified as the largest potential pool of students since the Baby Boomer generation and will statistically and significantly impact college-readiness or student success research. Open-Door Policies. Within the community college system of education, open- door policies are those policies and practices which afford ?open-access? to all students who apply to enroll in a community college regardless of the declared objective of the individual student, e.g., one course, a certificate, retraining, vocational training, degree, transfer courses, etc. (Milliron & E. de los Santos, 2004; Phillippe & Sullivan, 2005). P-12. The system of K-12 to include Pre-K as noted in the research literature and as referenced in P-16. 41 P-16. An acronym for a seamless educational system in which college-readiness policies linearly support the longitudinal process of P-16 education from Pre-K to completion of a four-college degree. The P-16 system has three student success delimiters: 1) guiding a child from early care through high school to prepare for college; 2) the successful completion of a two-year college degree or technical training; 3) the successful completion of a four-year college degree (Paredes, 2006; Pipho, 2001). Perceptions. Perceptions are defined as the processes which form ideas and understandings about the world in which an individual lives. Society, peers, upbringing, experiences, high school, rules, laws, policies, and so forth, are the ?shapers? of individual perceptions. Emphasis in this study is given to how policy has influenced perceptions and how perceptions might realign college-readiness or student success practices and/or policies. Perceptions Research. A systematic process of statistical analysis which measures and reports perceptual data to indicate the impact on college student success. Perceptions research is used in this study to suggest how these perceptions might statistically impact the present and future actions of policy designers. Policy or Policies. A written document or set of documents in which the document(s) is/are presented to an organization as a matter of guide to achieve specific or general goals. Policies may be interpreted differently in terms of how the policies are perceived, carried out, and reflected in the culture of the organization. For this study, policy is further defined as ?the catalyst which creates educational perceptions and outcomes? (Venezia, 2005). 42 Policy Alignment. Policies which are aligned are effective guidelines which reflect the perceptions and actual practices in the educational institutions. Alignment is the process of contiguous positive correlation between what is perceived and practiced in the halls of the institution and institutional practice or application (Venezia, 2005). Policy Designers. Any individual or group of individuals who have influence on shaping policy or institutional practice which impacts college student success. Policy Realignment. The intentional process of reviewing current educational policy and practice in full view of feedback, input, opinions, and perceptions by all stake holders for the sole purpose to realign policy and practice to improve educational outcomes, e.g., community college student success. Remediation or Remedial Education. The requirement of a student to participate in a developmental course prior to the student being permitted to participate in a college- level course of the same or related subject matter as required by the institution (NCES, 2004-010; NCES, 97-584). Remediation is noted in this study as a variable of deficiency in college-readiness and is determined by community college testing services for high school students who have not taken the ACT, SAT or who do not have transfer courses in General Education core courses. Remedial education courses provide the solution. Reverse Transfers. Degree-holding students attending a community college to upgrade a skill, acquire new skills, or acquire non-credit learning. Of the students in the reverse transfer process, 28% have at least a Bachelor?s Degree (Boggs, 2004). Students. Community college students are individuals enrolled in the college in any course, program of study, or activity in which the stated goal is a degree, certificate, specialized training, college transfer, or other stated goal. 43 Stakeholders. Any individual or group of individuals which have direct or indirect influence on student success policy or practice at any level in the P-16 system. Student Swirl. The non-linear matriculation of community college students as they enter and/or leave college in pursuit of their educational goals. Organization of the Study The organization of this study is segmented into five Chapters. Chapter I included an introduction to the scope of student success and stipulated the objectives of the research in terms of the research questions. Moreover, the relationship between student success, policies, and outcomes has been suggested and included the problem to be researched, limitations, specific terms, significance and purpose of the study. Chapter II presents a review of the directly and indirectly related literature of student success, perceptions of students and faculty, the community college, factors of the Strategic-Impact-Triad Model, institutional practice, and a summary. In Chapter III, the methodology of the study is organized into the research design, population, sampling, instrumentation, procedure, data analysis, confidentiality and anonymity, reliability and validity, and a summary. The results or findings of the study will be statistically presented in Chapter IV, whereas Chapter V will discuss the conclusions and recommendations of the study. An Appendix includes Survey Instruments, Letters of Intent, pilot test information, and other supporting or related research material pertinent to this study. 44 Chapter Summary College student success is a significantly investigated educational phenomenon, while student success in the context of institutional practice is not. Because institutional practice strategically impacts student success, there is a need for additional and continued research to uncover and suggest evolving solutions. Furthermore, this chapter identified studies which have investigated student success; from these studies, a Strategic-Impact- Triad Model was designed to address three factors significantly impacting student success: 1) academic preparation, 2) work ethics, 3) and institutional support. The factors in the Strategic-Impact-Triad Model were assessed within the framework of institutional practice to better understand the underlying relationships promoting/improving college student success. Student success is a national priority; in the absence of college- readiness, the outlook for enrollment, persistence, graduation, and a strong economy is comparatively and statistically less impressive than a strong national policy of college- readiness and success for all students (Phillips & Skelly, 2006). Figure 2 provided a global graphical summation of this chapter and indicated the depth of interactive variables to achieve student success. The overarching hypothesis of this study was to suggest that the perceptions of students and faculty in the community college are the results of institutional practice and that practice positively or negatively impacts student success. Applicability of the findings of this study suggested to policy- makers that how policies are perceived is the reality of how policies are practiced. To omit perceptions as variables in designing policy is to omit a major source of valuable information in making life-changing decisions for students, faculty, and other stakeholders directly or indirectly related to college student success. 45 Figure 2. Community College Global Model of Student Success. Students Faculty P E R C E P T I O N S IN ST ITU T ION A L PRACTIC E The ?Voluminous Factors? Impacting College Student Success (The following list is not all inclusive) Parents Legislators/Policymakers Student Engagement Educational Administrators Faculty Perceptions High School GPA & Remediation ACT/SAT Dual-Enrollment Dropout/Stopout Diversity of Ability/Preparation Peer Pressure Attitudes/Work Ethics Community Support Core Curriculum Workforce Issues College or Trade Socioeconomic Considerations High School Support/Counseling Too Many Minds First-Year, First-Generation Millennials, GenX Student Swirl Enrollment Delay Adult Learner Returning Perceptions of Self Perceptions of Others Perceptions of School Perceptions of College Misaligned Policy Educational Practice Robbins et al., 2004 (9 constructs) Kuh el al., 2006 (14 indicators) Smith, 2005 (51 competencies) ERIC (6,287 documents/research) Academic Preparation - - - - - - - Work Ethics - - - - - - - Institutional Support Academic Preparation - - - - - - - Work Ethics - - - - - - - Institutional Support STUDENT SUCCESS DOMAIN Community College Global Model of Student Success 46 CHAPTER II LITERATURE REVIEW "When you reach a point in your life when you have to question the magic of creation, and all the world seems against you, remember: we do not live in a world of reality, we live in a world of perceptions... it is up to you to decide, to which world you belong." --- Justin S., Age 15 --- Indiana, 2005 ?People can be divided into three groups: 1) Those that make things happen; 2) Those who watch things happen; and, 3) Those who wonder what?s happening.? --- Anonymous Introduction This chapter reviewed the literature related to student success, perceptions, and the relationship between academic preparation, work ethics, institutional support, and institutional practice. To address college student success in the context of the community college, a historical perspective of community colleges was presented as a stage upon which to pursue the literature on issues and factors impacting student success. The chapter was generally delimited into the topics of: the history, role, demographics, and issues of the community college; institutional practices and student success research; academic preparation; work ethics; institutional support; and institutional practices. All of these areas included perspectives on student-faculty perceptions. The chapter also included several tables, figures, and a significant summary. 47 Historical Perspective and Role of the Community College The Yale Report of 1828?which proposed among other things?that students who were defective in their college-readiness preparation for college success should not be allowed to enroll in the college. However, Charles W. Eliot, in his 1869 presidential inaugural address to the faculty and staff at Harvard University, disagreed with the report. He took the opposing view when he openly suggested that ?? the American College is obliged to supplement the American school. Whatever elementary instruction the schools fail to give, the college must supply? (Spann, 2000, p. 2). The reference to college has historically and gradually come to include community, junior, and technical colleges, while the reference to supplement the American school has specific implications for the open-door policies of the community college system of education (Bailey et al, 2005a; Bailey, 2006b; Evelyn, 2004a; Franco, 2002). Blocker, Plummer, and Richardson (1965) noted the existence of private and public two-year colleges in the 1800?s. The validity of these institutions as a correlation to the historical context of the community/junior/technical college construct is that these ?institutions acted as post-secondary schools and were similar in content to the first two years of American colleges of the times? (Geller, 2001, p. 3) by dividing ?the upper and lower divisions? of the college or university (Wattenbarger & Witt, 1995, p. 18). Three examples cited were (1) Monticello College established in 1835, (2) Susquehanna University in 1858, and (3) the University of Michigan?s junior college in 1883. A major exponential factor which contributed to the creation of public colleges during this period was the Morrill Act of 1862 (Phillippe & Sullivan, 2005) in which 30,000 acre land grants were parsed out to individual states. Imbedded in the 48 longitudinal construct of the Morrill Act was the framework to provide open-access to postsecondary education and the need to train future workers for the industrial revolution of the first decades of the 20th century. As a result, junior colleges evolved to offer both liberal arts education and vocational training as compared to traditional public and private colleges and universities which maintained a more liberal arts or professional approach to education (Cohen & Brawer, 2003). Established in 1901, Joliet Junior College is the oldest continuously operating public two-year college in the United States (Phillippe & Sullivan, 2005). Joliet Junior College was conceptualized by William Rainey Harper and J. Stanley Brown as an extension of high school by offering a ?fifth and sixth year of study beyond high school that was comparable to the first two years of college? (p. 1). One of the major purposes in Joliet Junior College was to provide open-access to high school graduates who lacked financial resources or who were not college-ready for the academic rigors of highly competitive universities, which was a precursor to the open-door policy of the community college system of education presently in operation today (Boggs, 2004; Chronicle of Higher Education, 2004; NCES, 2003, 2006). According to Wattenbarger and Witt (1995) and Cohen and Brawer (2003), by 1901 there were approximately nine two-year colleges in existence. Moreover, historians of the junior college movement agreed that the national movement officially began during the 1890s in the Midwest. Several notable individuals have been credited with the junior college movement: (1) William Rainey Harper of the University of Chicago, (2) Alexis Lange of the University of California at Berkeley, (3) Edmund James of the University of Illinois, (4) David Starr Jordan of Stanford (5) Henry Tappan of the 49 University of Michigan and (6) Anthony Caminetti, State Senator of California (Erdman & Ogden, 2000; Kintzer & Bryant, 1998; Wattenbarger & Witt, 1995). Of particular importance in the literature on community colleges, William Rainey Harper of the University of Chicago publicly argued for a separation of the first two years of college and the more advanced studies of higher work in the last two years and graduate studies. He was noted to have opined that ?it is not until the end of the sophomore year that university methods of instruction may be employed to advantage? (Brint & Karabel, 1989, p. 24). Moreover, Harper?s hypothesis suggested five attributes (Eells, 1931, p. 48) of the first two years of college which have direct research implication for today?s junior, community, and technical colleges (Cohen, 2005; Erdman & Ogden, 2000; Kintzer & Bryant, 1998). 1. Many students will find it convenient to give up college work at the end of the sophomore year (Ausburn, 2002; Horn, Nevill & Griffith, 2006); 2. Many students who would not otherwise do so, will undertake at least two years of college work (Bragg, 2001; Burd, 2006; Voorhees & Zhou, 2000; Welsh, Brake & Choi, 2005); 3. The professional schools will be able to raise their standards for admission (Almeida, 1991; VanWagoner, Bowman & Spraggs, 2005); 4. Many academies and high schools will be encouraged to develop higher work (Philips & Skelly, 2006; Van de Water & Rainwater, 2001); and, 5. Many colleges which have not the means to do the work of the junior and senior years will be satisfied under this arrangement to do the lower work (Day & McCabe, 1997; Florida Community Colleges & Workforce 50 Education, 2005; Wattenbarger, Haynes & Smith, 1982; Jenkins & Boswell, 2002). Although Harper suggested the separation of the first-two years and the latter years of a student?s college studies, the outcome of Harper?s five attributes has been of great significance to the system of two-year education, including the role and identity of the community college system of education. The role and identity of the community college was solidified in the 1947 Truman Commission Report on Higher Education when it was noted that ?the federal government, for the first time, fully recognized the important role of the community colleges. The Truman Commission called for public postsecondary education for all Americans, regardless of race, creed, color, sex, or economic status? (Smith, 1997, p. 1264). Community colleges serve a unique and vital role in the educational system of the United States as noted by sampling state-by-state community college studies, e.g., Massachusetts, State University of New York, New Jersey, Colorado, Oregon, etc. (DuBois, 1999; Harbour, Davies & Lewis, 2006; Motta, 1999; Nespoli & Gilroy, 1999; Widson et al., 2006). The community college system of education has evolved to become the dynamic bridge between high school graduates, a two-year degree, a four- year college degree, and workforce training. For a significant number of these working- class students, a two-year college degree is the only option available when pursuing postsecondary education (Burd, 2006; Voorhees & Zhou, 2000); conversely, the dynamics of community college student goals do not always translate into a degree, either two-year or four-year, due to factors or variables that challenge student success or deficits in college-readiness (Bailey et al., 2005a; Bailey, Jenkins & Leinbach, 2005; Venezia et 51 al., 2005; Sanoff, 2006). However, Boggs (2004) suggested that the community college is a fundamental higher education resource throughout the nation. Milliron and E. de los Santos (2004) and the American Association of Community Colleges (2006a), suggested the following: (1) the community college is an integral part of the educational system which cannot be presently ignored; (2) the community college system of education is projected to remain a future national education and economic asset, as corroborated by the research of VanWagoner, Bowman and Spraggs (2005); (3) the community college is a significantly powerful and comprehensive institution of ?educational, economic, and social dynamics? (Milliron & E. de los Santos, 2004, p. 106) which provides opportunity for personal and professional goals via the comprehensiveness of the community college system. For example: The comprehensive community college is woven into the fabric of American life, and increasingly into the social tapestry of the world. The students of the community college run multinational corporations, fly through space, star in movies, provide leadership in statehouses, and map the human genome. (Milliron & E. de los Santos, 2004, p. 106) Categorical data supporting Milliron and E. de los Santos and related to the generations, characteristics and role of the community college system of education are compiled and summarized in Table 5. For example, the generations noted suggested that the community college has been a mainstay for the development of a workforce which has generally not been privy, able, or prepared to attend four-year colleges or universities for the past 100-plus years. In terms of economic scale, the community college system of education provides future earning power for community college students, counties, and states as suggested by Gillum and Davies (2003). Table 5 identified how perceptions of legislators do not necessarily mesh with actual facts or data, particularly when the data is 52 available through research but invalidated from an absence of application or acceptance (Cohen, 2005). Table 5 Community College Generations, Characteristics, Principles, and Earning Power Generation & Time Span Generation Characteristics (Tillery & Deegan, 1985) Six Key Principles of the Learning Community College (O?Banion, 1997) Earning Power (Gillum & Davies, 2003, p. 249 - 250) 1 1900- 1930 An extension of secondary schools 1. creating substantive change in individual learners 2. engaging learners as full partners in the learning process 3. creating and offering as many options for learners as possible 4. assisting learners in forming and participating in collaborative learning projects 5. defining the roles of learning facilitators by the needs of the learners 6. documenting improved and expanded learning for its learners, the only way the learning college and its facilitators succeed Based on the qualitative and quantitative data developed and presented in the study, it is possible to draw certain conclusions: 1. the perceptions that are held by these legislators concerning the degree to which this community college impacted the state and local economies are based more on a belief than on quantifiable data; 2. the perceptions that are held by the legislators concerning the impact that the college has on individual earning ability also are based more on belief than quantifiable data; 3. the data generated by the economic impact analysis model seems to indicate the value this community college has to both the state?s and county?s economies; 4. the economic impact model clearly indicates that the college is a major source of secondary jobs for both the county and state; 5. the data generated from the state?s wage records provide direct evidence confirming program completion at college has a positive impact on the earning power of its program completers. 2 1930- 1950 Junior College Generation 3 1950- 1970 Community College Generation 4 1970- 1985 Comprehensive Community College Generation 5 1985- 1999 Unnamed (Tillery & Deegan, 1985); 6 1999- 2006 Learning Community College Generation (Geller, 2001); Linchpin Institutions (Milliron & E. de los Santos, 2004) 53 As demonstrated in Table 5, perceptions are important not only in understanding the characteristics, principles, and earning power in the community college, they form the basis of action in the daily business of the two-year college system. As suggested by VanWagoner, Bowman and Spraggs (2005), the significant community college is an institution in which the principles, characteristics, and practices establish a framework to build ?a pervasive passion for mission and accomplishments? (p. 47), inclusive of creating a culture of inquiry to improve student success in the community college. Demographics of the Community College Diversity and open-door policies have become synonymous terms in the community college and define the community college system of education in the United States (Almeida, 1991; Bragg, 2001; Burd, 2006; Levine & Cureton, 1998; Phillippe & Sullivan, 2005). According to Horn, Nevill, and Griffith (2006, p. iv), ?compared with students attending 4-year colleges, community college students are more likely to be older, female, Black or Hispanic, and from low-income families? as noted by the 2003-04 data in Figure 3. Moreover, the traditional-age student population in the community college has been on the rise over the past decade (Adelman, 2005). At present, the majority of students in the community college are classified as independent students, which have the following attributes: (1) 24 or older, (2) considered financially independent from parents for financial aid classification, and (3) married and/or have children. As noted in Figure 3, 61% of community college students are classified as independent as compared to 35% of students at public or private 4-year institutions. 54 Additionally and in support of Horn, Nevill, and Griffith (2006), Phillippe & Sullivan (2005) suggested that the majority of community college students are ?financially independent, working full or part time and supporting families while enrolled [and that] for those who aspire to a higher standard of living, community colleges are truly an open door? (p. 19). Open door policies in the community college represent a responsibility on the part of the community college to ensure that students have opportunity to be successful completers of stated educational goals (Bailey et al., 2005a). College student success in the community college is a critical-mass issue, with national implications for the future quality of life for all citizens (Kuh et al., 2006). Figure 3. Comparative Demographics of Community Colleges and 4-Year Institutions: 2003-04. The impact that the community college system of education has on the national scope of educational opportunities is demonstrated in Table 6. Of particular value to the global workforce is the number of students influenced by the community colleges across the nation (Phillippe & Sullivan, 2005, p. 9). Demographic Characteristics of Undergraduates Enrolled in Community Colleges and 4-Year Institutions: 2003-04 0 10 20 30 40 50 60 70 80 90 100 Comparative Percentages: 100% Community Colleges 4-Year Institutions Median Age Female Low- Income White Black Hispanic Independent 24 21 59 55 26 20 60 69 15 11 14 10 61 35 55 Franco (2002) suggested that the community college has a civic role to prepare students for the work of democracy. As indicated in Table 6, particularly noting the number of lives touched by these institutions, not only do community colleges prepare a significant number of individuals for the workforce, these institutions also foster the principles of citizenship. Franco (2002) argued that because community colleges serve the majority of underserved, or low-income students, the community college system of education has a role to play as ?America?s democracy colleges? (p. 131). Table 6 Number of Community Colleges by State (2004) and Population Served 2001-2002 State Private Public Tribal Tot Population aged 18 or older % of population served Fall 2001 % of population served 2001- 2002 Rank Alabama 1 23 0 24 3,355,089 2.3% 3.6% 28 Alaska 0 5 0 5 443,064 2.6% 5.4% 11 Arizona 1 19 2 22 3,861,087 4.8% 8.9% 2 Arkansas 1 24 0 25 2,011,990 2.3% 3.6% 27 California 24 111 1 136 25,177,335 5.9% 9.8% 1 Colorado 0 15 0 15 3,291,814 2.4% 4.2% 20 Connecticut 5 12 0 17 2,593,471 1.7% 2.5% 42 Delaware 0 3 0 3 Di 604,636 2.0% 3.0% 34 District of Columbia 1 0 0 1 460,873 0.0% 0.0% 51 Florida 3 28 0 31 12,568,154 2.7% 4.4% 17 Georgia 6 37 0 43 6,612,187 1.8% 2.8% 37 Hawaii 3 7 0 10 931,428 2.8% 4.0% 23 Idaho 0 4 0 4 950,204 2.8% 4.1% 22 Illinois 8 45 0 53 9,272,276 3.7% 7.4% 4 Indiana 2 3 0 5 4,535,822 1.5% 2.5% 41 Iowa 5 15 0 20 2,221,237 3.3% 5.0% 13 Kansas 3 22 0 25 1,998,360 3.6% 6.2% 7 Kentucky 1 16 0 17 3,079,098 2.3% 3.3% 31 Louisiana 0 11 0 11 3,268,183 1.0% 1.8% 48 Maine 2 8 0 10 991,471 1.3% 1.8% 47 Maryland 1 18 0 19 4,017,277 2.7% 4.1% 21 56 Table 6 (continued) State Private Public Tribal Tot Population aged 18 or older % of population served Fall 2001 % of population served 2001- 2002 Rank Massachusetts 9 17 0 26 4,925,984 1.8% 2.7% 38 Michigan 2 28 2 32 7,433,782 2.7% 4.4% 18 Minnesota 3 28 2 33 3,717,580 3.5% 4.7% 15 Mississippi 0 16 0 16 2,094,765 2.9% 4.0% 24 Missouri 5 14 0 19 4,229,728 2.0% 3.2% 32 Montana 0 8 7 15 685,747 1.5% 2.4% 43 Nebraska 0 7 2 9 1,276,129 2.8% 5.6% 9 Nevada 1 4 0 5 1,543,076 3.2% 5.0% 14 New Hampshire 3 4 0 7 951,142 1.6% 2.3% 44 New Jersey 2 19 0 21 6,396,274 2.1% 3.0% 35 New Mexico 0 15 3 18 1,328,276 4.1% 7.4% 3 New York 18 43 0 61 14,441,533 2.0% 2.8% 36 North Carolina 2 59 0 61 6,171,175 2.9% 4.3% 19 North Dakota 0 5 5 10 485,091 1.9% 2.7% 39 Ohio 6 34 0 40 8,537,248 2.1% 3.1% 33 Oklahoma 0 15 0 15 2,588,799 2.4% 3.7% 26 Oregon 1 14 0 15 2,618,763 3.3% 6.0% 8 Pennsylvania 9 18 0 27 9,418,495 1.4% 2.1% 45 Rhode Island 1 1 0 2 814,451 2.3% 3.3% 30 South Carolina 1 17 0 18 3,046,567 2.4% 3.5% 29 South Dakota 1 4 4 9 560,348 1.2% 1.5% 49 Tennessee 2 13 0 15 4,348,929 1.8% 2.6% 40 Texas 6 66 0 72 15,302,983 3.2% 5.5% 10 Utah 1 5 0 6 1,549,836 3.6% 5.4% 12 Vermont 2 2 0 4 471,443 1.3% 1.4% 50 Virginia 2 24 0 26 5,433,719 2.8% 4.5% 16 Washington 1 33 1 35 4,483,340 4.1% 7.3% 5 West Virginia 1 10 0 11 1,406,199 1.3% 1.8% 46 Wisconsin 0 17 2 19 4,064,317 2.7% 4.0% 25 Wyoming 0 7 0 7 359,486 4.6% 6.8% 6 Outlying Territories 2 6 0 8 ----- ----- ----- ----- TOTALS 171 986 29 1,186 212,490,261 2.9% 4.8% ----- Update January 2007: AACC 180 991 31 1,202 Sources: Phillippe, K., & Sullivan, L. (2005). National Profile of Community Colleges: Trends & Statistics, American Association of Community Colleges, 4th Ed., Community College Press: Washington, DC.; American Association of Community Colleges. Community College Facts at a Glance. Retrieved May 11, 2007, from http://www.aacc.nche.edu/Content/NavigationMenu/AboutCommunity Colleges/Fast_Facts1/ Fast_Facts.htm. 57 As indicated in Table 6, the breadth of the community college system of education has been demonstrated, including growth in the years of 2005 to 2007 (American Association of Community Colleges, 2007). Moreover, Table 7 indicated a fast fact set of data which signified the demographics of the community college system in Toto (American Association of Community Colleges, 2007; Phillippe & Shults, 2003; Phillippe & Sullivan, 2005;). Of significance in Table 7 is that 46% of all undergraduates attend community, junior or technical colleges. Of statistical significance, also indicated in Table 7, is: (1) 62% of applicants for the licensed professional registered nurse graduated from community colleges; (2) 65% of new healthcare workers get their training at community colleges; (3) approximately 550,000 associate degrees and 270,000 two-year certificates are issued annually; (4) a significant portion of the underrepresented population are served by the community college system; (5) 48% of community colleges offer welfare-to-work programs, with 54% of those not participating with plans to offer these life-changing programs; and (6) approximately 11.6 million students are enrolled in community colleges, including both credit and non-credit. Additional demographic data related to the community college as noted in the literature was: (1) 28% of reverse transfers?students seeking non-credit and credit courses in the community college?have at least a Bachelor?s degree (Boggs, 2004; Bragg, 2001); (2) average age of students is 29 years of age (Phillippe & Sullivan, 2005) with a trend for traditional age students (18-24 years old) on the rise (Boggs, 2004); (3) community college student profiles included percentages from all undergraduates: (a) 47% of African-Americans, (b) 55% of Hispanics, (c) 47% of Asian/Pacific Islanders, (d) 58 57% Native Americans, (e) 45% of first-time freshmen, and (f) 60% women, with 40% men (Phillippe & Sullivan, 2005); (4) 65% of students who attend community colleges are from families with annual incomes equal-to-or-less-than $20,000, compared to only 8.6% attending two-year institutions when the family income exceeds $100,000 (Boswell, 2004); and, (5) alumni are considered as sources of funding, noting that a generational transfer of $41-trillion is predicted in the next 45 years (Strout, 2006). A few additional facts as presented by the American Association of Community Colleges (2007) are of significant importance: 1) Health care: 50% of new nurses and the majority of other new health-care workers are educated at community colleges; 2) International programs: Close to 100,000 international students attend community colleges?about 39% of all international undergraduate students in the United States; 3) Workforce training: 95% of businesses and organizations that employ community college graduates recommend community college workforce education and training programs; 4) Homeland security: Close to 80% of firefighters, law enforcement officers, and EMTs are credentialed at community colleges; 5) Five hottest community college programs: registered nursing, law enforcement, licensed practical nursing, radiology, and computer technologies; and, 6) Earnings: The average expected lifetime earnings for a graduate with an associate degree are $1.6 million?about $.4 million more than a high school graduate earns (see Table 7 for additional details). 59 Table 7 Community College Fast Fact Data Fast Facts Data on the Community College System of Education Number of Community Colleges: Public Institutions ? 991 Private Institutions ? 180 Tribal Institutions ? 31 Total ? 1,202 Welfare Reform: 48% of community colleges offer welfare-to-work programs. Of those that do not, 54% plan to offer programs specifically designed for welfare recipients. Enrollment: 11.6 million students 6.6 million credit students 5 million non-credit students 46% of all U.S. undergraduates 45% of first-time freshmen 59% women; 41% men 60% part-time; 40% full-time (full time = 12 + credit hours) Student Profile: 47% of Black undergraduate students 55% of Hispanic 47% of Asian/Pacific Islander 57% of Native American Average Age ? 29 Years 21 or younger ? 43% 22 ? 39 is 42% 40 or older ? 16% Students Receiving Financial Aid: Any aid ? 47% Federal Grants ? 23% State Aid ? 12% Federal loans ? 11% Healthcare: 65% of new healthcare workers get their training at community colleges. Percentage of Federal Financial Aid: Pell Grants ? 32% Campus-based aid ? 9.8% Stafford Loans: Subsidized ? 5.4% Unsubsidized ? 4.4% Revenue Sources (Public Colleges): 44% - State Funds 20% - Tuition and Fees 20% - Local Funds 5% - Federal Funds 11% - Other Governance (Public Colleges): More than 600 boards of trustees 6,000 board members 29 states ? local boards 16 states ? state boards 4 states ? local/state boards Degrees and Certificates Annually: More than 550,000 associate degrees Nearly 270,000 two-year certificates In 2003, 62% of applicants taking the national registered nurse examination to become licensed professional registered nurses were graduates of associate degree programs. Tuition and Fees: $2,191 average annual tuition at public community colleges Training: 95% of businesses and organizations that use them recommend community college workforce education and training programs. Information Technology: More than 95% of community colleges are Internet connected. In recent years, the average starting salary for graduates of information technology programs has increased more than 24%, from $20,753 to $25, 771 Sources: Phillippe, K., & Sullivan, L. (2005). National Profile of Community Colleges: Trends & Statistics, American Association of Community Colleges, 4th Ed., Community College Press, Washington, DC., and American Association of Community Colleges (2006b). American Association of Community Colleges; Community College Facts at a Glance. Retrieved May 11, 2007, from http://www.aacc.nche.edu /Content/NavigationMenu/About Community Colleges/Fast_Facts1/ Fast_Facts.htm. 60 Grubb and Lazerson (2004) suggested that community colleges need to reach for success by building on their individual strengths. Community college demographics demonstrated that ?virtually every region in America now has a community college? (Grubb & Lazerson, p. B116), with a range from 3 in Delaware to a high of 136 in California (Phillippe & Sullivan, 2005). Community colleges are poised to become a significant force in the educational system of the nation, as the data suggested. A specific comparative indicator of community college demographics is compiled in Table 8. The data noted in the table stipulated the relationship between the Alabama Commission on Higher Education (ACHE, 2006) and the American Association of Community Colleges ([AACC], 2007; Phillippe & Sullivan, 2005). For example, does the national dataset promulgated by the American Association of Community Colleges suggest a close correlation to the Alabama Commission on Higher Education data? For this study, the close correlation lends validity and reliability to the survey of community and technical colleges in Alabama as a viable population sample for transferability to the larger context of national community college applications. The total population of the entire two-year college system was not indicated in Table 8; additionally, remediation was not addressed in terms of comparison between the Alabama Commission on Higher Education and the American Association of Community Colleges. 61 Table 8 Comparative Sample of Demographic Datasets: Alabama Commission on Higher Education (ACHE) and American Association of Community Colleges (AACC) 1. American Association of Community Colleges: AACC % Reference: Male: 41%; Female: 59%; Black: 13%; Hispanic: 14%; Asian: 6%; Native American: 1% Remedial: N/A 2. Alabama Commission on Higher Education: ACHE % (All Data in Table 2.3 are referenced to the dataset provided by ACHE) Institution. Pop M F Black White Hispanic Asian Native American Remedial Education Alabama Southern Community College 1,270 100% 433 34% 837 66% 494 39% 763 60% 3 < 1% 1 < 1% 4 < 1% 402 32% Bevill State Community College 3,873 100% 1,318 34% 2,555 66% 470 12% 3,221 83% 21 < 1% 108 3% 3 < 1% 794 21% Bishop State Community College 4,888 100% 1,652 34% 3,236 66% 3,064 63% 1,601 33% 28 < 1% 70 1.4% 38 < 1% 776 16% Calhoun State Community College 8,629 100% 3,614 42% 5,015 58% 1,608 19% 6,362 74% 158 2% 132 1.5% 281 3% 1,610 19% Central Alabama Community College 2,169 100% 711 33% 1,458 67% 502 23% 1,621 75% 22 1% 9 < 1% 8 < 1% 381 18% Jefferson Davis Community College 1,151 100% 513 45% 638 55% 344 30% 704 61% 10 < 1% 5 < 1% 38 3.3% 178 16% Lawson State Community College 3,371 100% 1,183 35% 2,188 65% 2,787 83% 493 15% 12 < 1% 13 < 1% 5 < 1% 880 26% Reid State Technical College 660 100% 198 30% 462 70% 363 55% 279 42% 3 < 1% 3 < 1% 11 2% 21 3.2% Trenholm State Technical College 1,439 100% 719 50% 720 50% 856 59% 548 38% 6 < 1% 9 < 1% 2 < 1% 221 15% TOTALS: 27,450 100% 10,341 38% 17,109 62% 10,487 38% 15,592 57% 263 1% 350 1% 390 1% 5,484 20% National Percentages: NA 41% 59% 13% 66% 14% 6% 1% NA Sources: American Association of Community Colleges: http://www.aacc.nche.edu/ Alabama Commission on Higher Education: http://www.ache.state.al.us/ 62 Issues of the Community College System of Education The community college system of education ? which consists of junior, community, and technical colleges ? is an integral part of higher education fraught with challenges and opportunities. Milliron & Wilson (2004) suggested that ?if they didn?t exist ? we?d have to invent them? (p. 22); Eaton (2006) recommended to accrediting organizations that they ??do not [step] away from the historic community college commitment to access ?[this] would be a sad development for all of us and millions of students we serve? (p. 92); Honeyman and Sullivan (2006) suggested to Florida delegates that ?to facilitate discussion and tackle these substantive policy issues? (p. 178) was critical to solving the pressing issues facing America?s community colleges; and, Milliron and E. de los Santos (2004) contended that ?many community colleges have become a nexus of lifelong learning in their communities? (p.106). Additionally, Franco (2002) suggested that: Ultimately, community colleges, in taking stock at the turn of a new century, have to determine their own developmental trajectory. By developing sustainable service-learning partnerships with K-12 schools, community-based organizations, and universities, community colleges can genuinely democratize higher education, the communities they serve, and the students they educate. (p. 135) As the literature suggested, there are many pressing and competing issues in higher education: as an integral part of the educational system, community colleges are not exempt. This study categorically denoted the issues as: (1) challenges in the community college, and (2) opportunities in the community college. The issues addressed included funding, enrollment, competition, diversity, opportunities, and workforce development. Moreover, the issues identified impact college student success. 63 Challenges in the Community College The community college system of education faces an onslaught of challenges in the next five years (The Chronicle of Higher Education, 2004). Six views were discussed and delineated as: (1) choosing among competing agendas, (2) meeting the needs of a changing society, (3) staying focused on suitable missions, (4) more students and less money, (5) hiring employees and motivating them, and (5) fragmentation, isolation, and divisiveness. Furthermore, Evelyn (2004a) suggested that community colleges have an image problem; DeGenaro (2006) noted that ?critical discussions of 2 year college mission[s] should also be fostered? (p. 544); and Eaton (2006) and Bragg (2001) argued the need to protect the policy of open access to public community colleges. Of the many issues that challenge the community college, this study addressed the following three topics: (1) Choosing Among Competing Agendas, (2) Meeting the Needs of a Changing Society, and (3) More Students and Less Money. Choosing Among Competing Agendas Competing agendas as suggested by The Chronicle of Higher Education (2004) are educational issues which will press the community college most for solutions. For example, professional development as compared to serving underrepresented or underprepared student populations suggested two interdependent, but contextually separate, agendas. Shkodriani (2004) indicated that community colleges are prime resources for teacher professional development, whereas Education Secretary Margaret Spelling suggested that community colleges were an ideal starting point for low-and- moderate-income students as a source of education which is most likely to promote 64 student success, all other factors being equal (Burd, 2006). These two competing agendas have underlying variables which require the community college to design different approaches. Variances in the approaches to providing for professional development, underrepresented populations, student success, and other agendas, can be found in the structure of the resources to support each function or agenda within the community college (Boggs, 2004; Dicroce, 2005; Dougherty & Hong, 2005; Grubb & Lazerson, 2004; Strout, 2006). For instance, Shkodriani (2004, p.4) suggested inherent problems in the way teacher professional development was structured, managed, and delivered as indicated in Table 9. Table 9 Criticisms of Professional Development Efforts Criticism Criticism Explained 1 Inflexible and too short ? instructors have a predetermined amount of material to get through in a short amount of time. 2 Often designed as ?one size fits all,? operating as if all participants have the same background, the same subject areas, and learn at the same pace and in the same way. 3 Inconvenient, involving travel to areas sometimes a distance from home or school ? it takes place outside the classroom environment and requires additional time beyond the normal daily schedule. 4 Teachers are not involved in determining program content. As put forth in Table 9, the issues related to professional development require personnel resources devoted to seeking solutions for each sub-issue identified. Consequently, as the resources are allocated to address each problem, these resources may compete with other agendas. Instructional resources dedicated to the agenda of professional development may compete for resources to simultaneously address the instructional process to support underrepresented students or students with deficits in 65 college-readiness, e.g., student success. Competing agendas require resources and as resources increasingly become scarce within the community college, tough choices have to be made as to which programs are supported and those which are postponed or unmet. Bailey et al., (2005a) conducted a study of student success in the community college. The study identified several ?institutional characteristics that affect[ed] the success of community college students? (p. 2). Success, as argued in the study, was a composite of several competing agendas, including but not limited to: financial resources, efforts in retention, multi-institutional attendance, leadership, faculty relations, and local political influence. Additionally, Bragg (2001) argued that: ? community colleges are continually expected to prepare individuals for careers, but vocational preparation need not be divorced from transfer opportunities. Indeed, enhancing transfer opportunities in all facets of the community college curriculum, including programs once thought terminal, can enhance opportunities for social mobility for all students. (p. 111) Underlying the analysis of competing agendas in the community college is the relationship between institutional resources and accountability (Boggs, 2004; Dougherty & Hong, 2005; Jacobson, 2005; The Chronicle of Higher Education, 2004; VanWagoner, Bowman & Spraggs, 2005). Zarkesh & Beas (2004) studied performance indicators and performance-based funding in community colleges. In order to assess performance indicators, the study investigated indicators in the larger context of the movement towards accountability. The application of accountability is the watchdog of the competing agendas phenomena. As competing agendas vie for resources, stakeholders are looking to the community college as efficient centers of vocational training, academic preparation, and to facilitate higher education, all the while balancing competing agendas to maximize positive student outcomes (VanWagoner, Bowman & Spraggs, 2005). 66 Evelyn (2004a) identified several entities which are looking closely at the community college for leadership and solutions to competing agendas. The entities noted were lawmakers, students, the business community, individual states, and even the community colleges themselves: issues identified were supply and demand, funding, and policy; resources listed were funding diversification, experience, and physical capacity. As suggested by The Chronicle of Higher Education (2004): [Physical] capacity is rapidly becoming the most critical challenge facing community colleges. More students are enrolling in community colleges than ever before ? the result of an echo baby boom, immigration, job competition, and the need for retraining generated by corporate downsizing. There are, however, too few faculty members to teach too many students, and precious little classroom and laboratory space is available for needed classes in both the arts and sciences and in career programs. (B.10) A review of the literature on the community college supports the framework of competing agendas that must be addressed and solved within the community college. Moreover, competing agendas will require the community college system of education to rethink priorities and seek alternate sources of support, inclusive of private donations (Strout, 2006). As suggested by Evelyn (2004b), ?with new missions, surging enrollments, and falling support, even the promise of access for all is in question? (p. A27). While competing agendas are critical issues, the community college is also charged with meeting the needs of a rapidly changing society. Meeting the needs of a changing society includes the competing and evolving agenda of student success. Meeting the Needs of a Changing Society As indicated by Closson (1996), ?the combined forces of demographics, social changes, and advancing technology create a swiftly changing society? (p. 3). A changing 67 society does not allow the community college system of education an exemption to remain in limbo: workforce development is contingent upon the community college to remain a rapid-responder for training (Ashburn, 2006; Milliron & Wilson, 2004). Status quo in higher education is cause for great concern; moreover, as societal forces shape the direction of national goals, education and training become the holistic catalyst to respond as force-multipliers in the lives of its citizens (Dicroce, 2005; Jacobson, 2005). Nowhere is the impetus for change greater than in the community college and one of the major delimiters in this process is capacity (VanWagoner, Bowman, & Spraggs, 2005; Zarkesh & Beas, 2004). Proactive community college capacity is defined as: The primary goal for higher education policy in this era is not to increase capacity in traditional ways but to address public needs and priorities? needs and priorities that include greater emphasis than in the past on accountability, cost and prices, efficiency, and effectiveness. In fact, even states whose population growth requires increased capacity are likely to look as much to productivity improvements (such as greater use of current campus facilities) as to new campuses to meet the higher educational needs of their citizens. (Callan, Doyle & Finney, 2001, p. 18) As suggested in the definition, capacity has a direct correlation to meeting the actual or perceived needs of a rapidly changing society. However, the capacity of the community college to meet the sundry needs of society is not restricted only to the number of teachers or classrooms (Callan, Doyle & Finney, 2001). Capacity will require a paradigm shift from reacting to the challenges of a changing society to proactive opportunities and innovative practices to lead a changing society (VanWagoner, Bowman & Spraggs, 2005). Wattenbarger (1983) conducted a study to determine the value of research for improving the community college. The study suggested that unless problems are investigated as a function of institutional research for the purpose of ?turning theory 68 into action? (p. 58), viable, proactive, and innovative change is less likely to occur. The study conducted by Wattenbarger (1983) was supported by Cohen?s (2005) investigation of practitioners and researchers: ?research on community colleges has been conducted for many decades, and for just as many years it has been ignored by community college practitioners ? even when the practitioner and the researcher are the same person?? (p. 51). Furthermore, Cohen?s (2005) study identified two constructs which support the relationship of community college research to proactive solutions for a changing society: (1) ?educational problems are always unique and for that reason always require unique responses, tailored as best as possible to the idiosyncrasies of the actual, unique situation? (p. 59), and (2) ?for community college practitioners to attend to research conducted in the [community college], the divide between research and practice must be bridged? (p. 59). To meet the needs of a changing society, the community college system of education must utilize its innovative prowess to understand the evolving community it serves. Consequently, to understand the underlying causes of a changing society enables the community college to proactively meet the needs of its constituents and stakeholders by taking the reins of community leadership and participation (VanWagoner, Bowman & Spraggs, 2005). And to understand that enrollment levels are projected to increase at the same time that fiscal support is level-funded or reduced, suggested that competing agendas are also attributes of a changing society. 69 More Students and Less Money One of the most profound challenges the community college system of education will face in the next decade is the influx of college-eligible students (Conley, 2005). The U.S. Department of Education projects that by 2009, 75% of high school seniors will likely attend college (Boggs, 2004), which included an estimated 45% enrolled in public two-year technical, community and junior college institutions (Horn, Nevill & Griffith, 2006; National Center for Educational Statistics, 2003). The Reference Service Press (2003) reported that current estimates for college-eligible students were expected to reach 15.3 million students, with a 15% increase to a projected 17.7 million students by the year 2012. Using the 45% enrollment projection for community colleges, the influx of students will range from 6.88 million (15.3 x .45) to 7.96 million (17.7 x .45) over the next decade. Statistically, there are 1,202 community, junior, and technical colleges serving a range of 6.88 to 7.96 million credit students between 2007 and 2012, with another 5 million non-credit students (Phillippe & Sullivan, 2005). Considering the mean as a broadly defined reference, average enrollment per two-year institution is estimated at 6,622 students (7.96 million / 1,202) by 2012. The Alabama [Community] College System has a total of 79,771 students in the system with an average Fall 2005 enrollment of 3,191 students (Alabama Commission on Higher Education, 2006). The numbers suggested that for many community colleges ? all other things being equal -- prioritizing competing agendas will potentially become tantamount to rejecting the long-standing open-door policy of the community college system of education (Windham, Perkins & Rogers, 2001). Student success is suggested as a competing agenda. 70 Boggs (2004) studied major issues impacting the community college system of education. The investigation detailed the many competing agendas and specifically noted several indicators related to student success and funding: 1. California and Florida turned away 175,000 and 35,000 students, respectively, due to insufficient resources; 2. State funding for community colleges dropped by nearly $584.8 million between 2002 and 2003, and 22 states, or 44% of the states supporting community colleges, reported decreased funding; 3. Institutions averaged 60% of their funding from state and local funds (35% for public four-year institutions) with only 21% of funding derived from tuition; 4. Tuition increased by 7.9% in the Fall of 2002 and 13.8% in the Fall of 2003; California planned to increase tuition in 2003-04 as much as 63.6% and Virginia Community Colleges raised tuition by $15.59 per credit hour to $52.71 (Larose, 2003); 5. Many community colleges have frozen or reduced course sections and, and in extreme cases, have eliminated whole programs and summer sessions; 6. Close to half of all students who pursue higher education will do so in the community college; and, 7. Twenty-eight percent of students seeking credit and non-credit courses in the community college have at least a Bachelor?s degree. Enrollment is projected to increase exponentially, while state funding follows a more linear scale (Hendrick, Hightower & Gregory, 2006). As noted by Milliron and 71 Wilson (2004), the juxtaposition of enrollment and funding may be classified as ?funding agony and opportunity? (p. 56). Opportunity is synonymous with the methods, materials, and manpower resources to establish a significant community college promoting student success. As noted by VanWagoner, Bowman and Spraggs (2005), ?In the significant community college, the number of students passing through the ?in? door is not the important success measure?the number persisting to the graduation-transfer- employment door is of the greatest importance? (p. 39). Significant community colleges will pursue every means of opportunity to acquire alternate sources of funding and support, while funding agony is a multifaceted process. Components of funding agony are state appropriations, tuition, and institutional expenditures; nevertheless, funding woes are not without potential remedy. Funding remedy in the community college is a leadership derived culture of entrepreneurialism (Strout, 2006). Enrollment agony may be found in several key issues. First, students have characteristics which impact enrollment, such as student swirl in which linear matriculation occurs infrequently (Borden, 2004; Komives & Woodard, 2003). As noted by Milliron and Wilson (2004), ?students are more diverse and increasingly ?swirled,? using community colleges for short-cycle training, industry certification, reverse transfer, or graduate school options? (p. 55). And as students swirl?toward achievement or success?enrollment demands increase while funding remains level or is reduced. Second, student diversity has increased exponentially (Horn, Nevill & Griffith, 2006; Kraman, 2006). As suggested by Hendrick, Hightower, and Gregory (2006): 72 In the last 40 years, 2-year college enrollments have exploded in the United States. Sheer numbers of students demanding higher education at the community college level?combined with issues of decreased funding and increased accountability?have put increasingly severe stress on the traditional open door policy of community colleges. (p. 628) Additional enrollment and funding issues in the community college literature included: an increase in on-line students as the demand for distance education in the community college continues to rise (Carnevale, 2006); how to best meet the needs [competing agendas] of the millennial generation?the largest student population in history--as they enroll in the community college and impact instructional and institutional processes (Coomes & Debard, 2004; Debard, 2004); college-ready as compared to college-eligible, a significant difference in the ability of students to enroll, persist, and succeed (Conley, 2005): included in the student success aspect is the amount of remedial courses students may require (Boulard, 2004; Conley, 2005; Spann, 2000); the influx of immigrants seeking to enroll and immerse themselves in the culture of the nation, while pursuing vocational training or degrees (Wang, 2004); and dual-enrollment programs, in which high school students dually-enroll in community college credit courses (Hugo, 2001; Karp, Bailey, Hughes, & Fermin, 2005; Kleiner & Lewis, 2005). Palazesi & Bower (2006) studied the baby boomers as they reinvented themselves by taking advantage of the offerings within the community college. The study?noting the relationship of more students and less money?suggested that baby boomers give significance to ?older adults [who] increasingly represent a larger population in postsecondary education? (p. 45). The study noted that as baby boomers attended community colleges for educational services, they generated revenue for the institution at the same time that they perpetuated increased enrollment. Demographic trends indicated 73 that the number of traditional students, ages 18 to 25, will begin to level out concurrently with the retirement era of baby boomers in 2011. As noted in the study, it is imperative that community colleges understand the intrinsic value baby boomers assign to the services provided by two-year institutions. To understand this generation?s need to acquire life-long learning should give rise to relevance in the significant community college. Even in light of the challenge of more students and less funding, innovative measures will create opportunities heretofore unlooked for in the community college system of education (VanWagoner, Bowman & Spraggs, 2005). Within the next decade, the community college system of education will face many challenges. The system will encounter more students, without the much needed appropriations for additional services; competing agendas will require difficult choices as to which functions can and cannot be funded or supported; and, as society changes, the community college must be proactive in its leadership role to provide viable solutions to the community it serves. Challenges in the community college are not without potential solutions. However, solutions are the result of proactive thinking, research, application, and leadership. As suggested by VanWagoner, Bowman, and Spraggs (2005): The move from success to significance will not be easy. Community college leaders will have to think differently, act differently, and respond differently to their environments. Nevertheless, the parts are there. Community colleges have long attracted leaders within their organizations who want to make a difference, who rise above the traditional culture, and who share a vision for the future. There has never been a better time or a greater need for community colleges to assume their significant role in creating the future?Community colleges are the right institutions at the right time, if we make the critical move to significance. (p. 50) A ?critical move to significance? (VanWagoner, Bowman & Spraggs, 2005, p. 50) includes a conscientious effort on the part of community college 74 leaders to understand the framework of institutional practice as a methodology to improve student success. Consequently, institutional practice and student success are direct correlates of one another, although not always positive correlates. Opportunities in the Community College The community college system of education, like its four-year counterpart, has alumni in every profession and sector of employment. Many of the alumni from both educational systems included noted individuals, some more significantly known than others. Boggs (2004) provided the following community college examples: Eileen Collins, NASA?s first female mission commander; Dustin Hoffman, winner of an Academy Award; Kweisi Mfume, former Congressman and NAACP President; Nobel Prize recipient and chemist, Bruce Merrifield; Dr. J. Craig Venter, lead scientist in decoding the human genome; and, Bonnie Blair, Olympic speed skater. A further review of the literature on community colleges revealed that although challenges exist in the two-year system, there is also considerable evidence that community colleges have established themselves as change-agents in the educational arena (Milliron & E. de los Santos, 2004; VanWagoner, Bowman, & Spraggs, 2005). Mellow and Talmadge (2005) investigated the diversity of LaGuardia Community College. LaGuardia?s population consisted of students from 159 different countries, speaking 110 different languages, and 66% were foreign born. As a result of the enormous diversity in the student population, LaGuardia developed significant and lasting changes to its operations, or what it termed ?organizational-change initiatives? (p. 61). An outcome of the initiatives instituted at LaGuardia garnered the college 75 significant accolades: it was ?identified by the National Survey of Student Engagement as one of three top-performing large community colleges ? [and it] ? received a Certificate of Excellence from the Hesburgh Awards for significant contributions to faculty development that enhances undergraduate teaching and learning? (p. 65). LaGuardia is but one of the many significant achievements in the community college system of education. The achievements of the community college are opportunities to excel in: areas of remedial education, which is a direct component of student success (Hendrick, Hightower & Gregory, 2006); the critical involvement the colleges play in the preparation of the nation?s first responders--professionals such as law enforcement officers, firefighters, or emergency medical technicians (American Association of Community Colleges, 2006a); workforce readiness as 95% of businesses and organizations who employ community college graduates recommend community college workforce education and training (American Association of Community Colleges, 2006b). Each one of these areas includes components of college student success. LaGuardia is a prime example of how the community college is able to look inward to apply its power in changing the lives of its students. Changing the lives of students is an inherent component in promoting student achievement. The community college system of education is a significant partner in the training and education of 45% of all undergraduates (Lamkin, 2004). Although many examples and studies could be cited in supporting the opportunities and achievements within the community college, VanWagoner, Bowman and Spraggs (2005) suggested the following regarding opportunities in the community college system of education: 76 Demand for services is increasing. Support from communities is strong. Business and industry leaders are increasingly turning to community colleges as their workforce providers. Large foundations are increasing their support?Community colleges are now more respected, better understood, and better positioned than at any other time in their history. But our challenges have risen with our status, and we must now impose a new paradigm upon ourselves?More than just a training provider, significant community colleges are economic drivers and essential community resources. (p. 38) From Open Door Policy to Significance Open door policies in the community college system of education across the nation have set a course of significant access to higher education for countless numbers of students (Horn, Nevill & Griffith, 2006; Vaughn, 2004; Wirt, Choy, Rooney, & Provasnik, 2005). Significant access included: (1) opportunities for students to participate in dual-enrollment programs to earn college credit and prepare for college while in high school (Jordan, Cavalluzzo & Corallo, 2006; Kisker, 2006); (2) remedial courses to support college-eligible students who required help in acquiring prerequisite skills in Math, English, or Reading (Arendale, 2005; CCSSE, 2005; Greene & Forster, 2003; Oudenhoven, 2002; Spann, 2000); (3) the development of enhanced citizenry and democratization for students who may have been exempt from ?equal opportunity for all for social and economic mobility? (Franco, 2002, p. 120); (4) students whose only goal is to acquire technical skills to become gainfully employed in the workforce beyond a high school diploma (Ashburn, 2006; Field, 2005); and (5) partnerships between community colleges and the business sectors served to ensure economic and workforce development (ACT, 2006a; Olson, 2006; VanWagoner, Bowman & Spraggs, 2005). Although the community college system of education is founded on the principles of open-door access, it has also been challenged by the disparities of the preparation of 77 students entering its doors. For example, students arriving at the doors of the community college are more likely to need remediation than students entering the doors of 4-year colleges and universities (Boulard, 2004; Horn, Nevill & Griffith, 2006; Jenkins, 2006; Jenkins & Boswell, 2002). As a result of these types of challenges in the community college, the two-year college system has become a significant educational liaison between high school graduates and a four-year college degree or job training (Byrd & MacDonald, 2005; Perin, 2002, 2006). Therefore, this study will consider the significant open-access attributes of the community college to investigate the perceptions of students and faculty as they respectively reflect on student success and the factors which significantly impact student success. As noted by the historical context, issues, and opportunities in the community college, the system of two-year education is poised to utilize its resources?although limited in specific areas?to meet the needs of its student population. In order to improve student success, strategic impact factors must be identified, investigated, assessed, challenged, and applied in practice. For this study, the strategic impact factors have been identified as academic preparation, work ethics, and institutional support?all within the contextual framework of institutional practice. It is imperative that community colleges strive to become a significant institution of life-long learning for all students; the most effective avenue for this prestigious goal is to ensure that student success is imbedded in the daily institutional practices of the institution?from Top Administrator to Facility Maintenance Engineer. 78 The Framework of Institutional Practice and Student Success As previously noted, student success has received significant attention in the annals of scholars; nevertheless, community college student success as outcomes of institutional practice have not (Bailey, 2006a; Jenkins, 2006). Bailey et al. (2004) argued that ?we have found virtually no research that attempts to define and assess program institutionalization or broader college-wide reforms? (p. 42). Institutional practice is as complex and varied as the factors related to quantifying college student success (Bailey et al., 2005a; Bailey, 2006a; Braxton, 2006; Kuh et al., 2006; Long, 2006; NPEC, 2006; Richardson, 2006; Robbins et al., 2004). To understand the strategic value and impact that institutional practice has on the success of college students is to acknowledge the need to reform organizational practices which negatively impact student opportunities for success (Bailey et al., 2004; Morris et al., 2005). Therefore, positive institutional or organizational practices are critical to the effectiveness of an educational institutional in initiating and promoting college student success. In terms of the critical-mass-influence of institutional practice, McClenney and Greene (2005) suggested the following: ? enrolling in a community college may be as intimidating for those students who eventually succeed as it is for those who don?t. Why, then, do some persevere while others leave before they meet their goals? Institutional practice can tip the balance. (p. 2) Consequently, if the organizational practices within an institution can ?tip the balance? (McClenney & Greene, 2005, p. 2) toward student success, educational institutions must develop a culture of inquiry to proactively promote student success (Achieving the Dream, 2005). Before addressing the culture of inquiry as related to institutional practice and student success, what broad definition might be applied to 79 institutional or organizational practice in the community college? The Resource Guide for Institutional Transformation to Improve Student Success at Community Colleges by Achieving the Dream (2005) suggested the following construct-definition: The process for institutional transformation?presupposes a college has certain characteristics and commitments. Its leadership must be strongly and visibly committed to the goal of increasing student success. It must be willing to undertake a process of honest self-examination. And it must be prepared to engage in a participatory planning process that includes a broad cross-section of faculty, staff, administrators, students, and members of the larger community outside the college?after an initial round of analysis, planning, implementation, and evaluation, the process begins again, generating further goals and further improvement in student [success] outcomes. (p. ?Overview of the Process?) The Achieving the Dream (2005) initiative was originally begun in 2004 as a consortium of partners and community colleges. The initial resource guide is a culmination of research and investigation of 27 community colleges for the purpose of promoting institutional change to improve student success throughout community colleges. Presently, there are over 58 two-year and other colleges participating in the Achieving the Dream (2005) initiative and the demographics are indicated in Table 10. The significance noted in Table 10 is the number of students who are influenced by the participating institutions, with future implications for student success. For example, assuming these institutions have initiated organizational practices to promote student success beyond the status quo is quid pro quo for the community college system of education. In other words, quid pro quo argues the reciprocal need for community college leaders to seek out best practices in the colleges participating in the Achieving the Dream (2005) initiative; and, to implement these practices applicable to their respective institutions to proactively and perpetually promote student success (Kuh et al., 2006). 80 Table 10 Community Colleges Participating in the Achieving the Dream (2005) Initiative (IPEDS 2004) Year State City Institution Enrolled 2004 Florida Fort Lauderdale Broward Community College 32948 Tampa Hillsborough Community College 22123 Tallahassee Tallahassee Community College 29556 Orlando Valencia Community College 29556 New Mexico Albuquerque Central New Mexico Community College 22077 Las Cruces New Mexico State University-Dona Ana 6038 Farmington San Juan College 514 Santa Fe Santa Fe Community College 3897 Albuquerque Southwestern Indian Polytechnic Institute 772 Gallup University of New Mexico-Gallup 3056 North Carolina Durham Durham Technical Community College 5534 Jamestown Guilford Technical Community College 8491 Williamston Martin Community College 927 Goldsboro Wayne Community College 3272 Texas San Antonio Alamo Community College District 49485 Dallas Brookhaven College 10446 Beeville Coastal Bend College 4013 El Paso El Paso Community College District 23828 Galveston Galveston College 2353 McAllen South Texas College 17130 Uvalde Southwest Texas Junior College 5140 Virginia Danville Danville Community College 4060 Big Stone Gap Mountain Empire Community College 2906 Martinsville Patrick Henry Community College 3341 Franklin Paul D. Camp Community College 1468 Norfolk Tidewater Community College 22691 2005 Connecticut Hartford Capital Community College 3436 Bridgeport Housatonic Community College 4701 Norwalk Norwalk Community College 5790 Ohio Cleveland Cuyahoga Community College 24664 Steubenville Jefferson Community College 1658 Mansfield North Central State College 4389 Dayton Sinclair Community College 19622 Zanesville Zane State College 1789 2006 Pennsylvania Pittsburgh Community College of Allegheny County 19292 Monaca Community College of Beaver County 2490 Media Delaware County Community College 10824 Blue Bell Montgomery County Community College 8915 Bethlehem Northampton Community College 8270 Philadelphia Community College of Philadelphia 20606 Youngwood Westmoreland County Community College 6290 Texas Alvin Alvin Community College 3932 Lake Jackson Brazosport College 3389 Texas City College of the Mainland 3948 Bayton Lee College 5954 The Woodlands North Harris Montgomery Community College District 35788 Pasadena San Jacinto College 24519 Wharton Wharton County Junior College 6100 Prairie View Prairie View A&M University 8006 Houston Texas Southern University; Univ of Houston-Downtown 22635 Washington Moses Lake Big Bend Community College 1919 Yakima Yakima Valley Community College 4737 Des Moines Highline Community College 5610 Tacoma Tacoma Community College 6471 Renton Renton Technical College 3682 Seattle Seattle Central Community College 10000 Total: 605,048 81 Within the Achieving the Dream (2006) Model, a database was provided for researchers to assess the progress of students in the program and correlate success of students within the context of institutional practices to participating community colleges. For this data to be viable and improve the success of students, colleges participating in the initiative have identified practices that will help students succeed; institutions submitted themes and data to the database. The practices identified were: 1) putting a sharper focus on developmental education as approximately 50% of community college students need some form of remediation: once the remediation has been successfully completed, students are noted to have equal chance to succeed in their respective educational goals, including degree completion or university transfer (Kozeracki, 2002); 2) improved instructional techniques such as collaborative learning, paired classes, and learning communities (VanWagoner, Bowman & Spraggs, 2005); 3) student success courses including classes in the Teaching-Learning-Assessment Domain (TLAD) which support student achievement in time management, study skills, etc (WorkEthics.Org, 2006; The Conference Board et al., 2006); 4) advising services, including faculty advising, in which students establish goals, mapping out strategies to achieve respective goals by clearly stating educational objectives (Dale & Drake, 2005) ; 5) greater involvement of faculty, staff, and community members as key players in the success of college students (Capaldi, Lombardi & Yellen, 2006). 82 Innovative thinking was a common practice in the colleges participating in the Achieving the Dream (2005, 2006) initiative. Kuh, Kinzie, Schuh, and Whitt (2005a) studied the Documenting Effective Educational Practices (DEEP) project and concluded: ? although generally self critical, they [the colleges with a culture of inquiry] aren?t plagued by a culture of complaint, in large part because of their bent toward innovation. To varying degrees, they?re emblematic of the learning organizations described by Peter Senge and the firms studied by Jim Collins that catapulted from good to great. (p. 46) Although the Documenting Effective Educational Practices (DEEP) project pertained to four-year institutions, and the Achieving the Dream (2005, 2006) project was specific to community colleges, significant similarities in terms of institutional practice as impacting student success was suggested. Kinzie and Kuh (2004), argued that ?nearly two years of intensive research in the daily work of twenty institutions [DEEP Project] may finally put to rest any doubt that building cross-campus collaborations to facilitate student success is essential? (p. 2). Similarly, in the themes of the Achieving the Dream (2005, 2006) initiative, innovation is key to bridging best institutional practices to enhancing student success. The antithesis to student success is suggested by Merrow?s (2006) article, My College Education: Looking at the Whole Elephant: In our effort to describe the beast, we were impressed by students who squeeze as much as they can from their college experience and by teachers who dedicate their energies to seeing students succeed. Too much is left to chance, however, and too many lives are blighted by our national indifference to what is happening on our campuses during the years between admission and graduation. (p. 15) In addition to the Achieving the Dream (2005, 2006) initiative and the Documenting Effective Educational Practices (DEEP) project, the Community College Survey of Student Engagement (CCSSE) (2005) study surveyed 133,281 student 83 respondents at 257 participating colleges in 38 states (p. 23). In the composite survey years of 2002, 2003, 2004, and 2005, the number of respondents represented a population of 2,360,316 students across the spectrum of 404 CCSSE member colleges within 43 states?representing approximately 36% of the national pool of community colleges and 37% of the 6,318,779 credit students. As recently noted, to promote student success beyond the status quo is quid pro quo for the community college system of education; or in the words of Kuh, Kinzie, Schuh, and Whitt (2005a): ?a time-honored approach to improving effectiveness is to learn what high-performing organizations within a given industry do and then to determine which of their practices are replicable in other settings? (p. 44). To reiterate?in other words, quid pro quo argues the reciprocal need for community college leaders to seek out best practices in high-performing community colleges and determine if their practices can be applied in local colleges?to promote and improve college student success within the framework of institutional practice. Considering the studies from Achieving the Dream (2005) and the Community College Survey of Student Engagement (CCSSE) (2005), the imperative for community college leaders and policy-makers to understand the impact that institutional practice has on student success is beyond general reproach. Even prior to considering the specific factors noted in the Strategic-Impact-Triad Model, the data suggested that institutional practice is critical to student success. Specifically, Kuh (2001) and Marti (2005) separately evaluated the reliability and validity of the Community College Student Report (CCSR) within the CCSSE study. As argued by Marti (2005): 84 The CCSR was adapted from the National Survey of Student Engagement (NSSE), with permission from Indiana University. The NSSE instrument was developed in 1999 for use in four-year colleges and universities [with] a high degree of intentional overlap between the NSSE and CCSSE instruments. Of the 79 items measuring student engagement on the NSSE instrument, 56 of those items appear on the 2003 version of the CCSR, representing a 71% overlap between the two instruments. Psychometric properties of the NSSE instrument have been explored extensively and have demonstrated that the instrument is reliable and valid. (p. 1) Kuh (2001) suggested the following insight: In general, the psychometric properties of the NSSE are very good, as the vast majority of items equal or exceed recommended measurement levels. Those items that are not in the normal range on certain indicators, such as kurtosis and skewness, are due to the nature of the student experience, not because of psychometric shortcomings of the instrument. The face and construct validity of the survey are strong. This is not surprising because national assessment experts designed the instrument and most of the items have been used for years in established college student assessment programs. In addition, we made improvements to individual items and the overall instrument based on what was learned from focus groups, cognitive testing, and the psychometric analysis on the results from the Spring 1999 field test, the inaugural national administration in Spring 2000, and the Spring 2001 administration. The results seem to be relatively stable from one year to the next and non-respondents are generally comparable respondents in many ways, though contrary to popular belief non-respondents appear to be slightly more engaged than respondents. (p. 23 of 26) Based on the reliability and validity of the data collected and subsequent analysis, benchmarks of effective educational practice were identified. The benchmarks are: 1) active and collaborative learning; 2) student effort; 3) academic challenge; 4) student- faculty interaction; and, 5) support for learners. Further substantiation of the benchmarks noted in the research by the CCSSE is found in several studies related to student success and institutional practices: a) Achieving the Dream (2005, 2006); b) Documenting Effective Educational Practices (DEEP) project (Kinzie & Kuh, 2004; Kuh, Kinzie, 85 Schuh, & Whitt, 2005a,b); c) the nine college or student success constructs by Robbins et al., (2004); d) a significant literature review of what matters to student success (Kuh et al., 2006); e) Bailey and Alfonso?s (2005) analysis of research on program effectiveness at community colleges; f) Tinto and Pusser?s (2006) work on creating a model of institutional action to promote college student success; and, g) Smith?s (2005) 51 competencies of instructional effectiveness. Within all the studies and research noted, it is suggested in this dissertation that institutional practice and student success are interdependent upon one another. For an educational institution to attempt to achieve student success in the absence of effective practices is related to Braxton?s (2006) research: ??c ollege student success stands as a topic that cries out for some form of systematic empirical attention. Without the benefit of such scholarly attention, uninformed, ad hoc views on student success and [ineffective] ways to achieve student success will emerge? (p. 1). Not the only method available to community colleges, but the one identified in this study as the most powerful, is the action-oriented practice of a culture of inquiry to address the relationship of institutional practice and student success. Reid (2004) suggested that ?educators need to be inquirers into professional practice,? (p. 3) who routinely question their practices and assumptions and ?who are capable of investigating the effects of their teaching on student learning? (p. 3). The questions include, but are not limited to, ?issues, problems, concerns, dilemmas, contradictions, and interesting situations? (p. 3). Using Reid?s (2004) dimensions of construct of inquiry can be transposed to institutional practices and the need for the community college to possess a culture of inquiry to assess strategies and organizational activities to determine which 86 institutional factors promote or harm student success: 1) Conceptual Dimension? conscious analysis of events which transpire in the institution. The Conceptual Dimension included examining the theory behind the practice and the exploration of alternatives; 2) Critical Dimension?justification of actions based on ?moral, ethical and socio-political issues? (p. 4). The Critical Dimension included understanding the environmental framework for a culture of inquiry to fully develop as a critical-mass function in the contextual framework of institutional practice. How might a culture of inquiry be implemented and operationalized to advantage in the community college to maximize positive efforts toward institutional practices to promote student success? First, a culture of inquiry is not the only nomenclature to be used to define the strategic inside-out approach to understanding an institution?s organizational practices to significantly improve student success. VanWagoner, Bowman, and Spraggs (2005) used the category of The Significant Community College to suggest that ?the number of students passing through the ?in door? is not the important success measure?the number persisting to the graduation-transfer-employment door is of the greatest concern? (p. 39), ?they will continually seek to understand how students learn and what promotes and impedes [student] success? (p. 49); Tinto and Pusser (2006) classified a culture of inquiry as institutional action for student success; McClenney and Greene (2005) used the term of culture of engagement; and, Hanson (2006) referenced the learning community college to define a culture of inquiry. All of these identifiers generally apply to the outcomes which result from an institution which practices a culture of inquiry as a methodology to assess and improve college student success. 87 Secondly, a culture of inquiry to determine institutional practices which impacts student success is explicitly within the purview of community college leaders. As noted by Boswell and Wilson (2004), ??community college leaders have a responsibility to re- examine their own practices and assumptions, holding themselves accountable for adopting cost-effective and learning-centered strategies that help ensure student success? (p. 49). The third point is that community colleges should strive to implement a culture of evidence within the culture of inquiry within the total set of institutional practices (Dowd, 2005). For example, Bailey and Alfonso (2005) framed a culture of evidence within the culture of inquiry as: ?institutional research functions play a more prominent role and faculty and administrators are more fully engaged with data and research about success of their students, using those data to make decisions? (p. 3). Stated differently, if data are used to make decisions about student success as a result of organizational practices of inquiry, student success is more likely to be improved. Moreover, Bailey and Alfonso (2005) suggested six strategies to implement a culture of evidence within a culture of inquiry: 1. Assess and invest in the necessary resources to establish an effective institutional research function; follow-up with skills development for individuals within the institutional research department; 2. Recognize and accept that assessments of program effectiveness are difficult and require commitment by staff, faculty, and administration (Miller, 2006); 3. Synthesize and prepare understandable research reports on student success and outcomes, which included both quantitative and qualitative methodologies; 88 4. Make concerted efforts to involve faculty, staff, administration, and?even students, when and where applicable?in opportunities for the research endeavors; 5. Develop a systematic and efficient process of dissemination for publishing the findings throughout the institution and community, and; 6. Promote collaborative processes among institutional research offices to share data, best practices, outcomes, and other information which supports student success in the community college system of education. And, the fourth point: Biswas (2006) conducted a study on the impact that accrediting bodies may possibly have on student success based on the construct that a culture of inquiry and evidence can be a powerful tool in efforts to improve student outcomes. For example, the website of the Southern Association of Colleges and Schools (SACS) provided the following statement: ?The Mission of the Southern Association of Colleges and Schools is the Improvement of Education in the South Through Accreditation.? This study would argue that ?improvement in education? includes a thorough immersion in a culture of inquiry to meet not only the requirements of the Southern Association of Colleges and Schools (SACS), but to also and more importantly meet the success needs of students in the community college. By practicing the daily application-construct of ?We exceed the minimum requirements of SACS in order that our students are successful beyond mere expectations,? community colleges are more likely to initiate a culture of inquiry as a 89 matter of daily and expected practice. The regional accrediting commissions are indicated in Table 11 (Biswas, 2006, p. 5). Table 11 Regional Accreditation and Higher Learning Commissions Commission Member States Middle States Association of College and Schools Commission on Higher Education accredits community colleges and four-year institutions Delaware, District of Columbia, Maryland, New Jersey, New York, Pennsylvania, Puerto Rico New England Association of Schools and Colleges Commission on Institutions of Higher Education accredits community colleges and four-year institutions Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, Vermont North Central Association of Schools and Colleges Higher Learning Commission accredits community colleges and four-year institutions Arkansas, Arizona, Colorado, Iowa, Illinois, Indiana, Kansas, Michigan, Missouri, Nebraska, New Mexico, North Dakota, Ohio, Oklahoma, South Dakota, West Virginia, Wisconsin, Wyoming Northwest Association of Schools and Colleges Commission on Colleges accredits community colleges and four-year institutions Alaska, Idaho, Montana, Nevada, Oregon, Utah, Washington Southern Association of Schools and Colleges Commission on Colleges accredits community colleges and four-year institutions Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Texas, Virginia The Western Association of Schools and Colleges Accrediting Commission for Community and Junior Colleges; Commission for Senior Colleges and Universities California, Hawaii, Pacific territories Source: Biswas, R. (2006). A supporting role: How accreditors can help promote the success of community college students. An Achieving the Dream Policy Brief, October 2006. Retrieved January 3, 2007, from http://www.achievingthedream.org /default.tp. Within the analysis by Biswas (2006), the questions to be answered were: 1) can the accreditation process become an effective lever in improving institutional practices to promote student success, 2) can regional agencies do more in terms of standards or other actions to guide institutions in addressing the plethora of student success challenges local institutions face; and, 3) what methodologies within the accrediting process might be 90 utilized to support and accelerate the process of improving student success? To address these questions, the following conclusion was drawn: There is a growing sense, among institutions and accrediting bodies alike, that accreditation would benefit from moving toward an ongoing process of continuous improvement based on a culture of evidence, built around the central themes of student learning and student success. This shift from a periodic, discontinuous seminal event will require a parallel shift from a compliance framework to an improvement framework, with data driving the undertaking. (Biswas, 2006, p. 20) Institutional practices have been suggested as critical to the success of students in the community college (Bailey et al., 2005a; Braxton, 2006; Hirsch, 2001; Kuh et al., 2006; Achieving the Dream, 2005, 2006; Community College Survey of Student Engagement, 2005). A key component of this relationship is the culture of inquiry to continually ask tough questions which will drive the institution to significantly and continually improve student success (Reid, 2004; Tinto & Pusser, 2006; Greene, 2005; Bailey & Alfonso, 2005). Within the framework of institutional practice to promote college student success, certain strategic factors need specific investigation to determine their collective and interdependent impact on student outcomes. The factors for this study have been identified in the Strategic-Impact-Triad Model and will be analyzed as follows: a) Factor 1: Academic Preparation; b) Factor 2: Work Ethics of Students and Faculty; and, c) Factor 3: Institutional Support. These factors will be measured from the respective perceptions of students and faculty in the community college setting. As previously noted, variances in the perceptions of students and faculty tend to have consequences which are not always in the best interest of the student. By no fault of the faculty member or the student, the differences in these groups in terms of factors related to student success, are more likely to create minor or major issues which interfere 91 with student achievement. And, as previously noted, student achievement or success is not solely within the purview of earning a degree. To reiterate, Long (2006) argued that ?student success is a multidimensional issue with varying definitions of the benchmarks? (p. 2). The benchmarks include perceptions of academic preparation, work ethics and institutional support?within the framework of institutional practice within a culture of evidence within a culture of inquiry (Brock et al., 2007). The next sections of this study will delve into the Strategic-Impact-Triad factors in terms of previous and on-going research. As contended in this study, the SIT Model factors?or domains of practice?are strongly suggested as required for students to be successful, even if success is identified to have ?varying definitions of the benchmarks? (Long, 2006, p. 2). The three domains as identified in this study are academic preparation, work ethics and institutional support. The three domains are considered to be interrelated and co-dependent upon each other for positive student outcomes. Therefore, the sections that follow will discuss in detail the three domains using underlying constructs to identify and characterize the variables which comprise the domains. These variables are not strictly comprehensive for this study; however, these variables are the characteristics which generally and specifically define the domains of interest. For this study, these characteristics correlate to the practices which impact student success or achievement within the community college system of education. 92 Factor 1: Academic Preparation Conley (2005), Kirst and Venezia (2004), and Kaye, Lord, and Bottoms (2006) agreed on a specific delineation between a student being college-ready and college- eligible. While college-eligibility is a process of high school graduation, timely application, transcript submission, and the proverbial on-campus visit, college-readiness or academic preparation is the foundation for participation, performance, student success, persistence to graduation, and life-long learning. The framework for student success is applicable to any field of study or career choice. For example, engineering requires greater emphasis in reading and math skills compared to English majors who depend more heavily on reading and writing skills; moreover, employees in all trades and professions across the nation require a solid foundation in basic skills for everyday tasks, with everyday tasks becoming increasingly technologically complex (ACT, 2006b). Additionally, for the high school graduate who enrolls in the community college, basic skills are just as vital to degree attainment as degrees in the university setting. Basic skills are the ingredients upon which the educational system is dependent for establishing, maintaining, enhancing, and supporting global competitiveness (Krueger, 2006; Phillips & Skelly, 2006). Consequently, the development of academic preparation and student success are intertwined as co-dependent, assumed positive correlation variables. For instance, as basic skills improve, student success is more likely to improve linearly. Studies by the ACT (2006a, 2006b), Kaye, Lord, and Bottoms (2006), Callan, Finney, Kirst, Usdan, and Venezia (2006), Karp, Bailey, Hughes, and Fermin (2005), and the U.S. Department of Education (2006), have all noted that academic preparation is one of the most important educational challenges this nation has ever faced. 93 Specifically, the U.S. Department of Education (2006) in its report, Answering the Challenge of a Changing World: Strengthening Education for the 21st Century, suggested the following academic preparation related issues: 1) America is facing a rapidly changing global workforce at an exponential pace never before experienced; 2) Technological innovation and newfound freedoms around the world are spurring competition for the American worker well beyond basic skills; 3) The high school diploma at one time was status quo and adequate for a major portion of the jobs in the nation, with a college degree as the educational cr?me-de-la-cr?me; 4) The National Defense Education Act of 1958 signed into law by President Eisenhower stated: ?The Congress hereby finds and declares that the security of the Nation requires the fullest development of the mental resources and technical skills of its young men and women?; (p. 4) 5) ?U.S. manufacturing will no longer employ millions in low-skilled jobs. Tomorrow?s jobs will go to those with education in science, engineering, and mathematics and to highly-skilled technical workers. Such a workforce is an important key to future growth, productivity, and competitiveness?; (p. 4) (National Association of Manufacturers (NAM) (2005). 6) Ninety percent of the fastest-growing jobs of the future will require some type or types of postsecondary education; 7) If current trends continue, by 2012 over 40% of factory jobs will require postsecondary education; 94 8) Almost half of our 17-year-olds do not have the basic understanding of math needed to qualify for a production associate?s job at a modern auto plant; 9) More than half of the undergraduate degrees awarded in China are in the fields of science, technology, engineering and math, compared to 16% in the U.S.; 10) Educational leadership has been challenged as a result of data indicating that many developed nations outperform the U.S. in international tests; these test scores are linked to a lack of challenging course work in high school suggesting an ominous outlook for many American schools, including 2- and 4-year institutions. Each one of the items within the report by the U.S. Department of Education (2006) suggested that preparation for college or the workforce is a national mandate with implications so far reaching that the very fabric of the nation?s economic survival is at stake. Moreover, the indicators in the study comprise many input variables which negatively impact student success or workforce outcomes. Of significance is the substantive relationship between academic preparation in high school and being prepared for either college or the workforce. According to Barton (2006), the most common reason that companies reject applicants as hourly production workers is that the workers do not have adequate basic employability skills: the data indicated that the rate of rejection was 69%, or 69 potential workers out of every 100 applicants?due to a lack of basic employability skills and, with emphasis in basic academic skills. Lovett and Mundhenk (2004) argued that ?a college degree has replaced a high- school diploma as the gateway to the American middle class? (p. 2) and as noted by the National Association of Manufacturers (NAM) (2005), the future success of the 95 workforce will be more dependent on employees with higher or postsecondary education than at any time in history. Consequently, the success of higher education is multi-source dependent on improved student academic preparation, funding, community support, legislative action, and other factors; student academic preparation is inextricably dependent on educational policies and perceptions which mandate a process to continually improve the spectrum of academic preparation policy and practice (Callan, Finney, Kirst, Usdan, & Venezia, 2006; Jenkins, 2006; Southern Regional Education Board, 2006). College-readiness, a.k.a., academic preparation, has been investigated and reported as one of seven national education priorities (Byrd & MacDonald, 2005). In this same study, it was noted that 41% of community college students and 29% of all entering college students are academically challenged in at least one of the basic skills needed to succeed in college, e.g., reading, writing, or math. The literature on college-readiness and academic preparation is quite extensive, to include considerable research on the basic skills of entering freshmen, with variances in policy redress, perceptions and transferability to the public good (Achieve, Inc., 2006; Greene & Winters, 2005; Wirt, Choy, Rooney, & Provasnik, 2005). A particularly vital component with regards to college-readiness is the level of reading ability of students who are college-eligible (ACT, 2006b; National Assessment of Adult Literacy (NAAL), 2005). An outcome of the need for studies on academic preparation, i.e., factors for college-success, resulted in one of the most profound projects ever undertaken in the history of education (Caboni & Adisu, 2004). However, Graham (2003) noted that at the time of A Nation at Risk, there were ?some [who] rejected the arguments as over-blown rhetoric? (p. vii). 96 T.H. Bell, Secretary of Education under President Ronald Reagan, on August 26, 1981, established The National Commission on Excellence in Education (NCEE) (1983). Secretary Bell tasked the NCEE with preparing a report on the quality and condition of education. The study, A Nation at Risk: The Imperative for Educational Reform (NAR), was completed in 1983 and provided a detailed qualitative analysis of issues facing American education, including recommended solutions to the problems identified. Although the study is over 20 years old, the significance of the findings are directly related to the issues which impact student success in the present generation of college- eligible students (Conley, 2005). The issues are compiled in Table 12 and implied the depth of academic preparation deficits longitudinally before the study, and for nearly twenty-five years subsequent to the study. Moreover, as college student success research is conducted in the educational community, the outcomes and findings of the studies are generally and alarmingly consistent with the indicators in A Nation at Risk (Horn, Nevill & Griffith, 2006; Kuh et al., 2006). The relationship between the indicators in Table 12, academic preparation, and institutional practice in promoting student success is that community colleges must consider the academic preparation as a two-part characteristic of students. The first-part is pre-college; the second-part is present-college. For example, using the ACT COMPASS System (2006) to test incoming students provides academic preparation indicators to support student success, i.e., suggested developmental courses: this institutional practice is a pre-college student characteristic. The present-college characteristic of students is related to academic preparation while in college?those institutional practices which provide present-day academic preparation and success. 97 Table 12 A Nation at Risk: Indicators of the Risk and Current References Risk Descriptive Indicator (1983) Current Reference(s) (2004-2008) 1 International comparisons of student achievement, completed a decade ago, reveal that on 19 academic tests American students were never first or second and, in comparison with other industrialized nations, were last seven times. Conley, 2005; ACT, 2005a,b; 2006a,b; College Board (2006b) 2 Some 23 million American adults are functionally illiterate by the simplest tests of everyday reading, writing, and comprehension. NAAL, 2005 3 About 13% of all 17-year-olds in the United States can be considered functionally illiterate. Functional illiteracy among minority youth may run as high as 40%. ACT, 2005a,b; 2006a,b 4 Average achievement of high school students on most standardized tests is now lower than 26 years ago when Sputnik was launched. ACT, 2005a, 2005b, 2006a,b 5 Tested ability for over 50% of gifted students (population) does not positively correlate to their respective achievement in school. Carey (2006) 6 The College Board?s Scholastic Aptitude Tests (SAT) demonstrate a virtually unbroken decline from 1963 to 1980. Average verbal scores fell over 50 points and average mathematics scores dropped nearly 40 points. Reference Service Press (2003); College Board (2006a, 2006b) 7 C llege Board achievement tests also reveal consistent declines in recent years in such subjects as Physics and English. 8 Both the number and proportion of students demonstrating superior achievement on the SATs (i.e., those with scores of 650 or higher) have also dramatically declined. ACT, 2005a, 2005b, 2006a,b 9 Many 17-year-olds do not possess the higher-order intellectual skills we should expect of them. Nearly 40% cannot draw inferences from written material; only one-fifth can write a persuasive essay; and only one-third can solve a mathematics problem requiring several steps. Boswell & Wilson, 2004; NCES, 2005; Lovett & Mundhenk, 2004 10 There was a steady decline in science achievement scores of U.S. 17- year-olds as measured by national assessments of science in 1969, 1973, and 1977. 11 Between 1975 and 1980, remedial mathematics courses in public 4-year colleges increased by 72% and now constitute one-quarter of all mathematics courses taught in those institutions. Bettinger & Long, 2004, 2005 12 Average tested achievement of students graduating from college is also lower. 13 Business and military leaders complain that they are required to spend millions of dollars on costly remedial education and training programs in such basic skills as reading, writing, spelling, and computation. The Department of the Navy, for example, reported to the Commission that one-quarter of its recent recruits cannot read at the ninth grade level, the minimum needed simply to understand written safety instructions. Without remedial work they cannot even begin, much less complete, the sophisticated training essential in much of the modern military. Krueger, 2006; Bettinger & Long, 2005; Attewell, Lavin, Domina, Levey, 2006; Horn, Nevill & Griffith, 2006 Source: National Commission on Excellence in Education (NCEE), (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office. (Section: Indicators of the Risk, p. 8-9). 98 Caboni and Adisu (2004), in analyzing A Nation at Risk (NAR) and comparing current trends which have directly or collaterally ensued from the 1983 study, suggested that A Nation at Risk?through no fault of the report or its intended outcomes?has resulted in a modicum of positive changes in the educational system in the United States; however, Caboni and Adisu (2004) and Gordon (2003) also indicated that many issues remain in flux. For example, the original NAR report focused on curriculum, remediation, and teaching. A particular outcome of the NAR report concluded that high schools should develop and participate in a set of core courses referred to as the New Basics Curriculum (NBC). While the goal of the NBC was to initiate and support improved high school preparation for college student success, an additional suggested outcome and benefit was to significantly reduce the level of remediation for graduating high school seniors. However, as indicated by Caboni and Adisu (2004): According to the National Commission on the High School Senior Year, only 10 states have aligned their high school graduation requirements in English and only two states have done so in Math. Additionally, only 20% of schools require students to take the New Basics Curriculum recommended by NAR. (p. 168) As further noted in Figure 4, there are many variables which positively or negatively impact the domain of student success. In terms of college-readiness?or its equivalent of academic preparation?as a college student?s academic preparation level increases positively, it is more likely that the student will be successful, gain significant employment, enjoy an improved quality of life, and contribute to society more accurately. Conversely, as suggested in Figure 4, a student who has deficits in academic preparation before and during college will face significant difficulties in achieving academic success, less the attainment of status quo (lower levels in the qualities of life). The variables in 99 Figure 4 also possess the potential to hinder the applicability of the college-readiness process resulting in negative outcomes in the sphere of national competitiveness, global leadership, quality of life, and unfavorable perceptions of the educational system beginning with P-12 (Conley, 2005; Daugherty, 2005; Lord, Marks & Creech; 2005). k Figure 4. College-Readiness/Student Success Impact Model. The U.S. Department of Education projected that by 2009, 75% of high school seniors will likely attend college (Boggs, 2004), which included an estimated 42% enrolled in public two-year technical, community and junior college institutions (Horn, Nevill & Griffith, 2006; National Center for Education Statistics (NCES), 2003). Moreover, as noted by the College Board (2004, 2005), colleges and universities over the last eight years have increased student enrollment from 14.3 million to 15.3 million to reach an all-time record high number of students. College enrollment is expected to U. S. Economy Competitive Global Influence Educated Workforce Quality of Life Perceptions C R O E L A L D E I G N E E S S IMPROVED DEFICIT EDUCATION (All Types/Levels) Recruitment, Retention, Graduation Rates, Policies, Workforce Input, P-16 Initiatives, Student Success, Educational Practices, Etc. 100 increase another 15% to an estimated 17.7 million students by the year 2012. Assuming the validity and reliability of the College Board?s projection, academic preparation outcomes have the baseline potential to positively or negatively affect the nation?s future workforce, leadership in a global economy, and students? personal and professional success (personal and professional success are based on academic ability to achieve). Table 13 suggested the potential impact of a lack of college-readiness or academic preparation based on the composite projections by the College Board, U.S. Department of Education and the American College Testing Service. The data presented in Table 13 does not directly address the variables of workforce readiness, economics, policy issues, work ethics, institutional support, or institutional practices. Table 13 Academic Preparation Impact Projections Reporting Agency Actual or Projected Year % Attendance # of Students Attending # of Students Not Attending Impact of College-Readiness (Remedial or Developmental or Completion Rates) College Board 2012 ? 75% 17.7 million Baseline 3.9 million College-ready (.22 x 17.7 million) U.S. Department of Education 2009 ? 100% 22.2 million 4.5 million 4.9 million potentially College-ready (.22 x 22.2 million) ACT, Inc. (2005a) 2004 n/a n/a n/a Only 22% met or exceeded College-readiness Benchmarks ACT, Inc. (2005b) 1983 ? 2005 n/a n/a n/a All Two-Year College Completion Rates: 30% National Completion Rates for Four-Year Colleges: 51.8% 101 As put forth in Table 13, a lack of college-readiness has the potential to impact the performance and often times the completion rates of students enrolled in college or those potential college-eligible students seeking to enroll in college (Dounay, 2006a; Maloney, 2003). Furthermore, if the data as projected are within a few percentage points of being correct, this would suggest the depth of the problem of students not prepared for the rigors of college-level work or opportunities in the workforce. The result of deficits in college-readiness or academic preparation suggested a negative impact on the economy, society, and higher education (Boswell & Wilson, 2004; NCES, 2005). Additional data from the American College Testing Service (ACT, 2006) indicated that student preparation for college-level reading is at its lowest point in more than a decade, spanning 1994 to 2005. Additionally, the study noted that ?it is also recognized today that the knowledge and skills needed for college are equivalent to those needed in the workplace? (p. 3), including reading skills. Education, including reading skills, begins in the formative years of grade school, transitions to and through high school, and concludes when the individual has successfully completed his or her stated educational objective, whether that objective is a degree, certificate or vocational training?all of which are indicators of student success. The overarching theme of the Academic Preparation Impact Projections (Table 13) is to offer the following: ?As a state policy-maker and education leader, you will see considerable variety in state policies. You will be able to assess your individual state policies by how well they support your state?s overall college-readiness effort? (Daugherty, 2005, p. 2). In terms of institutional practices related to academic 102 preparation in the two-year college system, community college leaders should be the champions to improve state policies which promote and support student achievement. Phillips and Skelly (2006), in citing Gaston Caperton, president of The College Board, noted that ?The future of this country is going to be won in the public schools. We are in an education race, not an arms race. To successfully compete in a global economy, our students need to be prepared? to earn a living wage (p. 26). In relationship to a living wage, the U. S. Department of Labor, Bureau of Labor Statistics (2000), noted that college graduates over the age of 25 earn nearly twice as much as those in the workforce who only have high school diplomas. In subsequent and related studies, the College Board (2004, 2005) conducted research in the area of educational benefits. Generally, these findings suggested that education not only supports individuals financially, but society through increased tax revenues, improved health benefits, politically informed citizenry, and life-long learners. The recommendations of the research, therefore, suggested that students who are academically prepared to acquire postsecondary education will be in a position to support themselves to a higher standard and, consequently, contribute to society in a more positive and progressive manner. The Stanford University Bridge Project study connected barriers between high school and college which identified misaligned policies and perceptions which hampered or altogether prevented students from being prepared for college or being equipped to make proper and informed decisions. Additionally, Collins and Chandler (1997) investigated perceptions that parents and students had with respect to learning environments at school. The students were generally less positive about their school environments than were the parents. The study suggested that policies which address 103 these negative perceptions are more likely to initiate positive change if the perceptions are identified and given consideration for change. Notwithstanding, policy formulation and alignment is a systematic process which requires perpetual assessment to measure the effectiveness of policy on the educational process known as student success (Callan, Finney, Kirst, Usdan, & Venezia, 2006). The Stanford and Collins and Chandler studies indicated that policies and perceptions, when out-of-sync, tend to have a detrimental impact on academic preparation, and ultimately on college student success. As noted in Table 14 (Dounay, 2006b), many states have begun to evaluate their methods to prepare high school students for the rigors of college for the specific purpose of preparing students to succeed in college. Dounay (2006b) studied the academic preparation processes within the states to determine the overall trend to improve student success as a national endeavor. The following definitions for Table 14 were provided: 1) Fully Aligned (FA): state has standard high school academic preparation that meets or exceeds the number of units as well as course types (i.e., Algebra I, lab sciences) required to maximize college student success; 2) Partially Aligned (PA): state has standard high school academic preparation that meets or exceeds the number of units, but does not meet the course types to achieve maximum student success; and, 3) Not Aligned (NA) has at least one of the following: (a) does not require all high school students to complete the number of academic preparation courses in each subject required to prepare students for college success; (b) does not have statewide high school graduation requirements; and/or (c) does not have statewide college admissions requirements (college admissions requirements set by individual institutions). 104 The alignment of high school curricula to college student success is directly related to pre-college student characteristics of baseline academic preparation. For example, if high school students do not sufficiently prepare academically for college or the workforce, success in college-level courses or life-long learning promotes a greater concern to community college leaders, faculty, and society (Wyatt, Saunders, & Selmer, 2005). Table 14 Alignment Between High School Graduation and College Admissions Course Requirements State State-set admissions requirements for state public institutions (5) Legend: NA = Not Aligned (0) PA = Partially Aligned (3) NI = Not Included (0) FA = Fully Aligned (5) una = unavailable Requirements in: * FA or PA in one or more core courses Rating (6) English Math Social Studies Science Foreign Language Alabama - No 0.00 NA NA NA NA NA Alaska - No 0.00 NA NA NA NA NA American Samoa - No 0.00 NA NA NA NA NA Arizona * + Yes 1.83 PA NA PA NA NI Arkansas * + Yes 2.67 PA NA FA PA NI California + Yes 2.17 NA NA FA PA NA Colorado - No 0.00 NA NA NA NA NA Connecticut - No 0.00 NA NA NA NA NA Delaware - No 0.00 NA NA NA NA NA Dist. of Columbia - No 0.00 NA NA NA NA NA Florida * + Yes 3.50 PA PA FA FA NA Georgia * + Yes 2.67 PA NA FA PA NA Hawaii - No 0.00 NA NA NA NA NI Idaho * + Yes 2.50 FA NA FA NA NI Illinois - No 0.00 NA NA NA NA NA Indiana - No 0.00 NA NA NA NA NA Iowa - No 0.00 NA NA NA NA NA Kansas * + Yes 1.83 PA NA PA NA NI Kentucky * + Yes 3.50 FA PA FA PA NA Louisiana * + Yes (1) 3.17 PA PA FA PA NA Maine - No 0.00 NA NA NA NA NA Maryland - No 0.00 NA NA NA NA NA Massachusetts + Yes 0.83 NA NA NA NA NA Michigan - No 0.00 NA NA NA NA NA Minnesota - No (2) 0.00 NA NA NA NA NA Mississippi * + Yes 2.83 PA PA PA PA NI 105 Table 14 (continued) State State-set admissions requirements for state public institutions (5) Legend: NA = Not Aligned (0) PA = Partially Aligned (3) NI = Not Included (0) FA = Fully Aligned (5) una = unavailable Requirements in: * FA or PA in one or more core courses Rating (6) English Math Social Studies Science Foreign Language Missouri + Yes (3) 1.33 NA NA NA PA NI Montana * + Yes 1.83 PA NA NA PA NI Nebraska - No 0.00 NA NA NA NA NA Nevada * + Yes 1.83 PA PA NA NA NI New Hampshire - No 0.00 NA NA NA NA NA New Jersey - No 0.00 NA NA NA NA NA New Mexico - No 0.00 NA NA NA NA NA New York - No 0.00 NA NA NA NA NI North Carolina * + Yes (4) 2.17 NA NA FA PA NA North Dakota + Yes 0.83 NA NA NA NA NI Ohio - No 0.00 NA NA NA NA NA Oklahoma * + Yes 3.50 PA FA FA PA NI Oregon * + Yes (5) 1.83 NA NA PA PA NA Pennsylvania - No 0.00 NA NA NA NA NA Rhode Island - No 0.00 NA NA NA NA NA South Carolina * + Yes 3.17 PA PA FA PA NA South Dakota * + Yes 2.17 PA NA FA NA NI Tennessee - No 0.00 NA NA NA NA NA Texas - No 0.00 NA NA NA NA NA Utah - No 0.00 NA NA NA NA NA Vermont - No 0.00 NA NA NA NA NA Virgin Islands una NA una una una una una Virginia - No 0.00 NA NA NA NA NA Washington * + Yes 1.33 NA NA NA PA NA West Virginia * + Yes 3.50 PA FA FA PA NI Wisconsin * + Yes 2.50 FA NA FA NA NI Wyoming * + Yes 3.17 PA PA FA PA NI Totals: Valid Entries = 52 All Columns = 53 una = not included - No = 28 + Yes = 24 - No = 54% + Yes = 46% NA: 35 PA: 14 FA: 3 NA: 67% PA: 27% FA: .06% NA: 43 PA: 7 FA: 2 NA: 83% PA: 13% FA: .04% NA: 34 PA: 4 FA: 14 NA: 67% PA: .04% FA: 27% NA: 36 PA: 15 FA: 1 NA: 69% PA: 29% FA: .02% NA: 36 NI: 16 NA: 69% NI: 31% Special Note: No state indicated that all core courses had fulfilled Full Alignment of English, Math, Social Studies, Science, and Foreign Language (1) Notes: State has ?Regents Core,? but institutions may adopt additional requirements. (2) Notes/Citation: Admissions requirements set independently by University of Minnesota and Minnesota State Colleges and Universities (MNSCU) (3) Notes/Citation: Units below are from 16-unit core curriculum required to apply to a public 4-year college. Students must complete a 17-unit core curriculum to apply to the University of Missouri. (4) Notes/Citation: Alignment reflects undergraduate admissions requirements effective Fall 2006 (5) Notes/Citation: Alignment reflects undergraduate admissions requirements effective Fall 2006. Graduates of Oregon high schools may also use the Proficiency-based Admission Standards System (PASS) option to substitute for English, mathematics, science, social science, and second language subject requirements. (6) A rating of 5.00 is Fully Aligned across all core courses plus the state has set state wide admissions requirements. Source: Dounay, J. (2006b). Alignment between high school graduation and college admissions course requirements. Denver, CO: Education Commission of the States. 106 As indicated in Table 14, 46% of the states have established admissions requirements for public institutions, while 54% of the states have yet to complete this task. It is noted that these are admissions requirements, not requirements for the New Basics Curriculum recommended by NAR. Additionally, Caboni and Adisu (2004) argued that between A Nation at Risk in 1983 and 2004, only 20% of schools had required high school students to complete the basic core curriculum. However, the study by Dounay (2006b) indicated that states have only slightly improved their alignment processes as follows: 1) English: 67% not aligned, 27% partially aligned, and .06% fully aligned; 2) Math: 83% not aligned, 13% partially aligned, and .04% fully aligned; 3) Social Studies: 67% not aligned, .04% partially aligned, and 27% fully aligned; 4) Science: 69% not aligned, 29% partially aligned, and .02% fully aligned; and, 5) Foreign Language: 69% not aligned and 31% not included in the analysis. Statistical averages are as follows: 1) courses not aligned, 71%; 2) courses partially aligned, 17.7%; 3) courses fully aligned, 6.8%; and, 4) courses not included, 31% with no average to be computed. Although there is a statistical difference in the outcomes measured by Caboni and Adisu (2004) and Dounay (2006b), the statistical difference is not significant in terms of academic preparation and college student success?which indicated that considerable work is still required to prepare students for college success [This significance includes recognition of all factors affecting student success.] Or as noted by Dounay (2006b): ?High remediation rates among first-year students in both 2- and 4-year postsecondary institutions suggest[ed] that existing state and local graduation requirements are not adequately aligned with postsecondary expectations? (p. 1). Stated as an underlying construct to define academic preparation: if the educational systems within the individual 107 states average full-alignment for the individual core curriculum components of English, Math, Social Studies, and Science at the descriptive statistical values of .06%, .04%, 27%, and .025%, respectively, academic preparation is negatively and seriously impacted in the formative years of the nation?s future college students and workforce. This study analyzed each state by calculating a set of ratings to numerically view the impact of how well or how poorly states were working towards college-readiness by aligning their core curriculum to the demands of college and work. For example, the rating scale was organized as follows: (1) if the state had ?state-set? admissions requirements, the state was awarded 5 points; (2) if the state had any core courses which were fully aligned, the state was given 5 points for every course fully aligned; (3) if the state only partially aligned its core courses with college or work requirements, the state was only allowed 3 points for each course partially aligned; and, (4) for any course not aligned or not included, 0 points were credited to the state. As there were 6 categories, the totals were calculated and divided by 6 to derive a statistical measure of the power of the state?s efforts. The highest possible score was 5.00 (30/6). For a state with a score of 0.00, it indicated that there was considerable work required to foster alignment between core academic preparation courses and higher education to better prepare students for college or work. States with the highest calculated rating of 3.50 suggested that inroads have been made; however, to maximize the efforts towards the best prepared students for college and the workforce, the ultimate goal was to achieve a rating of 5.00. Even though a state achieved a rating of 5.00, this does not preclude other factors from negatively influencing academic preparation (Barton, 2006; Kuh et al., 2006). Figure 5 shows the number of states correlated to their 108 corresponding scores. While the scale in Figure 5 noted the state scores relative to their correlated groupings, it should also be pointed out that the majority of the scores are located below the 3.0 rating. Moreover, only 14% of the states scored above a rating of 3.00, while 76.9% scored less than the 50th percentile mark of 2.50. The ratings suggested that core curriculum alignment is significantly less effective than recommended by the indicators in A Nation at Risk (see Table 12 and Figure 5). Figure 5. Synopsis of State Scores by Number of Relative Score Groupings. In relation to Dounay (2006a, 2006b) and Caboni and Adisu (2004), Achieve Inc., (2006) also studied the 50-states in terms of their respective alignment of high school policies with the demands of college and work. The study identified the current trend that ?there is a large gap between what high schools expect and what colleges and employers demand, an expectations gap? (p. 3). Furthermore, the study noted that only five states?California, Indiana, Nebraska, New York, and Wyoming?reported that they 0.00 5.00 State Scores for Core Courses Alignment 0.00 = 0% Aligned Across Core 5.00 = 100% Aligned Across Core 0.83 2.50 1.33 1.83 2.17 2.67 2.83 3.17 4 of 52 States and Territories have reached a score of 3.50, with an additional 3 states scoring 3.17. Statistically, only 14% of the states scored better than a 3.0 28 2 2 5 3 2 2 1 3 3.50 Number of States 4 109 had completed the alignment process, which included standards validated by higher education communities and business partners. Standards in California and Indiana were analyzed by Achieve, Inc., and found to be well aligned with the American Diploma Project (ADP) (2004) college and workforce readiness benchmarks. Conversely, as suggested in Table 2.10, the ratings for California and Indiana were 2.17, and 0.00, respectively. The inconsistencies strongly suggested that college-readiness and academic preparation are far from being an exact science; furthermore, reporting data as meeting the alignment process does not constitute validity and reliability across the actual student success outcomes as measured by accountability procedures (American Diploma Project, 2004; Dounay, 2006c; Institute for Higher Education Policy, 2006; L?Orange & Ewell, 2006). Nevertheless, as argued by Adelman (2006), a rigorous high school curriculum is a strong predictor?not a guarantor?of academic preparation as a factor of college student success. Achieve, Inc., (2006) also evaluated an additional thirty states which reported that they were initiating action to align the high school standards to the demands for college and the workforce. Example studies which correlated significantly with Caboni and Adisu (2004), Achieve, Inc. (2006), and Dounay (2006b) are: Adelman (1999) reported that 50% of first-year college students needed to upgrade their Math or English; Attewell, Lavin, Domina, and Levey (2006) argued that 40% of traditional students needed remediation, with higher rates for nontraditional students; Bettinger and Long (2005) indicated that 40% of first-year community college students took remedial courses; the ACT (2006b) revealed that only 56% of 2005 high school graduates took a core preparatory curriculum, e.g., NBC; and Callan, Finney, Kirst, Usdan, and Venezia (2006) noted that: 110 Educators and policymakers have known since the 1980s that this country would need a more highly educated workforce. For the past several decades, they have broadcast a consistent message urging high school students to attend college?and students have responded. Today?s high school students have higher academic aspirations than ever before; almost 90% of high school students of all racial and ethnic groups aspire to attend college. Almost 60% of high school graduates enroll in college right after high school, and many additional students enroll in college within a few years of high school graduation. But educators and policymakers have not fulfilled their side of the bargain; they have not developed coherent state systems of education that adequately prepare high school students for the academic expectations of college. (p. 3) Furthermore, according to a study by Achieve, Inc., (2005) it was reported that college instructors and employers confirmed their perceptions that high school graduates lack preparation for college-level classes and the skills to advance beyond entry-level jobs. Specifically, survey data indicated that instructors believed that 42% of the students were not ready for college-level classes; a strong positive correlation to the instructor data was also noted by employers who indicated that 45% of the potential workforce lacked skills to advance beyond entry level positions. This relationship is also supported by the ACT (2006b) and the Teaching Commission (2006). The ACT (2006b) tested 1.2 million high school graduates in 2005. Of this group, 56% (672,000 students) revealed that they had completed a type of core curriculum while in high school. Conversely, the data indicated that 44% (528,000 students) did not complete a core curriculum in high school, but opted for courses outside the framework of the college-readiness/academic preparation process. To extend this statistical analysis to the future projected college bound student population is to note that if the trend is within ? 5 percentage points of accuracy, a significant concern arises for America?s composite college student success future. 111 For example, Figure 6 shows a comparison of the 2005 ACT (2006) data in graphical form compared to the College Board?s (2004, 2005) projected data in 2017. Using interpolation and a mathematical linear scale between 2005 and 2017, based on the data from the ACT (2006) and the College Board (2004, 2005)?which is statistically significant?the projected outlook for students not participating in a core curriculum suggested that an inordinate number of students will not have taken advantage of the core curriculum to achieve a baseline level of academic preparation. A result of fewer core curriculum courses in high school will decrease the likelihood or opportunity of being better prepared for college success, work, or life in general. In other words, to omit courses which are more likely to prepare students for college level work, the pre-college phase of academic preparation is severely shortchanged. Figure 6. Comparison of Core Curriculum Participants. Numerically, if the trend continues for students who do not avail themselves of the high school core curriculum, students ill-prepared for life and college may prove to be an educational albatross for the foreseeable future. Specifically noting that 90% of all Core Curriculum Participants: 2005 672,000 Completed 528,000 Non- Participants Total Students Tested: 1,200,000 56% 44% Core Curriculum Participants: 2017 9,912,000 Completed 7,788,000 Non- Participants Total Students Projected: 17,700,000 44% 56% 112 racial and ethnic classes aspire to attend college (Callan, Finney, Kirst, Usdan, & Venezia, 2006) and 88% of all students surveyed in a study conducted by Venezia, Kirst and Antonio (2003a) intend to pursue postsecondary education, the indication is as follows (using 88%): 1) 88% of 17,700,000 is 15,576,000 students knocking at the doors of universities and community colleges in 2017; 2) of the 15,576,000 students arriving for postsecondary education or training, 44% have not received the high school core curriculum; and, 3) at 44%, there will be 6,853,440 students who potentially are not academically prepared for success in college. The difference in the total projection of 17,700,000 and the 88% declared attendees will require further investigation as to levels of college-readiness, variables in determining the delay in attending, and so forth. More crucial is the data that indicated the possibility and probability that 44% of the total projected 17,700,000 students arrive for postsecondary education not having completed the high school core curriculum ? a mere 7,788,000 students who are more likely to be less prepared for college success or the workforce than those who had completed the core curriculum (ACT, 2006c; Adelman, 2006; Dounay, 2006b, 2006c, 2006d, 2006e). The ACT (2006c) suggested that the core curriculum should include four years of English, three years each of mathematics, science, and social studies. Barton (2006) indicated that even when there is a core curriculum it is difficult to pin down what constitutes achievement among the participants of core courses as variations exist within schools, between schools, and even across school systems and states. To glean specificity from within the variances of achievement, Barton (2006) analyzed the evaluation system used by the National Assessment of Educational Progress (NAEP) as NAEP specifically delineated what students were to know and be able to do to receive 113 achievement levels of basic, proficient, or advanced in each subject area for fourth, eighth, and twelfth grades. For example, in 2003, 29% of 8th graders reached the proficient level in math, while about 50% of 12th graders reached the same rating. The achievement level of proficient in math was categorized as: ?the definition of proficient in eighth-grade mathematics describes a considerable level of mathematical ability; this level is set at a scale score of 299 on NAEP?s 0 to 500 scale? (p. 24). Figure 7 suggested that one of the most influential variables positively impacting academic preparation is the development of basic skills: the antithesis to this construct is a debilitating educational outcome requiring incalculable community college man-hours and scarce resources to correct. Assumptions of who is to blame for remediation per se, does not address problems and seek solutions; however, several studies have identified misaligned policies as problematic and suggested solutions to address these inadequacies (Attewell, Lavin, Domina, & Levey, 2006; Bettinger & Long, 2005; Dounay, 2006b; Daugherty, 2005; Lord, 2002a, 2002b). For example, suggested solutions included policies which led the charge to improve the structure for basic skills development of students to succeed in college and the workforce without the need for remediation, as well as inform stakeholders about the process and structure of college-readiness to promote student success. As noted by Dounay (2006c), students and parents are not educated on Carnegie units, number of units required, specific courses in those units, variances in units required for high school graduation, and units required for college admission, e.g., college-readiness, academic preparation, college student success. 114 Furthermore, as states fail to align high school core curricula fully with college- readiness requirements, the gap continues to widen between preparation in high school and potential success in college or the workforce (Achieve, Inc., 2006; Boswell & Wilson, 2004; Phelan, 2004; Wirt, Choy, Rooney, & Provasnik, 2005; Starratt, 2003). At the heart of the issue of academic preparation research is the data suggesting that preparation of young minds is perfunctory to achieve postsecondary educational success for the betterment of life, liberty, and the pursuit of economic happiness?and even survival (Byrd & MacDonald, 2005; Greene & Foster, 2003; Greene & Winters, 2005; Phillips & Skelly, 2006). Figure 7 also suggests that as students progress through the formative years of their learning process? P-12 ?the system of education must be fully aligned to maximize?as a minimum?the opportunity for young adults to mature into ?thinking-learners?, not automatons who can recite phrases, numbers, or pass standardized tests questions as a rote exercise (Carey, 2006). 115 Figure 7. Core Curriculum College-Readiness Model. When academic preparation and institutional practice are assessed as co-variables impacting student success, it is imperative that community colleges foster a culture of inquiry to fully recognize the academic and diverse needs of students. As suggested by Clagett (2004) and Terenzini and colleagues (1994): Core Curriculum English ? 4 Years Math ? 3 Years Science ? 3 Years Social Studies ? 3 Years English-Variable Math-Variable Science-Variable Social Studies-Variable Non-Core Curriculum Better Prepared for: College Life The Workforce Less Prepared for: College Life The Workforce Positive College-Readiness Negative College-Readiness FULLY ALIGNED State has standard high school graduation requirements that meet or exceed the number of units as well as course types required for college success. NOT FULLY ALIGNED BASIC SKILLS 116 In the past, we have tended to develop new student support programs implicitly assuming that the challenge is to help students adapt to the institution. For nontraditional and diverse students, however, the logic needs to be reversed: Institutions must seek ways in which they can change so as to accommodate the transitional and learning needs of first- generation and other nontraditional students. Some students will flourish in their new environment without institutional intervention. Others, however, will require assistance that is initiated by institutional representatives ? faculty and staff. Faculty cannot assume that their sole responsibility is to teach and advise, and that if students do not take advantage of what they have to offer it is the student?s problem. The burden of responsibility for taking advantage of transitional support mechanisms cannot rest with the student alone. (Terenzini, 1994, p. 72) One of the prime areas where institutional practice must address academic preparation as a detrimental factor which harms student success is remedial education. College students and faculty have dichotomous perceptions regarding remedial or developmental education. Several students interviewed during the literature review process of this study were quoted as saying, ?Why do I have to take these basic courses when I made all A?s and B?s in High School?? Furthermore, the comments made by these students were given substantial acknowledgment by Olsen (2000): ?High schools produced record numbers of graduates with A and B grade point averages, while colleges and universities reported a significant and costly growth in remedial courses? (p. 104). To substantiate the validity of the data analysis by Olsen (2000), Brozik (2004) reflected literally on student academic preparation and college student success as: No kidding, I mean it. Whom do I blame? I teach upper-division and graduate courses, and I am constantly confronted with students who cannot spell, who do not or will not read, and whose math skills are simply appalling. I spend a whole lot of time trying to get these kids up to a reasonable level of literacy. I should be teaching content, but, oh no, I just try to get past sentence fragments. (p. 25) 117 A longitudinal study by Woodruff and Ziomek (2004), as part of an ACT research effort, was performed between the years from 1991 to 2003. The study investigated high school grade point average (HSGPA) inflation. Inflation in this study was identified and measured as a correlation of HSGPA and ACT assessment scores. Findings from the thirteen-year study indicated that grade inflation was a significant contributor to college remediation. Also specified in the research is the definition of inflation: ?HSGPAs increased without a concomitant increase in achievement, as measured by the ACT? (p. ii). Of concern is that this research also noted that grade inflation may be present in many colleges, and that as a result of grade inflation, there is a relational connection to ?credential inflation? (p. 10). As faculty in the community college are better prepared to recognize the variances in student academic preparation, practices which better support student success should begin to evolve (Achieve the Dream, 2005). For example, as faculty in the community college understand the trend established by shortfalls in academic preparation resulting in remedial coursework, institutional practices which enable students to achieve at greater increments can be initiated and improved in a longitudinal manner. As a result, the students will receive present-college academic preparation to improve their respective chances to succeed in college, whether that success is a degree, transfer, an improved work ethic, a better outlook on life, a renewal of motivation to achieve, or simply the attainment of a certificate. As noted in Table 15, remediation from 2001 ? 2006 for students attending community or technical colleges ranged from 41.1% to 44.9% during the years of 2001 to 2006, non-respectively. On average, remedial courses were offered to 43.2% (26, 515 of 61,417) of first-time students enrolled between 2001 and 2006. 118 Institutional practices to meet this academic preparation shortfall are the responsibility of each stakeholder associated with the community college system of education (Achieving the Dream, 2005; Bailey et al., 2005a; Kuh et al., 2006; Richardson, 2006; VanWagoner, Bowman, & Spraggs, 2005). Brock et al. (2007), reported findings on the latest outcomes of the Achieving the Dream Initiative (a consortium of MDRC and the CCRC). One of the outcomes of the study was as follows: ?Colleges implemented a wide array of strategies to improve student success, including strengthening academic advising and orientation programs, revamping developmental education, and offering professional development for faculty and staff.? (p. iii). Key to this finding within the present context is the improved institutional practice of ?revamping developmental education? (p. iii). How does this revamping action translate into institutional practice to improve student achievement? First, the primary goal of the Achieving the Dream Initiative is to establish a culture of evidence in the community college. Second, the colleges in the initiative commit to collecting significant data on practices in the college and, in turn, using the data to improve student achievement. And, third, data associated with developmental or remedial programs are used to improve the step-by-step process of testing, advising, course selection, follow-up, tutoring, and so forth. Therefore, students who need a basic math course simply are not enrolled in the class without a ?linear success factor.? 119 Table 15 Alabama Commission on Higher Education, High School Report: Enrollment in Alabama Public Colleges and Universities (First-Time Freshmen) Total First-Time Enrolled in Alabama Public Colleges TOT Remedial Math Only Remedial English Only Remedial Math and English Total Remedial Remedial % Based on Total Row/Column Fall Term 2006 2-Year Colleges 10291 1965 797 1471 4233 41.1% 4-Year Coll./Univ. 10827 1018 382 352 1752 16.2% Fall Term 2006 Sub. 21118 2983 1179 1823 5985 28.3% Fall Term 2005 2-Year Colleges 10294 1863 945 1505 4313 41.9% 4-Year Coll./Univ. 10448 641 254 275 1170 11.2% Fall Term 2005 Sub. 20742 2504 1199 1780 5483 26.4% Fall Term 2004 2-Year Colleges 9,782 1,973 740 1,501 4,214 43.0% 4-Year Coll./Univ. 9,792 909 272 385 1,566 15.9% Fall Term 2004 Sub. 19,547 2,882 1,102 1,886 5,780 29.5% Fall Term 2003 2-Year Colleges 10,652 2,021 953 1,702 4,676 43.9% 4-Year Coll./Univ. 9,713 1,034 287 418 1,739 17.9% Fall Term 2003 Sub. 20,365 3,055 1,240 2,120 6,415 31.5% Fall Term 2002 2-Year Colleges 10,213 1,774 1,051 1,766 4,591 44.9% 4-Year Coll./Univ. 9,713 1,034 287 418 1,739 17.9% Fall Term 2002 Sub. 19,740 2,828 1,374 2,637 6,839 34.6% Fall Term 2001 2-Year Colleges 10,185 1,627 1,133 1,728 4,488 44.1% 4-Year Coll./Univ. 8,753 995 206 408 1,609 18.4%% Fall Term 2001 Sub. 18,938 2,622 1,339 2,136 6,097 32.2% 2-Year Totals: Fall Term 2006 10,291 1965 797 1471 4233 41.1% Fall Term 2005 10,294 1863 945 1505 4313 41.9% Fall Term 2004 9,782 1,973 740 1,501 4,214 43.0% Fall Term 2003 10,652 2,021 953 1,702 4,676 43.9% Fall Term 2002 10,213 1,774 1,051 1,766 4,591 44.9% Fall Term 2001 10,185 1,627 1,133 1,728 4,488 44.1% 2001 - 2006 61,417 11,223 5,619 9,673 26,515 43.2% (Avg) 4-Year Totals: Fall Term 2006 10,827 1018 382 352 1752 16.2% Fall Term 2005 10,448 641 254 275 1170 11.2% Fall Term 2004 9,792 909 272 385 1,566 15.9% Fall Term 2003 9,713 1,034 287 418 1,739 17.9% Fall Term 2002 9,713 1,034 287 418 1,739 17.9% Fall Term 2001 8,753 995 206 408 1,609 18.4%% 2001 - 2006 59,246 5,631 1,688 2,256 9,575 16.2% (Avg) Percentage Difference Between 2-Year Colleges and 4-Year Colleges/Universities: (e.g., community and technical colleges offered 27% more remedial courses) 27% Source: http://www.ache.state.al.us 120 This study seeks to address the perceptions of students and faculty to assess the relationship between institutional practices and academic preparation as a major factor impacting student success. If the perceptions of students and faculty are not properly aligned and supported by relevant policies, the framework for establishing successful programs for student success may also be misaligned, ineffective and harmful to student achievement. Currently, there is a large gap between educators? expectations of their students and students? own expectations for success, including computer competence (McGuire & Williams, 2002; Messineo & DeOllos, 2005; Brancato, 2003). Without perception studies, it is highly likely that the educational system in America would understand itself much less than if these studies had not been conducted (Park, Scherer & Glynn, 2001; Konings, Brand-Gruwel & Merrienboer, 2005). Moreover, there have been many studies of perceptions about various issues or topics in education. For example, faculty beliefs about teaching at a research university (Wright, 2005); societal perceptions that schools ?have abandoned academic standards, ? undermined American economic competitiveness, ? breed social disorder, ? waste massive sums of money, ? no longer provide a reliable way for people to get ahead, and ? reinforce societal inequality in American society? (LaBaree, 1 997, p. 69); student perceptions of community college classroom environments as contributing to or hindering their learning (Veltri, Banning & Davies, 2006); the mental image that a college student is a recent high school graduate; is young, white, middle-or-upper- income; an individual who will pursue only a four-year degree on a residential campus (Lamkin, 2004); and, even military and civilian comparisons of education (Franke, 2001). The power of perceptions is graphically illustrated in Figure 8, Figure 9, and Figure 10. 121 Figure 8. The Relationship of Perceptions and the Strategic-Impact-Triad Factors. The ?bold arrows? indicate the direct influence perceptions have on each independent SIT Model factor; the ?dashed arrows? indicate interdependent, collective influence perceptions have on all SIT Model factors. Perceptions in this model are all-inclusive, at all levels of practice. Figure 9. Misaligned Perception Model of Student?s Academic Preparation. Perceptions influence decisions made by students and faculty in the outcomes affecting student success, specifically how academic preparation is perceived by students and faculty as a variable impacting student success. In this model, the faculty member perceives the student to be unprepared solely based on outcomes of classroom activity without knowing the underlying factors influencing the student?s actual ability to be successful. Faculty Member EDLD1000 Assessment (Perceptions influenced by numerous variables) Fair Outcome of ?C? ?Student is not prepared for college level work ? apparently not motivated.? * 45 hours @ work weekly * Wife & new baby * Community service * ?I don?t think my professor tries to understand my situation? *** High Level of College-readiness. Misaligned ?framework? for Perception of Student Readiness Lecture Projects Perceptions Academic Preparation Work Ethics Institutional Support 122 Figure 10. Aligned Perception Model of Student?s Academic Preparation. In this model, the faculty member perceives that the student?s ability to succeed is influenced by a lack of academic preparation prior to college. Intervention then occurs to provide academic support to promote and improve student success. Studies have concluded that academic preparation is conclusively linked to student success (Dounay, 2006a, 2006b; Phillips & Skelly, 2006; ACT, 2006). Academic preparation and institutional practice are impact factors which together influence college student success in the community college (Achieving the Dream, 2005; Institute for Higher Education Policy, 2006). Perceptions are input variables which impact academic preparation, institutional practice, and student success (Woodruff & Ziomek, 2004). Therefore, to assess the impact of academic preparation on the success of community college students within the framework of institutional practice, student and faculty respondents will be surveyed to collect data for analysis. The outcome of Factor 1 is to inform community college leaders that institutional practice is a key impact factor in promoting student success in the community college; and, that academic preparation must Faculty Member EDLD1000 Assessment (Perceptions influenced by numerous variables) Fair Outcome of ?C? ?Student is not prepared for college level work ? maybe I should ask him if he needs help.? * may need remedial courses * little family support * poor self-esteem for success * ?I?m having trouble, but I don?t want to seem like I?m dumb or ask any questions.? *** Low Level of College-readiness Aligned ?framework? for Perception of Student Readiness Lecture Projects 123 include pre-college and present-college attributes to promote and achieve student success. As noted by the College Board (2006a, 2006b) in Table 16, creating a set of action steps to improve student success is vital to institutional, community, student, and national success. Table 16 Action Steps for Policymakers to Prepare All Students for the Workforce and College Action Step Description 1 Use the common expectation to establish a statewide commitment that all students will be prepared for college and workforce training programs when they graduate from high school 2 Require that all students take a rigorous core preparatory course program in high school 3 Hold schools and states accountable for preparing all students for college and workforce training programs through rigorous core courses 4 Ensure that state standards reflect the skills needed for college and workforce training readiness for all students 5 Provide funding measures of college and workforce training readiness skills to be used as statewide high school assessments 6 Begin measuring student progress with aligned assessments as early as the eighth grade to monitor progress, make appropriate interventions, and maximize the number of high school graduates who are ready for college and workforce training programs 7 Use the common expectations of college and workforce training readiness as a prerequisite for entry into funded training or developmental programs (e.g., incumbent worker training) and offer remediation for those who do not meet this expectation 8 Communicate the common expectation of college and workplace training readiness to all stakeholders, including businesses, workforce and economic development associations, and educational institutions. Source: College Board. (2006b). Ready for college and ready for work: Same or different? ACT: College and Workforce Training Readiness. Iowa City, IA. (p. 9) Academic preparation in the community college is dependent upon academic preparation in the years leading up to college attendance. While in the community college, institutional practices which improve pre-college preparation must be assessed and given proper attention; institutional practices during present-college academic 124 preparation includes any and all instances of educational practice which promotes the success of students in terms of their stated educational objectives. For example, what perceptual differences might students and faculty members indicate which will assess the relationship between academic preparation and student success in the community college in the contextual framework of institutional practice? Sample constructs are listed below: (Robbins et al., 2004; Kuh et al., 2006; Smith, 2005) a. caring faculty are essential to the academic preparation and success of students; effective teaching is an institutional practice which promotes academic preparation; b. GPA inflation distorts academic success which negatively impacts student achievement (ACT, 2005c); c. academic preparation before college is a prerequisite to college student success; d. students and faculty view academic preparation very differently which can have unintended negative student success results; e. weak academic preparation requires students to work harder to achieve success; and, f. community colleges failing to offer remedial courses to students with weak academic preparation are a poor example of institutional practices promoting student success. The literature suggested that academic preparation is a longitudinal process in the life of a college student. The outcome of a lack of academic preparation is the level of success a community college student achieves in real-life, including lifelong learning. To assess academic preparation as an institutional practice promoting or hindering community college student success is vital to all stakeholders in the student success domain?but most importantly, to the students and their families. 125 Community college student success is a two part process. As previously noted, the two parts are: 1) pre-college, and 2) present-college. Pre-college academic preparation includes those factors associated with high school, socioeconomic forces, and the development of an individual?s personal work ethics. Stated differently, pre-college academic preparation includes all factors which influence the development of the prospective college student to be successful in college-level requirements. Conversely, present-college academic preparation includes those things that are accomplished between the signing of the college application and the accomplishment of the individual?s stated educational objective?whatever that objective/goal may be. Achieving academic preparation in college is a success indicator, just as improving the level of writing skill of a student is a success indicator. Moreover, student success in terms of academic preparation is not solely the passing of a test or the successful completion of a lab project. Rather, academic preparation is an improvement in the life of the student. When educators think in terms of academics, it may often be considered as ?book learning.? For this study, academic preparation is denoted as that which has the potential to improve the lives of students on several levels. In other words, if an individual attends a community college and that same individual assumes that he or she can make the Dean?s List without a commitment to academic preparation, would the perception of academic preparation be similar or different between faculty and student? Therefore, the perceptual differences regarding academic preparation need to be better understood so that ?academic preparation? might begin to be a conceptualized, center-line framework for both student and faculty?for the purpose of improving the level of success students enjoy as a result of their college experience. 126 Factor 2: Work Ethics of Students and Faculty Work ethics have been defined as ?the desirable characteristics for a potential employee? (Hill & Petty, 1995, p. 59). Also referred to as employability or soft skills, what role might work ethics play?as in institutional practice?in impacting student success? According to Robinson (2000), employability skills are basic job skills which are perfunctory to ?getting, keeping, and doing well on a job? (p. 1). For the community or technical college student, the transposition of work ethics on the job is specifically applicable to doing well in the classroom. For example, Strom, Strom, and Moore (1999) used the Peer and Self-Evaluation System (PSES) to inform teachers about group interaction from the student point of view. The framework for the Peer and Self- Evaluation System was derived from field testing the system with 300 high school students (p. 539). The premise of the PSES was to enable ?a teacher [to] help students gain the ability to judge themselves? (p. 541) and to ?assess their own efforts to enhance team productivity? (p. 541). Additionally, the PSES suggested that groups of people who can work together will be the key to success in the emerging global marketplace? simultaneously validating that group success depends on individual accountability. Consequently, teamwork is a work ethic and a college student and faculty success indicator (The Conference Board et al., 2006). According to WorkEthics.Org (2006), the number one priority of Georgia?s employers is to create a viable and effective workforce by teaching the following work ethics to students: 1) Attendance, 2) Teamwork, 3), Attitude, 4), Organizational Skills, 5) Cooperation, 6), Character, 7) Appearance, 8) Productivity, 9) Communication, and 10) Respect. Students who attend the community or technical college without these work 127 ethics are more likely to be less prepared to do college-level work than students who possess these traits to a greater degree (Hill & Fouts, 2005; Hill & Petty, 1995; Kezar, 2006; The Conference Board et al., 2006; VanWagoner, 2006). Therefore, work ethics have a direct impact on student success and are strategic baseline factors about which students and faculty have perceptions. To measure, compare, and report the impact of work ethics as a factor of student success is to inform the community or technical college of institutional actions to be taken in improving student achievement. A review of the literature produced what might be considered synonyms for the conceptual framework of work ethics behavior by both student and faculty in the community college (Hill & Fouts, 2005; Robinson, 2000). For instance, the following examples suggested the variances and interchangeability in the application and use of terminology: 1) The Conference Board et al. (2006) interchanged the terms Professionalism/Work Ethic and Ethics/Social Responsibility (p. 9) to define two of the applied skills needed to successfully perform in the workplace. Moreover, for community or technical college graduates, the five most frequently reported applied skills considered ?very important? by employers across the United States were: a) Professionalism/Work Ethic (83.4%); b) Teamwork/Collaboration (82.7%); c) Oral Communications (82%); d) Critical Thinking/Problem Solving (72.7%); and, e) Written Communications (71.5%) (p. 20); 2) Robinson (2000) used the term employability skills to define a set of basic skills students must have to acquire employment and once employed, to maintain their employment. Robinson (2000) also noted that employability skills are teachable skills, similar to teaching organizational skills (Bakunas & Holley, 2004) or communication skills (Emanuel, 2005); 3) Waggoner (2006) suggested the term 128 soft skills to identify the following student characteristics: ?? courtesy, respect for others, work ethic, teamwork, self-discipline, self-confidence, conformity to norms, language proficiency, behavior, communication skills?listening, teamwork, and responsibility? (p. 4). Moreover, Waggoner (2006) argued that a problem arises when professors perceive that students in their classroom are deficient in soft skills, yet the instructor fails to address the deficiencies; the counter argument was that ?teaching soft skills with the hard skills recognizes that professors are teaching the whole person? (Waggoner, 2006, p. 4); 4) McAdams (2007) report, The Hottest Skills for 2007, used the concept/term of business acumen to identify skills and work ethic traits for individuals in the Information Technology community. Specifically, project management was identified as one of the hottest areas for future employment. Project managers, according to McAdams (2007), needed to be savvy individuals who were able to communicate effectively, motivate others, multitask, demonstrate interpersonal skills, instill impressions of trust, and demonstrate reliability; and, 5) International studies have also researched the effects or lack thereof of work ethics. For example, Rose [United Kingdom] (2005), studied the effects of increased levels of employee qualifications on the work ethic of individuals and argued that ?more surveys [are needed] to provide data on employee commitment to work, orientation to work, and attachment to work? (p. 153). Additionally, McLeish [Australia] (2002) studied the employability skills needed by small and medium sized Australian enterprises by utilizing interview and focus group research methodologies (see Table 17). 129 Table 17 Employability Skills for Australian Small and Medium Sized Enterprises Personal Values Loyalty, Commitment, Honesty, Positive self-esteem, Enthusiasm, Reliability, and Positive personal presentation Theme Employability Skill Indicators Inter- Personal Skills Communication Listens and understands Speaks clearly and directly Writes clearly Negotiates effectively Reading independently Teamwork Works well with peers, customers, supervisors and support staff Works across different ages Transfers effectively between individual work and team work Knows their own role as part of the team in the work situation Shows cultural sensitivity Initiative and enterprise skills Problem- solving Develops creative solutions Is practical Shows independence and initiative in identifying problems and solving them Problem solves in teams Able to estimate and calculate Understands tables of figures and can interpret graphs Understands basic budgeting Initiative and enterprise Adapts to new situations Develops a strategic vision Learning skills Planning and organizing Manages time, self, and able to work alone Resourceful Makes decisions Understands relationships amongst workplace processes and systems Adapts resource allocations to cope with contingencies Establishes clear project goals and deliverables Allocates people and other resources to tasks Self awareness Has a personal vision and goals Evaluates and monitors own performance Learning Has enthusiasm for ongoing learning Willing to learn in any setting Open to new ideas and techniques Prepared to invest time and effort in learning new skills Acknowledges the need to learn in order to accommodate change Workplace skills Technology Able to relate the use of technology to work Has basic computer skills Willing to upgrade technology skills Willing to use a range of technologies Uses technology to seek, process and present the information Used physical abilities for the application of technology Relevant physical ability to apply technology Source: McLeish, A. (2002). Employability skills for Australian small and medium sized enterprises. Commonwealth Department of Education Science & Training: Australia. 130 Because of the variations in the terminology applied to work ethics by indigenous institutions and international organizations, this study will use the terms of work ethics, employability skills, and soft skills interchangeably, lending emphasis to the ten work ethics previously noted by WorkEthics.Org (2006). The preference for these ten work ethics is indicated as a subset of factors of the work ethic in the Strategic-Impact-Triad Model. Furthermore, the crux of the work ethics argument in this study is: what impact on student success might work ethics have? And what institutional practices foster the application of work ethics as a factor to promote student success?both in college and in the workforce? One of the first issues to be addressed is to inquire into the impact generational differences might have on institutional practice and student success in relation to work ethics (Lancaster & Stillman, 2003; Martin & Tulgan, 2002; Raines, 2003; Stillman, 2003; Zemke, Raines, & Filipczak, 1999). The University of Michigan Health Systems (2002) outlined a scenario between a faculty member and a student in which the value systems of the individuals differed. The premise of the scenario was to inform both students and faculty members that these differences were normal with respect to the work ethic value system of each individual. For example, the faculty member was born in the 1950?s, perceived that the student was not dedicated to the job, and was disrespectful; whereas, the student?born in the 1970?s?was self-assured, confident, and perceived her performance as dedicated and efficient. The scenario is played out countless times daily in community or technical college classrooms all across the nation. A summary of the generational differences as noted by the University of Michigan Health Systems (2002) is presented in Table 18. 131 Table 18 Work Ethics of Different Generations Generation Description of Work Ethic Values They Bring to Work Born before the end of World War 1 (1945) Dedicated to the job Are dedicated, hard workers Believe in following rules and abiding by the law Show respect for authority Are patient and do not need instant gratification Born after World War II but before 1960 Ambitious and driven to succeed on the job Have an optimistic outlook Hard workers who want personal gratification from work that they do Believe in self-improvement and growth Born between 1960 and 1980 Want a balance between job and personal life Aware of diversity and think globally Want to balance work with other parts of life Tend to be informal Rely on themselves Are practical in their approach to work Want to have fun at work Like to work with latest technology Born after 1980 Dedicated to the job Have an optimistic outlook Are self-assured and achievement-focused Believe in strong morals and serving the community Aware of diversity As presented in Table 18, generational differences in perceptions of work ethics can have confrontational outcomes. The following texts, written by various consultants and communication specialists, suggested that the generational differences impact the workplace whether these differences are acknowledged or ignored. The references are: 1) Lancaster and Stillman (2003), When Generations Collide: Who They Are, Why They Clash, How to Solve the Generational Puzzle at Work; 2) Martin and Tulgan (2002), Managing the Generational Mix: From Collision to Collaboration; 3) Raines (2003), Connecting Generations; and 4) Zemke, Raines, and Filipczak (1999), Generations at Work: Managing the Clash of Veterans, Boomers, Xers, and Nexters in Your Workplace. 132 The generations defined in the referenced texts are: 1) veterans or traditionalists, born before 1945; 2) baby boomers, born between 1946 and 1964; 3) generation X, or Gen Xers, born between 1965 and 1980; and 4) millennials or GenY, Echo Boomers, or Nexters, born from 1981 to the present. Although the lengthy detail of the variances in value systems of work ethics of these generations is beyond the scope of this study, the value of knowing that the differences exist is argued as influential in the success of college students (Haworth, 1997). For instance, community college practices cannot be made in a vacuum. A prime example is an action taken by faculty or administration in the absence of understanding their primary constituents?their students. Lancaster and Stillman (2003) suggested that the millennial generation desires for management, e.g., faculty and administration, to collaborate directly with them as compared to issuing directives for them to follow. For faculty to practice classroom techniques which are heavily directive is to invite discontent on the part of the millennial students in the classroom, thereby potentially negatively impacting student success. As noted by Lancaster and Stillman (2003), the differences in generations can cause ?clashpoints?, or conflicting issues related to values, views of authority, work and communication styles, expectations of leadership, the institutional environment, and work versus leisure. Consequently, one area that the community college must address is the differences in the work ethics of students as compared to the work ethics of the faculty. A major ?clashpoint? might be the differences in value systems or work ethics between students and faculty. This study argued that differences in the perceptions of teamwork on a project is an ideal ?clashpoint? that requires redress to improve the student work ethic of teamwork. 133 As noted earlier by WorkEthics.Org (2006), the work ethics program in the State of Georgia includes ten specific items which are included in classroom instruction. The ten work ethics and their corresponding definitions are shown in Table 19. Table 19 Work Ethics Taught in the Two-Year Technical Colleges in Georgia Title of Work Ethic Description Attendance Attends class; arrives/leaves on time; notifies instructor in advance of planned absence Character Displays loyalty, honesty, trustworthiness, dependability, reliability, initiative, self-discipline, and self-responsibility Teamwork Respects the rights of others; respects confidentiality; is a team worker; is cooperative; is assertive; displays a customer service attitude; seeks opportunities for continuous learning; demonstrates mannerly behavior Appearance Displays appropriate dress, grooming, hygiene, and etiquette Attitude Demonstrates a positive attitude; appears self-confident; has realistic expectations of self Productivity Follows safety practices; conserves materials; keeps work area neat and clean; follows directions and procedures; makes up assignments punctually; participates Organizational Skills Manifests skill in prioritizing and management of time and stress; demonstrates flexibility in handling change Communication Displays appropriate nonverbal (eye contact, body language) and oral (listening, telephone etiquette, grammar) skills Cooperation Displays leadership skills; appropriately handles criticism, conflicts, and complaints; demonstrates problem-solving capability; maintains appropriate relationships with supervisors and peers; follows chain of command Respect Deals appropriately with cultural/racial diversity; does not engage in harassment of any kind Source: http://www.workethics.org/contact.htm As indicated in Table 19, the work ethic traits are descriptive of the characteristics that employers desire in prospective employees, including community and technical college graduates. When the data in Table 19 is compared to the data as outlined in Table 134 17, several common themes emerge. It should be noted that the data in Table 17 is compiled from a study in Australia and is directed at employability skills, whereas Table 19 is focused on classroom instruction to support student success as a result of institutional practices?yet, both sets of data are driven by workforce initiatives for respective national and global economic and social improvement. The common traits which are instilled in the community or technical college and practiced in the workplace are generally identical, conceding the fact that the wording is not exact: the conceptual framework and underlying constructs are?as strongly argued in this dissertation? practically identical?theoretically, educationally, contextually, and ideologically. In total agreement with the study by McLeish (2002), the constructs derived by WorkEthics.Org (2006), and the research conducted by Waggoner (2006), The Conference Board et al. (2006) surveyed over 400 employers across the United States and correlated the educational preparation of high schools, community colleges, and four-year institutions to work ethics and concluded that: ?the findings indicate that applied skills [Applied skills refer to those skills that enable new entrants to use the basic knowledge acquired in school to perform in the workplace] on all educational levels trump basic knowledge and skills, such as Reading Comprehension and Mathematics. In other words, while the ?three R?s? are still fundamental to any new workforce entrant?s ability to do the job, employers emphasize that applied skills like Teamwork/Collaboration and Critical Thinking are ?very important? to success at work. (p. 9) According to U.S. Department of Labor estimates, 80% of workers who lost their jobs do so not because of deficient occupational skills, but because of poor work ethics. Waggoner (2006) lends support to this statistic as she argued for the inclusion of soft skills in the classroom to prepare students for the workplace, e.g., student success. In 135 fact, Waggoner (2006) noted that ?beyond the classroom, a lack of soft skills is more likely to get an individual?s employment terminated than a lack of cognitive or technological skills? (p. 5). If the underlying theme is suggestively true that individuals in the workforce are more likely to lose jobs because of deficient work ethics, then the community college has a new mandate to uncover institutional practices to promote student success in every conceivable facet associated with work ethics. If the community college adopts the ten work ethics as instituted for over 30 years in the State of Georgia (WorkEthics.Org, 2006), the leadership of the community college system of education should ensure that all layers of institutional function are embedded with sound practices of work ethics?to maximize opportunity for every student to achieve to their maximum potential?inclusive of the development, enhancement, and reinforcement of personal and professional work ethics by students and faculty. In the words of Waggoner (2006): But what happens when professors perceive there are individuals in their college classroom who are deficient in their soft skills? When did it enter a professor?s job description to teach students fundamentals of courtesy, social graces, and collegiality while teaching the hard skills of inferential statistics? After all, aren?t professors to teach the specialized knowledge honed in their doctoral programs? Teaching soft skills with the hard skills recognizes that professors are teaching the whole person [student success]. (p. 4) It has been indicated in the literature, studies, and opinions, that work ethics is important to individuals and the future economic survival of the nation. To emphasize the value of each work ethic as promulgated by WorkEthics.Org (2006), each individual ethic will be discussed as a sub-factor of the work ethic in the Strategic-Impact-Triad Model. The discussion for each ethic is intended to establish the ten ethics as a composite baseline to identify the work ethics practices of students and faculty as 136 comparative to assess how these ten ethics impact college student success. One limitation to be noted is that further study is warranted to determine which work ethic might be suggested as the most influential as compared to the other 9 ethics or what grouping has more statistical influence than other groupings: that research is set-aside for future, detailed analysis. Attendance. Students in any type of college course must participate. Participation may be partially defined as attendance, and attendance is one of the most important?if not the most important?work ethic that a student must exhibit. As defined in Table 19, attendance for college students is: attends class; arrives/leaves on time; notifies instructor in advance of planned absence. Even for on-line courses, students must participate, e.g., e-attendance (Smith, 2005). Thus, attendance signifies that if students do not participate via the proverbial process of either in-class or on-line attendance, student success is suspect?not absolute (Marburger, 2001, 2006; Romer, 1993). Institutional practice which supports and encourages students to attend and participate is vital to college student success (Brewer & Burgess, 2005; Davidovitch & Soen, 2006; Gump, 2005; Stanca, 2004, 2006). Marburger (2006) conducted an empirical study to investigate the issue of whether mandatory student attendance made a significant difference in student success. The evidence in the study suggested that the enforcement of an attendance policy on absenteeism was beneficial to student achievement. A portion of the data accumulated indicated that daily absenteeism averaged 18.5%, with a range from 8.5% to a high of 44.1% on any given day. It was also noted that Friday had the largest missed class days, and the absences increased gradually as the semester progressed (p. 149). 137 Marburger (2006) maintained daily attendance records to correlate material covered on a specific day during the semester. The two classes in the study were taught by the same instructor, in two different Fall Semesters, during the same time slot (MWF 12:00), and the participants (economics students) were informed of the impact that their attendance would have on the grading in the course, e.g., no attendance policy versus strict enforcement of institutional attendance policy. To address validity and reliability of the parameters, ?both sections were held on fall semesters during the same time slot and taught by the same instructor, and differences in absenteeism could be traced to the enforced attendance policy? (p. 150). To determine specificity in correlation-of-absenteeism to student success, the material taught on each day was recorded and mapped to individual student attendance records per day, per class. Multiple choice exams given to students in both classes?no attendance policy and enforced attendance policy?resulted in indicators suggesting that students who attended class more frequently performed slightly better on the exams than did students who were more inclined to participate in ?nonacademic uses of their time? (p. 148). Accounting for variances in the independent variables of age, gpa, oncampus, localresident, credithours, workhours, wednesday, friday, and nopolicy (p. 151), was intended to address the differences in absenteeism. The outcome of the study by Marburger (2006) meshed with Romer (1993) suggesting that the body of literature on the relationship between absenteeism and student performance indicated an inverse relationship. Romer?s (1993) contention was that there was a significant link between absenteeism and student learning or success; Marburger (2006), although supporting Romer?s (1993) study, suggested that ?whereas the 138 relationship between a mandatory attendance policy and learning is statistically significant, the impact does not appear to be substantial? (p. 154). Therefore, it is imperative that research be conducted in the community college to ascertain the impact of attendance on student success as a work ethic practice. Furthermore, to relate institutional practice which promotes or hinders student success as an outcome of the work ethic of attendance also needs investigation. Additional studies also generally supported the work of Marburger (2001, 2006) and Romer (1993). Gump (2005) argued that a strong negative correlation is suggested between absences and a student?s final grades; Brewer and Burgess (2005) conceded that faculty have a role to play in the attendance of students: ?when college students are not motivated in a particular class, a common outcome is a lost desire to attend class, followed by frequent absences and plummeting grades? (p. 24); Davidovitch and Soen (2006) argued from a different attendance perspective: they countered that student attendance may have serious consequences for the performance evaluation of instructors. ?The vast majority of academic institutions make use of student evaluations for any and all of the purposes cited above, without taking into consideration a possible relationship between student attendance in a particular course and student evaluation of the course instructor? (p. 693); and, Stanca (2006) countered that attendance indicators omit ?unobservable individual characteristics, such as ability, effort, and motivation? (p. 252) and that motivated students who succeed, even when not attending regularly, ?implies that estimates of the impact of attendance on academic performance are likely to be subject to omitted variable bias? (p. 252). 139 According to WorkEthics.Org (2006), the work ethic of attendance is taught in the classroom as a vital component of student success. The institutional practice of attendance is perceived differently by faculty and students, and therefore, has need of study to assess the impact on student achievement in college courses, the attainment of educational goals, and even in the workplace. Attendance, as argued in this study, should be a positive correlation between college student success and success in the workplace. Character. WorkEthics.Org (2006), as noted in Table 19, defines character as: displays loyalty, honesty, trustworthiness, dependability, reliability, initiative, self- discipline, and self-responsibility. How might this work ethic characteristic and its interrelated components impact student success in college? First, the attributes for character in this definition are many and give rise to future separate or co-related studies for each item or grouped items; secondly, a modicum of common sense is inherently assumed that community college students and faculty are individuals who possess and carry out the attributes noted in the character definition; that individuals of character are more likely to be successful in the classroom and life than those individuals who exemplify few?if any?of the attributes indicated. Thirdly, as argued by Anderson (2000), character education in an ideal world is a collaborative and harmonious effort and the responsibility of families, schools, and communities; however, in reality, the consistency of the collaborative effort does not exist. In terms of the community college classroom as an avenue to offer character development, Anderson (2000) noted that ?the classroom could be one arena to reinforce, model, and practice positive character traits on a daily basis; therefore, the teacher is central to character education. The processes (classroom strategies utilized and environment created) within the classroom are critical? 140 (p. 139). Conversely, the first line of defense in the development of community college students? character is in the home with the parents long before the college provides open- door acceptance for the student. Cordry and Wilson (2004) analyzed the hours parents had with their children from birth until day one of school, including the hours that schools have with students. The data were: 1) by the age of five, parents will have 43,800 hours with the child; 2) each school year the school will spend 1,260 hours with the student, and the parents an additional 7,490 hours; 3) by graduation, the student will have spent 16,380 hours with teachers and 97,370 hours with parents; and, 4) the ratio of parents to teacher is 76% to 24%, respectively (p. 56). As noted by Cordry and Wilson (2004), ?active parental involvement improves student morale, attitudes, and academic achievement; thus, by taking on an active role, parents reduce their child?s risk of failure academically and reduce the chances of dropping out before graduation? (p. 57). Parental involvement is more likely to develop in individuals the character traits of respect, responsibility, fairness, and hard work?character traits needed by community college students (Anderson, 2000) and those taught in the WorkEthics.Org (2006) program. Assuming that parental involvement is a variable that is not considered in the process of a student?s application to attend the community college, the inclusion of teaching the work ethic of character?by action and word?becomes even more important to student success. A selected example: suppose that students who have misguided scruples due to a lack of parental involvement (or other factors) decide to perform unethical practices during exams, labs, or other necessary development endeavors. Do these acts of cheating or unethical behavior interfere with student success? 141 According to Rudebock (2005), Puka (2005), and Sterngold (2004), cheating is a serious problem and harms students in potential ways that cannot always be measured by scores; aside from cheating, more serious unethical behavior by students is harmful to the college, other students, and society. Puka (2005) suggested that in the context of world problems, ?the ethical problems of college life are small? (p. 32). However, Puka (2005) cited date rape, racism, homophobia, sexism, drug use, suicide, theft, and vandalism as serious unethical problems on college campuses. The counterpoint of Puka (2005) on the issue of cheating/dishonesty is: Most faculty and administrators, however, rate academic dishonesty a high crime, fatal to education. Obviously, cheating is wrong: an affront to learning and self-integrity. But even where cheating is widespread, seeming to threaten the educational mission of a university [or community college], its touted harms do not stand scrutiny. Cheating need not decrease overall learning at college. Largely this is because learning and test-achievement do not correlate well; tests are not very good measures of the learning process. Thus, to cheat on tests also is not automatically to cheat oneself as a learner. (p. 32) One additional contention of Puka (2005) is to note that some faculty and administration identify problems in the character of students while they fail to see the character flaws they themselves possess. If true, do such dichotomous actions create perceptions in the eyes of students that pose problems for students to succeed in college? In other words, if students and faculty perceive their own character traits as acceptable or higher, while one or both actually have character faults, will the character faults interfere with student success? The argument of this dissertation is to say that variances in character are expected; however, negative character traits are more likely to harm student 142 success in specific terms related to a student?s outlook on life more so than on the student?s academic achievement. However, if negative character traits include unethical or similar related actions by faculty, student, or both, the level of academic achievement may be suspect. Or are these perceptions simply a matter of disharmony which does, in fact, not interfere with how students achieve in the community college? Sterngold (2004) noted the data of the 2003 National Survey of Student Engagement in which 87% of college students who took the survey responded that they knew of someone who had used an Internet source without giving proper credit. Conversely, in the same study by Sterngold (2004), it was reported that only two-fifths of students in another study reported cheating. In short, to study the character of students is to investigate a host of variables into how the character of students is subject to interfere with their success in college. In light of the national study by The Conference Board et al. (2006), when employers across the nation are concerned about the quality of the future workforce in terms of the work ethic of the individual and not overly concerned with the technical skills of the same individual?there is cause for concern and for promoting student success through the inclusion of work ethics in the classroom. For the community college, the development of student character is tentative at best; however, institutional practices which are easily observed by students?both in the classroom and in the halls of the institution?may prove to have desirable outcomes for student character modification. It is understood that students should not cheat, lie, or commit other unethical activities. Furthermore, it is assumed that faculty will also demonstrate the same positive ethical character traits that are expected of students attending classes in the two-year college systems across the nation. To understand how 143 students and faculty express their respective differences in regards to the work ethic of character is valuable in the overall process of promoting work ethics to further promote the success of community college students?particularly valuable to the future employment tenure of graduating students. Teamwork [and collaboration]. The creation of student teams and the collaboration within the group is an attempt by faculty in the community college to improve the work ethic of team members; moreover, student projects are intended to mirror group projects in the workplace (Hansen, 2006; Strom & Strom, 1999; Strom & Strom, 2002; The Conference Board et al., 2006). According to Tarricone & Luca (2002, p. 54), ?skills such as problem solving, communication, collaboration, interpersonal skills, social skills, and time management are actively being targeted by prospective employers as essential requirements for employability.? Hansen (2006) argued that teamwork includes many tangible benefits for students: 1) the hands-on approach to learning in teams resulted in greater active learning, with increased comprehension and retention of information; 2) higher levels of student motivation and achievement were observed; 3) improved communication skills, and, 4) stronger interpersonal and social skills. Furthermore, Hansen (2006) suggested that active skills development was an improved method in fostering team skills development than the traditional lecture-style type of teaching method. Teamwork, therefore, is the co-ethic to the following: ?No man is an island, entire of itself? because I am involved in mankind?whereof I am a member? ? (Alford, 1839, p. 574). Members of a team cannot successfully function as islands; student teams are not exempt from the mainland, as are team players in the world of work. The community college classrooms and online learning activities should be 144 precisely geared to teamwork in every facet of the educational process to promote student success. Two definitions of teamwork are correlated here for emphasis: 1) WorkEthics.Org (2006) defined teamwork in Table 19 as: respects the rights of others; respects confidentiality; is a team worker; is cooperative; is assertive; displays a customer service attitude; seeks opportunities for continuous learning; demonstrates mannerly behavior, and 2) McLeish (2002), in Table 17, defined teamwork as: works well with peers, customers, supervisors and support staff; works across different ages; transfers effectively between individual work and team work; knows their own role as part of the team in the work situation; and, shows cultural sensitivity. The attributes as noted by WorkEthics.Org (2006) and McLeish (2002) are consistent themes of collaboration, consideration, and goal-orientation. As a merged set of constructs related to teamwork, WorkEthics.Org (2006) and McLeish (2002) accentuate the need to correlate classroom and workforce methodology of teamwork as promoting success for students while in college and subsequent to graduation. However, Strom and Strom (2002) identified two specific problems associated with teamwork in the community college: 1) what is the process in assessing performance of individuals in group work; and, 2) the design of effective assignments that guide students to actively practice the specific teamwork skills they are expected to learn. Additionally, Strom & Strom (1999) argued ?group success depends on individual accountability? (p. 172), which is consistent with the findings of McLeish (2002), The Conference Board et al. (2006), National Association of Manufacturers (2005), the U.S. Department of Labor, Bureau of Labor Statistics (2000), and Hughes and Karp (2006). 145 In other words, the group is only as strong as the accountability and teamwork ethic of the members who comprise the team; and, the individual is not above or beyond the strength of the combined teamwork ethic or collaborative resources of the team. As previously noted by Powell (1989), some of the factors [individuals] are pretty darn good; nevertheless, the whole [team] is strategically dependent upon its parts [accountability of team members]?those that are ?pretty darn good? (Powell, 1989, p. 490) and those that are not: this conceptual model is precisely applicable to teamwork in the community college. Furthermore, the conceptual framework offered by Powell (1989) is a succinct guide to community college leaders, faculty, and students in assessing the work ethic of teamwork to improve student success in college and as a method to ?hedge? the future transition from college to the world of work. It is argued in this study that success in the workforce is an indicator of previous success in college or workforce training. Individual success as an outcome of college student success is a brighter outlook for work and life. To assess the work ethic of teamwork for both student and faculty member in the community college is to address the impact that this attribute has on student success. Consequently, the need to investigate institutional practices specific to student and faculty teamwork work ethics is suggested as beneficial to improving the success of students. As argued by Hansen (2006), ?Teams and teamwork have been long used by business and, over the years, much has been written on the subject specifically examining the development and use of teams in college to help prepare students to be productive members of work teams? (p. 11). Appearance. A controversial topic, student appearance in the classroom is subject to interpretation. For example, McLeish (2002) refers to this attribute as how the 146 individual presents himself or herself to the public, whereas WorkEthics.Org (2006), considers appearance as: displays appropriate dress, grooming, hygiene, and etiquette. The dichotomy in this attribute is to ask the questions as posed by many community college faculty members: Does student appearance impact student success and are students aware of how to dress for job interviews, meetings, and/or group presentations? Juhnke, et al. (1987), argued that appearance does have an impact but that the type of assistance is situational. For example, ?variations in dress and facial features and traditional manipulations of attractiveness seem to have different effects in different contexts, and the appropriateness of appearance in the situation in which it is encountered will be an important influence on [obtaining help] helping? (p. 318). For the community college student and faculty member, does student appearance translate into the type and amount of support faculty tend to provide to students either in class or outside class? Granted, students should be reasonably dressed for class and present themselves with ?appropriate dress, grooming, hygiene, and etiquette? (WorkEthics.Org, 2006, http://www.workethics.org/). However, does the appearance of a student promote or hinder their success in terms of support from other students, faculty, or the institution? A search on Amazon.com for dressing for success on the job resulted in numerous books on how to dress for the job and how appearance is important to employers. For the community college graduate, dressing appropriately for the job involves understanding the type of work, from technical occupations to professional employment. Regardless, dressing for success is important in the workforce. Nevertheless, is the appearance of a student in class or on-campus going to impact the success of a student? Will showing up in ?baggy clothes? prevent a student from achieving stated educational goals? To answer 147 this question as related to student success is to seek the perceptions of students and faculty as a tool to assess how appearance impacts student success, with full knowledge that organizations assume that graduates have a viable sense of ethical appearance for the appropriate job type and level. Student appearance is a matter of high controversy. The larger issue is to assess the impact that student dress has on the success of a student, specifically related to academic success and life-ling learning. Thus, the need to assess both student and faculty appearance as factors impacting student achievement warrants further investigation. In particular, faculty and students may have significant perceptual variances in terms student appearance as noted in the example below: Several weeks ago, I was at juvenile court monitoring a student disciplinary action for a school district. A fifteen-year-old boy had been called before the judge on a breaking and entering charge. He was wearing a black concert t-shirt. On the back of the shirt was the Grim Reaper, his skull grinning from under a black velvet hood, holding his traditional scythe in one bony hand, and reaching around to molest the virtually nude woman standing in front of him. As the boy was trying to convince the judge he was innocent, I leaned over to the assistant district attorney sitting beside me and with a very knowing air, I whispered, ?You know, if I were that boy?s attorney, I don?t believe I would have recommended wearing that particular shirt this morning.? ?Oh, that?s nothing,? responded the D.A. wryly. ?You should have been here last week when a girl on a D.W.I. charge was wearing a Budweiser t- shirt.?? what students wear has become a major issue in the nation?s public schools [and colleges and universities]. (Gilbert, 1999, p. 3) Attitude. WorkEthics.Org (2006), in Table 19, defines attitude as: demonstrates a positive attitude; appears self-confident; has realistic expectations of self; McLeish (2002), does not use the label of attitude, but rather refers to attitude as self-awareness and positive self-esteem. Can it be fathomed that the attitude of a student, or faculty member, could conceivably impact the success of a student? As noted by Horn, Nevill, and Griffith (2006), the majority of community college students are employed, have 148 families, and are less prepared for college than students who attend four-year institutions. Consequently, do these additional duties of life create a situation where students do not have positive attitudes, or normal self-expectations? Should community colleges be interested in the attitudes which students have of themselves, the college, fellow students, or life in general? And do these attitudes and self-evaluations support the success of students? A study conducted by Miller, Pope, and Steinmann (2005) partially responded to these questions: ?For institutions to be responsive to student needs, and to better understand how students view themselves in relation to the institution and institutionalized outcomes, there must be considerable work done to identify student characteristics, attitudes, beliefs, patterns of behavior, and other general perceptions? (p. 598). A similar study concluded that ?the largest contributors to student satisfaction and success were the caring attitude of the instructor and the supportive environment created by fellow students? (McKinney, McKinney, Franiuk, & Schweitzer, 2006, p. 281). If the community college desires to understand the attitudes of students as the various attitudes impact student success, institutional practices must be established to achieve two goals: 1) identify student attitudes, both positive and negative; 2) capitalize on the positive attitudes as a means to improve the negative attitudes espoused by other students. Measuring the perceptions of students and faculty related to attitudes is one method of assessing how the attitude work ethic influences student success in the context of institutional practices. Negative student attitudes can interfere with student learning. For example, students who are overly prone to negatively perceive an instructor as not caring about them will not view the course requirements the same as a student who has a 149 positive attitude about the course and the instructor (Braxton, 2006; Miller, Pope & Steinmann, 2005). Productivity. Hill and Petty (1995) conducted a study and found that when the American worker was compared to their international counterparts, the American employee was generally viewed as less productive and lacked the strong work ethic exhibited by many off-shore workers. Furthermore, The Conference Board, et al. (2006), conducted a major study of 400 employers nationwide. One of the outcomes of the study, related to concerns over productivity was: ?Over the next five years employer respondents expect to reduce their hiring of high school graduates and increase the hiring of post-secondary educated workers? (p. 58). The implication for community colleges is that it is more vital now than ever to establish institutional practices to help students learn the value of productivity as fewer high school graduates will be able to find employment without at least a community college degree or postsecondary technical training. Hamilton-Attwell (1998) supported the contention that an organization is partly responsible for ensuring that individuals who possess a solid set of work ethics has the opportunity to practice their ethics as measured by the level of productivity achieved. In other words, if individuals who have a valued sense of work ethics encounters an environment which does not support the application of the ethics to promote a positive work behavior, the individual will be less successful, and not as a result of their lack of work ethics. The community college has little recourse to deny the suggested outcome of the study by Hamilton-Attwell (1998). Yankelvich (1982) and Hamilton-Attwell (1998) separated the idea that the work ethic and work behavior were the same: ??it is important that we remain aware of the 150 differences between work behavior?what people do in the workplace?and work ethic?a set of beliefs and perceptions about work? (Hamilton-Attwell, 1998, p. 79). If it is possible that work ethics and work behavior are dichotomous in nature, then the community college should be aware that to investigate this variance is valuable data in regards to establishing institutional practices to measure both outcomes. If students are viewed as being less productive in their course work, it may be result of inaccurate beliefs or perceptions about work. If this were the case, the community college would have insight into the actions of students to support counter-actions on its part to create practices which addressed the differences in the ethic and behavior of the student to promote and improve student success. In the words of Hamilton-Attwell (1998), ?is there a link between work ethic and productivity? Yes, but a sound work ethic among employees [and students] will not necessarily lead to productivity improvement?only if the employees [or students] experience a benefit of their behavior will they use it? (p. 86). In other words, to increase the productivity of students in terms of the quantity and quality of the work they achieve, research has opened the doors of knowledge so that the community college has a mandate to provide the infrastructure to support student success by the intentional institutional practices it employs to improve student achievement, vis- ?-vis, productivity (Hamilton-Attwell, 1998; Johnson, 2007). Pierson and Holmes (2007) conducted a comparative study related to how students perceived their own respective work ethics as they prepare to enter the world of work. Imbedded in this study was the relationship between the work ethic of the student and the outcome of their efforts in the workforce, e.g., the productivity of the graduates once they are employed. Pierson and Holmes (2007) noted the following: ?Clearly, since 151 work consumes such a monumental portion of each of our lives, shouldn?t attitudes and values regarding it be studied?? (p. 2). The study used the Occupational Work Ethic Inventory ? [OWEI] (Petty, 1993), to measure the expressed work habits, attitudes, and values of future employees, with specific comparative analysis between gender. As an outcome of the study, Pierson and Holmes (2007) suggested that: This research concludes that graduating seniors at one rural State University have a favorable self-perception of their occupational work ethic. It also helps to dispel the notion of stereotypical negative character traits that have been attributed to today?s young people?Results from this study should strengthen university faculty and administrators? confidence that graduates are leaving college armed with a strong attitude toward work. Business and industry should be pleased that employees they hire right out of college really do know the meaning of a day?s work for a day?s pay. (p. 8) Although the work by Pierson and Holmes (2007) is in reference to the university setting, community college students are viewed less favorably in the study issued by The Conference Board, et al. (2006). Nevertheless, the research by Pierson and Holmes (2007) suggested one solution to the major concerns noted in the study by The Conference Board, et al. (2006). Based on the relationship between the research of Pierson and Holmes (2007) and The Conference Board, et al. (2006), the community college was informed of the value that research plays in forming policies and practices which significantly enhances the success of college students, in terms of work ethics, productivity, and technical skill sets. To utilize the Occupational Work Ethic Inventory ? [OWEI] (Petty, 1993) is strongly suggested in this dissertation as an incentive for leadership and policy-makers to become involved in the practice of a culture of inquiry to model significant changes in the success of community college students to promote student success, e.g., enrollment, retention, improved work ethics, academic preparation, 152 institutional support, and graduation rates (Bailey et al., 2005b; Bailey et al., 2006; Horn, Nevill & Griffith, 2006; Jacoby, 2006). This study seeks to better understand the relationship between how students and faculty, separately and collectively, perceive the work ethic of productivity. If students and faculty are in agreement that students are highly productive, the measured perceptions should reflect these positively-correlated variances. The results of the data should then be instrumental in initiating institutional practices which regularly assess the practice of productivity for both students and faculty members (Hamilton-Attwell, 1998). Consequently, leadership within the community college should seek every opportunity to address each and every potential misaligned practice to support and actively promote student success (Johnson, 2007; VanWagoner, Bowman & Spraggs, 2005). Organizational Skills. Students who are proficient at organizational skills are more likely to practice the following: manifests skill in prioritizing and management of time and stress; demonstrates flexibility in handling change (WorkEthics.Org, 2006); or manages time, self, and able to work alone, resourceful, make decisions, understands relationships amongst workplace processes and systems; adapts resource allocations to cope with contingencies; establishes clear project goals and deliverables; and allocates individuals and other resources to tasks (McLeish, 2002). Provided the community college implemented practices to measure the outcomes of students in terms of their organizational skills, the main effect of such actions within the college are suggested as instrumental in creating a baseline policy structure to improve student success; for students who have little or none of the characteristics of the organizational skills noted by WorkEthics.Org (2006) or McLeish (2002), the community college would be well 153 positioned to teach organizational skills to students to promote success. Moreover, students are less likely to be receptive to institutional nonchalant attitudes: ?I?m proud of my organizational skills; I love to tell other people what to do? (www.cyberslayer.co.uk). Bakunas and Holley (2004) argued that organizational skills need to be taught in the classroom. Instances of disorganization were cited: students came to class without pen or pencil; notebooks were in total disarray, leaving a trail of paper as the student migrated from class-to-class; or constant forgetfulness of important items. Although the study by Bakunas and Holley (2004) was related to students in earlier grades or in high school, they have provided an essential guide for community colleges who may be interested in developing organizational skills for their students. The key elements are: 1) help students understand that they are responsible for materials for their work, such as pencils, paper, notebooks, etc., as this correlated directly to community college students; 2) provide guidance as to how students might organize their materials, readings, group work, binders, etc, as this correlated directly to community college students; 3) demonstrate how to take efficient study notes and important information to become studious with the material under investigation, as this correlated directly to community college students; and, 4) provide students with the tools necessary for them to understand how their actions translate into organizational behavior, as this correlated directly to community college students. Institutional practices to identify students who have difficulty with organizational skills are a matter of intentional application. The practices include, but are not limited to, survey methods, interviews, orientation sessions, and profile analysis. Unless the leadership of the community college is adept [and willing] at looking at the strategic 154 layers of influence on student success, assessing what students need in the way of organizational skills may be difficult to identify and more difficult to implement as effective practices to support student success. Urso and Sygielski (2007) argued: ? community college students are capable of making successful transfers to four-year colleges or universities. In order to achieve this substantial goal, many students had to learn how to use their work ethic(s) to provide for themselves adequate resources or become self-managed. One particular case-in point: individuals like Mary Ann and Tony are exceptional time managers. Both students had to develop those skills because of the many responsibilities associated with studying, working and raising a family. Understanding what their instructors expect of them, they are able to allocate the appropriate amount of time for studies, student life and personal responsibilities. (p. 16) Organizational skills in the workforce are important to the success of college graduates, particularly community college graduates. As suggested by ContinuingEducation.com (2007), organizational skills are not the same for all individuals. Moreover, a good work ethic of being organized helps [students and] employees to be more productive, feel that work is structured and progress is made, and creates a sense of order in their lives. Furthermore, it is suggested that being organized is efficient because it helps individuals maintain predictability in their work environments, e.g., items in places when and where needed, efficient output, enhanced productivity. Community colleges should recognize that many students are not necessarily passive learners (Marshall, 2007). Armed with the knowledge that students are actively engaged in learning is incentive for community colleges to facilitate processes and practices which meet the needs of students in specific terms of their individual success in college, including the provision of practicing and teaching organizational skills as a workforce ethic needed by business and industry (The Conference Board et al., 2006). 155 Moreover, acquisition of data related to students and faculty who are effectively organized will provide a baseline from which other students and faculty might measure their own respective level of organizational skills. Data which are obtained from student and faculty perceptions indicate the significant relationship of how each group might react to the other in terms of organization, planning and educational outcomes. Communication. Communication skills or oral communications is indicated in the literature as a profound work ethic skill needed by graduates at all levels of education (Emanuel, 2005; The Conference Board et al., 2006). In fact, the study conducted by Emanuel (2005), argued that ?Good communication skills fuel self-confidence and enable people to exert more control over their lives. Such people know how to effectively research, conceptualize, organize, and present ideas and arguments?speaking skills are more important to job success than are specific technical skills? (p. 153). Emanuel (2005) also asked the question of whether community college students are prepared for the workforce as indicated by their communication skills. Crawley and Klomparens (2000), conducted a study of 500 Ph.D. alumni from Michigan State University in the years from 1982?1993 and found that there was a short-list of skills most likely to promote a successful career. These skills were: conflict resolution, communication, and teamwork. Consequently, although these skills are identified from Ph.D. alumni at a major university with national recognition, these skills are excellent indicators for community college students to improve their success in college and in the workforce upon graduation. As indicated in Table 17, McLeish (2002), defined the communication work ethics as: listens and understands, speaks clearly and directly, writes clearly, negotiates 156 effectively, and reading independently; similarly, WorkEthics.Org (2006, Table 19) defined the communication work ethic as: displays appropriate nonverbal (eye contact, body language) and oral (listening, telephone etiquette, grammar) skills. As noted in these definitions, there is a strong correlation among all the studies identified, e.g., Crawley and Klomparens (2000), The Conference Board et al. (2006), and Emanuel (2005). At issue for community college leaders and policy-makers is: if communication skills are a work ethic of the magnitude as noted in the body of research on communication skills (ACT, 2006b; Crawley & Klomparens, 2000; Emanuel, 2005; The Conference Board et al., 2006; et al.), what institutional practices might be considered as a critical-mass outcome to support student success in terms of communication skills? If communication skills are defined as utilizing writing and oral skills to communicate effectively, then assessing the skill level of entering students is vital to student achievement. For example, remedial education has come under fire in recent times as a matter intended for the community college and not four-year colleges or universities (Bettinger & Long, 2005; Jenkins & Boswell, 2002; McJunkin, 2005; Perin, 2006). Provided the accuracy of the remediation movement, community colleges should attempt to practice methodologies which effectively measure student academic preparation in basic skills as a pre-college characteristic, and support present-college activities for maximization of student success. As a result, the level of efficient institutional practices to move students from remediation to credit classes will promote student success, including the communication skills work ethic (McJunkin, 2005; Soliday, 2002). 157 For this study, the assessment of perceptions between faculty and students in terms of communications skills is critical to student success. For example, faculty who perceive that students are [or should be] prepared for college-level work will establish lesson-plans or lab activities relevant to the course of study. However, if students entering the doors of the community college perceive themselves as prepared for college- level work, but who in reality are not academically prepared, the outcomes of the work performed by students will not reach a level sufficient to be successful at college-level work. Therefore, students may achieve a minimal passing grade, but what is the reality of the outcome? As previously noted in this study, student success is not solely about achieving a degree or stated educational goal. Success also includes the ability of students to improve their skills at being better communicators (ACT, 2006b; Crawley and Klomparens, 2000; Emanuel, 2005). Improved communication skills impact the ability of students to present findings relative to coursework and work. If a student cannot understand what is read and communicate the material effectively, the student will achieve lower levels of knowledge in the field of study or the ability to perform in the workforce at levels consistent with the expectations of employers (The Conference Board et al., 2006). Consequently, community colleges must establish a methodology to assess the perceptions of students and faculty so that the understanding of both groups is aligned to maximize the success of students. Two independent mindsets regarding communications skills may lead to frustration and reduced levels of student achievement. Cooperation. A search of an online thesaurus service, using Yahoo! Education, indicated several synonyms for cooperation: co-action, collaboration, synergy, teamwork, 158 affiliation, alliance, association, combination, conjunction, connection, and partnership. According to Wikipedia, the idea of cooperation is born out of the necessity to form cohesive groups to support the success of human interaction. Students and faculty in the community college are a prime example of human interaction. Palmer (2000), in How Community Colleges Can Create Productive Collaborations with Local Schools, suggested that college students are best served when productive partnerships are established between the community college in the service area and local schools. Moreover, Kuh, Kinzie, Schuh, and Whitt (2005b) suggested that what works at one college may not be ideal for another college: nevertheless, ?the absence of such a blueprint and the fact that many roads lead to student success are, in fact, good news for those who desire to enhance student learning and engagement at their own institutions? (p. 21). At the heart of engagement is a cooperative and collaborative spirit to promote student success (Johnson, 2007; Woods, 2007). WorkEthics.Org (2006) defined cooperation as: displays leadership skills; appropriately handles criticism, conflicts, and complaints; demonstrates problem-solving capability; maintains appropriate relationships with supervisors and peers; follows chain of command. McLeish (2002), laces the attributes of cooperation throughout the employability skills required by employees. For example, interpersonal skills included works well with peers, customers, supervisors, and support staff, whereas initiative and enterprise skills included problem solving in teams. How might the community college interpret the application of cooperation in the work ethics practices of the institution? Strom and Strom (2002) suggested that students participate in cooperative learning groups: 159 The effects of student participation in cooperative learning groups are well known. Researchers commonly report student gains in problem solving skills, more favorable attitudes toward education, increased willingness to try new and difficult tasks, an enhanced sense of belonging, greater appreciation for persons of other ethnic backgrounds, reduction of misbehavior, and better relationships with classmates. Students also grow from listening to the viewpoints of others, encouraging teammates, showing empathy, negotiating conflict, and making an effort to help peers understand lessons. (p. 315) Furthermore, Strom and Strom (2002) recommended that faculty become aware of the factors associated with the Collaboration-Integration Theory (CIT), which is implemented through Cooperative Learning Exercises and Roles (CLEAR). The Collaboration-Integration Theory finds its application in the goals of CLEAR: 1) encourage students to acquire an active role in their learning; 2) guide the student into collaborative actions when working in groups; 3) enable all participants to participate in the group effort; 4) ensure that students have a chance to contribute a unique position relative to the group consensus; 5) reduce boredom from passive activity by allowing all students the opportunity to function by differentiating member roles, and 6) allow sufficient contact in the group setting so that fair and effective peer evaluations may be conducted. For faculty in the community college, the Collaboration-Integration Theory (CIT) as practiced through the goals of the Cooperative Learning Exercises and Roles (CLEAR) process are excellent suggestions as to improving student success by modeling cooperative and collaborative practices. Should faculty consider becoming a member of the team to promote greater student success? The answer is a resounding yes as participation becomes an effective instructional practice to encourage an enhanced level of cooperative trust between student and faculty. Cooperation and trust are important 160 ingredients in the attributes sought by employers in the U.S. and Australia (McLeish, 2002; The Conference Board et al., 2006; National Association of Manufacturers, 2005). Respect. The final work ethic noted by WorkEthics.Org (2006) is respect, and is defined as: deals appropriately with cultural/racial diversity; does not engage in harassment of any kind. McLeish (2002) refers to respect as: works across different ages and shows cultural sensitivity. Within the community college, one of the practices that should be implemented in terms of respect is to determine if students and faculty apply this work ethic, e.g., respect, in daily relationships. An important finding, then, is to correlate the relationship of respect in the college setting to the workforce. In other words, it is more likely that students who do not practice respect for others is more like to be an employee who practices respect for customers or fellow employees. Miley and Gonsalves (2005) conducted a study which investigated the relationship that students perceived existed in the classroom. For example, students were surveyed to assess their perceptions of teaching attributes of faculty. The results of the study indicated that students and faculty have very different perceptions of instructional teaching styles, and furthermore, ?they also may have misconceptions of what students perceive as good teaching? (p. 20). Moreover, Gorko et al. (1994) studied relationships between students and faculty and concluded that students desire more equality and respect from faculty; conversely, faculty perceived that students need order in the classroom, want to be entertained, that students feel that faculty should be a pillar of virtue, and faculty and students should maintain a less-than-distant relationship, e.g., a sort of buddy relationship. 161 Appleby (1990) also examined the relationship between faculty and students. The outcome of the study related that faculty perceived that students who were immature and inattentive were a problem for classroom effectiveness, whereas students viewed faculty who were short on empathy and poor communicators as a detriment to their success in the course. As noted in the studies by Appleby (1990), Gorko et al. (1994), Miley and Gonsalves (2005), and Walsh and Maffei (1994), the relationship between faculty and students is a matter of perceived mutual respect. What might these four studies suggest to community colleges in terms of student success? First, respect in the classroom is a prerequisite to student learning. Prensky (2006) argued that faculty and students share a common bond in the classroom, namely? the teaching-learning domain: ?With such an atmosphere of mutual disrespect festering in our classrooms, learning is becoming increasingly difficult. Before you can teach or learn from someone, you need to genuinely respect them? (Prensky, 2006, p. 96). Second, Phelps (2006) suggested that ?teachers who use respect as a behavioral norm desire to serve students actively? (p. 70). To serve students actively includes understanding students? own sense of respect within the context of beliefs, attitudes, and actions. To understand both the student and faculty work ethic of respect is to inform the community college of actions to be taken to improve student success. Third, Eric Chester (2005) a noted business consultant, in Getting Them to Give a Damn, describes the younger generation of workers. In the description of the attitudes and ethics of these workers, Chester (2005) provides a unique perspective on respect based on years of experience: 162 And don?t think for a moment that your kidployees [or students] don?t value respect themselves. On the contrary, they know all too well what respect is and, more importantly, the power it holds? [they] crave respect [and] will go to great lengths to get it, but when it comes to giving respect, you might find them stingy. They won?t automatically respect you simply because of your age, position, or title. They don?t want to yield their power or put you in a position of control over them. In a strange reversal of the traditional dynamic between youth and age, they believe that they?re owed respect automatically?but that you have to prove that you?re worthy of their consideration. (p. 22) Community colleges have access to significant research to guide the process to revamp institutional practices as related to the work ethic of respect between students, faculty, and administration (Appleby, 1990; Chester, 2005; Gorko et al., 1994; Miley & Gonsalves, 2005; Phelps, 2006; Prensky, 2006; Walsh & Maffei, 1994). Moreover, for the community college to improve student success is to assess the impact that respect has on student achievement, particularly as it implicates the relationship between faculty and student. As a significant element within the Strategic-Impact-Triad Model, work ethics is a critical factor to the success of future employees. The Conference Board et al. (2006) indicated that ?Professionalism/Work Ethic, Teamwork/Collaboration and Oral Communications are rated as the three most important applied skills needed by entrants in today?s workforce? (p. 10). Because the community college is poised as the bridge between high school and work or between high school and a four-year degree, the impetus for understanding how work ethics impacts student success has never been more relevant than at any time in the historical context of the educational system in the United States (Dicroce, 2005; Franco, 2002; Kuh et al., 2006; Robbins et al., 2004; Smith, 2005). 163 Institutional practices which promote a solid set of work ethics is noted in the literature as significantly valuable to individuals?not only as community college students?but also as members of society and future workforce participants. Perceptions from students and faculty related to work ethics can inform community college leaders and policy-makers about the influence work ethics has on student success, thereby initiating a mandate for improvement in the community college, specifically in terms of institutional practice and support. In some circles related to this study during the literature review phase, faculty members commented that if a student attends the community college without the prerequisite of the work ethics already established as part of the psyche of the individual, there is little hope in changing the outlook of the student. Conversely, there is sufficient research to support the contention that the work ethics of the student are extremely important for success in life and work, and are ?teachable.? Subsequent to findings in the literature, community colleges should initiate methods and practices to achieve the following: 1) assess the work ethics of students upon arrival; 2) implement practices and support structures to improve the work ethics of students; and, 3) assess the results of the work ethics practices and support structures to determine their impact on student success, e.g., improved outlook on college, life, and work. As previously noted, Cohen?s (2005) investigation of practitioners and researchers gives cause for concern: ?? research on community colleges has been conducted for many decades, and for just as many years it has been ignored by community college practitioners? (p. 51). A summary and comparison of work ethics is indicated in Table 20, whereas Chester (2005) provides an overview of how [work] ethics have changed in society as suggested in Table 21. 164 Table 20 Comparative Summary of Work Ethics WorkEthics.Org. (2006). Powered by East Central Technical College. Retrieved October 1, 2006, from http://www. workethics.org/ McLeish, A. (2002). Employability skills for Australian small and medium sized enterprises. Commonwealth Department of Education Science & Training: Australia. Attendance Character Teamwork Appearance Attitude Productivity Organizational Skills Communication Cooperation Respect Communication Teamwork Problem-Solving Initiative and enterprise Planning and organization Self awareness Learning Technology Table 21 How Changes in Society?s Values Have Impacted the Work Ethic in America Us Them Parents were dedicated to the company Parents complain about work Parents/Schools taught work ethic Parents/Schools don?t teach the work ethic Work hard?feel proud?get ahead! Work hard?feel tired?miss out! Adults were defined by their vocation Adults are defined by wealth and leisure time The customer was king Customers are equal, not elevated Dress for success Personal image is everything Buy into the company credo Don?t sell out to anyone at any time Get on with a good company that takes care of you Every company will eventually outsource you or automate your job You climbed the corporate ladder and retired with a pension You build your resume with vast experience from many jobs and retire with an IRA Then Now Teens had to work to buy a car and cool stuff Parents give teens a car and other cool stuff Jobs for teens were hard to find Jobs for teens are in endless supply The boss was the boss The boss is your peer, if not your buddy Employers were to be respected above all Employees are to be respected above all School first, then job, then friends/activities Friends/activities first, then school, then job Unethical employees were fired, vilified Unethical employees can become CEO A kid is a kid is a kid: you?re no different than anyone else. If you want to achieve great things, you have to work harder than the next guy Every kid is a gift from God. You?re special. You?re destined to do great things and have it all someday, because no one else in the world is exactly like you. (Source: Chester, E. (2005). Getting them to give a damn: How to get your front line to care about your bottom line. Dearborn Trade Publishing: Chicago, IL., p. 16). 165 Factor 3: Institutional Support For the community college, institutional support has the connotation of identifying, assessing, and providing solutions for the needs and diversity of students, e.g. variances in academic preparation; differences in work ethics; attitudes about life, college, and themselves; and, perceived and actual views on how the community college supports them as individuals. For example, Kozeracki and Brooks (2006) conducted a study on the impact that developmental education had on the success of community college students. It was suggested that ?students? success should be measured by their ability to move from developmental courses to college-level courses and then to achieve success in transfer or vocational programs of study? (p. 63). In order to be successful in the transition from basic skills to college-level courses, the institution must provide systemic support structures to enable student success on many fronts (Dungy, 2003; Restauri, 2004; Veltri, Banning, & Davies, 2006). As suggested by Dungy (2003), ?In every organizational structure, student affairs professionals should try to organize themselves so that they can use both old and new techniques to help students succeed in their academic life? (p. 342). Institutional support is co-equal to the domain of organizational structure, inclusive of actions which the college exhibits and practices in support of students. For example: 1) When students needed academic advising with a Plan of Study, how did the advising support student success? 2) If students enrolled with academic preparation deficiencies, how were these deficiencies determined and what policies were imbedded into institutional practices which provided unbridled support to enable student achievement? 3) As students navigated official college administrative requirements, were 166 they given adequate and consistent direction and support to encourage them to successfully navigate the administrative maze, while remaining sane and encouraged to persist to graduation?, and 4) Are institutional facilities sufficiently adequate?in terms of comfort, usability, and cleanliness?to promote an environmental infrastructure which is conducive to student success? These are but a few questions that community colleges must consider in terms of assessing institutional support structures to promote student success. Subsequently, the IPEDS classic definition for institutional support was: A functional expense category that includes expenses for the day-to-day operational support of the institution. Includes expenses for general administrative services, central executive-level activities concerned with management and long range planning, legal and fiscal operations, space management, employee personnel and records, logistical services such as purchasing and printing, and public relations and development. Also includes information technology expenses related to institutional support activities. If an institution does not separately budget and expense information technology resources, the costs associated with student services and operation and maintenance of plant will also be applied to this function. (http://nces.ed.gov/ipeds/glossary/index.asp?id=186) FASB institutions include actual or allocated costs for operation and maintenance of plant, interest and depreciation. GASB institutions do not include operation and maintenance of plant or interest, but may, as an option, distribute depreciation expense. (http://nces.ed.gov/ipeds/glossary). Within the Integrated Postsecondary Education Data System (IPEDS) definition, the educational functions are supported by administrative decisions to fund or not fund the operations as monies are available. Within the community college, institutional functions include, but are not limited to: a) advising, b) remedial programs, c) student services or admissions, d) facilities, e) social functions, f) memberships, g) recruitment, retention, and graduation, h) institutional research, i) instructional activities, and j) administrative processes. Specifically, Dungy (2003) identified several areas under the title of Student Affairs, e.g., institutional support, as indicated in Table 22. ?Student 167 affairs? in this study is equivalent to all functions required by the community college to support student success. Additionally, the names identified in Table 22 may not be exactly the same between two-year colleges; however, the functional intent within each named department or division will be considered ?same? in terms of institutional support structures to promote student success. Table 22 Student Affairs Major Functions Within the Domain of Institutional Support Functional Area Descriptor Statement Professional Journals/Membership Academic Advising Help students create a plan of study to reach their respective educational goal NACADA Journal Admissions To inform prospective students about the institution, programs; to recruit, screen, accept applications Journal of College Admissions; College Board Review Assessment, Research, and Program Evaluation Colleges and universities gather data about their students, including, but not limited to, grades, test scores, and demographics Review of Research in Education; Research in Higher Education; Standards for Educational Psychological Testing Athletics In small liberal arts colleges and community colleges, student affairs divisions have responsibility for intercollegiate athletics NCAA Manual; NCAA News; JUCO Review Campus Safety Safety and enforcement of laws on campus; may report to business affairs or student affairs Campus Law Enforcement Journal Career Development To help students find satisfying and rewarding employment; career development specialists also help students with career exploration, planning their job search, and other skills such as resume writing, interviewing, and making effective presentations Journal of Career Planning and Employment; Career Development Quarterly; Spotlight on Career Services, Recruitment, and HR/Staffing College or Student Unions Functions as a service center and gathering place for students, faculty, staff and alumni ACUI Bulletin; Programming Community Service and Service Learning Programs Community service is usually a volunteer program may or may not be connected to a for-credit academic program The Compact News; NSEE Quarterly; Journal of Experiential Education Commuter Services and Off-Campus Housing Commuter students may be defined as all students who do not live in institution- owned housing on campus Commuter 168 Table 22 (continued) Functional Area Descriptor Statement Professional Journals/Membership Counseling and Psychological Services Helping students work through psychological and emotional issues that may affect their academic success and personal development Counseling Psychologist; Division 17 Newsletter; Journal of Counseling and Development Dean of Students Office Responds to students, faculty, staff, parents, community members, and others concerned with student-related issues or concerns that arise on campus; helps students while establishing and enforcing both community standards and institutional standards Net Results; NASPA Journal; Journal of College Student Development; About Campus Dining and Food Services Services range from vending machines to full-service food courts that rival commercial establishments outside the campus Newswave Disability Support Services Colleges and universities are require to provide support services for students with disabilities, to include academic services such as note takers and interpreters Journal of Postsecondary Education And Disability Enrollment Management In a competitive environment in higher education, colleges and universities have made recruitment and retention of students a priority Journal of College Admissions; College & University Financial Aid Role of the financial aid office is to help students create a plan to finance their education NASFAA Newsletter; Journal of Student Financial Aid Fundraising and Fund Development A number of student affairs divisions have added fundraising and fund development to supporting student success Currents; Chronicle of Philanthropy Graduate and Professional Student Services Graduate and professional careers, and transfer processes within the community college; includes admissions, alumni relations, judicial affairs, orientation, student organizations, leadership programs, and academic functions as fellowships and assistantships ACPA, NASPA, Council of Graduate Schools, Association of American Medical Colleges, Association of American Law Schools, American Assembly of Collegiate Schools of Business Greek Affairs Fraternities and Sororities, emphasizing community building, socialization, and adherence to the values of scholarship, leadership, and community services Association of Fraternity Advisors; Perspectives Health Services On-campus facilities or off-campus providers ACHA Journal International Student Services To support international students and ensure compatibility between the college and international student needs NSFSA Newsletter; International Educator Magazine Judicial Affairs Ensure academic integrity, ethics, and behavioral standards of the institution are maintained; includes a method to resolve issues on rules and regulations Synthesis; ASJA Newsletter; Journal of College and University Law; The College Student and the Courts Leadership Programs Integrating training of undergraduate students, may include partnerships between college and community organizations Concepts and Connections; Leadership Studies Journal 169 Table 22 (continued) Functional Area Descriptor Statement Professional Journals/Membership Lesbian, Gay, Bisexual, and Transgender (LGBT) Student Services Provide resources and services that encourage a welcoming and safe environment for lesbian, gay, bisexual, and transgender students, faculty, and staff National Consortium of Directors of LGBT Resources in Higher Education; no publications Multicultural Student Services Welcome, support, empower, and integrate all students into the life of the campus Black Issues in Higher Education; Hispanic Outlook; Journal of American Indian Education; Journal of Asian American Studies Orientation and New Student Programs To welcome new students to campus, as well as for introducing them to the history, traditions, educational programs, academic requirements, and student life on campus Journal of the Freshman Year Experience; Journal of College Orientation and Transition Recreation and Fitness Programs To promote good health and wellness, to teach physical skills, and encourage a positive social interaction among students NIRSA Journal Religious Programs and Services To support a variety of faiths and religions on campus; may include chaplains. Dialogue; NACUC News; Realm of Higher Education Registration Services Enrollment and registration for classes College and University Residence Life and Housing Activities for resident life on campus, and includes elements of off-campus activities Journal of College and University Student Housing; ACUHO-I Talking Stick Student Activities Student activities is responsible for providing a range of programs and services Campus Activity Programming Women?s Centers Through counseling and educational materials, the centers focus on issues such as equity, leadership, money management, safety, health, strategies to combine family and work, and relationship violence NWSA Journal Source: Dungy, G. (2003). Organization and Functions of Student Affairs. Student Services: A Handbook for the Profession (Komives & Woodard, 2003), pp. 339-356. Jossey-Bass: 4th Ed., San Francisco, CA. As suggested in Table 22, institutional support is a conglomerate of many factors. Although the factors seem to cut across many departments or divisions within an educational institution, all factors play a role in the success of students. As noted by Hirsch (2001), ?It is not possible to know if these students are capable of college-level work without offering them assistance and evaluating the results of such an intervention? (p. 3). Similarly, Kuh, Kinzie, Schuh, and Whitt (2005b), argued that institutions which 170 are most successful in reaching students to promote their success are those institutions which understand that ?student success is not a function of osmosis? (p. 268). Both Hirsch (2001), and Kuh, Kinzie, Schuh, and Whitt (2005a, 2005b) vehemently argued that institutional support structures are critical to the success of students: the structures, they contend, are embedded and layered throughout the institution. And, most importantly, outcomes of the support structures should be assessed regularly to determine if they are effective or need to be modified or eliminated and replaced. Institutional support structures by general and specific category and function are shown in Table 22. Sandeen (2004) noted that ?student affairs staff must demonstrate with their knowledge, insight, and organizational skills that they have something real to contribute to the academic process? (p. 32). While student affairs (or student services, institutional support, etc) is a key unit within the organization to investigate the success of institutional support structures to promote student achievement, community college leaders are pivotal in coordinating and establishing institutional practices which support student success within the framework of institutional support (Achieving the Dream, 2005; Strout, 2006; VanWagoner, Bowman & Spraggs, 2005). Specifically, Boswell and Wilson (2004) noted that ??community college leaders have a responsibility to re - examine their own practices and assumptions, holding themselves accountable for adopting cost-effective and learning-centered strategies that help ensure student success? (p. 49). What are the strategies suggested by Boswell and Wilson (2004)? The strategies mentioned by Boswell and Wilson (2004) are the functional areas noted in Table 22. Senge (1990) suggested that ?there is a tremendous tendency of people high in the organization to become remote from reality and the facts, to begin to 171 hypothesize and conjecture without any formal grounding of their theories? (p. 351). Assuming that the relationship between Senge (1990), Boswell and Wilson (2004), and Dungy (2003) produced positive correlations, these studies suggested to community colleges that institutional support structures have the significant potential to promote student success as direct outcomes of institutional practice. First, competing agendas in the community colleges are a major cause of concern when attempting to allocate resources to institutional support structures globally throughout the institution (Boggs, 2004; Burd, 2006; Dicroce, 2005; Dougherty & Hong, 2005; Shkodriani, 2004; Strout, 2006; The Chronicle of Higher Education, 2004). Next, community college leaders must understand the relationship between contemporary students and the institution as a true relationship, not just an extraneous or minor part of a student?s college experience (Edwards, 2007). And, thirdly, as noted by Romero, Purdy, Rodriquez, and Richards (2005): ??leaders need access to the vast literature in fields such as the social sciences, management, economics, and education from which to draw when making decisions? (p. 291). Laden (2002), however, argued that ??most educators [leaders] choose to work in a setting that focuses on teaching, application, and drawing from a knowledge base of experiences rather than focusing on the production of, dissemination, and transference of empirically based knowledge? (p. 2). Leadership provided the roadmap; institutional support enabled the structured stops on the trip; students are the passengers; faculty members are the bus drivers; a successful road-trip is student success; and arriving home is graduation. Taking a look at the stops on the trip is correlated to the functional areas as noted in Table 22. Four functional areas will be briefly discussed to inform community college leaders, policy- 172 makers, and other stakeholders that institutional support is more than just a function of helping students by ?sending? them to student affairs. The four functional areas discussed will be: 1) academic advising; 2) registration services; 3) orientation and new student programs; and, 4) institutional facilities [not specifically named in Table 2.18]. It should be noted for this study that the items as indicated in Table 22 provide years of research for each item, for grouped items, and for the composite of the entire set of variables in the table. To promote the relationship between assessing the perceptions of students and faculty as a methodology to understand the impact that institutional support structures have on student success, the following four selective examples have been addressed. These four examples are exploratory, and not exhaustively definitive. A definitive research activity would yield much greater results; however, the reality of time and space is hereby invoked. Academic Advising. Students participating in the advising process invoke an institutional practice within the context of an institutional support structure. Student affairs has advisors for students in many areas of college life, and for students who are attending major universities, advisors may or may not be faculty (McArthur, 2005). However, a community college student is more likely to be a commuter student with limited time on campus before or after class; consequently, ?while little can be done to influence ?background characteristics? or ?environmental? circumstances of community college students, the creation of institutional mechanisms to maximize student/faculty contact is likely to result in greater levels of integration and hence persistence? (Halpin, 1990, p. 31). 173 Dale and Drake (2005), in researching the relationship between academic and student affairs, noted that ?academic advising, which includes assisting students with setting clear educational goals and developing academic plans, provides another opportunity for student and academic affairs to collaborate? (p. 60). Within the study by Dale and Drake (2005), Valencia Community College (VCC) was cited as a prime example of how institutional support structures enabled students to succeed in college. VCC developed a LifeMap program to guide students in using the college resources, e.g., institutional support structures in action. LifeMap linked institutional services and individuals to help students achieve their academic goals. The collaborative links included essential components of institutional support such as faculty, courses, staff, technology, and programs and services to help students succeed in college. As a result, Valencia Community College?s semester-to-semester persistence rates increased from 65% in 1994-95 to 79% in 2003-04 (Romano, 2004). ?Valencia Community College has also experienced increases in enrollment, course completion rates, graduation rates, and transfer rates into state universities, and currently awards more associate degrees than any other community college in the United States? (Romano, 2004). Although student success with respect to academic advising is strongly indicated at Valencia Community College, the Community College Survey of Student Engagement (CCSSE) (2006) noted some disturbing trends. CCSSE (2006) reported in its study that 67% of remedial students and 53% of college-level students indicated that advising was very important to them, even more important than some other institutional services, e.g., student-aid advising, child care, or tutoring. Yet, 26% of students who were participating in remedial courses and 41% of 174 students taking college-level courses indicated that they rarely or never participated in academic advising?a critical institutional support service offered to community college students to promote their academic success. The data reported in the study of the CCSSE (2006) responds to the issue of advising: When students participate in advising, does this activity support student success in the context of an institutional support structure? As suggested by the data, academic advising is an institutional support function which helps community college students succeed in college. Community colleges should use this data to become informed of the institutional practices associated with academic advising to improve student learning, and ultimately?community college student success. Consequently, a community college that is cognizant of the value that advising offers to support the decisions of students, should strive to ensure that this practice is: 1) assessed regularly, 2) instilled in every member of administration, faculty, and staff; and, 3) that students, particularly, are given opportunity to give feedback as to the impact that advising had on their individual educational goals. Additionally, faculty advisors are the ?linchpin? of correlating student outcomes to the practice of student advising. Registration Services. Registration is a natural successor to academic advising. However, registration services, as an extension of institutional support services, are intended to help students succeed in college. The Office of Institutional Research, Johnson County Community College (JCCC) (1996), Overland Park, KS, conducted a study to assess how well students perceived the adequacy and functions of the college?s services. The respondents were asked to rate each of the 17 student services provided by the institution: 1) financial aid, 2) counseling center, 3) admissions and records, 4) career center, 5) bookstore, 6) make-up/telecourse [online] testing lab, 7) orientation, pre- 175 advising, 8) business office, 9) student activities, 10) new student assessment/placement, 11) job listing/recruiting, 12) student government, 13) library, 14) food services, 15) computer labs, 16) children?s center, and 17) access center. Of the services offered by JCCC, 75% of the respondents indicated that they would use eight of the services more than the other services provided. The top rated service was touchtone registration [now online registration services], reported by respondents as the number one institutional service they needed and used. Furthermore, a study conducted by the Center for Digital Education (2005) revealed that contemporary students, or those categorized as millennials?the name given to a generation of 60 million people born between 1979 and 1994 (T.H.E. Journal, 2004)?are technologically savvy. Consequently, these students use technology to access their records, register online, download/upload lessons, and so forth. A notable survey question in the study conducted by the Center for Digital Education (2005) was: Students can complete course registration transactions online as ?c? [previous question] and pay course registration fees online. Students responded as follows: 1) Small/Rural community colleges: 51%; 2) Mid/Suburban: 79%; and, 3) Large/Urban: 81%. The data indicate that the institutions provided these services to students as a matter of institutional practice to support their success. If students cannot register and access their information and records securely, persistence is suspect?leading to students who are disgruntled with their community college and are more apt to transfer or drop out. Therefore, community colleges should consistently assess the perceptions of both students and faculty members to understand how institutional services support or harm student success. Noting, again, that student success has many underlying factors, the 176 community college is at fault to assume that present services are adequate while faculty, students or both faculty and students perceive the services to be less-than adequate. Perceptions, in terms of institutional support therefore, must be thoroughly understood. Orientation and New Student Programs. Student orientation is an important success factor for students entering the doors of the community college, as well as four- year colleges and institutions. The value of orientation, new student programs, and transfer has been summarized by the National Orientation Directors Association in the purpose of the Journal of College Orientation and Transition: ?? focuses on the trends, practices, research, and development of programs, policies, and activities related to the matriculation, orientation, transition, and retention of college students. Also, encouraged are literature reviews, ?how-to? articles, innovative initiatives, successful practices, and new ideas.? Within the community college, is orientation considered a part of the institutional support structure? And, do students benefit from such a process? Derby and Smith (2004) answered these questions by analyzing the effects and relationships of a community college orientation course on the retention of students. In the study, it was argued that: ? the term ?drop-out? denotes a student who has permanently left the institution. In the literature, drop-outs are assumed to have been academically underprepared. A puzzling issue regarding drop-outs, however, is that this view fails to consider those students who were academically prepared (and indeed academically successful), but left the institution because that institution failed to meet the student?s academic [and other support] needs. (p. 764) Derby and Smith (2004), through a review of the literature, suggested that studies on the relationship between college orientation courses and retention in the community college are scarce. However, the study conducted by Derby and Smith (2004) consisted 177 of 7,466 enrolled students between Fall Semester 1999 through Spring Semester 2002. The findings [limited discussion for effect] are summarized as: 1) ?for the first cohort of non-reverse transfer students?a greater proportion of students who took the orientation course obtained their degrees than did those students who did not take the orientation course? (p. 768); 2) ?a greater proportion of students who took the orientation course did not fit the ?drop-out? criteria and, conversely, those students who did take the enrollment course were less likely to drop out? (p. 768); and, 3) ?It appears that associations exist between taking an orientation course and student retention, particularly with respect to associate degree attainment within the two-year traditional time frame? (p. 770). Lorenzetti (2006) and Hicks (2005) noted that orientation courses are beneficial to students entering college. Due to student variances in college-readiness and life experiences, orientation courses help students become acclimated to the institution, its functions, and support structures to help students succeed. However, Lorenzetti (2006) argued for caution on the use of an orientation course as a matter of perfunctory practice. If the course does not offer feedback, evaluation, progressive information, and specificity to help students navigate the maze of college life, orientation courses become less effective?offsetting the suggested outcomes of the research by Derby and Smith (2004). Discussing the relationship between student retention and an orientation course is suggested as warranting merit; however, for the community college to be informed as to the practices to improve student success by improving college orientation courses is a matter of further research. Regardless, orientation is an institutional service provided to students as a matter of institutional practice to promote community college success. 178 Institutional Facilities. How are institutional facilities to be defined and also to be qualified? First, a college is a city unto itself which consists of tar and lines in parking lots; sidewalks, buildings, chairs, a library, heating/cooling, hallways, restrooms, and a plethora of other physical attributes. To qualify the institutional facilities is to include the argument that facilities?all of them?are important to the success of college students. For example, parking lots are linked to buildings, linked to classrooms, linked to conducive-facility practices to promote student success. In terms of defining institutional facilities, the proper term is physical plant which includes all parts of the college or university as an institutional support system to support students, faculty, administration, and all other community stakeholders. Veltri, Banning, and Davies (2006) conducted an investigation into the relationship between the perceptions of students and the environmental factors experienced by students in the classroom. In the study, it was suggested that ?various studies?all hinted at linkages between the classroom?s physical qualities and student learning and persistence?researchers believe that the classroom plays a key role in postsecondary student development and learning? (p. 518). Moreover, an argument in the study suggested that not only do students perceive the effects of problems in classroom qualities, but so do faculty. As a result, classrooms which are not conducive to a baseline standard for student learning, e.g., student success, are a major concern to community college faculty and?hopefully?to community college leaders. As noted by Veltri, Banning, and Davies (2006), students indicated that classrooms which were student-learning-friendly, provided an environment supporting their academic achievement. Conversely, classrooms which were not reflective of 179 institutional practices to support student achievement, provided an atmosphere where learning was, at best, very difficult for various physical reasons, including furniture, arrangement, windows, blinds, and climate control. Additionally, students who pay tuition and will someday be required to re-pay students loans or who defer other financial desires in order to remain able to pay tuition, view the facilities as a requirement to support their learning. In other words, for students who outright pay tuition, the notion that a classroom or classrooms/facilities are in unacceptable condition, is an affront to their implied contract with the college. In short, then, students view these unacceptable facilities as a practice which is suggestive to the student that the institution ?does not care about them? and may have a detrimental impact on student success. ?Studies have shown that classroom behaviors such as aggression, interaction, attendance, questioning, and attitudes like satisfaction can all be influenced by the classroom environment? (Banning, 1992, p. 24). Moreover, pedagogy is important in the design of the classroom to maximize the delivery of material, in whatever form that material may take (Niemeyer, 2001). Classrooms are not the only institutional support system in the college which merits how students and faculty perceive the support of student success. For example, are items such as vending machines placed in strategic locations for the convenience of students and faculty; are buildings labeled properly for students and faculty to locate them; is the web site up-to-date so that students, faculty, and the community find support in terms of relevant information to make decisions about locations of classes or registration; and, are facilities clean? These institutional support structures are but a portion of the total physical plant within the auspices and control of the community 180 college; however, without viable learning-conducive classrooms, clean facilities, and student support systems, community college student success is subject to review and improvement to promote college student achievement. It is critical that the community college regularly review the perceptions of students and faculty in terms of institutional support structures. The review, via surveys or open-ended questions on the college web site, can provide invaluable information and/or data as to how the support structures are viewed. For the community college to assume that facilities meet the needs of students and faculty is to assume that student success is a matter of status quo. Institutional support systems, as indicated in Table 22, are many. In fact, many community colleges within current budgets and facility constraints may experience difficulties in meeting the needs of students in all areas of institutional support. However, to effectively promote student success as a factor of the Strategic-Impact-Triad Model, it is vital that?as a minimum?the community college understands the perceptions of students and faculty as a means to measure current institutional practices which is directly tied to institutional support to improve how students achieve their respective educational goals. As a result of understanding how students and faculty perceive the institutional support structures, the community college is better informed of what practices work, what practices don?t, and can establish plans to modify current institutional practices imbedded in institutional support structures to improve student success. In the words of Charles Dickens in Great Expectations, ?Take nothing on looks; take everything on evidence. There?s no better rule.? 181 Chapter Summary To state that there are countless variables which influence student success is a serious understatement. As a method to inform the community college system of education of the depth of variables to be investigated to improve college student success, a listing of the factors?categorically denoted?were indicated in Figure 3, Community College Global Model of Student Success (see Chapter I). In seeking a methodology to realistically measure factors impacting community college student success, the Strategic- Impact-Triad Model was developed. The Strategic-Impact-Triad Model, which purports to measure the perceptions of students and faculty to assess institutional practices, compiled the variables from the Community College Global Model of Student Success and the studies conducted by Robbins et al. (2004), Kuh el al. (2006), and Smith (2005) into three student success impact factors. The Strategic-Impact-Triad Model factors were categorized as: 1) academic preparation, 2) work ethics, and 3) institutional support. The Strategic-Impact-Triad Model was used as a structured guide to assess the factors within the context of institutional practice to promote and improve college student success. Based on the significant literature review, and the lack of research using the three grouped factors noted, the following conclusions were derived: 1. Student success is important to the national scope of education in the United States, including the national and global workforce; 2. Academic preparation of students has two components: a) pre-college, and b) present-college; both components are important to understanding how institutional practices impact student success in the community college; 182 3. Work Ethics, as noted by WorkEthics.Org (2006) and supported by McLeish (2004), is comprised of several sub-variables. The ten sub-variable components were reviewed and suggested that work ethics impacts student success, while in college and subsequent to college graduation or goal attainment; 4. Institutional support is important to the success of students, particularly in terms of persistence to graduation; 5. Institutional practices impact each Strategic-Impact-Triad Model factor, to varying degrees; 6. Institutional practices impact student success; to improve the practices as related to student success, student and faculty perceptions should be used as input measures to determine the impact and the depth of the impact; and, 7. Outcomes of the literature review supported the design of the Strategic-Impact- Triad Model as a methodology to assess the impact of academic preparation, work ethics, and institutional support on the success of community college students within the contextual framework of institutional practice. Capaldi, Lombardi, and Yellen (2006) summarized the mindset that community colleges should adopt to improve graduation rates, .e.g., college student success. Graduation rates are statistically significant indicators of college achievement as a recipient of institutional practices: ?colleges and universities can implement programs that improve low [graduation or student success] rates by addressing the causes that they do control. They can ensure that prospective students understand the requirements for academic success and the preparation they need to succeed. And once students have matriculated, institutions can clear the path to the degree. (Capaldi, Lombardi & Yellen, 2006, p. 45) 183 CHAPTER III METHODS ?It seems safe to say that significant discovery, really creative thinking, does not occur with regard to problems about which the thinker is lukewarm.? --- Mary Henle ?Take nothing on its looks; take everything on evidence. There?s no better rule.? --- Charles Dickens (1812-1870, Great Expectations) ?We are proposing a kind of collective inquiry not only into the content of what each of us says, thinks, and feels, but also into the underlying motivations, assumptions, and beliefs that lead us to do so.? --- David Bohm, Donald Factor, and Peter Garrett Introduction Community colleges have become the doorway to higher education for a significant number of college-eligible students (Hendrick, Hightower & Gregory, 2006; Kisker, 2006; Perin, 2006; The Alabama College System, 2005). Conley (2005) defined college-eligible as a process of fulfilling various admission requirements; however, he also dichotomized college-eligibility and college-readiness. Whereas college-eligibility is a process of ensuring the correct college entrance forms have been completed and course prerequisites have been taken in high school, college-readiness was the benchmark to establish student success. In the context of this study, college-readiness falls within the educational constraints and practices of the community college. How educational practices are assessed is critical to identifying how educational practices should be improved to enhance success opportunities provided to community college students. 184 Students and faculty in the community college have different perceptions of college-eligibility, college-readiness, and student success outcomes (Grimes & David, 1999; Merrow, 2006). A significant number of students perceived themselves to have mastered their courses in high school, achieving an impressively accomplished GPA (Smith, 2006); conversely, a substantial number of faculty members perceived student academic readiness differently. According to Lindholm, Szelenyi, Hurtado, and Korn (2005), only 50% of faculty indicated their satisfaction with the college-readiness of their students. In terms of how students and faculty separately and collectively perceived college-readiness (e.g., the benchmark for student success), Lindholm, Szelenyi, Hurtado, and Korn (2005) noted that only 36% of postsecondary faculty (from four-and-two-year institutions, both public and private) considered that most students are well prepared academically for college. Forty-one percent of all survey respondents?and 65% of faculty at public two-year colleges?revealed that most of the students they taught lacked the basic skills needed for college-level coursework. In stark contrast, 70% of entering college students perceived themselves as above average or in the highest 10% academically and 48% reported earning ?A grades? in high school. Because of these reported significant perceptual differences between students and faculty in the community college respective of student success, this study investigated how specific factors of institutional practice were perceived as having influenced student achievement. Perceptual differences between students and faculty in the community college formed the data-framework for this study, noting that college-readiness and student achievement are two sides of the same community college success coin. 185 This study investigated the underlying perceptions of students and faculty as a means to assess the relationship between perceptions and the student success domains of academic preparation, work ethics, and institutional support. Moreover, there were two fixed-factor variables as set by the researcher, which facilitated the coding of students and faculty as the independent variables (IVs). To identify the direct and indirect input of student and faculty perceptions as factors impacting student success, a one-way ANOVA was conducted using academic preparation, work ethics, and institutional support as the dependent variables (DVs). To investigate the relationship between the dependent, independent, interrelated, and interdependent variables, Chapter III addressed several topics. The topics included: methodology and research design; general hypothesis and research questions; population, sample, confidentiality, and anonymity; procedures for data collection, analysis and coding; instrumentation development & design: panel review-survey enhancement considerations; and, pilot study data, factor analysis (principle component analysis), reliability and validity. Final dataset analysis and findings will be discussed in detail in Chapter IV. In addition to the noted topics in Chapter III, the chapter will briefly discuss the transition from original paper survey to the online survey using SurveyMonkey.com. The transition from paper survey to online survey was due to the overwhelming ?under- participation? of the respondents who completed the paper-formatted surveys, inclusive of the incompleteness and inaccuracies of the limited samples returned. Thus, the pilot sample population was revamped from community and technical colleges in Alabama only to community and technical colleges in Alabama, Georgia, and Florida. Criteria for selecting the final participating community colleges will also be presented. 186 Design of the Study To facilitate a method to uncover underlying perceptions of students and faculty, a survey design using both quantitative and qualitative data was conducted (Tashakkori & Teddlie, 2003). The mixed-methods approach included both qualitative and quantitative methods for the purpose of ?collecting and analyzing both quantitative and qualitative data in a single study? (Creswell, 2003, p. 210). During the literature review phase of this study, a valid and reliable survey instrument to assess the combined domains of interest was not identified (e.g., academic preparation, work ethics, and institutional support). Consequently, the design of this study adhered to the theoretical application of Pett, Lackey, and Sullivan (2003, p. 13): ?The development of valid and reliable instruments takes time, patience, and knowledge?with careful prep aration and testing, it is possible to produce, under most circumstances, reliable and valid measures of a construct?that can be evaluated using factor analysis .? Theoretical Framework The theoretical framework of this study is closely identified with the construct noted by Tabachnick and Fidell (2007): Descriptive statistics describe samples of subjects in terms of variables or combinations of variables. Inferential statistical techniques test hypotheses about differences in populations on the basis of measurements made on samples of subjects. If reliable differences are found, descriptive statistics are then used to provide estimations of central tendency, and the like, in the population?use of inferential and descriptive statistics is rarely an either-or proposition. We are usually interested in both describing and making inferences about a data set. (p. 7) 187 The inferential and descriptive results of this study are used to inform community college administrators, faculty, staff, and other stakeholders that student success is dynamic in form. As demonstrated in the algorithm in Figure 11, the cyclic nature of student success is a process of perpetual review of current practices based on an understanding of one?s subject matter. The community college subject matter is to investigate factors impacting student success. In the context of this study, student success is impacted by practices imbedded within the constructs of academic preparation, work ethics, and institutional support. As noted in Figure 11, the process cycles from understanding to review, and this cyclic process creates a theoretical framework by which community colleges may incrementally improve the success of students. It should also be noted that the Strategic-Impact-Triad factors in Figure 11 are not mutually exclusive; rather, the variables are interdependent and the process, therefore, is interdependent. Consequently, Figure 11 demonstrates a logical mechanism by which community colleges might begin to better understand the relationship of the SIT factors which impact student success?thereby improving community college student success one practice at a time. Methodological design of the study was an assimilation of theoretical constructs present in a thorough review of the pertinent literature on student success, with an emphasis on the specific constructs of academic preparation, work ethics and institutional support. Moreover, this study was designed to assess these three critical factors impacting student success in the community college to address two specific goals: 1) to suggest a culture of evidence (Brock et al., 2007) that student and faculty perceptions possess relevant input into positively enhancing institutional practice; and, 2) to 188 operationalize the findings of the study as an intentional institutional framework to improve student success (Johnson, 2007). To summarize the methodology and design of this study is to promulgate the following purpose for investigating institutional practices: To find out what is happening and act on what you find. This is akin to turning on a flashlight in a darkened alley. You never know quite what to expect. However, for those brave enough to turn on the light, the problems only hinted at in the shadows can be forthrightly dealt with. Such an enterprise, especially when you are dealing with more subjective measures such as organizational climate, team morale, or management style, is especially problematic. Translating this information into action helpful to the company?s [community college?s] success is an additional issue that many fail to address. (Chaudron, 2006, p. 3) 189 Figure 11. Strategic-Impact-Triad Model Algorithm. The algorithm indicates the logical flow at the entry point of a student?s efforts to be successful in the community college. The flow analyzes research, asks pertinent questions, sifts the data, filters the data through the variables in the SIT Model, verifies outcomes, improves practices, and begins the process anew. 190 Research Questions To address the methodology and design of this study was to state opposing hypotheses as guiding principles for the study. The null hypothesis is stated as a construct that faculty and students are more likely to have similar perceptions related to the Strategic-Impact-Triad (SIT) factors influencing student success. Conversely, the alternate hypothesis is suggested as a construct that faculty and students are more likely to view the SIT factors influencing student success statistically significantly different. Research supported the argument that the SIT factors of influence are more likely to be divergent between the student and faculty populations (Kuh, Kinzie, Schuh, & Whitt, 2005b; Lindholm, Szelenyi, Hurtado, & Korn, 2005; Smith, 2006). This study used the following research questions: 1) What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? 2) What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? 3) What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? 4) What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics, and institutional support? 191 Population and Sample A random sampling methodology was implemented to acquire data for this study. To avoid sample bias (e.g., limiting application to the greater population), there were no delimiting or restrictive factors included in the randomized data collection process (Gravetter & Wallnau, 2007). In specific terms of applied methods, students and faculty were contacted via the institutional representative for voluntary participation in the study. The participants in each respective group who desired to make a contribution to the study were provided two web links. Each set-of-links were correlated to the appropriate group: students were provided links to the STUDENT INFORMATION SHEET (see Appendix A) and student survey instrument (see Appendix B); faculty members were provided links to the FACULTY INFORMATION SHEET (see Appendix C) and the faculty survey instrument (see Appendix D). Student and faculty web portals for survey access are shown in Appendix E. [Web portals are specific http links within web sites.] Respective populations for the community college system of education were approximately 11,600,000 students and 593,211 faculty or 12,193,231 total potential respondents split across two fix-factor groups (American Association of Community Colleges [AACC], 2006a, 2006b, 2007; Digest of Education Statistics, 2005, Table 223; Phillippe & Sullivan, 2005). In consideration of logistical difficulty in sampling the total set of community college student and faculty populations?inclusive of letters to 1,202 college presidents?the original survey method was to randomly sample faculty and students in the community and technical colleges from within The Alabama College System ([ACS], 2005). [Note: ACS is the two-year college system in Alabama.] 192 The Alabama College System (2005) is a sample of the entire community college system population and suggested general transferability to the two-year college system in Toto. The ACS (2005) had a sample population of 79,059 students and 1,755 full-time faculty members, with a much greater faculty sample when part-time faculty members were included (Fall 2006 ? 2007 students were 98,805; Fall 2007 ? 2008 data were unavailable at the time of this study). The Alabama College System was initially selected based on the following considerations: a) geographical proximity to Auburn University, b) diversity of students in the system; c) students and faculty in the system represented a valid sample from the community college system nationally; and, d) the assumption that the sample from the system would provide transferability to the community college system nationally (Gravetter & Wallnau, 2007; Sugden, Smith, & Jones, 2000). The Alabama College System (2005, 2007) consisted of 22 community colleges and 4 technical colleges. Although the initial colleges identified to participate in the study were limited in geographical context, this study initially and randomly selected a combination of 5 community and 3 technical colleges in The Alabama College System (2005, 2007) located throughout the State of Alabama. The colleges selected, with demographic data, are indicated in Table 23. The colleges in Table 23 represented an intentional unbiased sample reflective of the total population of community colleges nationwide, with the goal to survey a diverse and representative sample of students and faculty. There were no specific criteria for the colleges selected, other than researcher experience in The Alabama College System. 193 Table 23 Original Colleges Randomly Selected for Participation in the Study Institution Name Location Students Faculty * 1 (TSTC) Trenholm St. Technical College Montgomery, AL 1,501 130 2 (CVCC) Chattahoochee Valley Community College Phenix City, AL 2,049 114 3 (DSTC) Drake State Technical College Huntsville, AL 939 58 4 (EOCC) Enterprise-Ozark Community College Enterprise, AL 2,295 131 5 (NASCC) Northeast Alabama St. Community College Rainsville, AL 2,789 211 6 (NSCC) Northwest-Shoals Community College Muscle Shoals, AL 4,567 334 7 (RSTC) Reid State Technical College Evergreen, AL 662 50 8 (GCWSCC) G. C. Wallace State Community College Selma, AL 1,844 111 Totals: 8 N/A N/A 16,646 1,139 (1) 1. Indicated both full-time and part-time; part-time faculties are invited to participate, which increased participants significantly. 2. Data Source 1: http://www.ache.state.al.us 3. Data Source 2: http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/ftptpersonnel.aspx 4. Data Source 3) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/stuhdctbygenrace.aspx As indicated in Table 23, the total original sample subset of potential respondents was 16,646 students and 1,139 faculty members, with a significant increase when part- time faculty members were included. Although the distinction between full-or-part-time faculty members were not analyzed statistically as a sub-group, data to identify the differences in faculty employment was requested on the faculty survey for general demographic analysis. Of the community and technical colleges contacted to participate in the study, three responded, indicating a participation rate of 38%. Subsequent to the letters sent to the Presidents of the 8 colleges in Table 23 (see Appendix F, Sample Presidential Request Letter; Appendix G, Sample Presidential Approval), an additional college learned of and requested to participate in the study, which increased the participation rate to 44%. Additionally, Dean Barbara Anne Spears of H. Councill Trenholm State Technical 194 College, Coordinator, Alabama Master Teacher Seminar, July 8 ? 12, 2007, agreed to administer surveys at the 2007 annual Alabama Master Teacher Seminar by randomly asking for faculty to voluntarily complete a survey (after reading a FACULTY INFORMATION SHEET, see Appendix C). Table 24 indicated the colleges which agreed to participate in the study and included the notation relevant to the Alabama Master Teacher Seminar. The Alabama Master Teacher Seminar was included due to the experience of the faculty attending; experienced faculty, as perceived and stipulated by the researcher of this study, would provide valuable perceptual data. The return rates of the first draft of the paper-based survey instruments were less than 1% from a composite of all participants. As a result of the phenomenally low return rate coupled with the incompleteness of many surveys returned, the researcher decided to rewrite the survey instruments, convert the surveys into a web-based method, contact the colleges again, and request that they participate as pilot study participants only. For each institution in Table 24, institutional representatives were contacted directly by phone, email and letters to emphasize the importance of minimal participation to create a reliable and valid survey instrument, inclusive of construct validity via factor analysis (principle component analysis) of the survey instrument questions (Pett, Lackey & Sullivan, 2003). Reconsidering the structure of the survey methodology and questions became the catalyst to revise the purpose in the original colleges who submitted correspondence to participate in the study. The participating colleges as indicated in Table 24 became the focal point for the initial pilot study, subsequent to a thorough panel review of the revised survey questions. As can be seen in Table 24, the potential sample for the pilot study was quite large in comparison to normal pilot study samples (Lancaster, Dodd & Williamson, 195 2004). However, based on the initial paper survey review response rate of < 1% and compounded by the incompleteness of forms returned, the researcher was reluctant to conduct the pilot study with fewer colleges than indicated in Table 24. The pilot study colleges yielded a potential student sample of approximately 7,000; the faculty respondents numbered well over 500 when full-and-part time faculty members were included. To encourage participation, the researcher contacted local college liaisons regularly via email and by phone. A letter of appeal was also sent to the contact person for distribution to faculty and students. The only recourse remaining was for the principal investigator to drive to each location, and request participation from students and faculty. It is hypothetical if this on-site method would have improved the response rates. Table 24 Actual Participants in the Pilot Study Phase College Acronym Name of College or Activity Location Students Faculty (1) P/F: T TSTC Trenholm State Technical College Montgomery, AL 1,501 56/74: 130 DSTC Drake State Technical College Huntsville, AL 939 34/24: 58 EOCC Enterprise-Ozark Community College Enterprise, AL 2,295 66/65: 131 CACC Central Alabama Community College Alexander City, AL 2,985 238/62: 300 MTS (LBW) Master Teacher Seminar, July 8 ? 12, 2007, (Wallace Community College) Lurleen B. Wallace, Dothan, AL NA 50 (2) (Est.) Totals: 5 (3) N/A N/A 7,720 619 (2) 1. Faculty in the dataset includes estimates of full-and-part time (analysis did not factor sub-groups of part- time faculty except for descriptive indicators). NOTE: P/F: T equates to Part-Time/ Full-Time: Total. 2. Master Teacher Seminar Data not included in the Pilot Test. Total respondents were insufficient to conduct statistical analysis of the dataset returned (Paper Surveys). 3. Source a) http://nces.ed/gov/ipedspas/reportOnVars.asp. 4. Source b) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/ftptpersonnel.aspx 5. Source c) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/stuhdctbygenrace.aspx 196 Data returned from the pilot study identified a total of 265 respondents comprised of 68 faculty members and 197 students. The total respondents relative to the sample population were: 265 of 8,339 (see Table 24), a return rate of 0.031778, or 3.2%. Of the total number of responses, there were 26 exclusions from the dataset due to incomplete responses, or a loss of 9.8% return data (7 faculty & 19 students). However, the actual completion rate of the pilot study surveys was 90.2%. Within the total pilot sample, the ratio of faculty to student, respectively, was: 26% faculty and 74% student, or about 3:1 in favor of student respondents. This return ratio suggested a lower return ratio between students and faculty, and possibly may have an impact on the data reported. For example, the national data in terms of the overall ratio of the total populations of faculty and students in the community college system were: 11,600,000 students and 593,211 faculty members, or 1 instructor per 19.6 students, on average (American Association of Community Colleges [AACC], 2006a, 2006b, 2007; Digest of Education Statistics, 2005, Table 223; Phillippe & Sullivan, 2005). As the purpose of the pilot study was to evaluate and establish the reliability of the survey instruments, general demographic data will not be presented here (Field, 2005; Pallant, 2007). However, the following colleges and number of respondents were reported (see Table 25). Table 25 lists the respondents by college, group, and percentages specific to the total respondents, their respective college, and the total population sample. The data indicate the difficulties in performing research using survey methods (Asiu, Antons & Fultz, 1998; Goho, 2002; Porter & Umbach, 2006). While some response rates were acceptable, other colleges indicated response rates which could be interpreted as response rate outliers. 197 Table 25 Pilot Study Respondent Data College Stu, % of College Tot (Specific to College) Fac, % of College Tot (Specific to College) Stu, % of Population (Total: 7,720) Fac, % of Population (Total: 619) TSTC 117/1,501: 7.8% 23/130: 17.7% 117/7,720: 1.5% 23/619: 3.7% DSTC 41/939: 4.4% 9/58: 15.5% 41/7,720: .5% 9/619: 1.5% EOCC 5/2,295: .2% 16/131: 12.2% 5/7,720: .06% 16.619: 2.6% CACC 34/2,985: 1% 20/300: 6.7% 34/7,720: .4% 20/619: 3.2% Total: Student Respondents: 197/197: 100% Faculty Respondents: 68/68: 100% Potential Students: 197/7,720: 2.6% Potential Faculty: 68/619: 11% Subsequent to the pilot study and inclusive of the factor analysis (principle component analysis) of the collected data, the final dataset for the study was requested from the colleges as indicated in Table 26. The population sample for these potential community or technical colleges were as follows: 1) potential student respondents: 121,753; and, 2) potential faculty respondents: 6,557. As previously noted, no criteria were established for the random selection of the community or technical colleges with the exception of Valencia Community College (VCC) and Florida Community College at Jacksonville (FCCJ). VCC was cited in this study as utilizing highly effective student success support structures (Achieving the Dream, 2005; Dale & Drake, 2005). Both VCC and FCCJ were selected due to the large number of students and faculty and the diversity within the colleges as noted by a review of the web sites of each respective community college. 198 A total of 18 community or technical colleges were included in the potential final dataset. The colleges requested to participate were: 2 technical colleges in Georgia, 2 community colleges in Florida, and 14 community colleges in Alabama. Georgia technical colleges were selected due to the use of the WorkEthics.Org (2006) material cited in this study. The eighteen colleges requested to participate in this study were a voluntary sample from the population of the community college system of education in the United States. These eighteen colleges constituted a valid sample of the population, with suggested analysis as transferable to the total population. The demographic data for the potential participating colleges are indicated in Table 26. Each participating community or technical college contact person was given the web portals for students and faculty, respectively. When a student or faculty member decided to access the survey, the respondent was requested to complete the survey at a time that was most convenient for him or her. Convenience was defined as completing the survey off campus so that the survey might be given due consideration in the privacy of one?s home or distant from coworkers. The methodology of this requested practice was to encourage free-and-open input from both students and faculty. Moreover, this practice was also implemented to improve the response rates within the final dataset. As noted in Table 26, there were 18 college presidents contacted to participate in this study. The discussion for the actual participating colleges is presented in Chapter IV. 199 Table 26 Potential Community and Technical Colleges Surveyed for the Final Dataset College Acronym Name of Community or Technical College Location Students Faculty (1) P/F:T ASCC Alabama Southern Comm College Monroeville, AL 2,548 49/58: 107 ATC Altamaha Technical College Jesup, GA 1,921 65/41: 106 BeSCC Bevill State Comm College Sumiton, AL 6,513 329/121: 450 BiSCC Bishop State Comm College Mobile, AL 4,074 94/119: 213 CCC Calhoun Comm College Decatur, AL 9,345 480/134: 614 CGTC Central Georgia Technical College Macon, GA 4,873 375/109: 484 CVCC Chattahoochee Valley Comm College Phenix City, AL 2,049 85/29: 114 FCCJ Florida Comm College at Jacksonville Jacksonville, FL 23,700 346/375: 721 GSCC Gadsden State Comm College Gadsden, AL 5,040 369/162: 531 JHFSCC James H. Faulkner St. Comm College Bay Minnette, AL 3,332 150/66: 216 JDCC Jefferson Davis Comm College Brewton, AL 1,084 58/47: 105 JSCC Jefferson State Comm College Birmingham, AL 7,729 480/138: 618 NASCC Northeast Alabama St. Comm College Rainsville, AL 2,789 170/41: 211 RSTC Reid State Technical College Evergreen, AL 662 27/23: 50 SUSCC Southern Union State Comm College Opelika, AL 4,731 197/95: 292 TALSCC T.A. Lawson State Comm College Birmingham, AL 5,595 155/108: 236 VCC Valencia Comm College Orlando, FL 29,636 861/298: 1,159 WSCC Wallace State Comm College Hanceville, AL 6,132 199/131: 330 Totals 121,753 6,557 Totals: Students: 121,753; Faculty Total: 6,557 Faculty, Part-time: 4,489 Faculty, Full-Time: 2,095 1. Faculty in the dataset includes estimates of full-and-part time (analysis did not factor sub-groups of part-time faculty). NOTE: P/F: T equates to Part-Time/ Full-Time: Total. 2. Source a) http://nces.ed.gov/ipedspas/reportOnVars.asp. 3. Source b) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/ftptpersonnel.aspx 4. Source c) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/stuhdctbygenrace.aspx Instrumentation During survey development, it was understood that low response rates may be common among respondents. For example, Asiu, Antons and Fultz (1998) studied the phenomenon known as ?survey saturation? and its effect on respondents; Goho (2002) suggested that mixed-mode surveys had little positive effect on return rates; and, Porter and Umbach (2006) studied variations in response rates across student and faculty responses. The studies noted provided the researcher information to create a survey most conducive to ease of use and applicability, including layout, clarity, and minimal time to completion. The following sections discuss development of the survey in detail. 200 Item and Domain Development In terms of variable (question) design considerations, each question is noted as a correlate of a trend in the educational literature. For example, the book by Kuh, Kinzie, Schuh and Whitt (2005b), Student Success in College: Creating Conditions That Matter, provided an extensive look at practices which supported the success of students. In this research-intensive text, the authors described a number of colleges participating in the Documenting Effective Educational Practice (DEEP) system as identified in the National Survey of Student Engagement (NSSE) (Kuh, Kinzie, Schuh & Whitt, 2005b, p. 10). A thorough review of this text indicated practices for the domains identified in the Strategic-Impact-Triad Model. One example variable associated with practices impacting academic preparation as identified in Section 4 of the student survey (see Appendix B) and Section 4 of the faculty survey (see Appendix D), was the item of ?writing assignments? or writing across the curriculum. Writing across the curriculum in many DEEP colleges and universities was the practice of having students write various types of exercises in every class, whether that writing was a short essay or an original case study. The intent of writing across the curriculum was to enable students to improve their academic preparation by becoming better writers, thereby improving their success within the goal of becoming academically prepared (e.g., domain of academic preparation). The notation of the DEEP colleges is but one example of the process to derive the variables to measure the constructs of academic preparation, work ethics and institutional support. A correlation table of the questions for the survey instruments and the studies and reports is provided in Appendix H. 201 The development of a data collection instrument began during the literature review phase of this study. Steps in this development process included: 1) identifying the constructs to be measured; 2) creating questions as valid and reliable measures of the constructs; and, 3) designing the appropriate scale of fit for the constructs identified (DeVellis, 2003; Fowler, 2002; Spector, 1992). Subsequent to designing the initial question set and survey format, the survey was distributed via SurveyMonkey.com to a panel of experts. The panel review version of the survey included 6 comment textboxes. Review surveys were distributed to 4 national community college organizations (League for Innovation in the Community College, American Association of Community Colleges, Community College Research Center, and Office of Community College Research and Leadership), students, faculty, experienced researchers, and the dissertation committee members. In total, 33 requests were made for comment and of those, 19 evaluations were returned. The response rate for the panel review process was 58%. Of the 58% who responded to the initial survey, several changes were suggested. The changes included, but were not limited to: content restructuring, reduction of questions per construct, consistency of wording, and the elimination of sections or the addition of other sections. All recommendations were meshed into the final draft versions of each respective survey. The final draft surveys were then released to the pilot schools (see Table 24). The contact persons at each respective college participating in the web-based pilot study were then sent emails; the emails contained the updated web portals to each respective survey. 202 Description of Survey The survey instruments included standard demographic sections for both faculty and students to identify and describe the participants. For comparative analysis and to establish the variance between students and faculty, a series of 15 self-reported categories were identified in the next section. These 15 items were intended to set the benchmark between perceptions of students and faculty in terms of student abilities related to success. The 15 items had correlation to the study by Lindholm, Szelenyi, Hurtado, and Korn (2005), which surveyed 40,670 full-time college and faculty members at 421 two- year colleges, four-year colleges, and universities nationwide (p. 3). The results of the study by Lindholm, Szelenyi, Hurtado, and Korn (2005) suggested that faculty and students would reject the null hypothesis previously stated that students and faculty view student success the same. The Likert-scale for this 15-item section was a condensed version of a 5-point scale, reduced to a 3-point scale for ease of interpretation. The 3- points of the scale were: 1) below average, 2) average, and 3) above average. In the next major section of each respective survey and to assess the perceptions of students and faculty in the three domains of the Strategic-Impact-Model, 12 questions were asked of each respondent for each of the constructs in the SIT Model. Each question within the total 36 question set (12 per domain) was based on a type of practice within the community college which impacted student success, whether that practice was direct or indirect. The 4-point Likert-scale used to measure the responses in each domain were: 1) Not Important, 2) Somewhat Important, 3) Important, and 4) Very Important. A No Opinion option was not used in this survey at the suggestion of a panel review member, which was contrary to the opinion suggested by Dillman (2000). 203 The final two sections of the surveys included open-ended questions and a three- question section on the factors of the SIT Model. The open-ended questions were ?self- explanatory?, while the three questions related to the SIT Model were ?yes-no? type responses. The researcher wanted to assess the perceptions of students and faculty in terms of whether these groups believed that academic preparation, work ethics and institutional support were either required or not required for students to be successful in the community college. These three questions in both faculty and student surveys measured an absolute reference in terms of SIT Model factors as being required or not required to promote student success (see Student Survey, Appendix B and Faculty Survey, Appendix D). The next four sections of this study will address the development of the survey questions, including the relationship of survey question to a specific construct, as well as a review of the open-ended questions. These sections are under the headings of: Academic Preparation, Work Ethics, Institutional Support, and Open-Ended Questions. Academic Preparation Factor 1, academic preparation, was reviewed for pre-college and present-college implications. For example, the survey questions were designed to seek the relationship that faculty and students had about how the dependent variable of academic preparation impacted the educational outcome of college student success; the questions sought to statistically assess and compare the perceptions of the sample groups in order to determine the main effect of academic preparation on student achievement as reported by students and faculty. 204 As a result of the literature investigation, a set of variables or characteristics defining the domain of academic preparation was prepared. These variables or characteristics, based on specific practices in the domain of academic preparation, are indicated in Table 27. Table 27 Practices of the Academic Preparation Domain (APD) Item Survey Question/Item: Student Question: How important are the following items or activities in helping you to be successful in your college work? (Academic Preparation) Faculty Question: How important are the following items or activities in helping students be successful in college? APD Practices Identified (Examples) Not an exhaustive list Based on research Includes established practices Are elements of student success Practices based on measured perceptions Are codependent on work ethics and institutional support 1 Writing assignments Effective student writing across the curriculum 2 Reading the textbook Effective student reading across the curriculum 3 Getting feedback on assignments and tests Collaborative learning development 4 Having instructors as advisors Time management, planning, retention 5 Using email to get help with class material Using technology as learning assistant 6 Instructors who challenge and encourage me Student support at a personal level 7 Participating in labs with real-world exercises Learning to deal with real problems 8 Having online study guides for each course Structured methodology to enhance learning 9 Tests that actually cover the material taught Valid instruments to evaluate and guide students 10 Getting help from instructors during office hours Indicators that instructors are willing to help 11 Receiving feedback about progress in a course Establishment of shared teaching-learning 12 Having a syllabus that is a learning guide Roadmap to success in courses 205 Work Ethics Factor 2, work ethics, was also based on a detailed review of pertinent literature. In terms of instrumentation development, many sources were used to define the characteristics of work ethics to develop research questions. Two major sources were WorkEthics.Org (2006) and McLeish (2005), noting that numerous other studies contributed to the development of questions. For example, The Conference Board et al. (2006) indicated that ?Professionalism/Work Ethic, Teamwork/Collaboration and Oral Communications are rated as the three most important applied skills needed by entrants in today?s workforce? (p. 10). The same methodology and principles used to develop and correlate survey question to research for academic preparation was used in the sections on work ethics and institutional support. As a result of the literature investigation, a set of variables or characteristics defining the domain of work ethics was prepared. These variables or characteristics, based on specific practices in the domain of work ethics, are indicated in Table 28. 206 Table 28 Practices of the Work Ethics Domain (WED) Item Survey Question/Item: Student Question: How important are the following items or activities in helping you to be successful in your college work? (Work Ethics) Faculty Question: How important are the following items or activities in helping students be successful in college? WED Practices Identified (Examples) Not an exhaustive list Based on research Includes established practices Are elements of student success Practices based on academic and workforce analysis Are codependent on academic preparation and institutional support 1 Showing up for class on time Understanding the necessity of parameters 2 Students take the initiative to make up missed work due to absences Responsibility and leadership 3 Attending class regularly Participation to achieve 4 Appearance Appropriateness of the situation 5 Students as a team player in group projects Teamwork, caring attitude 6 Students helping other students succeed Teamwork, caring attitude 7 Students improving their organizational skills Learning to become efficient in college/life 8 Treating people with respect Dignity and respect as key to success 9 Instructors giving students feedback on their work ethics Practice to evaluate and guide students in terms of their work ethics 10 Hearing from business and community leaders about work ethics Emphasize the importance of work ethics by those in the workforce 11 Being an effective manager of time Time management in college and life 12 Earning an A by unethical methods Honesty/Integrity Using the research by WorkEthics.Org (2006), McLeish (2005), The Conference Board et al. (2006), and other studies in the domain of work ethics, provided the foundation for the survey questions to assess the main effect of work ethics on the success of the community college student as perceived by students and faculty. More importantly, the work ethics section survey questions were intended to measure the perceptions of students and faculty to impact institutional change to improve student success within the framework of daily work ethics practice. For example, to better understand that faculty and students view work ethics statistically different will inform the community college of changes to be made to improve student success; in reality, as 207 noted in the research of both scholar and business domain, a bad attitude or non- attendance is a problem subject to cause loss of gainful employment. The instructional practice of altering an attitude or encouraging participation via face-to-face attendance is statistically significant to students, faculty, education, the workforce, and global competition (Braxton, 2006; Noel-Levitz, 2006). According to a study released by the Boston Area Advanced Technological Education Connections (BATEC) (2007), ?The case for soft skills might appear to be open-and-shut, given industry?s strong endorsement. Paradoxically, despite the importance of employability skills, neither educators nor students appear to appreciate them as employers do? (p. 33). Consequently, to further assess the value of the work ethic attributes, the survey initially included an ordering of the attributes as shown in Figure 12. The purpose of this hierarchical method was to determine if a best predictor might be determined that would significantly improve student success when compared to the other work ethics. However, this activity was deleted from the survey process as a recommendation of several review panel members. 208 Of the ten (10) work ethic characteristics below, please arrange these in the order from MOST important to LEAST important? Also, please respond to the two questions below related to STUDENTS and FACULTY. 1. Attendance 2. Character 3. Teamwork 4. Appearance 5. Attitude 6. Productivity 7. Organizational Skills 8. Communication 9. Cooperation 10. Respect Place the corresponding number of the Work Ethic in the boxes below. MOST Important to ????????????? L EAST Important Of the ten WORK ETHICS listed, which ONE do you think is the MOST IMPORTANT for a STUDENT? Of the ten WORK ETHICS listed, which ONE do you think is the MOST IMPORTANT for an INSTRUCTOR? Figure 12. Work Ethics Hierarchical Rating. The rating mechanism intended to assess the work ethics from most important to the least important for the purpose of establishing practices which are correlated to the level of the ethic. For example, if the highest rated ethic is attendance and attendance is a chronic problem in classrooms, policy and practice would be given additional emphasis to promote the value of attendance for student success. Methodology would be established, measured, and evaluated for enhancement relative to attendance practices. Institutional Support Factor 3, institutional support, is also important as a variable which promotes or harms student success in the community college. Institutional support is strategically and exceptionally important because without the infrastructure to support student success, the community college will be unable to keep its commitment to provide open access to the diversity of students attending the two-year college (Boggs, 2004; NCES, 2003; Vaughn, 2004). Asking questions to assess the main effect that institutional support has on student success as perceived by students and faculty within the contextual framework of institutional practice are necessary to improve student achievement. Questions in this section have also been correlated to research. For example, it might be argued and counter-argued that something as simple as a classroom might impact the success of students. According to Appleby (1990), classrooms which do not 209 support the learning of students are suggested as detrimental to the success of students. Learning-conducive classrooms and buildings (Kuh, Kinzie, Schuh & Whitt, 2005) are an integral part of institutional support, and variances in the perceptions of this specific variable are noted in several questions. Institutional support cuts across all demarcation lines within a college; without effective institutional support structures applied to daily practice, student success is diminished. As a result of the literature investigation, a set of variables or characteristics defining the domain of institutional support was prepared. These variables or characteristics, based on specific practices in the domain of institutional support, are indicated in Table 29. Institutional support in this study is the most impersonal student success domain. For example, faculty and/or students may perceive that institutional support structures are farthest from their control, and therefore, are more likely to feel powerless to cause change in institutional support structures. The survey measured perceptions of students and faculty within a very limited set of practices related to institutional support. Moreover, when students and/or faculty perceive that ?offices? on campus do not provide support for student and/or faculty problems, these types of actions/practices are explicitly those actions/practices this study intended to measure (see Table 29). 210 Table 29 Practices of the Institutional Support Domain (ISD) Item Survey Question/Item: Student Question: How important are the following items or activities in helping you to be successful in your college work? (Institutional Support) Faculty Question: How important are the following items or activities in helping students be successful in college? ISD Practices Identified (Examples) Not an exhaustive list Based on research Includes established practices Are elements of student success Practices based on academic and workforce analysis Are codependent on academic preparation and work ethics 1 Having problems resolved satisfactorily Direct individual support to promote success 2 Perceiving faculty, staff and administrators as accessible and helpful Open/collaborative/interactive leadership to build a learning community 3 Feeling safe on campus to study Freedom to concentrate on the task at hand 4 Getting help in finding meaningful employment Direct individual support to promote success 5 Permission to call any individual associated with the college Open/collaborative/interactive leadership to build a learning community 6 Online registration is available when needed Direct individual support to promote success 7 Being in classrooms that are clean Indicators of ?institutional concern for the client? 8 Understanding the mission of the college How the college supports the student and why 9 Having student organizations that enrich the learning experience Direct and indirect individual support to promote success 10 Giving feedback to administrators on how to improve the college Students become change agents 11 Having community services published on the web site Support structures made available for students and faculty 12 Resources for student support are reliably accessible Support is needed in all areas when needed by student and faculty Qualitative Open-Ended Questions In addition to the survey questions related to the domains of academic preparation, work ethics, and institutional support, qualitative open-ended questions were included in each survey. The open-ended questions presented to students and faculty respondents are identified in Table 30. The questions in Table 30 captured themes from respondents which may be used to improve student success. For example, if the pre- 211 defined questions in the domain sections of the survey failed to address specific perceptions from students or faculty members, open-ended questions may potentially gather and report such crucial information (Dillman, 1991). Specifically, if a respondent did not report a complete answer to a scaled survey question, an open-ended question may shed light on the actual intent of the respondent. Faculty and student themes will be presented in Chapter IV. Table 30 Instrumentation to Correlate Quantitative Data and Qualitative Themes. Faculty/Student Question Content #1 What should community colleges do to support students who are academically unprepared? #2 How can community colleges help students or faculty acquire and practice good work ethics? #3 What can a community do to improve its institutional support to help students succeed in college from enrollment to graduation? #4 What institutional practices (actions by members of the college) have you observed which helps or harms the success of a student? Reliability and Validity Reliability and validity of the student and faculty survey instruments and scores were evaluated using two methods. Method one was to assess content validity of the scale by ?having items reviewed by experts for relevance to the domain[s] of interest? (DeVellis, 2003, p. 50). Method two assessed: 1) reliability using Cronbach?s homogeneity of the item scores within the scale; and 2) validity by using Principal Component Analysis (PCA). Each method is discussed in the next two sections. 212 Panel of Experts To assess validity in this study was to consider several variables related to validation. According to Pallant (2007), ??there is no one clear -cut indicator of a scale?s validity. The validation of a scale involves the collection of empirical evidence concerning its use? (p. 7). Therefore, to address validity of the survey instrument scores, a diverse panel of expert reviewers became the source to validate the instrument. As previously discussed in the section on Instrumentation, Item and Domain Development, the panel consisted of nationally recognized community college entities, students, faculty, and the dissertation committee members. Each section (6 major sections) of the instrument included a comment textbox for feedback on content validity, relationship of questions to the domain under investigation, and overall structure/design. The expert panel of reviewers provided extensive feedback on question content and suggested several revisions to questions, including: reordering of questions asked, consistent wording, number of questions per domain, scale modification, inclusion of reverse coded questions, and revisions of open-ended questions. The panel members also noted that the survey questions within each domain related to the domain under investigation. Additionally, panel members were also asked to review each section, which included directions for the specific survey section. The panel reviewed and commented on each section of the survey. Feedback from the panel suggested that the objectives of each section were met by the collection methods within each section. Based on the extensive feedback and corrective suggestions from the panel of experts, the survey items were content validated to measure what they intended to measure within each domain of interest in the study (DeVellis, 2003; Fowler, 2002; Spector, 1992). 213 Cronbach?s Alpha, Principal Component Analysis, and ANOVAs To validate the survey instrument scores, a pilot test was conducted (Pett, Lackey & Sullivan, 2003; Tabachnick & Fidell, 2007). The pilot test was conducted within the institutional culture and control of the participating technical and community colleges as previously identified (see Table 24). The researcher reviewed acceptable levels of reliability for survey-data domain scores (Streiner, 2003). It was projected that Cronbach?s coefficient of reliability would yield acceptable alpha values ? .700 (Cortina, 1993; Field, 2005; Kline, 1999; Nunnaly, 1978), respective of sample size implications, to indicate internal consistency and homogeneity of the survey instrument variables/questions (Shannon & Davenport, 2001). Furthermore, utilizing the statistical technique of principal component analysis (PCA), survey data were analyzed to assess the independent factor loadings for and the relationships within the three domains of this study. This study used PCA as a primary method for ??selecting and measuring a set of variables?determining the number of factors?[and]?interpreting the results?? (Tabachnick & Fidell, 2007, p. 608). This study did not specifically use the statistical techniques of Factor Analysis (FA) or the models of Confirmatory Factor Analysis (CFA) or Exploratory Factor Analysis (EFA). The goal in this study was to establish a survey instrument which possessed acceptable levels of validity using the statistical construct of PCA and reliability using Cronbach?s alpha coefficient of reliability. The terms of PCA or FA may be interchanged in this study; however, the intent for either notation (PCA or FA) is specific to the methods used in principal component analysis. 214 As stipulated by Tabachnick and Fidell (2007), DeVellis (2003), and Field (2005), FA and PCA do not produce the same results. ?FA produces factors, while PCA produces components? (Tabachnick & Fidell, 2007, p. 609). As noted by Field (2005); ? there are two approach es to locating underlying dimensions of a data set: factor analysis and principal component analysis. These techniques differ in the communality estimates that are used. Simplistically, though, factor analysis derives a mathematical model from which factors are estimated, whereas principal component analysis merely decomposes the original data into a set of linear variables ? only factor analysis ca n estimate the underlying factors and it relies on various assumptions for these estimates to be accurate. Principal component analysis is concerned only with establishing which liner components exist within the data and how a particular variable might contribute to that component. (p. 630) Therefore, in this study the use of ?factor? or ?component? is interpreted as ?component? within the methods of principal component analysis. To obtain statistical component output in SPSS, options available within the factor analysis structure were configured as follows: 1) factor loadings for each independent domain in the SIT Model were set at 2 (determined by researcher); 2) rotation solution set to Varimax; and, 3) loading values ordered from high to low on the components loaded with absolute values suppressed for variable coefficients ? .4000 (Field, 2005; Pallant, 2007; Tabachnick & Fidell, 2007). The purpose in the PCA was not to research the survey instrument to its maximum potential, final conclusion, or complex modeling structure, e.g., CFA, EFA; rather, principal component analysis was utilized in this study to obtain an ?empirical summary of the data set? (Tabachnick & Fidell, 2007, p. 635) to support validity of the survey instrument factors and scores. 215 To initially test the validity of the survey instrument scores, a pilot study was conducted. The pilot study surveys were made available to respondents for a period of 30 calendar days. At the end of this period, the survey instruments were taken off-line and the data was processed. The pilot study data will be presented in Chapter III as the precursor for the final data analysis in Chapter IV, based on the actual dataset collected from the voluntary participating colleges (see Table 25 for colleges contacted). Using the Statistical Package for the Social Sciences (SPSS), the pilot study data scores were assessed. In terms of reported perceptions in the pilot dataset, Table 31 indicated the PCA components results of the independent variable domains. The analysis suggested that the factor loadings identified dimensions within the domains; however, according to Pallant (2007), Field (2005), and Tabachnick and Fidell (2007), the following statistical indicators suggested that the data reported contained sufficient elements to suggest that the survey instrument was a valid instrument. The elements noted are: 1) Cronbach?s alpha coefficients of internal consistency of survey instrument scale scores (Cortina, 1993; Kline, 1999; Nunnaly, 1978; Pallant, 2007); and 2) the use of factor analysis or principal component analysis to indicate strength of correlation between survey question scores. The factor analysis indicators are discussed below: 1. Bartlett?s Test of Sphericity. Bartlett?s measure tests the null hypothesis that the original correlation matrix is an identity matrix. For factor analysis to work, we need some relationships between variables and if the R-matrix were an identity matrix, then all correlation coefficients would be zero. Therefore, we want this test to be significant (i.e. have a significance value less than .05). A significant test tells us that the R- matrix is not an identity matrix; therefore, there are some relationships between the variables we hope to include in the analysis. For these data, Bartlett?s test is highly significant (p < .001), and therefore factor analysis is appropriate (Field, 2005, p. 652); Bartlett?s Test of Sphericity should be statistically significant at p < .05. (Pallant, 2007, p. 185) 216 2. Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy. The KMO index has a range between 0 and 1, with suggested minimal adequacy as .6. For an index ? .6, the KMO indicates factors analysis is an appropriate statistical method. (Pallant, 2007; Tabachnick & Fidell, 2007) 3. Correlation Matrix. Coefficients within the matrix should have significant numerical values ? .3000. A lack of these values indicates that factor analysis may not be an appropriate statistical method. (Pallant, 2007; Tabachnick & Fidell, 2007) Table 31 provides the results of the principal component analysis of the pilot study data as reported by students and faculty. The scores were compiled in SPSS and assessed using factor analysis settings as previously noted, e.g., Varimax rotation. The results in Table 31 indicated that for each domain, Cronbach?s coefficient of internal consistency indicated a reliable survey instrument. The domains for the Strategic- Impact-Triad Model to assess the validity of the data reported were: Academic Preparation (.821), Work Ethics (.824), and Institutional Support (.901). Factor analysis measures for Bartlett?s Test of Sphericity, Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy, and the Correlation Matrix, respectively, were each within the range for statistical significance. Factor analysis calculations from the survey scale scores are indicated in Table 31. The Notes section of Table 31 indicated detailed output from SPSS to support the survey instrument as a valid instrument. Additional calculations are also show in Table 32. 217 Table 31 Principal Component Analysis for Pilot Study Variables (Independent Analysis) Academic Preparation Work Ethics Institutional Support Factor Factor Factor 1 2 1 2 1 2 Q11 .730 Q6 .773 Q11 .787 Q10 .708 Q10 .770 Q10 .777 Q9 .684 Q4 .725 Q9 .775 Q7 .676 Q7 .720 Q8 .741 Q6 .655 Q9 .695 Q6 .638 Q12 .576 Q5 .659 Q12 .615 .404 Q4 .508 Q11 .627 .423 Q5 .553 .429 Q5 .452 .406 Q8 .604 Q1 .797 Q8 .427 .409 Q2 .824 Q2 .773 Q2 .780 Q3 .803 Q3 .764 Q1 .759 Q1 .711 Q4 .572 Q3 .586 Q12 (1) .408 -.456 Q7 .468 .488 1. Cronbach?s Reliability Coefficient: .821 (Q1-12); ** Q1-3: = .624 * Q4-12: = .810 ** Q4, 6-7,9-12: = .785 * Q1-3,5,8: = .674 2. Kaiser-Meyer-Olkin Measures of Sampling Adequacy: KMO = .831 3. Bartlett?s Test of Sphericity: Sig. = .000 1. Cronbach?s Reliability (Q1-12) Coefficient: .824 (Q1-11: 880) ** Q1-3: = .826 * Q4-11: = .870 ** Q4-7,8,9-10: = .856 * Q1-3, 11: = .807 2. Kaiser-Meyer-Olkin Measures of Sampling Adequacy: KMO = .869 3. Bartlett?s Test of Sphericity: Sig. = .000 1. Cronbach?s Reliability Coefficient: .901 (Q1-12) ** Q1-4: = .772 * Q5-12: = .884 ** Q6, 8-11: = .851 * Q1-5,7,12: = .835 2. Kaiser-Meyer-Olkin Measures of Sampling Adequacy: KMO = .907 3. Bartlett?s Test of Sphericity: Sig. = .000 Notes: (** omitted cross-loadings for each factor) 1. Variable Q12 in the Work Ethics domain was reverse coded. Using all variables in the Work Ethics domain to perform the Reliability Analysis resulted in Cronbach?s Alpha of .824; using only variables Q1 ? Q11 in the Work Ethics domain resulted in Cronbach?s Alpha of .880. 2. Academic Preparation: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 3. Work Ethics: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 4. Institutional Support: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 5. Loadings on variables with values of < .4000 were not included in the analysis; 6. Bartlett?s Test of Sphericity is significant at p < .05 for factor analysis to be considered appropriate (Tabachnick & Fidell, 2007); 7. Kaiser-Meyer-Olkin (KMO) index range 0 to 1, with .6 suggested as minimum for good factor analysis (Tabachnick & Fidell, 2007). 8. Correlation Matrix coefficient ratio for: a) Academic Preparation: 42% ? .3000 (60/144); b) Work Ethics: 72% ? .3000 (103/144); c) Institutional Support: 88.2% ? .3000 (127/144) Within the work ethics domain in Table 31, variable/question #12 was worded as: ?Earning an A by unethical methods.? The intent in this question/variable was to 218 measure the underlying construct of honesty/integrity within the domain of work ethics. Interestingly?although this question was reverse coded?it appeared to measure a latent construct with four dimensions. The dimensions were: 1) students who considered honesty/integrity as an important work ethic to earning an A; 2) students who considered honesty/integrity as an unimportant work ethic to earning an A, e.g., any means to earn an A is a solution; 3) faculty who considered honesty/integrity as an important work ethic to earning an A; and, 4) faculty reporting that students consider earning an A grade by unethical methods was a work ethic practiced by many students. Although this variable or characteristic within the domain of work ethics is an important variable for analysis, for this study the measured outcomes in the pilot study were inconclusive. Therefore, the analysis indicted in Table 31 and Table 32 suggested that Question 12 of the work ethics domain requires significant modification for further study. Nevertheless, for this study and to further correlate the dimensions in question 12 of the work ethics domain, the question was included in the final dataset process. Before analyzing the data indicated in Table 32, the dimensions of the results in Table 31 will be discussed as a methodology for general construct validation (reliability) with the framework of educational practices. For the academic preparation domain, dimensional components which appear to be pertinent to the practices of academic preparation were: Q4, 6, 7, 9 ? 12: instructor dependent practices; Q1 - 3, student dependent practices; Q5 and 8: online interdependent processes (practices). Q5 and 8 cross-loaded, indicating that these variables or practices are cross-relational between student and faculty effort for success. The domain of work ethics compiled as follows: Q4 ? 10: categorical ethics-skills development; Q1 ? 3: student participatory intent; Q11, time management. Q11 cross- 219 loaded, indicating that this variable is correlated to both student and faculty practices to maximize student success. Q12 is not correlated to any factor as its application in the PCA analysis distorts the applicability of the domain. And, for the domain of institutional practice, the loadings indicated the following: Q1 ? 4: direct student support; Q6, 8 ? 11: indirect student support; Q5, 7, 11: miscellaneous student support, and as cross-loaded variables, these items are institutional functions more than faculty or student functions. Table 32, indicated the version of the factor analysis which ?removed? the cross- loadings and restructured the constructs within each domain in terms of specific practices. Controversial in nature, cross-loadings have been identified as problematic in some scholarly circles and also a matter of researcher interpretation on the other hand (DeVellis, 2003; Field, 2005; Pett, Lackey & Sullivan, 2003; Pallant, 2007; Tabachnick & Fidell, 2007). For this study, the highest factor loadings per variable were used to interpret the results acknowledging that further development of the survey instrument specific to analytical depth of FA, CFA, or EFA?not PCA?was warranted. Notwithstanding, all indicators were that the pilot study statistical analysis suggested acceptable levels of validity resulting from the pilot dataset. As suggested by Pallant (2007) and Pett, Lackey and Sullivan (2003), coefficient groupings per factor were processed to determine the reliability coefficient within each domain. For example, the researcher of this study determined that the highest loadings per domain would be grouped and Cronbach?s alpha determined. Those internal consistency results are indicated in Table 32, as well as the revised constructs or dimensions within each domain. 220 Table 32 Adjusted Principal Component Analysis for Pilot Study Variables (Independent Analysis) Academic Preparation Work Ethics Institutional Support Factor Factor Factor 1 2 1 2 1 2 Q11 .730 Q6 .773 Q11 .787 Q10 .708 Q10 .770 Q10 .777 Q9 .684 Q4 .725 Q9 .775 Q7 .676 Q7 .720 Q8 .741 Q6 .655 Q9 .695 Q6 .638 Q12 .576 Q5 .659 Q12 .615 Q4 .508 Q11 .627 Q5 .553 Q5 .452 Q8 .604 Q1 .797 Q8 .427 Q2 .824 Q2 .773 Q2 .780 Q3 .803 Q3 .764 Q1 .759 Q1 .711 Q4 .572 Q3 .586 Q12* .408 -.456 Q7 .488 Cronbach?s Reliability Coefficient: .821 (Q1-12) Reliability ? Factor 1 = .810 (Q4 ? Q12) Reliability ? Factor 2 = .624 (Q1 ? Q3) Cronbach?s Reliability (Q1-12) Coefficient: .824 (Q1-11: 880) Reliability ? Factor 1 = .870 (Q4 ? Q11) Reliability ? Factor 2 = .826 (Q1 ? Q3) Reliability ? Factor 2 = .249 (Q1 ? Q3, Q12) Cronbach?s Reliability Coefficient: .901 (Q1-12) Reliability ? Factor 1 = .879 (Q5 ? Q6, Q8 - Q12) Reliability ? Factor 2 = .795 (Q1 ? Q4, Q7) Factor 1: Instructor Derived Practices Factor 2: Student Derived Practices Factor 1: Categorical Ethics Factor 2: Student Driven Intent Factor 1: Direct Institutional Support Factor 2: Indirect Institutional Support Notes (see Table 27 for additional statistical details.) 1. Academic Preparation: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 2. Work Ethics: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 3. Institutional Support: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations; 4. Loadings on variables with absolute values of < .4000 were not included in the analysis. As indicated in Table 32, the principal component analysis of the three domains of the SIT Model suggested that for each domain there were two dimensions per domain. Although the SIT Model suggested that sub-scales may be present, all statistical indicators lead the researcher to conclude that the survey instruments have demonstrated that reliability and validity had been established. Included in the methods to ascertain the 221 validity of the survey instrument scores for further use in the final dataset collection process, the three domains of the SIT Model were analyzed interdependently. Cronbach?s internal consistency results of the independent factor analysis of the SIT Model domains are as follows: 1) academic preparation (.821), with factor 1 (.810), factor 2 (.624); 2) work ethics (.824), with factor 1 (.870), factor 2 (.826), and 3) institutional support (.901), with factor 1 (.879) and factor 2 (.795). These indicators suggested that the methods used in the survey instrument were measuring valid results. To validate student success in terms of a composite of the institutional practices in this study as interdependent variables, each set of 12 scores were combined and assessed in SPSS as one dataset. Cronbach?s internal consistency of the interdependent analysis of the SIT Model domains (academic preparation, work ethics and institutional support) was .931. This assessment also included principal component analysis using Bartlett?s Test of Sphericity, Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy, and the Correlation Matrix, respectively; each indicator was within the range suggesting statistical significance. Factor derivatives were not assessed in the 36-item evaluation. The purpose in statistically assessing the 36-items in this study was to test the scores for underlying problems which may result from relationships between each domain. For example, each domain suggested that the scores presented by faculty and student respondents suggested score reliability; nevertheless, to assess the constructs within the total set of 36 variables or questions, Cronbach?s alpha was calculated, as well as significance within PCA. Additionally, means and standard deviations are also shown in Table 33 for item-by-item interpretation relative to item variance and reliability. 222 Table 33 Composite 36-Item Reliability Analysis SIT Model Domain Questions/Variables: AP = Academic Preparation; WE ? Work Ethics; IS ? Institutional Support Stu Mean Stu SD Fac Mean Fac SD Academic Preparation Variables: Q1-12 1 Writing Assignments 3.217 0.742 3.290 0.797 2 Reading the textbook 3.328 0.723 3.452 0.619 3 Students getting feedback on assignments and tests 3.678 0.535 3.581 0.529 4 Having instructors as advisors 3.544 0.637 3.194 0.827 5 Using email to give help with class material 3.206 0.830 2.790 0.792 6 Instructors who challenge and encourage students 3.617 0.610 3.742 0.477 7 Designing labs with real-world exercises 3.589 0.596 3.581 0.560 8 Having online study guides to help students learn 3.111 0.865 2.677 0.919 9 Tests that actually cover the material taught 3.800 0.415 3.694 0.516 10 Giving students help during office hours 3.583 0.597 3.597 0.527 11 Giving students feedback about progress in a course 3.656 0.563 3.645 0.546 12 Designing a syllabus that is a learning guide 3.544 0.646 3.226 0.818 Work Ethics Variables: Q1-12 (13-24) 1(13) Showing up for class on time 3.689 0.562 3.710 0.458 2(14) Students take initiative to make up work due to absences 3.794 0.445 3.839 0.371 3(15) Attending class regularly 3.778 0.467 3.855 0.355 4(16) Appearance 3.344 0.765 2.903 0.900 5(17) Being a team player in group projects 3.600 0.565 3.452 0.563 6(18) Helping other students succeed 3.339 0.756 3.065 0.744 7(19) Students improving their organizational skills 3.644 0.556 3.500 0.536 8(20) Treating people with respect 3.794 0.419 3.565 0.562 9(21) Instructors giving students feedback on their work ethics 3.578 0.651 3.532 0.593 10(22) Hearing from business/community leaders about work ethics 3.133 0.868 3.048 0.858 11(23) Being an effective manager of time 3.650 0.534 3.645 0.515 12(24) Earning an A by unethical methods 2.356 1.319 1.581 1.095 Institutional Support Variables: Q1-12 (25-36) 1(25) Having problems resolved satisfactorily 3.561 0.581 3.403 0.527 2(26) Perceiving faculty/staff/admin as accessible and helpful 3.611 0.583 3.565 0.532 3(27) Feeling safe on campus to study 3.700 0.527 3.645 0.482 4(28) Getting help in finding meaningful employment 3.581 0.677 3.290 0.776 5(29) Permission to call any individual associated with the college 3.372 0.740 2.758 0.803 6(30) Online registration is available when needed 3.467 0.646 3.306 0.737 7(31) Being in classrooms that are clean 3.589 0.547 3.468 0.564 8(32) Understanding the mission of the college 3.335 0.742 2.710 0.930 9(33) Student organizations that enrich the learning experience 3.361 0.768 3.226 0.663 10(34) Feedback to administrators on how to improve the college 3.408 0.700 3.258 0.676 11(35) Having community services published on the web site 3.228 0.797 2.839 0.793 12(36) Resources for student support are reliably accessible 3.556 0.591 3.484 0.593 Notes: 1. (N = 265) 2. Scale: (1) Not Important, (2) Somewhat Important, (3) Important, and (4) Very Important 3. Cronbach?s Reliability Coefficient for Internal Consistency: .931 4. Kaiser-Meyer-Olkin Measure of Sampling Adequacy: KMO = .908 5. Bartlett?s Test of Sphericity: a. Approx. Chi-Square, 4358.660; b. df = 630; c. Sig. = .000 6. Correlation Matrix table identified 48.23% loading coefficients as ? 0.3000 (625/1296). 7. Principal Component Analysis was not processed in the composite scale; this action is reserved for further study. 223 In each survey, students and faculty were asked to rate the abilities of students in terms of general practices impacting student success. For example, students and faculty were asked to rate the leadership abilities of students. Leadership was the practice being assessed and the measure being tested was whether the perception of these sample groups was more likely to agree or disagree. This section of the survey was to establish a benchmark that students and faculty did not agree in terms of factors related to self- perceptions of student abilities, e.g., student success. The data was collected in a 15 item set with a scale of: 1) Below Average, 2) Average, and 3) Above Average. This scale established a benchmark from which perceptions would be compared and used to respond to Research Question IV. For example, using the 15 items in the self-reported abilities, the variances between students and faculty established that a strong correlation existed between previous studies and the data collected in this study?which supported validity (expected outcomes) of the survey scores. As indicated in Table 34, Cronbach?s reliability coefficient of internal consistency for the self-reported (abilities/practices) pilot study dataset was .919. This assessment also included principal component analysis using Bartlett?s Test of Sphericity, Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy, and the Correlation Matrix, respectively; each indicator was within the range suggesting statistical significance. Factor derivatives were assessed and identified, but were not detailed in terms of FA, EFA or CFA, as validity of the survey instrument scores was of paramount concern. 224 Table 34 Principal Component Analysis of Self-Reported Student Abilities (Practices) Rotated Component Matrix(a) : (N = 265) Factor Faculty Student Cronbach?s Reliability Coefficient: .919 1 2 Mean/SD Mean/SD I15 ? Work Ethic .808 1.677/0.536 2.570/0.546 I4 ? Motivation to succeed in college .805 2.048/0.638 2.730/0.480 I10 ? Enjoy learning new things .767 2.113/0.704 2.720/0.483 I3 ? Team player .762 2.081/0.489 2.290/0.538 I9 ? Respect for others .728 2.048/0.585 2.780/0.452 I6 ? Producing quality work .702 1.871/0.586 2.500/0.531 I14 - Leadership .692 1.790/0.547 2.390/0.539 I11 ? Reading ability .620 1.710/0.584 2.490/0.578 I2 ? Writing ability .592 1.694/0.499 2.290/0.538 I5 ? Oral presentations .587 1.726/0.518 2.120/0.603 I12 ? Time management .586 1.661/0.571 2.190/0.602 I1 - Attendance .511 2.129/0.586 2.530/0.540 I13 ? Math skills .771 1.694/0.499 2.130/0.604 I8 ? Success in high school .719 1.903/0.503 2.300/0.607 I7 ? Computer skills .693 2.032/0.572 2.290/0.645 1. Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a Rotation converged in 3 iterations. 2. Factor 1: Cronbach?s Coefficient Alpha for internal consistency: ? = .919 3. Factor 2: Cronbach?s Coefficient Alpha for internal consistency: ? = .669 4. Kaiser-Meyer-Olkin Measure of Sampling Adequacy: KMO = .927 5. Bartlett?s Test of Sphericity: a. Approx. Chi-Square, 1985.468; b. df = 105; c. Sig. = .000 6. Factor 1: Perceived Ethics: Factor 2: Perceived Skills Proficiency 7. Correlation Matrix table identified 81.3% loading coefficients as ? 0.3000 (183/225). Table 35 provided a summary of the pilot data score analysis. Faculty and student (independent variables) scores were reported in the SIT Model domains of academic preparation, work ethics, and institutional support, as well as self-reported abilities (practices) and the SIT Model grouped-domain variables (dependent variables). Specific to homogeneity of the scores within the respective domains, Cronbach?s alpha coefficient of internal consistency for reliability ranged from .821 to .919, indicating strong inter- item correlations in each dependent variable. To evaluate significance and other factors related to validity, ANOVAs were used to evaluate domain scores. 225 To assess the validity of the survey instrument scores as measuring what they claimed to measure in this study, ANOVAs were used to compare group means in the domains of the SIT Model. The ANOVAs suggested validity in terms of statistical evidence that the outcomes of the scores suggested that differences in the group perceptions existed and that these differences were statistically significant. The statistical evidence indicated that the study measured what it intended to measure, e.g., test scores were appropriate and meaningful for assessing student and faculty perceptions of student success in the domains of the SIT Model. The information in Table 35 provided a summary of the pilot data score analysis as a method to suggest that the survey instruments? scores possessed validity and reliability (Chaudron, 2006; DeVellis, 2003; Field, 2005; Fowler, 2002; Pallant, 2007; Pett, Lackey & Sullivan, 2005; Shannon & Davenport, 2001; Tabachnick and Fidell, 2007). Table 35 Pilot Data ANOVAs Indicating Validity, Reliability and Significance * SIT Model Factors Source df ? F *p Observed Power ?2 Levene?s /Sig FA/PCA (Table #) Self-Reported Abilities 1, 254 .919 124.632 .000 1.000 .329 3.585/.059 34 Institutional Support 1, 240 .901 11.286 .001 .929 .047 4.400/.037 31,32 Academic Preparation, Work Ethics, and Institutional Support 1, 240 .931 10.201 .002 .889 .041 .606/.437 N/A Work Ethics 1, 240 .824 7.892 .005 .799 .032 .322/.571 31,32 Academic Preparation 1, 240 .821 4.271 .040 .539 .017 .771/.381 31,32 * p < .05 (Faculty N = 68; Student N = 197.) [For adjusted multiple comparisons, see Table 36]. 226 Specific to the data provided in Table 35, this study did not invoke the methodology of a priori to analyze differences in student and faculty groups. However, as a result of using the same independent variables to compare multiple variances (relative to Type I and Type II error avoidance), an a posteriori analysis was applied to the pilot study data. The technique used was the Bonferroni-Holm (BH) pairwise- comparison correction model of the alpha level (Aickin, 2004; Pallant, 2007). The Bonferroni-Holm model has a logical ordering structure of p1 < p2 < ... < pn (Aickin, 2004, p. 183). In terms of multiple comparisons, the algorithm is indicated as: if p1 < ?/n, reject the corresponding null hypothesis and continue; if p2 < ?/(n-1), reject the corresponding null hypothesis and continue; if p3 < ?/(n-2), reject the corresponding null hypothesis and continue; ?until all tests are evaluated using the Bonferroni -Holm methodology. As is indicated in Table 36, the p-values associated with the ANOVA (see Table 35), were adjusted using the BH algorithm. The data processed in the Bonferroni-Holm algorithm resulted in the following adjusted p-values, noting that the independent variables (IV) are constant, e.g., students and faculty. The following methodology to evaluate the pilot data using multiple measurements is provided: 1. Measurement 1: IV to DV, Self-Reported Abilities: ?/n (.05/5) = .01; p1 = .000. Measurement 1 (p1 < ?/n, .000 < .01) rejects the corresponding null hypothesis that students and faculty view student abilities the same; 2. Measurement 2: IV to DV, Institutional Support ?/n (.05/4) = .0125; p2 = .001. Measurement 2 (p2 < ?/n, .001 < .0125) rejects the corresponding null hypothesis that students and faculty view institutional support the same; 227 3. Measurement 3: IV to composite DVs (academic preparation, work ethics, and institutional support), ?/n (.05/3) = .0167; p2 = .002. Measurement 3 (p2 < ?/n, .002 < .0167) rejects the corresponding null hypothesis that students and faculty view the overall combined practices of academic preparation, work ethics, and institutional support the same; 4. Measurement 4: IV to DV, Work Ethics ?/n (.05/2) = .025; p2 = .005. Measurement 4 (p2 < ?/n, .005 < .025) rejects the corresponding null hypothesis that students and faculty view the work ethics the same; and, 5. Measurement 5: IV to DV, Academic Preparation ?/n (.05/1) = .05; p2 = .040. Measurement 5 (p2 < ?/n, .040 < .05) rejects the corresponding null hypothesis that students and faculty view the academic preparation the same. Table 36 Pilot Data ANOVAs, Bonferroni-Holm(BH) Adjusted Correction Model. SIT Model Factors Source df ? F p* BH Observed Power ?2 Levene?s /Sig FA/PCA (Table #) Self- Reported Abilities 1,254 .919 124.632 .000* .0100 1.000 .329 3.585/.059 34 Institutional Support 1,240 .901 11.286 .001* .0125 .929 .047 4.400/.037 31,32 Academic Preparation, Work Ethics, and Institutional Support 1,240 .931 10.201 .002* .0167 .889 .041 .606/.437 N/A Work Ethics 1,240 .824 7.892 .005* .0250 .799 .032 .322/.571 31,32 Academic Preparation 1,240 .821 4.271 .040* .0500 .539 .017 .771/381 31,32 * Significant p-value after Bonferroni-Holm (BH) step-down correction methodology: (.05/5 = .01; .05/4 = .0125; .05/3 = .0167; .05/2 = .025; .05/1 = .05) 228 As part of the methodology of the pilot study, data outcomes were reviewed in terms of Type I and Type II errors. To reduce the chance of reaching the wrong conclusion when performing analysis of variance procedures, this study compared the pilot dataset outcomes against the principles of Type I and Type II errors. A Type I error indicates that ?we may reject the null hypothesis when it is, in fact, true? (Pallant, 2007, p. 205); conversely, a Type II error suggests that ?we fail to reject a null hypothesis when it is, in fact, false, i.e., believing that the groups do not differ, when in fact they do? (Pallant, 2007, p. 205). Therefore, a review of the output suggested that the coefficients and statistical values indicated that a Type I or Type II principle had not been violated, setting the stage for the instruments to be used in the final dataset. The basic principle for both types of error is to ?reject? the null hypothesis, from different interpretive analysis. The original hypothesis for this study was to theoretically note that students and faculty do not agree with each other in how they respectively perceive student success; specifically, how the practices of academic preparation, work ethics, and institutional support are perceived. In an effort to avoid a Type I or Type II error, statistical significance was reviewed using various methodologies, values and indicators. Because the Type I error is to reject the null hypothesis (there is no difference in student and faculty perceptions) when it is actually true (students and faculty, in fact, have similar perceptions), erroneous conclusions may be drawn or decisions are made, e.g., erroneous policies and practices. Therefore, for this study, the Bonferroni-Holm p-value adjustment model of multiple comparisons was used to reduce the increased problem with Type I errors when making multiple comparisons between the IV and multiple DV?s. Type II errors were reviewed in light of the statistical analysis using the Bonferroni-Holm model; 229 the data and outcomes indicated that a Type II error had not been assumed (students and faculty view student success the same, but that construct was rejected). Finally, the methodology of the pilot study dataset and analytical analysis was conducted within the statistical principles of the following assumptions (Pallant, 2007): 1) to address variances between the groups, the dependent variable(s) utilized a continuous scale to more accurately measure significant levels of variance; 2) random sampling was practiced; 3) respondents were independent of one another; 4) the samples were extracted from populations which were normally distributed; and, 5) that samples were obtained from populations of equal variance. Data Collection and Procedures This study was conducted to assess the variances in perceptions of students and faculty as related to evaluating and improving strategic factors that influence student achievement. As a working framework, this study applied procedures of data collection and analysis similar to research practices used in business organizations (Chaudron, 2006). To obtain perceptual data from students and faculty, online surveys were administered to students and faculty. Faculty and student respondents who choose to participate were directed to separate web portals to access the respective online surveys. Students were directed to the Student and Faculty Perceptions of College Student Success: STUDENT SURVEY, while faculty were directed to the Student and Faculty Perceptions of College Student Success: FACULTY SURVEY. (Online surveys were available via an SSL-encrypted link to SurveyMonkey.com.) 230 Initially, the survey instruments were developed in a paper-format and mailed to respective participating colleges. In response to extremely low return rates and incomplete surveys, the paper surveys were quickly discarded and replaced with globally accessible online surveys. The research protocols were each (paper and online) approved by the Auburn University Institutional Review Board for the Use of Human Subjects in Research. The initial Protocol Approval form is displayed in Appendix I, whereas the Protocol Modification document is shown in Appendix J. In order to facilitate maximum data collection, close coordination between the college liaison and researcher was maintained throughout the data collection process. The data collection relationship between local college contact and principal investigator included both pilot study and final dataset phases of this study (see Table 24 and Table 26). Utilizing an advanced technology, the method established to collect the data was a simple process of providing students or faculty links to respective web portals. Each web portal included information sheets and respective surveys. (Web portals are defined in this study as a specific http link to a web server.) To provide easy access for respondents to self-report their perceptions, the surveys were designed and modified using SurveyMonkey.com. The procedure for the actual data collection was a process of forwarding the links to college liaisons, who in turn, forwarded the links to listserv web or email servers for faculty or students, e.g., allfaculty@localcollege.edu, allstudents@localcollege.edu. For colleges that did not have listserv functions in place for students, listserv functions for faculty was the only method of reaching out to student respondents. Faculty were sent both sets of links and forwarded these links to students either in class mailing lists or informing students by sundry means that the survey was 231 available, if they desired to voluntarily participate. Although the method may seem simple in practice, the composite framework of designing and upgrading the survey and the technology of implementation and data collection required considerable man-hours of serious effort. SurveyMonkey.com provided a web-based, interactive object-oriented database software application which allowed the researcher to design sections, sub-sections, scales, and various question configurations throughout the survey instrument. The sections and questions were varied as required by the researcher and suggested by the survey review panel. For example, Liker-scale items were a simple ?point-and-click? process, as well as many other types and formats of questions. The design method of SurveyMonkey.com is geared to all types of qualitative and quantitative studies, inclusive of single-mode or mixed-method research. Data coding was initially accomplished using SurveyMonkey.com. The data were formatted in SurveyMonkey.com for import into Microsoft Excel 2007. Data were examined in SPSS using statistical procedures to determine reliability, alpha-coefficients, factor analysis (principle component analysis), and analysis of research questions. (The detail of the analysis was discussed in the section on Reliability and Validity.) Data analysis consisted of two main statistical functions: 1) qualitative analysis, and 2) quantitative analysis. First, open ended questions were provided to both students and faculty to extract data themes relevant to the study. Section 5 of the student survey and Section 7 of the faculty survey included open ended questions. For example, to access themes associated with student success, Question #2 asked: How can community colleges help students or faculty acquire and practice good work ethics? Question 2 was 232 used to seek the open-ended responses from faculty and students so that common themes might emerge. The researcher used the open-ended responses to gain insight into scaled responses which may not have been evident in the scaled responses. The comments were processed in Microsoft Word 2007, and searches were conducted for commonalities within the comments from both student and faculty respondents. According to Creswell (2003), commonalities in qualitative studies provide phrases or metaphors which are linked parts to the whole. Therefore, the comments from faculty and students were used as ?information data-words? to identify word associations to the scaled data in the survey questions. The open-ended responses are discussed as part of the analysis or each research question in Chapter IV. Responses were also correlated to the SIT Model domains to provide additional reliability and validity of the data collected during this study. This correlation was addressed by matching themes to specific domains. Secondly, to statistically assess the value, consistency, and variability of the scaled data, a one-way ANOVA was used. The ANOVA was selected as a method to measure the perceptual impact of the independent variables (faculty and students) on the Strategic-Impact-Triad Model dependent variables of academic preparation, work ethics, and institutional support?within the context of institutional practices impacting community college student success. Additionally, descriptive and comparative methods were included in this study to inform the community college of the depth of concern that the community college system of education should foster for maximizing student success and achievement at all levels of the Teaching-Learning-Assessment Domain. 233 Confidentiality and Anonymity In regards to the respondents who participated in the pilot study and the final dataset collection process, this study practiced the highest levels of integrity in maintaining the anonymity of all respondents. Any participant who accidentally or voluntarily provided personal identification in the open-ended questions was assured that this identifying information was treated confidentially. As Moss (1998) suggested in The Role of Consequences in Validity Theory, ?the definition of validity is not just an interesting philosophical question; it can be seen to have real ethical, political, and economic consequences? (p. 6). Although the discussion by Moss (1998) was related to the domain of validity resulting from measurement practices, the larger context of the construct just noted extends to the protection of human subjects. Consequently, for this study it was imperative that the participants understood that their perceptions and opinions (or unintended self-reported personal identification) were provided in absolute confidentiality. Under no circumstances were any responses or unintentional identifications provided to any individual, institution, organization, or entity, which violated the sacred trust of individuals to remain anonymous. Contextually, this study used methods to ensure that ?real ethical, political, and economic consequences? (Moss, 1998, p. 6) of data collection and reporting were properly obtained, but never at the expense or identification of any individual participant or group of participants. 234 Chapter Summary This chapter provided a roadmap as to how the methods of the study unfolded. A survey was developed by the researcher, tested using pilot data, and analyzed in SPSS. The outcomes of the SPSS data indicated that the survey instruments possessed elements of validity, reliability, and functionality. Additionally, the survey instruments were reviewed by appropriate individuals and the results of the feedback indicated a viable research instrument. Descriptive analysis using factor analysis (principal component analysis) was presented as an indication that the variability within the domains of the study correlated to an acceptable level. This outcome provided information that the survey instruments were acceptable for continuing the study by collecting the final dataset as previously noted in Chapter III. The pilot study, while attaining fewer respondents than expected, resulted in a positive correlation to the methodological design of the study, e.g., acceptable levels for Cronbach?s alpha, observed power, Levene?s Test of Equality Error Variance, factor analysis, principal component analysis, Bonferroni-Holm p-value adjustment, and significant outcomes (p < .05). Additionally, qualitative and quantitative elements of the survey instruments were explained and analysis of these sections provided feedback that respondents who completed the survey suggested that all portions of the survey were viable and valid collection components within the total instrument domains of interest. Chapter IV will present the statistical content, detail, and analysis of the final dataset. In addition, each domain will be statistically and descriptively assessed. The dataset will be presented in two forms: 1) statistical form (quantitative), and 2) 235 descriptive form (qualitative). Correlation between statistical and descriptive data will also be presented, including qualitative data from the pool of respondents. 236 CHAPTER IV RESULTS ?In faith, there is enough light for those who want to believe and enough shadows to blind those who don?t.? --- Blaise Pascal ?The world we?ve made as a result of the level of thinking we have done thus far creates problems that we cannot solve at the same level at which we created them.? --- Albert Einstein Introduction Community colleges have become an open-door for more than 11 million full- and-part time students across the nation (American Association of Community Colleges, 2006a, 2006b, 2007; Phillippe & Sullivan, 2005). With the arrival of each student, there is an influx of variations in attitudes, experiences, educational backgrounds, and perceptions. These noted variables are inherent in every student in the community college system of education, as well as every faculty member?whether that faculty member is a full-time or part-time individual. Moreover, faculty members also have attitudes, experiences, educational backgrounds, and perceptions which do not necessarily align with those of students. This study seeks to understand the differences or similarities in how students and faculty express their respective attitudes, experiences, educational backgrounds, and perceptions. Chapter IV will explore the final dataset for this study. The final dataset will be used as the focal point to statistically and descriptively respond to the survey questions posed in Chapter I. The research questions used in this study were: 237 1. What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? 2. What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? 3. What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? 4. What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics, and institutional support? Characteristics of the Sample As indicated in Chapter III (see Table 26), the initial set of potential participating colleges for the final dataset was quite extensive. The potential sample from the original total populations for students and faculty respectively were: 121,753 and 6,557. During a period of 30 calendar days in which letters were sent to college presidents, the colleges which chose to participate were 6 of the 18, or a participation rate of 33%. The revised potential sample size now included 51,771 students and 3,073 faculty members. In the final analysis of the collection process, a total of 396 students and 152 faculty members participated in the study. 238 Community or technical colleges who voluntarily participated are shown in Table 37. Upon receipt of approval letters from college presidents (see samples, Appendix K and Appendix L), surveys and information letters were sent to the college liaison (see Appendix A, B, C & D). Fifteen days after transmitting the web portals, a Letter of Appeal (see Appendix M) was sent to encourage increased participation. The surveys were accessible in SurveyMonkey.com for a period of 30 calendar days, with follow-up phone calls and emails from the principal investigator during this time period. Table 37 Participating* Community and Technical Colleges Surveyed for the Final Dataset Community or Technical College Location Students Faculty (1) P/F:T Participation Alabama Southern Comm Coll Monroeville, AL 2,548 49/58: 107 No (Note 5) Altamaha Technical College Jesup, GA 1,921 65/41: 106 No (Note 6) Bevill State Comm College Sumiton, AL 6,513 329/121: 450 No (Note 5) Bishop State Comm College Mobile, AL 4,074 94/119: 213 No (Note 5) *Calhoun Comm College Decatur, AL 9,345 480/134: 614 Yes * *Central Georgia Technical Coll Macon, GA 4,873 375/109: 484 Yes * Chattahoochee Valley Comm Coll Phenix City, AL 2,049 85/29: 114 No (Note 5) *Florida Comm Coll at Jacksonville Jacksonville, FL 23,700 346/375: 721 Yes * *Gadsden State Comm College Gadsden, AL 5,040 369/162: 531 Yes * James H. Faulkner St. Comm Coll Bay Minnette, AL 3,332 150/66: 216 No (Note 5) *Jefferson Davis Comm College Brewton, AL 1,084 58/47: 105 Yes * *Jefferson State Comm College Birmingham, AL 7,729 480/138: 618 Yes * Northeast Alabama St. Comm Coll Rainsville, AL 2,789 170/41: 211 No (Note 5) Reid State Technical College Evergreen, AL 662 27/23: 50 No (Note 5) Southern Union State Comm Coll Opelika, AL 4,731 197/95: 292 No (Note 5) T.A. Lawson State Comm College Birmingham, AL 5,595 155/108: 236 No (Note 5) Valencia Comm College Orlando, FL 29,636 861/298: 1,159 No (Note 6) Wallace State Comm College Hanceville, AL 6,132 199/131: 330 No (Note 5) Totals: 121,753 6,557 Adjusted Totals (Participants) *: 51,771 3,073 Notes: 1. Faculty in the dataset includes estimates of full-and-part time (analysis did not factor sub-groups of part- time faculty). NOTE: P/F: T equates to Part-Time/ Full-Time: Total. 2. Source a) http://nces.ed.gov/ipedspas/reportOnVars.asp. 3. Source b) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/ftptpersonnel.aspx 4. Source c) http://www.acs.cc.al.us/facts/2006-2007/enrollment/fall/stuhdctbygenrace.aspx 5. College did not respond 6. College responded, but could not participate 239 Student Participants Student demographic data requested in the survey described student participants in terms of age, gender, employment, ethnicity, educational goals, and family status. Moreover, demographics contributed an important set of descriptive data to better understand who the student respondents were in the study. Many students in the community or technical college are more likely to be older students, have families, and work to support those families. These types of information are important when analyzing the results of reported data in terms of student success. These variables are valuable to the community college system as the institutions move forward to improve academic preparation, work ethics, or institutional support practices and policies to enhance the success of students across all spectrums of success. As indicated in Table 38, the actual number of student respondents was 396, or a return rate of < 1% for the final student dataset. A composite return rate for both students and faculty members was approximately 1%. Student data related to ethnicity included the following items: White (non-Hispanic), 68.4%; African-American (non-Hispanic), 24.7%; Hispanic (Latino/Latina), 2.0%; Asian/Pacific Islander, 2.0%; American Indian/Alaskan Native, 0.0%; and, Other, 2.8%. The survey instrument did not provide categorical options for the ?other? category. Detailed demographic data for students is provided in Appendix L. Generally for students, female participants outnumbered male participants by a ratio of 293:103, or 74% female students to 26% male students?a 3:1 margin. Comparatively, students ranged in age as follows: 19-24, 32.3%; 25-34, 33.6%; and, 35 or older, 34.1%, with 67.7% 25 years of age or older. For the community college, 240 research indicated that students are older than the status quo of the 19-24 age range (Adelman, 2005; Horn, Nevill & Griffith, 2006). The results of this study were supported by the national data related to community college student ages. Additional student demographic data indicated that 29.3% of students were first- time attendees, whereas 55.6% reported they were a returning or transfer student. Within the student respondents, 20.4% noted that they were updating skills, pursuing professional certification, or were attending for reasons not included in the survey instrument. Students, as a matter of perceived preparation for college, indicated that 61.6% earned A?s and B?s in high school, with 26.8% reporting C grades, and 11.6% recalling that they were in the ?D-grades or below area?, or could not remember their respective high school GPA. In terms of degrees sought, students indicated that their educational goals included: Associate Degrees, 28.8%; Bachelor Degrees, 28.8%; Master Degrees, 26.8%; and, PhD, EdD, JD, or MD was 15.6%. Related to high school GPA and degrees sought, students reported the following statistical relationship between remedial courses taken and reported academic preparation for college: 48.7% completed a basic Math course; 39.3% indicated they needed a basic English course; 21.0% reported the necessity of a basic Reading course; and, 42.7% noted that remedial or developmental courses were ?not applicable.? It is unknown in this study if the 42.7% indicating ?not applicable? did not require remedial coursework, or did not report their participation. In terms of specific variables impacting student success, the amount of work and family requirements were included in the survey. Students responded that 53.0% work full-time, 22.0% work part-time, and 22.7% did not work while attending class. Family 241 impact was noted as 33.3% who were married with children, 6.3% were married, but did not currently have children, and 17.9% were single parents. Although the survey instrument did not categorize combinations of employment/marital status, students indicated that 75% of them worked either full-time or part-time while attending college; moreover, 57.5% of the students were involved in some type of family requirements, with or without children. Stated differently, the data in this study indicated that 3 of 4 students were more likely to work, and that 1 in 2 students are involved in family situations. This data informed the community and technical college that the Strategic-Impact-Triad Model factors have a significant impact on student success, e.g., sufficient time to devote to academic preparation. Student demographic data suggested that family and work are important variables impacting student success, e.g., time with work and family is time away from academic preparation. Additionally as noted in the reported demographic data regarding remedial courses taken, a lack of academic preparation impedes the success of students if institutional support practices are not reviewed regularly to understand student abilities and provide resources for student success. In this study, the ?demographic variables? established perceptions which students: 1) bring with them to the academic table, 2) use to form and practice work ethics; or, 3) interpret institutional support structures within the college. A synopsis of the student demographic data is provided in Table 38, with a detailed view of the demographic data provided in Appendix N. 242 Table 38 Student Demographics College Gender M/F:Tot Age Groups Educational Goals Employment CCC 6/12:18 19-24: 8 25-34: 2 ? 35: 8 AS: 6 BS: 5 MS: 7 PhD: 0 JD/MD: 0 Work Full-Time: 6 Work Part-Time: 7 Do Not Work: 5 Married, w/Children: 6 Married, no Children: 2 Single Parent: 1 CGTC 38/137:175 19-24: 45 25-34: 64 ? 35: 66 AS: 72 BS: 40 MS: 35 PhD: 18 JD/MD: 10 Work Full-Time: 92 Work Part-Time: 33 Do Not Work: 44 Married, w/Children: 53 Married, no Children: 13 Single Parent: 40 FCCJ 11/2:13 19-24: 0 25-34: 6 ? 35: 7 AS: 3 BS: 10 MS: 0 PhD: 0 JD/MD: 0 Work Full-Time: 8 Work Part-Time: 3 Do Not Work: 2 Married, w/Children: 7 Married, no Children: 0 Single Parent: 1 GSCC 1/5:6 19-24: 1 25-34: 2 ? 35: 3 AS: 2 BS: 2 MS: 2 PhD: 0 JD/MD: 0 Work Full-Time: 2 Work Part-Time: 1 Do Not Work: 0 Married, w/Children: 2 Married, no Children: 1 Single Parent: 0 JDCC 0/4:4 19-24: 1 25-34: 0 ? 35: 3 AS: 0 BS: 1 MS: 2 PhD: 1 JD/MD: 0 Work Full-Time: 2 Work Part-Time: 1 Do Not Work: 1 Married, w/Children: 1 Married, no Children: 0 Single Parent: 0 JSCC 49/131:180 19-24: 69 25-34: 64 ? 35: 47 AS: 29 BS: 61 MS: 59 PhD: 19 JD/MD: 12 Work Full-Time: 101 Work Part-Time: 38 Do Not Work: 36 Married, w/Children: 58 Married, no Children: 8 Single Parent: 26 Total Respondents: 396 Students N = 396 243 Faculty Participants Faculty demographic data requested in the survey characterized faculty participants in terms of gender, age, ethnicity, education, years teaching experience, and types of teaching assignments, e.g., online versus in class. Moreover, faculty demographics contributed an important set of descriptive data to better understand who the faculty respondents were in the study. Data reported indicated faculty were highly experienced, with 68.4% having 5 years or more of teaching experience; moreover, 83.6% were older than 35 years of age. These variables indicated that the faculty respondents were experienced in the practices of their respective institutions, thereby, providing a valuable set of perceptual data for use in assessing practices related to academic preparation, work ethics and institutional support. As indicated in Table 39, the actual number of faculty respondents was 152 faculty members, or a return rate of < 1% for the faculty dataset. A composite return rate for both students and faculty members was approximately 1%. Faculty data related to ethnicity included the following items: White (non- Hispanic), 85.5%; African-American (non-Hispanic), 7.9%; Hispanic (Latino/Latina), 0.7%; Asian/Pacific Islander, 2.0%; American Indian/Alaskan Native, 0.7%; and, Other, 3.3%. The survey instrument did not provide categorical options for the ?other? category. Faculty member sample demographic data is presented in Table 39, with detailed demographic data for faculty members provided in Appendix M. Generally for faculty, there were 101 female faculty members (66.4%) compared to 51 male instructors (33.6%)?a 2:1 relationship. Faculty age ranges were: 19-24, 2.0%; 25-34, 14.5%; and, 35 or older, 83.6%, with 98.1% 25 years of age or older. 244 For faculty, educational degrees were reported as 12.5% Bachelors, 71.7% Masters, 14.5% Doctorate, and a 1.3% holding either an MD or JD professional licensure. In terms of teaching experience and employment status, faculty responded that 28.3% had 5 or less years of teaching experience, while 69.1% had 6 or more years in the classroom. The make-up of the teaching status was: full-time, 63.2% and part-time, 18.4%. Additionally, faculty noted that their assigned duties were distributed as: teaching technical courses only, 31.2%; general education (non-technical), 47.1%; teach in-class and on-line courses, 38.4%; teaching in-class only, 50.0%; and, teaching on-line courses only, 2.2%. Faculty demographic data, as in the case of student demographic data, provided valuable information which suggested that faculty?as do students?have perceptions which faculty members: 1) bring with them to the academic table, 2) use to form and practice work ethics; or, 3) interpret institutional support structures within the college. A synopsis of the faculty member demographic data is provided in Table 39, with a detailed view of the demographic data provided in Appendix O. 245 Table 39 Faculty Demographics College Gender M/F: Tot Age Groups Degree Status Teaching Experience CCC 12/18:30 19-24: 2 25-34: 3 ? 35: 25 BS: 2 MS: 21 PhD: 6 JD/MD: 1 5 or less years: 10 6 to 10 years: 6 More than 10 years: 13 Teach Full-Time: 12 Teach Part-Time: 11 CGTC 10/16:26 19-24: 0 25-34: 0 ? 35: 26 BS: 9 MS: 17 PhD: 0 JD/MD: 0 5 or less years: 8 6 to 10 years: 4 More than 10 years: 14 Teach Full-Time: 17 Teach Part-Time: 5 FCCJ 0/1:1 19-24: 0 25-34: 0 ? 35: 1 BS: 0 MS: 1 PhD: 0 JD/MD: 0 5 or less years: 0 6 to 10 years: 0 More than 10 years: 1 Teach Full-Time: 1 Teach Part-Time: 0 GSCC 9/15:24 19-24: 0 25-34: 3 ? 35: 21 BS: 3 MS: 17 PhD: 3 JD/MD: 1 5 or less years: 5 6 to 10 years: 2 More than 10 years: 16 Teach Full-Time: 21 Teach Part-Time: 1 JDCC 2/6:8 19-24: 0 25-34: 1 ? 35: 7 BS: 0 MS: 5 PhD: 3 JD/MD: 0 5 or less years: 1 6 to 10 years: 0 More than 10 years: 7 Teach Full-Time: 6 Teach Part-Time: 2 JSCC 18/45:63 19-24: 1 25-34: 15 ? 35: 47 BS: 5 MS: 49 PhD: 9 JD/MD: 0 5 or less years: 19 6 to 10 years: 15 More than 10 years: 26 Teach Full-Time: 38 Teach Part-Time: 9 Total Respondents: 152 Faculty N = 152 246 Quantitative Analysis and Findings This section of the study will present the quantitative findings in relationship to each research question. An ANOVA was used to assess the impact that academic preparation, work ethics, and institutional support?separately and collectively?had on student success as perceived by students and faculty. For the analysis, faculty and students were the independent variables; the dependent variables were academic preparation, work ethics and institutional support. The quantitative analysis and findings that follow will address: 1) student and faculty perceptions, and 2) the research questions for this study. Student and Faculty Perceptions To establish a statistical benchmark that students and faculty possess variances in their respective perceptions within the student success domain (totality of practices), a set of 15 self-reported categories or practices related to student success was included in the survey instrument. The 15 categories (practices) were: 1) attendance, 2) writing ability, 3) team player, 4) motivation to succeed in college, 5) oral presentations, 6) producing quality work, 7) computer skills, 8) success in high school, 9) respect for others, 10), enjoy learning new things, 11) reading ability, 12) time management, 13) math skills, 14) leadership, and 15) work ethic. These 15 categories of educational practice were framed within the Strategic-Impact-Triad Model factors of academic preparation, work ethics and institutional support. To establish the perceptual benchmark scores, the following questions were presented to students and faculty, respectively: 1) Compared to other Community College 247 students at my college, I would rate myself in the following categories as: 2) Based on your experience as an instructor, how would you rate the general performance of your students in the categories below? The categories (categories and practices are hereafter interchangeable) identified in the questions to faculty and students are the 15 items previously noted above. These 15 items were scored using a 3-point Likert-scale of: 1) below average, 2) average, and 3) above average. Descriptively, students rated themselves in the ?average-to-above-average? scale in all 15 categories, using highest statistical percentages for each category. For example, students rated themselves in attendance as ?above average? at 62.9% of the respondents, work ethic as ?above average? at 68.0%, and leadership as ?average? at 51.7% of respondents (see Table 39). No student category had the greatest percentage in the ?below average? scale. The variance in student self-rated perceptions of student success (practices) compared to faculty ratings of students positively correlated to previous research (Lindholm, Szelenyi, Hurtado & Korn, 2005; Wyatt, Saunders & Selmer, 2005). Faculty scores suggested that perceptions of student abilities, e.g., practices to be successful in college, were rated in the ?below-average-to-average? scale in each category. For example, faculty rated students in attendance as ?average? at 67.4%, work ethic as ?average? at 57.2%, and leadership as ?average? at 64.5% of respondents (see Table 39). Statistically, faculty reported the following percentage scores for each rating in all 15 categories: 1) 25.16% ?below average?; 2) 63.4% ?average?; and, 3) 11.4% ?above average?. For faculty, 88.6% rated students in the 15 categories, or areas of practice, as ?below-average or average?. Only 11.4% of faculty perceived students as ?above 248 average? in the practices impacting student success. Students reported the following percentage scores for each rating in all 15 categories: 1) 5.2% ?below average?; 2) 43.2% ?average?; and, 3) 51.5% ?above average?. For students, 94.7% rated themselves in the 15 categories or practices, as ?average-to-above-average?. Only 5.2% of the student respondents rated themselves as ?below average? in all categories. The 15 items were compared in SPSS using students/faculty as the independent variables and the student ability item mean scores as the dependent variable. Cronbach?s alpha coefficient of reliability and internal consistency was calculated at .911 (strong inter-item correlation), N=548, 3.5% responses excluded (96.5% valid cases). The faculty mean (M = 1.86, N = 138, SD = .363) was significantly different (0.60 on a 3-point scale) from the student mean (M = 2.46, N = 391, SD = .302), validating the group mean directional variance and magnitude expected in this study. To assess the relationship between students and faculty (independent variable) and self-reported-practices scores (dependent variable) to establish the perceptual benchmark, an analysis of variance (ANOVA) was used with ? = .05. The results of the ANOVA indicated statistical significance, F (1,527) = 360.692, p < .001, ?2 = .406, observed power = 1.0. The significance suggested that students and faculty do not perceive student success abilities (practices) similarly. The eta squared effect size of 0.406 was large (Cohen, 1988; Field, 2005; Pallant, 2007), suggesting that the interaction between student and faculty perceptions accounted for 41% of the total variance in the student ability (practices) scores. The observed power statistic of 1.0 suggested confidence in rejecting the null hypothesis that there was no real difference between the groups. 249 Additionally, Levine?s Test of Equality of Error Variances indicated that the assumption of equality of variances across population groups represented by the reported sample scores was not violated, F (1,527) = 3.429, p > .05; therefore, Welch and Brown- Forsythe?s robust tests of equality of means were not included in this section of the study as these robust tests are ?preferable when the assumption of the homogeneity of variance is violated? (Pallant, 2007, p. 246). Table 40 provided the quantitative findings of the self-reported-practices scores by faculty and students. As noted in the columns for student and faculty means, scores indicated that students consistently reported higher scores than faculty reported, with a mean range of 0.31 to 0.89. Moreover, each item score assessed suggested statistical significance at p < .001. 250 Table 40 Strategic-Impact-Triad (SIT) Perceptions: Students Compared to Faculty Item Students Faculty Stu M Fac M Below Avg Avg Above Avg Belo w Avg Avg Above Avg Mean Diff. Attendance 1.8% (7) 35.3% (138) 62.9% (246) 13.0 % (18) 67.4% (93) 19.6% (27) 2.61 2.07 0.55 Writing Ability 1.8% (7) 52.9% (207) 45.3% (177) 36.2 % (50) 60.9% (84) 2.9% (4) 2.43 1.67 0.77 Team Player 2.3% (9) 45.0% (176) 52.7% (206) 15.9 % (22) 72.5% (100) 11.6% (16) 2.50 1.96 0.55 Motivation to succeed in college 1.0% (4) 28.6% (112) 70.3% (275) 18.1 % (25) 62.3% (86) 19.6% (27) 2.69 2.01 0.68 Oral Presentations 13.3% (52) 62.7% (245) 24.0% (94) 26.8 % (37) 66.7% (92) 6.5% (9) 2.11 1.80 0.31 Producing quality work 0.8% (3) 43.7% (171) 55.5% (217) 25.4 % (35) 70.3% (97) 4.3% (6) 2.55 1.79 0.76 Computer skills 5.1% (20) 46.5% (182) 48.3% (189) 21.0 % (29) 60.9% (84) 18.1% (25) 2.43 1.97 0.46 Success in high school 10.7% (42) 50.1% (196) 39.1% (153) 17.4 % (24) 73.9% (102) 8.7% (12) 2.28 1.91 0.37 Respect for others 0.8% (3) 21.5% (84) 77.7% (304) 13.0 % (18) 63.0% (87) 23.9% (33) 2.77 2.11 0.66 Enjoy learning new things 0.5% (2) 26.3% (103) 73.1% (286) 15.2 % (21) 64.5% (89) 20.3% (28) 2.73 2.05 0.68 Reading ability 2.3% (9) 40.4% (158) 57.3% (224) 28.3 % (39) 63.8% (88) 8.0% (11) 2.55 1.80 0.75 Time management 12.0% (47) 54.0% (211) 34.0% (133) 44.9 % (62) 47.8% (66) 7.2% (10) 2.22 1.62 0.60 Math skills 18.4% (72) 59.3% (232) 22.3% (87) 40.6 % (56) 55.8% (77) 3.6% (5) 2.04 1.63 0.41 Leadership 5.9% (23) 51.7% (202) 42.5% (166) 29.0 % (40) 64.5% (89) 6.5% (9) 2.37 1.78 0.59 Work ethic 1.5% (6) 30.4% (119) 68.0% (266) 32.6 % (45) 57.2% (79) 10.1% (14) 2.66 1.78 0.89 N = 548 251 Research Questions Data analysis and findings for each research question are presented in this section. Analysis of variance (ANOVA) was used to assess the reported student and faculty scores. The ANOVAs in this study were used as a method to validate anticipated findings with regards to the assumptions of ANOVAs: 1) normal population distributions, 2) variances are fairly similar, 3) observations are independent, and 4) the dependent variables should be measured on an interval scale (Field, 2005). Research Question 1 Research question 1 assessed the interaction between faculty and students by posing the following question: What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? To address this question, the domain of academic preparation (first factor of influence on student success) was studied in detail. Data calculated for research question 1 returned a Cronbach?s internal consistency of reliability coefficient of .837 (suggesting a strong inter-item positive score correlation), with an N = 548 and 5.5% of the responses excluded (94.5% valid cases). Statistically, the mean difference between students (M = 3.434, N = 382, SD = .440) and faculty (M = 3.393, N = 136, SD = .426) was relatively small (0.041 on a 4-point scale) for this domain of interest, e.g., academic preparation. The mean difference between students and faculty also indicated that the direction of the difference, although slight, suggested that students were more likely to agree that the academic preparation practices were a positive determinate of community college student success. 252 To assess student and faculty reported scores as a method to evaluate the main effect that academic preparation practices had on the success of community college students, an ANOVA with ? = .05, was used. The results of the analysis of variance were not statistically significant, F (1,516) = .904, p = .342, ?2 = .002, observed power = .158. The findings suggested that these groups did not report a significant difference in their perceptions of the effect that academic preparation practices had on community college student success. The eta squared effect size of 0.002 quantified the interaction between students and faculty as a ?negligible effect? in the total variance of the academic preparation scores. The magnitude of the effect size, therefore, is small and indicated that this finding was not substantive. The findings, including the observed power, suggested that the null hypothesis should be accepted, respective of Type I and Type II potential errors. Moreover, Levine?s Test of Equality of Error Variances indicated that the assumption of equality of variances across population samples represented by the reported scores was not violated, F (1,516) = .187, p = .665. Upon review of the demographic data for this study, it was noted that the ratio of students to faculty in terms of gender, respectively, was: 3:1 (female-to-male) and 2:1 (female-to-male). The mean difference between student and faculty males (M = 3.316, N = 148, SD = .538) and student and faculty females (M = 3.466, N = 370, SD = .381) was minimal (0.15). The mean difference indicated that female respondents were more likely than male respondents to positively perceive the academic preparation practices impacting community college student success. In regards to this level of demographic gender variance between respondents, ANOVA with ? = .05, was calculated for these sample groups. The analysis of variance?in terms of gender?indicated that the 253 difference in perceptions was statistically significant, F (1,516) = 12.662, p < .001, ?2 = .024, observed power = .944. Although the observed power indicated a 94% chance of detecting an effect in the sample scores between the independent and dependent variables (Field, 2005), the eta squared statistic suggested that this between-group effect was small. Only 2.4% of the variance in academic preparation scores is predictable by the strength of the relationship of perceptions between male faculty and students and female faculty and students (Pallant, 2007). Levine?s Test of Equality of Error Variances indicated that the assumption of equality of variances across gender population samples had been violated, F (1,516) = 14.227, p < .001 (variances of the groups are significantly different). Due to the violation of the Levene?s equality of variance, Welch?s and Brown-Forsythe?s robust test of equality of means was reviewed and indicated statistical significance, F (1, 208.659) = 9.508, p = .002, suggesting a rejection of the null hypothesis in terms of gender variance. Welch?s and Brown-Forsythe?s statistic is useful for analysis when there is an unequal N in the groups being compared (Field, 2005; Pallant, 2007). The statistical results for research question 1 suggested that students and faculty perceived the associated practices of academic preparation impacting community college students similarly, indicating that student and faculty perceptions ?are in agreement? regarding the academic preparation practices impacting student success in the community college. Gender analysis suggested perceptual differences in the domain of academic preparation practices impacting community college student success. 254 Research Question 2 Research question 2 assessed the interaction between faculty and students by posing the following question: What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? To address this question, the domain of work ethics (second factor of influence on student success) was studied in detail. Data reported for research question 2 returned a Cronbach?s internal consistency of reliability coefficient of .851 (indicating significant inter-item positive score correlation), with an N = 531 and 6.8% of the responses excluded (93.2% valid cases). Statistically, the mean difference between students (M = 3.318, N = 367, SD = .481) and faculty (M = 3.207, N = 135, SD = .417) for this domain of interest was 0.111, e.g., work ethics. Mean statistics indicated that students were more likely than faculty to report scores promoting the value of work ethics practices on the success of the community college student. To assess student and faculty reported scores as a method to evaluate the impact (main effect) that the work ethics practices had on the success of community college students, an ANOVA with ? = .05, was used. The results of the analysis of variance suggested statistical significance, F (1,500) = 5.628, p = .018, ?2 = .011, observed power = .658. The statistical relationship between faculty and students suggested that these groups were more likely to have contrasting perceptions about the methods and practices of work ethics impacting community or technical college student achievement. Eta squared quantified a weak effect (1%) in the total variance of the dependent variable (work ethics) scores as impacted by the perceptual relationship between the independent 255 variables (student/faculty). Additionally, Levine?s Test of Equality of Error Variances indicated that the assumption of equality of variances across population samples?in the domain of work ethics perceptions?was not violated, F (1,500) = 3.389, p = .066. Welch?s and Brown-Forsythe?s robust tests of equality of means, F (1, 273.530) = .6.424, p = .012, supported Levene?s statistic. Welch?s and Brown-Forsythe?s statistic is useful for analysis when there is an unequal N in the groups being compared (Field, 2005; Pallant, 2007). As with academic preparation, the domain of work ethics was evaluated using gender as the independent variable (due to the large difference in the number of male- female respondents). Using an ANOVA, gender perceptions of work ethics indicated a significant statistical difference in the practices impacting student success, F (1,500) = 14.570, p < .001, ?2 = .011, observed power = .658. Eta squared quantified a weak effect (1%) in the total variance of the dependent variable (work ethics) scores as impacted by the perceptual relationship between the independent variables specific to gender, e.g., male-female respondents. Levene?s statistic was violated, F (1,500) = 7.994, p = .005, while Welch?s and Brown-Forsythe?s robust test of equality of means were significant, F (1, 210.371) = 11.669, p = .001. Welch?s and Brown-Forsythe?s statistic is useful for analysis when there is an unequal N in the groups being compared (Field, 2005; Pallant, 2007). The statistical results for research question 2 suggested that students and faculty perceived the associated practices of work ethics impacting community college students differently, indicating that student and faculty perceptions ?are not in agreement? regarding the work ethic practices impacting student success in the community college. 256 Gender analysis also suggested perceptual differences in the domain of work ethics practices impacting community college student success. Research Question 3 Research question 3 assessed the relationship between faculty and students by posing the following question: What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? To address this question, the domain of institutional support (third factor of influence on student success) was studied in detail. Data reported for research question 3 returned a Cronbach?s alpha reliability coefficient of .897, with an N = 531 and 7.3% of the responses excluded (92.7% valid cases). Statistically, the mean difference between students (M = 3.318, N = 366, SD = .533) and faculty (M = 3.116, N = 133, SD = .512) was significant for the sample size returned (0.202). As with academic preparation and work ethics, the mean difference indicated that students tended to agree more strongly than faculty that the institutional support practices impacting community college student success were important. To assess student and faculty reported scores as a method to evaluate the impact that institutional support practices had on the success of community college students, an ANOVA with ? = .05, was used. The results of the analysis of variance suggested statistical significance, F (1,497) = 14.237, p < .001, ?2 = .028, observed power = .965. The relationship between faculty and students suggested that these groups view the methods and practices to achieve institutional support for community or technical college students differently. Eta squared quantified a weak effect (3%) in the total variance of 257 the dependent variable (institutional support) scores as impacted by the perceptual relationship between the independent variables (student/faculty). Additionally, Levine?s Test of Equality of Error Variances indicated that the assumption of equality of variances across population groups?in the domain of institutional support perceptions? represented by the reported sample was not violated, F (1,497) = .208, p = .649. As with academic preparation and work ethics, the domain of institutional support was evaluated using gender as the independent variable (due to the large difference in the number of male-female respondents). Using an ANOVA, gender perceptions of institutional support indicated a significant statistical difference in the practices impacting student success, F (1,497) = 17.158, p < .001, ?2 = .033, observed power = .985. Eta squared quantified a weak effect (3%) in the total variance of the dependent variable (institutional support) scores as impacted by the perceptual relationship between the independent variables specific to gender, e.g., male-female respondents. Levene?s statistic was violated, F (1,497) = 12.914, p < .001, while Welch?s and Brown-Forsythe?s robust test of equality of means were significant, F (1, 199.852) = 12.999, p < .001. Welch?s and Brown-Forsythe?s statistic is useful for analysis when there is an unequal N in the groups being compared (Field, 2005; Pallant, 2007). The statistical results for research question 3 suggested that students and faculty perceived the associated practices of institutional support impacting community college students differently, indicating that student and faculty perceptions ?are not in agreement? regarding the institutional support practices impacting student success in the community college. Gender analysis also suggested perceptual differences in the domain of institutional support practices impacting community college student success. 258 Research Question 4 Research question 4 consists of two parts: Part One is a theoretical discussion of the interdependent relationship of the SIT Model factors of academic preparation, work ethics and institutional support and is a prerequisite discussion for the findings to be discussed in Part Two. Part Two includes descriptive and statistical findings from the data analysis specific to research question 4. Strategic-Impact-Triad (SIT) Model Coefficient Equation Research question 4 assessed the relationship between faculty and students by posing the following question: What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics and institutional support? Research question 4 pertains specifically to the structured logic that each SIT Model factor is interdependent with every other factor. Interdependency of the three factors is similar to a coefficient-equation for student success. Student success has many definitions (National Postsecondary Education Cooperative, 2006) [NPEC]; nevertheless, whatever definition is applied to student success in the context of the student success domain, the factors or domains of the Strategic-Impact-Triad Model are highly dependent upon one another as co-requisites for students to succeed in the community college. For this study, the coefficient-equation is directly correlated to the SIT Model Factors. An example of a coefficient equation for student success is illustrated in Figure 13. As demonstrated in the figure, there is a numerical range between 0.0 and 1.0 for each term in the equation, with 0.0 being minimum success and 3.0 indicating maximum 259 success. Minimum success is defined to be a student who: 1) graduates with minimal academic skills and little desire to become a life-long learner, 2) believes and actively demonstrates that work ethics are for others to practice, and, 3) would have transferred many times if the opportunities had presented themselves. For this study, perceptions of the factors (domains) of academic preparation (AP), work ethics (WE), and institutional support (IS) were measured using scaled responses. However, to promote student success in the community or technical college, it is imperative that students and faculty understand that collectively, AP + WE + IS are necessary for students to achieve to their highest potential. The better a student is academically prepared while in college ?and? the greater is the positive work ethic ?and? the more significant the institutional support, the greater is the success of the community college student. Within each term of the SIT Model Coefficient Equation, is the numerical coefficient which indicates the level of success within each factor (term). For example, in Figure 13, if xAP were to contain a score of .7AP, this could equate to a 100 point scale. This .7 coefficient would suggest that the student?s academic preparation was at 70%. To continue, if a student were to demonstrate a work ethic that yielded a result of .3, then the work ethic would only be 30% of some normalized value system. And, lastly, if institutional support was perceived by this student to be ?unacceptable?, it may be rated at a .2 coefficient, or given a 20% effectiveness score. Therefore, the SIT Model Coefficient Equation would be: .7AP + .3WE + .2IS = 1.2Level of Student Success. The goal for the SIT Model Coefficient Equation is to obtain a score as near to 3.0 as possible. The closer to 3.0, the more likely the student will have achieved maximum success if the SIT Model factors are 260 undergirded with effective educational practices to promote the achievement of the 3.0 score, or student success. If community college institutional practices were guided by policies, attitudes, and support structures to enable as many students as possible to achieve 70% in each component of the SIT Model Coefficient Equation, student success could be improved significantly. Students who attend community colleges that practice methods of achievement at the .7 level per term (SIT domains) in the equation, would yield student success scores of 2.1, equating 2.1 as the goal for policies and practices. The following example will be shown as a method to explain the SIT Model Coefficient Equation. In the next section, the findings for research question 4 will be discussed and demonstrated using the SIT Model Coefficient Equation. Sample: John Doe: .8AP + .4WE + .9IS = 2.1. Mr. Doe has reached 70% or the minimum for student success within the SIT Model; however, the goal is to reach .7 per domain of influence on student success. The community college could then improve practices to correct the work ethic deficiencies, as well as establish a culture of evidence for transferability to other community colleges (see Figure 13). 261 ======================================================== xAP + yWE + zIS = aLevel of Student Success Stated Minimal Objective: .7AP + .7WE + .7IS = 2.1Level of Student Success Student Achieves 70% in Each Student Success Domain ======================================================== 1AP + 1WE + 1IS = 3.0 (Maximum Success = Scale Score 3.0) ======================================================== (unlimited variances in scores) ----------------------------------------------------------------------- .5AP + .5WE + .5WE = 1.5 Success (Scale Score 1.5) ----------------------------------------------------------------------- (unlimited variances in scores) ----------------------------------------------------------------------- 0AP + 0WE + 0IS = No Success (Scale Score 0) ========================================================== Figure 13. Strategic-Impact-Triad (SIT) Model Coefficient Equation. This model covers a range of 0.0 to 3.0, with 3.0 being the highest achievable student success goal to be promoted by the community college. The coefficient scores of .5 per domain are illustrated to show what the midpoint of the scale would be for student success; even at a .5 per domain, the student falls short of the goal of .7 per domain and therefore, is more likely to be unsuccessful in the community college. Coefficient x is related to practices of academic preparation for student success; y, practices of work ethics; z, practices of institutional support. Coefficient a represents the numerical score for the interrelationship of the SIT Model factors applied to maximize student success in the community college. Research Question 4 Findings This section of the study consisted of two areas of findings. The first finding is related to a descriptive analysis of how students and faculty perceived the necessity of the SIT Model domains for community college student success. The second finding is specific to statistical data when the SIT Model domains of academic preparation, work ethics and institutional support were analyzed in SPSS as interdependent or codependent variables in the success of community college students. The questions that were posed for students and faculty in descriptively addressing research question 4 were to determine if their respective perceptions regarding the 262 domains of the SIT Model were: 1) required for student success, or 2) not required for student success. The researcher desired to investigate the relationship between respondents in terms of understanding the intrinsic and actual value assigned to the SIT domains related to student success, e.g., the perceived content validity of the interdependency or codependency of academic preparation, work ethics, and institutional support. The method to acquire descriptive data for research question 4 is indicated in Figure 14. Figure 14 provided the scale (1-required; 2-not required) by which students and faculty reported their perceptions of the ?required-or-not-required? responses, indicating the value of co-dependency of the SIT Model Factors as impacting community college student success. How do you respond to the following statements? Required for Student Success Not Required for Success Academic Preparation is: ? ? Work Ethics are: ? ? Institutional Support is: ? ? Figure 14. Survey Questions to Descriptively Assess Research Question 4. The reported scores and frequencies ( f ) of the responses for research question 4 are provided in Table 41. The findings in Table 41 suggested that faculty and students highly agreed that the three SIT Model factors were required for student success. Faculty and student respondents reported the following: 1) total respondents agreeing that academic preparation was required for student success, 98.8%; 2) total respondents agreeing that work ethics was required for student success, 96.4%; and 3) total respondents agreeing that institutional support was required for student success, 93.7%. 263 Using these findings from the total respondents, the SIT Model Coefficient Equation, and interpolation, the coefficient scores of the equation would yield: .988AP + .964WE + .937IS = 2.889Level of Student Success. In terms of within group differences, the following findings were reported: students agreed that academic preparation, work ethics and institutional support, respectively, was required at: 98.4%, 96.25, and 94.0%. Comparatively, faculty agreed that academic preparation, work ethics and institutional support, respectively, was required at: 100%, 96.9%, and 93.0%. The variance between the SIT Model Coefficient Equation is computed as the difference between the student success faculty coefficient of 2.899 and the student success student coefficient of 2.8865, or 0.0125. The descriptive variance between groups for research question 4 suggested that faculty and students strongly agreed that these factors are required if students are to succeed in the community college at a standard heretofore unpracticed in the community college as an organizational unit within the educational system of the United States. As a result of the finding that faculty and students indicated that academic preparation, work ethics and institutional support were required for community college student success, a co-related finding was identified. The co-related descriptive finding for research question 4 is the relationship between the 15 self-reported practices (see Table 40) and the descriptive results of the factors required for student success (see Table 41). This relationship indicated a dependent, but dichotomous construct. The dependency and dichotomy in this finding is that these groups strongly agreed that academic preparation, work ethics and institutional support were required for community college student success (the dependency) at the 264 same time that these groups indicated a statistical difference in student abilities or practices (dichotomy). This co-related finding is based on reported scores from the groups in the study, indicating a significant finding for community colleges. It is a finding that informed community colleges of the perceptions reported by these groups which should be investigated to their respective complete conclusion. The finding related to the agreement over the required need for the SIT Model factors impacting student success is a powerful ?mandate? for change, just as the significant difference in the reported perceptions of student abilities/practices is a ?mandate? for change?all for the improvement in the success of students. To reiterate, student success in the context of this study is to maximize student success for life-long learning. This finding is suggestive of promoting policies and practices to improve student achievement. Table 41 Faculty and Student Strategic-Impact-Triad Factor Domain Assessments AP ( f ) / AP % WE ( f ) / WE % IS ( f ) / IS % Range or % Faculty Required ( f ) 129 / 100% 125 / 96.9% 120 / 93.0% 0 ? 129 (Range) Not Required ( f ) 0 / 0% 4 / 3.1% 9 / 7.0% 0 ? 129 (Range) Sub-Total 129 / 100% 129 / 100% 129 / 100% 129 / 100% No Response 23 / 15.1% 23 / 15.1% 23 / 15.1% 23 / 15.1% Respondents 129 / 84.8% 129 / 84.8% 129 / 84.8% 129 / 84.8% Total (N = 152) 152 / 100% 152 / 100% 152 / 100% 152 / 100% Student Required ( f ) 360 / 98.4% 352 / 96.2% 344 / 94.0% 0 ? 366 (Range) Not Required ( f ) 6 / 1.6% 14 / 3.8% 22 / 6.0% 0 ? 366 (Range) Sub-Total 366 / 100% 366 / 100% 366 / 100% 366 / 100% No Response 30 / 8.2% 30 / 8.2% 30 / 8.2% 30 / 8.2% Respondents 366 / 92.4% 366 / 92.4% 366 / 92.4% 366 / 92.4% Total (N = 396) 396 / 100% 396 / 100% 396 / 100% 396 / 100% Notes: 1. Students, N = 396; Faculty, N = 152 2. AP ? Academic Preparation; WE ? Work Ethics; IS ? Institutional Support 3. Percentage totals are rounded to obtain 100%. 265 In addition to the descriptive findings discussed, all thirty-six items (12 from each domain) scores were coded and analyzed to determine that Cronbach?s coefficient of reliability and internal consistency was .937. Faculty and students were the independent variables and the mean scores of the 36 items were the dependent variable. Using an ANOVA, ? = .05, the analysis indicated statistical significance, F (1,499) = 8.181, p = .004, ?2 = .016, observed power = .815. Eta squared quantified a weak effect (2%) in the total variance of the dependent variable (composite item) scores as impacted by the perceptual relationship between the independent variable (students/faculty). Levene?s test of homogeneity of variance was not violated F (1,499) = .483, p = .488. Welch?s and Brown-Forsythe?s robust test of equality of means were also significant, F (1, 259.816) = 8.887, p = .003. Faculty mean scores (M = 3.236, N = 135, SD = .391) were lower than student mean scores (M = 3.357, N = 366, SD = .428), suggesting that students in all item scores tended to agree more positively than faculty that the practices impacting the three domains of the SIT Model were important. To summarize the statistical and descriptive findings, the scores evaluated indicated that students and faculty perceived the impact of academic preparation practices on student success similarly (research question 1); for the practices of work ethics and institutional support as impacting student success, faculty and students indicated that there was a significant perceptual difference (research questions 3 and 4). For research question 4, faculty and student scores indicated that there was a statistically significant difference in student abilities or practices, which were essentially practices across all three domains within the SIT Model. Descriptive analysis suggested that students and faculty strongly agreed that the domains of academic preparation, work ethics and 266 institutional support were critical factors necessary for community college students to be successful. Qualitative Analysis and Findings Qualitative data was obtained in four open-ended questions in this study to support the quantitative data findings. The open-ended questions related to community college practices for student success were: 1. What should community colleges do to support students who are academically unprepared? 2. How can community colleges help students acquire and practice good work ethics? 3. What can a community college do to improve its institutional support to help students succeed in college from enrollment to graduation? 4. What institutional practices (actions by members of the college) have you observed that helps or harms the success of a student? Analysis of the qualitative data identified several themes associated with practices impacting the success of the community college students. To extract themes in the open- ended responses from respondents for each SIT Model domain, keyword searches were conducted, resulting in the following pertinent qualitative findings impacting student success between and within the SIT Model domains of practice. 267 Academic Preparation Themes 1. Maximize Tutoring Services (269 references); 2. Involved and Caring Faculty to Help Students Succeed (183 references); 3. Promote the Purpose and Offerings in Remedial Courses (175 references); 4. Effective Academic Advising (72 references). The following comments by a student and faculty member support the findings of the practices to improve student success within the domain of academic preparation [SR ? Student Respondent; FR ? Faculty Respondent]. Responses are for the question: What should community colleges do to support students who are academically unprepared? Pay a little more attention to students. Don't just "tell" them there is help, show them. I am 40 years old almost and I feel like I am in a fight all by myself. Sure, they "tell" me there is help, but they give off the attitude of "I?m not going to be the one to help though". NOBODY has bothered to ask me how I am doing with the exception of 1 instructor. I feel that advisors need to get involved more with students and get out from behind their desk to make the effort to let the students know a little about them. I am almost through my 1st semester and just found out I had an advisor. I sometimes get tired of feeling like a number to the system. You hear a lot of things like "Do you know how many students there are here? We can't single out each individual!" Yes you can! That is why I give you my money!!! Sorry, getting off my soap box now. [SR250] 1. Be honest with the student! 2. Enforce attendance, tardiness, and class requirements. 3. Put the student in one or more developmental classes. 4. Motivate students to succeed by being friendly and professional. 5. Return tests and assignments THE NEXT CLASS DAY! 6. Require work in Math and English Labs. 7. Offer encouragement one-on-one and be positive in dealing with students. 8. Be patient!!! 9. Require extensive work--not "Mickey Mouse" level work--and then offer help. 10. Be accessible to students. [FR71] 268 Work Ethics Themes 1. The Necessity of Work Ethics (245 references); 2. Practice Work Ethics Daily (160 references); 3. Group Work Accountability (94 references); 4. Involvement of Community and Business Leaders (71 references); 5. Feedback on Personal Work Ethics (45 references); 6. Instill Work Ethics via Workshops (22 references). The following comments by a student and faculty member support the findings of the practices to improve student success within the domain of work ethics [SR ? Student Respondent; FR ? Faculty Respondent]. Responses are for the question: How can community colleges help students acquire and practice good work ethics? Students can acquire good ethics hopefully from their home environment but also in the classroom. If the teacher reaches out in excellence the students will respond. [SR120] 1. Be an example first; 2. Include discussions of good work ethics in appropriate courses. 3. Personal instructor-student counseling. 4. By having local leaders and employers address the student body regarding what is expected of graduates who will be hired. 5. Hold students accountable for their actions; i.e., enforce attendance policies, academic standards, dress codes, etc. [FR54] Institutional Support Themes 1. College Wide Dynamic Support Services (321 references); 2. Effective Advising at all Levels of the Institution (131 references); 3. Easy and Open Access to Administrators (56 references); 269 4. Innovative Institutional Practices to Support Student Success (52 references); 5. Effective Online Resources (48 references); 6. Consistent Communications from the Institution (36 references). The following comments by a student and faculty member support the findings of the practices to improve student success within the domain of institutional support [SR ? Student Respondent; FR ? Faculty Respondent]. Responses are for the question: What can a community college do to improve its institutional support to help students succeed in college from enrollment to graduation? I wish some staff would not make things so difficult, such as having students go back and forth between offices. Why can?t the offices not gather forms and send them collectively to the next station instead of having students carry them from place to place? Of course this is not always the case, but it seems office workers could be a little more sensitive to the amount of footwork the student has to do to accommodate paper trails. Also, it would be helpful if the textbooks were listed on-line. I understand that the college bookstore needs to make their profit, but many students live far from campus and would like the opportunity to purchase books from online bookstores, or maybe even online with the college bookstore. That being said, my college does an overall great job trying to meet student needs. [SR28] Create a dynamic support system to deal with the pressures and problems that college students are faced with such as a young college freshman who is homesick to an older student who is struggling financially?to both attend classes and provide for a family. [FR4] Institutional Practices Themes 1. Variations of Providing Specific Help to Students (328 references); 2. Classroom Practices, Good and Bad (297 references); 3. Specific Actions to Support Student Success (64 references); 270 4. Conduct, Ethics, and Professionalism (47 references); 5. Deliberate Negative Attitudes for Students (20 references); 6. Email Responses with Excessive Delays (13 references). The following comments by a student and faculty member support the findings of the practices to improve student success within the domain of institutional practices [SR ? Student Respondent; FR ? Faculty Respondent]. Responses are for the question: What institutional practices (actions by members of the college) have you observed that helps or harms the success of a student? Since I am an older and returning student, I have experienced a lot of helpful practices at this institution. I think when the staff sees the same student over and over again and seem to believe that the student is there for the long haul, then the practices of honesty, concern, pride, collaboration, constructive criticism?are to me what has helped me to be a better student. [FR28] Really caring about the whole student and not just the tasks to be completed is an excellent institutional practice. [FR2] Chapter Summary This chapter presented the survey results in two forms: quantitatively and qualitatively. The quantitative analysis suggested that students and faculty view factors impacting student success in the community college in a variety of ways: 1) Self-reported abilities were significantly different, in that faculty perceived community college students as less-than prepared to perform above average in college level work; 2) Students and faculty reported that they perceived the practices evaluated within the academic 271 preparation domain similarly. For the domains of work ethics and institutional support, the data indicated that students and faculty differed in their perceptions in terms of the practices impacting student success; 3) Students and faculty agreed significantly that for a student to be successful in the community college, the student must possess functional elements of academic preparation, work ethics, and institutional support. Additionally, qualitative data indicated that students and faculty have differences in how student success is perceived across each SIT Model factor. Themes were extracted from the open-ended responses from all respondents. The themes supported the quantitative findings reported in this chapter. Chapter V will provide a summary of the study as well as present conclusions, recommendations, and implications for the study. The use of quantitative data will be used to support the topics discussed in Chapter V. 272 CHAPTER V SUMMARY, CONCLUSIONS, RECOMMENDATIONS, AND IMPLICATIONS ?There is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things.?--- Niccolo Machiavelli ?In theory there is no difference between theory and practice. In practice there is.? --- Yogi Berra ?There are those who look at things the way they are, and ask why... I dream of things that never were, and ask why not?? --- Robert Kennedy Introduction This study, Strategic Factors of Institutional Practice Impacting Student Success in the Community College as Perceived by Students and Faculty: Academic Preparation, Work Ethics and Institutional Support, investigated the relationship between the impact factors identified to assess the influence of these factors on community college student success. Student success was defined as a multidimensional construct, suggesting that the definition was more than likely not similarly defined by individuals associated with the community college educational process. One possible definition for community college student success was the level of graduation or transfer rates. Unfortunately, not all community college students enroll to graduate or transfer; nor do all community college students arrive equipped to graduate or transfer. Therefore, whatever definition is applied to community college student success, there are many factors which impact the success of students in the two-year college system. 273 Of all the impact factors related to student success, academic preparation, work ethics and institutional support are strategically positioned within the domain of community college student success. The strategic reference is that these domains are primary constructs related to student success and provide a framework for restructuring the policies and practices to improve student achievement. Consequently, this study used four research questions to assess the relationship between students and faculty as these populations perceived the impact characteristics within the Strategic-Impact-Triad (SIT) Model factors (or domains). The impact factors were academic preparation (prior to college and during college), work ethics (for students and faculty), and institutional support (to support students and faculty). These three success domains were assessed separately and collectively. Separately, twelve characteristics (practices) were rated by students and faculty for each domain. The surveys presented to students and faculty members assessed the comparative perception scores for each domain, utilizing twelve characteristics of practice?or a total of 36 practices within the SIT Model. To better understand the disparities between faculty and students, this study created an instrument for capturing these variances in the form of perceptual responses to factors influencing student success within the community college. As previously noted, the domains of interest were academic preparation, work ethics, and institutional support. These three domains in the educational processes of the community college are?by the very nature of daily practice?necessary for students to be successful in college. Chapter I provided a background for the study, identified the problem to be investigated, stipulated the research questions to be answered, noted the significance of the study, set limitations and assumptions for the study, and identified key terms. 274 Chapter II addressed the SIT Model factors in terms of an extensive and pertinent literature review, inclusive of the historical basis for the existence of community colleges, challenges faced by the community college system of education, and created the framework of the SIT Model factors. For Chapter III, methods to assess respondent data were explored, inclusive of characteristics of the population sample, survey development, validity and reliability, data collection and procedures, and confidentiality and anonymity. Chapter IV discussed the findings of the study and correlated the findings to the research questions. The correlation between the research questions and the findings informed community colleges that perceptions of practices within each SIT Model domain suggested differences between students and faculty. Chapter V will provide a summary of the study, identify conclusions and suggest recommendations, and discuss implications of the study. To reiterate, the following research questions were used in this study: 1. What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? 2. What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? 3. What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? 275 4. What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics, and institutional support? Summary of the Study Qualitative and quantitative methods were used to address the research questions in this study. The purpose of this framework was to promote a study ingrained with the principle promulgated by Moss (1998): ?the definition of validity is not just an interesting philosophical question; it can be seen to have real ethical, political, and economic consequences? (p. 6). For this study, its primary purpose was to investigate the relationship between two major groups within the community college system of education: students and faculty. The relationship under investigation was the perceptual outlook between and within these groups in terms of practices impacting community college student success. The study intended to contribute to perceptions and student success research, as well as to enable community colleges to better understand factors impacting their respective student body success, e.g., academic preparation, work ethics and institutional support. With an improved understanding of these domains of influence on student success, community colleges might be able to improve: 1) the relationship between students and faculty; 2) policies and practices in each domain of the SIT Model to promote success opportunities for all its students; and, 3) the workforce by preparing a better employee, citizen, and life- long learner. 276 To assess the relationship between these groups, this study used a survey to measure the responses of faculty and students (see Appendix B and Appendix D). The data reported suggested that perceptual differences related to student success does exist between these groups in terms of: 1) abilities of students to be successful in the community college; 2) practices related to academic preparation, work ethics, and institutional support; and, 3) the perceptions of the importance of the SIT Model domains as required for student success. Subsequent to pilot testing and validating the survey instruments, the web-based survey was open to respondents for 30 calendar days. At the conclusion of this timeframe, 152 faculty members and 396 students had volunteered to participate in the study. Using an Analysis of Variance (ANOVA) to assess the variances in the mean scores reported between sample groups, the findings suggested the following: 1) students and faculty do not agree that student abilities generally support student success in the community college (p < .001); 2) academic preparation (AP) practices impacting the success of community college students were perceived to be similar between the groups (p = .342); 3) work ethics (WE) practices impacting the success of community college students were perceived to be different between the groups (p = .018); 4) institutional support (IS) practices impacting the success of community college students were perceived to be different between the groups (p < .001); and, 5) faculty members and students each reported that the SIT Model factors of influence on student success were required, and are indicated as: faculty AP = 100%, WE = 96.9%, and IS = 93.0%, while student AP = 98.4%, WE = 96.25%, and IS = 94.0%. The quantitative and qualitative findings suggested that the research questions had been properly assessed (Moss, 1998). 277 Conclusions Conclusions for the study will be discussed in this section. Moreover, conclusions will include quantitative and qualitative data or comments from respective participants to support the conclusions drawn from this study. Student and faculty comments refer to the practices of academic preparation, work ethics and institutional support and can be used to identify, evaluate, and enhance the success of community or technical college students (SR ? Student Respondent; FR ? Faculty Respondent). Additionally, each conclusion will address a specific finding of the study beginning with the vitality of response rates and the importance of responding to surveys. For this study, during both phases of the data collection process, response rates were exceptionally low. The pilot study returns were 3.2%, compared to the final dataset of approximately 1%. One need for increased sample sizes is related to sampling error (Fowler, 2002). For the entire community college system with over 11 million students and 500,000 faculty members, a return rate of either 3.2% or 1% would yield large samples; however, for inferential purposes, samples should be as significant as possible to apply (infer) the findings of this study to the populations under investigation. The second need to improve the response rates is related to the depth of qualitative data (noting that quantitative scales to improve consistent scores are not exempt from consideration). For this study, perceptions were variables which are both common and unique. The common perceptions are those which correlate research to application within the auspices of themes converted to measurable practice; however, unique perceptions are those themes which sometimes offer revolutionary or evolutionary methods for change. Because response rates were low in this study, it was concluded that 278 there was a potential loss of valuable qualitative research data which may be important to decision makers in more fully understanding practices within the SIT Model domains to improve student success. Although survey responses may generally be on the decline (Asiu, Antons & Fultz, 1998; Goho, 2002; Porter & Umbach, 2006), surveys are a proven vehicle for data collection (Chaudron, 2006; DeVellis, 2003; Spector, 1992). The value of survey data is its potential to become a catalyst for improving the success of students in ways heretofore unknown. As suggested by a student: The college should determine specific student needs like this survey. A survey should be done to see what a student needs help with and then group students who have the same needs. This will help students get the help they need because the college can?t help every single person with so many different needs. [SR81] Another conclusion drawn from the data reported in this study was that a student success benchmark had been positively correlated to previous research and was useful in analyzing the research questions (Grimes & David, 1999; Kuh, Kinzie, Schuh & Whitt, 2005b; Lindholm, Szelenyi, Hurtado & Korn, 2005; Merrow, 2006; Smith, 2006; Wyatt, Saunders & Selmer, 2005). The benchmark has two components: 1) students and faculty do not view/perceive student abilities (practices) in the same way; 2) students and faculty highly agree that for students to be successful in the community college, elements of academic preparation, work ethics, and institutional support must be present and practiced in the institution. The benchmark, therefore, can be stated in terms of the following conclusion: students and faculty indicated that student success requires the SIT Model factors, even if students and faculty suggested disagreement about the practices and abilities of community college students to be successful. 279 To understand the value of the conclusion (benchmark) that students and faculty indicated agreement that the SIT model factors are required, while noting disagreement about the abilities or practices of students to be successful, consider the following comment: Perhaps, aiding the academically unprepared could come in the form of more required orientation classes that must be pre-requisite to students who drop below an A-B coming out of high school. In these orientation classes, a sampling of college English (writing, rhetoric, logic, grammar), math, science, reading, history....could be given...keyboarding....many students arrive to meet the challenge on the first day of class...and feel inadequate....lessons on attitude, how to cope with college failure, how to cope with working a job and studying in the college classroom, how to relate to family and college at the same time, how to deal with personal finances and spending during the college experience, how to study more efficiently...in this orientation, incorporate full-time, experienced faculty members who will address these orientations....For Example, have a vibrant member from each department...an outstanding, master teacher state in terms that young people understand...on how to achieve in college. [FR34] The undertones in the statement by this faculty member, suggested that he or she agreed that community college students are generally inadequately prepared for college level work; however, this same faculty member gives credence to the conclusion that the SIT Model domain practices are required for students to be successful in the community college. Statistically, the differences in reported perceptions by students and faculty suggested that these groups viewed the main effect of academic preparation on student success similarly (p = .342). Survey scores indicated that students and faculty are more likely to agree regarding the practices identified in the academic preparation section of the survey. Based on a search of keywords in the open-ended comments in terms of what the community college should do to support students who are academically unprepared, 280 there were: 1) 269 references to providing tutorial services; 2) 183 suggestions that faculty should be more involved in the academic success of students; 3) 175 references to offering remedial courses; and 4) 72 references to improved academic advising. Although students and faculty reported that they agreed with each other on the academic preparation practices impacting student success, the researcher concluded that this finding should be interpreted with caution. Caution was urged due to 23 years of classroom experience and daily interaction with community college students in which the practices noted in the survey have been actively used. As a result of many years of experience with students and faculty, the researcher concluded that the reported scores by students and faculty members do not generally correlate well with actual academic preparation practices in and out of the classroom. Notwithstanding this conclusion, comments provided by faculty and students indicated strong support of the goal of ensuring effective practices to improve student achievement within the domain of academic preparation. Question presented: What should community colleges do to support students who are academically unprepared? Community colleges should offer remedial courses for students who are unprepared for higher level education. If a student is not capable of participating in higher education, due to a learning disability or lack of discipline, community colleges should offer technical programs. Campus tutors, professor availability, and accurate advising are other options community colleges should offer to support student success. [SR247] Determine why, and what factors led to their unpreparedness, and channel assistance towards that area for the students. [FR68] Give them all the support and encouragement possible through excellent, caring staff and available tutorial services. Staff, although very busy, need to take more time out to personally help all students. [SR90] 281 Faculty and students responded to the practices of work ethics by reporting a statistically significant perceptual difference (p = .018). Drawing an erroneous conclusion in this factor (work ethics) for student success is as detrimental to student success as to assume that academic preparation practices cannot be evaluated and improved. Nevertheless, there were several qualitative inputs which suggested that if students enroll in the community college lacking in a viable work ethic, it may be too late to instill good work ethics in the lives of students or faculty. The following statements from faculty and students correlate to the construct that work ethics needs to be ?brought? at the time of enrollment. Question presented: How can community colleges help students acquire and practice good work ethics? It may be too late to teach ethics once they get here. [FR11] I don't know that the college can help students acquire and practice good work ethics. I do believe that instructors that enforce deadlines, standards, and academic honesty help reinforce good work ethics and can motivate students to develop or reach those goals. [FR27] The first thing is by expecting them to have them. Accommodations should not be made to cover up poor work ethics. [FR77] I think society, parents and role models dictate this. It is not the job of the college. [FR116] The mold is set for most people at this age. This may be the first time they are required to do quality work and submit it on time. The work ethic tends to be how to negotiate the best deal. [FR137] Don't know, either the student has ethical behavior or he or she does not. You cannot grow ethics. [FR150] That is the responsibility of the student. [SR11] You can always say with workshops and seminars... but this is something that falls back on the home life and the parents of the students. [SR40] Work ethics is entirely up to the individual, not the organization. [SR116] 282 In contrast to the students and faculty who perceived work ethics as necessary, but that the necessity falls outside the domain of community college responsibilities, there were also many students and faculty who perceived that student success is strongly impacted by work ethics. The following sample comments support the inclusion of work ethics in the community college to promote student success: Have instructors use more group work. Instructors can also educate students on good work ethics. [FR147] Students can acquire good ethics hopefully from their home environment but also in the classroom. If the teacher reaches out in excellence the students will respond. [SR183] Tell students the truth. "If you don't work, you don't eat." [SR75] Teach about work ethics. Have business people come in and speak about it. [SR162] They really can't because students are just too busy; therefore, it leaves no room for college professors to do what they can to help the situation. [SR28] By designing the curriculum to meet the needs of the employer. [SR17] Demonstrate good work ethics from faculty. Could have a workshop on work ethics. [FR28] The work ethics comments provided here by students and faculty are a sample of the total work ethics comments. However, the samples indicated that there is a difference between faculty and students in regards to the impact that work ethics has on student success. The conclusion regarding the practices of work ethics to improve student success is simply that the quantitative and qualitative findings support work ethics as part of all instructional programs within the community college system of education. 283 In terms of Cronbach?s reliability of the internal consistency for the institutional support domain, the value calculated was .897. This statistical information provided the indication that students and faculty?more so than in academic preparation or work ethics?did not agree regarding the impact that institutional support had on student success. One way to define this variance can be found in the comment made by the following student: I wish some staff would not make things so difficult, such as having students go back and forth between offices. Why can?t the offices not gather forms and send them collectively to the next station instead of having students carry them from place to place? Of course this is not always the case, but it seems office workers could be a little more sensitive to the amount of footwork the student has to do to accommodate paper trails. Also, it would be helpful if the textbooks were listed on-line. I understand that the college bookstore needs to make their profit, but many students live far from campus and would like the opportunity to purchase books from online bookstores, or maybe even online with the college bookstore. That being said, my college does an overall great job trying to meet student needs. [SR28] The comment by this student is endemic of many comments returned by both students and faculty. There were many positive responses; however, there were also many comments which indicated that institutional support was less than expected by students and/or faculty. In terms of a conclusion regarding institutional support, there was ample data returned to suggest that student success is impacted by practices related to institutional support. One student summed up the value of the institutional support process: ?Assist the student from beginning to end? (SR20). Consequently, this study concluded that without adequate, professional, and caring institutional support, students are less likely to persist at the current community college if other options exist. 284 Institutional support is exactly the combined set of student success practices which should precisely ?assist the student from beginning to end? (SR20). The following comments by faculty and students will indicate the variances in perceptions regarding institutional practices to promote the success of community college students. Question posed: What can a community college do to improve its institutional support to help students succeed in college from enrollment to graduation? Follow up with the student?s progress from start to finish. [SR72] Keep the communication open at all times. [SR87] Respond to emails and phone calls, so students are not nervous or anxious about what to expect. [SR112] Have more activities to boost morale. Keep check by giving surveys more than once so that they will know how we feel and what areas we may need assistance in. [SR143] Consistently remind the student that College is the path to a better life that is right around the corner. Help them keep the end results in mind. [SR164] They can continue to care about the students and inform students from time to time about what resources are there to help and how we can get it. [SR173] Make services such as advising and tutoring available for all students. [SR274] Have staff members that want to see the students succeed and to go the extra mile to help. [SR25] Always have someone available for the student to talk to, or get advice from. [SR70] Encourage instructors to utilize community leaders, and others in particular professions as speakers in classes, touching on various topics covered in the text. [FR4] 285 Have a better working relationship with industry. Provide more co-op and apprenticeship programs. Allow students to see what their career really consists of. [FR5] Effective communication is the key for students to succeed: communication from the top down and the bottom up. [FR13] Have a student center, mentorship, student support center. [FR28] Provide auxiliary support programs, reasonable office hours for faculty, a library that is available. [SR37] Spend less on useless activities (new buildings, unproductive administrators, and academic fads) and more on libraries, labs, and tutors. [SR64] Admit students that are qualified, create an environment in which "the college" is seen not as an adversary but a supporter. [FR80] Better signage to help students find faculty and appropriate buildings on campus; caring people in Financial Aid and Enrollment Services Departments. [SR116] Treat students as our primary customer and provide maximum access to the elements that would provide support to his or her success. [SR142] This study concluded that institutional support is the glue that holds academic preparation and work ethics together. Without adequate academic preparation a student may succeed, but at such a rudimentary level that few employers will remotely consider hiring the individual. The same is true in regards to work ethics; a student who practices the most basic work ethics traits is more likely to encounter employment in areas where ?trivialization? is the norm. Trivialization is defined to be employment where few opportunities for advancements exist, learning new skills is not practiced, and minimal salary and benefits are the norm. Trivialization results in individuals who have demonstrated a history of poor work ethics, including basic educational skills. 286 Consequently, for academic support to thrive hand-in-hand with work ethics, the community college must progressively promote a receptive educational environment (institutional support structures) as a method to induce co-dependence between all three SIT Model factors. For the community college which practices a minimalist approach to student success, students may graduate. They may graduate without academic preparation skills sufficient to carry them to the next level of education; they may graduate with work ethics that will keep them at the bottom rung of the workforce; and they may graduate from their Alma Mater with a disgruntled attitude towards an institution which failed to meet their educational and basic student needs. For this study, the conclusion drawn from the findings related to institutional support is that without an effective organizational infrastructure to support academic preparation and work ethic practices to promote student achievement, serious consequences lie on the horizon for community colleges. Recommendations Several recommendations are discussed in this study. Although the focus for this study was the research questions, other findings are also included. The research questions will be discussed, followed by other recommendations derived from the study. Findings for research question 1 indicated that the relationship between students and faculty was not significant, suggesting that these groups considered the practices of academic preparation similarly. A sub-group calculation was conducted using gender as the independent variable. The finding was that gender difference was statistically significant, F (1,516) = 12.662, p < .001. The recommendation for research question 1 is 287 to conduct further study within the domain of academic preparation impacting student success using sub-groups to determine underlying constructs which have statistical significance. This recommendation is applicable to each SIT Model factor of influence on student achievement. For example, if sub-groups related to gender, ethnicity, administration, or individual experiences, were statistically significant, the possibility exists that these differences in the leadership of the community college may have both a direct and indirect bearing on the policies and practices established in the college which promotes the maximum success of the student body. This also applies to each factor within the SIT Model. Findings for research question 2 indicated that work ethics was statistically significant, suggesting that students and faculty viewed practices to promote student success in the community college differently. It is recommended that further study be conducted to better understand the complexity of the relationship of work ethics as practiced by students and faculty. This recommendation includes an analysis to determine the best predictor to support student success within the ten work ethics noted by WorkEthics.Org (2006). Additionally, to better assess the work ethics in the community college, a method to pre-test and post-test students is recommended. This data would provide a basis of comparison, providing opportunity for control group studies to investigate specific instructional and practical methods to determine if work ethics may be influenced by practices in the community college. Workforce studies are also recommended in this area specific to the relationship of the community college as a training partner for the development or enhancement of employee work ethics. 288 Findings for research question 3 indicated that students and faculty did not agree regarding the practices of institutional support to promote student success. It is recommended that additional study be conducted on the practices of institutional support across a much larger population to determine central tendencies across institutions within the community college system of education, identifying institutional success trends. For example, a broader scale of research may suggest community college practices which are identified as common among institutions of similar size, student body, and program offerings. Moreover, because students and faculty practice education within the control of the community college, it is important that institutional structures be better analyzed and understood as a method to promote student success. A poor example of effective institutional support would be the arrival of a student with poor work ethics, minimal basic skills, and who leaves the college with poor work ethics, having received little development in this area, and who graduates with minimal skills and a modicum of intellectual capital to use in the workforce. Findings for research question 4 indicated a very strong relationship between students and faculty. The recommendation for research question 4 includes the SIT Model Coefficient Equation as a method to develop policies to address each factor of the SIT Model in relation to the combined factors. The recommendation is suggested to develop a working model to measure student levels of success in each of the SIT Model domains to assess the overall needs of the student in each area; specifically, how well does the student perform academically, practice work ethics, and understand the institutional support structures in place to support maximum success. In the words of one wise student: ?Work with them to get them prepared? [SR114]. 289 Further investigation is recommended to discern the relationship between the 88.4% who indicated that they attained an A/B/C average in high school while approximately 50% required remedial Math, 30% needed remedial English, and 21% needed remedial Reading. This recommendation supports research question 1. To infer the self-reported benchmark identified in this study has application to community colleges throughout the system is to also recommend that this benchmark needs further testing and validation from additional studies with a much broader set of faculty and student respondents. Assuming that similar findings occur ( p < .001 for each self-reported practice assessed; over 90% of students and faculty in agreement in all three domains), this information could be used as the catalyst to redesign policies and practices to significantly improve student success. Redesign in this context is to build an effective system of practices which are achievable, measureable, and effective. It is recommended that community college leaders and other stakeholders understand the findings of this study as an information resource to investigate practices to improve the success of students in the community college. Research findings which remain without application tend to have insignificant results in the practices of the community college. In simple terms, if the leadership and other participants in the community college assume that ?all is well?, when in fact, many practices remain untested or invalidated, it becomes a game-of-chance whether students will be successful at a level to be productive in their chosen field, progressing due to academic preparedness, a viable work ethic, and a thirst for learning throughout one?s life as a result of being exposed to the effective and caring practices of an educational institution. 290 It is recommended that this entire study be replicated across a much larger base of community and technical colleges to validate the study, inclusive of the survey instruments (faculty instrument and student instrument). It is recommended that the survey instrument be evaluated using the statistical procedures associated with Factor Analysis, Exploratory Factor Analysis or Confirmatory Factor Analysis, to explore any latent constructs within the individual SIT Model domains, and inclusive of any latent constructs within the student success domain e.g., 36 characteristics identified in this study. It is recommended that the Strategic-Impact-Triad Model be researched to validate the SIT Model Coefficient Equation as a longitudinal study. This type of study would provide a framework to ascertain its merits and long term application in the community college. This type of research may yield a model for entrance evaluations for applicability of student success functions related to the codependency of the SIT Model domains. It is recommended that this study be replicated in the four-college system and compared to studies within the two-year college system to discern what may be extracted from successful practices in the four-year system to be applied to the two-year system or vice versa. The underlying relationship in this recommendation is that community colleges are the ?proving ground? for many transfer students. 291 Implications An extensive number of studies have been conducted related to the domain of student success (Bailey, 2006a; Braxton, 2006; Brock et al., 2007; Horn, Nevill & Griffith, 2006; Jenkins, 2006). Moreover, student success is comprised of a large contingent of factors which influence student achievement. Student success has two main players in the process: students, who are both participants and recipients of the educational processes to promote student success; and, faculty who are also participants and recipients of the same processes. To delimit the entire spectrum of variables in this study, a Strategic-Impact-Triad Model was designed as a framework to conduct the study. The findings of this study indicated that students and faculty perceive practices impacting student success in similar and dissimilar manner. To address these differences and similarities, the following three implications are stipulated. First, as previously noted in this study, students in the community college are more likely to arrive at the institution unprepared for college requirements. College is a conglomeration of study, reading, writing, work, family, time management, and many more activities necessary to be successful. The implication suggested here is that community college leaders should lead the way in finding every conceivable solution to creating a culture of practice which promotes student achievement at every possible turn in the college. Although every member of an educational institution is an integral part of ensuring student success, the college President and Deans are those individuals who ?hold the reins of power? to authorize resources to be applied to policies and practice. Without this leadership support, small pockets of success may be achieved; without the backing of the college leadership, the implication that student success will continue to be 292 studied as it has for years without effective outcomes or improvement in student success is the construct opined by Cohen (2005): ?research on community colleges has been conducted for many decades, and for just as many years it has been ignored by community college practitioners ? even when the practitioner and the researcher are the same person?? (p. 51). Leadership is the linchpin to student success. For without resources applied to the SIT Model domains of influence, student success will continue to follow the path identified by Cohen (2005). Next, faculty members have a major role to play in the SIT Model domains impacting the success of the very students they teach and interact with on a daily basis. The SIT Model proposed in this study is a model to create a culture of inquiry (Achieving the Dream, 2005; Dowd, 2005; McClenney & Greene, 2005; Reid, 2004; VanWagoner, Bowman & Spraggs, 2005). The implication stated here is that a culture of inquiry is a relentless pursuit of positive outcomes for students. For example, as noted in the survey questions for academic preparation, the practice of writing assignments is intended to be a part of the culture of inquiry to assess how this practice might be used to improve student achievement to improve the overall goal of student success?not just any form of success, but life-long success. Within the domain of work ethics, being a team player in group projects was a practice identified to help students learn the meaning of working in teams to accomplish goals, establish priorities, consider the well-being of the individual and the group?what has been called ?group dynamics.? As this implication is a relentless pursuit, this is a time intensive process. Within this study, it is suggested that time intensive educational practices should be based upon the simple premise of ?building blocks? or assembling a puzzle. Start with one piece and build upon that foundation. 293 Therefore, the implication for faculty is that the SIT Model will appear to be an insurmountable challenge, fraught with educational potholes. This study would declare that this is incorrect logic. The SIT Model is a process involving all participants, and the process is built one piece at a time. Without the culture of inquiry, the SIT Model is simply another model which will be ?? ignored by community college practitioners? (Cohen, 2005, p. 31). Without the culture of inquiry and faculty buy-in, the success of students will remain a practice of status quo. The Strategic-Impact-Triad Model is suggested as a method to bridge the culture of inquiry and the buy-in of faculty to foster practices across all endeavors of student achievement. The final implication for this study deals directly with the primary recipient of the SIT Model domains of influence?the community college student. The implication for this individual is broad in scope and serious in consequences. As the national economy depends on skilled workers and individuals of integrity, the community college student stands on the threshold of success?provided the SIT Model domains impacting student success are implemented. Whether the process is dubbed the SIT Model, effective institutional practices, or any other name, the goal is to promote the success of each student to his or her fullest potential. The implication suggested here is that failure to assess practices within the community college leaves the success of the student body to chance, not structured, evaluated practice. It could be argued that community colleges evaluate their practices on a regular basis and the counter-argument would be the qualitative feedback from students and faculty. The implication is simple: without a model to guide student success, through a structured set of practices which are evaluated and modified as needed, student achievement exists at the cost of inconsistent practice. 294 Chapter Summary The purpose in conducting this study was to investigate the relationships between students and faculty in terms of perceptions of practices impacting community college success. Four research questions were derived from a comprehensive review of pertinent literature. The four questions were: 1. What is the relationship between faculty and students? perceptions in assessing the impact that academic preparation has on the success of the college student? 2. What is the relationship between faculty and students? perceptions in assessing the impact that work ethics has on the success of the college student? 3. What is the relationship between faculty and students? perceptions in assessing the impact that institutional support has on the success of the college student? 4. What is the relationship between faculty and students? perceptions in assessing institutional practice to promote student success as specifically related to academic preparation, work ethics, and institutional support? To address the questions, an extensive review of the literature was undertaken and completed; a model was designed to establish a framework for the study (Strategic- Impact-Triad (SIT) Model); and a survey was designed, pilot tested, and submitted to the sample groups consisting of 396 students and 152 faculty members. Six colleges agreed to voluntarily participate in the study, covering Alabama, Florida, and Georgia. Reported 295 data by the participants were used to assess the scores as a method to specifically respond to each research question. The findings of the study indicated that differences do exist between students and faculty in terms of the practices within the community colleges participating in the study. While there were differences, there was also agreement in the domain of academic preparation. An ANOVA was used to determine significant relationships between the groups in the three domains of practice influencing student success: academic preparation, work ethics, and institutional support. Additionally, qualitative themes indicated support for the identified domains within the SIT Model. The findings of this study indicated that students and faculty do have different perceptions about factors impacting student success in the community college. Based on the data reported, there were more differences than similarities, which is what the researcher hypothesized throughout the study. The only statistical variance which indicated student and faculty member agreement was in the domain of academic preparation. Qualitative data supported the similarities within academic preparation practices. The overarching conclusion for this study may best be served by quoting Derek Bok (2006), in Our Underachieving Colleges: A Candid Look at How Much Students Learn and Why They Should Be Learning More: 296 One can always scoff at educational research or dismiss existing studies that involve institutions and students different from one?s own. Rather than carp at research, however, faculties would do better to support careful studies of critical reasoning within their own college. In the short term, such inquiries could help them decide which methods of teaching and learning are most appropriate and effective. In the longer run, research could explore the next great pedagogic frontier and help instructors understand how to evaluate individual students and adjust their methods of teaching to fit the varying cognitive styles, preconceptions, and epistemic assumptions that undergraduates bring to the classroom. So long as work of this kind remains undone, colleges run the risk of continuing to rely on familiar methods of instruction and curricular policies that do far less than they should to develop the very cognitive abilities that faculties endorse so strongly as the principal aim of a college education. (p. 145) The comments by Bok (2006) are an axiom related to the Strategic-Impact-Triad Model domains of influence on student success. So long as studies of this kind ?remain undone, colleges run the risk of continuing to rely on familiar methods of instruction and curricular policies? that fail to recognize relationships of factors impacting student success (p. 145). The SIT Model is a framework for change, but not for the sake of change; rather, change to improve the lives of students in the community college. The snowball-effect is that success now is much more likely to breed success in future generations of students throughout the totality of the community college system of education?this success is global in context, yet fundamentally human in its most basic form. The basic form of humanness is explained by Dr. Linda Lujan (2006) when she described her educational encounter with a community college: ?Fortunately, I found wonderful instructors, good advisers, and an environment that supported and encouraged me? (p. B21). Dr. Lujan?s comment is precisely the intended impact of the SIT Model in the community college. 297 REFERENCES Achieve, Inc. (2005). Measuring what matters: Creating a longitudinal data system to improve student achievement. Washington, DC. Retrieved 24 October 2006 from http://www.achieve.org/. Achieve, Inc. (2006). Closing the expectations gap, 2006: An annual 50-state progress report on the alignment of high school policies with the demands of college and work. Washington, DC. Retrieved 24 October 2006 from http://www.achieve.org/. Achieving the Dream. (2005). Resource guide for institutional transformation to improve student success at community colleges. Retrieved January 11, 2007, from http://www.achievingthedream.org. Achieving the Dream. (2006). Success is what counts: National initiative promotes change to improve student success at community colleges. Retrieved January 11, 2007, from http://www.achievingthedream.org/ _images/_index 03/SuccessCounts.pdf. ACT COMPASS System. (2006). Comprehensive Computer-Adaptive Testing System. ACT, Inc. Retrieved July 7, 2006, from http://www.act.org/compass/index.html. ACT. (2005a). Crisis at the core: Preparing all students for college and work (IC 050805270 7416). Iowa City, IA: Author. 298 ACT. (2005b). 2005 Retention/Completion Summary Tables. Iowa City, IA: Author. ACT. (2005c). Are high school grades inflated? (IC 050805240). Iowa City, IA: Author. ACT. (2006a). Ready to succeed: All students prepared for college and work (IC 0402SE060 7913). Iowa City, IA: Author. ACT. (2006b). Reading between the lines: What the ACT reveals about college-readiness in reading (7538). Iowa City, IA: Author. ACT. (2006c). Benefits of a high school core curriculum (8191). Iowa City, IA: Author. ACTE. (2006). ACTE voices position on high school reform. Nine points to change our schools. Techniques: Connecting Education & Careers, 81(3), 12. Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor?s degree attainment. Washington, DC: U. S. Department of Education, Office of Educational Research and Improvement. Adelman, C. (2005). Moving into two ? and moving on: The community college in the lives of traditional-age students. U.S. Department of Education. Washington, DC: Office of Vocational and Adult Education. Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through college. Washington, DC: U. S. Department of Education. Aickin, M. (2004). Inference and scientific exploration. Journal of Scientific Exploration, 18(2), 179-186. Alabama Commission on Higher Education. (2006). Student Database Reports, Fall 2005, Enrollment Summary Report ? 2 Year. Retrieved July 14, 2006, from http://www.ache.state.al.us/Abstract0506/Student%20Database/Index.htm. 299 Alford, 1839, 574-5; The Works of John Donne, vol III, 1572- 1631. Alliance for Excellent Education. (2006). Paying double: Inadequate high schools and community college remediation. Issue Brief (August 2006). Washington, DC: Author. Almeida, D. (1991). Do underprepared students and those with lower academic skills belong in the community college? A question of policy in light of the ?mission?. Community College Review, 18(4), 28-32. Alvarado, D. (2006). Postsecondary success and pluralism: A call for systemic coherency. National Postsecondary Education Cooperative, October 2006, http://nces.ed.gov/npec/. American Association of Community Colleges. (2006a). First responders: Community colleges on the front line of security. Washington, DC. American Association of Community Colleges. (2006b). Community College Fact Sheet. Retrieved July 17, 2006, from http://www.aacc.nche.edu/Content/Navigation. American Association of Community Colleges. (2007). Community College Facts at a Glance. Retrieved May 11, 2007, from http://www.aacc.nche.edu/Content/ NavigationMenu/AboutCommunityColleges/Fast_Facts1/Fast_Facts.htm. American Diploma Project. (2004). Creating a high school diploma that counts. A Partnership of Achieve, Inc., The Education Trust, and Thomas B. Fordham Foundation. Amey, J., & Long, P. (1998). Developmental coursework and early placement: Success strategies for underprepared community college students. Community College Journal, 22(3), 3-10. 300 Anderson, D. (2000). Character education: Who is responsible? Journal of Instructional Psychology, 27(3), 139-142. Appleby, D. (1990). Faculty and student perceptions of irritating behaviors in the college classroom. Journal of Staff, Program, and Organizational Development, 8, 41- 46. Arendale, D. (2005). Terms of endearment: Words that define and guide developmental education. Journal of College Reading and Learning, 35(2), 66-82. Ashburn, E. (2006). Colleges should modify their worker-training programs to meet needs of employers, survey finds. The Chronicle of Higher Education. Retrieved July 14, 2006, from http://chronicle.com/daily/2006/06/2006061202n.htm. Asiu, B., & Antons, C. & Fultz, M. (1998). Undergraduate perceptions of survey participation: Improving response rates and validity. Paper presented at the Annual Forum of the Association for Institutional Research (38th, Minneapolis, Minnesota, May 17-20, 1998. Attewell, P., Lavin, D., Domina, T., & Levey, T. (2006). New evidence on college remediation. The Journal of Higher Education, 77(5), 886-924. Ausburn, L. (2002). The freedom versus focus dilemma in a customized self-directed learning environment: A comparison of the perceptions of adult and younger students. Community College Journal of Research and Practice, 26, 225-235. Bailey, T. (2006a). Research on institution level practice for postsecondary student success. National Postsecondary Education Cooperative, October 2006, http://nces.ed.gov/npec/. 301 Bailey, T. (2006b). Ten years later: The community college research center and the changing roles of community colleges. CCFC Currents, Community College Research Center: Teachers College, Columbia University, New York. Bailey, T. et al. (2004). Improving student attainment in community colleges: Institutional characteristics and policies. Community College Research Center: Teachers College, Columbia University. Bailey, T. et al. (2005a). Community college student success: What institutional characteristics make a difference? Community College Research Center: CCRC Working Paper No. 3. Bailey, T. et al. (2005b). Beyond student right-to-know data: Factors that can explain community college graduation rates. CCRC Brief Number 29, ERIC: ED489092. Bailey, T., et al. (2006). Is student right-to-know all you should know? An analysis of community college graduation rates. Research in Higher Education, 47(5), 491- 519. Bailey, T., & Alfonso, M. (2005). Paths to persistence: An analysis of research on program effectiveness at community colleges. Community College Research Center: Teachers College, Columbia University, 6(1), 1-28. Bailey, T., Jenkins, D., & Leinbach, T. (2005). Graduation rates, student goals, and measuring community college effectiveness. Community College Research Center: Teachers College, Columbia University, New York. Bakunas, B., & Holley, W. (2004). Teaching organizational skills. Clearing House, 77(3), 92-95. 302 Barton, P. (2006). High school reform and work: Facing labor market realities. Policy Information Report: Princeton, NJ: Educational Testing Service. Baum, S, & Payea, K. (2005). Education Pays 2004. College Board, Revised Edition, 2005. Bettinger, E., & Long, B. (2004). Shape up or ship out: The effects on remediation on students at four-year colleges (Working Paper No. 10369). Cambridge, MA: National Bureau of Economic Research. Retrieved from the National Bureau of Economic Research Web site: www.nber.org/papers/w10369. Bettinger, E., & Long, B. (2005). Remediation at the community college: Student participation and outcomes. New Directions for Community Colleges, 129, 17-26. Biswas, R. (2006). A supporting role: How accreditors can help promote the success of community college students. An Achieving the Dream Policy Brief, October 2006. Retrieved January 3, 2007, from http://www.achievingthedream.org /default.tp. Blocker, C., Plummer, R., & Richardson, R. (1965). The two-year college: A social synthesis. Englewood Cliffs, NJ: Prentice-Hall, Inc. Boggs, G. (2004). Community colleges in a perfect storm. Change, 36(6), 6-11. Bok, D. (2006). Out underachieving colleges: A candid look at how much students learn and why they should learning more. Princeton, NJ: Princeton University Press. Borden, V. (2004). Accommodating student swirl: When traditional students are no longer the traditional. Change, March/April 2004, 10 ? 17. 303 Boston Area Advanced Technological Education Connections (BATEC). (2007). BATEC Information Technology Workforce Skills Study. BATEC, University of Massachusetts: Bosaon, MA. Boswell, K. (2004). Bridges or barriers? Public policy and the community college transfer function. Change, 36(6), 22 ? 29. Boswell, K., & Wilson, C. (2004). Keeping America?s promise: A report on the future of the community college. Education Commission of the States, Denver, CO. Boulard, G. (2004). A fresh start. Remedial education gives students a new lease on college life. Community College Week, 17(1), 6-8. Bragg, D. (2001). Community college access, mission, and outcomes: Considering intriguing intersections and challenges. Peabody Journal of Education, 76(1), 93- 116. Brancato, V. (2003). Professional development in higher education. New Direction for Adult and Continuing Education, 98, 59 ? 65. Braxton, J. (2006). Faculty professional choices in teaching that foster student success. National Postsecondary Education Cooperative, June 2006, http://nces.ed.gov/npec/. Brint, S., & Karabel, J. (1989). The diverted dream: Community colleges and the promise of educational opportunities in America. (19001985). NY: Oxford University Press. Brock, T., et al. (2007). Building a culture of evidence for community college student success: Early progress in the Achieving the Dream Initiative. Authors: MDRC: New York and Community College Research Center: Columbia University. 304 Brozik, D. (2004). Whom do I blame? Phi Kappa Phi Forum, 84(4), 25- 26. Burd, S. (2006). Working-class students increasingly end up at community colleges, giving up on a 4-year degree. The Chronicle of Higher Education, 52(40), A23. Byrd, K., & MacDonald, G. (2005). Defining college-readiness from the inside out: First generation college student perspectives. Community College Review, 33(1), 22- 37. Caboni, T., & Adisu, M. (2004). A Nation at Risk after 20 years: Continuing implications for higher education. Peabody Journal of Education, 79(1), 164- 176. Callan, P., Doyle, W., & Finney, J. (2001). Evaluating state higher education performance: Measuring up 2000. Change, 33(2), 19. Callan, P., Finney, J., Kirst, M., Usdan, M., & Venezia, A. (2006). Claiming common ground: State policymaking for improving college-readiness and success. The National Center for Public Policy and Higher Education: San Jose, CA. Capaldi, E., Lombardi, J., & Yellen, V. (2006). Improving graduation rates: A simple method that works. Change, 38(4), 44-50. Carey, K. (2006). Is our students learning? The measurements elite colleges don?t want you to see. The Washington Monthly, 38(9), 26-29. Carnevale, D. (2006). Michigan community colleges expect more online high-school students. The Chronicle of Higher Education, 52(36), A38. CCSSE (2005). Engaging students, challenging the odds: 2005 findings. Community College Survey of Student Engagement, Community College Leadership Program, The University of Texas at Austin. Austin, TX: Author. 305 CCSSE (2006). Act on fact: Using data to improve student success, 2006 findings. Community College Survey of Student Engagement, Community College Leadership Program, The University of Texas at Austin. Austin, TX: Author. Center for Digital Education. (2005). Digital community colleges and the coming of the millennials 2005. Report of Major Findings from the 2005 Digital Community Colleges Survey. Folsom, CA: Author. Retrieved January 7, 2007, from http://www.centerdigitaled.com. Chaudron, D. (2006). Master of all you survey: Hot to use surveys to improve organizations, teams and leaders. Organized Change Publications: San Diego, CA. Clagett, C. (2004). Applying ad hoc institutional research findings to college strategic planning. New Directions for Institutional Research, 2004(123), 33-48. Closson, R. (1996). The learning society: How shall community colleges respond? Community College Review, 24(Summer 1996), 3 ? 18. Cohen, A. (2005). UCLA community college review: Why practitioners and researcher ignore each other (Even when they are the same person). Community College Review, 33, 51 ? 62. Cohen, A., & Brawer, F. (2003). The American community college (4th ed.). San Francisco: Jossey-Bass. Cohen, J. W. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates. College Board (2004). Education Pays 2004: The Benefits of Higher Education for Individuals and Society. New York, NY: Author. 306 College Board (2005). Education Pays Update. A Supplement to Education Pays 2004: The Benefits of Higher Education for Individuals and Society. New York, NY: Author. College Board (2006a). Teachers and the uncertain American future. Center for Innovative Thought: New York, NY: Author. College Board (2006b). Ready for college and ready for work: Same or different? ACT: College and Workforce Training Readiness: Iowa, City: IA. (IC 0402TA060). Collins, M., & Chandler, K. (1997). A Guide to using data from the National Household Education Survey, NES 97-561. U.S. Department of Education. National Center for Educational Statistics. Washington, DC. Conley, D. (2005). College Knowledge: What it Really Takes for Students to Succeed and What We Can Do to Get Them Ready. San Francisco, CA: Jossey-Bass. ContinuingEducation.com. (2007). Organizational skills and practices. Retrieved January 5, 2007, from http://www.continuingeducation.com/ pharmtech/orgskills/orgskills.pdf. Coomes, M., & Debard, R. (2004). A generational approach to understanding students. New Directions for Student Services, 106, 5 -16. Cordry, S., & Wilson, J. (2004). Parents as first teacher. Education, 125 (1), 56-62. Cortina, J. ((1993). What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology, 78, 98-104. Crawley, G., & Klomparens, K. (2000). The 1997-98 MSU Ph.D. Alumni Survey. Michigan State University, East Lansing, MI. 307 Creswell, J. (2003). Research design: Qualitative, quantitative, and mixed methods approaches. SAGE Publications: Thousand Oaks, CA. Dale, P., & Drake, T. (2005). Connecting academic and student affairs to enhance student learning and success. New Directions for Community Colleges, 131, 51- 64. Dalgety, J., & Coll, R. (2006). The influence of first-year chemistry students? learning experiences on their educational choices. Assessment & Evaluation in Higher Education, 31(3), 303-328. Daugherty, R. (2005). High school to college and careers: Aligning state policies. Southern Regional Education Board, Atlanta, GA. Davidovitch, N., & Soen, D. (2006). Class attendance and students? evaluation of their college instructor. College Student Journal, 40(3), 691-703. Day, P., & McCabe, R. (1997). Remedial education: A social and economic imperative. Executive Issue Paper submitted to American Association of Community Colleges, October, 1997. DeBard, R. (2004). Millennials coming to college. New Directions for Student Services, 106, 33 ? 45. DeGenaro, W. (2006). Community colleges, the media, and the rhetoric of inevitability. Community College Journal of Research and Practice, 30, 529-545. Derby, D., & Smith, T. (2004). An orientation course and community college retention. Community College Journal of Research and Practice, 28, 763-773. DeVellis, R. (2003). Scale development: Theory and applications (2nd ed.). Sage Applied Social Research Methods Series, Vol. 26. Thousand Oaks, CA: Sage. 308 Dicroce, D. (2005). How to make community colleges the first leg of a journey. The Chronicle of Higher Education, 52 (10), B22. Digest of Education Statistics. (2005). Employees in degree-granting institutions, by employment status, sex, control and type of institution, and primary occupation: Fall 2003. U. S. Department of Education, National Center for Education Statistics, 2003 Integrated Postsecondary Education Data System, Winter 2003- 04. Table 223. Dillman, D. (1991). The design and administration of mail surveys. Annual Review of Sociology. 17: 225-249. Dillman, D. (2000). Mail and Internet surveys: the tailored design method (2nd ed.). New York: John Wiley & Sons, Inc. Dobelle, E. (2006). Reform for college-readiness. The Journal of New England Board of Higher Education, 20(4), 9. Dougherty, K, & Hong, E. (2005). State systems of performance accountability for community colleges: Impacts and lessons for policymakers. An Achieving the Dream Policy Brief. Community College Research Center, Columbia University. Dougherty, K., Reid, M., & Nienhusser, H. (2006). State policies to achieve the dream in five states: An audit of state policies to aid student access to and success in community colleges in the first five achieving the dream states. Community College Research Center, Columbia University. Dounay, J. (2006a). Ensuring rigor in the high school curriculum: What states are doing. Denver, CO: Education Commission of the States. 309 Dounay, J. (2006b). Alignment between high school graduation and college admissions course requirements. Denver, CO: Education Commission of the States. Dounay, J. (2006c). Alignment of high school graduation requirement and state-set college admissions requirements. Denver, CO: Education Commission of the States. Dounay, J. (2006d). Embedding college readiness indicators in high school curriculum and assessments. Denver, CO: Education Commission of the States. Dounay, J. (2006e). Involving families in high school and college expectations. Denver, CO: Education Commission of the States. Dowd, A. (2005). Best practices for assessing performance in community colleges: The emerging culture of inquiry. Community College Student Success Project, University of Massachusetts: Boston, pp. 1-41. Doyle, W. (2006). Community college transfers and college graduation: Whose choices matter most? Change, May/June 2006, 56-58. DuBois, G. (1999). State University of New York: Fulfilling the promise. Community College Journal of Research and Practice, 23: 255-268. Dungy, G. (2003). Organization and Functions of Student Affairs. Student Services: A Handbook for the Profession (Komives & Woodard, 2003). Jossey-Bass: 4th Ed., San Francisco, CA. Eaton, J. (2006). Recreating America?s community colleges: Implications of the substantive issues in their future. Community College Journal of Research and Practice, 30, 91-93. 310 Education Commission of the States. (2006). The National Forum on Education Policy. Minneapolis, MN. July 11-14, 2006. Edwards, T. (2007). Shaping organizational futures through generative leadership. Leadership Abstracts, 20(2), 1 ? 4. Eells, W. (1931). The junior college. NY: Houghton Mifflin Company. Emanuel, R. (2005). The case for fundamentals of oral communication. Community College Journal of Research and Practice, 29, 153?162. Erdman, H., & Ogden, W. (2000). Reconsidering William Rainey Harper as ?Father of the Junior College?. College Student Journal, 34(3), 434-439. Evelyn, J. (2004a). 2-year colleges face an identity crisis: They play a central role in job training and access to higher education, but their public image suffers. The Chronicle of Higher Education, 51(10, B1. Evelyn, J. (2004b). Community colleges at a crossroads: With new missions, surging enrollment, and falling support, even the promise of access for all is in question. The Chronicle of Higher Education, 50(34), A27. Evelyn, J. (2005). Recruiting the world: Community colleges go globe-trotting. 2-year institutions are drawing more international students, and recruiters are crossing oceans to meet them. The Chronicle of Higher Education, 52(10), B1. Field, A. (2005). Discovering Statistics Using SPSS (2nd ed.). Thousand Oaks, CA: SAGE Publications, Inc. Field, K. (2005). House committee is expected to approve worker-training bill with funds for community colleges. The Chronicle of Higher Education. Retrieved July 18, 2006, from http://chronicle.com/daily/2005/02/2005021702n.htm. 311 Florida Community Colleges & Workforce Development (2005). Developmental education in Florida community colleges. Florida Department of Education, Tallahassee, FL. Forster, G. (2006). The embarrassing good news on college access. The Chronicle of Higher Education, 52(27), B50. Fowler, F. (2002). Survey research methods (3rd ed.). Sage Applied Social Research Methods Series, Vol., 1. Thousand Oaks, CA: Sage. Franco, R. (2002). The civic role of community colleges: Preparing students for the work of democracy. The Journal of Public Affairs, 2002 Supplement 1, 6, 119-138. Franke, V. (2001). Generation X and the military: A comparison of attitudes and values between West Point cadets and college students. Journal of Political and Military Sociology, 29 (92-119), 92 ? 119. Gilbert, C. (1999). We are what we wear: Revisiting student dress codes. Brigham Young University Education and Law Journal, 2, 3-17. Gillum, F., & Davies, T. (2003). The reality of perceptions: The future earning power of community college students. Community College Journal of Research and Practice, 27, 239-252. Goho, J. (2002). Mixed mode effects in a community college graduate survey. Paper presented at the Annual Forum of the Association for Institutional Research (42nd, Toronto, Canada, June 2-5, 2002). Gordon, D. (2003). A nation reformed? American education 20 years after A Nation at Risk. Cambridge, MA: Harvard Education Press. 312 Gorko, et al. (1994). Myths about student-faculty relationships: What do students really want? Journal on Excellence in College Teaching, 5, 51-65. Graham, P. (2003). A nation reformed? American education 20 years after A Nation at Risk (pp. vii-xi). D. Gordon (Ed.) in Forward. Cambridge, MA: Harvard Education Press. Gravetter, F., & Wallnau, L. (2007). Statistics for behavioral sciences. Thomson Wadsworth: Belmont, CA. Greene, J. (2000). The cost of remedial education: How much Michigan pays when students fail to learn basic skills. MacKinac Center for Public Policy, Midland, Michigan. Retrieved September 27, 2006, from http://www.mackinac.org. Greene, J., & Forster, G. (2003). Public high school graduation and college readiness rates in the United States. Center for Civic Innovation at the Manhattan Institute: Education Working Paper, No. 3. Manhattan Institute for Policy Research: New York. Greene, J., & Winters, M. (2005). Public high school graduation and college-readiness rates: 1991-2002. Center for Civic Innovation at the Manhattan Institute: Education Working Paper, No. 8. Center for Civic Innovation at the Manhattan Institute for Policy Research: New York. Grimes, S., & David, K. (1999). Underprepared community college students: Implications of attitudinal and experiential differences. Community College Review, 27(2), 73-92. Grubb, N., & Lazerson, M. (2004). Community colleges need to build on their strengths. The Chronicle of Higher Education, 51 (10), B16. 313 Halpin, R. (1990). An application of the Tinto model to the analysis of freshman persistence in a community college. Community College Review, 17(4), 22-33. Hamilton-Attwell, A. (1998). Productivity and work ethics. Work Study, 47(3), 79-86. Hammons, C. (2004). The cost of remedial education: How much Alabama pays when students fail to learn basic skills. Alabama Policy Institute, Birmingham, AL. Hansen, R. (2006). Benefits and problems with student teams: Suggestions for improving team projects. Journal of Education for Business, 82(1), 11-19. Hanson, C. (2006). From learning to education: A new paradigm for the community college. Community College Review, 34(2), 128-138. Harbour, C., Davies, T., & Lewis, C. (2006). Colorado?s voucher legislation and the consequences for community colleges, Community College Review, 33(3/4), 1-18. Haworth, J. (1997). The misrepresentation of the Generation X. About Campus, Sept- Oct, 10 ? 15. Haycock, K. (2006). Promise abandoned: How policy choices and institutional practices restrict college opportunities. The Education Trust: Washington, DC. Hendrick, R., Hightower, W., & Gregory, D. (2006). State funding limitations and community college open door policy: Conflicting priorities? Community College Journal of Research and Practice, 30, 627-640. Hicks, R. (2005). Assessing the academic, personal and social experiences of pre-college students. NACAC Journal, 186, 18-24. Hill, R., & Fouts, S. (2005). Work ethic and employment status: A study of jobseekers. Journal of Industrial Teacher Education, 42(3), 48-65. 314 Hill, R., & Petty, G. (1995). A new look at selected employability skills: A factor analysis of the occupational work ethic. Journal of Vocational Education Research, 20(4), 59-73. Hirsch, G. (2001). Helping college students succeed: A model for effective intervention. Brunner-Routledge: Philadelphia, PA. Honeyman, D., & Sullivan, M. (2006). Recreating America?s community colleges: Critical policy issues facing America?s community colleges. Community College Journal of Research and Practice, 30, 177-182. Horn, L., Nevill, S., & Griffith, J. (2006). Profile of undergraduates in U.S. postsecondary education institutions: 2003-04: With a special analysis of community college students (NCES 2006-184). U.S. Department of Education. Washington, DC: National Center for Educational Statistics. Horn, R., & Ethington, C. (2002). Self-reported beliefs of community college students regarding their growth and development: Ethnic and enrollment status differences. Community College Journal of Research and Practice, 26(5), 401-413. Hughes, K, & Karp, M. (2006). Strengthening transitions by encouraging career pathways: A look at state policies and practices. American Association of Community Colleges and League for Innovation in the Community College. Hugo, E. (2001). Dual enrollment for underrepresented student populations. New Directions for Community Colleges, 113, 67-72. Institute for Higher Education Policy. (2006). Making accountability work: Community college and statewide higher education accountability systems. Washington, DC: Author. 315 Jacobson, D. (2005). The new core competence of the community college. Change, July/August, 27(4), 52-61. Jacoby, D. (2006). Effects of part-time faculty employment on community college graduation rates. Journal of Higher Education, 77(6), 1081-1103. Jenkins, D. (2006). What community college policies and practices are effective in promoting student success? A study of high-and low-impact institutions. Community College Research Center, New York, NY. Jenkins, D., & Boswell, K. (2002). State policies on community college remedial education: Findings from a national survey. Education Commission of the States: Denver, CO. Jenkins, L. (2006). The $100 billion problem: Permitting forgetfulness. School Administrator, 63(3), 44-45. Jenkins, R. (2005). Know thy students: The truth about community-college students often flies in the face of long-established stereotypes. The Chronicle of Higher Education, 52(6), Page C1. Johnson County Community College (JCCC) (1996). Johnson County Community College Student Needs Assessment. Office of Institutional Research, Oct 96, ED 405059, JC970218 Johnson, M. (2007). Wallace State?s new rules of business: Affirming the truths of intentional transformation. Community College Journal of Research and Practice, 31 (6), 511 ? 516. 316 Jordan, W., Cavalluzzo, L., & Corallo, C. (2006). Community college and high school reform: Lessons from five case studies. Community College Journal of Research and Practice, 30, 729-749. Juhnke, et al. (1987). Effects of attractiveness and nature of request on helping behavior. Journal of Social Psychology, 127(4), 317-322. Karp, M., Bailey, T., Hughes, K., and Fermin, B. (2005). Update to state dual enrollment policies: Addressing access and quality. U.S. Department of Education, Office of Vocational and Adult Education, Washington, DC. Kaye, R., Lord, J., & Bottoms, G. (2006). Getting students ready for college and careers. Southern Regional Education Board. Atlanta, GA: SREB. Kazis, R. (2006). Building a pipeline for college access and success. The Journal of the New England Board of Higher Education, 20(4), 13-15. Kezar, A. (2006). Redesigning for collaboration in learning initiatives: An examination of four highly collaborative campuses. Journal of Higher Education, 77(5), 804- 838. Kintzer, F., & Bryant, D. (1998). Global perceptions of the community college. Community College Review, 26, 35 ? 55. Kinzie, J., & Kuh, G. (2004). Going DEEP: Learning from campuses that share responsibility for student success. About Campus: December 2004, 2-8. Kirst, M., & Venezia, A. (2004). From high school to college: Improving opportunities for success in postsecondary education. San Francisco, CA: Jossey-Bass. Kirst, M., & Venezia, A. (2006). What states must do. The Chronicle of Higher Education, 52 (27), B36. 317 Kisker, C. (2006). Integrating high school and the community college: Previous efforts and current possibilities. Community College Review, 34(1), 68-86. Kleiner, B., & Lewis, L. (2005). Dual enrollment of high school students at postsecondary institutions: 2002-03 (NCES 2005-008). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Kline, P. (1999). The handbook of psychological testing (2nd edition). London: Routledge. Knight, W, & Moore, M., & Coperthwaite, C. (1997). Institutional research: Knowledge, skills, and perceptions of effectiveness. Research in Higher Education, 38(4), 419 ? 433. Komives, S., & Woodard, D. (2003). Student Services: A Handbook for the Profession. 4th Ed., Jossey-Bass: San Francisco, CA. Konings, K., Brand-Gruwel, S., & Merrienboer, J. (2005). Towards more power learning environments through combining perceptions of designers, teachers, and students. The British Psychological Society, 75, 645-660. Kozeracki, Carol. (2002). ERIC review: Issues in developmental education. Community College Review, 29, 83-100. Kozeracki, C., & Brooks, B. (2006). Emerging institutional support for developmental education. New Directions for Community Colleges, 136, 63-73. Kraman, J. (2006). Closing the expectations gap 2006: An annual 50-state progress report on the alignment of high school policies with the demands of college and work. Achieve, Inc.: Washington, DC. 318 Krueger, C. (2006). Dual enrollment: Policy issues confronting state policymakers. Education Commission of the States: Denver, CO. Kuh et al. (2006). What matters to student success: A review of the literature. Commissioned Report for the National Symposium on Postsecondary Student Success: Spearheading a Dialog on Student Success. National Postsecondary Education Cooperative, July 2006, http://nces.ed.gov/npec/. Kuh, G. (2001). The National Survey of Student Engagement: Conceptual framework and overview of psychometric properties. Bloomington, IN: Indiana University Center for Postsecondary Research. Kuh, G., Kinzie, J., Schuh, J., & Whitt, E. (2005a). Never let it rest: Lessons about student success from high-performing colleges and universities. Change, 37(4), 44-51. Kuh, G., Kinzie, J., Schuh, J., & Whitt, E. (2005b). Student Success in College: Creating Conditions That Matter. Jossey-Bass: San Francisco, CA. Kuh, G. (2007). What student engagement data tell us about college readiness. Peer Review, 9(1), 4 ? 8. Labaree, D. (1997). Public goods, private goods: The American struggle over educational goals. American Educational Research Journal, 34(1), 39 ? 81. Laden, B. (2002). A cross-national perspective: How community college leaders understand and use research in decision-making in California and Ontario. Paper presented at the annual meeting of the Association for the Study of Education, Sacramento, California. 319 Lamkin, M. (2004). To achieve the dream, first look at the facts. Change, 36(6), 12 ? 15. Lancaster, G., Dodd, S., & Williamson, P. (2004). Design and analysis of pilot studies: recommendations for good practice. Journal of Evaluation in Clinical Practice, 10(2), 307 ? 312. Lancaster, L. (2003). The click and clash of generations. Library Journal, 128(17), 36- 39. Lancaster, L., & Stillman, D. (2002). When Generations Collide: Who They Are. Why They Clash. How to Solve the Generational Puzzle at Work. HarperCollins: New York, NY. Larose, M. (2003, August 19). Budget cuts forcing tuition hikes across the country. Community College Times. Retrieved July 14, 2006, from http://www.aacc.nche.edu/Template.cfm?Section=NewsandEvents&template=/Co ntentManagement/ContentDisplay.cfm&ContentID=10973&InterestCategoryID= 272. Levine, A. & Cureton, J. (1998). Collegiate life: An obituary. Change, 30(3), 12-17. Lindholm, J., Szelenyi, K., Hurtado, S., & Korn, W. (2005). The American college teacher: National norms for the 2004-2005 HERI faculty survey. University of California, Los Angeles: Higher Education Research Institute. Long, B. (2006). Using research to improve student success: What more could be done. National Postsecondary Education Cooperative, October 2006, http://nces.ed.gov/npec/. 320 L?Orange, H., & Ewell, P. (2006). P-16 data systems: An alignment status report. Data Quality Campaign & National Center for Educational Accountability. Lord, J. (2002a). Student readiness for college: Connecting state policies. Southern Regional Education Board (SREB): Atlanta, GA. Lord, J. (2002b). High school to college and careers: Aligning state policies. Southern Regional Education Board (SREB): Atlanta, GA. Lord, J., Marks, J., & Creech, J. (2005). Creating college opportunity for all: Prepared students and affordable colleges. Southern Regional Education Board (SREB): Atlanta, GA. Lorenzetti, J. (2006). How not to run an orientation course: Research reveals flaws in orientation course for online students. Distance Education Report 10, 7(3), pp. 1,2 of 2. Lovett, C., & Mundhenk, R. (2004). How can colleges prove they?re doing their jobs?: We need an honest conversation. The Chronicle of Higher Education. The Chronicle Review, 1-10. Retrieved September 6, 2005, from http://chronicle.com/weekly/v51/i02/02b00601.htm/. Lujan, L. (2006). I wanted to give something back. The Chronicle of Higher Education, 53(10), B21. Luo, J., & Jamieson-Drake, D. (2004). Linking student precollege characteristics to college development outcomes: The search for a meaningful way to inform institutional practice and policy. Paper presented at the Annual Forum of the Association for Institutional Research (AIR) (44th, Boston, MA, May 28-June2, 2004). 321 Lynch, D. (2005). Differences between student and faculty perceptions of learning strategies. The Teaching Professor, 19, 4. Maloney, W. (2003). Connecting the texts of their lives to academic literacy: Creating success for at-risk, first-year college students. Journal of Adolescent & Adult Literacy, 46(8), 664-673. Marburger, D. (2001). Absenteeism and undergraduate performance. Journal of Economic Education, 32(Spring), 99-109. Marburger, D. (2006). Does mandatory attendance improve student performance? Journal of Economic Education, 37(2), 148-155. Marshall, S. (2007). No student is a passive learner. Communications of the ACM, 50(1), 12. Marti, N. (2005). Overview of the CCSSE instrument and psychometric properties. The Community College Survey of Student Engagement. Retrieved January 3, 2007, from http://hawaii.hawaii.edu/assessment/Resources/CCSSE%202004/Appendix /psychometrics_paper.doc. Martin, C., & Tulgan, B. (2002). Managing the Generation Mix: From Collision to Collaboration. HRD Press, Inc.: Amherst, MA. Maypole, J., & Davies, T. (2001). Students? perceptions of constructivist learning in a community college American History II survey course. Community College Review, 29(2), 54 ? 79. McArthur, R. (2005). Faculty-based advising: An important factor in community college retention. Community College Review, 32(4), 1-19. 322 McClenney, K., & Greene, T. (2005). A tale of two students: Building a culture of engagement in the community college. About Campus, July-August 2005, 2-7. McLeish, A. (2002). Employability skills for Australian small and medium sized enterprises. Commonwealth Department of Education Science & Training: Australia. McJunkin, K. (2005). Remedial education in the community colleges: Understanding the problem and proposing solutions. UCLA Community College Bibliography. Community College Journal of Research and Practice, 29(6), 495-500. McKinney, J., McKinney, K., Franiuk, R., & Schweitzer, J. (2006). The college classroom as a community: Impact on student attitudes and learning. College Teaching, 54(3), 281-284. Mellow, G., & Talmadge, R. (2005). Creating the resilient community college. Change, 37(3), 58-66. Merrow, J. (2006). My college education: Looking at the whole elephant. Change, 38(3), 8-15. Messineo, M., & DeOllos, I. (2005). Are we assuming too much? Exploring students? perceptions of their computer competence. College Teaching, 53 (2), 50 ? 55. Miller, M. (2006). Assessing college-level learning. The National Center for Public Policy and Higher Education: San Jose, CA. Miller, M., & Pope, M., & Steinmann, T. (2005). A profile of contemporary community college student involvement, technology use, and reliance on selected college life skills. College Student Journal, 39(3), 596-603. 323 Miley, W., & Gonsalves, S. (2005). A simple way to collect data on how students view teaching styles. College Teaching, 53(1), 20. Milliron, M., & E. de los Santos, G. (2004). Making the most of community colleges on the road ahead. Community College Journal of Research and Practice, 28, 105- 122. Milliron, M., & Wilson, C. (2004). No need to invent them: Community colleges and their place in the education landscape. Change, November/December, 22-58. Morris, et al. (2005). Measuring what matters: Milestones and transitions for student success. Community College Student Success Project. Presented at the Annual Forum of the Association for Institutional Research (CIR), June 1, 2005: San Diego, California. Motta, J. (1999). Community colleges in Massachusetts. Community College Journal of Research and Practice, 23: 243-253. National Assessment of Adult Literacy (NAAL) (2005). A first look at the literacy of America?s adults in the 21st century. National Center of Education Statistics. U.S. Department of Education: Jessup, MD. National Association of Manufacturers. (2005). The looming workforce crisis: Preparing American workers for 21st century competition. Labor Day Report 2005. Washington, DC: Author. National Center for Educational Statistics. (2003). U.S. Department of Education, Community College Students: Goals, Academic Preparation and Outcomes, NCES 2003-164, Washington, DC: U.S. Government Printing Office. 324 National Commission on Excellence in Education (NCEE) (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office. National Postsecondary Education Cooperative [NPEC] (2006). Student outcomes: Current Activities, Student Success. Retrieved December 5, 2006 from http://nces.ed.gov/npec/student_outcomes_ca.asp. Nespoli, L., & Gilroy, H. (1999). New Jersey?s community colleges: An experiment in ?Coordinated Autonomy.? Community College Journal of Research and Practice, 23: 269-281. Noel-Levitz (2006). National Freshman Attitudes Report. 2006 National Research Study, 2006 National Freshman Attitudes Study: Denver, CO. Nunnaly, J. (1978). Psychometric theory. New York: McGraw-Hill. O?Banion, T. (1997). A Learning College for the 21st Century. American Council on Education, Series on Higher Education. Onyx Press: Pheonix, AZ. Olsen, D. (2000). Institutional research. New Directions for Higher Education, 111, 103 - 111. Olson, L. (2006). Skills for work, college readiness are found comparable. Education Week, 25(36), 1-19. Oudenhoven, B. (2002). Remediation at the community college: Pressing issues, uncertain solutions. New Directions for Community Colleges, 117 (Spring 2002), 35-44. 325 Overby, B. (2004). Reality versus perception: Using research to resolve misconceptions about developmental programs and promote credibility and acceptance. Inquiry, 9(1), 1 ? 10. Palazesi, L., & Bower, B. (2006). Self-identity modification and intent to return: Baby boomers reinvent themselves using the community college. Community College Review, 33, 44-67. Pallant, J. (2007). SPSS survival manual: A step by step guide to data analysis using SPSS for Windows (3rd ed). Berkshire, England: Open University Press. Palmer, J.,(ed.) (2000). How community colleges can create productive collaborations with local schools. New Directions for Community Colleges, 111 (Fall 2000). San Francisco: Jossey-Bass. Paredes, R. (2006). Developing a statewide strategy for P-16: Closing the gaps between public education and higher education. Commissioner?s Report on Higher Education, April 20, 2006. Pathways to College Network. (2004). A shared agenda: A leadership challenge to improve college access and success. The Education Resources Institute (TERI). Boston, MA: Author. Pathways to College Network. (2006). College readiness for all toolbox. Retrieved September 20, 2006, from http://www.pathwaystocollege.net/index.html. Perin, D. (2002). The location of developmental education in community colleges: A discussion of the merits of mainstreaming vs. centralization. Community College Review, 30(1), 27-44. 326 Perin, D. (2006). Can community colleges protect both access and standards? The problem of remediation. Teachers College Record, 108(3), 339 ? 373. Perna, L., & Thomas, S. (2006). A framework for reducing the college success gap and promoting success for all. National Postsecondary Education Cooperative, July 2006, http://nces.ed.gov/npec/. Pett, M., & Lackey, N., & Sullivan, J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Sage Publications, Inc.: Thousand Oaks, California. Petty, G. (1993). Development of the occupational work ethic inventory. Paper presented at the 1993 annual American Vocational Association meeting, Nashville, Tennessee. Phelan, D. (2004). Enrollment policies and student access at community colleges. Education Commission of the States: Denver, CO. Phelps, P. (2006). The three Rs of professionalism: When teachers commit to three key values, professionalism improves. Kappa Delta Pi, 42(2), 69-71. Phillips, D., & Skelly, K. (2006). College-readiness for all. School Administrator, 63 (1), 26-32. Phipps, R. (1998). College remediation: What it is, what it costs, what?s at stake. The Institute for Higher Education Policy: Washington, DC. Phillippe, K., & Shults, C. (2003). State-by-state profile of community colleges. American Association of Community Colleges, 6th Ed., Community College Press, Washington, DC. 327 Phillippe, K., & Sullivan, L. (2005). National Profile of Community Colleges: Trends & Statistics, American Association of Community Colleges, 4th Ed., Community College Press: Washington, DC. Pierson, P., & Holmes, G. (2007). Perceptions of work ethic among college seniors: A comparative study. Journal of College and Character, 2, 1-10. Retrieved January 17, 2007, from http://www.collegevalues.org/articles.cfm?a=1&id=603 . Pipho, C. (2001). State policy options to support a P-16 system of public education. Education Commission of the States: Denver, CO. Porter, S., & Umbach, P. (2006). Student survey response rates across institutions: Why do they vary? Research in Higher Education, 47(2), 229-240. Powell, W. (1989). The whole is not greater than the sum of its parts, but some of the parts are pretty darn good. Contemporary Sociology, 18(4), 490-493. Prensky, M. (2006). On being disrespectful. Educational Leadership, 64(2), 92. Puka, B. (2005). Student cheating. Liberal Education, 91(3), 32-25. Raines, C. (2003). Connecting Generations: The Sourcebook for a New Workplace. Crisp Publications, Inc.: www.cripslearning.com. Reason, R., Terenzini, R., & Domingo, R. (2005). Developing social and personal competence in the first year of college. Paper presented at the meeting of the Association for the Study of Higher Education, November 2005, Philadelphia, PA. Reference Service Press (2003). College enrollment at all-time high. Retrieved November 23, 2005, from http://www.rspfunding.com/articles/article/1436530/14618.htm. 328 Reid, A. (2004). Towards a culture of inquiry in DCES. Department of Education and Children?s Services, Government of South Australia. Occasional Paper Series, No. 1, 1-19. Restauri, S. (2004). Creating an effective online distance education program using targeted support factors. TechTrends: Linking Research & Practice to Improve Learning, Nov/Dec 2004, 48(6), p. 32-39. Richardson, E. (2006). Promoting broad access and student achievement: A test of the public will. National Postsecondary Education Cooperative, October 2006., http://nces.ed.gov/npec/. Robbins et al., (2004). Do psychosocial and study skill factors predict college outcomes? A meta-analysis. Psychological Bulletin, 130(2), 261-288. Robinson, J. (2000). What are employability skills? The Workplace, 1(3), 1-3. Romano, J. (2004). LifeMap: A learning-centered system for student success. Paper presented at the National Council on Student Development annual conference, Orlando, FL, Oct. 2004. Romer, D. (1993). Do students go to class? Should they? Journal of Economic Perspectives, 7(3), 167-174. Romero, M., Purdy, L., & Rodriquez, L., & Richards, S. (2005). Research needs and practices of community college practitioners. Community College Journal of Research and Practice, 29(4), 289-302. Rose, M. (2005). Do rising levels of qualifications alter work ethic, work orientation and organizational commitment for the worse? Evidence from the UK, 1985-2001. Journal of Education and Work, 18(2), 131-164. 329 Rudebock, R. (2005). When honesty is the expected policy. College Teaching, 53(4), 145-145. Sandeen, A. (2004). Educating the whole student: The growing academic importance of student affairs. Change, May/June, 28-33. Sanoff, A. (2006). What professors and teachers think: A perception gap over student?s preparation. The Chronicle of Higher Education, 52 (27), B9. Senge, P. (1990). The fifth discipline: The art and practice of the learning organization. New York: Currency Doubleday. Shannon, D., & Davenport, M.A. (2001). Using SPSS to solve statistical problems: A self-instruction guide. Upper Saddle River, NJ: Prentice-Hall. Shkodriani, G. (2004). Community colleges as professional development: Resources for working teachers. Education Commission of the States: Denver, CO. Smith, H. (1997). A federal perspective on community colleges. Journal of Chemical Education. 74 (11), 1264. Smith, T. (2005). Fifty-one competencies for online instruction. The Journal of Educators Online, 2(2), 1-18. Smith, V. (2006). Bridging the gap between high school and college: An interview with David Spence. Change, 38(3), 40-46. Soliday, M. (2002). The politics of remediation. Pittsburgh, PA: University of Pittsburgh Press. Southern Regional Education Board. (2006). Challenge to lead: The momentum continues. Atlanta, GA: 2006 Annual Report. 330 Spann, M. (2000). Remediation: A must for the 21st century learning society. Community College Policy Center: Denver, Colorado. Spector, P. (1992). Summated rating scale construction: An introduction (Sage University Paper series on Quantitative Applications in the Social Sciences, No. 07-082). Newbury Park, CA: Sage. Sperling, C. (2003). How community colleges understand the scholarship of teaching and learning. Community College Journal of Research and Practice, 27, 593- 601. Stanca, L. (2004). The effects of attendance on academic performance: Panel data evidence for Introductory Microeconomics. University of Milano-Bicocca, Department of Economics: Working Papers. Stanca, L. (2006). The effects of attendance on academic performance: Panel data evidence for Introductory Microeconomics. Journal of Economic Education, 37(3), 251-266. Starratt, R. (2003). Opportunities to learn and the accountability agenda. Phi Delta Kappan, 85(4), 298-303. Stein, M., et al. (2005). College prep 101: Principal Leadership, 6(1), 22 ? 26. Sterngold, A. (2004). Confronting plagiarism. Change, 36(3), 16-21. Streiner, D. (2003). Being inconsistent about consistency: When coefficient alpha does and doesn?t matter. Journal of Personality Assessment, 80(3), 217-222. Strom, P., Strom, R., & Moore, E. (1999). Peer and self-evaluation of teamwork skills. Journal of Adolescence, 22(4), 539-553. 331 Strom, P., & Strom, R. (1999). Making students accountable for teamwork. Community College Journal of Research and Practice, 23, 171-182. Strom, P., & Strom, R. (2002). Overcoming limitations of cooperative learning among community college students. Community College Journal of Research and Practice, 26, 315-331. Strout, E. (2006). Community colleges struggle when it comes to soliciting private donations: As government support lags, 2-year colleges face challenges as they increase development efforts. The Chronicle of Higher Education, 52(23), A25. Sugden, R., Smith, T., & Jones, R. (2000). Cochran?s rule for simple random sampling. Royal Statistical Society, 62, Part 4, 787-793. Swanson, C. (2004). Projections of 2003-04 high school graduates: Supplemental analysis based on findings from who graduates? Who doesn?t. Urban Institute, Washington, DC. Swigart, T, & Murrell, P. (2001). Factors influencing estimates of gains made among African-American and Caucasian community college students. Community College Journal of Research and Practice, 25, 297-312. Tabachnick, B., & Fidell, L. (2007). Using multivariate statistics (5th ed). Boston, MA: Pearson Education, Inc. Tarricone, P., & Luca, J. (2002). Employees, teamwork, and social independence?a formula for successful business? Team Performance Management, 8, 54-59. Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social and behavioral research. SAGE Publications. Thousand Oaks, CA. 332 Teaching Commission. (2006). Teaching at risk: Progress & potholes. New York: The Teaching Commission. Retrieved August 31, 2006, from http://www.theteachingcommission.org/ press/pdfs/ProgressandPotholes.pdf. Terenzini, P., Rendon, L., Upcraft, M., Miller, S., Allison, K., Gregg, P., and Jalomo, R. (1994). The transition to college: Diverse students, diverse stories. Research in Higher Education, 35(1), 57-73. T.H.E. Journal. (2004). Digital Community Colleges and the Coming of the ?Millennials?. T.H.E. Journal, 32(3), 14-15. The Alabama College System. (ACS)(2005). The Chancellor?s Report 2005. Montgomery, AL. The Alabama College System. (ACS)(2007). The Alabama College Systems: Facts and Enrollment Statistics, Enrollment Reports by Term. Retrieved October 24, 2007, from http://www.acs.cc.al.us/facts/enrollindex.aspx. The Chronicle of Higher Education. (2004). The biggest challenge for community colleges: 6 views. Community Colleges. Retrieved June 26, 2006, from http://chronicle.com/weekly/v51/i10/10b01001.htm. The Conference Board et al. (2006). Are they really ready to work? Employers? perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. workforce. The Conference Board, Inc., the Partnership for 21st Century Skills, Corporate Voices for Working Families, and the Society for Human Resource Management: Author. (ISBN: No. 0-8237-0888-8.) 333 Tillery, D., & Deegan, W. (1985). The evolution of two-year colleges through four generations. In Deegan, W., & Tillery, D. (Eds.), Renewing the American community college: Priorities and strategies for effective leadership (pp. 3-33). San Francisco: Jossey-Bass Publishers. Tinto, V., & Pusser, B. (2006). Moving from theory to action: Building a model of institutional action for student success. National Postsecondary Education Cooperative, June 2006, http://nces.ed.gov/npec/. University of Michigan Health Systems. (2002). Work Ethics of Different Generations. Retrieved January 21, 2007, from http://www.med.umich.edu/diversity/pdffiles/file29.pdf. Urso, D., & Sygielski, J. (2007). Why community college students make successful transfer students. Journal of College Admission, Winter2007 (194), 12-17. U.S. Census Bureau. (2006). State & county quickfacts. Retrieved September 27, 2006, from http://quickfacts.census.gov/qfd/states/06000.html. U. S. Department of Education (1997), National Center for Educational Statistics, Education and the Economy: An Indicators Report, NCES 97-269, by Paul T. Decker, Jennifer King Rice, Mary T. Moore, and Mary R. Rollefson, Project Officer. Washington, DC: 1997. U. S. Department of Education. (2000). Corporate involvement in education: Achieving our national education priorities. The seven priorities of the U.S. Department of Education. Washington, DC: (Eric Document Reproduction Service No. ED440307). 334 U. S. Department of Education (2003), National Center for Education Statistics. Remedial Education at Degree-Granting Postsecondary Institutions in Fall 2000, NCES 2004-010, by Basmat Parsad and Laurie Lewis, Project Officer: Bernard Greene. Washington, DC: 2003. U. S. Department of Education. (2006). Answering the challenge of a changing world: Strengthening education for the 21st century. Washington, DC: Office of the Secretary. U. S. Department of Labor, Bureau of Labor Statistics (2000). Earnings and unemployment for persons 25 and over. Retrieved May 9, 2006, from http://www.bls.gov/opub/working/page6b.htm. Van Alphen, A., Helfens, R., Hasman, A., & Imbos, T. (1994). Likert or Rasch? Nothing is more applicable than good theory. Journal of Advanced Nursing, 20, 196-201. Van de Water, G., & Rainwater, T. (2001). What is P-16 Education? A primer for legislators ? a practical introduction to the concept, language and policy issues of an integrated system of public education. The Education Commission of the States, Denver, CO. VanWagoner, R., & Bowman, L., & Spraggs, L. (2005). Editor?s Choice: The significant community college. Community College Review, 33(1), 38-50. Vaughn, G. (2004). How to keep open access in community colleges. The Education Digest, 69(6), 52-55. Veltri, S., & Banning, J., & Davies, T. (2006). The community college classroom environment: Student perceptions. College Student Journal, 40(3), 517-527. 335 Venezia, A., & Kirst, M., & Antonio, A. (2003a). Fix K-16 disconnections, or betray the college dream. Education Digest, 68(9), 34-39. Venezia, A., & Kirst, M., & Antonio, A. (2003b). Betraying the college dream: How disconnected K-12 and Postsecondary education systems undermine student aspirations. A Project of the Stanford Institute for Higher Education Research, Stanford University?s Bridge Project. Venezia, A., et al. (2005). The governance divide: A report on a four-state study on improving college-readiness and success. The National Center for Public Policy and Higher Education. September 2005. Voorhees, R., & Zhou, D. (2000). Intentions and goals at the community college: Association student perceptions and demographics. Community College Journal of Research and Practice, 24: 219-232. Waggoner, J. (2006). Nothing hard about soft skills in the college classroom. MountainRise, 3(2), 1-31. Walsh, D., & Maffei, M. (1994). Never in a class by themselves: An examination of behaviors affecting the student-professor relationship. Journal of Excellence in College Teaching, 5, 23-49. Wang, W. (2004). UCLA community college review: Community education in the community college. Community College Review, 32(32), 43-56. Wattenbarger, J. (1983). Research as a basis for improving the community college. Community College Review, 10, 58-62. Wattenbarger, J., & Haynes, F., & Smith, A. (1982). Coping with complexity: Another viewpoint for community colleges. Community College Review, 10, 3 ? 12. 336 Wattenbarger, J., & Witt, A. (1995). Origins of the California system: How the junior college movement came to California. Community College Review, 22 (4), 17- 25. Weimer, M. (1994). An introduction to the National Center on Postsecondary Teaching, Learning, and Assessment. Innovative Higher Education, 19(1), 3-6. Welsh, J., & Brake, N., & Choi, N. (2005). Student participation and performance in dual-credit courses in a reform movement. Community College Journal of Research and Practice, 29, 199-213. Widson, J., et al. (2006). As assessment of web accessibility knowledge and needs at Oregon community colleges, Community College Review, 33(3/4), 19-37. Wimberly, G., & Noeth, R. (2004). Schools involving parents in early postsecondary planning. ACT Policy Report. ACT: Iowa City, IA. (IC 050804020 4746). Wimberly, G., & Noeth, R. (2005). College readiness begins in middle school. ACT Policy Report. ACT: Iowa City, IA. (IC 050805040 6091). Windham, P., Perkins, G., & Rogers, J. (2001). Concurrent-use campuses: Part of the new definition of access. Community College Review, 29(3), 39 ? 55. Wirt, J., & Choy, S., & Rooney, P., & Provasnik, S. (2005). The condition of education 2005. U.S. Department of Education, National Center for Education Statistics (NCES 2005-094). Washington, DC: U.S. Government Printing Office. Woods, M. (2007). Transformation of student success. Community College Journal of Research and Practice, 31(6), 485 - 486. Woodruff, D., & Ziomek, R. (2004). High school grade inflation from 1991 to 2003: ACT Research Report Series, 2004-5. 337 WorkEthics.Org. (2006). Powered by East Central Technical College. Retrieved October 1, 2006, from http://www.workethics.org/. Wright, M. (2005). Always at odds?: Congruence in faculty beliefs about teaching at a research university. The Journal of Higher Education, 76(3), p. 331 ? 353. Yankelvich, D. (1982). The work ethic is underemployed. Psychology Today, May, 5-8. Yecke, C. (2005). Mayhem in the middle: How middle schools have failed America ? and how to make them work. Thomas B. Fordham Institute. Washington, DC. Zarkesh, M., & Beas, A. (2004). UCLA community college review: Performance indicators and performance-based funding in community colleges. Community College Review, 31, 62-76. Zemke, R., Raines, C., & Filipczak, B. (2000). Generations at Work: Managing the Clash of Veterans, Boomers, Xers, and Nexters in Your Workplace. American Management Association: New York, NY. 338 APPENDICES 339 APPENDIX A, Page 1 of 2 340 APPENDIX A, Page 2 of 2 341 APPENDIX B, Page 1 of 6 Student and Faculty Perceptions of College Student Success: STUDENT SURVEY v.2 Exit this surv ey >> 1. Statement of Participation: 1 / 5 1. I have read the STUDENT INFORMATION SHEET explaining this study. This survey is asking you as a Community College Student to respond to some statements and give your opinion as to what you believe about the statements. Your instructors (professors) have been given the same questions and this is an opportunity for you to respond openly. You will in NO WAY be identified and your responses are 100% secure and confidential. Neither your instructors nor your college will know which student gave what answer. Please click NEXT and you will be directed to the Community College Student Survey. =========================================================================== Next >> 2. Community College Student Demographic Data. 2 / 5 1. Please indicate the name of your college in the textbox below. 2. Gender: Male Female 3. Age Group: 18-24 25-34 35 or older 4. Indicate your enrollment status as noted by the options below(check all that apply): I am a first-time college student I am a returning or transfer student I have a degree; I'm here to update my skills I am attending to obtain a professional certification only None of these options apply to me 5. Please Select the Appropriate Item: White (Non-Hispanic) African-American (Non-Hispanic) Hispanic (Latino/Latina) Asian/Pacific Islander American Indian/Alaskan Native Other 6. What was your Grade Point Average in high school? 4.0 - 3.0 (A's and B's) 2.9 - 2.0 (C Average) 1.9 or Below (D Average or Below) Don't know or remember 342 APPENDIX B, Page 2 of 6 7. What is the highest degree you hope to obtain? Associate Degree Bachelor Degree Masters Degree Doctorate (PhD, EdD) JD (Law) or MD (Medical) 8. Employment and marital status (please check those that apply to you). Work Full-Time Work Part-Time Don't Work While in School Married, with children Married, no children Single parent 9. What remedial or developmental education courses have you taken at this or another college (check all that apply)? Basic Math Basic English Basic Reading Not Applicable << Prev Next >> 3. Student Group Performance 3 / 5 1. Compared to other Community College Students at my college, I would rate myself in the following categories as: Below Average Average Above Average 1. Attendance: ? ? ? 2. Writing ability: ? ? ? 3. Team player: ? ? ? 4. Motivation to succeed in college: ? ? ? 5. Oral presentations: ? ? ? 6. Producing quality work: ? ? ? 7. Computer skills: ? ? ? 8. Success in high school: ? ? ? 9. Respect for others: ? ? ? 10. Enjoy learning new things: ? ? ? 11. Reading ability: ? ? ? 12. Time management: ? ? ? 13. Math skills: ? ? ? 14. Leadership: ? ? ? 15. Work ethic: ? ? ? << Prev Next >> 343 APPENDIX B, Page 3 of 6 4. Academic Preparation, Work Ethics & Institutional Support 4 / 5 * 1. How important are the following items or activities in helping you to be successful in your college work? (Academic Preparation) Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Writing assignments ? ? ? ? 2. Reading the textbook ? ? ? ? 3. Getting feedback on assignments and tests ? ? ? ? 4. Having instructors as advisors ? ? ? ? 5. Using email to get help with class material ? ? ? ? 6. Instructors who challenge and encourage me ? ? ? ? 7. Participating in labs with real-world exercises ? ? ? ? 8. Having online study guides for each course ? ? ? ? 9. Tests that actually cover the material taught ? ? ? ? 10. Getting help from instructors during office hours ? ? ? ? 11. Receiving feedback about progress in a course ? ? ? ? 12. Having a syllabus that is a learning guide ? ? ? ? 344 APPENDIX B, Page 4 of 6 2. How important are the following items or activities in helping you to be successful in your college work? (Work Ethics) Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Showing up for class on time ? ? ? ? 2. Taking the initiative to make up missed work due to absences ? ? ? ? 3. Attending class regularly ? ? ? ? 4. Appearance ? ? ? ? 5. Being a team player in group projects ? ? ? ? 6. Helping other students succeed ? ? ? ? 7. Improving my organizational skills ? ? ? ? 8. Treating people with respect ? ? ? ? 9. Getting feedback from instructors on my work ethics ? ? ? ? 10. Hearing from business and community leaders about work ethics ? ? ? ? 11. Being an effective manager of my time ? ? ? ? 12. Earning an A by unethical methods ? ? ? ? 345 APPENDIX B, Page 5 of 6 3. How important are the following items or activities in helping you to be successful in your college work? (Institutional Support) Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Having problems resolved satisfactorily ? ? ? ? 2. Perceiving faculty, staff and administrators as accessible and helpful ? ? ? ? 3. Feeling safe on campus to study ? ? ? ? 4. Getting help in finding meaningful employment ? ? ? ? 5. Permission to call any individual associated with the college ? ? ? ? 6. Online registration is available when needed ? ? ? ? 7. Being in classrooms that are clean ? ? ? ? 8. Understanding the mission of the college ? ? ? ? 9. Having student organizations that enrich the learning experience ? ? ? ? 10. Giving feedback to administrators on how to improve the college ? ? ? ? 11. Having community services published on the web site ? ? ? ? 12. Resources for student support are reliably accessible ? ? ? ? << Prev Next >> 346 APPENDIX B, Page 6 of 6 5. Community College Student Opinions and Comments 5 / 5/5 1. What should community colleges do to support students who are academically unprepared? 2. How can community colleges help students acquire and practice good work ethics? 4. What can a community college do to improve its institutional support to help students succeed in college from enrollment to graduation? 5. What institutional practices (actions by members of the college) have you observed which helps or harms the success of a student? 5. How do you respond to the following statements? Required for Student Success Not Required for Student Success Academic Preparation is: ? ? Work Ethics are: ? ? Institutional Support is: ? ? 6. Click DONE. And a personal thank you for taking the time to help with this study. Kenneth Scott... << Prev Done >> 347 APPENDIX C, Page 1 of 2 348 APPENDIX C, Page 2 of 2 349 APPENDIX D, Page 1 of 6 Student and Faculty Perceptions of College Student Success: FACULTY SURVEY v.2 1. Statement of Participation: 1 / 5 I have read the FACULTY INFORMATION SHEET explaining this study. The following survey is asking you as a faculty member to respond to some statements and give your opinion as to what you believe about the statements. Your students have been given the same questions and this is an opportunity for you to respond openly. You will in NO WAY be identified and your responses are 100% secure and confidential. Neither your administrators, colleagues, nor your students, will know which instructor gave what answer. Please click NEXT and you will be directed to the Community College Faculty Survey. ============================================================= Next >> 2. Community College Faculty Demographic Data. 2 / 5 1. Please indicate the name of your college in the textbox below. 2. Gender: Male Female 3. Age Group: 18-24 25-34 35 or older 4. Please Select the Appropriate Item: White (Non- Hispanic) African- American (Non- Hispanic) Hispanic (Latino/Latina) Asian/Pacific Islander American Indian/Alaskan Native Other 5. Highest Degree Earned: Bachelor's Master's Doctorate JD or MD * 6. Please provide the years of Teaching Experience and your Current Employment Status (select all that apply): 5 or less 6 - 10 > 10 Full-time Part-time << Prev Next >> 350 APPENDIX D, Page 2 of 6 3. Instructional Status and General Student Performance 3 / 7 1. Please select EACH item that applies to your teaching assignments and/or classes. Teach Technical Courses Only Teach General Education Courses (Non-Technical) Teach In-Class AND On-Line Courses Teach In-Class ONLY Teach On-Line ONLY 2. Student Group Performance 3 / 5 3. Based on your experience as an instructor, how would you rate the general performance of your students in the categories below?: Below Average Average Above Average 1. Attendance: ? ? ? 2. Writing ability: ? ? ? 3. Team player: ? ? ? 4. Motivation to succeed in college: ? ? ? 5. Oral presentations: ? ? ? 6. Producing quality work: ? ? ? 7. Computer skills: ? ? ? 8. Success in high school: ? ? ? 9. Respect for others: ? ? ? 10. Enjoy learning new things: ? ? ? 11. Reading ability: ? ? ? 12. Time management: ? ? ? 13. Math skills: ? ? ? 14. Leadership: ? ? ? 15. Work ethic: ? ? ? << Prev Next >> 351 APPENDIX D, Page 3 of 6 4. Academic Preparation, Work Ethics & Institutional Support 4 / 5 * 1. How important are the following items or activities in helping students be successful in college? Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Writing assignments ? ? ? ? 2. Reading the textbook ? ? ? ? 3. Students getting feedback on assignments and tests ? ? ? ? 4. Having instructors as advisors ? ? ? ? 5. Using email to get help with class material ? ? ? ? 6. Instructors who challenge and encourage students ? ? ? ? 7. Designing labs with real-world exercises ? ? ? ? 8. Having online study guides to help students learn ? ? ? ? 9. Tests that actually cover the material taught ? ? ? ? 10. Giving students help during office hours ? ? ? ? 11. Giving students feedback about progress in a course ? ? ? ? 12. Designing a syllabus that is a learning guide ? ? ? ? 352 APPENDIX D, Page 4 of 6 2. How important are the following items or activities in helping students be successful in college? Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Showing up for class on time ? ? ? ? 2. Students take the initiative to make up missed work due to absences ? ? ? ? 3. Attending class regularly ? ? ? ? 4. Appearance ? ? ? ? 5. Students as a team player in group projects ? ? ? ? 6. Students helping other students succeed ? ? ? ? 7. Students improving their organizational skills ? ? ? ? 8. Treating people with respect ? ? ? ? 9. Instructors giving students feedback on their work ethics ? ? ? ? 10. Hearing from business and community leaders about work ethics ? ? ? ? 11. Being an effective manager of time ? ? ? ? 12. Earning an A by unethical methods ? ? ? ? 353 APPENDIX D, Page 5 of 6 3. How important are the following items or activities in helping students be successful in college? Not Important (1) Somewhat Important (2) Important (3) Very Important (4) 1. Having problems resolved satisfactorily ? ? ? ? 2. Perceiving faculty, staff and administrators as accessible and helpful ? ? ? ? 3. Feeling safe on campus to study ? ? ? ? 4. Getting help in finding meaningful employment ? ? ? ? 5. Permission to call any individual associated with the college ? ? ? ? 6. Online registration is available when needed ? ? ? ? 7. Being in classrooms that are clean ? ? ? ? 8. Understanding the mission of the college ? ? ? ? 9. Having student organizations that enrich the learning experience ? ? ? ? 10. Giving feedback to administrators on how to improve the college ? ? ? ? 11. Having community services published on the web site ? ? ? ? 12. Resources for student support are reliably accessible ? ? ? ? << Prev Next >> 354 APPENDIX D, Page 6 of 6 5. Community College Faculty Opinions and Comments 5 / 5 1. What should community colleges do to support students who are academically unprepared? 2. How can community colleges help students acquire and practice good work ethics? 3. What can a community college do to improve its institutional support to help students succeed in college from enrollment to graduation? 4. What institutional practices (actions by members of the college) have you observed which helps or harms the success of a student? 5. How do you respond to the following statements? Required for Student Success Not Required for Student Success Academic Preparation is: ? ? Work Ethics are: ? ? Institutional Support is: ? ? 6. Click DONE. And a personal thank you for taking the time to help with this study. Kenneth Scott... << Prev Done >> 355 APPENDIX E: Web Portals FOR STUDENTS ONLY FOR STUDENTS ONLY FOR STUDENTS ONLY The following survey is a study by Ken Scott, doctoral candidate at Auburn University and Instructor at TrenholmTech in Montgomery, Alabama. I NEED YOUR HELP! Please send any questions to one of the email addresses below or you may call one of the numbers listed. K. Edward Scott (Ken), CCNA, CCAI Work: 334-420-4392 scottk1@auburn.edu Home: 334-279-6480 kscott@trenholmtech.cc.al.us War Eagle! skinner777@knology.net For STUDENTS wishing to participate in this study: 1) please click on the STUDENT INFORMATION SHEET link below, read the information provided (you may print a copy for your files); 2) after reading the STUDENT INFORMATION SHEET, click on the back button in your browser, and then 3) click the link "STUDENT SURVEY" which will allow you access to the online STUDENT SURVEY. STUDENT PARTICIPATION: STUDENT INFORMATION SHEET STUDENT SURVEY FOR FACULTY ONLY FOR FACULTY ONLY FOR FACULTY ONLY The following survey is a study by Ken Scott, doctoral candidate at Auburn University and Instructor at TrenholmTech in Montgomery, Alabama. I NEED YOUR HELP! Please send any questions to one of the email addresses below or you may call one of the numbers listed. K. Edward Scott (Ken), CCNA, CCAI Work: 334-420-4392 scottk1@auburn.edu Home: 334-279-6480 kscott@trenholmtech.cc.al.us War Eagle! skinner777@knology.net For FACULTY wishing to participate in the study: 1) please click on the FACULTY INFORMATION SHEET link below, read the information provided (you may print a copy for your files); 2) after reading the FACULTY INFORMATION SHEET, click on the back button in your browser, and then 3) click the link "FACULTY SURVEY" which will allow you access to the online FACULTY SURVEY. FACULTY PARTICIPATION: FACULTY INFORMATION SHEET FACULTY SURVEY 356 APPENDIX F, Page 1 of 3 Auburn University Auburn University, Alabama 36849-5221 Educational Foundations April 30, 2007 Telephone: (334) 844-4460 Leadership and Technology Fax: (334) 844-3072 4036 Haley Center Dr. Stafford L. Thompson, President Enterprise-Ozark Community College Enterprise Campus 600 Plaza Drive Enterprise, AL 36330 REF: Strategic Factors of Institutional Practice Which Impacts Student Success in the Community College as Perceived by Students and Faculty: Academic Preparation, Work Ethics, and Institutional Support. Dear Dr. Thompson: My name is Kenneth Scott (Principal Investigator). At present, I am a faculty member in Computer Information Systems at H. Councill Trenholm State Technical College in Montgomery, Alabama. I am also enrolled in the doctoral program in Higher Education at Auburn University in Auburn, Alabama. Dr. Maria Martinez Witte is my dissertation committee Co-Chair and may be contacted at (334) 844-4460 or wittemm@auburn.edu , if you have any additional questions. This letter is being sent to you to request permission to conduct a study at your institution for my dissertation project. The purpose of the study is to investigate the perceptions of students and faculty related to academic preparation, work ethics, and institutional support within the framework of institutional practice in order to improve student success within the community college. Results of the study?in the form of a dissertation copy?will be made available to you, should you desire. Please know that the Survey Instruments (enclosed) are strictly confidential and anonymous to protect Enterprise-Ozark Community College, students, and faculty. Surveys are a one-time process, will take about 15-to-20 minutes to complete and are completed at the convenience of the student or faculty member. The study will measure the perceptions of students and faculty so that community or technical colleges may better understand the factors which impact student success, thereby, providing valuable information to leaders in community or technical colleges. If you would kindly permit the study to be conducted at Enterprise-Ozark Community College, I respectfully request a Letter of Consent to conduct the study (please see page 3). Additionally, I would also request a contact person with whom I might closely coordinate the study, including mailing/collection of surveys, meeting face-to-face as needed, posting flyers, providing Information Sheets to participants, etc. All materials and costs will be provided and assumed by the Principal Investigator of this dissertation project. 357 APPENDIX F, Page 2 or 3 Dr. Stafford L. Thompson April 30, 2007 Page 2 If you have questions that I may answer, please contact me. I can be reached at the following phone number or one of the email addresses noted below: Office Phone: (334) 420-4392 (H. Councill Trenholm State Technical College, Patterson Campus) Office Email: kscott@trenholmtech.cc.al.us Auburn Email: scottk1@auburn.edu Included with this letter, as draft enclosures, are the following items for your review: 1. Student and Faculty Perceptions of College Student Success: FACULTY SURVEY; 2. Student and Faculty Perceptions of College Student Success: STUDENT SURVEY; 3. FACULTY INFORMATION SHEET; 4. STUDENT INFORMATION SHEET (Please note that students must be 19 years old or older to participate; this item adheres to Auburn University Institutional Review Board policy involving Human Subjects in the State of Alabama); 5. An invitational Flyer to encourage student and faculty participation; 6. Procedural Steps & Script to Conduct a Dissertation Survey; 7. * Letters of Appreciation; (to be forwarded upon completion of the study). Dr. Thompson, please accept my sincerest gratitude for your consideration in allowing the institution to participate in this very important study. Should you have any questions, please don?t hesitate to contact me or Dr. Maria Witte at Auburn University. Very best regards, K. Edward Scott (Ken) H. Councill Trenholm State Technical College Instructor, Computer Information Systems: Director, Cisco Regional Academy Phone: (334) 420-4392; Email: kscott@trenholmtech.cc.al.us ; scottk1@auburn.edu Principal Investigator & Doctoral Candidate, Auburn University Educational Foundations, Leadership and Technology c: Dr. Maria Martinez Witte, Committee Co-Chair Associate Professor, Adult Education EFLT Graduate Program Officer Educational Foundations, Leadership and Technology Department 4036 Haley Center (Office: 4012 Haley Center) Auburn University, AL 36849-5221 358 APPENDIX F, Page 3 or 3 Sample Letter of Consent Enterprise-Ozark Community College Enterprise Campus 600 Plaza Drive Enterprise, AL 36330 April 30, 2007 Auburn University Institutional Review Board c/o Office of Human Subjects 307 Samford Hall Auburn, AL 36849 ATTN: Ms. Susan Anderson Dear Institutional Review Board: Please note that Mr. Kenneth Scott (Principal Investigator) has the permission of Enterprise-Ozark Community College to conduct research on our campus for his dissertation protocol, Strategic Factors of Institutional Practice Which Impacts Student Success in the Community College as Perceived by Students and Faculty: Academic Preparation, Work Ethics, and Institutional Support. Mr. Scott proposes to work closely with the contact person, Dr. Ben Smith (Institutional Liaison), Dean of Students, to coordinate the study on the campus of Enterprise-Ozark Community College. Dr. Smith can be reached at 334-555-1212 or bsmith@eocc.edu . The coordination between Dr. Ben Smith and Kenneth Scott is approved so that surveys may be mailed to Enterprise-Ozark Community College, distributed to students and faculty, secured, and returned by mail to Kenneth Scott. Mr. Scott will be handling all costs and coordination associated with his study. Because respective survey instruments will take approximately 15-to-20 minutes to complete, faculty members and students of Enterprise-Ozark Community College may participate in the study, at the convenience of the participant. Mr. Scott has informed me that the survey process will follow procedures of anonymity and confidentiality to protect Enterprise-Ozark Community College, students, and faculty members; and, that the study conducted will in no way initiate ill will between students, faculty, and Enterprise-Ozark Community College or between Enterprise-Ozark Community College faculty, students, administration, and Auburn University. If there are any questions, please contact my office. Regards, Dr. Stafford L. Thompson, President Enterprise-Ozark Community College c: K. Edward Scott Instructor ? Computer Information Systems; Doctoral Candidate, Auburn University H. Councill Trenholm State Technical College Patterson Campus 3920 Troy Highway Montgomery, AL 36116 359 APPENDIX G 360 APPENDIX H, Page 1 of 3 Practices of the Academic Preparation Domain (APD) Correlated to Research/Studies Practice Research/Studies used to derive practices within the APD References below are not all-inclusive of available research Writing assignments Achieve, Inc., 2006; ACT, 2006a, 2006b; Krueger, 2006; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Bettinger & Long, 2005; Attewell, Lavin, Domina, Levey, 2006; Horn, Nevill & Griffith, 2006; NAAL, 2005 Reading the textbook Arendale, 2005; Bok, 2006; Byrd & MacDonald, 2005; CCSSE, 2005; Conley, 2005; Greene & Forster, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Oudenhoven, 2002; Spann, 2000 Getting feedback on assignments and tests Achieving the Dream, (2005, 2006); Derby & Smith, 2004; Kinzie & Kuh, 2004; Lorenzetti, 2006; Martin & Tulgan, 2002; Smith, 2005; The Conference Board el al., 2006 Having instructors as advisors Brock et al., 2007; Dale & Drake, 2005; Dungy, 2003; VanWagoner, Bowman, & Spraggs, 2005; Kuh el al, 2006; McArthur, 2005; Restauri, 2004; Using email to get help with class material Pett, Lackey & Sullivan, 2003; [SR68]; [SR112]; Smith, 2005; (researcher practice) Instructors who challenge and encourage me Brewer & Burgess, 2005; Davidovitch & Soen, 2006; Gump, 2005; Philips & Skelly, 2006; Smith, 2005; Stanca, 2004, 2006; Van de Water & Rainwater, 2001 Participating in labs with real-world exercises Bok, 2006; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); (researcher practice) Having online study guides for each course Johnson County Community College (JCCC), 1996; Smith, 2005; (researcher practice) Tests that actually cover the material taught Bok, 2006; Hirsch, 2001; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b) Getting help from instructors during office hours Achieving the Dream, (2005, 2006); Derby & Smith, 2004; Kinzie & Kuh, 2004; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Lorenzetti, 2006; Martin & Tulgan, 2002; Smith, 2005; The Conference Board el al., 2006 Receiving feedback about progress in a course Achieving the Dream, (2005, 2006); Derby & Smith, 2004; Kinzie & Kuh, 2004; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Lorenzetti, 2006; Martin & Tulgan, 2002; Smith, 2005; The Conference Board el al., 2006 Having a syllabus that is a learning guide Braxton, 2006; Long, 2006; Smith, 2005; Spelling, 2003; Weimer, 1994; (researcher practice) 361 APPENDIX H, Page 2 of 3 Practices of the Work Ethics Domain (WED) Correlated to Research/Studies Practice Research/Studies used to derive practices within the WED References below are not all-inclusive of available research Showing up for class on time Brewer & Burgess, 2005; Davidovitch & Soen, 2006; Gump, 2005; Marburger, 2006; McLeish, 2002;Stanca, 2004; WorkEthics.Org, 2006 Students take the initiative to make up missed work due to absences Brewer & Burgess, 2005; Hill & Petty, 1995; Horn, Nevill & Griffith, 2006; McLeish, 2002; WorkEthics.Org, 2006 Attending class regularly Brewer & Burgess, 2005; Davidovitch & Soen, 2006; Gump, 2005; Marburger, 2006; McLeish, 2002; Stanca, 2004, 2006; WorkEthics.Org, 2006 Appearance Gilbert, 1999; Juhnke, et al., 1987; McLeish, 2002; WorkEthics.Org, 2006 Students as a team player in group projects Hansen, 2006; McLeish, 2002; Strom, Strom & Moore, 1999; Tarricone & Luca, 2002; WorkEthics.Org, 2006; Students helping other students succeed National Association of Manufacturers, 2005; Hughes & Karp, 2006; McAdams, 2007; McLeish, 2002; WorkEthics.Org, 2006; Students improving their organizational skills Bakunas & Holley, 2004; Hamilton-Attwell, 1998; Johnson, 2007; McLeish, 2002; Pierson & Holmes, 2007; WorkEthics.Org, 2006 Treating people with respect Anderson, 2000; Cordry & Wilson, 2004; McKinney, McKinney, Franiuk & Schweitzer, 2006; McLeish, 2002; WorkEthics.Org, 2006 Instructors giving students feedback on their work ethics Cohen, 2005; Crawley & Klomparens, 2000; Emanuel, 2005; McJunkin, 2005; McLeish, 2002; Soliday, 2002; WorkEthics.Org, 2006 Hearing from business and community leaders about work ethics Chester, 2005; Cohen, 2005; McLeish, 2002; WorkEthics.Org, 2006; Waggoner, 2006; Being an effective manager of time ContinuingEducation.com, 2007; Hamilton-Attwell, 1998; Hill & Petty, 2995; McLeish, 2002; WorkEthics.Org, 2006 Earning an A by unethical methods McLeish, 2002; Rudebock, 2005; Puka, 2005; Sterngold, 2004; WorkEthics.Org, 2006 362 APPENDIX H, Page 3 of 3 Practices of the Institutional Support Domain (ISD) Correlated to Research/Studies Practice Research/Studies used to derive practices within the ISD References below are not all-inclusive of available research Having problems resolved satisfactorily Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 Perceiving faculty, staff and administrators as accessible and helpful Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 Feeling safe on campus to study Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 Getting help in finding meaningful employment Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 Permission to call any individual associated with the college Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 Online registration is available when needed Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Being in classrooms that are clean Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Veltri, Banning, & Davies, 2006 Understanding the mission of the college Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Robbins et al., 2004 Having student organizations that enrich the learning experience Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Veltri, Banning, & Davies, 2006 Giving feedback to administrators on how to improve the college Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Veltri, Banning, & Davies, 2006 Having community services published on the web site Bok, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004; Veltri, Banning, & Davies, 2006 Resources for student support are reliably accessible Bok, 2006; CCSSE, 2006; Dungy, 2003; Hirsch, 2001; Komives & Woodard, 2003; Kuh, Kinzie, Schuh, & Whitt (2005a, 2005b); Restauri, 2004 363 APPENDIX I 364 APPENDIX J 365 APPENDIX K 366 APPENDIX L 367 APPENDIX M To: Students and Faculty of Community and Technical Colleges From: Ken Scott, Instructor - CIS, Trenholm State Tech College & Student, Auburn University Subj: Dissertation for a doctoral research project, Auburn University Date: 26 November 2007 This letter is an appeal to students and faculty in the two-year college system (Community College (CC) or Technical College (TC)), wherever you may be. If you are reading this, your President has agreed to help with the study I am conducting. You see, I am both a teacher and student. As an instructor at a Tech College, I understand the process of teaching and learning; as a student, I also understand the process of going to class, taking tests, and writing papers. It is this latter activity about which I need your help?writing a paper. Please know that this paper is a huge undertaking and includes some ?heavy? research activities, including all that ?stats? stuff you may know about or have heard of? So, what am I asking you to do for me? Well, it?s really quite simple. For me to write the paper, I need data. The data of which I speak are your perceptions about some questions which will help me understand how students and faculty relate in some areas of CC or TC practice (things we do in the CC or TC). Before I go any further, I am ?on your side? when it comes to surveys; I know they take time and we get them all the time. But I want you to please consider something. Just suppose for a moment that your responses could help change the lives of many of the students who will come after you to attend a CC or TC? Right now, there are about 11,600,000 students in the two-year college system. Suppose your answers could help change the lives of 25% of those students? That?s 2.9 million students! Without your help, you and I will never know if this change was even possible. So, here?s what I?m asking of you?whether you are a student or faculty member: Contribute 15 minutes of your time! Give me 15 minutes of your time to think through the questions being asked and your opinions to some comment-type questions. The entire process only takes 15 minutes, but it is possible that those 15 minutes could help change education in ways that will enable students to be successful beyond our methods we currently practice today in community or technical colleges. I don?t have all the answers to all the issues we face, whether student or teacher?but I would surely like to know whether your opinions might shed light into some issues that have a real potential to become a force for change. And without your input, feedback, and/or comments, these ?unanswered questions? will remain unanswered! The surveys will be officially open to you upon receipt of this letter, on or about the 26th of November, and the surveys will close on or about December 12th. These surveys are completely online, can be completed as your time allows, and you can even access them from home, the coffee shop, or whenever you decide. Your input is needed. The only compensation that I can offer is a sincere Thank You for your help and time. I don?t have any means to offer Antarctica Excursions or Moon Flights (I pay tuition at Auburn and my daughter is in college)! Finally, you should know that your responses are not shared with anyone (your identity is TOTALLY anonymous and confidential). I have a secure (SSL encryption) link, the data is protected, and your college, instructors, administration, or your friends will not know the data you have shared. Please?take a moment and complete a survey. And , who knows?maybe one day you might desire my feedback on a survey. I?ll be the first to volunteer?I deeply appreciate your consideration, Ken Scott, Doctoral Candidate, Auburn University Instructor - CIS; Director - CISCO Regional Academy (334-420-4392) Trenholm State Technical College, kscott@trenholmtech.cc.al.us; scottk1@auburn.edu If you have questions/comments/problems, please let me know via email or give me a call. Student Web Link: http://www.knology.net/~skinner777/images/ScottStudents.htm Faculty Web Link: http://www.knology.net/~skinner777/images/ScottFaculty.htm 368 APPENDIX N, Page 1 of 3 Page 1 of 3 and 2 of 3, provides participating college information; Appendix L, Page 3 of 3, shows the complete student demographic matrix data table for participating colleges; Appendix M, is the faculty demographic matrix data table. CCC: Calhoun Community College is a comprehensive community college with locations in Decatur, Huntsville, and Redstone Arsenal, Alabama. The North Alabama area is one of the fastest growing areas in the South. The area economy includes high-tech, high-profile industries such as NASA, U.S. Army, Boeing, Teledyne Brown, McDonnell Douglas, Intergraph, General Electric, TRW, Unisys, 3M, Monsanto, General Motors, and others. The institution, with over 9000 students and 600+ full- and part-time employees, is the largest and one of the most progressive community colleges in Alabama. CGTC: Welcome to Central Georgia Technical College where we will help you imagine the possibilities. Whether you are an individual pursuing personal career goals or a representative of business and industry seeking solutions to business concerns, CGTC is here to assist you. We believe our best attributes are the quality of our instructional programs and the dedication of our faculty. We hope you will also consider these other attributes - a wide choice of programs; scheduling flexibility and web-based courses; an excellent library; an attractive campus; six convenient locations; and student services designed to make your college experience a successful one. We are dedicated to our students and their aspirations which continue to inspire and encourage us to bring all of our resources to bear upon our mission of supporting educational, economic, and community development. The faculty and staff join me in welcoming you to what we know will be an exciting and rewarding educational experience in the center of Georgia. We look forward to seeing you on our campuses and in our virtual classrooms. Please explore our website and visit our campuses to learn how we make positive differences for students and communities. FCCJ: At Florida Community College Jacksonville, success Starts with the Right Education. In a fast- moving global economy, the knowledge and skills people learn must be relevant and purposeful and quickly adaptable. That?s why people are coming to Florida Community College in record numbers ? more than 64,000 students in 2005?06. More than ever, leaders in business, education and government are recognizing the tremendous value of community colleges and the difference our programs make in people?s lives. Florida Community College is growing its reputation as the largest, most dynamic and most influential higher education institution on the First Coast. Our workforce development program is the largest in Florida, helping thousands of students prepare for high-demand careers each year. We also have built the state?s largest online learning program, military education program, and information technology curriculum. The size and quality of our education programs produce significant economic impact for the region. Employers depend on us to deliver the highly skilled employees they need to compete and grow in the future. The impact of our college extends beyond student success and economic growth ? to the arts and culture in our community. Through the Artist Series, honored by the State of Florida as a ?Major Cultural Institution,? we bring top-rated Broadway productions, concerts and cultural performances to Jacksonville each year. Our Community: Total student headcount: 64,230 (2005?06) ; Median age in college-credit programs: 27 years old; Median age in continuing education programs: 39 years old. The majority of students pursue associate degrees or other career-training credentials. The balance of the student population is enrolled in high school completion or basic education programs, special academic programs, or professional development classes. Our diverse student body closely mirrors the diversity of Northeast Florida. Our English language programs attract students from about 120 countries around the world?a marvelous complement to the rich diversity of our campus experience. Faculty and Employees: Total full-time faculty: 404 (August 2006). Full-time faculty degrees: Doctorate: 20%, Master?s: 70%, Bachelor?s/Other: 10%, Total employees: 2,386. 369 APPENDIX N, Page 2 of 3 GSCC: The decision about which college or university to attend is one of the most important you'll ever make. It is a choice that will shape the rest of your life. You'll want to choose a college that will meet not only your academic and/or technical needs but will give you the individual attention to help you reach your goals as well. Your decision should also make financial sense, particularly if you are not so sure about your college major. It must be affordable and allow you some "slack" as you decide about your future. That is why we invite you to consider Gadsden State Community College. Program Quality and Diversity In selecting a college, you are probably considering whether the institution offers your intended major, whether you can receive help in choosing a major if you are undecided about a course of study, and whether the program or courses are as good as those at other institutions you are considering. Reports from some of the universities to which GSCC students transfer have shown that GSCC students do as well as, and in some cases better than, students who complete their first two years at the university. Many GSCC alumni have graduated from four-year institutions with honors and have continued their education to complete law school, medical school, pharmacy school, MBA programs, and graduate degrees in other fields of study. JDCC: Jefferson Davis Community College is a comprehensive community college with campuses in Brewton and Atmore offering students a unique educational experience. Class sizes are small, allowing faculty to provide personalized instruction and hands-on training. The College is committed to providing access to the latest information technology. Mission Statement: Jefferson Davis Community College , one of the public two-year colleges of the Alabama College System, provides accessible quality educational opportunities, promotes economic growth, and enhances the quality of life for the college service area. JSCC: Jefferson State Community College, one of Alabama?s leading two-year colleges, has provided excellence in education and workforce training for the greater Birmingham area for almost 40 years. Founded in 1965, Jefferson State offers more than 120 university transfer programs, 20 career programs with multiple options, and numerous certificate programs. Jefferson State also offers a comprehensive approach to workforce training through a variety of credit and non-credit programs. Jefferson State is accredited by the Commission on Colleges of the Southern Association of Colleges and Schools (1866 South Lane, Decatur, Georgia 30033-4097; Telephone number (404) 679-4501) to award the Associate Degree. The college served over 11,000 students (approximately 7300 for credit and 3600 for non-credit). Several convenient locations, along with online instruction, provide the accessibility needed in a busy lifestyle. Locations include the Shelby Campus in northern Shelby County, the Jefferson Campus in eastern Jefferson County, the St. Clair Center in Moody, and the Pell City Center. 370 APPENDIX N, Page 3 of 3 Detailed Student Demographics Data Matrix College ? Total CCC CGTC FCCJ GSCC JDCC JSCC Gender: Nr of Male/Females 396 6/12 38/137 11/2 1/5 0/4 49/131 Age Grouping: 18-24 124 8 45 0 1 1 69 25-34 138 2 64 6 2 0 64 ? 35 134 8 66 7 3 3 47 Enrollment Status: First-time college student 116 7 49 2 1 2 55 Returning or transfer student 219 10 89 8 3 2 107 Have a degree, updating skills 39 1 21 3 0 0 14 Professional certification only 28 0 24 1 1 0 2 None of these apply to me 14 0 10 0 1 0 3 Ethnicity: White (Non-Hispanic) 270 17 97 7 3 2 144 African-American (Non-Hispanic) 98 1 68 4 2 2 21 Hispanic (Latino/Latina) 8 0 2 1 0 0 5 Asian/Pacific Islander 8 0 2 1 0 0 5 American Indian/Alaskan Native 0 0 0 0 0 0 0 Other 11 0 6 0 0 0 5 GPA in High School: 4.0 ? 3.0 (A?s and B?s) 243 12 120 8 5 3 95 2.9 ? 2.0 (C Average) 106 3 43 3 1 1 55 1.9 or Below (D Average or Below) 8 1 2 0 0 0 5 Don?t Know or Remember 39 2 10 2 0 0 25 Highest Degree Goal: Associate Degree 112 6 72 3 2 0 29 Bachelor Degree 119 5 40 10 2 1 61 Masters Degree 105 7 35 0 2 2 59 Doctorate (PhD, EdD) 38 0 18 0 0 1 19 JD (Law) or MD (Medical) 22 0 10 0 0 0 12 Employment or Marital Status: Work Full-Time 211 6 92 8 2 2 101 Work Part-Time 83 7 33 3 1 1 38 Don?t Work While in School 88 5 44 2 0 1 36 Married, with children 127 6 53 7 2 1 58 Married, no children 24 2 13 0 1 0 8 Single Parent 68 1 40 1 0 0 26 Remedial or Development Courses: Basic Math 192 6 85 6 4 2 89 Basic English 116 4 54 4 0 1 53 Basic Reading 83 0 52 4 1 1 25 Not Applicable 168 11 66 6 2 1 82 Notes: 1. CCC ? Calhoun Community College 2. CGTC ? Central Georgia Technical College 3. FCCJ ? Florida Community College Jacksonville 4. GSCC ? Gadsden State Community College 5. JDCC ? Jefferson Davis Community College 6. JSCC ? Jefferson State Community College 371 APPENDIX O Detailed Faculty Demographics Data Matrix College ? Total CCC CGTC FCCJ GSCC JDCC JSCC Gender: Nr of Male/Females 152 12/18 10/16 0/1 9/15 2/6 18/45 Age Grouping: 19-24 3 2 0 0 0 0 1 25-34 22 3 0 0 3 1 15 ? 35 127 25 26 1 21 7 47 Ethnicity: White (Non-Hispanic) 130 27 22 1 19 6 55 African-American (Non-Hispanic) 12 0 1 0 2 2 7 Hispanic (Latino/Latina) 1 0 1 0 0 0 0 Asian/Pacific Islander 3 2 0 0 1 0 0 American Indian/Alaskan Native 1 0 1 0 0 0 0 Other 5 1 1 0 2 0 1 Highest Degree Earned: Bachelors Degree 19 2 9 0 3 0 5 Masters Degree 110 21 17 1 17 5 49 Doctorate (PhD, EdD) 21 6 0 0 3 3 9 JD (Law) or MD (Medical) 2 1 0 0 1 0 0 Years Teaching Experience, Current Employment Status: 5 or less years 43 10 8 0 5 1 19 6 to 10 years 27 6 4 0 2 0 15 More than 10 years 77 13 14 1 16 7 26 Full-Time 95 12 17 1 21 6 38 Part-Time 28 11 5 0 1 2 9 Teaching Assignments: Teach Technical Courses Only 43 4 15 1 7 1 15 Teach General Ed (Non-Technical) 65 16 5 0 11 5 28 Teach In-Class AND On-Line Courses 54 11 9 1 12 1 20 Teach In-Class Only 69 12 9 0 7 6 35 Teach On-Line Only 3 1 0 0 1 0 1 Notes: 1. CCC ? Calhoun Community College 2. CGTC ? Central Georgia Technical College 3. FCCJ ? Florida Community College Jacksonville 4. GSCC ? Gadsden State Community College 5. JDCC ? Jefferson Davis Community College 6. JSCC ? Jefferson State Community College