An Investigation of Organizational Information Security Risk Analysis by Stephen Zachariah Jourdan A dissertation submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Doctor of Philosophy Auburn, Alabama May 14, 2010 Keywords: information systems, security, risk analysis, ISRA Copyright 2010 by Stephen Zachariah Jourdan Approved by R. Kelly Rainer, Jr., Chair, Professor of Management Information Systems Thomas E. Marshall, Associate Professor of Management Information Systems F. Nelson Ford, Associate Professor of Management Information Systems ii Abstract From the dawn of the information age, technology has advanced rapidly to today where networked computers are almost ubiquitous. One of the problems with connecting computers together is the increased vulnerability to information security threats. Computer viruses, denial of service attacks, and intruders hacking into organizational information systems are becoming commonplace (Mitnick & Simon, 2002; Bodin, Gordon, & Loeb, 2005). In recent years, practitioners and researchers have begun to study issues related to information security (Straub & Welke, 1998). One component of this research is assessing the information security risk analysis practices of the organization (Cavusoglu, Mishra, & Raghunathan, 2004). Despite a growing number and variety of information security threats, many organizations continue to neglect implementing information security policies and procedures. The likelihood that an organization?s information systems can fall victim to these threats is known as information systems risk (Straub & Welke, 1998). To combat these threats, an organization must undergo a rigorous process of self-analysis. Rainer, Snyder, and Carr (1991) published one of the seminal papers related to Information Security Risk Analysis (ISRA). Since the publication of that work, very little research has been conducted to investigate the risk analysis processes that organizations conduct to assess and remedy the variety of information security threats that exist in a modern iii networking environment. To better understand the current state of this information security risk analysis (ISRA) process, this study used two phase approach. In the first phase, a questionnaire using both open-ended and closed ended questions was administered to a group of information security professionals (N=32). The results of this initial investigation led to a second phase questionnaire where a regression model was tested using a new sample of information security professionals (N=144). The qualitative and quantitative results of this study show that organizations are beginning to conduct regularly scheduled ISRA processes. However, the results also show that organizations still have room for improvement to create idyllic ISRA processes. In this exploratory study, a regression model was tested the effect of the frequency of the ISRA process, number of methodologies in the ISRA process, the use of insurance to protect the organization?s information assets, the calculation of Return on Investment for security expenditures, the perceived significance of threats to the organization?s information systems, the support of top management for the ISRA process, and the security culture of the organization all indicated a positive effect on the perceived ISRA effectiveness. Limitations of the study and implications for researchers and managers are discussed. iv Acknowledgements My deepest gratitude goes to my wife, Jessica. Her loving patience and kindness has helped me to complete this project. I am very thankful for my wonderful wife, and I dedicate this dissertation to her. I would like to thank my wonderful dissertation chair, Dr. R. Kelly Rainer, Jr., for his encouragement and guidance seeing me through this project and the entire doctoral program. He always seemed to know when to push me to do more. I could not have asked for a better chairperson, friend, and mentor. I would also like to thank my parents because they showed me, by example, to set and achieve goals. They always encouraged me to achieve my dreams, and for that I am truly in their debt. I extend my gratitude to my committee members, Dr. Nelson Ford and Dr. Thomas Marshall, for their continuous support throughout my doctoral program. A special thanks to all professors who offered doctoral seminars for the students. Dr. Terry Byrd, Dr. William Boulton, Dr. Casey Cegielski, Dr. Christopher Craighead, Dr. Dianne Hall, Dr. Junior Feild, Dr. Allison Jones-Farmer, Dr. Alejandro Lazarte, and Dr. R. Kelly Rainer, Jr. provided seminars that laid the foundation for my career in academia, and I thank them for that. v Table of Contents Abstract ............................................................................................................................... ii Acknowledgments ...............................................................................................................v List of Tables ................................................................................................................... viii List of Figures .................................................................................................................... ix Chapter 1: Introduction .......................................................................................................1 Research Objective of the Study ............................................................................5 Organization of the Dissertation ............................................................................6 Chapter 2: Literature Review ..............................................................................................7 Information Security ..............................................................................................7 Risk Management ..................................................................................................9 Risk Analysis .......................................................................................................11 Alternative ISRA Approaches .............................................................................14 Chapter 3: Methodology ...................................................................................................19 Step 1. Instrument Creation .................................................................................20 Step 2. Phase One Data Collection ......................................................................21 Step 3. Phase One Data Analysis .........................................................................22 CISSP Sample Characteristics .................................................................22 vi Threat Significance ..................................................................................25 Risk Factors .............................................................................................28 Return on Investment for Information Security .......................................29 Insurance for Information Security ..........................................................30 ISRA Frequency .......................................................................................30 ISRA Participation and Approval ............................................................31 Improved ISRA Process ...........................................................................32 Step 4. Model Development .................................................................................37 Step 5. Phase Two Data Collection......................................................................43 Step 6. Phase Two Data Analysis ........................................................................45 Common Method Bias ...........................................................................50 Chapter 4: Results .............................................................................................................51 Model Estimation .................................................................................................51 Results of Hypothesis Tests .................................................................................51 Chapter 5: Discussion & Conclusion ................................................................................54 Contributions of the Study ..................................................................................56 Limitations of the Study ......................................................................................56 Implications for Research & Practice ...................................................................57 Conclusion of the Study .........................................................................................58 References .........................................................................................................................59 Appendix A Email Blast to the ISRA Survey Phase One Particpants ..............................68 Appendix B The Information Security Risk Analysis Questionnaire ? Phase One ..........69 vii Appendix C Screen Capture of ISRA Questionnaire ? Phase One ...................................79 Appendix D Email Blast to the ISRA Survey Phase Two Particpants .............................80 Appendix E ISRA Questionnaire ? Phase Two ................................................................81 Appendix F Screen Capture of ISRA Questionnaire ? Phase Two ..................................87 viii List of Tables Table 1: InfoSec Practices ..................................................................................................9 Table 2: Participants? InfoSec Certifications .....................................................................22 Table 3: Sample Characteristics of Phase One Respondents ............................................24 Table 4: Phase One Respondents? Country, Worker Status, & InfoSec Responsibility ...25 Table 5: Threat Significance by Percentage .....................................................................27 Table 6: Risk Factors by Percentage ..................................................................................28 Table 7: ROI and Insurance for Information Security ......................................................30 Table 8: ISRA Frequency .................................................................................................31 Table 9: ISRA Participation and Approval .......................................................................32 Table 10: Proposed ISRA Process Agreement .................................................................33 Table 11: ISRA and Loss Exposure Methodologies .........................................................36 Table 12: Summary of Proposed Hypotheses ....................................................................42 Table 13: Sample Characteristics of Phase Two Participants ...........................................47 Table 14: Proposed Model Variables and Definitions .......................................................48 Table 15: Means, Standard Deviations, Intercorrelations and Coeffiecient Alphas for Study Variables ...............................................................................49 Table 16: Table of Model Results .....................................................................................53 ix List of Figures Figure 1: Development of InfoSec Activities ......................................................................8 Figure 2: Six Methodological Steps ...................................................................................20 Figure 3: Improved ISRA Process .....................................................................................34 Figure 4: ISRA Effectiveness Model .................................................................................43 1 CHAPTER I INTRODUCTION From the dawn of the information age, technology has advanced rapidly until today where networked computers are almost ubiquitous. A main concern with connecting computers together is that this increases an information system?s exposure to information security threats. As a result of this exposure, computer viruses, denial of service attacks, and intruders hacking into organizational information systems are becoming commonplace (Mitnick & Simon, 2002; Bodin, Gordon, & Loeb, 2005). In recent years, society has become aware of computer-related security (i.e. information security) issues through stories in the popular news media. Computer viruses, identity theft, denial of service attacks, and incidents of informational espionage have become major news stories. Even when an organization is using firewalls, virus protection software, intrusion detection systems, and other advanced technologies, the organization?s computers, networks, and information are not safe (Moore, 2003). According to the 2007 CSI Computer Crime and Security Survey, ?The average annual loss reported in this year?s survey shot up to $350,424 from $168,000 the previous year. Not since the 2004 report have average losses been this high? (Richardson, 2007, p. 2). This level of security-related hazards is nothing new, but organizations have historically been oblivious to these dangers and have subsequently minimized 2 information security expenditures. This lack of security investment led Str?ub and W?lk? (1998) to state, ?Inf?rm?ti?n s?curity c?nc?rns ?r? ?ft?n ign?r?d by t?p m?n?g?rs, middl? m?n?g?rs, ?nd ?mpl?y??s ?lik?. ?s ? r?sult, m?ny inf?rm?ti?n syst?ms ?r? f?r l?ss s?cur? th?n th?y might ?th?rwis? b?, ?nd s?curity br??ch?s ?r? f?r m?r? fr?qu?nt ?nd d?m?ging th?n is n?c?ss?ry? (Str?ub & W?lk?, 1998, p.2). Even when top management supports the security initiatives, inv?stm?nts t? pr?t?ct ?g?inst kn?wn vuln?r?biliti?s may n?t be suffici?nt t? ?ssur? that ?n organization?s inf?rm?ti?n assets are safe. N?w thr??ts ?r? c?ntinu?usly b?ing d?sign?d ?nd d?pl?y?d by cybercriminals t? ?xpl?it vuln?r?biliti?s th?t d?f?nding organizations h?v? n?t y?t disc?v?r?d. ?xt?nt lit?r?tur? h?s id?ntifi?d th? ?dv?nt?g?s f?r these organizations t? sh?r? inf?rm?ti?n ?b?ut new vuln?r?biliti?s, ?tt?cks, ?nd d?m?g?s fr?m br??ch?s (Ma & Pearson, 2005; Kotulic & Clark, 2004; Dutta & McCrohan, 2002). Y?t, firms ?r? h?sit?nt t? sh?r? security-related inf?rm?ti?n. Inf?rm?ti?n s?curity r?l?t?d crim? is r?sp?nsibl? f?r ? signific?nt ?m?unt ?f fin?nci?l l?ss t? c?mp?ni?s c?nducting busin?ss thr?ugh th? Int?rn?t (G?rd?n, L??b, Lucyshyn, & Rich?rds?n, 2004). Internet- based att?cks ?n c?rp?r?t? inf?rm?ti?n ?ss?ts, m?tiv?t?d by crimin?ls with malicious intent, h?v? b??n incr??sing in numb?r ?nd s?phistic?ti?n. H?w?v?r, th? full d?gr?? ?f fin?nci?l l?sses du? t? inf?rm?ti?n s?curity br??ch?s is difficult t? ?ss?ss b?c?us? th? m?j?rity ?f organizations ?r? h?sit?nt t? r?p?rt br??ch?s f?r f??r ?f m?rk?t r?pris?l (C?mpb?ll, G?rd?n, L??b & Zh?u, 2003). In the current business environment, information systems s?curity (InfoSec) h?s b??n pr?cl?im?d ?s ? k?y issu? f?r th? d?v?l?pm?nt ?f ? gl?b?l Inf?rm?ti?n S?ci?ty 3 (Commission of the European Communities, 1994). Information s?curity h?s ?ttr?ct?d th? ?tt?nti?n ?f r?s??rch?rs, pr?f?ssi?n?ls, j?urn?lists, l?gisl?t?rs, g?v?rnm?nts, ?nd citiz?ns. ?n? w?uld ?xp?ct this publicity t? r?is? ?w?r?n?ss ?nd l??d ?rg?niz?ti?ns t? inv?st in s?curity. H?w?v?r, r?c?nt surv?ys sh?w th?t th? ?ctu?l situ?ti?n is r?th?r frustr?ting. Hind? (1998) ?n?lyz?d th? r?sults ?f thr?? r?c?nt surv?ys in th? UK ?nd c?mp?r?d th?m t? th? r?sults ?f p?st surv?ys t? c?nclud? th?t ?? th? und?rlin?d m?ss?g?s ?f k?y risks; ?f l?ck ?f ?w?r?n?ss; ?nd ?f l?ck ?f pr?p?r?dn?ss by m?n?g?m?nt h?v? n?t ?lt?r?d sinc? th? v?ry first UK ?udit C?mmissi?n Surv?y c?nduct?d in 1981?. In ?dditi?n t? th? l?ck ?f impr?v?m?nt, k?y r?sults includ?d th? f?ll?wing: ?n? in fiv? ?rg?niz?ti?ns h?d suff?r?d ? s?ri?us br??ch ?f s?curity; s?curity p?lici?s wer? in?d?qu?t?; th?r? was ? signific?nt g?p b?tw??n ?w?r?n?ss ?f s?curity risks ?nd st?ps t?k?n t? ?v?id th?m. R?g?rding th? ?w?r?n?ss t? ?cti?n g?p, th? Busin?ss Inf?rm?ti?n S?curity Surv?y (Hind?, 1998) c?nclud?d th?t ?? th? r?gr?tt?bl? truth is th?t p??pl? ?ft?n kn?w h?w t? ?v?id s?curity br??ch?s ?nd y?t d? n?thing ?b?ut it. ?cc?rding t? th? surv?y r?sults, m?r? th?n h?lf ?f th? ?rg?nis?ti?ns th?t h?d suff?r?d s?curity br??ch?s f?lt th?y c?uld h?v? d?n? s?m?thing t? pr?v?nt it? (Hind?, 1998). Information technology (IT) pr?f?ssi?n?ls ?ft?n find gr??t difficulty in c?nvincing c?rp?r?t? m?n?g?m?nt t? inv?st in s?curity pr?j?cts (Lindup, 1996). C?rp?r?t? m?n?g?m?nt usu?lly supp?rts pr?j?cts th?t c?n pr?v? th?ir c?st-?ff?ctiv?n?ss, f?ll?w st?bl? ?nd r?c?gniz?d m?th?d?l?gi?s th?t ?nsur? th?ir succ?ssful c?mpl?ti?n, d?m?nstr?t? c?mpli?nc? with c?rp?r?t? str?t?gic pl?n, ?nd allow th?ir ?ff?ct ?n th? ?rg?niz?ti?n to b? 4 ?ss?ss?d. Even with these inherent barriers, organizations h?v? t?k?n th?s? thr??ts s?ri?usly ?nd h?v? begun to invest b?th t?chn?l?gy ?nd hum?n r?s?urc?s t? pr?t?ct th?ir inf?rm?ti?n ?ss?ts (C?nry-Murr?y, 2003). Despite this effort, the pace of inn?v?ti?n by cybercrimin?ls t? ?xpl?it these vuln?r?biliti?s has increased. This development has made it more difficult f?r ?ny singl? organization t? b? ?bl? t? pr?t?ct th?ir n?tw?rk ?l?n? because inf?rm?ti?n s?curity is ? c?mpl?x t?chn?l?gy-b?s?d ?c?syst?m ?f ?tt?ck?rs ?nd d?f?nd?rs inv?lv?d in ? c?ntinu?us l??rning pr?c?ss (Knapp, Morris, Rainer & Byrd, 2003). In addition to this complex external environment, organizational str?t?gy ?ff?cts th? r?l? th?t inf?rm?ti?n t?chn?l?gy pl?ys (H?nd?rs?n & V?nk?tr?m?n, 1993). ?t ?n? ?xtr?m?, ?m?rging t?chn?l?gy driv?s th? str?t?gy ?f th? firm (Hub?r, 1990). At th? ?th?r ?xtr?m?, t?chn?l?gy is m?r?ly ? n?c?ss?ry t??l t? supp?rt ?p?r?ti?ns (C?rr, 2003). ? firm?s t?chn?l?gic?l ?ri?nt?ti?n (t?chn?l?gic?l ?pp?rtunism) driv?s inv?stm?nt t? build th? c?p?bility ?f id?ntifying, ?ssimil?ting, tr?nsf?rming, ?nd ?xpl?iting ?m?rging t?chn?l?gy (Sriniv?s?n, Lili?n, & R?ng?sw?my, 2002). ? firm?s t?chn?l?gic?l ?pp?rtunism d?t?rmin?s th? d?gr?? th?t th?y ch??s? t? c?pit?liz? ?n ?m?rging t?chn?l?gi?s such ?s th? Int?rn?t. L?v?r?ging Int?rn?t t?chn?l?gy d??s n?t c?m? with?ut risks, including ?xp?sur? t? ?xt?rn?l ?tt?ck. In an environment with scarce capital, organizations must decide how to allocate their resources to minimize this risk and protect themselves from security threats in the most cost effective way. The main goal of this study is to investigate this process. 5 Research Objective of the Study Using both qualitative and quantitative methods, this study attempts to learn more about the analysis that organizations undergo to allocate their security resources. This process, inf?rm?ti?n s?curity risk analysis (ISRA), is ? f?rm ?f risk m?n?g?m?nt und?rt?k?n t? r?duc? the n?g?tiv? ?utc?m? ?f s?curity br??ch?s. These breaches threatening information assets take many forms. Threats can be external (i.e. viruses, cybercriminals, and natural disasters) or internal (i.e. human error, technical obsolescence, and ineffective security controls). With a seemingly infinite number of threats poised against information assets and a limited amount of financial resources and personnel, firms must choose which assets are most critical to the organization?s survival. To protect the organization, choices must be made to balance risk factors such as maintaining legal requirements or the avoiding lawsuits from customers (Whitman, 2003). If a firm focuses too much on one factor, resources are being wasted that could be used to balance the risk posed by another threat. These ISR? processes are not holistic; these methods rely on ? very simplistic model of th? ?rg?niz?ti?n d?fin?d in t?rms ?f ?ss?ts, m?inly d?t?, h?rdw?r?, ?nd s?ftw?r?. This research attempts, for the first time, to determine the ISRA process in the context of the entire organization. Due to the very limited research about ISRA in the context of the entire organization, the researcher determined that an open-ended questionnaire would be the best methodology to begin the investigation of this process. 6 Organization of the Dissertation The dissertation is organized into five chapters. Chapter I introduces the topic of Information Security Risk Analysis (ISRA). Chapter II provides a theoretical perspective by reviewing the relevant literature regarding the process investigated in this study. The chapter provides a literature background for ISRA. Chapter III covers the research methodology that explored the ISRA process. This chapter describes the six methodological steps of the project from survey creation to the creation of a proposed theoretical model. Chapter IV shows the results of testing the proposed regression model. Chapter V includes a discussion of the findings, major contributions, limitations of the study, and implications for research and practice. This discussion is followed by a conclusion to the study. 7 CHAPTER II LITERATURE REVIEW The first section of this chapter introduces the topic of information security and defines that as a separate concept from network security and computer security. The literature base for risk management is reviewed in the second section. The research for risk analysis in the information security context is discussed in the third section. The final section of Chapter II discusses the various ISRA approaches in detail. Information Security Information Security (InfoSec) is the set of processes, procedures, personnel, and technology charged with protecting an organization?s information assets (Whitman and Mattord, 2003). These set of practices begin from the top of the organization with the senior executives analyzing the external environment and the current organizational structure to create the organization?s strategy. The executives work together to with the head of each functional area (i.e. Chief Financial Officer, Chief Operating Officer, etc.) to create policies for their respective functional areas. The head of the Information Systems functional area, usually the Chief Information Officer (CIO), responds to this organizational mandate by creating the IS policy which dictates the structure of the organization?s information systems and the policies of each department within the IS functional area. The CIO then works with the Chief Information Security Officer (CISO) to create the InfoSec Policy as a subset of the IS function?s policy 2003; Rainer, Turban, & creation process, see Figure Figure 1. Development of The InfoSec policy c will carry out all of the InfoSec activities. operations, project management, risk management, and policy evaluation training is developed by the information security department to reduce the number of security-related incidents deals with the day-to-day maintenance of current information security systems and all 8 (Whitman Potter, 2007). For a graphic depiction of this InfoSec Policy 1. InfoSec Activities ontains detailed plans and procedures for how the department These activities include end-u that occur through the users? lack of awareness & Mattord, ser training, . End-user . Operations 9 other support activities. Project management deals with the creation and implementation of new security systems. Risk management is the process of identifying vulnerabilities to an information systems and taking action to control for those weaknesses. As new vulnerabilities appear, changes must be made to the organization?s InfoSec policy to include these threats including contingency plans for incident response, disaster recovery, and business continuity planning (Whitman & Mattord, 2004). This study focuses on Risk Analysis as a subset of Risk Management depicted in Table 1. Table 1. InfoSec Practices (Whitman & Mattord, 2004) End-User Training Information Security Education, Training and Awareness Operations Updating and maintaining current InfoSec systems Project Management Designing and implementing new InfoSec projects Risk Management Identifiying and controlling for risks to information assets Policy Evaluation Assessing current policy, making changes, contingency planning Risk Management Researchers and practitioners have longed stated that inf?rm?ti?n t?chn?l?gy (IT) projects were not secure from their inception. To deal with the c?mpl?xiti?s ?nd unc?rt?inti?s th?t incr??singly surr?und t?chn?l?gic?l ch?ng? ?nd its m?n?g?m?nt, risk m?n?g?m?nt c?n b? ?n ?xtr?m?ly p?w?rful ?ppr??ch. Risk is s?m?tim?s s??n as ? n?g?tiv? c?nc?pt with respect to IT in ?rg?niz?ti?ns b?c?us? it impli?s th?t s?m?thing c?uld g? wr?ng with ?n IT pr?j?ct. C?nv?nti?n?lly, in IT pr?j?cts, risks h?v? b??n n?rr?wly d?fin?d limited only to the financial success or failure associated with a 10 project?s completion. T?d?y, with IT b?c?ming int?gr?l t? ? c?mp?ny?s ?xist?nc?, th? st?k?s ?r? c?nsid?r?bly high?r ?nd br??d?r in sc?p? (Smith, McKeen, & Staples, 2001). Th? r?sults ?f Smith et al.?s (2001) study, which involve a f?cus gr?up ?f s?ni?r IT m?n?g?rs fr?m ? numb?r ?f ?rg?niz?ti?ns in ? v?ri?ty ?f industri?s, were composed of th? m?n?g?rs? pr?s?nt?ti?ns and ? r?vi?w ?f th? curr?nt r?s??rch ?n risk m?n?g?m?nt. Smith et al. (2001) concluded that IT m?n?g?rs must l??rn t? c?ntr?l b?th th? pr?bl?ms ?nd th? p?t?nti?l th?t risk r?pr?s?nts. The study developed s?v?r?l g?n?r?l principl?s t? h?lp IT m?n?g?rs d??l ?ff?ctiv?ly with these risks. ?ff?ctiv? risk m?n?g?m?nt inv?lv?s t?king ? h?listic ?ppr??ch t? risk, d?v?l?ping ? risk m?n?g?m?nt p?licy, ?st?blishing cl??r ?cc?unt?biliti?s ?nd r?sp?nsibiliti?s, b?l?ncing risk ?xp?sur? ?g?inst c?ntr?ls, b?ing ?p?n ?b?ut risks t? r?duc? c?nflict ?nd inf?rm?ti?n hiding, ?nf?rcing risk m?n?g?m?nt pr?ctic?s, ?nd l??rning wh?t w?rks ?nd what d??s not fr?m p?st ?xp?ri?nc? (Smith ?t ?l., 2001). Th? und?rlying pr?bl?m with risk is th?t m?n?g?rs ?r? g?n?r?lly un?w?r? ?f th? full r?ng? ?f ?cti?ns th?t th?y c?n t?k? t? r?duc? risk. B?c?us? ?f this l?ck ?f kn?wl?dg?, subs?qu?nt ?cti?ns t? pl?n f?r ?nd c?p? with risk ?r? l?ss ?ff?ctiv?. This is ?n? vi?bl? ?xpl?n?ti?n f?r why l?ss?s fr?m c?mput?r ?bus? ?nd c?mput?r dis?st?rs t?d?y ?r? still s? unc?mf?rt?bly l?rg? ?nd p?t?nti?lly d?v?st?ting (Straub, 1998). An effective method for increasing an organization?s knowledge of the risks and countermeasures associated with IT is to undergo some form of a risk analysis. 11 Risk Analysis Rainer, Snyder, and Carr (1991) defined risk analysis (RA) as ?the process managers use to examine the threats facing their IT assets and the vulnerabilities of those assets to the risks.? (Rainer, Snyder, & Carr, 1991, p.133) Rainer et al. (1991) further stated that RA consisted of identifying assets, indentifying threats to those assets, and determining the vulnerability of said assets to those threats, and RA methodologies were either quantitative or qualitative. These methodologies would ideally be acceptable to all stakeholders (i.e. management, users, and the IS department), be comprehensive enough to assess all risks, be logically sound, be practical enough to deliver the best protection for the investment, and be conducive to learning through documentations and records of the RA process (Rainer, Snyder, & Carr, 1991). Risk ?n?lysis (R?) is th? pr?d?min?nt m?th?d?l?gy f?r ISRA. Risk ?n?lysis is ? r?th?r str?ightf?rw?rd m?th?d?l?gy th?t f?ll?ws the fiv? st?g?s of ass?ts id?ntific?ti?n/v?lu?ti?n, thr??ts ?ss?ssm?nt, vuln?r?biliti?s ?ss?ssm?nt, existing/pl?nn?d s?f?gu?rd ?ss?ssm?nt, and risk ?ss?ssm?nt (International Standards Organization, 2006). B?sk?rvill? (1991) stated that almost all information security professionals use RA for a tool to justify the cost of security controls to managment and ?ttribut?s p?rt ?f th? succ?ss ?f R? t? its us? ?s ? c?mmunic?ti?n link b?tw??n th? s?curity ?nd m?n?g?m?nt pr?f?ssi?n?ls wh? must t?k? d?cisi?ns c?nc?rning inv?stm?nts in InfoSec. ?Its simpl? pr?b?bility ?rithm?tic ?ll?ws th? s?curity pr?bl?m t? b? ?xpr?ss?d in ? c?lculus th?t is f?mili?r t? m?n?g?m?nt ?nd in t?rms (m?n?t?ry) th?t p?rmit c?mp?ris?n with c?pit?l ?pp?rtunity c?sts? (B?sk?rvill?, 1991, p.752). 12 Other researchers have attempted to improve upon this calculus. G?rd?n and Loeb (2002?) proposed an ?c?n?mic m?d?l composed of thr?? p?r?m?t?rs ?f ? firm?s ?xp?ct?d l?ss du? t? inf?rm?ti?n s?curity br??ch?s: th? pr?b?bility ?f ? thr??t ?ccurring, th? pr?b?bility th?t ? thr??t w?uld b? succ?ssful (lik?lih??d ?f ? br??ch), ?nd th? l?ss r?sulting fr?m ? succ?ssful s?curity br??ch. This m?d?l assigns th? pr?b?bility ?f ? thr??t m?king th? implicit ?ssumpti?n th?t ?ll thr??ts h?v? ?n ?qu?l pr?b?bility ?f ?ccurring ?nd th? ?xplicit d?t?rmin?ti?n th?t th? firm c?nn?t influ?nc? th? pr?b?bility ?f th? ?ccurr?nc? ?f ? thr??t. Th? m?d?l ?ls? assigns th? v?lu? ?f th? ?xp?ct?d l?ss ?s ? functi?n ?f th? pr?b?bility ?f ? br??ch. This l?gic m?k?s th? implicit ?ssumpti?n th?t firms ?r? c?nc?rn?d with ?n ?v?r?g? l?ss, inst??d ?f ?n ?xtr?m? c?s?. Str?ub and W?lk? (1998) id?ntify industry susc?ptibility t? risk, p?st firm ?cti?ns t?k?n t? s?cur? inf?rm?ti?n syst?ms, ?nd p?rs?n?l ?w?r?n?ss ?f s?curity risk ?s driv?rs f?r ? m?n?g?r?s p?rc?pti?n ?f risk. H?w?v?r, th? study d??s n?t ?xplicitly id?ntify wh?th?r ? firm?s str?t?gy influ?nc?d by t?chn?l?gic?l ?pp?rtunism h?s influ?nc? ?n th? p?rc?pti?n ?f s?curity risk. Firms r?c?gniz? th?t f?ilur? t? c?r?fully w?igh ?cti?n t? ?ddr?ss inf?rm?ti?n s?curity is imp?rt?nt, because th? m?rk?t r?sp?nds unf?v?r?bly t? firms th?t sp?nd ?ith?r t?? much ?r t?? littl? t? s?cur? th?ir inf?rm?ti?n ?ss?ts (C?mpb?ll ?t ?l., 2003). Th? pr?v?iling wisd?m is th?t inv?stm?nts in inf?rm?ti?n s?curity h?v? b??n sh?wn t? h?v? ? diminishing r?turn. (G?rd?n ?t ?l., 2002?) H?w?v?rmeteg th? pr?bl?m is c?mpl?x: ?N?rm?l t??ls utiliz?d t? ?v?lu?t? inv?stm?nts such ?s R?I ?r IRR m?y n?t b? ?ppr?pri?t?? (G?rd?n & L??b, 2002b, p. 28). 13 Inv?stm?nt t? pr?t?ct ?g?inst kn?wn thr??ts is n?c?ss?ry but n?t suffici?nt t? gu?r?nt?? s?curity because th? information s?curity ?nvir?nm?nt is, by definition, ch?r?ct?riz?d by unc?rt?inty. Firm inv?stm?nt c?n b? c?t?g?riz?d ?l?ng ? c?ntinuum ?f firm ?ctivism. ?t ?n? ?nd firms s??k t? tr?nsf?r th? risk thr?ugh insur?nc? ?r ?uts?urcing c?ntr?cts, and at the ?th?r ?nd ?f th? sp?ctrum firms inv?st pr??ctiv?ly in dyn?mic c?p?biliti?s ?s ? str?t?gy t? pr?vid? fl?xibility t? ?ddr?ss ?nvir?nm?nt?l unc?rt?inty (Br??l?y, My?rs, & ?ll?n, 2005). K?gut ?nd Kul?til?k? (2001) identified l?bbying the government ?s an additional f?rm ?f pr??ctiv? inv?stm?nt. Even organizations that are proactive with respect to information security have reported uncertainty about the thoroughness of their preparations. Th? C?mput?r S?curity Institut? (CSI) st?t?d in its 2007 r?p?rt th?t th? ?v?r?g? ?nnu?l l?ss r?p?rt?d by U.S. c?mp?ni?s in th? 2007 CSI C?mput?r Crim? ?nd S?curity Surv?y m?r? th?n d?ubl?d, fr?m $168,000 in the 2006 r?p?rt t? $350,424 in the 2007 surv?y. This ?nds ? fiv?-y??r trend ?f l?w?r r?p?rt?d l?ss?s (Richardson, 2007). Fin?nci?l fr?ud ?v?rt??k virus ?tt?cks ?s th? s?urc? ?f th? gr??t?st fin?nci?l l?ss. Virus l?ss?s, which h?d b??n th? l??ding c?us? ?f l?ss f?r s?v?n str?ight y??rs, f?ll t? s?c?nd pl?c?. ?n?th?r signific?nt c?us? ?f l?ss w?s syst?m p?n?tr?ti?n by ?utsid?rs. ?cc?rding t? the results, ?lm?st ?n?- fifth ?f th?s? r?sp?nd?nts wh? suff?r?d ?n? ?r m?r? kinds ?f s?curity incid?nt s?id th?y had suff?r?d ? ?t?rg?t?d ?tt?ck? (i.?. ? m?lw?r? ?tt?ck ?im?d ?xclusiv?ly ?t th?ir ?rg?niz?ti?n ?r ?t ?rg?niz?ti?ns within ? sm?ll subs?t ?f th? g?n?r?l p?pul?ti?n). Insid?r ?bus? ?f n?tw?rk ?cc?ss ?r ?-m?il (such ?s tr?fficking in p?rn?gr?phy ?r pir?t?d s?ftw?r?) ?dg?d ?ut virus incid?nts ?s th? m?st pr?v?l?nt s?curity pr?bl?m, with 59% ?nd 14 52% ?f r?sp?nd?nts r?p?rting ??ch r?sp?ctiv?ly. ?t ? p?ri?d wh?n ?xp?rts thr?ugh?ut th? industry h?v? b??n discussing with c?nc?rn th? gr?wing s?phistic?ti?n ?nd st??lth ?f cyb?r ?tt?cks, r?sp?nd?nts are s?ying th?y l?st signific?ntly m?r? m?n?y in 2006, ?s st?t?d by R?b?rt Rich?rds?n, CSI dir?ct?r ?nd ?uth?r ?f th? surv?y (Richardson, 2007). Th? study, by Campbell, Gordon, Loeb, and Zhou (2003), further illustrates the financial dangers associated with information security issues. The study ?x?min?d th? ?c?n?mic ?ff?ct ?f inf?rm?ti?n s?curity br??ch?s r?p?rt?d in n?wsp?p?rs ?r publicly tr?d?d US c?rp?r?ti?ns. Th?y f?und limit?d ?vid?nc? ?f ?n ?v?r?ll n?g?tiv? st?ck m?rk?t r??cti?n t? public ?nn?unc?m?nts ?f inf?rm?ti?n s?curity br??ch?s. H?w?v?r, furth?r inv?stig?ti?n r?v??l?d th?t th? n?tur? ?f th? br??ch ?ff?ct?d th? r?sult. Campbell ?t ?l. (2003) f?und ? highly signific?nt n?g?tiv? m?rk?t r??cti?n f?r inf?rm?ti?n s?curity br??ch?s inv?lving un?uth?riz?d ?cc?ss t? c?nfid?nti?l d?t?meteg but n? signific?nt r??cti?n wh?n th? br??ch d??s n?t inv?lv? c?nfid?nti?l inf?rm?ti?n. Thus, st?ck m?rk?t p?rticip?nts ?pp??r?d t? discrimin?t? ?cr?ss typ?s ?f br??ch?s wh?n ?ss?ssing th?ir ?c?n?mic imp?ct ?n ?ff?ct?d firms. Th?s? findings w?r? c?nsist?nt with th? ?rgum?nt th?t th? ?c?n?mic c?ns?qu?nc?s ?f inf?rm?ti?n s?curity br??ch?s v?ry ?cc?rding t? th? n?tur? ?f th? und?rlying ?ss?ts ?ff?ct?d by th? br??ch (Campbell ?t ?l., 2003). To accomplish the goal of minimizing risk to information assets with a minimum investment, several ISRA approaches have been proposed. ?lt?rn?tiv? ISRA ?ppr??ch?s In their paper, Rainer et al. (1991) categorized many RA methodologies into either the quantitative or qualitative categories. Annualized loss expectancy (ALE), 15 Courtney, Livermore Risk Analysis Methodology (LRAM), and Stochastic Dominance were all classified as expected value analysis where the loss exposure is a function of the asset?s vulnerability to a threat multiplied by the likelihood of the reality of the threat using the Delphi method to solicit information and obtain consensus from users. These methodologies have the advantages of forcing the organization to identify their most vulnerable assets, develop contingency plans to operate without these assets, and test these plans to demonstrate how critical these assets are to the organization. The disadvantages of these methodologies are imprecision and cost. Measuring the probabilities of these assets being attacked by these threats is a very imprecise endeavour. While being inaccurate, the process can be very expensive in time, labor, and dollars invested (Rainer, Snyder, & Carr, 1991). Rainer et al. (1991) described qualitative methodologies as an alternative to the more extensive quantitative methodologies. The qualitative methodologies include Scenario Analysis, Fuzzy Metrics, and questionnaires. As with the quantititative methodologies, the Delphi method could be used to clarify the variables under investigation. These methodologies have the advantages of being much less costly than the quantitative methods. However, the qualitative methodologies have the inherrent disadvantages of defininig risk in vague variables (i.e. low, medium, high, stong, weak, etc.) that do not provide exact dollar values and probalities (Rainer, Snyder, & Carr, 1991). Since the publication of the study by Rainer, Snyder, and Carr (1991), other researchers have attempted to add to the portfolio of methodologies that an organization 16 can use for RA. H?lb?in, T?uf?l, and B?ukn?cht (1996) pr?p?s?d th? us? ?f tr?ns?cti?n- b?s?d busin?ss m?d?ls f?r s?curity d?sign in ?rg?niz?ti?ns. Th?s? m?d?ls ?r? us?d t? sp?cify n??d-t?-kn?w ?uth?riz?ti?ns ?nd r?l?-b?s?d ?cc?ss rights, b?s?d ?n inf?rm?ti?n ?xch?ng? ?nd th? cli?nt-suppli?r m?d?l. B?ckh?us? ?nd Dhill?n (1996) r?ly ?n th? c?nv?rs?ti?n?l structur?s d?riving fr?m sp??ch ?ct th??ry (S??rl?, 1987) t? pr?p?s? ? th??r?tic?l ?nd c?nc?ptu?l f?und?ti?n f?r ?n?lyzing IS s?curity. Th?y ?rgu? th?t ?n ?n?lysis ?f structur?s ?f r?sp?nsibility in ?rg?niz?ti?ns m?y l??d t? th? d?v?l?pm?nt ?f s?cur? IS. Th?s? ?ppr??ch?s supp?rt th? ?n?lysis ?f th? ?rg?niz?ti?n f?r th? purp?s? ?f f?rmul?ting sp?cific s?curity r?quir?m?nts. H?w?v?rmeteg it h?s n?t b??n sh?wn wh?th?r th?s? c?n b? us?d within ? c?mpr?h?nsiv? ISRA m?th?d?l?gy. B?d?nh?rst ?nd ?l?ff (1989) h?v? pr?p?s?d ?n int?gr?t?d m?th?d?l?gy f?r ISRA. Th?ir fr?m?w?rk ?f ? m?th?d?l?gy f?r th? lif?-cycl? ?f c?mput?r s?curity in ?n ?rg?niz?ti?n c?nsists ?f the fiv? ph?s?s of initi?ti?n, est?blishment of a c?mput?r s?curity p?licy, risk ?n?lysis/pr?j?ct d?finiti?n, inst?ll?ti?n, and m?int?n?nc?. This m?th?d?l?gy inc?rp?r?t?s risk ?n?lysis int? ? c?mpr?h?nsiv? s?curity fr?m?w?rk. Org?niz?ti?n?l issu?s ?r? ?ddr?ss?d in th? initi?ti?n st?g?. The second st?g? includ?s th? d?v?l?pm?nt ?f ? c?mput?r s?curity policy, b?s?d ?n th? missi?n st?t?m?nt ?f th? ?rg?niz?ti?n, ?nd th? ?st?blishm?nt ?f ? c?mput?r s?curity st??ring c?mmitt??. H?w?v?r, this methodology d??s n?t includ? ?ny kind ?f ?rg?niz?ti?n?l ?n?lysis. 17 Hitchings? (1995) ?ppr??ch ?tt?mpts t? int?gr?t? risk ?n?lysis with ?rg?niz?ti?n?l ?n?lysis in th? c?nt?xt ?f ? g?n?ric fr?m?w?rk f?r ISRA. Th? pr?p?s?d fr?m?w?rk c?mpris?s th? f?ll?wing ph?s?s: 1. ?n?lysis ?f th? ?rg?niz?ti?n ?nd d?finiti?n ?f r?l?v?nt syst?ms. 2. S?curity ?n?lysis ?f r?l?v?nt syst?ms. 3. Risk ?n?lysis. 4. S?curity d?sign. 5. S?curity impl?m?nt?ti?n, m?nit?ring, ?nd m?n?g?m?nt. In th? first ph?s? th? ?rg?niz?ti?n is ?n?lys?d t? d?t?rmin? th? syst?ms th?t n??d t? b? ?x?min?d fr?m ? s?curity p?rsp?ctiv?. Th? s?c?nd ph?s? c?nc?rns th? s?curity ?n?lysis ?f th?s? syst?ms and includ?s th? ?n?lysis ?f busin?ss pr?c?ss?s ?nd th? int?rpr?tiv? ?n?lysis ?f inf?rm?ti?n in th? ?rg?niz?ti?n. Risk ?n?lysis is th?n p?rf?rm?d in c?nt?xt t? th? ?rg?niz?ti?n. S?curity d?sign, in th? f?urth ph?s?, pr?duc?s ? s?curity pl?n th?t includ?s ? s?curity p?licy ?nd sp?cific c?unt?rm??sur?s. Fin?lly, s?curity impl?m?nt?ti?n is c?upl?d t? m?nit?ring ?nd m?n?g?m?nt. M?nit?ring ?nd m?n?g?m?nt ?r? c?ntinu?us ?ctiviti?s th?t ?im ?t k??ping risk ?t ? t?l?r?bl? l?v?l (Hitchings, 1995). The weakness of this approach is that it views risk analysis as a step in a process, but the risk analysis is the entire process. These initial attempts at an ISRA process are important attempts to develop workable security contols for the organization, and th? qu?lity ?f s?curity c?ntr?ls c?n signific?ntly influ?nc? ?ll c?t?g?ri?s ?f risk. Tr?diti?n?lly, research?rs ?nd instituti?ns r?c?gniz?d th? dir?ct imp?ct fr?m incid?nts r?l?t?d t? fr?ud, th?ft, ?r ?ccid?nt?l d?m?g?. 18 M?ny s?curity w??kn?ss?s, h?w?v?r, c?n dir?ctly incr??s? ?xp?sur? in ?th?r ?r??s. F?r ?x?mpl?, th? p?t?nti?l f?r l?g?l li?bility r?l?t?d t? cust?m?r priv?cy br??ch?s m?y pr?s?nt ?dditi?n?l risk. ? str?ng information s?curity pr?gr?m r?duc?s l?v?ls ?f r?put?ti?n, ?p?r?ti?n?l, l?g?l, ?nd str?t?gic risk by limiting th? instituti?n?s vuln?r?bility t? intrusi?n ?tt?mpts ?nd m?int?ining cust?m?r c?nfid?nc? ?nd trust in th? instituti?n. S?curity c?nc?rns c?n quickly ?r?d? cust?m?r c?nfid?nc? ?nd p?t?nti?lly d?cr??s? th? ?d?pti?n r?t? ?nd r?t? ?f r?turn ?n inv?stm?nt f?r str?t?gic?lly imp?rt?nt pr?ducts ?r s?rvic?s. Practition?rs ?nd risk m?n?g?rs sh?uld inc?rp?r?t? s?curity issu?s int? th?ir risk ?nalysis pr?c?ss f?r ??ch risk c?t?g?ry. Fin?nci?l instituti?ns sh?uld ?nsur? th?t s?curity risk ?ss?ssm?nts ?d?qu?t?ly c?nsid?r p?t?nti?l risk in ?ll busin?ss lin?s ?nd risk c?t?g?ri?s. Inf?rm?ti?n s?curity risk ?nalysis is th? pr?c?ss us?d t? id?ntify ?nd und?rst?nd risks t? th? c?nfid?nti?lity, int?grity, ?nd ?v?il?bility ?f inf?rm?ti?n ?nd inf?rm?ti?n syst?ms. In its simpl?st f?rm, ? risk ?nalysis c?nsists ?f th? id?ntific?ti?n ?nd v?lu?ti?n ?f ?ss?ts ?nd ?n ?n?lysis ?f th?s? ?ss?ts in r?l?ti?n t? p?t?nti?l thr??ts ?nd vuln?r?biliti?s, r?sulting in ? r?nking ?f risks (i.e. risk factors) t? mitig?t?. Th? r?sulting inf?rm?ti?n sh?uld b? us?d t? d?v?l?p str?t?gi?s t? mitig?t? th?s? risks. ?n ?d?qu?t? ?ss?ssm?nt id?ntifi?s th? v?lu? ?nd s?nsitivity ?f inf?rm?ti?n ?nd syst?m c?mp?n?nts ?nd th?n b?l?nc?s th?t kn?wl?dg? with th? ?xp?sur? fr?m thr??ts ?nd vuln?r?biliti?s. The next chapter illustrates the methodology used to gather more information about this complex business process. 19 CHAPTER III RESEARCH METHODOLOGY This research study combines quantitative and qualitative interviewing techniques using two phases. Quantitative interview studies attempt to report how many people are in particular categories and the relationships between one category and another. These studies, characterized by closed-ended Likert-scale questions, collect numbers as data, but this is not why these studies are quantitative. These studies, characterized by the sample survey, attempt to maximize the sample?s generalizability to the population under investigation (Scandura & Williams, 2000). These studies are quantitative because all of their results can be presented as a table of numbers (Weiss, 1994). In contrast, qualitative interview data tends to be narrative in nature. A qualitative interview produces rich, detailed answers while a quantitative interview is designed to produce data that can be coded and processed quickly. In qualitative interviewing the researcher is much more interested in the interviewee?s point of view. This is in direct contrast to a structured quantitative interview where the researcher decides all of the questions and answers for the respondent (Bryman & Bell, 2003). Researchers can combine quantitative and qualitative interview techniques in a study (Bryman & Bell, 2003). Figure 2 illustrates the two phases of the study combining quantitative and qualitative interviewing techniques. The following sections prov detailed descriptions of the six methodological steps used in the two phases. Figure 2. Six Methodolog Step 1. Instrument Creation The first phase of this methodology began by creating a survey instrument that would explore the complica created by the principal researcher. Then, an expert panel including two accomplished university researchers and four Certified Information Systems Security Professionals (CISSPs) was consulted. Information Systems Security Certification Consortium who agree to a code of ethics, possess a minimum level of professional experience, and earn continuing professional education 20 ical Steps ted ISRA process. To accomplish this, an instrument was CISSPs are members of the non-profit International pass a comprehensive exam, credits (www.isc2.org). This expert panel reviewed ide 21 the questionnaire and suggested improvements regarding various aspects of the ISRA process including content validity and potential intrusiveness. Suggestions were made, changes implemented, and feedback was given in several stages over a two month period. After this iterative refinement process, the instrument was deemed ready for data collection. Step 2. Phase One Data Collection To initiate data collection, an email (see Appendix A) was sent to 300 CISSPs asking for their participation in a study being conducted by a researcher at Auburn University. The email explained the purposes of the study and assured possible participants that any information they provide was strictly anonymous and would only be used for research purposes. The email also explained that participation in the study required only that participants fill out a short survey, which would take between 20 and 30 minutes. Finally, the email directed those who desired to participate to download the attached spreadsheet, complete each part, and email the spreadsheet back to the researcher. Those who did not respond to the first request for participation were contacted again with a second email. This second communication was sent approximately one week after the original communication and was the last time that non-responders were contacted. Finally, after two weeks the first phase of the data collection ended and the data was analyzed. Specifically, 300 individuals were contacted about participation in the study. Of the 300 individuals contacted, 32 completed the semi-structured survey for a response rate of 10.67%. A copy of the text for the semi-structured survey is included 22 in Appendix B. A screenshot of the Microsoft Excel worksheet is included in Appendix C. Step 3. Phase One Data Analysis The sample is notable for several reasons. First, the participants all had the CISSP certification (Table 2) indicating a standard of information securtity knowledge and experience. In addition to the CISSP certification, 25.1% of the participants held at least one additional information security related certification. Second, the CISSP certification is one of the most selective certifications in the information security profession, and individuals who earn this certification are held to the highest professional and ethical standards. Third, the sample of InfoSec professionals provided data from individuals who are highly knowledgeable about the ISRA process at their respective organizations. Finally, the holders of the CISSP certification work in a variety or information security roles in a diverse array of organizations. T?bl? 2. P?rticip?nts? InfoSec C?rtific?ti?ns Please select your certification. Response Percent Response Count None 0.0% 0 CISSP 100.0% 32 SSCP 18.8% 6 CAP 0.0% 0 Other 6.3% 2 CISSP Sample Characteristics Table 3 illustrates the diversity with respect to number of employees, type of industry, job position, IT experience, and InfoSec experience. The sample had 23 participants who worked at a mix of small, medium, and large organizations. The respondents worked in a variety of industries in both the public and private sector. The professionals also worked in a variety of roles in the organization from rank and files workers represented by the Other IT/Technical/Scientific/Professional category through all levels of management from department head up to the owner and executive level of the organization. These professionals had a variety of IT and InfoSec experience with the vast majority being mid-level professionals with between six and fifteen years of experience. 24 T?bl? 3. Sample Characteristics of Phase One Respondents Employees: More than 15,001 25.0% From 7,501 to 15,000 9.4% From 2,501 to 7,500 25.0% From 501 to 2,500 18.8% 500 or less 21.9% Industry: Largest represented include: Finance, Banking, & Insurance 18.8% Consultant 12.5% Information Technology/Security/Telecom 12.5% Manufacturing 12.5% Government-federal, military, local, etc. 6.3% Medical/Healthcare-public or private 6.3% Consumer Products/Retail/Wholesale 6.3% Utilities 6.3% Professional Services-Legal, Marketing, etc. 3.1% Education/Training 3.1% Energy 3.1% Publishing 3.1% Travel/Hospitality 3.1% Real Estate/Property Management 3.1% Job Position: Other IT/Technical/Scientific/Professional 40.6% MIS/IS/IT/Technical management 28.1% Consultant/Contractor 12.5% Department Manager/Supervisor/Director 9.4% Owner/Partner 6.3% Senior Manager/Executive 3.1% IT Experience: 5 years or less 3.1% Between 6 and 10 43.8% Between 11 and 15 25.0% Between 16 and 20 15.6% More than 20 12.5% InfoSec Experience: 5 years or less 31.3% Between 6 and 10 46.9% Between 11 and 15 12.5% Between 16 and 20 3.1% More than 20 6.3% 25 Table 4 further describes the participants. These professionals, mostly worked in North America represented by Canada and the United States, but a few other countries were also represented. The majority (78.1%) considered themselves permanent employees while 21.9% labeled themselves as an outsourced worker. Most of these professionals considered information security one of their primary job responsibilities. T?bl? 4. Phase One Respondents? Country, Worker Status, & InfoSec Responsibility Select the country where you perform the majority of your work. United States ? United States of America 68.8% Canada 18.8% United Kingdom 6.3% Saudi Arabia ? Kingdom of Saudi Arabia 3.1% South Africa ? Republic of South Africa 3.1% Are you an outsourced (consultant) worker? YES, I?m an outsourced worker. 21.9% NO, I?m a regular/permanent employee. 78.1% Is information security a primary or secondary responsibility of current job? Primary 62.5% Secondary 37.5% Threat Significance One of the critical tasks in the ISRA process is to identify threats and rank them according to significance. Organizations have limited resources with which countermeasures may be implemented. Whitman (2003) used a list of threats to determine whether organizations were concerned about the information security environment. That study resulted in a weighted ranking of threats that were similar to the 26 2002 CSI/FBI Annual Computer Crime and Security Survey (Whitman, 2003; Power, 2002). This questionnaire uses the same items and a 5-point Likert scale to ask participants to rank each threat?s significance from extremely insignificant to extremely significant. The results, shown in Table 5, show that the vast majority (more than 90%) of participants listed acts of human failure, deliberate acts of espionage or trespass, deliberate acts of sabotage or vandalism, deliberate acts of theft, and deliberate software attacks as the most significant threats to their respective organizations. 27 Table 5. Threat Significance by Percentage Thr??ts Extremely insignificant Insignificant Neither insignificant or significant ?ct ?f hum?n f?ilur? 0.0% 3.1% 6.3% C?mpr?mis?s t? int?ll?ctu?l pr?p?rty 3.1% 25.0% 9.4% D?lib?r?t? ?cts ?f ?spi?n?g? ?r tr?sp?ss 0.0% 9.4% 0.0% D?lib?r?t? ?cts ?f inf?rm?ti?n ?xt?rti?n 6.3% 15.6% 6.3% D?lib?r?t? ?cts ?f s?b?t?g? ?r v?nd?lism 3.1% 6.3% 0.0% D?lib?r?t? ?cts ?f th?ft 0.0% 3.1% 0.0% D?lib?r?t? s?ftw?r? ?tt?cks 0.0% 9.4% 0.0% F?rc?s ?f n?tur? 0.0% 12.5% 3.1% Qu?lity ?f s?rvic? d?vi?ti?ns fr?m s?rvic? pr?vid?rs 0.0% 21.9% 6.3% T?chnic?l h?rdw?r? f?ilur?s ?r ?rr?rs 0.0% 15.6% 6.3% T?chnic?l s?ftw?r? f?ilur?s ?r ?rr?rs 0.0% 21.9% 3.1% T?chnic?l ?bs?l?sc?nc? 6.3% 31.3% 6.3% Table 5 (continued). Threat Significance by Percentage Thr??ts Significant Extremely Significant ?ct ?f hum?n f?ilur? 37.5% 53.1% C?mpr?mis?s t? int?ll?ctu?l pr?p?rty 28.1% 34.4% D?lib?r?t? ?cts ?f ?spi?n?g? ?r tr?sp?ss 12.5% 78.1% D?lib?r?t? ?cts ?f inf?rm?ti?n ?xt?rti?n 15.6% 56.3% D?lib?r?t? ?cts ?f s?b?t?g? ?r v?nd?lism 21.9% 68.8% D?lib?r?t? ?cts ?f th?ft 25.0% 71.9% D?lib?r?t? s?ftw?r? ?tt?cks 46.9% 43.8% F?rc?s ?f n?tur? 46.9% 37.5% Qu?lity ?f s?rvic? d?vi?ti?ns fr?m s?rvic? pr?vid?rs 50.0% 21.9% T?chnic?l h?rdw?r? f?ilur?s ?r ?rr?rs 50.0% 28.1% T?chnic?l s?ftw?r? f?ilur?s ?r ?rr?rs 46.9% 28.1% T?chnic?l ?bs?l?sc?nc? 25.0% 31.3% 28 Risk Factors Baker, Rees, and Tippet (2007) stated that while organizations are attempting to take advantage of information technology to be competitive, those that do not pay heed to information security are actually making their organizations less competitive due to increased vulnerabilities. Management is faced with an array of information security standards and technologies, but no reliable criteria for making effective strategic decisions and determining the priority of those decisions regarding InfoSec expenditures. The Office of Homeland Security (2002) stated that a lack of real world data on how organizations set priorities on all the risks in a modern computing environment (i.e. risk factors). Table 6 shows that many organizations use some or all of the risk factors to plan their respective InfoSec strategies. When questioned about the Other category, these answers were more industry specific. Participants were concerned about violations of patient confidentiality in the medical industry, regulatory requirements in the financial services industry, and downstream liability in a variety of industries. T?bl? 6. Risk F?ct?rs by P?rc?nt?g?s When developing risk factors for your organization's risk analysis, which factors do your organization focus on the most? Yes No Legal, regulatory, or statutory requirements 78.13% 21.88% Loss of consumer confidence 75.00% 25.00% Damage to organization?s image/brand 78.13% 21.88% Financial losses 93.75% 6.25% Risks to infrastructure 81.25% 18.75% Risks of possible lawsuits 71.88% 28.13% Business requirements for information confidentiality, integrity, and availability 75.00% 25.00% Other 25.00% 75.00% 29 Return on Investment for Information Security The financial return for investing in information security counter measures has historically been difficult to calculate (Gordon & Loeb, 2002a; Gordon & Loeb, 2002b). Several strategies have been used in an attempt to place a dollar figure on a business concept that is difficult to quantify. The most common strategy is using fear, uncertainty, and doubt (FUD) to sell investments using anecdotal stories from real-world worst case scenarios. The second method is to estimate return on investment (ROI) for information security based on the cost of countermeasures. Another method is to use indirect estimates of the possible costs associated with security breaches. A more traditional approach involves using a traditional risk or decision analysis framework (Cavusolgo et al., 2004). This research project simply asked respondents whether their organization was using any method for the calculation of ROI for information security expenditures (Table 7). Of the respondents who stated their organization calculated ROI for information security, none would answer any follow up questions regarding the specifics of how their organization accomplishes this task. Several individuals specifically stated that they could not disclose that information due to the proprietary nature of that methodology. 30 T?bl? 7. ROI and Insurance f?r Inf?rm?ti?n S?curity Does your organization calculate Return on Investment (ROI) for information security investments and expenses? Response Percent Yes 15.6% No 84.4% Does your organization purchase insurance to cover its information assets? Response Percent Yes 28.1% No 71.9% Insurance for Information Security A minority of professionals (see Table XXX) indicated that their organization used insurance to protect their information assets. When further asked about the details regarding the insuring of their organization?s information assets, respondents varied in the percentage of assets from the most critical assets only (10-15% of assets insured) to all information assets (90-100% of assets insured). The participants also indicated a wide variety of insurance strategies from traditional insurance, to outsourcing a variety of redundant services, to the establishment of a variety of cold, warm, and hot sites ready to go if disaster strikes. When these additional strategies were considered under the category of insurance, most participants agreed that their organization is using some form of insurance. ISRA Frequency When asked about the frequency of the ISRA process at their organizations, approximately one-fourth chose never or rarely for their department and organization (Table 8). The fact that this many organizations are conducting their ISRA process with 31 such haphazard infrequency is troubling. About half chose annually or quarterly chose either quarterly for their department and organization. The remainder chose Weekly/Monthly or Continuously for the frequency of their respective ISRA processes. When further probed about the frequency of the process at their organizations, individuals from this group made comments stating that this is an ongoing process with committees that meet regularly throughout the year. T?bl? 8. ISR? Fr?qu?ncy How often is the information security risk analysis conducted for your department within your organization? Response Percent Never 9.4% Rarely 15.6% Annually 28.1% Quarterly 12.5% Weekly/Monthly 6.3% Continuously 28.1% How often is the information security risk analysis conducted for your entire organization? Response Percent Never 6.3% Rarely 21.9% Annually 25.0% Quarterly 25.0% Weekly/Monthly 3.1% Continuously 18.8% ISRA Participation and Approval The expert panel was also curious to know who participated in the ISRA process. The expert panel hoped that the ISRA process was not simply delegated to the IT department and forgotten. The panel believed that when an organization used professionals, with a diverse knowledge of all the functional areas, a more successful 32 ISRA process could be achieved. Second, the panel also wanted to know if the ISRA process was achieving support from the executives and other managers in their respective organization. Finally, the panel was interested in knowing who had final approval of the ISRA process. The results of these queries are shown in Table 9. T?bl? 9. ISR? P?rticip?ti?n and Approval Which of the following individuals at your organization participate in information security risk analysis? Yes No Owner/Partner 28.13% 71.88% Senior Manager/Executive (e.g. CEO, CIO) 65.63% 34.38% Department Manager/Supervisor/Director 87.50% 12.50% MIS/IS/IT/Technical management 93.75% 6.25% Other Managerial 68.75% 31.25% Consultant/Contractor 84.38% 15.63% Other IT/Technical/Scientific/Professional 87.50% 12.50% Other Employees 40.63% 59.38% Which of the following individuals at your organization have final approval of the information security risk analysis? Yes No Owner/Partner 21.88% 78.13% Senior Manager/Executive (e.g. CEO, CIO) 81.25% 18.75% Department Manager/Supervisor/Director 40.63% 59.38% MIS/IS/IT/Technical management 28.13% 71.88% Other Managerial 6.25% 93.75% Consultant/Contractor 9.38% 90.63% Other IT/Technical/Scientific/Professional 6.25% 93.75% Other Employees 6.25% 93.75% Improved ISRA Process As referred to earlier, many ISRA processes are available to the practitioner. These processes are developed by academics (Rainer et al., 1991; Holbein et al., 1996; Backhouse & Dhillon, 1996), government agencies (ISO, 2006; OHS, 2002) or 33 consultants hired by government agencies (Stoneburner, Goguen, & Feringa, 2002) in an attempt to give organizations a step-by-step process by which to conduct their ISRA. The expert panel attempted to develop a simple processes reflecting the best practices of a modern organization. The six-step process (Table 10) was met with great enthusiasm by the survey participants. T?bl? 10. Proposed ISR? Pr?c?ss Agreement St?p ?cti?n 1 D?t?rmin? IT ?ss?ts 2 D?t?rmin? v?lu? ?f IT ?ss?ts. 3 ?num?r?t? p?ssibl? thr??ts t? IT ?ss?ts. 4 D?t?rmin? vuln?r?bility ?f ?ss?ts t? sp?cific thr??ts. 5 D?t?rmin? risk ?xp?sur? f?r ?rg?niz?ti?n. 6 Minimiz? ?xp?sur? ?nd/?r purch?s? insur?nc? t? minimiz? risk ?xp?sur?. Do you agree with the process above for information security risk assessment? Response Percent Yes 96.9% No 3.1% Despite this percent agreement, many participants noted that the six-step process did not contain a process to add new threats and reprioritize threats that were no longer important. Several other comments were made asking the researchers to consider the iterative ISRA process and how changes to the InfoSec policy were made as a result of the ISRA process. See Figure 3 for the proposed Information Security Risk Analysis methodology as part of a broad security risk management framework. Figure 3. Improved ISRA Process 34 35 To develop the lists in the questionnaire, several sources were used. Beginning with a seminal work in ISRA (Rainer et al., 1991) and ending with the recent books on the subject (Whitman & Mattord, 2003; Whitman & Mattord, 2004), a fairly extensive list of methodologies were developed. The expert panel considered this a thorough list of methodologies used in the ISRA process and was interested to know how many were in use. As shown in Table 11, the information security risk assessment/auditing category, assessment of the routers, anti-virus software, and the use of firewalls were the most popular methodologies. The most popular methodologies to measure loss exposure were the Delphi technique/brainstorming, contractor assessments, single loss expectancy (SLE), questionnaires, and surveys. Another interesting fact was that many organizations relied on a variety of both qualitative and quantitative methodologies as encouraged by Rainer et al. (1991). In the Other category for both, a few respondents listed proprietary technologies and software not specifically listed in the questionnaire. However, upon further investigation, all of the answers given in the Other category could be classified in the categories listed on the survey. 36 T?bl? 11. ISR? and Loss Exposure Methodologies Select all information security risk assessment/audit methodologies used at your organization. Yes No Anti-virus software analysis 90.63% 9.38% Password cracking and improvement 84.38% 15.63% Firewall implementation and correction of configuration errors 93.75% 6.25% Vulnerability testing/correction 87.50% 12.50% War dialing (scanning for unauthorized modems and fax machines) 59.38% 40.63% Identification of critical infrastructure components 87.50% 12.50% Physical security review 84.38% 15.63% Centralized information storage location review 81.25% 18.75% Access control evaluation 84.38% 15.63% Certification identification 62.50% 37.50% Integration of the firewall, VPN and e-commerce 65.63% 34.38% Assessment of the routers and servers 93.75% 6.25% Cryptography review 62.50% 37.50% Computer Security Policy review and documentation 81.25% 18.75% Other 25.00% 75.00% Choose all the methodologies your organization uses to measure the possible loss exposure of information assets. Yes No Consultant/Contractor Assessments 78.13% 21.88% Annualized Loss Expectancy (ALE) 56.25% 43.75% Courtney?s ALE Method 21.88% 78.13% Cost-Benefit Analysis (CBA) 56.25% 43.75% Annualized Rate of Occurrence (ARO) 37.50% 62.50% Single Loss Expectancy (SLE) 75.00% 25.00% Livermore Risk Analysis Methodology (LRAM) 21.88% 78.13% Stochastic Dominance/Daily Loss Formula 21.88% 78.13% Scenario Analysis 65.63% 34.38% Delphi technique/brainstorming 81.25% 18.75% OCTAVE method 25.00% 75.00% Fuzzy Metrics 21.88% 78.13% Questionnaires 75.00% 25.00% Surveys 75.00% 25.00% Other 6.25% 93.75% 37 Step 4. Model Development Throughout the first phase of this research project, a theme that emerged many times was the success (i.e. effectiveness) of the ISRA process. The literature contains no means of measuring the effectiveness of this complicated process because very few studies measuring effectiveness of any aspect of information security exist. One study attempted to measure user perceptions of concern for security as a measure of IS security effectiveness (Straub & Goodhue, 1991). Another developed a perceived measure of security effectiveness using responses about overall security deterrence, prevention, as well as the protection level of computer hardware, software, data, and services (Kankanhalli, Hock-Hai, Bernard, & Kwok-Kee, 2003). Another attempted to create a mediation model of information security effectiveness (Knapp, 2005). In this study, the perceived ISRA effectiveness variable is based on the subjective judgment of security professionals and is directly based on the 5-item scale of Information Security Effectiveness (Kankanalli et al., 2003; Knapp, 2005; Knapp, 2006). Using self-reported, subjective measures has been frequently debated (Podsakoff & Organ, 1986; Straub, Boudreau, & Gefen, 2004). Despite the debate, self-reported, subjective measures can be an appropriate research tool for exploratory studies (Spector, 1994). Frequency. Organizations that are successful at any initiative require practice to achieve success at that initiative, and that knowledge must be captured, organized, disseminated repeatedly due to the ever changing business environment (Davenport & 38 Prusak, 1998). A successful system for evaluating the threats to information assets at an organization occurs as an iterative process where the organization improves the quality of their security policies and procedures over time (Gordon & Loeb, 2006; Knapp, Marshall, Rainer, & Ford, 2006). Hypothesis 1: The frequency of the information security risk analysis process will be positively related to their perceptions of information security risk analysis effectiveness. Number of Methodologies. Rainer, Snyder, Carr (1991) warned organizations against using only one methodology to conduct the ISRA at their organization. A combination of different qualitative and quantitative methodologies will be the most effective strategy to manage the IT risks to organizations (Rainer, Snyder, & Carr, 1991). An economically based, formal process for evaluating the threats to information assets at an organization is not achieved without a combination of methodologies (Gordon & Loeb, 2006). Hypothesis 2: The number of methodologies used in the information security risk analysis process will be positively related to their perceptions of information security risk analysis effectiveness. Insurance for InfoSec. Organizations have long wanted to protect their information because the potential for substantial economic loss exists through the theft of proprietary information, natural disasters, and other potential attacks. Implementing expensive InfoSec countermeasures does not guarantee full protection. A new solution to this problem is cyber-risk insurance policies. These policies provide financial protection in the event of an information security breach, and organizations who are mature in their 39 ISRA process will lead their respective industries in this practice (Gordon, Loeb, & Sohail, 2003). Hypothesis 3: The purchase of insurance to protect the organization's information assets will be positively related to their perceptions of information security risk analysis effectiveness. ROI for InfoSec. Resources to invest information systems? resources are scarce in every organization, and when organizations allocate capital for any IT expenditure, the stakeholders need to be insured that outlay will be wise use of funds. Even now, many organizations have broken or non-existent ROI processes for information security expenditures (May, 1997). The organizations who have implemented metrics for their InfoSec expenditures will have the accountability offered by being able to measure where their security dollar may be invested with the most benefit (Cavusoglu et al., 2004). Hypothesis 4: The calculation of Return on Investment (ROI) for the organization's information security investments will be positively related to their perceptions of information security risk analysis effectiveness. Threat significance. Straub and Welke (1998) warned organizations to stop ignoring the threats to their organization?s information assets. By not perceiving these threats as significant, organizations will have information systems that are far less secure than they could be. Without understanding the threats arrayed against the organizations, it is more likely that breaches will occur often and be costly when they do occur (Straub & Welke, 1998). Whitman (2003) took this one step further to encourage organizations to rank these threats as to their significance. Profiling the threats and knowing the threats 40 are the first steps in implementing countermeasures to fight the threat whether that threat is an individual, group, or force of nature (Whitman, 2003). Hypothesis 5: Information security professional's perceptions of the significance of the threats against the organization's information systems will be positively related to their perceptions of information security risk analysis effectiveness. Top Management Support. Another recurring topic of discussion in the qualitative phase of this project was the importance of having a management team that supported the ISRA process. Top management support is the degree that management believes in and allocates resources to the IS function (Ragu-Nathan, Apigian, Ragu- Nathan, & Tu, 2004). In the IS literature, the construct of top management support has been identified as the most frequently hypothesized variable contributing to IS implementation success (Markus, 1981; Sharma & Yetton, 2003). Top management (i.e. executives) significantly influence resource allocation and act as a champion of change to create a productive environment for successful IS implementation (Thong, Yap, & Raman, 1997). For four decades, top management support has been recognized as critical for effective computer security management (Allen, 1968; Wasserman, 1969; Parker, 1981). Dutta & McCrohan (2002) stated that effective organizational computer security does not start with firewalls or anti-virus software, but with top management support. Hypothesis 6: Information security professional's perceptions of top management support for the information security risk analysis will be positively related to their perceptions of information security risk analysis effectiveness. Security Culture. Culture can be defined as a set of beliefs, values, understandings, and norms shared by members of an organization (Daft & Marcic, 2001). 41 Culture has been an important topic in the practitioner literature (Santarelli, 2005) and has been identified as an opportunity for future IS research in security (Kankanhalli et al., 2003). The culture construct has been explored for its role regarding the implementation of new behaviors and organizational improvement initiatives (Detert, Schroeder, & Mauriel, 2000). In the IS literature, organizational culture has been examined as an opposition force resisting new technologies and transformations (Robey & Boudreau, 1999) and impacting organizational security (von Solms & von Solms, 2004). Hypothesis 7: Information security professional's perceptions of the organization's security culture will be positively related to their perceptions of information security risk analysis effectiveness. 42 Table 12. Summary of Proposed Hypotheses Hypotheses 1. The frequency of the information security risk analysis process will be positively related to their perceptions of information security risk analysis effectiveness. 2. The number of methodologies used in the information security risk analysis process will be positively related to their perceptions of information security risk analysis effectiveness. 3. The purchase of insurance to protect the organization's information assets will be positively related to their perceptions of information security risk analysis effectiveness. 4. The calculation of Return on Investment (ROI) for the organization's information security investments will be positively related to their perceptions of information security risk analysis effectiveness. 5. Information security professional's perceptions of the significance of the threats against the organization's information systems will be positively related to their perceptions of information security risk analysis effectiveness. 6. Information security professional's perceptions of top management support for the information security risk analysis will be positively related to their perceptions of information security risk analysis effectiveness. 7. Information security professional's perceptions of the organization's security culture will be positively related to their perceptions of information security risk analysis effectiveness. In this study, a model was proposed and tested. The model consolidates the existing research on the information security risk analysis process and tests the relationship of several components of ISRA effectiveness. The model predicts that ISRA effectiveness is positively related to specific aspects of the frequency, the number of methodologies used, the purchase of insurance to protect information assets, the calculation of ROI for security expenditures, the significance of threats in the environment, the support of top management, and the culture of security at the organization. Figure 4 provides a depiction of the hypothesized model. Figure 4. ISRA Effectivenes Step 5. Phase Two Data Collection Many concerns have been raised about using online web surveys in academic research. These concerns include constructing internet surveys, receiving incomplete or multiple responses, and managing confidentiality co 43 s Model ncerns (Simsek & Veiga, 2001; 44 Stanton & Rogelberg, 2001). The researcher used a popular online survey firm (SurveyMonkey) to develop the second phase online web survey. This software was designed with mechanisms in place to only allow fully completed surveys to be accepted, eliminating the need to discard incomplete surveys. The software only allowed participants to register for the survey one time, avoiding the danger of a single participant filling out multiple surveys. This list of email addresses was stored on a separate server to maintain the participants? confidentiality. Once the surveys were completed, the researcher received the data in formatted files that were easily loaded into SPSS. The survey did not need to be replicated using the internet web survey procedures outlined above. The data validation and collection procedures using internet surveys were much less labor intensive than when using traditional paper surveys. Participants for the second phase of this study were recruited from lists obtained from a lead generation company that collects names of participants for marketing research. This company, Majon International, possessed lists of willing survey participants including information security professionals. As part of the service provided to the researcher, the company sent emails (see Appendix D), using their email system, to each potential participant in the second phase of the data collection. An email was sent to each information security professional asking for their participation in a study being conducted by a researcher at Auburn University. The email explained the purposes of the study and assured possible participants that any information they provide was strictly anonymous and would only be used for research purposes. The email also explained that participation in the study required only that participants fill out 45 a short survey, which would take between 15 and 20 minutes. Finally, the email directed those who desired to participate to click the survey link and begin. Once a participant clicked the link, he or she was routed to the web survey and could begin entering information. After filling out the survey and clicking submit, participants were routed to a third page which thanked them for their helpful participation and reminded them that the information provided was strictly anonymous and would only be used for research purposes. Those who did not respond to the first request for participation were contacted again with a second similar email. This second communication was sent approximately one week after the original communication and was the last time that non-responders were contacted. Finally, once a sufficient number of responses were received, the survey was taken off the web and the data was analyzed. Specifically, 1,000 individuals were contacted about participation in the study. Of the 1,000 individuals contacted, 144 completed the web survey for a response rate of 14.4%. A copy of the web survey is included in Appendix E. Step 6. Phase Two Data Analysis The phase two participants were similar to the respondents in the first phase (Table 3). Table 13 illustrates the diversity of this sample with respect to number of employees, type of industry, job position, IT experience, and InfoSec experience. The sample had participants who worked at a mix of small, medium, and large organizations. The respondents worked in a variety of industries in both the public and private sector. The professionals also worked in a variety of roles in the organization from rank and files 46 workers represented by the Other IT/Technical/Scientific/Professional category through all levels of management from department head up to the owner and executive level of the organization. These professionals had a variety of IT and InfoSec experience with the vast majority being mid-level professionals with between six and fifteen years of experience. 47 T?bl? 13. Sample Characteristics of Phase Two Participants Employees: More than 5,000 59% From 501 to 5000 22% 500 or less 19% Industry: Largest represented include: Government-federal, military, local, etc. 22% Finance, Banking, & Insurance 13% Consumer Products/Retail/Wholesale 9% Medical/Healthcare-public or private 7% Manufacturing 6% Utilities 6% Information Technology/Security/Telecom 6% Education/Training 5% Non-Profit 4% Professional Services-Legal, Marketing, etc. 4% Transportation/Warehousing 4% Travel/Hospitality 4% Job Position: Consultant/Contractor 15% Department Manager/Supervisor/Director 10% MIS/IS/IT/Technical management 38% Other IT/Technical/Scientific/Professional 21% Other Managerial 3% Owner/Partner 5% Senior Manager/Executive 9% IT Experience: 5 years or less 3% Between 6 and 10 33% Between 11 and 15 33% Between 16 and 20 26% More than 20 6% InfoSec Experience: 5 years or less 39% Between 6 and 10 49% Between 11 and 15 10% Between 16 and 20 2% More than 20 0% 48 In addition to demographic data, information was collected on the organization?s ISRA process. Questions relating to the frequency of the process, the methodologies used in the process, the use of insurance, the calculation of ROI for security expenditures, the significance of the threats, the support of top management, the security culture, and the effectiveness of the ISRA process. Table 14 contains each study variable and its definition. The means, standard deviations, intercorrelations, and coefficient alphas, when applicable, of all study variables are presented in Table 15. Table 14. Proposed Model Variables and Definitions Study Variable Definition 1 Frequency Dummy variable coded as 1 if the organization conducts the ISRA process continuously, weekly, or monthly and 0 if the processes is completed less frequently 2 Methodologies Dummy variable coded to 1 if the organization uses 6 or fewer methodologies, 2 for the inclusive range 7 to 12, 3 for 13 to 18, 4 for 19 to 24, and 5 for 25 or greater 3 Insurance Dummy variable coded as 1 if the organization purchases insurance to protect its information assets and 0 if the organization does not 4 ROI Dummy variable coded as 1 if the organization calculates return on investment for security investments to protect its information assets and 0 if the organization does not 5 Threat Significance Average of the participant's answers rating the significance of 12 information security threats 6 Top Management Support Average of the participant's answers to the 3 item Top Management Support scale 7 Security Culture Average of the participant's answers to the 5 item Security Culture scale 8 ISRA Effectiveness Average of the participant's answers to the 5 item ISRA Effectiveness scale 49 Table 15. Means, Standard Deviations, Intercorrelations and Coefficient Alphas for Study Variables Mean SD 1 2 3 4 1 ISRA Effectiveness 3.994 1.050 1.000 2 Frequency 0.285 0.453 0.581 1.000 3 Methodologies 2.965 1.300 0.777 0.787 1.000 4 Insurance 0.486 0.502 0.651 0.556 0.754 1.000 5 ROI 0.440 0.496 0.683 0.736 0.802 0.797 6 Threats 4.243 0.579 0.741 0.703 0.756 0.687 7 Top Mngt. Support 3.963 1.093 0.964 0.601 0.777 0.667 8 Security Culture 3.960 1.027 0.950 0.596 0.778 0.673 Table 15 (continued). Means, Standard Deviations, Intercorrelations and Coefficient Alphas for Study Variables 5 6 7 8 1 ISRA Effectiveness 2 Frequency 3 Methodologies 4 Insurance 5 ROI 1.000 6 Threats 0.760 1.000 7 Top Mngt. Support 0.700 0.707 1.000 8 Security Culture 0.737 0.712 0.942 1.000 50 Common Method Bias Common Method Bias (CMB) is when the predictor and criterion variables are obtained from the same source, measured in the same context, and the source of the method bias cannot be identified (Podsakoff, MacKenzie, Lee, and Podsakoff, 2003). Podsakoff et al. (2003) stated that this bias is inherent in all survey research and provided a summary of sources and methods for dealing with common method problems. According to their work, the researcher should use all procedural remedies in survey design, separate the predictor and criterion variables psychologically, and guarantee response anonymity (Podsakoff et al., 2003). This study attempted to minimize the effects of CMB by carefully reviewing items to check for clarity of meaning, using scales with fewer items, removing headings in the survey instrument to remove potential priming effects, randomizing items to combat the social desirability effect, and all respondents were promised anonymity to encourage candid responses (Podsakoff et al., 2003). The proposed regression model and the results of the regression analysis of the proposed regression model are discussed in Chapter IV. 51 CHAPTER IV RESULTS Model Estimation Taken together, the constructs and variables discussed in Chapter III allow the development of the following model. Y = 0? + 1? (Freq) + 2? (Meth) + 3? (Ins) + 4? (ROI) + 5? (Threat) + 6? (TMS) + 7? (SC) Where: Y = Dependent variable, ISRA Effectiveness Freq = Frequency of ISRA Process Meth = Number of Methodologies Ins = Purchase Insurance ROI = Calculate ROI Threat = Threat Significance TMS = Top Management Support SC = Security Culture Results of Hypothesis Tests Hypothesis 1 predicted a positive relationship between the frequency of the organization?s ISRA process and the perceived effectiveness of the ISRA process. While the reported p-value is significant at .045, the results demonstrate a negative coefficient of -.150. Therefore, due to an inverse relationship, hypothesis one is not supported. 52 Hypothesis 2 predicted a positive relationship number of methodologies used in the information security risk analysis process and the perceived effectiveness of the ISRA process. The reported coefficient of .066 is positive and the reported p-value of .047 is significant at alpha level .05. Hypothesis 2 is supported. Hypothesis 3 predicted a positive relationship between the purchase of insurance to protect an organization?s information assets and the perceived effectiveness of the ISRA process. The reported p-value is not significant at .338, and the results demonstrate a negative coefficient of -.065. Therefore, hypothesis three is not supported. Hypothesis 4 predicted a positive relationship between the calculation of ROI for an organization?s information security expenditures and the perceived effectiveness of the ISRA process. While the reported p-value is significant at .014, the results demonstrate a negative coefficient of -.204. Therefore, due to an inverse relationship, hypothesis four is not supported. Hypothesis 5 predicted a positive relationship between information security professional?s perceptions of the significance of the threats against the organization's information systems and the perceived effectiveness of the ISRA process. The reported coefficient of .270 is positive and the reported p-value of .000 is significant at alpha level .05. Hypothesis 5 is supported. Hypothesis 6 predicted a positive relationship between information security professional?s perceptions of Top Management Support for the ISRA process and the perceived effectiveness of the ISRA process. The reported coefficient of .527 is positive 53 and the reported p-value of .000 is significant at alpha level .05. Hypothesis 6 is supported. The last test, for Hypothesis 7, predicted a positive relationship between information security professional?s perception of the security culture for the organization and the perceived effectiveness of the ISRA process. The results in this case support the hypothesis with a positive coefficient of .362 and a p-value of .000. Hypothesis 7 is supported. Table 16 provides a summary of the complete model results. Table 16. Table of Model Results Variable Coefficient Std Error P-Value Supported Frequency of ISRA Process -0.150 0.074 0.045* No Number of Methodologies 0.066 0.033 0.047* Yes Purchase Insurance -0.065 0.068 0.338 No Calculate ROI -0.204 0.082 0.014* No Threat Significance 0.270 0.056 0.000* Yes Top Management Support 0.527 0.052 0.000* Yes Security Culture 0.362 0.057 0.000* Yes Note: * p < .05 The next chapter includes a discussion of the findings, major contributions, limitations of the study, and implications for research and practice. This discussion is followed by a conclusion to the study. 54 CHAPTER 5 DISCUSSION & CONCLUSION The regression model constructed and tested in this study explains a large portion of the variance associated with security professional?s perception of the effectiveness of the ISRA process at their organization reporting an R 2 of .937. Four of the variables measured were found to be significant: (a) Number of Methodologies, (b) Threat Significance, (c) Top Management Support, and (d) Security Culture. However, Hypothesis 3, dealing with the purchase of insurance, was not supported due to p-value which showed insignificance. Also, Hypotheses 1 and 4, dealing with the frequency of the ISRA process and the calculation of ROI respectively, were not supported due to directional inconsistencies. Four hypotheses were supported with positive coefficients. The number of methodologies used in the ISRA process, the threat significance, top management support, and security culture all had a positive effect on perceived ISRA effectiveness. Organizations that use more methodologies likely have a more developed ISRA process. By comparison, a firm that is beginning its initial ISRA may be using only one or two methodologies. The veteran ISRA organizations that understand the severity and complexity of the threats would also be expected to work harder to a very thorough analysis. Alternatively, a novice organization would only be in the beginning stages of 55 learning about all possible threats and vulnerabilities. Top management support is crucial in this endeavour because this process uses organizational resources to do a professional job, and organizations where management withholds its support will not be able to complete all necessary ISRA activities due to budgetary concerns. An organization?s security culture would also be critical to develop a successful ISRA process because all stakeholders would be vigilant for issues that could bring harm to an organization?s information systems. The frequency of the ISRA process and the calculation of ROI for security reported significant, yet negative coefficients. This is not what was hypothesized. This study will not attempt to demonstrate the cause of these negative relationships. However, a possible explanation for the negative relationship with frequency may be that information security professionals perceive that the effectiveness of the ISRA process does not necessarily dictate that organizations should conduct this process more frequently. An organization may achieve a high return on their security investment by conducting a thorough annual ISRA as opposed to a half-hearted monthly or quarterly affair. The survey participants may be more impressed with the quality of the processes regardless of the frequency. Additionally, these professionals may be more focused on the security of the organization?s information assets than financial measures like ROI. With experience, these professionals have likely seen expensive investments in information security countermeasures that yielded very little improvement in the organization?s overall information security. These professionals have also likely seen great improvements in 56 the organization?s information security with very little investment. The relationships between ISRA effectiveness, ISRA frequency, and information security ROI will need to be explored in future research projects. Contributions of the Study This study makes several contributions to the limited information security risk analysis body of knowledge. The ISRA process was investigated across a variety of industries. This investigation provided insight into ISRA process by using qualitative and quantitative data collection methods. A model for the ISRA process was developed and agreed upon by the professionals themselves. A list of risk factors for the ISRA process was developed and agreed upon by the professionals themselves. This study gained insight into the frequency of and participants in the ISRA process conducted across both the department and organization. A model for the broader framework of information risk management was introduced. In the context of the ISRA process, this study added to the knowledge of both managers and security professionals. Ma and Pearson (2005) stated that it was necessary to explore these interrelationships between management practices and security objectives. Future studies need to continue the exploration of this research stream to insure that organizations have the most efficient and effective ISRA process possible. Limitations of the Study This study has several limitations. First, this study only questioned security professionals that had obtained the CISSP designation. By focusing only on these security professionals, this study may have ignored the many competent information security professionals exist that do not have this certification. Many organizations may be conducting a competent ISRA process without a single CISSP on staff. Certain 57 industries may not even require this certification, and some organizations may even develop their own training for conducting this analysis. Second, the scope of the organizations involved in this study was broad in terms of industry sector (i.e. education, government, and business). Future studies may need to focus on a specific sector due to the likelihood that different industry sectors focus on different risk factors when determining their risk exposure. Finally, the sample size was not large enough to conduct a more thorough analysis of the quantitative data. Further investigation is required to develop techniques to collect data from information security risk analysis professionals in sufficient quantity to provide a more thorough and numerous data collection. Implications for Research & Practice In their 2004 study, Kotulic and Clark pr?p?s?d ? c?nc?ptu?l m?d?l b?s?d ?n th? study ?f risk management ?t th? firm l?v?l. ?lth?ugh c?nsid?r?bl? tim? ?nd ?ff?rt w?r? ?xp?nd?d in ?tt?mpting t? v?lid?t? th? us?fuln?ss ?f th?ir pr?p?s?d m?d?l, this effort was not succ?ssful. Kotulic and Clark (2004) pr?vid?d ? d?scripti?n of th? pr?bl?ms f?c?d whil? attempting to collect data from information security professionals. Research regarding an organization?s information security practices is very intrusive. Information security professionals are, by nature, distrustful of anyone attempting to collect information about how they do their jobs. Kotulic and Clark (2004) sent out a mass mailing of 1540 unsolicited survey packages, and despite many efforts to solicit a response, received nine complete responses giving them a response rate of .61%. The authors went on to state that it is nearly impossible to collect information security data from an organization without a major supporter. This research project faced similar 58 obstacles, but this non-response issue was remedied by targeting information security professionals who have opted to receive questionnaires from researchers. Using this strategy, this research project did achieve a favorable response rate. Until researchers find creative ways to reach these nervous participants, who do not feel safe to disclose security information about their respective organizations, the growth of the information security body of knowledge is going to be hampered by failed research projects. Managers who are serious about protecting their organization?s information assets need to ensure that a thorough organizational information security risk analysis is being conducted at their organization. With top management support, the information security professionals cannot develop and maintain processes that identify new threats, protect the organizations assets from existing threats, and develop dynamic and thorough security policies to develop an organizational culture with security as on of its core values. Considering the dangers and costs associated with security incidents, it is critical today for organizations to take this process seriously in order to secure their valuable information assets. Conclusion of the Study This research effort has made a significant contribution to the information security risk analysis body of knowledge, but much work remains. Judging by the high volume of threats to information security assets, the value of a competent ISRA process will continue to grow across a variety of industries for the foreseeable future. Thus, practitioners and researchers should continuously seek to work together to understand the dynamics of the ISRA process and improve the methods for its execution. 59 REFERENCES Allen, B. (1968). Danger Ahead! Safeguard Your Computer. Harvard Business Review, 46(6), 97-101. B?ckh?us?, J. and Dhill?n, G. (1996). Structur?s ?f r?sp?nsibility ?nd s?curity ?f inf?rm?ti?n syst?ms. ?ur?p??n J?urn?l ?f Inf?rm?ti?n Syst?ms, 5(1), 2-9. B?d?nh?rst, K.P. and ?l?ff, J.H.P. (1989). Fr?m?w?rk ?f ? m?th?d?l?gy f?r th? lif? cycl? ?f c?mput?r s?curity in ?n ?rg?niz?ti?n. C?mput?rs & S?curity, 8, 433-442. Baker, W.H., Rees, L.P., and Tippet, P.S. (2007). Necessary Measures: Metric-driven information security risk assessment and decision making. Communications of the ACM, 50(10), 101-106. B?sk?rvill?, R. (1991). Risk ?n?lysis: An int?rpr?tiv? f??sibility t??l in justifying inf?rm?ti?n syst?ms s?curity. ?ur?p??n J?urn?l ?f Inf?rm?ti?n Syst?ms, 1(2), 121-130. Bodin, L.D., Gordon, L.A., and Loeb, M.P. (2005) Evaluating Information Security Investments Using the Analytic Hierarchy Process. Communications of the ACM, (48:2), 79-83. Br??l?y, R.?., My?rs, S.C., ?nd ?ll?n, F. (2005). Principl?s ?f C?rp?r?t? Fin?nc? (8th ed.). B?st?n, M?: McGr?w-Hill Irwin. 60 Bryman, A. and Bell, E. (2003). Business Research Methods. New York, NY: Oxford University Press. C?mpb?ll, K., G?rd?n, L. A., Loeb, M. P., ?nd Zh?u, L. (2003). Th? ?c?n?mic C?st ?f Publicly ?nn?unc?d Inf?rm?ti?n S?curity Br??ch?s: ?mpiric?l evid?nc? fr?m th? st?ck m?rk?t. J?urn?l ?f C?mput?r S?curity, 11, 431-448. C?rr, N.G. (2003). IT D??sn't M?tt?r. H?rv?rd Busin?ss R?vi?w, 81(5), 41-51. Cavusoglu, H., Mishra, B., and Raghunathan, S. (2004) A Model for Evaluating IT Security Investments. Communications of the ACM, (47:7), 87-92. Commission of the European Communities. (1994). ?ur?p??s W?y t? th? Inf?rm?ti?n S?ci?ty: ?n acti?n pl?n, Retrieved May 2007, from http://aei.pitt.edu/947/01/info_socieity_action_plan_COM_94_347.pdf C?nry-Murr?y, ?. (2003). Justifying S?curity Sp?nding. N?tw?rk M?g?zin?, 18(3), 44. Daft, R. L., & Marcic, D. (2001). Understanding Management (3rd ed.). New York: Harcourt College Publishers. Davenport, T.H. and Prusak, L. (1998). Working Knowledge: How organizations manage what they know. Cambridge, MA: Harvard Business School Press. Detert, J. R., Schroeder, R. G., & Mauriel, J. J. (2000). A Framework for Linking Culture and Improvement in Organizations. Academy of Management Review, 25(4), 850- 863. Dutta, A. ?nd McCr?h?n, K. (2002). M?n?g?m?nt?s R?l? in Inf?rm?ti?n S?curity in ? Cyb?r Ec?n?my. California Management Review, 45(1), 67-87. 61 G?rd?n, L.?., ?nd L??b, M.P. (2002?). Th? ?c?n?mics ?f Inf?rm?ti?n S?curity Inv?stm?nt. ?CM Tr?ns?cti?ns in Inf?rm?ti?n & Syst?ms S?curity, 5(4), 438-457. G?rd?n, L.?., ?nd L??b, M.P. (2002b). R?turn ?n Inf?rm?ti?n S?curity Inv?stm?nts: Myth vs. R??lity. Str?t?gic Fin?nc?, 26-31. Gordon, L.A., Loeb, M.P., and Sohail, T. (2003). A Framework for Using Insurance for Cyber-Risk Management. Communications of the ACM, 46(3), 81-85. G?rd?n, L.?., L??b, M.P., Lucyshyn, W., ?nd Richardson, R. (2004). The 9th Annual Computer Crime and Security Survey. San Francisco, CA: Computer Security Institute. Gordon, L.A. and Loeb, M.P. (2006). Budgeting Process for Information Security Expenditures. Communications of the ACM, 49(1), 121-125. H?nd?rs?n, J.C., ?nd V?nk?tr?m?n, N. (1993). Str?t?gic ?lignm?nt - L?v?r?ging Inf?rm?ti?n T?chn?l?gy f?r Tr?nsf?rming ?rg?niz?ti?ns. IBM Syst?ms J?urn?l, 32(1), 4-16. Hind?, S. (1998). R?c?nt S?curity Surv?ys. C?mput?rs & S?curity, 17, 207-210. Hitchings, J. (1995). ?chi?ving ?n Int?gr?t?d D?sign: Th? w?y f?rw?rd f?r inf?rm?ti?n s?curity. In ?ll?f, J. and v?n S?lms, S. (?ds.)meteg Inf?rm?ti?n S?curity ? th? N?xt D?c?d?meteg London: Ch?pm?n & H?ll. H?lb?in, R., T?uf?l, S, and B?ukn?cht, K. (1996). Th? us? ?f busin?ss pr?c?ss m?d?ls f?r s?curity d?sign in ?rg?niz?ti?ns. In S. K?tsik?s and D. Gritz?lis (?ds.), Inf?rm?ti?n Syst?ms S?curity: F?cing th? Inf?rm?ti?n S?ci?ty ?f th? 21st C?ntury, L?nd?n: Ch?pm?n & H?ll. 62 Hub?r, G.P. (1990). ? Th??ry ?f th? ?ff?cts ?f ?dv?nc?d Inf?rm?ti?n T?chn?l?gi?s ?n ?rg?niz?ti?n?l D?sign, Int?llig?nc?, ?nd D?cisi?n M?king. ?c?d?my ?f M?n?g?m?nt R?vi?w, 15(1), 47-71. International Information Systems Security Certification Consortium, Inc. (2007). Frequently Asked Questions. Retrieved December 10, 2007, from https://www.isc2.org/cgi-bin/content.cgi?category=84. International Standards Organization. (2006). Inf?rm?ti?n t?chn?l?gy?Guid?lin?s f?r th? m?n?g?m?nt ?f IT s?curity?Part 5: Management guidance on network security. Retrieved May 2007, from http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumbe r=31142 Kankanhalli, A., Hock-Hai, T., Bernard, C. Y. T., & Kwok-Kee, W. (2003). An Integrative Study of Information Systems Security Effectiveness. International Journal of Information Management, 23(2), 139-154. Knapp, K., Morris, F., Rainer, R.K., Jr., and Byrd, T.A. (2003). Defense Mechanisms of Biological Cells: A framework for network security thinking. Communications of the Association for Information Systems, 12, 701-719. Knapp, K. J. (2005). A Model of Managerial Effectiveness in Information Security: From grounded theory to empirical test. Dissertation Abstracts International. (UMI No. 3201451). 63 Knapp, K., Morris, F., Rainer, R.K., Jr., and Ford, F.N. (2006). Information Security: Management?s effect on culture and policy. Information Management & Computer Security, 14(1), 24-36. Kotulic, ?.G. and Clark, J.C. (2004). Why th?r? ?r?n?t m?r? inf?rm?ti?n s?curity r?s??rch studi?s. Information & Management, 41, 597-607. Lindup, K. (1996). Th? r?l? ?f inf?rm?ti?n s?curity in c?rp?r?t? g?v?rn?nc?. C?mput?rs & S?curity, 15, 477-485. M?, Q. ?nd Pearson, M. J. (2005). IS? 17799: ?B?st Pr?ctic?s? in Inf?rm?ti?n S?curity M?n?g?m?nt? Communications of the Association for Information Systems, 15, 577-591. May, T.A. (1997). The Death of ROI: Rethinking IT value measurement. Information Management & Computer Security, 5(3), 90-92. Mitnick, K.D. and Simon, W.L. The Art of Deception: Controlling the Human Element of Security, Indianapolis, IN: Wiley Publications, 2002. Moore, M. M. (2003). Employee Security Education: Pillars of your community. Retrieved April, 2007, from http://www.csoonline.com/read/010903/pillars.html OHS. (2002). National Strategy for Homeland Security. Office of Homeland Security. Parker, D. B. (1981). Computer Security Management. Reston, Virginia: Reston Publishing Company. Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. (2003). Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. Journal of Applied Psychology, 88(5), 879-903. 64 Podsakoff, P. M., & Organ, D. W. (1986). Self-Reports in Organizational Research: Problems and Prospects. Journal of Management, 12(4), 531-544. Power, R. (2002). CSI/FBI Computer Crime and Security Survey. Computer Security Issues & Trends. 8(1), 1-24. Ragu-Nathan, B. S., Apigian, C. H., Ragu-Nathan, T. S., & Tu, Q. (2004). A Path Analytic Study of the Effect of Top Management Support for Information Systems Performance. Omega, 32, 459-471. Rainer, R. K, Jr., Snydermeteg C. S., ?nd Carr, H. H. (1991). Risk ?n?lysis f?r Inf?rm?ti?n T?chn?l?gy. Journal of Management Information Systems, 8(1), 129-147. Rainer, R. K., Jr., Turban, E., and Potter, R.E. (2007). Introduction to Information Systems: Supporting and Transforming Business, Hoboken, NJ: John Wiley & Sons, Inc. Richardson, R. (2007). The 12th Annual Computer Crime and Security Survey. San Francisco, CA: Computer Security Institute. Robey, D., & Boudreau, M.-C. (1999). Accounting for the Contradictory Organizational Consequences of Information Technology: Theoretical Directions and Methodological Implications. Information Systems Research, 10(2), 167-185. Santarelli, S. (2005). Creating a Corporate Security Culture. Retrieved May, 2006, from http://searchsecurity.techtarget.com/tip/0,289483,sid14_gci1137072,00.html Scandura, T. A., and Williams, E. A. (2000). Research Methodology in Management: Current practices, trends, and implications for future research. Academy of Management Journal, 43(6), 1248?1264. 65 S??rl?, J. R. (1987). Sp??ch ?cts: ?n ?ss?y in th? Phil?s?phy ?f L?ngu?g?. New York, NY: C?mbridg? Univ?rsity Pr?ss. Sharma, R., & Yetton, P. (2003). The Contingent Effects of Management Support and Task Interdependence on Successful Information Systems Implementation. MIS Quarterly, 27(4), 533-555. Simsek, Z., & Veiga, J.F. (2001). A primer on Internet organizational surveys. Organizational Research Methods, 4(3), 218-235. Smith, H. ?., McKeen, J. D., and Staples, S. D. (2001). Risk M?n?g?m?nt in Inf?rm?ti?n Syst?ms: Pr?bl?ms and p?t?nti?l. Communications of the Association for Information Systems, 7, 1-28. Spector, P. E. (1994). Using Self-Report Questionnaires in OB Research: A Comment on the Use of a Controversial Method. Journal of Organizational Behavior, 15, 385- 392. Sriniv?s?n, R., Lili?n, G. L., ?nd R?ng?sw?my, ?. (2002). T?chn?l?gic?l ?pp?rtunism ?nd R?dic?l T?chn?l?gy ?d?pti?n: ?n applic?ti?n t? ?-Busin?ss. J?urn?l ?f M?rk?ting, 66(3), 47-63. Stanton, J.M., & Rogelberg, S.G. (2001). Using Internet/Intranet Web pages to collect organizational research data. Organizational Research Methods, 4(3), 200-217. Stoneburner, G., Goguen, A., and Feringa, A. (2002). Risk Management Guide for Information Technology Systems. National Institute of Standards and Technology. Str?ub, D.W. ?nd W?lk?, R.J. (1998). C?ping with Syst?ms Risk: S?curity pl?nning m?d?ls f?r m?n?g?m?nt d?cisi?n m?king. MIS Qu?rt?rly, 22(4), 441-469. 66 Straub, D. W., Boudreau, M. C., & Gefen, D. (2004). Validating Guidelines for IS Positivist Research. Communications of the AIS, 13(24), 380-427. Suh, B. ?nd Ing??, H. (2003). Th? IS Risk ?n?lysis B?s?d ?n ? Busin?ss M?d?l. Information & Management, 41, 149-158. Thong, J. Y. L., Yap, C. S., & Raman, K. S. (1997). Environments of Information Systems Implementation in Small Businesses. Journal of Organizational Computing and Electronic Commerce, 7(4), 253-278. von Solms, R., & von Solms, B. (2004). From Policies to Culture. Computers & Security, 23, 275-279. Wasserman, J.J. (1969). Plugging the Leaks in Computer Security. Harvard Business Review, 47(5), 119-129. Weiss, R.S. (1994). Learning from Strangers: The art and method of qualitative interview studies. New York, NY: The Free Press. Whitman, M.E. (2003). Enemy at the Gate: Threats to information security. Communications of the ACM, 46(8), 91-95. Whitman, M.E. and Mattord, H.J. (2003). Principles of Information Security. Boston, MA: Course Technology. Whitman, M.E. and Mattord, H.J. (2004). Management of Information Security. Boston, MA: Course Technology. 67 APPENDICES 68 Appendix A Email Blast to the ISRA Survey Phase One Participants Auburn University - ISRA Survey Sent out September 18, 2007 Recently, you were kind enough to participate in a research study conducted by Auburn University?s Dr. Ken Knapp and Dr. Thomas Marshall. In that study, you indicated that you would be interested in participating in a survey on risk management. We are currently conducting a study to create a set of best practices for information security risk management. All responses will be completely anonymous and confidential. All results will be reported only in a summarized fashion. No participant information will be revealed. We will be happy to provide you with an executive summary of our results if you would let us know that you would like one. When you are ready to take the survey, just fill out the Excel spreadsheet attached to this email. Please, begin on the worksheet ?Introduction?. When you complete the survey, please email the completed spreadsheet to jourdsz@auburn.edu. It should take between 20 and 30 minutes to complete the survey. Thank you in advance for your participation. Zack Jourdan Ph.D. Candidate Auburn University 69 APPENDIX B The Information Security Risk Analysis Questionnaire ? Phase One Survey on Information Security Risk Analysis 1. Introduction Thank you for your interest in this questionnaire. Through your participation, we hope to learn more about important aspects of information security risk analysis. This survey asks for your opinion about the risk analysis practices of the organization where you currently work or the organization that you support. This survey is on the worksheets (tabs at bottom) named Introduction, Demographics, Threats, Risk Factors, and Risk Analysis. Prerequisites for taking this survey: 1. You are an information security professional (i.e. CISSP or SSCP). OR 2. You have sufficient experience at the current organization where you work to have an opinion about its risk analysis practices. Consultants or outsourced employees: If you divide your time supporting more than one client, answer the questions in relation to the organization where you spend most of your time. Privacy Statement: Zack Jourdan is conducting this study. Please, address any questions you may have about this survey to Zack Jourdan (jourdsz@.auburn.edu). Information collected in this study will be part of a dissertation and published in professional journals. Only aggregate results will be published. "Information obtained in this study identifiable to you will be held in the strictest of confidence. Other than an email address, only general demographic questions will be asked. Your email address will not be shared with anyone. Please participate only once. All participants will receive a report of the results by email. 70 Your decision whether or not to participate will not jeopardize your relationship with (ISC) 2 or Auburn University. If you withdraw from this study, we will delete all provided information. If you agree to participate, please fill out all portions of this survey. If you do not agree to participate, please forward this file to colleagues who might be interested in completing this survey. Please, select the best answer from the blue list boxes. Please, type longer answers in the green text boxes. Please enter your email address. Please select your certification: None CISSP SSCP CAP Other If you have more than one certification or certifications not in the list, please describe here: 2. Demographics Instructions: All questions pertain to the entire organization where you work or the organization that you support. Answering these questions is very important for correct interpretation of the questionnaire results. Please, select the best answer from the blue list boxes. Please, type longer answers in the green textboxes. How many employees work in this organization? 500 or less From 501 to 2,500 From 2,501 to 7,500 From 7,501 to 15,000 More than 15,001 71 Select the country where you perform the majority of your work. List box of countries Are you an outsourced (consultant) worker? NO, I?m a regular/permanent employee. YES, I?m an outsourced worker. From the list below, select the primary industry that best describes the organization where you do the majority of your work. (Choose only one.) Consultant Government-federal, military, local, etc. Medical/Healthcare-public or private Finance, Banking, & Insurance Professional Services-Legal, Marketing, etc. Consumer Products/Retail/Wholesale Education/Training Energy Information Technology/Security/Telecom Entertainment Industrial Technology Manufacturing Non-Profit Publishing Travel/Hospitality Transportation/Warehousing Utilities Real Estate/Property Management Other If you chose other for industry, please describe the industry where you do most of your work. Which of the following describes your primary job function? Owner/Partner Senior Manager/Executive (e.g. CEO, CIO) Department Manager/Supervisor/Director MIS/IS/IT/Technical management Other Managerial Consultant/Contractor Other IT/Technical/Scientific/Professional 72 How many total years of experience do you have in information technology? 5 years or less Between 6 and 10 Between 11 and 15 Between 16 and 20 More than 20 How many total years of experience do you have in information security? 5 years or less Between 6 and 10 Between 11 and 15 Between 16 and 20 More than 20 Is information security a primary or secondary responsibility of your current job? Primary Secondary 73 3. Threat For each threat listed below, please choose the threats significance to your organization. Extremely significant 5 Significant 4 Neither insignificant nor significant 3 Insignificant 2 Extremely insignificant 1 Please, choose yes if your organization emphasizes this risk factor and no if your organization does not emphasize this risk factor. Please, respond for each risk factor in the list. Threat Example Act of human failure. Accidents, employee mistakes Compromises to intellectual property Piracy, copyright infringement Deliberate acts of espionage or trespass Unauthorized access and/or data collection Deliberate acts of information extortion Blackmail for information disclosure Deliberate acts of sabotage or vandalism Destruction of systems or information Deliberate acts of theft Illegal confiscation of equipment or information Deliberate software attacks Viruses, worms, macros, denial-of-service Forces of nature Fire, flood, earthquake, lightning Quality of service deviations from service providers Power and WAN quality of service issues Technical hardware failures of errors Equipment failure Technical software failures or errors Bugs, code problems, unknown loopholes Technical obsolescence Antiquated or outdated technologies 74 4. Risk Factors and Insurance When developing risk factors for your organization's risk analyses, which factors do your organization focus on the most? Legal, regulatory, or statutory requirements Loss of consumer confidence Damage to organization?s image/brand Financial losses Risks to infrastructure Risks of possible lawsuits Business requirements for information confidentiality, integrity, and availability Other If you selected other for risk factors, please describe here: Describe the main factors that your organization uses to establish acceptable risk levels. What would you add to, delete from, or alter in the above list of factors? Does your organization calculate Return on Investment (ROI) for information security investments and expenses? Yes No If your company does calculate ROI for information security, please describe how your organization calculates this ROI. Does your organization purchase insurance to cover its information assets? Yes No As a percentage, how much of your organization?s tangible information assets (i.e. physical assets, buildings, equipment, computer hardware, etc.) are covered by insurance? Why did your organization choose that percentage? 75 As a percentage, how much of your organization?s intangible information assets (i.e. profits, temporary operating expenses, intellectual properties, electronic files, databases, proprietary programs, etc.) are covered by insurance? Why did your organization choose that percentage? 5. Risk Analysis How often is the information security risk analysis conducted for your department within your organization? Never Rarely Annually Quarterly Weekly/Monthly Continuously How often is the information security risk analysis conducted for your entire organization? Never Rarely Annually Quarterly Weekly/Monthly Continuously Which of the following individuals at your organization participate in information security risk analysis? Please, choose yes if this individual is involved and no if this person is not involved. Please, answer yes or no for each individual. Owner/Partner Senior Manager/Executive (e.g. CEO, CIO) Department Manager/Supervisor/Director MIS/IS/IT/Technical management Other Managerial Consultant/Contractor Other IT/Technical/Scientific/Professional Other employees At your organization, who is involved in the information security risk analysis process? Please, describe the individuals and their roles here: 76 Which of the following individuals at your organization have final approval of the information security risk analysis? Please, choose yes if this individual is involved and no if this person is not involved. Please, answer yes or no for each individual. Owner/Partner Senior Manager/Executive (e.g. CEO, CIO) Department Manager/Supervisor/Director MIS/IS/IT/Technical management Other Managerial Consultant/Contractor Other IT/Technical/Scientific/Professional Other employees At your organization, who has final approval of the information security risk analysis process? Please, describe the individuals and their roles here: 1. Determine IT assets 2. Determine value of IT assets. 3. Enumerate possible threats to IT assets. 4. Determine vulnerability of assets to specific threats. 5. Determine risk exposure for organization. 6. Minimize exposure and/or purchase insurance to minimize risk exposure. Do you agree with the process for information security risk analysis in the above list? Yes No What would you add to, delete from, or alter on this list of steps for information security risk analysis/audit? 77 Select all information security risk analysis/audit methodologies used at your organization. Please, choose yes if your organization uses this methodology and no if your organization does not use this methodology. Please, answer yes or no for each methodology. Anti-virus software analysis Password cracking and improvement Firewall implementation and correction of configuration errors Vulnerability testing/correction War dialing (scanning for unauthorized modems and fax machines) Identification of critical infrastructure components Physical security review Centralized information storage location review Access control evaluation Certification identification Integration of the firewall, VPN and e-commerce Assessment of the routers and servers Cryptography review Computer Security Policy review and documentation Other If you selected other for methodology, please describe here: Describe the combination of information security risk analysis/audit methodologies used at your organization. What would you add to, delete from, or alter the methodologies listed above? 78 Choose all the methodologies your organization uses to measure the possible loss exposure of information assets. Please, choose yes if your organization uses this methodology and no if your organization does not use this methodology. Please, answer yes or no for each methodology. Consultant/Contractor Assessments Annualized Loss Expectancy (ALE) Courtney?s ALE Method Cost-Benefit Analysis (CBA) Annualized Rate of Occurrence (ARO) Single Loss Expectancy (SLE) Livermore Risk Analysis Methodology (LRAM) Stochastic Dominance/Daily Loss Formula Scenario Analysis Delphi technique/brainstorming OCTAVE method Fuzzy Metrics Questionnaires Surveys Other If you selected other for methodology, please describe here: Describe the methodologies your organization use to measure the possible loss exposure of information assets. What would you add to, delete from, or alter in the above list of methodologies? 79 APPENDIX C Screen Capture of ISRA Questionnaire ? Phase One 80 APPENDIX D Email Blast to the ISRA Survey Phase Two Participants Auburn University - ISRA Survey Sent out May 31st, 2009 Recently, you indicated that you would be interested in participating in a survey on topics related to information security. We are currently conducting a study to investigate how organizations conduct their information security risk analysis. This survey contains no questions pertaining to the vendors, techniques, or personnel used by your organization during the risk analysis process. All responses will be completely anonymous and confidential. All results will be reported only in a summarized fashion. No participant information will be revealed. We will be happy to provide you with an executive summary of our results. When you are ready to take the survey, just click on this link: Information Security Risk Analysis Survey You will be taken to a web-based survey. This survey will not collect any personal information including your email address, Internet Service Provider, IP address, MAC address, or any other personal information. If you have any questions about the survey or would like an executive summary of the results, please email jourdsz@auburn.edu. The survey should take between 15 and 20 minutes to complete. Thank you in advance for your participation. Zack Jourdan Ph.D. Candidate Auburn University 81 APPENDIX E The ISRA Questionnaire ? Phase Two Survey on Information Security Risk Analysis Introduction Thank you for your interest in this questionnaire. Through your participation, we hope to learn more about important aspects of information security risk analysis. This survey asks for your opinion about the risk analysis practices of the organization where you currently work. Prerequisites for taking this survey: 1. Your organization conducts a formal information security risk analysis process. 2. You are an information security professional (i.e. CISSP or SSCP). OR 3. You have sufficient experience at the current organization where you work to have an opinion about its risk analysis practices. Privacy Statement Zack Jourdan is conducting this study. Please, address any questions you may have about this survey to Zack Jourdan (jourdsz@auburn.edu). Information collected in this study will be part of a dissertation and published in professional journals. Only aggregate results will be published. This survey will not collect any personal information including your email address, Internet Service Provider, IP address, MAC address, or any other personal information. No information identifying your organization or you will be collected. Please participate only once. Your decision whether or not to participate will not jeopardize your relationship with Auburn University. If you withdraw from this study, we will delete all provided information. If you agree to participate, please fill out all portions of this survey. If you do not agree to participate, please close your browser?s window. 82 Demographics Instructions: All questions pertain to the entire organization where you work. Answering these questions is very important for correct interpretation of the questionnaire results. Please, select the best answer. How many employees work in this organization? 500 or less From 501 to 5000 More than 5,000 From the list below, select the primary industry that best describes the organization where you do the majority of your work. (Choose only one.) Consultant Government-federal, military, local, etc. Medical/Healthcare-public or private Finance, Banking, & Insurance Professional Services-Legal, Marketing, etc. Consumer Products/Retail/Wholesale Education/Training Energy Information Technology/Security/Telecom Entertainment Industrial Technology Manufacturing Non-Profit Publishing Travel/Hospitality Transportation/Warehousing Utilities Real Estate/Property Management Which of the following describes your primary job function? Owner/Partner Senior Manager/Executive (e.g. CEO, CIO) Department Manager/Supervisor/Director MIS/IS/IT/Technical management Other Managerial Consultant/Contractor Other IT/Technical/Scientific/Professional 83 How many total years of experience do you have in information technology? How many total years of experience do you have in information security? Threats For each threat listed below, please choose the threat?s significance to your organization. Extremely significant 5 Significant 4 Neither insignificant nor significant 3 Insignificant 2 Extremely insignificant 1 Threat Example Act of human failure. Accidents, employee mistakes Compromises to intellectual property Piracy, copyright infringement Deliberate acts of espionage or trespass Unauthorized access and/or data collection Deliberate acts of information extortion Blackmail for information disclosure Deliberate acts of sabotage or vandalism Destruction of systems or information Deliberate acts of theft Illegal confiscation of equipment or information Deliberate software attacks Viruses, worms, macros, denial-of-service Forces of nature Fire, flood, earthquake, lightning Quality of service deviations from service providers Power and WAN quality of service issues Technical hardware failures of errors Equipment failure Technical software failures or errors Bugs, code problems, unknown loopholes Technical obsolescence Antiquated or outdated technologies 84 Does your organization calculate Return on Investment (ROI) for information security investments and expenses? Yes No Does your organization purchase insurance to protect its information assets? Yes No How often is the information security risk analysis conducted for your entire organization? Once per year or less often Quarterly/Semiannually Continuously (i.e. weekly or monthly) 85 This is a list of information security risk analysis/audit methodologies that are possibly used at your organization. Please, select the methodologies used by your organization. Anti-virus software analysis Password cracking and improvement Firewall implementation and correction of configuration errors Vulnerability testing/correction War dialing (scanning for unauthorized modems and fax machines) Identification of critical infrastructure components Physical security review Centralized information storage location review Access control evaluation Certification identification Integration of the firewall, VPN and e-commerce Assessment of the routers and servers Cryptography review Computer Security Policy review and documentation Consultant/Contractor Assessments Annualized Loss Expectancy (ALE) Courtney?s ALE Method Cost-Benefit Analysis (CBA) Annualized Rate of Occurrence (ARO) Single Loss Expectancy (SLE) Livermore Risk Analysis Methodology (LRAM) Stochastic Dominance/Daily Loss Formula Scenario Analysis Delphi technique/brainstorming OCTAVE method Fuzzy Metrics Questionnaires Surveys 86 For each statement below, please choose the answer that describes your organization. Strongly Disagree = SD Disagree = D Neutral = N Agree = A Strongly Agree = SA Security culture In my organization? Employees value the importance of security. Security has traditionally been considered an important organizational value. Practicing good security is an accepted way of doing business. The overall environment fosters security-minded thinking. Information security is a key norm shared by organizational members. Top Management Support Top Management is interested in the implementation of an Information Security Risk Analysis Process. Top Management considers an Information Security Risk Analysis Process as important to the organization. Top Management has effectively communicated its support for an Information Security Risk Analysis Process. ISRA Effectiveness Risk analyses are conducted prior to writing new security policies. Top management is properly informed of vital information security risk analysis developments. The information security risk analysis program is successful. The information security risk analysis program protects the organization?s information assets. The information security risk analysis program is thorough. 87 APPENDIX F Screen Capture of ISRA Questionnaire ? Phase Two