Corporate information risk : an information security governance framework
- Authors: Posthumus, Shaun Murray
- Date: 2006
- Subjects: Computer security , Business enterprises -- Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9776 , http://hdl.handle.net/10948/814 , Computer security , Business enterprises -- Computer networks -- Security measures
- Description: Information Security is currently viewed from a technical point of view only. Some authors believe that Information Security is a process that involves more than merely Risk Management at the department level, as it is also a strategic and potentially legal issue. Hence, there is a need to elevate the importance of Information Security to a governance level through Information Security Governance and propose a framework to help guide the Board of Directors in their Information Security Governance efforts. IT is a major facilitator of organizational business processes and these processes manipulate and transmit sensitive customer and financial information. IT, which involves major risks, may threaten the security if corporate information assets. Therefore, IT requires attention at board level to ensure that technology-related information risks are within an organization’s accepted risk appetite. However, IT issues are a neglected topic at board level and this could bring about enronesque disasters. Therefore, there is a need for the Board of Directors to direct and control IT-related risks effectively to reduce the potential for Information Security breaches and bring about a stronger system of internal control. The IT Oversight Committee is a proven means of achieving this, and this study further motivates the necessity for such a committee to solidify an organization’s Information Security posture among other IT-related issues.
- Full Text:
- Date Issued: 2006
- Authors: Posthumus, Shaun Murray
- Date: 2006
- Subjects: Computer security , Business enterprises -- Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9776 , http://hdl.handle.net/10948/814 , Computer security , Business enterprises -- Computer networks -- Security measures
- Description: Information Security is currently viewed from a technical point of view only. Some authors believe that Information Security is a process that involves more than merely Risk Management at the department level, as it is also a strategic and potentially legal issue. Hence, there is a need to elevate the importance of Information Security to a governance level through Information Security Governance and propose a framework to help guide the Board of Directors in their Information Security Governance efforts. IT is a major facilitator of organizational business processes and these processes manipulate and transmit sensitive customer and financial information. IT, which involves major risks, may threaten the security if corporate information assets. Therefore, IT requires attention at board level to ensure that technology-related information risks are within an organization’s accepted risk appetite. However, IT issues are a neglected topic at board level and this could bring about enronesque disasters. Therefore, there is a need for the Board of Directors to direct and control IT-related risks effectively to reduce the potential for Information Security breaches and bring about a stronger system of internal control. The IT Oversight Committee is a proven means of achieving this, and this study further motivates the necessity for such a committee to solidify an organization’s Information Security posture among other IT-related issues.
- Full Text:
- Date Issued: 2006
Cybersecurity: reducing the attack surface
- Authors: Thomson, Kerry-Lynn
- Subjects: Computer security , Computer networks -- Security measures , f-sa
- Language: English
- Type: Lectures
- Identifier: http://hdl.handle.net/10948/52885 , vital:44319
- Description: Almost 60% of the world’s population has access to the internet and most organisations today rely on internet connectivity to conduct business and carry out daily operations. Further to this, it is estimated that concepts such as the Internet of Things (IoT) will facilitate the connections of over 125 billion ‘things’ by the year 2030. However, as people and devices are becoming more and more interconnected, and more data is being shared, the question that must be asked is – are we doing so securely? Each year, cybercriminals cost organisations and individuals millions of dollars, using techniques such as phishing, social engineering, malware and denial of service attacks. In particular, together with the Covid-19 pandemic, there has been a so-called ‘cybercrime pandemic’. Threat actors adapted their techniques to target people with Covid-19-themed cyberattacks and phishing campaigns to exploit their stress and anxiety during the pandemic. Cybersecurity and cybercrime exist in a symbiotic relationship in cyberspace, where, as cybersecurity gets stronger, so the cybercriminals need to become stronger to overcome those defenses. And, as the cybercriminals become stronger, so too must the defenses. Further, this symbiotic relationship plays out on what is called the attack surface. Attack surfaces are the exposed areas of an organisation that make systems more vulnerable to attacks and, essentially, is all the gaps in an organisation’s security that could be compromised by a threat actor. This attack surface is increased through organisations incorporating things such as IoT technologies, migrating to the cloud and decentralising its workforce, as happened during the pandemic with many people working from home. It is essential that organisations reduce the digital attack surface, and the vulnerabilities introduced through devices connected to the internet, with technical strategies and solutions. However, the focus of cybersecurity is often on the digital attack surface and technical solutions, with less of a focus on the human aspects of cybersecurity. The human attack surface encompasses all the vulnerabilities introduced through the actions and activities of employees. These employees should be given the necessary cybersecurity awareness, training and education to reduce the human attack surface of organisations. However, it is not only employees of organisations who are online. All individuals who interact online should be cybersecurity aware and know how to reduce their own digital and human attack surfaces, or digital footprints. This paper emphasises the importance of utilising people as part of the cybersecurity defense through the cultivation of cybersecurity cultures in organisations and a cybersecurity conscious society.
- Full Text:
- Authors: Thomson, Kerry-Lynn
- Subjects: Computer security , Computer networks -- Security measures , f-sa
- Language: English
- Type: Lectures
- Identifier: http://hdl.handle.net/10948/52885 , vital:44319
- Description: Almost 60% of the world’s population has access to the internet and most organisations today rely on internet connectivity to conduct business and carry out daily operations. Further to this, it is estimated that concepts such as the Internet of Things (IoT) will facilitate the connections of over 125 billion ‘things’ by the year 2030. However, as people and devices are becoming more and more interconnected, and more data is being shared, the question that must be asked is – are we doing so securely? Each year, cybercriminals cost organisations and individuals millions of dollars, using techniques such as phishing, social engineering, malware and denial of service attacks. In particular, together with the Covid-19 pandemic, there has been a so-called ‘cybercrime pandemic’. Threat actors adapted their techniques to target people with Covid-19-themed cyberattacks and phishing campaigns to exploit their stress and anxiety during the pandemic. Cybersecurity and cybercrime exist in a symbiotic relationship in cyberspace, where, as cybersecurity gets stronger, so the cybercriminals need to become stronger to overcome those defenses. And, as the cybercriminals become stronger, so too must the defenses. Further, this symbiotic relationship plays out on what is called the attack surface. Attack surfaces are the exposed areas of an organisation that make systems more vulnerable to attacks and, essentially, is all the gaps in an organisation’s security that could be compromised by a threat actor. This attack surface is increased through organisations incorporating things such as IoT technologies, migrating to the cloud and decentralising its workforce, as happened during the pandemic with many people working from home. It is essential that organisations reduce the digital attack surface, and the vulnerabilities introduced through devices connected to the internet, with technical strategies and solutions. However, the focus of cybersecurity is often on the digital attack surface and technical solutions, with less of a focus on the human aspects of cybersecurity. The human attack surface encompasses all the vulnerabilities introduced through the actions and activities of employees. These employees should be given the necessary cybersecurity awareness, training and education to reduce the human attack surface of organisations. However, it is not only employees of organisations who are online. All individuals who interact online should be cybersecurity aware and know how to reduce their own digital and human attack surfaces, or digital footprints. This paper emphasises the importance of utilising people as part of the cybersecurity defense through the cultivation of cybersecurity cultures in organisations and a cybersecurity conscious society.
- Full Text:
Data-centric security : towards a utopian model for protecting corporate data on mobile devices
- Authors: Mayisela, Simphiwe Hector
- Date: 2014
- Subjects: Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4688 , http://hdl.handle.net/10962/d1011094 , Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Description: Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
- Full Text:
- Date Issued: 2014
- Authors: Mayisela, Simphiwe Hector
- Date: 2014
- Subjects: Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4688 , http://hdl.handle.net/10962/d1011094 , Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Description: Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
- Full Text:
- Date Issued: 2014
Deploying DNSSEC in islands of security
- Authors: Murisa, Wesley Vengayi
- Date: 2013 , 2013-03-31
- Subjects: Internet domain names , Computer security , Computer network protocols , Computer security -- Africa
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4577 , http://hdl.handle.net/10962/d1003053 , Internet domain names , Computer security , Computer network protocols , Computer security -- Africa
- Description: The Domain Name System (DNS), a name resolution protocol is one of the vulnerable network protocols that has been subjected to many security attacks such as cache poisoning, denial of service and the 'Kaminsky' spoofing attack. When DNS was designed, security was not incorporated into its design. The DNS Security Extensions (DNSSEC) provides security to the name resolution process by using public key cryptosystems. Although DNSSEC has backward compatibility with unsecured zones, it only offers security to clients when communicating with security aware zones. Widespread deployment of DNSSEC is therefore necessary to secure the name resolution process and provide security to the Internet. Only a few Top Level Domains (TLD's) have deployed DNSSEC, this inherently makes it difficult for their sub-domains to implement the security extensions to the DNS. This study analyses mechanisms that can be used by domains in islands of security to deploy DNSSEC so that the name resolution process can be secured in two specific cases where either the TLD is not signed or the domain registrar is not able to support signed domains. The DNS client side mechanisms evaluated in this study include web browser plug-ins, local validating resolvers and domain look-aside validation. The results of the study show that web browser plug-ins cannot work on their own without local validating resolvers. The web browser validators, however, proved to be useful in indicating to the user whether a domain has been validated or not. Local resolvers present a more secure option for Internet users who cannot trust the communication channel between their stub resolvers and remote name servers. However, they do not provide a way of showing the user whether a domain name has been correctly validated or not. Based on the results of the tests conducted, it is recommended that local validators be used with browser validators for visibility and improved security. On the DNS server side, Domain Look-aside Validation (DLV) presents a viable alternative for organizations in islands of security like most countries in Africa where only two country code Top Level Domains (ccTLD) have deployed DNSSEC. This research recommends use of DLV by corporates to provide DNS security to both internal and external users accessing their web based services. , LaTeX with hyperref package , pdfTeX-1.40.10
- Full Text:
- Date Issued: 2013
- Authors: Murisa, Wesley Vengayi
- Date: 2013 , 2013-03-31
- Subjects: Internet domain names , Computer security , Computer network protocols , Computer security -- Africa
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4577 , http://hdl.handle.net/10962/d1003053 , Internet domain names , Computer security , Computer network protocols , Computer security -- Africa
- Description: The Domain Name System (DNS), a name resolution protocol is one of the vulnerable network protocols that has been subjected to many security attacks such as cache poisoning, denial of service and the 'Kaminsky' spoofing attack. When DNS was designed, security was not incorporated into its design. The DNS Security Extensions (DNSSEC) provides security to the name resolution process by using public key cryptosystems. Although DNSSEC has backward compatibility with unsecured zones, it only offers security to clients when communicating with security aware zones. Widespread deployment of DNSSEC is therefore necessary to secure the name resolution process and provide security to the Internet. Only a few Top Level Domains (TLD's) have deployed DNSSEC, this inherently makes it difficult for their sub-domains to implement the security extensions to the DNS. This study analyses mechanisms that can be used by domains in islands of security to deploy DNSSEC so that the name resolution process can be secured in two specific cases where either the TLD is not signed or the domain registrar is not able to support signed domains. The DNS client side mechanisms evaluated in this study include web browser plug-ins, local validating resolvers and domain look-aside validation. The results of the study show that web browser plug-ins cannot work on their own without local validating resolvers. The web browser validators, however, proved to be useful in indicating to the user whether a domain has been validated or not. Local resolvers present a more secure option for Internet users who cannot trust the communication channel between their stub resolvers and remote name servers. However, they do not provide a way of showing the user whether a domain name has been correctly validated or not. Based on the results of the tests conducted, it is recommended that local validators be used with browser validators for visibility and improved security. On the DNS server side, Domain Look-aside Validation (DLV) presents a viable alternative for organizations in islands of security like most countries in Africa where only two country code Top Level Domains (ccTLD) have deployed DNSSEC. This research recommends use of DLV by corporates to provide DNS security to both internal and external users accessing their web based services. , LaTeX with hyperref package , pdfTeX-1.40.10
- Full Text:
- Date Issued: 2013
Educating users about information security by means of game play
- Authors: Monk, Thomas Philippus
- Date: 2011
- Subjects: Computer security , Educational games -- Design , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9748 , http://hdl.handle.net/10948/1493 , Computer security , Educational games -- Design , Computer networks -- Security measures
- Description: Information is necessary for any business to function. However, if one does not manage one’s information assets properly then one’s business is likely to be at risk. By implementing Information Security controls, procedures, and/or safeguards one can secure information assets against risks. The risks of an organisation can be mitigated if employees implement safety measures. However, employees are often unable to work securely due to a lack of knowledge. This dissertation evaluates the premise that a computer game could be used to educate employees about Information Security. A game was developed with the aim of educating employees in this regard. If people were motivated to play the game, without external motivation from an organisation, then people would also, indirectly, be motivated to learn about Information Security. Therefore, a secondary aim of this game was to be self-motivating. An experiment was conducted in order to test whether or not these aims were met. The experiment was conducted on a play test group and a control group. The play test group played the game before completing a questionnaire that tested the information security knowledge of participants, while the control group simply completed the questionnaire. The two groups’ answers were compared in order to obtain results. This dissertation discusses the research design of the experiment and also provides an analysis of the results. The game design will be discussed which provides guidelines for future game designers to follow. The experiment indicated that the game is motivational, but perhaps not educational enough. However, the results suggest that a computer game can still be used to teach users about Information Security. Factors that involved consequence and repetition contributed towards the educational value of the game, whilst competitiveness and rewards contributed to the motivational aspect of the game.
- Full Text:
- Date Issued: 2011
- Authors: Monk, Thomas Philippus
- Date: 2011
- Subjects: Computer security , Educational games -- Design , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9748 , http://hdl.handle.net/10948/1493 , Computer security , Educational games -- Design , Computer networks -- Security measures
- Description: Information is necessary for any business to function. However, if one does not manage one’s information assets properly then one’s business is likely to be at risk. By implementing Information Security controls, procedures, and/or safeguards one can secure information assets against risks. The risks of an organisation can be mitigated if employees implement safety measures. However, employees are often unable to work securely due to a lack of knowledge. This dissertation evaluates the premise that a computer game could be used to educate employees about Information Security. A game was developed with the aim of educating employees in this regard. If people were motivated to play the game, without external motivation from an organisation, then people would also, indirectly, be motivated to learn about Information Security. Therefore, a secondary aim of this game was to be self-motivating. An experiment was conducted in order to test whether or not these aims were met. The experiment was conducted on a play test group and a control group. The play test group played the game before completing a questionnaire that tested the information security knowledge of participants, while the control group simply completed the questionnaire. The two groups’ answers were compared in order to obtain results. This dissertation discusses the research design of the experiment and also provides an analysis of the results. The game design will be discussed which provides guidelines for future game designers to follow. The experiment indicated that the game is motivational, but perhaps not educational enough. However, the results suggest that a computer game can still be used to teach users about Information Security. Factors that involved consequence and repetition contributed towards the educational value of the game, whilst competitiveness and rewards contributed to the motivational aspect of the game.
- Full Text:
- Date Issued: 2011
Establishing an information security culture in organizations : an outcomes based education approach
- Van Niekerk, Johannes Frederick
- Authors: Van Niekerk, Johannes Frederick
- Date: 2005
- Subjects: Computer security , Management information systems -- Security measures , Competency-based education
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9742 , http://hdl.handle.net/10948/164 , Computer security , Management information systems -- Security measures , Competency-based education
- Description: Information security is crucial to the continuous well-being of modern orga- nizations. Humans play a signfiicant role in the processes needed to secure an organization's information resources. Without an adequate level of user co-operation and knowledge, many security techniques are liable to be misused or misinterpreted by users. This may result in an adequate security measure becoming inadequate. It is therefor necessary to educate the orga- nization's employees regarding information security and also to establish a corporate sub-culture of information security in the organization, which will ensure that the employees have the correct attitude towards their security responsibilities. Current information security education programs fails to pay su±cient attention to the behavioral sciences. There also exist a lack of knowledge regarding the principles, and processes, that would be needed for the establishment of an corporate sub-culture, specific to information security. Without both the necessary knowledge, and the desired attitude amongst the employee, it will be impossible to guarantee that the organi- zation's information resources are secure. It would therefor make sense to address both these dimensions to the human factor in information security, using a single integrated, holistic approach. This dissertation presents such an approach, which is based on an integration of sound behavioral theories.
- Full Text:
- Date Issued: 2005
Establishing an information security culture in organizations : an outcomes based education approach
- Authors: Van Niekerk, Johannes Frederick
- Date: 2005
- Subjects: Computer security , Management information systems -- Security measures , Competency-based education
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9742 , http://hdl.handle.net/10948/164 , Computer security , Management information systems -- Security measures , Competency-based education
- Description: Information security is crucial to the continuous well-being of modern orga- nizations. Humans play a signfiicant role in the processes needed to secure an organization's information resources. Without an adequate level of user co-operation and knowledge, many security techniques are liable to be misused or misinterpreted by users. This may result in an adequate security measure becoming inadequate. It is therefor necessary to educate the orga- nization's employees regarding information security and also to establish a corporate sub-culture of information security in the organization, which will ensure that the employees have the correct attitude towards their security responsibilities. Current information security education programs fails to pay su±cient attention to the behavioral sciences. There also exist a lack of knowledge regarding the principles, and processes, that would be needed for the establishment of an corporate sub-culture, specific to information security. Without both the necessary knowledge, and the desired attitude amongst the employee, it will be impossible to guarantee that the organi- zation's information resources are secure. It would therefor make sense to address both these dimensions to the human factor in information security, using a single integrated, holistic approach. This dissertation presents such an approach, which is based on an integration of sound behavioral theories.
- Full Text:
- Date Issued: 2005
File integrity checking
- Authors: Motara, Yusuf Moosa
- Date: 2006
- Subjects: Linux , Operating systems (Computers) , Database design , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4682 , http://hdl.handle.net/10962/d1007701 , Linux , Operating systems (Computers) , Database design , Computer security
- Description: This thesis looks at file execution as an attack vector that leads to the execution of unauthorized code. File integrity checking is examined as a means of removing this attack vector, and the design, implementation, and evaluation of a best-of-breed file integrity checker for the Linux operating system is undertaken. We conclude that the resultant file integrity checker does succeed in removing file execution as an attack vector, does so at a computational cost that is negligible, and displays innovative and useful features that are not currently found in any other Linux file integrity checker.
- Full Text:
- Date Issued: 2006
- Authors: Motara, Yusuf Moosa
- Date: 2006
- Subjects: Linux , Operating systems (Computers) , Database design , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4682 , http://hdl.handle.net/10962/d1007701 , Linux , Operating systems (Computers) , Database design , Computer security
- Description: This thesis looks at file execution as an attack vector that leads to the execution of unauthorized code. File integrity checking is examined as a means of removing this attack vector, and the design, implementation, and evaluation of a best-of-breed file integrity checker for the Linux operating system is undertaken. We conclude that the resultant file integrity checker does succeed in removing file execution as an attack vector, does so at a computational cost that is negligible, and displays innovative and useful features that are not currently found in any other Linux file integrity checker.
- Full Text:
- Date Issued: 2006
Guidelines for the protection of stored sensitive information assets within small, medium and micro enterprises
- Authors: Scharnick, Nicholas
- Date: 2018
- Subjects: Computer security , Information technology -- Security measures Data protection Business -- Data processing -- Security measures Small business -- Data processing -- Security measures -- South Africa
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/34799 , vital:33452
- Description: Technology has become important in the business environment as it ensures that a business is competitive and it also drives the business processes. However, in the era of mobile devices, easy access to the internet and a wide variety of other communication mechanisms; the security of the business from a technological perspective is constantly under threat. Thus, the problem that this research aims to address is that there is currently a lack of understanding by SMMEs in protecting their stored sensitive information assets. This study intends to assist small businesses, such as those within the Small Medium and Micro Enterprises (SMME) on how to protect and secure information while it is in storage. SMMEs usually do not have available resources to fully address information security related concerns that could pose a threat to the well being and success of the business. In order to address the problem identified, and assist SMMEs with better protecting their stored information assets, the outcomes of this research is to develop guidelines to assist SMMEs in protecting stored sensitive information assets. Through the use of a qualitative content analysis, a literature review, a number of information security standards, best practices, and frameworks, including the ISO27000 series of standards, COBIT, ITIL, and various NIST publications were analysed to determine how these security approaches address security concerns that arise when considering the storage of sensitive information. Following the literature analysis, a survey was developed and distributed to a wide variety of SMMEs in order to determine what their information security requirements might be, as well as how they address information security. The results obtained from this, coupled with the literature analysis, served as input for the development of a number of guidelines that can assist SMMEs in protecting stored sensitive information assets.
- Full Text:
- Date Issued: 2018
- Authors: Scharnick, Nicholas
- Date: 2018
- Subjects: Computer security , Information technology -- Security measures Data protection Business -- Data processing -- Security measures Small business -- Data processing -- Security measures -- South Africa
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/34799 , vital:33452
- Description: Technology has become important in the business environment as it ensures that a business is competitive and it also drives the business processes. However, in the era of mobile devices, easy access to the internet and a wide variety of other communication mechanisms; the security of the business from a technological perspective is constantly under threat. Thus, the problem that this research aims to address is that there is currently a lack of understanding by SMMEs in protecting their stored sensitive information assets. This study intends to assist small businesses, such as those within the Small Medium and Micro Enterprises (SMME) on how to protect and secure information while it is in storage. SMMEs usually do not have available resources to fully address information security related concerns that could pose a threat to the well being and success of the business. In order to address the problem identified, and assist SMMEs with better protecting their stored information assets, the outcomes of this research is to develop guidelines to assist SMMEs in protecting stored sensitive information assets. Through the use of a qualitative content analysis, a literature review, a number of information security standards, best practices, and frameworks, including the ISO27000 series of standards, COBIT, ITIL, and various NIST publications were analysed to determine how these security approaches address security concerns that arise when considering the storage of sensitive information. Following the literature analysis, a survey was developed and distributed to a wide variety of SMMEs in order to determine what their information security requirements might be, as well as how they address information security. The results obtained from this, coupled with the literature analysis, served as input for the development of a number of guidelines that can assist SMMEs in protecting stored sensitive information assets.
- Full Text:
- Date Issued: 2018
Implementing the CoSaWoE models in a commercial workflow product
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
Information security awareness: generic content, tools and techniques
- Authors: Mauwa, Hope
- Date: 2007
- Subjects: Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9733 , http://hdl.handle.net/10948/560 , Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Description: In today’s computing environment, awareness programmes play a much more important role in organizations’ complete information security programmes. Information security awareness programmes are there to change behaviour or reinforce good security practices, and provide a baseline of security knowledge for all information users. Security awareness is a learning process, which changes individual and organizational attitudes and perceptions so that the importance of security and the adverse consequences of its failure are realized. Therefore, with proper awareness, employees become the most effective layer in an organization’s security defence. With the important role that these awareness programmes play in organizations’ complete information security programmes, it is a must that all organizations that are serious about information security must implement it. But though awareness programmes have become increasing important, the level of awareness in most organizations is still low. It seems that the current approach of developing these programmes does not satisfy the needs of most organizations. Therefore, another approach, which tries to meet the needs of most organizations, is proposed in this project as part of the solution of raising the level of awareness programmes in organizations.
- Full Text:
- Date Issued: 2007
- Authors: Mauwa, Hope
- Date: 2007
- Subjects: Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9733 , http://hdl.handle.net/10948/560 , Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Description: In today’s computing environment, awareness programmes play a much more important role in organizations’ complete information security programmes. Information security awareness programmes are there to change behaviour or reinforce good security practices, and provide a baseline of security knowledge for all information users. Security awareness is a learning process, which changes individual and organizational attitudes and perceptions so that the importance of security and the adverse consequences of its failure are realized. Therefore, with proper awareness, employees become the most effective layer in an organization’s security defence. With the important role that these awareness programmes play in organizations’ complete information security programmes, it is a must that all organizations that are serious about information security must implement it. But though awareness programmes have become increasing important, the level of awareness in most organizations is still low. It seems that the current approach of developing these programmes does not satisfy the needs of most organizations. Therefore, another approach, which tries to meet the needs of most organizations, is proposed in this project as part of the solution of raising the level of awareness programmes in organizations.
- Full Text:
- Date Issued: 2007
Introducing hippocratic log files for personal privacy control
- Authors: Rutherford, Andrew
- Date: 2005
- Subjects: Computer security , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9743 , http://hdl.handle.net/10948/171 , Computer security , Internet -- Security measures
- Description: The rapid growth of the Internet has served to intensify existing privacy concerns of the individual, to the point that privacy is the number one concern amongst Internet users today. Tools exist that can provide users with a choice of anonymity or pseudonymity. However, many Web transactions require the release of personally identifying information, thus rendering such tools infeasible in many instances. Since it is then a given that users are often required to release personal information, which could be recorded, it follows that they require a greater degree of control over the information they release. Hippocratic databases, designed by Agrawal, Kiernan, Srikant, and Xu (2002), aim to give users greater control over information stored in a data- base. Their design was inspired by the medical Hippocratic oath, and makes data privacy protection a fundamental responsibility of the database itself. To achieve the privacy of data, Hippocratic databases are governed by 10 key privacy principles. This dissertation argues, that asides from a few challenges, the 10 prin- ciples of Hippocratic databases can be applied to log ¯les. This argument is supported by presenting a high-level functional view of a Hippocratic log file architecture. This architecture focuses on issues that highlight the con- trol users gain over their personal information that is collected in log files. By presenting a layered view of the aforementioned architecture, it was, fur- thermore, possible to provide greater insight into the major processes that would be at work in a Hippocratic log file implementation. An exploratory prototype served to understand and demonstrate certain of the architectural components of Hippocratic log files. This dissertation, thus, makes a contribution to the ideal of providing users with greater control over their personal information, by proposing the use of Hippocratic logfiles.
- Full Text:
- Date Issued: 2005
- Authors: Rutherford, Andrew
- Date: 2005
- Subjects: Computer security , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9743 , http://hdl.handle.net/10948/171 , Computer security , Internet -- Security measures
- Description: The rapid growth of the Internet has served to intensify existing privacy concerns of the individual, to the point that privacy is the number one concern amongst Internet users today. Tools exist that can provide users with a choice of anonymity or pseudonymity. However, many Web transactions require the release of personally identifying information, thus rendering such tools infeasible in many instances. Since it is then a given that users are often required to release personal information, which could be recorded, it follows that they require a greater degree of control over the information they release. Hippocratic databases, designed by Agrawal, Kiernan, Srikant, and Xu (2002), aim to give users greater control over information stored in a data- base. Their design was inspired by the medical Hippocratic oath, and makes data privacy protection a fundamental responsibility of the database itself. To achieve the privacy of data, Hippocratic databases are governed by 10 key privacy principles. This dissertation argues, that asides from a few challenges, the 10 prin- ciples of Hippocratic databases can be applied to log ¯les. This argument is supported by presenting a high-level functional view of a Hippocratic log file architecture. This architecture focuses on issues that highlight the con- trol users gain over their personal information that is collected in log files. By presenting a layered view of the aforementioned architecture, it was, fur- thermore, possible to provide greater insight into the major processes that would be at work in a Hippocratic log file implementation. An exploratory prototype served to understand and demonstrate certain of the architectural components of Hippocratic log files. This dissertation, thus, makes a contribution to the ideal of providing users with greater control over their personal information, by proposing the use of Hippocratic logfiles.
- Full Text:
- Date Issued: 2005
Limiting vulnerability exposure through effective patch management: threat mitigation through vulnerability remediation
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Date Issued: 2007
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Date Issued: 2007
Log analysis aided by latent semantic mapping
- Authors: Buys, Stephanus
- Date: 2013 , 2013-04-14
- Subjects: Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4575 , http://hdl.handle.net/10962/d1002963 , Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Description: In an age of zero-day exploits and increased on-line attacks on computing infrastructure, operational security practitioners are becoming increasingly aware of the value of the information captured in log events. Analysis of these events is critical during incident response, forensic investigations related to network breaches, hacking attacks and data leaks. Such analysis has led to the discipline of Security Event Analysis, also known as Log Analysis. There are several challenges when dealing with events, foremost being the increased volumes at which events are often generated and stored. Furthermore, events are often captured as unstructured data, with very little consistency in the formats or contents of the events. In this environment, security analysts and implementers of Log Management (LM) or Security Information and Event Management (SIEM) systems face the daunting task of identifying, classifying and disambiguating massive volumes of events in order for security analysis and automation to proceed. Latent Semantic Mapping (LSM) is a proven paradigm shown to be an effective method of, among other things, enabling word clustering, document clustering, topic clustering and semantic inference. This research is an investigation into the practical application of LSM in the discipline of Security Event Analysis, showing the value of using LSM to assist practitioners in identifying types of events, classifying events as belonging to certain sources or technologies and disambiguating different events from each other. The culmination of this research presents adaptations to traditional natural language processing techniques that resulted in improved efficacy of LSM when dealing with Security Event Analysis. This research provides strong evidence supporting the wider adoption and use of LSM, as well as further investigation into Security Event Analysis assisted by LSM and other natural language or computer-learning processing techniques. , LaTeX with hyperref package , Adobe Acrobat 9.54 Paper Capture Plug-in
- Full Text:
- Date Issued: 2013
- Authors: Buys, Stephanus
- Date: 2013 , 2013-04-14
- Subjects: Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4575 , http://hdl.handle.net/10962/d1002963 , Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Description: In an age of zero-day exploits and increased on-line attacks on computing infrastructure, operational security practitioners are becoming increasingly aware of the value of the information captured in log events. Analysis of these events is critical during incident response, forensic investigations related to network breaches, hacking attacks and data leaks. Such analysis has led to the discipline of Security Event Analysis, also known as Log Analysis. There are several challenges when dealing with events, foremost being the increased volumes at which events are often generated and stored. Furthermore, events are often captured as unstructured data, with very little consistency in the formats or contents of the events. In this environment, security analysts and implementers of Log Management (LM) or Security Information and Event Management (SIEM) systems face the daunting task of identifying, classifying and disambiguating massive volumes of events in order for security analysis and automation to proceed. Latent Semantic Mapping (LSM) is a proven paradigm shown to be an effective method of, among other things, enabling word clustering, document clustering, topic clustering and semantic inference. This research is an investigation into the practical application of LSM in the discipline of Security Event Analysis, showing the value of using LSM to assist practitioners in identifying types of events, classifying events as belonging to certain sources or technologies and disambiguating different events from each other. The culmination of this research presents adaptations to traditional natural language processing techniques that resulted in improved efficacy of LSM when dealing with Security Event Analysis. This research provides strong evidence supporting the wider adoption and use of LSM, as well as further investigation into Security Event Analysis assisted by LSM and other natural language or computer-learning processing techniques. , LaTeX with hyperref package , Adobe Acrobat 9.54 Paper Capture Plug-in
- Full Text:
- Date Issued: 2013
NeGPAIM : a model for the proactive detection of information security intrusions, utilizing fuzzy logic and neural network techniques
- Authors: Botha, Martin
- Date: 2003
- Subjects: Computer security , Fuzzy logic , Neural networks (Computer science)
- Language: English
- Type: Thesis , Doctoral , DTech (Computer Studies)
- Identifier: vital:10792 , http://hdl.handle.net/10948/142 , Computer security , Fuzzy logic , Neural networks (Computer science)
- Description: “Information is the lifeblood of any organisation and everything an organisation does involves using information in some way” (Peppard, 1993, p.5). Therefore, it can be argued that information is an organisation’s most precious asset and as with all other assets, like equipment, money, personnel, and so on, this asset needs to be protected properly at all times (Whitman & Mattord, 2003, pp.1-14). The introduction of modern technologies, such as e-commerce, will not only increase the value of information, but will also increase security requirements of those organizations that are intending to utilize such technologies. Evidence of these requirements can be observed in the 2001 CSI/FBI Computer Crime and Security Survey (Power, 2001). According to this source, the annual financial losses caused through security breaches in 2001 have increased by 277% when compared to the results from 1997. The 2002 and 2003 Computer Crime and Security Survey confirms this by stating that the threat of computer crime and other related information security breaches continues unabated and that the financial toll is mounting (Richardson, 2003). Information is normally protected by means of a process of identifying, implementing, managing and maintaining a set of information security controls, countermeasures or safeguards (GMITS, 1998). In the rest of this thesis, the term security controls will be utilized when referring to information protection mechanisms or procedures. These security controls can be of a physical (for example, door locks), a technical (for example, passwords) and/or a procedural nature (for example, to make back-up copies of critical files)(Pfleeger, 2003, pp.22-23; Stallings, 1995, p.1). The effective identification, implementation, management and maintenance of this set of security controls are usually integrated into an Information Security Management Program, the objective of which is to ensure an acceptable level of information confidentiality, integrity and availability within the organisation at all times (Pfleeger, 2003, pp.10-12; Whitman & Mattord, 2003, pp.1-14; Von Solms, 1993). Once the most effective security controls have been identified and implemented, it is important that this level of security be maintained through a process of continued control. For this reason, it is important that proper change management, measurement, audit, monitoring and detection be implemented (Bruce & Dempsey, 1997). Monitoring and detection are important functions and refer to the ability to identify and detect situations where information security policies have been compromised and/or breached or security violations have taken place (BS 7799, 1999; GMITS, 1998; Von Solms, 1993). The Information Security Officer is usually the person responsible for most of the operational tasks in the control process within an Information Security Management Program (Von Solms, 1993). In practice, these tasks could also be performed by a system administrator, network administrator, etc. In the rest of the thesis the person responsible for these tasks will be referred to as system administrator. These tasks have proved to be very challenging and demanding. The main reason for this is the rapid advancement of technology in the discipline of Information Technology, for example, the modern distributed computing environment, the Internet, the “freedom” of end-users, the introduction of e-commerce, and etc. (Whitman & Mattord, 2003, p.9; Sundaram, 2000, p.1; Moses, 2001, p.6; Allen, 2001, p.1). As a result of the importance of this control process, and especially the monitoring and detection tasks, it is vital that the system administrator has proper tools at his/her disposal to perform this task effectively. Many of the tools that are currently available to the system administrator, utilize technical controls, such as, audit logs and user profiles. Audit logs are normally used to record all events executed on a system. These logs are simply files that record security and non-security related events that take place on a computer system within an organisation. For this reason, these logs can be used by these tools to gain valuable information on security violations, such as intrusions and, therefore, are able to monitor the current actions of each user (Microsoft, 2002; Smith, 1989, pp. 116-117). User profiles are files that contain information about users` desktop operating environments and are used by the operating system to structure each user environment so that it is the same each time a user logs onto the system (Microsoft, 2002; Block, 1994, p.54). Thus, a user profile is used to indicate which actions the user is allowed to perform on the system. Both technical controls (audit logs and user profiles) are frequently available in most computer environments (such as, UNIX, Firewalls, Windows, etc.) (Cooper et al, 1995, p.129). Therefore, seeing that the audit logs record most events taking place on an information system and the user profile indicates the authorized actions of each user, the system administrator could most probably utilise these controls in a more proactive manner.
- Full Text:
- Date Issued: 2003
- Authors: Botha, Martin
- Date: 2003
- Subjects: Computer security , Fuzzy logic , Neural networks (Computer science)
- Language: English
- Type: Thesis , Doctoral , DTech (Computer Studies)
- Identifier: vital:10792 , http://hdl.handle.net/10948/142 , Computer security , Fuzzy logic , Neural networks (Computer science)
- Description: “Information is the lifeblood of any organisation and everything an organisation does involves using information in some way” (Peppard, 1993, p.5). Therefore, it can be argued that information is an organisation’s most precious asset and as with all other assets, like equipment, money, personnel, and so on, this asset needs to be protected properly at all times (Whitman & Mattord, 2003, pp.1-14). The introduction of modern technologies, such as e-commerce, will not only increase the value of information, but will also increase security requirements of those organizations that are intending to utilize such technologies. Evidence of these requirements can be observed in the 2001 CSI/FBI Computer Crime and Security Survey (Power, 2001). According to this source, the annual financial losses caused through security breaches in 2001 have increased by 277% when compared to the results from 1997. The 2002 and 2003 Computer Crime and Security Survey confirms this by stating that the threat of computer crime and other related information security breaches continues unabated and that the financial toll is mounting (Richardson, 2003). Information is normally protected by means of a process of identifying, implementing, managing and maintaining a set of information security controls, countermeasures or safeguards (GMITS, 1998). In the rest of this thesis, the term security controls will be utilized when referring to information protection mechanisms or procedures. These security controls can be of a physical (for example, door locks), a technical (for example, passwords) and/or a procedural nature (for example, to make back-up copies of critical files)(Pfleeger, 2003, pp.22-23; Stallings, 1995, p.1). The effective identification, implementation, management and maintenance of this set of security controls are usually integrated into an Information Security Management Program, the objective of which is to ensure an acceptable level of information confidentiality, integrity and availability within the organisation at all times (Pfleeger, 2003, pp.10-12; Whitman & Mattord, 2003, pp.1-14; Von Solms, 1993). Once the most effective security controls have been identified and implemented, it is important that this level of security be maintained through a process of continued control. For this reason, it is important that proper change management, measurement, audit, monitoring and detection be implemented (Bruce & Dempsey, 1997). Monitoring and detection are important functions and refer to the ability to identify and detect situations where information security policies have been compromised and/or breached or security violations have taken place (BS 7799, 1999; GMITS, 1998; Von Solms, 1993). The Information Security Officer is usually the person responsible for most of the operational tasks in the control process within an Information Security Management Program (Von Solms, 1993). In practice, these tasks could also be performed by a system administrator, network administrator, etc. In the rest of the thesis the person responsible for these tasks will be referred to as system administrator. These tasks have proved to be very challenging and demanding. The main reason for this is the rapid advancement of technology in the discipline of Information Technology, for example, the modern distributed computing environment, the Internet, the “freedom” of end-users, the introduction of e-commerce, and etc. (Whitman & Mattord, 2003, p.9; Sundaram, 2000, p.1; Moses, 2001, p.6; Allen, 2001, p.1). As a result of the importance of this control process, and especially the monitoring and detection tasks, it is vital that the system administrator has proper tools at his/her disposal to perform this task effectively. Many of the tools that are currently available to the system administrator, utilize technical controls, such as, audit logs and user profiles. Audit logs are normally used to record all events executed on a system. These logs are simply files that record security and non-security related events that take place on a computer system within an organisation. For this reason, these logs can be used by these tools to gain valuable information on security violations, such as intrusions and, therefore, are able to monitor the current actions of each user (Microsoft, 2002; Smith, 1989, pp. 116-117). User profiles are files that contain information about users` desktop operating environments and are used by the operating system to structure each user environment so that it is the same each time a user logs onto the system (Microsoft, 2002; Block, 1994, p.54). Thus, a user profile is used to indicate which actions the user is allowed to perform on the system. Both technical controls (audit logs and user profiles) are frequently available in most computer environments (such as, UNIX, Firewalls, Windows, etc.) (Cooper et al, 1995, p.129). Therefore, seeing that the audit logs record most events taking place on an information system and the user profile indicates the authorized actions of each user, the system administrator could most probably utilise these controls in a more proactive manner.
- Full Text:
- Date Issued: 2003
NetwIOC: a framework for the automated generation of network-based IOCS for malware information sharing and defence
- Authors: Rudman, Lauren Lynne
- Date: 2018
- Subjects: Malware (Computer software) , Computer networks Security measures , Computer security , Python (Computer program language)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60639 , vital:27809
- Description: With the substantial number of new malware variants found each day, it is useful to have an efficient way to retrieve Indicators of Compromise (IOCs) from the malware in a format suitable for sharing and detection. In the past, these indicators were manually created after inspection of binary samples and network traffic. The Cuckoo Sandbox, is an existing dynamic malware analysis system which meets the requirements for the proposed framework and was extended by adding a few custom modules. This research explored a way to automate the generation of detailed network-based IOCs in a popular format which can be used for sharing. This was done through careful filtering and analysis of the PCAP hie generated by the sandbox, and placing these values into the correct type of STIX objects using Python, Through several evaluations, analysis of what type of network traffic can be expected for the creation of IOCs was conducted, including a brief ease study that examined the effect of analysis time on the number of IOCs created. Using the automatically generated IOCs to create defence and detection mechanisms for the network was evaluated and proved successful, A proof of concept sharing platform developed for the STIX IOCs is showcased at the end of the research.
- Full Text:
- Date Issued: 2018
- Authors: Rudman, Lauren Lynne
- Date: 2018
- Subjects: Malware (Computer software) , Computer networks Security measures , Computer security , Python (Computer program language)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60639 , vital:27809
- Description: With the substantial number of new malware variants found each day, it is useful to have an efficient way to retrieve Indicators of Compromise (IOCs) from the malware in a format suitable for sharing and detection. In the past, these indicators were manually created after inspection of binary samples and network traffic. The Cuckoo Sandbox, is an existing dynamic malware analysis system which meets the requirements for the proposed framework and was extended by adding a few custom modules. This research explored a way to automate the generation of detailed network-based IOCs in a popular format which can be used for sharing. This was done through careful filtering and analysis of the PCAP hie generated by the sandbox, and placing these values into the correct type of STIX objects using Python, Through several evaluations, analysis of what type of network traffic can be expected for the creation of IOCs was conducted, including a brief ease study that examined the effect of analysis time on the number of IOCs created. Using the automatically generated IOCs to create defence and detection mechanisms for the network was evaluated and proved successful, A proof of concept sharing platform developed for the STIX IOCs is showcased at the end of the research.
- Full Text:
- Date Issued: 2018
Phishing within e-commerce: reducing the risk, increasing the trust
- Authors: Megaw, Gregory M
- Date: 2010
- Subjects: Phishing , Identity theft -- Prevention , Electronic commerce , Computer security , Internet -- Safety measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11131 , http://hdl.handle.net/10353/376 , Phishing , Identity theft -- Prevention , Electronic commerce , Computer security , Internet -- Safety measures
- Description: E-Commerce has been plagued with problems since its inception and this study examines one of these problems: The lack of user trust in E-Commerce created by the risk of phishing. Phishing has grown exponentially together with the expansion of the Internet. This growth and the advancement of technology has not only benefited honest Internet users, but has enabled criminals to increase their effectiveness which has caused considerable damage to this budding area of commerce. Moreover, it has negatively impacted both the user and online business in breaking down the trust relationship between them. In an attempt to explore this problem, the following was considered: First, E-Commerce’s vulnerability to phishing attacks. By referring to the Common Criteria Security Model, various critical security areas within E-Commerce are identified, as well as the areas of vulnerability and weakness. Second, the methods and techniques used in phishing, such as phishing e-mails, websites and addresses, distributed attacks and redirected attacks, as well as the data that phishers seek to obtain, are examined. Furthermore, the way to reduce the risk of phishing and in turn increase the trust between users and websites is identified. Here the importance of Trust and the Uncertainty Reduction Theory plus the fine balance between trust and control is explored. Finally, the study presents Critical Success Factors that aid in phishing prevention and control, these being: User Authentication, Website Authentication, E-mail Authentication, Data Cryptography, Communication, and Active Risk Mitigation.
- Full Text:
- Date Issued: 2010
- Authors: Megaw, Gregory M
- Date: 2010
- Subjects: Phishing , Identity theft -- Prevention , Electronic commerce , Computer security , Internet -- Safety measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11131 , http://hdl.handle.net/10353/376 , Phishing , Identity theft -- Prevention , Electronic commerce , Computer security , Internet -- Safety measures
- Description: E-Commerce has been plagued with problems since its inception and this study examines one of these problems: The lack of user trust in E-Commerce created by the risk of phishing. Phishing has grown exponentially together with the expansion of the Internet. This growth and the advancement of technology has not only benefited honest Internet users, but has enabled criminals to increase their effectiveness which has caused considerable damage to this budding area of commerce. Moreover, it has negatively impacted both the user and online business in breaking down the trust relationship between them. In an attempt to explore this problem, the following was considered: First, E-Commerce’s vulnerability to phishing attacks. By referring to the Common Criteria Security Model, various critical security areas within E-Commerce are identified, as well as the areas of vulnerability and weakness. Second, the methods and techniques used in phishing, such as phishing e-mails, websites and addresses, distributed attacks and redirected attacks, as well as the data that phishers seek to obtain, are examined. Furthermore, the way to reduce the risk of phishing and in turn increase the trust between users and websites is identified. Here the importance of Trust and the Uncertainty Reduction Theory plus the fine balance between trust and control is explored. Finally, the study presents Critical Success Factors that aid in phishing prevention and control, these being: User Authentication, Website Authentication, E-mail Authentication, Data Cryptography, Communication, and Active Risk Mitigation.
- Full Text:
- Date Issued: 2010
Pseudo-random access compressed archive for security log data
- Authors: Radley, Johannes Jurgens
- Date: 2015
- Subjects: Computer security , Information storage and retrieval systems , Data compression (Computer science)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4723 , http://hdl.handle.net/10962/d1020019
- Description: We are surrounded by an increasing number of devices and applications that produce a huge quantity of machine generated data. Almost all the machine data contains some element of security information that can be used to discover, monitor and investigate security events.The work proposes a pseudo-random access compressed storage method for log data to be used with an information retrieval system that in turn provides the ability to search and correlate log data and the corresponding events. We explain the method for converting log files into distinct events and storing the events in a compressed file. This yields an entry identifier for each log entry that provides a pointer that can be used by indexing methods. The research also evaluates the compression performance penalties encountered by using this storage system, including decreased compression ratio, as well as increased compression and decompression times.
- Full Text:
- Date Issued: 2015
- Authors: Radley, Johannes Jurgens
- Date: 2015
- Subjects: Computer security , Information storage and retrieval systems , Data compression (Computer science)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4723 , http://hdl.handle.net/10962/d1020019
- Description: We are surrounded by an increasing number of devices and applications that produce a huge quantity of machine generated data. Almost all the machine data contains some element of security information that can be used to discover, monitor and investigate security events.The work proposes a pseudo-random access compressed storage method for log data to be used with an information retrieval system that in turn provides the ability to search and correlate log data and the corresponding events. We explain the method for converting log files into distinct events and storing the events in a compressed file. This yields an entry identifier for each log entry that provides a pointer that can be used by indexing methods. The research also evaluates the compression performance penalties encountered by using this storage system, including decreased compression ratio, as well as increased compression and decompression times.
- Full Text:
- Date Issued: 2015
Remote fidelity of Container-Based Network Emulators
- Authors: Peach, Schalk Willem
- Date: 2021-10-29
- Subjects: Computer networks Security measures , Intrusion detection systems (Computer security) , Computer security , Host-based intrusion detection systems (Computer security) , Emulators (Computer programs) , Computer network protocols , Container-Based Network Emulators (CBNEs) , Network Experimentation Platforms (NEPs)
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192141 , vital:45199
- Description: This thesis examines if Container-Based Network Emulators (CBNEs) are able to instantiate emulated nodes that provide sufficient realism to be used in information security experiments. The realism measure used is based on the information available from the point of view of a remote attacker. During the evaluation of a Container-Based Network Emulator (CBNE) as a platform to replicate production networks for information security experiments, it was observed that nmap fingerprinting returned Operating System (OS) family and version results inconsistent with that of the host Operating System (OS). CBNEs utilise Linux namespaces, the technology used for containerisation, to instantiate \emulated" hosts for experimental networks. Linux containers partition resources of the host OS to create lightweight virtual machines that share a single OS kernel. As all emulated hosts share the same kernel in a CBNE network, there is a reasonable expectation that the fingerprints of the host OS and emulated hosts should be the same. Based on how CBNEs instantiate emulated networks and that fingerprinting returned inconsistent results, it was hypothesised that the technologies used to construct CBNEs are capable of influencing fingerprints generated by utilities such as nmap. It was predicted that hosts emulated using different CBNEs would show deviations in remotely generated fingerprints when compared to fingerprints generated for the host OS. An experimental network consisting of two emulated hosts and a Layer 2 switch was instantiated on multiple CBNEs using the same host OS. Active and passive fingerprinting was conducted between the emulated hosts to generate fingerprints and OS family and version matches. Passive fingerprinting failed to produce OS family and version matches as the fingerprint databases for these utilities are no longer maintained. For active fingerprinting the OS family results were consistent between tested systems and the host OS, though OS version results reported was inconsistent. A comparison of the generated fingerprints revealed that for certain CBNEs fingerprint features related to network stack optimisations of the host OS deviated from other CBNEs and the host OS. The hypothesis that CBNEs can influence remotely generated fingerprints was partially confirmed. One CBNE system modified Linux kernel networking options, causing a deviation from fingerprints generated for other tested systems and the host OS. The hypothesis was also partially rejected as the technologies used by CBNEs do not influence the remote fidelity of emulated hosts. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
- Authors: Peach, Schalk Willem
- Date: 2021-10-29
- Subjects: Computer networks Security measures , Intrusion detection systems (Computer security) , Computer security , Host-based intrusion detection systems (Computer security) , Emulators (Computer programs) , Computer network protocols , Container-Based Network Emulators (CBNEs) , Network Experimentation Platforms (NEPs)
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192141 , vital:45199
- Description: This thesis examines if Container-Based Network Emulators (CBNEs) are able to instantiate emulated nodes that provide sufficient realism to be used in information security experiments. The realism measure used is based on the information available from the point of view of a remote attacker. During the evaluation of a Container-Based Network Emulator (CBNE) as a platform to replicate production networks for information security experiments, it was observed that nmap fingerprinting returned Operating System (OS) family and version results inconsistent with that of the host Operating System (OS). CBNEs utilise Linux namespaces, the technology used for containerisation, to instantiate \emulated" hosts for experimental networks. Linux containers partition resources of the host OS to create lightweight virtual machines that share a single OS kernel. As all emulated hosts share the same kernel in a CBNE network, there is a reasonable expectation that the fingerprints of the host OS and emulated hosts should be the same. Based on how CBNEs instantiate emulated networks and that fingerprinting returned inconsistent results, it was hypothesised that the technologies used to construct CBNEs are capable of influencing fingerprints generated by utilities such as nmap. It was predicted that hosts emulated using different CBNEs would show deviations in remotely generated fingerprints when compared to fingerprints generated for the host OS. An experimental network consisting of two emulated hosts and a Layer 2 switch was instantiated on multiple CBNEs using the same host OS. Active and passive fingerprinting was conducted between the emulated hosts to generate fingerprints and OS family and version matches. Passive fingerprinting failed to produce OS family and version matches as the fingerprint databases for these utilities are no longer maintained. For active fingerprinting the OS family results were consistent between tested systems and the host OS, though OS version results reported was inconsistent. A comparison of the generated fingerprints revealed that for certain CBNEs fingerprint features related to network stack optimisations of the host OS deviated from other CBNEs and the host OS. The hypothesis that CBNEs can influence remotely generated fingerprints was partially confirmed. One CBNE system modified Linux kernel networking options, causing a deviation from fingerprints generated for other tested systems and the host OS. The hypothesis was also partially rejected as the technologies used by CBNEs do not influence the remote fidelity of emulated hosts. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
The cost of free instant messaging: an attack modelling perspective
- Authors: Du Preez, Riekert
- Date: 2006
- Subjects: Computer security , Instant messaging , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9797 , http://hdl.handle.net/10948/499 , http://hdl.handle.net/10948/d1011921 , Computer security , Instant messaging , Data protection
- Description: Instant Messaging (IM) has grown tremendously over the last few years. Even though IM was originally developed as a social chat system, it has found a place in many companies, where it is being used as an essential business tool. However, many businesses rely on free IM and have not implemented a secure corporate IM solution. Most free IM clients were never intended for use in the workplace and, therefore, lack strong security features and administrative control. Consequently, free IM clients can provide attackers with an entry point for malicious code in an organization’s network that can ultimately lead to a company’s information assets being compromised. Therefore, even though free IM allows for better collaboration in the workplace, it comes at a cost, as the title of this dissertation suggests. This dissertation sets out to answer the question of how free IM can facilitate an attack on a company’s information assets. To answer the research question, the dissertation defines an IM attack model that models the ways in which an information system can be attacked when free IM is used within an organization. The IM attack model was created by categorising IM threats using the STRIDE threat classification scheme. The attacks that realize the categorised threats were then modelled using attack trees as the chosen attack modelling tool. Attack trees were chosen because of their ability to model the sequence of attacker actions during an attack. The author defined an enhanced graphical notation that was adopted for the attack trees used to create the IM attack model. The enhanced attack tree notation extends traditional attack trees to allow nodes in the trees to be of different classes and, therefore, allows attack trees to convey more information. During the process of defining the IM attack model, a number of experiments were conducted where IM vulnerabilities were exploited. Thereafter, a case study was constructed to document a simulated attack on an information system that involves the exploitation of IM vulnerabilities. The case study demonstrates how an attacker’s attack path relates to the IM attack model in a practical scenario. The IM attack model provides insight into how IM can facilitate an attack on a company’s information assets. The creation of the attack model for free IM lead to several realizations. The IM attack model revealed that even though the use of free IM clients may seem harmless, such IM clients can facilitate an attack on a company’s information assets. Furthermore, certain IM vulnerabilities may not pose a great risk by themselves, but when combined with the exploitation of other vulnerabilities, a much greater threat can be realized. These realizations hold true to what French playwright Jean Anouilh once said: “What you get free costs too much”.
- Full Text:
- Date Issued: 2006
- Authors: Du Preez, Riekert
- Date: 2006
- Subjects: Computer security , Instant messaging , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9797 , http://hdl.handle.net/10948/499 , http://hdl.handle.net/10948/d1011921 , Computer security , Instant messaging , Data protection
- Description: Instant Messaging (IM) has grown tremendously over the last few years. Even though IM was originally developed as a social chat system, it has found a place in many companies, where it is being used as an essential business tool. However, many businesses rely on free IM and have not implemented a secure corporate IM solution. Most free IM clients were never intended for use in the workplace and, therefore, lack strong security features and administrative control. Consequently, free IM clients can provide attackers with an entry point for malicious code in an organization’s network that can ultimately lead to a company’s information assets being compromised. Therefore, even though free IM allows for better collaboration in the workplace, it comes at a cost, as the title of this dissertation suggests. This dissertation sets out to answer the question of how free IM can facilitate an attack on a company’s information assets. To answer the research question, the dissertation defines an IM attack model that models the ways in which an information system can be attacked when free IM is used within an organization. The IM attack model was created by categorising IM threats using the STRIDE threat classification scheme. The attacks that realize the categorised threats were then modelled using attack trees as the chosen attack modelling tool. Attack trees were chosen because of their ability to model the sequence of attacker actions during an attack. The author defined an enhanced graphical notation that was adopted for the attack trees used to create the IM attack model. The enhanced attack tree notation extends traditional attack trees to allow nodes in the trees to be of different classes and, therefore, allows attack trees to convey more information. During the process of defining the IM attack model, a number of experiments were conducted where IM vulnerabilities were exploited. Thereafter, a case study was constructed to document a simulated attack on an information system that involves the exploitation of IM vulnerabilities. The case study demonstrates how an attacker’s attack path relates to the IM attack model in a practical scenario. The IM attack model provides insight into how IM can facilitate an attack on a company’s information assets. The creation of the attack model for free IM lead to several realizations. The IM attack model revealed that even though the use of free IM clients may seem harmless, such IM clients can facilitate an attack on a company’s information assets. Furthermore, certain IM vulnerabilities may not pose a great risk by themselves, but when combined with the exploitation of other vulnerabilities, a much greater threat can be realized. These realizations hold true to what French playwright Jean Anouilh once said: “What you get free costs too much”.
- Full Text:
- Date Issued: 2006
The effective combating of intrusion attacks through fuzzy logic and neural networks
- Authors: Goss, Robert Melvin
- Date: 2007
- Subjects: Computer security , Fuzzy logic , Neural networks (Computer science)
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9794 , http://hdl.handle.net/10948/512 , http://hdl.handle.net/10948/d1011917 , Computer security , Fuzzy logic , Neural networks (Computer science)
- Description: The importance of properly securing an organization’s information and computing resources has become paramount in modern business. Since the advent of the Internet, securing this organizational information has become increasingly difficult. Organizations deploy many security mechanisms in the protection of their data, intrusion detection systems in particular have an increasingly valuable role to play, and as networks grow, administrators need better ways to monitor their systems. Currently, many intrusion detection systems lack the means to accurately monitor and report on wireless segments within the corporate network. This dissertation proposes an extension to the NeGPAIM model, known as NeGPAIM-W, which allows for the accurate detection of attacks originating on wireless network segments. The NeGPAIM-W model is able to detect both wired and wireless based attacks, and with the extensions to the original model mentioned previously, also provide for correlation of intrusion attacks sourced on both wired and wireless network segments. This provides for a holistic detection strategy for an organization. This has been accomplished with the use of Fuzzy logic and neural networks utilized in the detection of attacks. The model works on the assumption that each user has, and leaves, a unique footprint on a computer system. Thus, all intrusive behaviour on the system and networks which support it, can be traced back to the user account which was used to perform the intrusive behavior.
- Full Text:
- Date Issued: 2007
- Authors: Goss, Robert Melvin
- Date: 2007
- Subjects: Computer security , Fuzzy logic , Neural networks (Computer science)
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9794 , http://hdl.handle.net/10948/512 , http://hdl.handle.net/10948/d1011917 , Computer security , Fuzzy logic , Neural networks (Computer science)
- Description: The importance of properly securing an organization’s information and computing resources has become paramount in modern business. Since the advent of the Internet, securing this organizational information has become increasingly difficult. Organizations deploy many security mechanisms in the protection of their data, intrusion detection systems in particular have an increasingly valuable role to play, and as networks grow, administrators need better ways to monitor their systems. Currently, many intrusion detection systems lack the means to accurately monitor and report on wireless segments within the corporate network. This dissertation proposes an extension to the NeGPAIM model, known as NeGPAIM-W, which allows for the accurate detection of attacks originating on wireless network segments. The NeGPAIM-W model is able to detect both wired and wireless based attacks, and with the extensions to the original model mentioned previously, also provide for correlation of intrusion attacks sourced on both wired and wireless network segments. This provides for a holistic detection strategy for an organization. This has been accomplished with the use of Fuzzy logic and neural networks utilized in the detection of attacks. The model works on the assumption that each user has, and leaves, a unique footprint on a computer system. Thus, all intrusive behaviour on the system and networks which support it, can be traced back to the user account which was used to perform the intrusive behavior.
- Full Text:
- Date Issued: 2007