A national strategy towards cultivating a cybersecurity culture in South Africa
- Authors: Gcaza, Noluxolo
- Date: 2017
- Subjects: Computer networks -- Security measures , Cyberspace -- Security measures , Computer security -- South Africa , Subculture -- South Africa
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10948/13735 , vital:27303
- Description: In modern society, cyberspace is interwoven into the daily lives of many. Cyberspace is increasingly redefining how people communicate as well as gain access to and share information. Technology has transformed the way the business world operates by introducing new ways of trading goods and services whilst bolstering traditional business methods. It has also altered the way nations govern. Thus individuals, organisations and nations are relying on this technology to perform significant functions. Alongside the positive innovations afforded by cyberspace, however, those who use it are exposed to a variety of risks. Cyberspace is beset by criminal activities such as cybercrime, fraud, identity theft to name but a few. Nonetheless, the negative impact of these cyber threats does not outweigh the advantages of cyberspace. In light of such threats, there is a call for all entities that reap the benefits of online services to institute cybersecurity. As such, cybersecurity is a necessity for individuals, organisations and nations alike. In practice, cybersecurity focuses on preventing and mitigating certain security risks that might compromise the security of relevant assets. For a long time, technology-centred measures have been deemed the most significant solution for mitigating such risks. However, after a legacy of unsuccessful technological efforts, it became clear that such solutions in isolation are insufficient to mitigate all cyber-related risks. This is mainly due to the role that humans play in the security process, that is, the human factor. In isolation, technology-centred measures tend to fail to counter the human factor because of the perception among many users that security measures are an obstacle and consequently a waste of time. This user perception can be credited to the perceived difficulty of the security measure, as well as apparent mistrust and misinterpretation of the measure. Hence, cybersecurity necessitates the development of a solution that encourages acceptable user behaviour in the reality of cyberspace. The cultivation of a cybersecurity culture is thus regarded as the best approach for addressing the human factors that weaken the cybersecurity chain. While the role of culture in pursuing cybersecurity is well appreciated, research focusing on defining and measuring cybersecurity culture is still in its infancy. Furthermore, studies have shown that there are no widely accepted key concepts that delimit a cybersecurity culture. However, the notion that such a culture is not well-delineated has not prevented national governments from pursuing a culture in which all citizens behave in a way that promotes cybersecurity. As a result, many countries now offer national cybersecurity campaigns to foster a culture of cybersecurity at a national level. South Africa is among the nations that have identified cultivating a culture of cybersecurity as a strategic priority. However, there is an apparent lack of a practical plan to cultivate such a cybersecurity culture in South Africa. Thus, this study sought firstly to confirm from the existing body of knowledge that cybersecurity culture is indeed ill-defined and, secondly, to delineate what constitutes a national cybersecurity culture. Finally, and primarily, it sought to devise a national strategy that would assist SA in fulfilling its objective of cultivating a culture of cybersecurity on a national level.
- Full Text:
- Authors: Gcaza, Noluxolo
- Date: 2017
- Subjects: Computer networks -- Security measures , Cyberspace -- Security measures , Computer security -- South Africa , Subculture -- South Africa
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10948/13735 , vital:27303
- Description: In modern society, cyberspace is interwoven into the daily lives of many. Cyberspace is increasingly redefining how people communicate as well as gain access to and share information. Technology has transformed the way the business world operates by introducing new ways of trading goods and services whilst bolstering traditional business methods. It has also altered the way nations govern. Thus individuals, organisations and nations are relying on this technology to perform significant functions. Alongside the positive innovations afforded by cyberspace, however, those who use it are exposed to a variety of risks. Cyberspace is beset by criminal activities such as cybercrime, fraud, identity theft to name but a few. Nonetheless, the negative impact of these cyber threats does not outweigh the advantages of cyberspace. In light of such threats, there is a call for all entities that reap the benefits of online services to institute cybersecurity. As such, cybersecurity is a necessity for individuals, organisations and nations alike. In practice, cybersecurity focuses on preventing and mitigating certain security risks that might compromise the security of relevant assets. For a long time, technology-centred measures have been deemed the most significant solution for mitigating such risks. However, after a legacy of unsuccessful technological efforts, it became clear that such solutions in isolation are insufficient to mitigate all cyber-related risks. This is mainly due to the role that humans play in the security process, that is, the human factor. In isolation, technology-centred measures tend to fail to counter the human factor because of the perception among many users that security measures are an obstacle and consequently a waste of time. This user perception can be credited to the perceived difficulty of the security measure, as well as apparent mistrust and misinterpretation of the measure. Hence, cybersecurity necessitates the development of a solution that encourages acceptable user behaviour in the reality of cyberspace. The cultivation of a cybersecurity culture is thus regarded as the best approach for addressing the human factors that weaken the cybersecurity chain. While the role of culture in pursuing cybersecurity is well appreciated, research focusing on defining and measuring cybersecurity culture is still in its infancy. Furthermore, studies have shown that there are no widely accepted key concepts that delimit a cybersecurity culture. However, the notion that such a culture is not well-delineated has not prevented national governments from pursuing a culture in which all citizens behave in a way that promotes cybersecurity. As a result, many countries now offer national cybersecurity campaigns to foster a culture of cybersecurity at a national level. South Africa is among the nations that have identified cultivating a culture of cybersecurity as a strategic priority. However, there is an apparent lack of a practical plan to cultivate such a cybersecurity culture in South Africa. Thus, this study sought firstly to confirm from the existing body of knowledge that cybersecurity culture is indeed ill-defined and, secondly, to delineate what constitutes a national cybersecurity culture. Finally, and primarily, it sought to devise a national strategy that would assist SA in fulfilling its objective of cultivating a culture of cybersecurity on a national level.
- Full Text:
An Information Security Policy Compliance Reinforcement and Assessment Framework
- Authors: Gundu, Tapiwa
- Date: 2017
- Subjects: Computer security , Information technology -- Security measures , Business -- Data processing -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10353/9556 , vital:34445
- Description: The majority of SMEs have adopted the use of information communication and technology (ICT) services. However, this has exposed their systems to new internal and external security vulnerabilities. These SMEs seem more concerned with external threat related vulnerabilities rather than those from internal threats, although researchers and industry are suggesting a substantial proportion of security incidents to be originating from insiders. Internal threat is often addressed by, firstly, a security policy in order to direct activities and, secondly, organisational information security training and awareness programmes. These two approaches aim to ensure that employees are proficient in their roles and that they know how to carry out their responsibilities securely. There has been a significant amount of research conducted to ensure that information security programmes communicate the information security policy effectively and reinforce sound security practice. However, an assessment of the genuine effectiveness of such programmes is seldom carried out. The purposes of this research study were, firstly, to highlight the flaws in assessing behavioural intentions and equating such behavioural intentions with actual behaviours in information security; secondly, to present an information security policy compliance reinforcement and assessment framework which assists in promoting the conversion of intentions into actual behaviours and in assessing the behavioural change. The approach used was based on the Theory of Planned Behaviour, knowledge, attitude and behaviour theory and Deterrence Theory. Expert review and action research methods were used to validate and refine the framework. The action research was rigorously conducted in four iterations at an SME in South Africa and involved 30 participating employees. The main findings of the study revealed that even though employees may have been well trained and are aware of information security good practice, they may be either unable or unwilling to comply with such practice. The findings of the study also revealed that awareness drives which lead to secure behavioural intents are merely a first step in information security compliance. The study found that not all behavioural intentions converted to actual secure behaviours and only 64% converted. However, deterrence using rewards for good behaviour and punishment for undesirable behaviour was able to increase the conversion by 21%.
- Full Text:
- Authors: Gundu, Tapiwa
- Date: 2017
- Subjects: Computer security , Information technology -- Security measures , Business -- Data processing -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10353/9556 , vital:34445
- Description: The majority of SMEs have adopted the use of information communication and technology (ICT) services. However, this has exposed their systems to new internal and external security vulnerabilities. These SMEs seem more concerned with external threat related vulnerabilities rather than those from internal threats, although researchers and industry are suggesting a substantial proportion of security incidents to be originating from insiders. Internal threat is often addressed by, firstly, a security policy in order to direct activities and, secondly, organisational information security training and awareness programmes. These two approaches aim to ensure that employees are proficient in their roles and that they know how to carry out their responsibilities securely. There has been a significant amount of research conducted to ensure that information security programmes communicate the information security policy effectively and reinforce sound security practice. However, an assessment of the genuine effectiveness of such programmes is seldom carried out. The purposes of this research study were, firstly, to highlight the flaws in assessing behavioural intentions and equating such behavioural intentions with actual behaviours in information security; secondly, to present an information security policy compliance reinforcement and assessment framework which assists in promoting the conversion of intentions into actual behaviours and in assessing the behavioural change. The approach used was based on the Theory of Planned Behaviour, knowledge, attitude and behaviour theory and Deterrence Theory. Expert review and action research methods were used to validate and refine the framework. The action research was rigorously conducted in four iterations at an SME in South Africa and involved 30 participating employees. The main findings of the study revealed that even though employees may have been well trained and are aware of information security good practice, they may be either unable or unwilling to comply with such practice. The findings of the study also revealed that awareness drives which lead to secure behavioural intents are merely a first step in information security compliance. The study found that not all behavioural intentions converted to actual secure behaviours and only 64% converted. However, deterrence using rewards for good behaviour and punishment for undesirable behaviour was able to increase the conversion by 21%.
- Full Text:
Digital forensic model for computer networks
- Authors: Sanyamahwe, Tendai
- Date: 2011
- Subjects: Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11127 , http://hdl.handle.net/10353/d1000968 , Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Description: The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
- Full Text:
- Authors: Sanyamahwe, Tendai
- Date: 2011
- Subjects: Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11127 , http://hdl.handle.net/10353/d1000968 , Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Description: The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
- Full Text:
Limiting vulnerability exposure through effective patch management: threat mitigation through vulnerability remediation
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
A framework for information security governance in SMMEs
- Authors: Coertze, Jacques Jacobus
- Date: 2012
- Subjects: Business -- Data processing -- Security measures , Management information systems -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9810 , http://hdl.handle.net/10948/d1014083
- Description: It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
- Full Text:
- Authors: Coertze, Jacques Jacobus
- Date: 2012
- Subjects: Business -- Data processing -- Security measures , Management information systems -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9810 , http://hdl.handle.net/10948/d1014083
- Description: It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
- Full Text:
An investigation into interoperable end-to-end mobile web service security
- Authors: Moyo, Thamsanqa
- Date: 2008
- Subjects: Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4595 , http://hdl.handle.net/10962/d1004838 , Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Description: The capacity to engage in web services transactions on smartphones is growing as these devices become increasingly powerful and sophisticated. This capacity for mobile web services is being realised through mobile applications that consume web services hosted on larger computing devices. This thesis investigates the effect that end-to-end web services security has on the interoperability between mobile web services requesters and traditional web services providers. SOAP web services are the preferred web services approach for this investigation. Although WS-Security is recognised as demanding on mobile hardware and network resources, the selection of appropriate WS-Security mechanisms lessens this burden. An attempt to implement such mechanisms on smartphones is carried out via an experiment. Smartphones are selected as the mobile device type used in the experiment. The experiment is conducted on the Java Micro Edition (Java ME) and the .NET Compact Framework (.NET CF) smartphone platforms. The experiment shows that the implementation of interoperable, end-to-end, mobile web services security on both platforms is reliant on third-party libraries. This reliance on third-party libraries results in poor developer support and exposes developers to the complexity of cryptography. The experiment also shows that there are no standard message size optimisation libraries available for both platforms. The implementation carried out on the .NET CF is also shown to rely on the underlying operating system. It is concluded that standard WS-Security APIs must be provided on smartphone platforms to avoid the problems of poor developer support and the additional complexity of cryptography. It is recommended that these APIs include a message optimisation technique. It is further recommended that WS-Security APIs be completely operating system independent when they are implemented in managed code. This thesis contributes by: providing a snapshot of mobile web services security; identifying the smartphone platform state of readiness for end-to-end secure web services; and providing a set of recommendations that may improve this state of readiness. These contributions are of increasing importance as mobile web services evolve from a simple point-to-point environment to the more complex enterprise environment.
- Full Text:
- Authors: Moyo, Thamsanqa
- Date: 2008
- Subjects: Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4595 , http://hdl.handle.net/10962/d1004838 , Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Description: The capacity to engage in web services transactions on smartphones is growing as these devices become increasingly powerful and sophisticated. This capacity for mobile web services is being realised through mobile applications that consume web services hosted on larger computing devices. This thesis investigates the effect that end-to-end web services security has on the interoperability between mobile web services requesters and traditional web services providers. SOAP web services are the preferred web services approach for this investigation. Although WS-Security is recognised as demanding on mobile hardware and network resources, the selection of appropriate WS-Security mechanisms lessens this burden. An attempt to implement such mechanisms on smartphones is carried out via an experiment. Smartphones are selected as the mobile device type used in the experiment. The experiment is conducted on the Java Micro Edition (Java ME) and the .NET Compact Framework (.NET CF) smartphone platforms. The experiment shows that the implementation of interoperable, end-to-end, mobile web services security on both platforms is reliant on third-party libraries. This reliance on third-party libraries results in poor developer support and exposes developers to the complexity of cryptography. The experiment also shows that there are no standard message size optimisation libraries available for both platforms. The implementation carried out on the .NET CF is also shown to rely on the underlying operating system. It is concluded that standard WS-Security APIs must be provided on smartphone platforms to avoid the problems of poor developer support and the additional complexity of cryptography. It is recommended that these APIs include a message optimisation technique. It is further recommended that WS-Security APIs be completely operating system independent when they are implemented in managed code. This thesis contributes by: providing a snapshot of mobile web services security; identifying the smartphone platform state of readiness for end-to-end secure web services; and providing a set of recommendations that may improve this state of readiness. These contributions are of increasing importance as mobile web services evolve from a simple point-to-point environment to the more complex enterprise environment.
- Full Text:
A framework for malicious host fingerprinting using distributed network sensors
- Authors: Hunter, Samuel Oswald
- Date: 2018
- Subjects: Computer networks -- Security measures , Malware (Computer software) , Multisensor data fusion , Distributed Sensor Networks , Automated Reconnaissance Framework , Latency Based Multilateration
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60653 , vital:27811
- Description: Numerous software agents exist and are responsible for increasing volumes of malicious traffic that is observed on the Internet today. From a technical perspective the existing techniques for monitoring malicious agents and traffic were not developed to allow for the interrogation of the source of malicious traffic. This interrogation or reconnaissance would be considered active analysis as opposed to existing, mostly passive analysis. Unlike passive analysis, the active techniques are time-sensitive and their results become increasingly inaccurate as time delta between observation and interrogation increases. In addition to this, some studies had shown that the geographic separation of hosts on the Internet have resulted in pockets of different malicious agents and traffic targeting victims. As such it would be important to perform any kind of data collection over various source and in distributed IP address space. The data gathering and exposure capabilities of sensors such as honeypots and network telescopes were extended through the development of near-realtime Distributed Sensor Network modules that allowed for the near-realtime analysis of malicious traffic from distributed, heterogeneous monitoring sensors. In order to utilise the data exposed by the near-realtime Distributed Sensor Network modules an Automated Reconnaissance Framework was created, this framework was tasked with active and passive information collection and analysis of data in near-realtime and was designed from an adapted Multi Sensor Data Fusion model. The hypothesis was made that if sufficiently different characteristics of a host could be identified; combined they could act as a unique fingerprint for that host, potentially allowing for the re-identification of that host, even if its IP address had changed. To this end the concept of Latency Based Multilateration was introduced, acting as an additional metric for remote host fingerprinting. The vast amount of information gathered by the AR-Framework required the development of visualisation tools which could illustrate this data in near-realtime and also provided various degrees of interaction to accommodate human interpretation of such data. Ultimately the data collected through the application of the near-realtime Distributed Sensor Network and AR-Framework provided a unique perspective of a malicious host demographic. Allowing for new correlations to be drawn between attributes such as common open ports and operating systems, location, and inferred intent of these malicious hosts. The result of which expands our current understanding of malicious hosts on the Internet and enables further research in the area.
- Full Text:
- Authors: Hunter, Samuel Oswald
- Date: 2018
- Subjects: Computer networks -- Security measures , Malware (Computer software) , Multisensor data fusion , Distributed Sensor Networks , Automated Reconnaissance Framework , Latency Based Multilateration
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60653 , vital:27811
- Description: Numerous software agents exist and are responsible for increasing volumes of malicious traffic that is observed on the Internet today. From a technical perspective the existing techniques for monitoring malicious agents and traffic were not developed to allow for the interrogation of the source of malicious traffic. This interrogation or reconnaissance would be considered active analysis as opposed to existing, mostly passive analysis. Unlike passive analysis, the active techniques are time-sensitive and their results become increasingly inaccurate as time delta between observation and interrogation increases. In addition to this, some studies had shown that the geographic separation of hosts on the Internet have resulted in pockets of different malicious agents and traffic targeting victims. As such it would be important to perform any kind of data collection over various source and in distributed IP address space. The data gathering and exposure capabilities of sensors such as honeypots and network telescopes were extended through the development of near-realtime Distributed Sensor Network modules that allowed for the near-realtime analysis of malicious traffic from distributed, heterogeneous monitoring sensors. In order to utilise the data exposed by the near-realtime Distributed Sensor Network modules an Automated Reconnaissance Framework was created, this framework was tasked with active and passive information collection and analysis of data in near-realtime and was designed from an adapted Multi Sensor Data Fusion model. The hypothesis was made that if sufficiently different characteristics of a host could be identified; combined they could act as a unique fingerprint for that host, potentially allowing for the re-identification of that host, even if its IP address had changed. To this end the concept of Latency Based Multilateration was introduced, acting as an additional metric for remote host fingerprinting. The vast amount of information gathered by the AR-Framework required the development of visualisation tools which could illustrate this data in near-realtime and also provided various degrees of interaction to accommodate human interpretation of such data. Ultimately the data collected through the application of the near-realtime Distributed Sensor Network and AR-Framework provided a unique perspective of a malicious host demographic. Allowing for new correlations to be drawn between attributes such as common open ports and operating systems, location, and inferred intent of these malicious hosts. The result of which expands our current understanding of malicious hosts on the Internet and enables further research in the area.
- Full Text:
Towards a framework for the integration of information security into undergraduate computing curricula
- Date: 2017
- Subjects: Information technology -- Study and teaching , Computer security -- Study and teaching , Educational technology , Computer networks -- Security measures
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/13691 , http://vital.seals.ac.za8080/${Handle} , vital:27296
- Description: Information is an important and valuable asset, in both our everyday lives and in various organisations. Information is subject to numerous threats, these can originate internally or externally to the organisation and could be accidental, intentional or caused by natural disasters. As an important organisational asset, information should be appropriately protected from threats and threat agents regardless of their origin. Organisational employees are, however, often cited as the “weakest link” in the attempt to protect organisational information systems and related information assets. Additionally to this, employees are one of the biggest and closest threat-agents to an organisation’s information systems and its security. Upon graduating, computing (Computer Science, Information Systems and Information Technology) graduates typically become organisational employees. Within organisations, computing graduates often take on roles and responsibilities that involve designing, developing, implementing, upgrading and maintaining the information systems that store, process and transmit organisational information assets. It is, therefore, important that these computing graduates possess the necessary information security skills, knowledge and understanding that could enable them to perform their roles and responsibilities in a secure manner. These information security skills, knowledge and understanding can be acquired through information security education obtained through a qualification that is offered at a higher education institution. At many higher education institutions where information security is taught, it is taught as a single, isolated module at the fourth year level of study. The problem with this is that some computing students do not advance to this level and many of those that do, do not elect information security as a module. This means that these students may graduate and be employed by organisations lacking the necessary information security skills, knowledge and understanding to perform their roles and responsibilities securely. Consequently, this could increase the number of employees who are the “weakest link” in securing organisational information systems and related information assets. The ACM, as a key role player that provides educational guidelines for the development of computing curricula, recommends that information security should be pervasively integrated into computing curricula. However, these guidelines and recommendations do not provide sufficient guidance on “how” computing educators can pervasively integrate information security into their modules. Therefore, the problem identified by this research is that “currently, no generally used framework exists to aid the pervasive integration of information security into undergraduate computing curricula”. The primary research objective of this study, therefore, is to develop a framework to aid the pervasive integration of information security into undergraduate computing curricula. In order to meet this objective, secondary objectives were met, namely: To develop an understanding of the importance of information security; to determine the importance of information security education as it relates to undergraduate computing curricula; and to determine computing educators’ perspectives on information security education in a South African context. Various research methods were used to achieve this study’s research objectives. These research methods included a literature review which was used to define and provide an in-depth discussion relating to the domain in which this study is contained, namely: information security and information security education. Furthermore, a survey which took the form of semi-structured interviews supported by a questionnaire, was used to elicit computing educators’ perspectives on information security education in a South African context. Argumentation was used to argue towards the proposed framework to aid the pervasive integration of information security into undergraduate computing curricula. In addition, modelling techniques were used to model the proposed framework and scenarios were used to demonstrate how a computing department could implement the proposed framework. Finally, elite interviews supported by a questionnaire were conducted to validate the proposed framework. It is envisaged that the proposed framework could assist computing departments and undergraduate computing educators in the integration of information security into their curricula. Furthermore, the pervasive integration of information security into undergraduate computing curricula could ensure that computing graduates exit higher education institutions possessing the necessary information security skills, knowledge and understanding to enable them to perform their roles and responsibilities securely. It is hoped that this could enable computing graduates to become a stronger link in securing organisational information systems and related assets.
- Full Text:
- Date: 2017
- Subjects: Information technology -- Study and teaching , Computer security -- Study and teaching , Educational technology , Computer networks -- Security measures
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/13691 , http://vital.seals.ac.za8080/${Handle} , vital:27296
- Description: Information is an important and valuable asset, in both our everyday lives and in various organisations. Information is subject to numerous threats, these can originate internally or externally to the organisation and could be accidental, intentional or caused by natural disasters. As an important organisational asset, information should be appropriately protected from threats and threat agents regardless of their origin. Organisational employees are, however, often cited as the “weakest link” in the attempt to protect organisational information systems and related information assets. Additionally to this, employees are one of the biggest and closest threat-agents to an organisation’s information systems and its security. Upon graduating, computing (Computer Science, Information Systems and Information Technology) graduates typically become organisational employees. Within organisations, computing graduates often take on roles and responsibilities that involve designing, developing, implementing, upgrading and maintaining the information systems that store, process and transmit organisational information assets. It is, therefore, important that these computing graduates possess the necessary information security skills, knowledge and understanding that could enable them to perform their roles and responsibilities in a secure manner. These information security skills, knowledge and understanding can be acquired through information security education obtained through a qualification that is offered at a higher education institution. At many higher education institutions where information security is taught, it is taught as a single, isolated module at the fourth year level of study. The problem with this is that some computing students do not advance to this level and many of those that do, do not elect information security as a module. This means that these students may graduate and be employed by organisations lacking the necessary information security skills, knowledge and understanding to perform their roles and responsibilities securely. Consequently, this could increase the number of employees who are the “weakest link” in securing organisational information systems and related information assets. The ACM, as a key role player that provides educational guidelines for the development of computing curricula, recommends that information security should be pervasively integrated into computing curricula. However, these guidelines and recommendations do not provide sufficient guidance on “how” computing educators can pervasively integrate information security into their modules. Therefore, the problem identified by this research is that “currently, no generally used framework exists to aid the pervasive integration of information security into undergraduate computing curricula”. The primary research objective of this study, therefore, is to develop a framework to aid the pervasive integration of information security into undergraduate computing curricula. In order to meet this objective, secondary objectives were met, namely: To develop an understanding of the importance of information security; to determine the importance of information security education as it relates to undergraduate computing curricula; and to determine computing educators’ perspectives on information security education in a South African context. Various research methods were used to achieve this study’s research objectives. These research methods included a literature review which was used to define and provide an in-depth discussion relating to the domain in which this study is contained, namely: information security and information security education. Furthermore, a survey which took the form of semi-structured interviews supported by a questionnaire, was used to elicit computing educators’ perspectives on information security education in a South African context. Argumentation was used to argue towards the proposed framework to aid the pervasive integration of information security into undergraduate computing curricula. In addition, modelling techniques were used to model the proposed framework and scenarios were used to demonstrate how a computing department could implement the proposed framework. Finally, elite interviews supported by a questionnaire were conducted to validate the proposed framework. It is envisaged that the proposed framework could assist computing departments and undergraduate computing educators in the integration of information security into their curricula. Furthermore, the pervasive integration of information security into undergraduate computing curricula could ensure that computing graduates exit higher education institutions possessing the necessary information security skills, knowledge and understanding to enable them to perform their roles and responsibilities securely. It is hoped that this could enable computing graduates to become a stronger link in securing organisational information systems and related assets.
- Full Text:
A framework towards effective control in information security governance
- Authors: Viljoen, Melanie
- Date: 2009
- Subjects: Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9773 , http://hdl.handle.net/10948/887 , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
- Full Text:
- Authors: Viljoen, Melanie
- Date: 2009
- Subjects: Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9773 , http://hdl.handle.net/10948/887 , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
- Full Text:
Topic map for representing network security competencies
- Authors: Yekela, Odwa
- Date: 2018
- Subjects: Computer networks , Computer networks -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/36368 , vital:33931
- Description: Competencies represent the knowledge, skills and attitudes required for job roles. Organisations need to understand and grow competencies within their workforce in order to be more competitive and to maximise new market opportunities. Competency Management is the process of introducing, managing and enforcing competencies in organisations. Through this process, occupational competencies can be assessed to see if candidates match the required job role expectations. The assessment of competencies can be conceptualised from two perspectives. The rst is `competency frameworks', which describe competencies from a high-level overview. As such, they are regarded as theWhat" element of competency. The second perspective is `competencybased learning', which focuses on addressing competencies from a more detailed, task-oriented perspective. Competency-based learning is regarded as the How" element of competency. Currently, there is no available tool that can map the What" with the How" element of competency. Such a mapping would provide a more holistic approach to representing competencies. This dissertation adopts the topic map standard in order to demonstrate a holistic approach to mapping competencies, specially in network security. This is accomplished through the design and evaluation of a Design Science artefact. In this research process a topic map data model was constructed from mapping the `What' and `How' elements together. To demonstrate the applicability of the model, it was implemented in a Computer Security Incident Response Team (CSIRT) recruitment scenario. The aim of this demonstration was to prove that the topic map could be implemented in an organisational context.
- Full Text:
- Authors: Yekela, Odwa
- Date: 2018
- Subjects: Computer networks , Computer networks -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/36368 , vital:33931
- Description: Competencies represent the knowledge, skills and attitudes required for job roles. Organisations need to understand and grow competencies within their workforce in order to be more competitive and to maximise new market opportunities. Competency Management is the process of introducing, managing and enforcing competencies in organisations. Through this process, occupational competencies can be assessed to see if candidates match the required job role expectations. The assessment of competencies can be conceptualised from two perspectives. The rst is `competency frameworks', which describe competencies from a high-level overview. As such, they are regarded as theWhat" element of competency. The second perspective is `competencybased learning', which focuses on addressing competencies from a more detailed, task-oriented perspective. Competency-based learning is regarded as the How" element of competency. Currently, there is no available tool that can map the What" with the How" element of competency. Such a mapping would provide a more holistic approach to representing competencies. This dissertation adopts the topic map standard in order to demonstrate a holistic approach to mapping competencies, specially in network security. This is accomplished through the design and evaluation of a Design Science artefact. In this research process a topic map data model was constructed from mapping the `What' and `How' elements together. To demonstrate the applicability of the model, it was implemented in a Computer Security Incident Response Team (CSIRT) recruitment scenario. The aim of this demonstration was to prove that the topic map could be implemented in an organisational context.
- Full Text:
Distributed authentication for resource control
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
Operational risk model for MSES :impact on organisational information communication technology
- Authors: Bayaga, Anass
- Date: 2011
- Subjects: Risk management -- Statistical methods , Computer networks -- Security measures , Risk assessment
- Language: English
- Type: Thesis , Masters , M Comm
- Identifier: http://hdl.handle.net/10353/8332 , vital:32270
- Description: The aim of the study was to investigate the impact of Information Communication Technology Operational Risk Management (ICT ORM) on the performance of a Medium Small Enterprise (MSE). The study was based upon a survey design to collect the primary data from 107 respondents using simple random sampling. The research instrument was administered online. A one stage normative model, associative in nature, was developed based upon reviewing previous research and in line with the research findings. The model elicited five factors based upon the multiple regression analysis of the data: principal causes of ORM failure related to ICT; change management requirements and ICT risk; characteristic(s) of information; challenges posed by ORM solutions and evaluation models affecting ICT adoption within MSEs. Based on the methodologies used in this study including factor analysis and multivariate regression analysis, it is recommended that this model be applied to monitor these changes more closely and to measure the changing strategies and the associated factors such as insufficient or improper user participation in systems development process, identified as potential barriers to the effective adoption and implementation of ICT within an MSE.
- Full Text:
- Authors: Bayaga, Anass
- Date: 2011
- Subjects: Risk management -- Statistical methods , Computer networks -- Security measures , Risk assessment
- Language: English
- Type: Thesis , Masters , M Comm
- Identifier: http://hdl.handle.net/10353/8332 , vital:32270
- Description: The aim of the study was to investigate the impact of Information Communication Technology Operational Risk Management (ICT ORM) on the performance of a Medium Small Enterprise (MSE). The study was based upon a survey design to collect the primary data from 107 respondents using simple random sampling. The research instrument was administered online. A one stage normative model, associative in nature, was developed based upon reviewing previous research and in line with the research findings. The model elicited five factors based upon the multiple regression analysis of the data: principal causes of ORM failure related to ICT; change management requirements and ICT risk; characteristic(s) of information; challenges posed by ORM solutions and evaluation models affecting ICT adoption within MSEs. Based on the methodologies used in this study including factor analysis and multivariate regression analysis, it is recommended that this model be applied to monitor these changes more closely and to measure the changing strategies and the associated factors such as insufficient or improper user participation in systems development process, identified as potential barriers to the effective adoption and implementation of ICT within an MSE.
- Full Text:
A cyber security awareness and education framework for South Africa
- Authors: Kortjan, Noloxolo
- Date: 2013
- Subjects: Computer networks -- Security measures , Computer crimes -- Prevention , Computer security
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9811 , http://hdl.handle.net/10948/d1014829
- Description: The Internet is becoming increasingly interwoven in the daily life of many individuals, organisations and nations. It has, to a large extent, had a positive effect on the way people communicate. It has also introduced new avenues for business and has offered nations an opportunity to govern online. Nevertheless, although cyberspace offers an endless list of services and opportunities, it is also accompanied by many risks. One of these risks is cybercrime. The Internet has given criminals a platform on which to grow and proliferate. As a result of the abstract nature of the Internet, it is easy for these criminals to go unpunished. Moreover, many who use the Internet are not aware of such threats; therefore they may themselves be at risk, together with businesses and governmental assets and infrastructure. In view of this, there is a need for cyber security awareness and education initiatives that will promote users who are well versed in the risks associated with the Internet. In this context, it is the role of the government to empower all levels of society by providing the necessary knowledge and expertise to act securely online. However, there is currently a definite lack in South Africa (SA) in this regard, as there are currently no government-led cyber security awareness and education initiatives. The primary research objective of this study, therefore, is to propose a cyber security awareness and education framework for SA that will assist in creating a cyber secure culture in SA among all of its users of the Internet.
- Full Text:
- Authors: Kortjan, Noloxolo
- Date: 2013
- Subjects: Computer networks -- Security measures , Computer crimes -- Prevention , Computer security
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9811 , http://hdl.handle.net/10948/d1014829
- Description: The Internet is becoming increasingly interwoven in the daily life of many individuals, organisations and nations. It has, to a large extent, had a positive effect on the way people communicate. It has also introduced new avenues for business and has offered nations an opportunity to govern online. Nevertheless, although cyberspace offers an endless list of services and opportunities, it is also accompanied by many risks. One of these risks is cybercrime. The Internet has given criminals a platform on which to grow and proliferate. As a result of the abstract nature of the Internet, it is easy for these criminals to go unpunished. Moreover, many who use the Internet are not aware of such threats; therefore they may themselves be at risk, together with businesses and governmental assets and infrastructure. In view of this, there is a need for cyber security awareness and education initiatives that will promote users who are well versed in the risks associated with the Internet. In this context, it is the role of the government to empower all levels of society by providing the necessary knowledge and expertise to act securely online. However, there is currently a definite lack in South Africa (SA) in this regard, as there are currently no government-led cyber security awareness and education initiatives. The primary research objective of this study, therefore, is to propose a cyber security awareness and education framework for SA that will assist in creating a cyber secure culture in SA among all of its users of the Internet.
- Full Text:
Bolvedere: a scalable network flow threat analysis system
- Authors: Herbert, Alan
- Date: 2019
- Subjects: Bolvedere (Computer network analysis system) , Computer networks -- Scalability , Computer networks -- Measurement , Computer networks -- Security measures , Telecommunication -- Traffic -- Measurement
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/71557 , vital:29873
- Description: Since the advent of the Internet, and its public availability in the late 90’s, there have been significant advancements to network technologies and thus a significant increase of the bandwidth available to network users, both human and automated. Although this growth is of great value to network users, it has led to an increase in malicious network-based activities and it is theorized that, as more services become available on the Internet, the volume of such activities will continue to grow. Because of this, there is a need to monitor, comprehend, discern, understand and (where needed) respond to events on networks worldwide. Although this line of thought is simple in its reasoning, undertaking such a task is no small feat. Full packet analysis is a method of network surveillance that seeks out specific characteristics within network traffic that may tell of malicious activity or anomalies in regular network usage. It is carried out within firewalls and implemented through packet classification. In the context of the networks that make up the Internet, this form of packet analysis has become infeasible, as the volume of traffic introduced onto these networks every day is so large that there are simply not enough processing resources to perform such a task on every packet in real time. One could combat this problem by performing post-incident forensics; archiving packets and processing them later. However, as one cannot process all incoming packets, the archive will eventually run out of space. Full packet analysis is also hindered by the fact that some existing, commonly-used solutions are designed around a single host and single thread of execution, an outdated approach that is far slower than necessary on current computing technology. This research explores the conceptual design and implementation of a scalable network traffic analysis system named Bolvedere. Analysis performed by Bolvedere simply asks whether the existence of a connection, coupled with its associated metadata, is enough to conclude something meaningful about that connection. This idea draws away from the traditional processing of every single byte in every single packet monitored on a network link (Deep Packet Inspection) through the concept of working with connection flows. Bolvedere performs its work by leveraging the NetFlow version 9 and IPFIX protocols, but is not limited to these. It is implemented using a modular approach that allows for either complete execution of the system on a single host or the horizontal scaling out of subsystems on multiple hosts. The use of multiple hosts is achieved through the implementation of Zero Message Queue (ZMQ). This allows for Bolvedre to horizontally scale out, which results in an increase in processing resources and thus an increase in analysis throughput. This is due to ease of interprocess communications provided by ZMQ. Many underlying mechanisms in Bolvedere have been automated. This is intended to make the system more userfriendly, as the user need only tell Bolvedere what information they wish to analyse, and the system will then rebuild itself in order to achieve this required task. Bolvedere has also been hardware-accelerated through the use of Field-Programmable Gate Array (FPGA) technologies, which more than doubled the total throughput of the system.
- Full Text:
- Authors: Herbert, Alan
- Date: 2019
- Subjects: Bolvedere (Computer network analysis system) , Computer networks -- Scalability , Computer networks -- Measurement , Computer networks -- Security measures , Telecommunication -- Traffic -- Measurement
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/71557 , vital:29873
- Description: Since the advent of the Internet, and its public availability in the late 90’s, there have been significant advancements to network technologies and thus a significant increase of the bandwidth available to network users, both human and automated. Although this growth is of great value to network users, it has led to an increase in malicious network-based activities and it is theorized that, as more services become available on the Internet, the volume of such activities will continue to grow. Because of this, there is a need to monitor, comprehend, discern, understand and (where needed) respond to events on networks worldwide. Although this line of thought is simple in its reasoning, undertaking such a task is no small feat. Full packet analysis is a method of network surveillance that seeks out specific characteristics within network traffic that may tell of malicious activity or anomalies in regular network usage. It is carried out within firewalls and implemented through packet classification. In the context of the networks that make up the Internet, this form of packet analysis has become infeasible, as the volume of traffic introduced onto these networks every day is so large that there are simply not enough processing resources to perform such a task on every packet in real time. One could combat this problem by performing post-incident forensics; archiving packets and processing them later. However, as one cannot process all incoming packets, the archive will eventually run out of space. Full packet analysis is also hindered by the fact that some existing, commonly-used solutions are designed around a single host and single thread of execution, an outdated approach that is far slower than necessary on current computing technology. This research explores the conceptual design and implementation of a scalable network traffic analysis system named Bolvedere. Analysis performed by Bolvedere simply asks whether the existence of a connection, coupled with its associated metadata, is enough to conclude something meaningful about that connection. This idea draws away from the traditional processing of every single byte in every single packet monitored on a network link (Deep Packet Inspection) through the concept of working with connection flows. Bolvedere performs its work by leveraging the NetFlow version 9 and IPFIX protocols, but is not limited to these. It is implemented using a modular approach that allows for either complete execution of the system on a single host or the horizontal scaling out of subsystems on multiple hosts. The use of multiple hosts is achieved through the implementation of Zero Message Queue (ZMQ). This allows for Bolvedre to horizontally scale out, which results in an increase in processing resources and thus an increase in analysis throughput. This is due to ease of interprocess communications provided by ZMQ. Many underlying mechanisms in Bolvedere have been automated. This is intended to make the system more userfriendly, as the user need only tell Bolvedere what information they wish to analyse, and the system will then rebuild itself in order to achieve this required task. Bolvedere has also been hardware-accelerated through the use of Field-Programmable Gate Array (FPGA) technologies, which more than doubled the total throughput of the system.
- Full Text:
Data-centric security : towards a utopian model for protecting corporate data on mobile devices
- Authors: Mayisela, Simphiwe Hector
- Date: 2014
- Subjects: Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4688 , http://hdl.handle.net/10962/d1011094 , Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Description: Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
- Full Text:
- Authors: Mayisela, Simphiwe Hector
- Date: 2014
- Subjects: Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4688 , http://hdl.handle.net/10962/d1011094 , Computer security , Computer networks -- Security measures , Business enterprises -- Computer networks -- Security measures , Mobile computing -- Security measures , Mobile communication systems -- Security measures
- Description: Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
- Full Text:
ISGOP: A model for an information security governance platform
- Authors: Manjezi, Zandile
- Date: 2020
- Subjects: Electronic data processing departments -- Security measures , Computer networks -- Security measures , Data protection
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/46130 , vital:39505
- Description: Sound information security governance is an important part of every business. However, the widespread ransomware attacks that occur regularly cast a shadow of doubt on information security governance practices. Countermeasures to prevent and mitigate ransomware attacks are well known, yet knowledge of these countermeasures is not enough to ensure good information security governance. What matters is how the countermeasures are implemented across a business. Therefore, an information security governance structure is needed to oversee the deployment of these countermeasures. This research study proposes an information security governance model called ISGoP, which describes an information security governance platform comprising a data aspect and a functional aspect. ISGoP adopted ideas from existing frameworks. An information security governance framework known as the Direct-Control Cycle was analyzed. This provided ISGoP with conceptual components, such as information security-related documents and the relationships that exist between them. It is important to understand these conceptual components when distributing information security-related documents across all level of management for a holistic implementation. Security related documents and their relationships comprise the data aspect of ISGoP. Another framework that influenced ISGoP is the SABSA framework. The SABSA framework is an enterprise architecture framework that enables interoperability. It ensures collaboration between the people working for a business. Ideas from the SABSA framework were used to identify roles within the information security governance framework. The SABSA life cycle stages were also adopted by ISGoP. Various functions define the functional aspect of ISGoP. These functions are organised according to the life cycle stages and the views defined for the various roles. A case study was used to evaluate the possible utility of ISGoP. The case study explored a prototype implementation of ISGoP in a company. In addition to demonstrating its utility, the case study also allowed the model to be refined. ISGoP as a model must be refined and modified for specific business circumstances but lays a solid foundation to assist businesses in implementing sound information security governance.
- Full Text:
- Authors: Manjezi, Zandile
- Date: 2020
- Subjects: Electronic data processing departments -- Security measures , Computer networks -- Security measures , Data protection
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/46130 , vital:39505
- Description: Sound information security governance is an important part of every business. However, the widespread ransomware attacks that occur regularly cast a shadow of doubt on information security governance practices. Countermeasures to prevent and mitigate ransomware attacks are well known, yet knowledge of these countermeasures is not enough to ensure good information security governance. What matters is how the countermeasures are implemented across a business. Therefore, an information security governance structure is needed to oversee the deployment of these countermeasures. This research study proposes an information security governance model called ISGoP, which describes an information security governance platform comprising a data aspect and a functional aspect. ISGoP adopted ideas from existing frameworks. An information security governance framework known as the Direct-Control Cycle was analyzed. This provided ISGoP with conceptual components, such as information security-related documents and the relationships that exist between them. It is important to understand these conceptual components when distributing information security-related documents across all level of management for a holistic implementation. Security related documents and their relationships comprise the data aspect of ISGoP. Another framework that influenced ISGoP is the SABSA framework. The SABSA framework is an enterprise architecture framework that enables interoperability. It ensures collaboration between the people working for a business. Ideas from the SABSA framework were used to identify roles within the information security governance framework. The SABSA life cycle stages were also adopted by ISGoP. Various functions define the functional aspect of ISGoP. These functions are organised according to the life cycle stages and the views defined for the various roles. A case study was used to evaluate the possible utility of ISGoP. The case study explored a prototype implementation of ISGoP in a company. In addition to demonstrating its utility, the case study also allowed the model to be refined. ISGoP as a model must be refined and modified for specific business circumstances but lays a solid foundation to assist businesses in implementing sound information security governance.
- Full Text:
Securing softswitches from malicious attacks
- Authors: Opie, Jake Weyman
- Date: 2007
- Subjects: Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4683 , http://hdl.handle.net/10962/d1007714 , Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Description: Traditionally, real-time communication, such as voice calls, has run on separate, closed networks. Of all the limitations that these networks had, the ability of malicious attacks to cripple communication was not a crucial one. This situation has changed radically now that real-time communication and data have merged to share the same network. The objective of this project is to investigate the securing of softswitches with functionality similar to Private Branch Exchanges (PBX) from malicious attacks. The focus of the project will be a practical investigation of how to secure ILANGA, an ASTERISK-based system under development at Rhodes University. The practical investigation that focuses on ILANGA is based on performing six varied experiments on the different components of ILANGA. Before the six experiments are performed, basic preliminary security measures and the restrictions placed on the access to the database are discussed. The outcomes of these experiments are discussed and the precise reasons why these attacks were either successful or unsuccessful are given. Suggestions of a theoretical nature on how to defend against the successful attacks are also presented.
- Full Text:
- Authors: Opie, Jake Weyman
- Date: 2007
- Subjects: Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4683 , http://hdl.handle.net/10962/d1007714 , Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Description: Traditionally, real-time communication, such as voice calls, has run on separate, closed networks. Of all the limitations that these networks had, the ability of malicious attacks to cripple communication was not a crucial one. This situation has changed radically now that real-time communication and data have merged to share the same network. The objective of this project is to investigate the securing of softswitches with functionality similar to Private Branch Exchanges (PBX) from malicious attacks. The focus of the project will be a practical investigation of how to secure ILANGA, an ASTERISK-based system under development at Rhodes University. The practical investigation that focuses on ILANGA is based on performing six varied experiments on the different components of ILANGA. Before the six experiments are performed, basic preliminary security measures and the restrictions placed on the access to the database are discussed. The outcomes of these experiments are discussed and the precise reasons why these attacks were either successful or unsuccessful are given. Suggestions of a theoretical nature on how to defend against the successful attacks are also presented.
- Full Text:
A strategy to promote awareness and adherence to information security policy at Capricorn District Municipality
- Authors: Mamabolo, Mokgadi Hellen
- Date: 2019
- Subjects: Computer security -- Management , Data protection -- Management , Computer security , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MPhil
- Identifier: http://hdl.handle.net/10948/40867 , vital:36245
- Description: The purpose of this research was to investigate the reasons for non-adherence to the ISP and to measure the current level of adherence to the ISP. The research revealed that non adherence to the ISP is caused by lack of training or awareness, and through non-communication of the ISP to employees. The study was conducted at Capricorn District Municipality, Polokwane Local Municipality, Molemole Local Municipality and Blouberg Local Municipality. A web-based questionnaire (QuestionPro) was developed and it was directed to every official who uses or interacts with municipal information, to quantify the level of adherence to ISP by employees. An email with the questionnaire link administered by www.questionpro.com was then sent to the population of 152 employees. Presently ISP adherence is one of the key concerns that are faced by organisations. Employees are perceived as one of the reasons that there are security breaches within organisations; hence, it is of paramount importance that these security breaches are noticed, as well as technical matters. Most researchers have reasoned that non-adherence to ISP is one of the major challenges faced by organisations. The non-adherence to ISP will lead to potential information security threats and unauthorised access to information that might compromise municipal business operations. The Information Security Officer together with the help of management must educate employees regarding the value of IS and why it is crucial to adhere to these policies. The proposed strategy summarises the various concepts required in the promotion of awareness and adherence to an effective ISP. Ultimately, this research study concludes that if management continually trains employees, raising awareness about ISP and monitoring their adherence to ISP, this should increase the adherence level.
- Full Text:
- Authors: Mamabolo, Mokgadi Hellen
- Date: 2019
- Subjects: Computer security -- Management , Data protection -- Management , Computer security , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MPhil
- Identifier: http://hdl.handle.net/10948/40867 , vital:36245
- Description: The purpose of this research was to investigate the reasons for non-adherence to the ISP and to measure the current level of adherence to the ISP. The research revealed that non adherence to the ISP is caused by lack of training or awareness, and through non-communication of the ISP to employees. The study was conducted at Capricorn District Municipality, Polokwane Local Municipality, Molemole Local Municipality and Blouberg Local Municipality. A web-based questionnaire (QuestionPro) was developed and it was directed to every official who uses or interacts with municipal information, to quantify the level of adherence to ISP by employees. An email with the questionnaire link administered by www.questionpro.com was then sent to the population of 152 employees. Presently ISP adherence is one of the key concerns that are faced by organisations. Employees are perceived as one of the reasons that there are security breaches within organisations; hence, it is of paramount importance that these security breaches are noticed, as well as technical matters. Most researchers have reasoned that non-adherence to ISP is one of the major challenges faced by organisations. The non-adherence to ISP will lead to potential information security threats and unauthorised access to information that might compromise municipal business operations. The Information Security Officer together with the help of management must educate employees regarding the value of IS and why it is crucial to adhere to these policies. The proposed strategy summarises the various concepts required in the promotion of awareness and adherence to an effective ISP. Ultimately, this research study concludes that if management continually trains employees, raising awareness about ISP and monitoring their adherence to ISP, this should increase the adherence level.
- Full Text:
A comparison of exact string search algorithms for deep packet inspection
- Authors: Hunt, Kieran
- Date: 2018
- Subjects: Algorithms , Firewalls (Computer security) , Computer networks -- Security measures , Intrusion detection systems (Computer security) , Deep Packet Inspection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60629 , vital:27807
- Description: Every day, computer networks throughout the world face a constant onslaught of attacks. To combat these, network administrators are forced to employ a multitude of mitigating measures. Devices such as firewalls and Intrusion Detection Systems are prevalent today and employ extensive Deep Packet Inspection to scrutinise each piece of network traffic. Systems such as these usually require specialised hardware to meet the demand imposed by high throughput networks. Hardware like this is extremely expensive and singular in its function. It is with this in mind that the string search algorithms are introduced. These algorithms have been proven to perform well when searching through large volumes of text and may be able to perform equally well in the context of Deep Packet Inspection. String search algorithms are designed to match a single pattern to a substring of a given piece of text. This is not unlike the heuristics employed by traditional Deep Packet Inspection systems. This research compares the performance of a large number of string search algorithms during packet processing. Deep Packet Inspection places stringent restrictions on the reliability and speed of the algorithms due to increased performance pressures. A test system had to be designed in order to properly test the string search algorithms in the context of Deep Packet Inspection. The system allowed for precise and repeatable tests of each algorithm and then for their comparison. Of the algorithms tested, the Horspool and Quick Search algorithms posted the best results for both speed and reliability. The Not So Naive and Rabin-Karp algorithms were slowest overall.
- Full Text:
- Authors: Hunt, Kieran
- Date: 2018
- Subjects: Algorithms , Firewalls (Computer security) , Computer networks -- Security measures , Intrusion detection systems (Computer security) , Deep Packet Inspection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/60629 , vital:27807
- Description: Every day, computer networks throughout the world face a constant onslaught of attacks. To combat these, network administrators are forced to employ a multitude of mitigating measures. Devices such as firewalls and Intrusion Detection Systems are prevalent today and employ extensive Deep Packet Inspection to scrutinise each piece of network traffic. Systems such as these usually require specialised hardware to meet the demand imposed by high throughput networks. Hardware like this is extremely expensive and singular in its function. It is with this in mind that the string search algorithms are introduced. These algorithms have been proven to perform well when searching through large volumes of text and may be able to perform equally well in the context of Deep Packet Inspection. String search algorithms are designed to match a single pattern to a substring of a given piece of text. This is not unlike the heuristics employed by traditional Deep Packet Inspection systems. This research compares the performance of a large number of string search algorithms during packet processing. Deep Packet Inspection places stringent restrictions on the reliability and speed of the algorithms due to increased performance pressures. A test system had to be designed in order to properly test the string search algorithms in the context of Deep Packet Inspection. The system allowed for precise and repeatable tests of each algorithm and then for their comparison. Of the algorithms tested, the Horspool and Quick Search algorithms posted the best results for both speed and reliability. The Not So Naive and Rabin-Karp algorithms were slowest overall.
- Full Text:
A holistic approach to network security in OGSA-based grid systems
- Authors: Loutsios, Demetrios
- Date: 2006
- Subjects: Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9736 , http://hdl.handle.net/10948/550 , Computer networks -- Security measures
- Description: Grid computing technologies facilitate complex scientific collaborations between globally dispersed parties, which make use of heterogeneous technologies and computing systems. However, in recent years the commercial sector has developed a growing interest in Grid technologies. Prominent Grid researchers have predicted Grids will grow into the commercial mainstream, even though its origins were in scientific research. This is much the same way as the Internet started as a vehicle for research collaboration between universities and government institutions, and grew into a technology with large commercial applications. Grids facilitate complex trust relationships between globally dispersed business partners, research groups, and non-profit organizations. Almost any dispersed “virtual organization” willing to share computing resources can make use of Grid technologies. Grid computing facilitates the networking of shared services; the inter-connection of a potentially unlimited number of computing resources within a “Grid” is possible. Grid technologies leverage a range of open standards and technologies to provide interoperability between heterogeneous computing systems. Newer Grids build on key capabilities of Web-Service technologies to provide easy and dynamic publishing and discovery of Grid resources. Due to the inter-organisational nature of Grid systems, there is a need to provide adequate security to Grid users and to Grid resources. This research proposes a framework, using a specific brokered pattern, which addresses several common Grid security challenges, which include: Providing secure and consistent cross-site Authentication and Authorization; Single-sign on capabilities to Grid users; Abstract iii; Underlying platform and runtime security, and; Grid network communications and messaging security. These Grid security challenges can be viewed as comprising two (proposed) logical layers of a Grid. These layers are: a Common Grid Layer (higher level Grid interactions), and a Local Resource Layer (Lower level technology security concerns). This research is concerned with providing a generic and holistic security framework to secure both layers. This research makes extensive use of STRIDE - an acronym for Microsoft approach to addressing security threats - as part of a holistic Grid security framework. STRIDE and key Grid related standards, such as Open Grid Service Architecture (OGSA), Web-Service Resource Framework (WS-RF), and the Globus Toolkit are used to formulate the proposed framework.
- Full Text:
- Authors: Loutsios, Demetrios
- Date: 2006
- Subjects: Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9736 , http://hdl.handle.net/10948/550 , Computer networks -- Security measures
- Description: Grid computing technologies facilitate complex scientific collaborations between globally dispersed parties, which make use of heterogeneous technologies and computing systems. However, in recent years the commercial sector has developed a growing interest in Grid technologies. Prominent Grid researchers have predicted Grids will grow into the commercial mainstream, even though its origins were in scientific research. This is much the same way as the Internet started as a vehicle for research collaboration between universities and government institutions, and grew into a technology with large commercial applications. Grids facilitate complex trust relationships between globally dispersed business partners, research groups, and non-profit organizations. Almost any dispersed “virtual organization” willing to share computing resources can make use of Grid technologies. Grid computing facilitates the networking of shared services; the inter-connection of a potentially unlimited number of computing resources within a “Grid” is possible. Grid technologies leverage a range of open standards and technologies to provide interoperability between heterogeneous computing systems. Newer Grids build on key capabilities of Web-Service technologies to provide easy and dynamic publishing and discovery of Grid resources. Due to the inter-organisational nature of Grid systems, there is a need to provide adequate security to Grid users and to Grid resources. This research proposes a framework, using a specific brokered pattern, which addresses several common Grid security challenges, which include: Providing secure and consistent cross-site Authentication and Authorization; Single-sign on capabilities to Grid users; Abstract iii; Underlying platform and runtime security, and; Grid network communications and messaging security. These Grid security challenges can be viewed as comprising two (proposed) logical layers of a Grid. These layers are: a Common Grid Layer (higher level Grid interactions), and a Local Resource Layer (Lower level technology security concerns). This research is concerned with providing a generic and holistic security framework to secure both layers. This research makes extensive use of STRIDE - an acronym for Microsoft approach to addressing security threats - as part of a holistic Grid security framework. STRIDE and key Grid related standards, such as Open Grid Service Architecture (OGSA), Web-Service Resource Framework (WS-RF), and the Globus Toolkit are used to formulate the proposed framework.
- Full Text: